Sample records for activation analysis an analytical

  1. Active matrix-based collection of airborne analytes: an analyte recording chip providing exposure history and finger print.

    PubMed

    Fang, Jun; Park, Se-Chul; Schlag, Leslie; Stauden, Thomas; Pezoldt, Jörg; Jacobs, Heiko O

    2014-12-03

    In the field of sensors that target the detection of airborne analytes, Corona/lens-based-collection provides a new path to achieve a high sensitivity. An active-matrix-based analyte collection approach referred to as "airborne analyte memory chip/recorder" is demonstrated, which takes and stores airborne analytes in a matrix to provide an exposure history for off-site analysis. © 2014 The Authors. Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Analyte detection using an active assay

    DOEpatents

    Morozov, Victor; Bailey, Charles L.; Evanskey, Melissa R.

    2010-11-02

    Analytes using an active assay may be detected by introducing an analyte solution containing a plurality of analytes to a lacquered membrane. The lacquered membrane may be a membrane having at least one surface treated with a layer of polymers. The lacquered membrane may be semi-permeable to nonanalytes. The layer of polymers may include cross-linked polymers. A plurality of probe molecules may be arrayed and immobilized on the lacquered membrane. An external force may be applied to the analyte solution to move the analytes towards the lacquered membrane. Movement may cause some or all of the analytes to bind to the lacquered membrane. In cases where probe molecules are presented, some or all of the analytes may bind to probe molecules. The direction of the external force may be reversed to remove unbound or weakly bound analytes. Bound analytes may be detected using known detection types.

  3. The rise of environmental analytical chemistry as an interdisciplinary activity.

    PubMed

    Brown, Richard

    2009-07-01

    Modern scientific endeavour is increasingly delivered within an interdisciplinary framework. Analytical environmental chemistry is a long-standing example of an interdisciplinary approach to scientific research where value is added by the close cooperation of different disciplines. This editorial piece discusses the rise of environmental analytical chemistry as an interdisciplinary activity and outlines the scope of the Analytical Chemistry and the Environmental Chemistry domains of TheScientificWorldJOURNAL (TSWJ), and the appropriateness of TSWJ's domain format in covering interdisciplinary research. All contributions of new data, methods, case studies, and instrumentation, or new interpretations and developments of existing data, case studies, methods, and instrumentation, relating to analytical and/or environmental chemistry, to the Analytical and Environmental Chemistry domains, are welcome and will be considered equally.

  4. Improvement of analytical capabilities of neutron activation analysis laboratory at the Colombian Geological Survey

    NASA Astrophysics Data System (ADS)

    Parrado, G.; Cañón, Y.; Peña, M.; Sierra, O.; Porras, A.; Alonso, D.; Herrera, D. C.; Orozco, J.

    2016-07-01

    The Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey has developed a technique for multi-elemental analysis of soil and plant matrices, based on Instrumental Neutron Activation Analysis (INAA) using the comparator method. In order to evaluate the analytical capabilities of the technique, the laboratory has been participating in inter-comparison tests organized by Wepal (Wageningen Evaluating Programs for Analytical Laboratories). In this work, the experimental procedure and results for the multi-elemental analysis of four soil and four plant samples during participation in the first round on 2015 of Wepal proficiency test are presented. Only elements with radioactive isotopes with medium and long half-lives have been evaluated, 15 elements for soils (As, Ce, Co, Cr, Cs, Fe, K, La, Na, Rb, Sb, Sc, Th, U and Zn) and 7 elements for plants (Br, Co, Cr, Fe, K, Na and Zn). The performance assessment by Wepal based on Z-score distributions showed that most results obtained |Z-scores| ≤ 3.

  5. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive

  6. Calibration-free concentration analysis for an analyte prone to self-association.

    PubMed

    Imamura, Hiroshi; Honda, Shinya

    2017-01-01

    Calibration-free concentration analysis (CFCA) based on surface plasmon resonance uses the diffusion coefficient of an analyte to determine the concentration of that analyte in a bulk solution. In general, CFCA is avoided when investigating analytes prone to self-association, as the heterogeneous diffusion coefficient results in a loss of precision. The derivation for self-association of the analyte was presented here. By using the diffusion coefficient for the monomeric state, CFCA provides the lowest possible concentration even though the analyte is self-associated. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. FASP, an analytic resource appraisal program for petroleum play analysis

    USGS Publications Warehouse

    Crovelli, R.A.; Balay, R.H.

    1986-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented in a FORTRAN program termed FASP. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An established geologic model considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The program FASP produces resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and many laws of expectation and variance. ?? 1986.

  8. Risk analysis by FMEA as an element of analytical validation.

    PubMed

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-05

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.

  9. An overview on forensic analysis devoted to analytical chemists.

    PubMed

    Castillo-Peinado, L S; Luque de Castro, M D

    2017-05-15

    The present article has as main aim to show analytical chemists interested in forensic analysis the world they will face if decision in favor of being a forensic analytical chemist is adopted. With this purpose, the most outstanding aspects of forensic analysis in dealing with sampling (involving both bodily and no bodily samples), sample preparation, and analytical equipment used in detection, identification and quantitation of key sample components are critically discussed. The role of the great omics in forensic analysis, and the growing role of the youngest of the great omics -metabolomics- are also discussed. The foreseeable role of integrative omics is also outlined. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Analytic process and dreaming about analysis.

    PubMed

    Sirois, François

    2016-12-01

    Dreams about the analytic session feature a manifest content in which the analytic setting is subject to distortion while the analyst appears undisguised. Such dreams are a consistent yet infrequent occurrence in most analyses. Their specificity consists in never reproducing the material conditions of the analysis as such. This paper puts forward the following hypothesis: dreams about the session relate to some aspects of the analyst's activity. In this sense, such dreams are indicative of the transference neurosis, prefiguring transference resistances to the analytic elaboration of key conflicts. The parts taken by the patient and by the analyst are discussed in terms of their ability to signal a deepening of the analysis. Copyright © 2016 Institute of Psychoanalysis.

  11. An analytic data analysis method for oscillatory slug tests.

    PubMed

    Chen, Chia-Shyun

    2006-01-01

    An analytical data analysis method is developed for slug tests in partially penetrating wells in confined or unconfined aquifers of high hydraulic conductivity. As adapted from the van der Kamp method, the determination of the hydraulic conductivity is based on the occurrence times and the displacements of the extreme points measured from the oscillatory data and their theoretical counterparts available in the literature. This method is applied to two sets of slug test response data presented by Butler et al.: one set shows slow damping with seven discernable extremities, and the other shows rapid damping with three extreme points. The estimates of the hydraulic conductivity obtained by the analytic method are in good agreement with those determined by an available curve-matching technique.

  12. Validation of an Active Gear, Flexible Aircraft Take-off and Landing analysis (AGFATL)

    NASA Technical Reports Server (NTRS)

    Mcgehee, J. R.

    1984-01-01

    The results of an analytical investigation using a computer program for active gear, flexible aircraft take off and landing analysis (AGFATL) are compared with experimental data from shaker tests, drop tests, and simulated landing tests to validate the AGFATL computer program. Comparison of experimental and analytical responses for both passive and active gears indicates good agreement for shaker tests and drop tests. For the simulated landing tests, the passive and active gears were influenced by large strut binding friction forces. The inclusion of these undefined forces in the analytical simulations was difficult, and consequently only fair to good agreement was obtained. An assessment of the results from the investigation indicates that the AGFATL computer program is a valid tool for the study and initial design of series hydraulic active control landing gear systems.

  13. How Can Visual Analytics Assist Investigative Analysis? Design Implications from an Evaluation.

    PubMed

    Youn-Ah Kang; Görg, Carsten; Stasko, John

    2011-05-01

    Despite the growing number of systems providing visual analytic support for investigative analysis, few empirical studies of the potential benefits of such systems have been conducted, particularly controlled, comparative evaluations. Determining how such systems foster insight and sensemaking is important for their continued growth and study, however. Furthermore, studies that identify how people use such systems and why they benefit (or not) can help inform the design of new systems in this area. We conducted an evaluation of the visual analytics system Jigsaw employed in a small investigative sensemaking exercise, and compared its use to three other more traditional methods of analysis. Sixteen participants performed a simulated intelligence analysis task under one of the four conditions. Experimental results suggest that Jigsaw assisted participants to analyze the data and identify an embedded threat. We describe different analysis strategies used by study participants and how computational support (or the lack thereof) influenced the strategies. We then illustrate several characteristics of the sensemaking process identified in the study and provide design implications for investigative analysis tools based thereon. We conclude with recommendations on metrics and techniques for evaluating visual analytics systems for investigative analysis.

  14. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  15. fMRI activation patterns in an analytic reasoning task: consistency with EEG source localization

    NASA Astrophysics Data System (ADS)

    Li, Bian; Vasanta, Kalyana C.; O'Boyle, Michael; Baker, Mary C.; Nutter, Brian; Mitra, Sunanda

    2010-03-01

    Functional magnetic resonance imaging (fMRI) is used to model brain activation patterns associated with various perceptual and cognitive processes as reflected by the hemodynamic (BOLD) response. While many sensory and motor tasks are associated with relatively simple activation patterns in localized regions, higher-order cognitive tasks may produce activity in many different brain areas involving complex neural circuitry. We applied a recently proposed probabilistic independent component analysis technique (PICA) to determine the true dimensionality of the fMRI data and used EEG localization to identify the common activated patterns (mapped as Brodmann areas) associated with a complex cognitive task like analytic reasoning. Our preliminary study suggests that a hybrid GLM/PICA analysis may reveal additional regions of activation (beyond simple GLM) that are consistent with electroencephalography (EEG) source localization patterns.

  16. An analytical pipeline to compare and characterise the anthocyanin antioxidant activities of purple sweet potato cultivars.

    PubMed

    Hu, Yijie; Deng, Liqing; Chen, Jinwu; Zhou, Siyu; Liu, Shuang; Fu, Yufan; Yang, Chunxian; Liao, Zhihua; Chen, Min

    2016-03-01

    Purple sweet potato (Ipomoea batatas L.) is rich in anthocyanin pigments, which are valuable constituents of the human diet. Techniques to identify and quantify anthocyanins and their antioxidant potential are desirable for cultivar selection and breeding. In this study, we performed a quantitative and qualitative chemical analysis of 30 purple sweet potato (PSP) cultivars, using various assays to measure reducing power radical-scavenging activities, and linoleic acid autoxidation inhibition activity. Grey relational analysis (GRA) was applied to establish relationships between the antioxidant activities and the chemical fingerprints, in order to identify key bioactive compounds. The results indicated that four peonidin-based anthocyanins and three cyanidin-based anthocyanins make significant contributions to antioxidant activity. We conclude that the analytical pipeline described here represents an effective method to evaluate the antioxidant potential of, and the contributing compounds present in, PSP cultivars. This approach may be used to guide future breeding strategies. Copyright © 2015. Published by Elsevier Ltd.

  17. Exploratory Analysis in Learning Analytics

    ERIC Educational Resources Information Center

    Gibson, David; de Freitas, Sara

    2016-01-01

    This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex…

  18. An Analysis of Earth Science Data Analytics Use Cases

    NASA Technical Reports Server (NTRS)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  19. An analytical technique for predicting the characteristics of a flexible wing equipped with an active flutter-suppression system and comparison with wind-tunnel data

    NASA Technical Reports Server (NTRS)

    Abel, I.

    1979-01-01

    An analytical technique for predicting the performance of an active flutter-suppression system is presented. This technique is based on the use of an interpolating function to approximate the unsteady aerodynamics. The resulting equations are formulated in terms of linear, ordinary differential equations with constant coefficients. This technique is then applied to an aeroelastic model wing equipped with an active flutter-suppression system. Comparisons between wind-tunnel data and analysis are presented for the wing both with and without active flutter suppression. Results indicate that the wing flutter characteristics without flutter suppression can be predicted very well but that a more adequate model of wind-tunnel turbulence is required when the active flutter-suppression system is used.

  20. Active Flash: Out-of-core Data Analytics on Flash Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boboila, Simona; Kim, Youngjae; Vazhkudai, Sudharshan S

    2012-01-01

    Next generation science will increasingly come to rely on the ability to perform efficient, on-the-fly analytics of data generated by high-performance computing (HPC) simulations, modeling complex physical phenomena. Scientific computing workflows are stymied by the traditional chaining of simulation and data analysis, creating multiple rounds of redundant reads and writes to the storage system, which grows in cost with the ever-increasing gap between compute and storage speeds in HPC clusters. Recent HPC acquisitions have introduced compute node-local flash storage as a means to alleviate this I/O bottleneck. We propose a novel approach, Active Flash, to expedite data analysis pipelines bymore » migrating to the location of the data, the flash device itself. We argue that Active Flash has the potential to enable true out-of-core data analytics by freeing up both the compute core and the associated main memory. By performing analysis locally, dependence on limited bandwidth to a central storage system is reduced, while allowing this analysis to proceed in parallel with the main application. In addition, offloading work from the host to the more power-efficient controller reduces peak system power usage, which is already in the megawatt range and poses a major barrier to HPC system scalability. We propose an architecture for Active Flash, explore energy and performance trade-offs in moving computation from host to storage, demonstrate the ability of appropriate embedded controllers to perform data analysis and reduction tasks at speeds sufficient for this application, and present a simulation study of Active Flash scheduling policies. These results show the viability of the Active Flash model, and its capability to potentially have a transformative impact on scientific data analysis.« less

  1. Analytical study of acoustically perturbed Brillouin active magnetized semiconductor plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shukla, Arun, E-mail: arunshuklaujn@gmail.com; Jat, K. L.

    2015-07-31

    An analytical study of acoustically perturbed Brillouin active magnetized semiconductor plasma has been reported. In the present analytical investigation, the lattice displacement, acousto-optical polarization, susceptibility, acousto-optical gain constant arising due to the induced nonlinear current density and acousto-optical process are deduced in an acoustically perturbed Brillouin active magnetized semiconductor plasma using the hydrodynamical model of plasma and coupled mode scheme. The influence of wave number and magnetic field has been explored. The analysis has been applied to centrosymmetric crystal. Numerical estimates are made for n-type InSb crystal duly irradiated by a frequency doubled 10.6 µm CO{sub 2} laser. It is foundmore » that lattice displacement, susceptibility and acousto-optical gain increase linearly with incident wave number and applied dc magnetic field, while decrease with scattering angle. The gain also increases with electric amplitude of incident laser beam. Results are found to be well in agreement with available literature.« less

  2. An Analysis of Machine- and Human-Analytics in Classification.

    PubMed

    Tam, Gary K L; Kothari, Vivek; Chen, Min

    2017-01-01

    In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.

  3. An analytical study of a six degree-of-freedom active truss for use in vibration control

    NASA Technical Reports Server (NTRS)

    Wynn, Robert H., Jr.; Robertshaw, Harry H.; Horner, C. Garnett

    1990-01-01

    An analytical study of the vibration control capabilities of three configurations of an active truss is presented. The truss studied is composed of two bays of an octahedral-octahedral configuration. The three configurations of the active truss studies are: all six battens activated (6 DOF), the top three battens activated (3 DOF), and the bottom three battens activated (3 DOF). The closed-loop vibration control response of these three configurations are studied with respect to: vibration attenuation, energy utilized, and the effects of motor drive amplifier saturation non-linearities.

  4. Analytical determination of flavonoids aimed to analysis of natural samples and active packaging applications.

    PubMed

    Castro-López, María del Mar; López-Vilariño, José Manuel; González-Rodríguez, María Victoria

    2014-05-01

    Several HPLC and UHPLC developed methods were compared to analyse the natural antioxidants catechins and quercetin used in active packaging and functional foods. Photodiode array detector coupled with a fluorescence detector and compared with LTQ-Orbitrap-MS was used. UHPLC was investigated as quick alternative without compromising the separation, analysis time shortened up to 6-fold. The feasibility of the four developed methods was compared. Linearity up to 0.9995, low detection limits (between 0.02 and 0.7 for HPLC-PDA, 2 to 7-fold lower for HPLC- LTQ-Orbitrap-MS and from 0.2 to 2mgL(-)(1) for UHPLC-PDA) and good precision parameters (RSD lower than 0.06%) were obtained. All methods were successfully applied to natural samples. LTQ-Orbitrap-MS allowed to identify other analytes of interest too. Good feasibility of the methods was also concluded from the analysis of catechin and quercetin release from new active packaging materials based on polypropylene added with catechins and green tea. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. ANALYTiC: An Active Learning System for Trajectory Classification.

    PubMed

    Soares Junior, Amilcar; Renso, Chiara; Matwin, Stan

    2017-01-01

    The increasing availability and use of positioning devices has resulted in large volumes of trajectory data. However, semantic annotations for such data are typically added by domain experts, which is a time-consuming task. Machine-learning algorithms can help infer semantic annotations from trajectory data by learning from sets of labeled data. Specifically, active learning approaches can minimize the set of trajectories to be annotated while preserving good performance measures. The ANALYTiC web-based interactive tool visually guides users through this annotation process.

  6. Focused analyte spray emission apparatus and process for mass spectrometric analysis

    DOEpatents

    Roach, Patrick J [Kennewick, WA; Laskin, Julia [Richland, WA; Laskin, Alexander [Richland, WA

    2012-01-17

    An apparatus and process are disclosed that deliver an analyte deposited on a substrate to a mass spectrometer that provides for trace analysis of complex organic analytes. Analytes are probed using a small droplet of solvent that is formed at the junction between two capillaries. A supply capillary maintains the droplet of solvent on the substrate; a collection capillary collects analyte desorbed from the surface and emits analyte ions as a focused spray to the inlet of a mass spectrometer for analysis. The invention enables efficient separation of desorption and ionization events, providing enhanced control over transport and ionization of the analyte.

  7. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  8. An Overview of Learning Analytics

    ERIC Educational Resources Information Center

    Zilvinskis, John; Willis, James, III; Borden, Victor M. H.

    2017-01-01

    The purpose of this chapter is to provide administrators and faculty with an understanding of learning analytics and its relationship to existing roles and functions so better institutional decisions can be made about investments and activities related to these technologies.

  9. Macro elemental analysis of food samples by nuclear analytical technique

    NASA Astrophysics Data System (ADS)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  10. Marketing Mix Formulation for Higher Education: An Integrated Analysis Employing Analytic Hierarchy Process, Cluster Analysis and Correspondence Analysis

    ERIC Educational Resources Information Center

    Ho, Hsuan-Fu; Hung, Chia-Chi

    2008-01-01

    Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…

  11. Dynamic analysis of a flexible spacecraft with rotating components. Volume 1: Analytical developments

    NASA Technical Reports Server (NTRS)

    Bodley, C. S.; Devers, A. D.; Park, A. C.

    1975-01-01

    Analytical procedures and digital computer code are presented for the dynamic analysis of a flexible spacecraft with rotating components. Topics, considered include: (1) nonlinear response in the time domain, and (2) linear response in the frequency domain. The spacecraft is assumed to consist of an assembly of connected rigid or flexible subassemblies. The total system is not restricted to a topological connection arrangement and may be acting under the influence of passive or active control systems and external environments. The analytics and associated digital code provide the user with the capability to establish spacecraft system nonlinear total response for specified initial conditions, linear perturbation response about a calculated or specified nominal motion, general frequency response and graphical display, and spacecraft system stability analysis.

  12. An Active Learning Exercise for Introducing Agent-Based Modeling

    ERIC Educational Resources Information Center

    Pinder, Jonathan P.

    2013-01-01

    Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…

  13. Wearable Networked Sensing for Human Mobility and Activity Analytics: A Systems Study.

    PubMed

    Dong, Bo; Biswas, Subir

    2012-01-01

    This paper presents implementation details, system characterization, and the performance of a wearable sensor network that was designed for human activity analysis. Specific machine learning mechanisms are implemented for recognizing a target set of activities with both out-of-body and on-body processing arrangements. Impacts of energy consumption by the on-body sensors are analyzed in terms of activity detection accuracy for out-of-body processing. Impacts of limited processing abilities in the on-body scenario are also characterized in terms of detection accuracy, by varying the background processing load in the sensor units. Through a rigorous systems study, it is shown that an efficient human activity analytics system can be designed and operated even under energy and processing constraints of tiny on-body wearable sensors.

  14. An analysis of beam parameters on proton-acoustic waves through an analytic approach.

    PubMed

    Kipergil, Esra Aytac; Erkol, Hakan; Kaya, Serhat; Gulsen, Gultekin; Unlu, Mehmet Burcin

    2017-06-21

    It has been reported that acoustic waves are generated when a high-energy pulsed proton beam is deposited in a small volume within tissue. One possible application of proton-induced acoustics is to get real-time feedback for intra-treatment adjustments by monitoring such acoustic waves. A high spatial resolution in ultrasound imaging may reduce proton range uncertainty. Thus, it is crucial to understand the dependence of the acoustic waves on the proton beam characteristics. In this manuscript, firstly, an analytic solution for the proton-induced acoustic wave is presented to reveal the dependence of the signal on the beam parameters; then it is combined with an analytic approximation of the Bragg curve. The influence of the beam energy, pulse duration and beam diameter variation on the acoustic waveform are investigated. Further analysis is performed regarding the Fourier decomposition of the proton-acoustic signals. Our results show that the smaller spill time of the proton beam upsurges the amplitude of the acoustic wave for a constant number of protons, which is hence beneficial for dose monitoring. The increase in the energy of each individual proton in the beam leads to the spatial broadening of the Bragg curve, which also yields acoustic waves of greater amplitude. The pulse duration and the beam width of the proton beam do not affect the central frequency of the acoustic wave, but they change the amplitude of the spectral components.

  15. SOMFlow: Guided Exploratory Cluster Analysis with Self-Organizing Maps and Analytic Provenance.

    PubMed

    Sacha, Dominik; Kraus, Matthias; Bernard, Jurgen; Behrisch, Michael; Schreck, Tobias; Asano, Yuki; Keim, Daniel A

    2018-01-01

    Clustering is a core building block for data analysis, aiming to extract otherwise hidden structures and relations from raw datasets, such as particular groups that can be effectively related, compared, and interpreted. A plethora of visual-interactive cluster analysis techniques has been proposed to date, however, arriving at useful clusterings often requires several rounds of user interactions to fine-tune the data preprocessing and algorithms. We present a multi-stage Visual Analytics (VA) approach for iterative cluster refinement together with an implementation (SOMFlow) that uses Self-Organizing Maps (SOM) to analyze time series data. It supports exploration by offering the analyst a visual platform to analyze intermediate results, adapt the underlying computations, iteratively partition the data, and to reflect previous analytical activities. The history of previous decisions is explicitly visualized within a flow graph, allowing to compare earlier cluster refinements and to explore relations. We further leverage quality and interestingness measures to guide the analyst in the discovery of useful patterns, relations, and data partitions. We conducted two pair analytics experiments together with a subject matter expert in speech intonation research to demonstrate that the approach is effective for interactive data analysis, supporting enhanced understanding of clustering results as well as the interactive process itself.

  16. Analytical Essay Writing: A New Activity Introduced to a Traditional Curriculum

    ERIC Educational Resources Information Center

    Kommalage, Mahinda

    2012-01-01

    Medical students following a traditional curriculum get few opportunities to engage in activities such as a literature search, scientific writing, and active and collaborative learning. An analytical essay writing activity (AEWA) in physiology was introduced to first-year students. Each student prepared an essay incorporating new research findings…

  17. Determining an Effective Intervention within a Brief Experimental Analysis for Reading: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Burns, Matthew K.; Wagner, Dana

    2008-01-01

    The current study applied meta-analytic procedures to brief experimental analysis research of reading fluency interventions to better inform practice and suggest areas for future research. Thirteen studies were examined to determine what magnitude of effect was needed to identify an intervention as the most effective within a brief experimental…

  18. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  19. An analytics of electricity consumption characteristics based on principal component analysis

    NASA Astrophysics Data System (ADS)

    Feng, Junshu

    2018-02-01

    Abstract . More detailed analysis of the electricity consumption characteristics can make demand side management (DSM) much more targeted. In this paper, an analytics of electricity consumption characteristics based on principal component analysis (PCA) is given, which the PCA method can be used in to extract the main typical characteristics of electricity consumers. Then, electricity consumption characteristics matrix is designed, which can make a comparison of different typical electricity consumption characteristics between different types of consumers, such as industrial consumers, commercial consumers and residents. In our case study, the electricity consumption has been mainly divided into four characteristics: extreme peak using, peak using, peak-shifting using and others. Moreover, it has been found that industrial consumers shift their peak load often, meanwhile commercial and residential consumers have more peak-time consumption. The conclusions can provide decision support of DSM for the government and power providers.

  20. Exhaled breath condensate – from an analytical point of view

    PubMed Central

    Dodig, Slavica; Čepelak, Ivana

    2013-01-01

    Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297

  1. An analytical design approach for self-powered active lateral secondary suspensions for railway vehicles

    NASA Astrophysics Data System (ADS)

    Wang, Peng; Li, Hong; Zhang, Jiye; Mei, TX

    2015-10-01

    In this paper, an analytical design approach for the development of self-powered active suspensions is investigated and is applied to optimise the control system design for an active lateral secondary suspension for railway vehicles. The conditions for energy balance are analysed and the relationship between the ride quality improvement and energy consumption is discussed in detail. The modal skyhook control is applied to analyse the energy consumption of this suspension by separating its dynamics into the lateral and yaw modes, and based on a simplified model, the average power consumption of actuators is computed in frequency domain by using the power spectral density of lateral alignment of track irregularities. Then the impact of control gains and actuators' key parameters on the performance for both vibration suppressing and energy recovery/storage is analysed. Computer simulation is used to verify the obtained energy balance condition and to demonstrate that the improved ride comfort is achieved by this self-powered active suspension without any external power supply.

  2. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-05-01

    function (Womer)  Unit cost as a function of learning and rate  Learning with forgetting (Benkard)  Learning depreciates over time  Discretionary...Analytical Tools for Affordability Analysis David Tate Cost Analysis and Research Division Institute for Defense Analyses Report Documentation...ES) Institute for Defense Analyses, Cost Analysis and Research Division,4850 Mark Center Drive,Alexandria,VA,22311-1882 8. PERFORMING ORGANIZATION

  3. An interlaboratory transfer of a multi-analyte assay between continents.

    PubMed

    Georgiou, Alexandra; Dong, Kelly; Hughes, Stephen; Barfield, Matthew

    2015-01-01

    Alex has worked at GlaxoSmithKline for the past 15 years and currently works within the bioanalytical and toxicokinetic group in the United Kingdom. Alex's role in previous years has been the in-house support of preclinical and clinical bioanalysis, from method development through to sample analysis activities as well as acting as PI for GLP bioanalysis and toxicokinetics. For the past two years, Alex has applied this analytical and regulatory experience to focus on the outsourcing of preclinical bioanalysis, toxicokinetics and clinical bioanalysis, working closely with multiple bioanalytical and in-life CRO partners worldwide. Alex works to support DMPK and Safety Assessment outsourcing activities for GSK across multiple therapeutic areas, from the first GLP study through to late stage clinical PK studies. Transfer and cross-validation of an existing analytical assay between a laboratory providing current analytical support, and a laboratory needed for new or additional support, can present the bioanalyst with numerous challenges. These challenges can be technical or logistical in nature and may prove to be significant when transferring an assay between laboratories in different continents. Part of GlaxoSmithKline's strategy to improve confidence in providing quality data, is to cross-validate between laboratories. If the cross-validation fails predefined acceptance criteria, then a subsequent investigation would follow. This may also prove to be challenging. The importance of thorough planning and good communication throughout assay transfer, cross-validation and any subsequent investigations is illustrated in this case study.

  4. Quantifying construction and demolition waste: an analytical review.

    PubMed

    Wu, Zezhou; Yu, Ann T W; Shen, Liyin; Liu, Guiwen

    2014-09-01

    Quantifying construction and demolition (C&D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C&D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C&D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Analytical and multibody modeling for the power analysis of standing jumps.

    PubMed

    Palmieri, G; Callegari, M; Fioretti, S

    2015-01-01

    Two methods for the power analysis of standing jumps are proposed and compared in this article. The first method is based on a simple analytical formulation which requires as input the coordinates of the center of gravity in three specified instants of the jump. The second method is based on a multibody model that simulates the jumps processing the data obtained by a three-dimensional (3D) motion capture system and the dynamometric measurements obtained by the force platforms. The multibody model is developed with OpenSim, an open-source software which provides tools for the kinematic and dynamic analyses of 3D human body models. The study is focused on two of the typical tests used to evaluate the muscular activity of lower limbs, which are the counter movement jump and the standing long jump. The comparison between the results obtained by the two methods confirms that the proposed analytical formulation is correct and represents a simple tool suitable for a preliminary analysis of total mechanical work and the mean power exerted in standing jumps.

  6. Analytical-HZETRN Model for Rapid Assessment of Active Magnetic Radiation Shielding

    NASA Technical Reports Server (NTRS)

    Washburn, S. A.; Blattnig, S. R.; Singleterry, R. C.; Westover, S. C.

    2014-01-01

    The use of active radiation shielding designs has the potential to reduce the radiation exposure received by astronauts on deep-space missions at a significantly lower mass penalty than designs utilizing only passive shielding. Unfortunately, the determination of the radiation exposure inside these shielded environments often involves lengthy and computationally intensive Monte Carlo analysis. In order to evaluate the large trade space of design parameters associated with a magnetic radiation shield design, an analytical model was developed for the determination of flux inside a solenoid magnetic field due to the Galactic Cosmic Radiation (GCR) radiation environment. This analytical model was then coupled with NASA's radiation transport code, HZETRN, to account for the effects of passive/structural shielding mass. The resulting model can rapidly obtain results for a given configuration and can therefore be used to analyze an entire trade space of potential variables in less time than is required for even a single Monte Carlo run. Analyzing this trade space for a solenoid magnetic shield design indicates that active shield bending powers greater than 15 Tm and passive/structural shielding thicknesses greater than 40 g/cm2 have a limited impact on reducing dose equivalent values. Also, it is shown that higher magnetic field strengths are more effective than thicker magnetic fields at reducing dose equivalent.

  7. An analytical model of prominence dynamics

    NASA Astrophysics Data System (ADS)

    Routh, Swati; Saha, Snehanshu; Bhat, Atul; Sundar, M. N.

    2018-01-01

    Solar prominences are magnetic structures incarcerating cool and dense gas in an otherwise hot solar corona. Prominences can be categorized as quiescent and active. Their origin and the presence of cool gas (∼104 K) within the hot (∼106K) solar corona remains poorly understood. The structure and dynamics of solar prominences was investigated in a large number of observational and theoretical (both analytical and numerical) studies. In this paper, an analytic model of quiescent solar prominence is developed and used to demonstrate that the prominence velocity increases exponentially, which means that some gas falls downward towards the solar surface, and that Alfvén waves are naturally present in the solar prominences. These theoretical predictions are consistent with the current observational data of solar quiescent prominences.

  8. An overview of city analytics

    PubMed Central

    Higham, Desmond J.; Batty, Michael; Bettencourt, Luís M. A.; Greetham, Danica Vukadinović; Grindrod, Peter

    2017-01-01

    We introduce the 14 articles in the Royal Society Open Science themed issue on City Analytics. To provide a high-level, strategic, overview, we summarize the topics addressed and the analytical tools deployed. We then give a more detailed account of the individual contributions. Our overall aims are (i) to highlight exciting advances in this emerging, interdisciplinary field, (ii) to encourage further activity and (iii) to emphasize the variety of new, public-domain, datasets that are available to researchers. PMID:28386454

  9. Putting an Ethical Lens on Learning Analytics

    ERIC Educational Resources Information Center

    West, Deborah; Huijser, Henk; Heath, David

    2016-01-01

    As learning analytics activity has increased, a variety of ethical implications and considerations have emerged, though a significant research gap remains in explicitly investigating the views of key stakeholders, such as academic staff. This paper draws on ethics-related findings from an Australian study featuring two surveys, one of…

  10. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    PubMed

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. An analytical procedure to assist decision-making in a government research organization

    Treesearch

    H. Dean Claxton; Giuseppe Rensi

    1972-01-01

    An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...

  12. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  13. Policy-Making Theory as an Analytical Framework in Policy Analysis: Implications for Research Design and Professional Advocacy.

    PubMed

    Sheldon, Michael R

    2016-01-01

    Policy studies are a recent addition to the American Physical Therapy Association's Research Agenda and are critical to our understanding of various federal, state, local, and organizational policies on the provision of physical therapist services across the continuum of care. Policy analyses that help to advance the profession's various policy agendas will require relevant theoretical frameworks to be credible. The purpose of this perspective article is to: (1) demonstrate the use of a policy-making theory as an analytical framework in a policy analysis and (2) discuss how sound policy analysis can assist physical therapists in becoming more effective change agents, policy advocates, and partners with other relevant stakeholder groups. An exploratory study of state agency policy responses to address work-related musculoskeletal disorders is provided as a contemporary example to illustrate key points and to demonstrate the importance of selecting a relevant analytical framework based on the context of the policy issue under investigation. © 2016 American Physical Therapy Association.

  14. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. a Multidisciplinary Analytical Framework for Studying Active Mobility Patterns

    NASA Astrophysics Data System (ADS)

    Orellana, D.; Hermida, C.; Osorio, P.

    2016-06-01

    Intermediate cities are urged to change and adapt their mobility systems from a high energy-demanding motorized model to a sustainable low-motorized model. In order to accomplish such a model, city administrations need to better understand active mobility patterns and their links to socio-demographic and cultural aspects of the population. During the last decade, researchers have demonstrated the potential of geo-location technologies and mobile devices to gather massive amounts of data for mobility studies. However, the analysis and interpretation of this data has been carried out by specialized research groups with relatively narrow approaches from different disciplines. Consequently, broader questions remain less explored, mainly those relating to spatial behaviour of individuals and populations with their geographic environment and the motivations and perceptions shaping such behaviour. Understanding sustainable mobility and exploring new research paths require an interdisciplinary approach given the complex nature of mobility systems and their social, economic and environmental impacts. Here, we introduce the elements for a multidisciplinary analytical framework for studying active mobility patterns comprised of three components: a) Methodological, b) Behavioural, and c) Perceptual. We demonstrate the applicability of the framework by analysing mobility patterns of cyclists and pedestrians in an intermediate city integrating a range of techniques, including: GPS tracking, spatial analysis, auto-ethnography, and perceptual mapping. The results demonstrated the existence of non-evident spatial behaviours and how perceptual features affect mobility. This knowledge is useful for developing policies and practices for sustainable mobility planning.

  16. Composable Analytic Systems for next-generation intelligence analysis

    NASA Astrophysics Data System (ADS)

    DiBona, Phil; Llinas, James; Barry, Kevin

    2015-05-01

    Lockheed Martin Advanced Technology Laboratories (LM ATL) is collaborating with Professor James Llinas, Ph.D., of the Center for Multisource Information Fusion at the University at Buffalo (State of NY), researching concepts for a mixed-initiative associate system for intelligence analysts to facilitate reduced analysis and decision times while proactively discovering and presenting relevant information based on the analyst's needs, current tasks and cognitive state. Today's exploitation and analysis systems have largely been designed for a specific sensor, data type, and operational context, leading to difficulty in directly supporting the analyst's evolving tasking and work product development preferences across complex Operational Environments. Our interactions with analysts illuminate the need to impact the information fusion, exploitation, and analysis capabilities in a variety of ways, including understanding data options, algorithm composition, hypothesis validation, and work product development. Composable Analytic Systems, an analyst-driven system that increases flexibility and capability to effectively utilize Multi-INT fusion and analytics tailored to the analyst's mission needs, holds promise to addresses the current and future intelligence analysis needs, as US forces engage threats in contested and denied environments.

  17. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality.

    PubMed

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; Maceachren, Alan M

    2008-11-07

    Kulldorff's spatial scan statistic and its software implementation - SaTScan - are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of

  18. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality

    PubMed Central

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; MacEachren, Alan M

    2008-01-01

    Background Kulldorff's spatial scan statistic and its software implementation – SaTScan – are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. Results We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed

  19. Development of Analytical Algorithm for the Performance Analysis of Power Train System of an Electric Vehicle

    NASA Astrophysics Data System (ADS)

    Kim, Chul-Ho; Lee, Kee-Man; Lee, Sang-Heon

    Power train system design is one of the key R&D areas on the development process of new automobile because an optimum size of engine with adaptable power transmission which can accomplish the design requirement of new vehicle can be obtained through the system design. Especially, for the electric vehicle design, very reliable design algorithm of a power train system is required for the energy efficiency. In this study, an analytical simulation algorithm is developed to estimate driving performance of a designed power train system of an electric. The principal theory of the simulation algorithm is conservation of energy with several analytical and experimental data such as rolling resistance, aerodynamic drag, mechanical efficiency of power transmission etc. From the analytical calculation results, running resistance of a designed vehicle is obtained with the change of operating condition of the vehicle such as inclined angle of road and vehicle speed. Tractive performance of the model vehicle with a given power train system is also calculated at each gear ratio of transmission. Through analysis of these two calculation results: running resistance and tractive performance, the driving performance of a designed electric vehicle is estimated and it will be used to evaluate the adaptability of the designed power train system on the vehicle.

  20. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  1. Stability analysis of magnetized neutron stars - a semi-analytic approach

    NASA Astrophysics Data System (ADS)

    Herbrik, Marlene; Kokkotas, Kostas D.

    2017-04-01

    We implement a semi-analytic approach for stability analysis, addressing the ongoing uncertainty about stability and structure of neutron star magnetic fields. Applying the energy variational principle, a model system is displaced from its equilibrium state. The related energy density variation is set up analytically, whereas its volume integration is carried out numerically. This facilitates the consideration of more realistic neutron star characteristics within the model compared to analytical treatments. At the same time, our method retains the possibility to yield general information about neutron star magnetic field and composition structures that are likely to be stable. In contrast to numerical studies, classes of parametrized systems can be studied at once, finally constraining realistic configurations for interior neutron star magnetic fields. We apply the stability analysis scheme on polytropic and non-barotropic neutron stars with toroidal, poloidal and mixed fields testing their stability in a Newtonian framework. Furthermore, we provide the analytical scheme for dropping the Cowling approximation in an axisymmetric system and investigate its impact. Our results confirm the instability of simple magnetized neutron star models as well as a stabilization tendency in the case of mixed fields and stratification. These findings agree with analytical studies whose spectrum of model systems we extend by lifting former simplifications.

  2. Integrated sudomotor axon reflex sweat stimulation for continuous sweat analyte analysis with individuals at rest.

    PubMed

    Sonner, Zachary; Wilder, Eliza; Gaillard, Trudy; Kasting, Gerald; Heikenfeld, Jason

    2017-07-25

    Eccrine sweat has rapidly emerged as a non-invasive, ergonomic, and rich source of chemical analytes with numerous technological demonstrations now showing the ability for continuous electrochemical sensing. However, beyond active perspirers (athletes, workers, etc.), continuous sweat access in individuals at rest has hindered the advancement of both sweat sensing science and technology. Reported here is integration of sudomotor axon reflex sweat stimulation for continuous wearable sweat analyte analysis, including the ability for side-by-side integration of chemical stimulants & sensors without cross-contamination. This integration approach is uniquely compatible with sensors which consume the analyte (enzymatic) or sensors which equilibrate with analyte concentrations. In vivo validation is performed using iontophoretic delivery of carbachol with ion-selective and impedance sensors for sweat analysis. Carbachol has shown prolonged sweat stimulation in directly stimulated regions for five hours or longer. This work represents a significant leap forward in sweat sensing technology, and may be of broader interest to those interested in on-skin sensing integrated with drug-delivery.

  3. Pre-analytic and analytic sources of variations in thiopurine methyltransferase activity measurement in patients prescribed thiopurine-based drugs: A systematic review.

    PubMed

    Loit, Evelin; Tricco, Andrea C; Tsouros, Sophia; Sears, Margaret; Ansari, Mohammed T; Booth, Ronald A

    2011-07-01

    Low thiopurine S-methyltransferase (TPMT) enzyme activity is associated with increased thiopurine drug toxicity, particularly myelotoxicity. Pre-analytic and analytic variables for TPMT genotype and phenotype (enzyme activity) testing were reviewed. A systematic literature review was performed, and diagnostic laboratories were surveyed. Thirty-five studies reported relevant data for pre-analytic variables (patient age, gender, race, hematocrit, co-morbidity, co-administered drugs and specimen stability) and thirty-three for analytic variables (accuracy, reproducibility). TPMT is stable in blood when stored for up to 7 days at room temperature, and 3 months at -30°C. Pre-analytic patient variables do not affect TPMT activity. Fifteen drugs studied to date exerted no clinically significant effects in vivo. Enzymatic assay is the preferred technique. Radiochemical and HPLC techniques had intra- and inter-assay coefficients of variation (CVs) below 10%. TPMT is a stable enzyme, and its assay is not affected by age, gender, race or co-morbidity. Copyright © 2011. Published by Elsevier Inc.

  4. Design and analysis of an intelligent controller for active geometry suspension systems

    NASA Astrophysics Data System (ADS)

    Goodarzi, Avesta; Oloomi, Ehsan; Esmailzadeh, Ebrahim

    2011-02-01

    An active geometry suspension (AGS) system is a device to optimise suspension-related factors such as toe angle and roll centre height by controlling vehicle's suspension geometry. The suspension geometry could be changed through control of suspension mounting point's position. In this paper, analysis and control of an AGS system is addressed. First, the effects of suspension geometry change on roll centre height and toe angle are studied. Then, based on an analytical approach, the improvement of the vehicle's stability and handling due to the control of suspension geometry is investigated. In the next section, an eight-degree-of-freedom handling model of a sport utility vehicle equipped with an AGS system is introduced. Finally, a self-tuning proportional-integral controller has been designed, using the fuzzy control theory, to control the actuator that changes the geometry of the suspension system. The simulation results show that an AGS system can improve the handling and stability of the vehicle.

  5. HPAEC-PAD for oligosaccharide analysis-novel insights into analyte sensitivity and response stability.

    PubMed

    Mechelke, Matthias; Herlet, Jonathan; Benz, J Philipp; Schwarz, Wolfgang H; Zverlov, Vladimir V; Liebl, Wolfgang; Kornberger, Petra

    2017-12-01

    The rising importance of accurately detecting oligosaccharides in biomass hydrolyzates or as ingredients in food, such as in beverages and infant milk products, demands for the availability of tools to sensitively analyze the broad range of available oligosaccharides. Over the last decades, HPAEC-PAD has been developed into one of the major technologies for this task and represents a popular alternative to state-of-the-art LC-MS oligosaccharide analysis. This work presents the first comprehensive study which gives an overview of the separation of 38 analytes as well as enzymatic hydrolyzates of six different polysaccharides focusing on oligosaccharides. The high sensitivity of the PAD comes at cost of its stability due to recession of the gold electrode. By an in-depth analysis of the sensitivity drop over time for 35 analytes, including xylo- (XOS), arabinoxylo- (AXOS), laminari- (LOS), manno- (MOS), glucomanno- (GMOS), and cellooligosaccharides (COS), we developed an analyte-specific one-phase decay model for this effect over time. Using this model resulted in significantly improved data normalization when using an internal standard. Our results thereby allow a quantification approach which takes the inevitable and analyte-specific PAD response drop into account. Graphical abstract HPAEC-PAD analysis of oligosaccharides and determination of PAD response drop leading to an improved data normalization.

  6. Analytical and numerical analysis of charge carriers extracted by linearly increasing voltage in a metal-insulator-semiconductor structure relevant to bulk heterojunction organic solar cells

    NASA Astrophysics Data System (ADS)

    Yumnam, Nivedita; Hirwa, Hippolyte; Wagner, Veit

    2017-12-01

    Analysis of charge extraction by linearly increasing voltage is conducted on metal-insulator-semiconductor capacitors in a structure relevant to organic solar cells. For this analysis, an analytical model is developed and is used to determine the conductivity of the active layer. Numerical simulations of the transient current were performed as a way to confirm the applicability of our analytical model and other analytical models existing in the literature. Our analysis is applied to poly(3-hexylthiophene)(P3HT) : phenyl-C61-butyric acid methyl ester (PCBM) which allows to determine the electron and hole mobility independently. A combination of experimental data analysis and numerical simulations reveals the effect of trap states on the transient current and where this contribution is crucial for data analysis.

  7. Cytobank: providing an analytics platform for community cytometry data analysis and collaboration.

    PubMed

    Chen, Tiffany J; Kotecha, Nikesh

    2014-01-01

    Cytometry is used extensively in clinical and laboratory settings to diagnose and track cell subsets in blood and tissue. High-throughput, single-cell approaches leveraging cytometry are developed and applied in the computational and systems biology communities by researchers, who seek to improve the diagnosis of human diseases, map the structures of cell signaling networks, and identify new cell types. Data analysis and management present a bottleneck in the flow of knowledge from bench to clinic. Multi-parameter flow and mass cytometry enable identification of signaling profiles of patient cell samples. Currently, this process is manual, requiring hours of work to summarize multi-dimensional data and translate these data for input into other analysis programs. In addition, the increase in the number and size of collaborative cytometry studies as well as the computational complexity of analytical tools require the ability to assemble sufficient and appropriately configured computing capacity on demand. There is a critical need for platforms that can be used by both clinical and basic researchers who routinely rely on cytometry. Recent advances provide a unique opportunity to facilitate collaboration and analysis and management of cytometry data. Specifically, advances in cloud computing and virtualization are enabling efficient use of large computing resources for analysis and backup. An example is Cytobank, a platform that allows researchers to annotate, analyze, and share results along with the underlying single-cell data.

  8. An Experimental Introduction to Interlaboratory Exercises in Analytical Chemistry

    ERIC Educational Resources Information Center

    Puignou, L.; Llaurado, M.

    2005-01-01

    An experimental exercise on analytical proficiency studies in collaborative trials is proposed. This practical provides students in advanced undergraduate courses in chemistry, pharmacy, and biochemistry, with the opportunity to improve their quality assurance skills. It involves an environmental analysis, determining the concentration of a…

  9. ^10B analysis using Charged Particle Activation Analysis

    NASA Astrophysics Data System (ADS)

    Guo, B. N.; Jin, J. Y.; Duggan, J. D.; McDaniel, F. D.

    1997-10-01

    Charged Particle Activation analysis (CPAA) is an analytic technique that is used to determine trace quantities of an element usually on the surface of a substrate. The beam from the accelerator is used to make the required nuclear reaction that leaves the residual activity with a measurable half life. Gamma rays from the residual activity are measured to determine the trace quantities of the elements being studied. We have used this technique to study re-entry cloth coatings for space and aircraft vehicles. The clothes made of 20μ m SiC fibers are coated with Boron Nitride. CPAA was used to determine the relative thicknesses of the boron coatings. In particular the ^10B(p,γ)^11C reaction was used. A fast coincidence set up was used to measure the 0.511 MeV annihilation radiation from the 20.38 minute ^11C activity. Rutherford Back Scattering (RBS) results will be presented as a comparison. Details of the process and the experiment will be discussed.

  10. Comparative analysis of methods for real-time analytical control of chemotherapies preparations.

    PubMed

    Bazin, Christophe; Cassard, Bruno; Caudron, Eric; Prognon, Patrice; Havard, Laurent

    2015-10-15

    Control of chemotherapies preparations are now an obligation in France, though analytical control is compulsory. Several methods are available and none of them is presumed as ideal. We wanted to compare them so as to determine which one could be the best choice. We compared non analytical (visual and video-assisted, gravimetric) and analytical (HPLC/FIA, UV/FT-IR, UV/Raman, Raman) methods thanks to our experience and a SWOT analysis. The results of the analysis show great differences between the techniques, but as expected none us them is without defects. However they can probably be used in synergy. Overall for the pharmacist willing to get involved, the implementation of the control for chemotherapies preparations must be widely anticipated, with the listing of every parameter, and remains according to us an analyst's job. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Analytical Chemistry Laboratory Progress Report for FY 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program inmore » analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.« less

  12. Overcoming Intuition: Metacognitive Difficulty Activates Analytic Reasoning

    ERIC Educational Resources Information Center

    Alter, Adam L.; Oppenheimer, Daniel M.; Epley, Nicholas; Eyre, Rebecca N.

    2007-01-01

    Humans appear to reason using two processing styles: System 1 processes that are quick, intuitive, and effortless and System 2 processes that are slow, analytical, and deliberate that occasionally correct the output of System 1. Four experiments suggest that System 2 processes are activated by metacognitive experiences of difficulty or disfluency…

  13. Advanced Video Activity Analytics (AVAA): Human Performance Model Report

    DTIC Science & Technology

    2017-12-01

    NOTICES Disclaimers The findings in this report are not to be construed as an official Department of the Army position unless so designated by other...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data...Video Activity Analytics (AVAA) system. AVAA was designed to help US Army Intelligence Analysts exploit full-motion video more efficiently and

  14. Defining Behavior-Environment Interactions: Translating and Developing An Experimental and Applied Behavior-Analytic Vocabulary in and to the National Language

    ERIC Educational Resources Information Center

    Tuomisto, Marti T.; Parkkinen, Lauri

    2012-01-01

    Verbal behavior, as in the use of terms, is an important part of scientific activity in general and behavior analysis in particular. Many glossaries and dictionaries of behavior analysis have been published in English, but few in any other language. Here we review the area of behavior analytic terminology, its translations, and development in…

  15. Quantifying construction and demolition waste: An analytical review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin

    2014-09-15

    Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less

  16. Visual Analytics for Power Grid Contingency Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Pak C.; Huang, Zhenyu; Chen, Yousu

    2014-01-20

    Contingency analysis is the process of employing different measures to model scenarios, analyze them, and then derive the best response to remove the threats. This application paper focuses on a class of contingency analysis problems found in the power grid management system. A power grid is a geographically distributed interconnected transmission network that transmits and delivers electricity from generators to end users. The power grid contingency analysis problem is increasingly important because of both the growing size of the underlying raw data that need to be analyzed and the urgency to deliver working solutions in an aggressive timeframe. Failure tomore » do so may bring significant financial, economic, and security impacts to all parties involved and the society at large. The paper presents a scalable visual analytics pipeline that transforms about 100 million contingency scenarios to a manageable size and form for grid operators to examine different scenarios and come up with preventive or mitigation strategies to address the problems in a predictive and timely manner. Great attention is given to the computational scalability, information scalability, visual scalability, and display scalability issues surrounding the data analytics pipeline. Most of the large-scale computation requirements of our work are conducted on a Cray XMT multi-threaded parallel computer. The paper demonstrates a number of examples using western North American power grid models and data.« less

  17. Pain anticipation: an activation likelihood estimation meta-analysis of brain imaging studies.

    PubMed

    Palermo, Sara; Benedetti, Fabrizio; Costa, Tommaso; Amanzio, Martina

    2015-05-01

    The anticipation of pain has been investigated in a variety of brain imaging studies. Importantly, today there is no clear overall picture of the areas that are involved in different studies and the exact role of these regions in pain expectation remains especially unexploited. To address this issue, we used activation likelihood estimation meta-analysis to analyze pain anticipation in several neuroimaging studies. A total of 19 functional magnetic resonance imaging were included in the analysis to search for the cortical areas involved in pain anticipation in human experimental models. During anticipation, activated foci were found in the dorsolateral prefrontal, midcingulate and anterior insula cortices, medial and inferior frontal gyri, inferior parietal lobule, middle and superior temporal gyrus, thalamus, and caudate. Deactivated foci were found in the anterior cingulate, superior frontal gyrus, parahippocampal gyrus and in the claustrum. The results of the meta-analytic connectivity analysis provide an overall view of the brain responses triggered by the anticipation of a noxious stimulus. Such a highly distributed perceptual set of self-regulation may prime brain regions to process information where emotion, action and perception as well as their related subcategories play a central role. Not only do these findings provide important information on the neural events when anticipating pain, but also they may give a perspective into nocebo responses, whereby negative expectations may lead to pain worsening. © 2014 Wiley Periodicals, Inc.

  18. Analytic uncertainty and sensitivity analysis of models with input correlations

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  19. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  20. Analytics for Education

    ERIC Educational Resources Information Center

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  1. Analytical structure, dynamics, and coarse graining of a kinetic model of an active fluid

    NASA Astrophysics Data System (ADS)

    Gao, Tong; Betterton, Meredith D.; Jhang, An-Sheng; Shelley, Michael J.

    2017-09-01

    We analyze one of the simplest active suspensions with complex dynamics: a suspension of immotile "extensor" particles that exert active extensile dipolar stresses on the fluid in which they are immersed. This is relevant to several experimental systems, such as recently studied tripartite rods that create extensile flows by consuming a chemical fuel. We first describe the system through a Doi-Onsager kinetic theory based on microscopic modeling. This theory captures the active stresses produced by the particles that can drive hydrodynamic instabilities, as well as the steric interactions of rodlike particles that lead to nematic alignment. This active nematic system yields complex flows and disclination defect dynamics very similar to phenomenological Landau-deGennes Q -tensor theories for active nematic fluids, as well as by more complex Doi-Onsager theories for polar microtubule-motor-protein systems. We apply the quasiequilibrium Bingham closure, used to study suspensions of passive microscopic rods, to develop a nonstandard Q -tensor theory. We demonstrate through simulation that this B Q -tensor theory gives an excellent analytical and statistical accounting of the suspension's complex dynamics, at a far reduced computational cost. Finally, we apply the B Q -tensor model to study the dynamics of extensor suspensions in circular and biconcave domains. In circular domains, we reproduce previous results for systems with weak nematic alignment, but for strong alignment we find unusual dynamics with activity-controlled defect production and absorption at the boundaries of the domain. In biconcave domains, a Fredericks-like transition occurs as the width of the neck connecting the two disks is varied.

  2. An Overview of Learning Analytics

    ERIC Educational Resources Information Center

    Clow, Doug

    2013-01-01

    Learning analytics, the analysis and representation of data about learners in order to improve learning, is a new lens through which teachers can understand education. It is rooted in the dramatic increase in the quantity of data about learners and linked to management approaches that focus on quantitative metrics, which are sometimes antithetical…

  3. Impact of Educational Activities in Reducing Pre-Analytical Laboratory Errors

    PubMed Central

    Al-Ghaithi, Hamed; Pathare, Anil; Al-Mamari, Sahimah; Villacrucis, Rodrigo; Fawaz, Naglaa; Alkindi, Salam

    2017-01-01

    Objectives Pre-analytic errors during diagnostic laboratory investigations can lead to increased patient morbidity and mortality. This study aimed to ascertain the effect of educational nursing activities on the incidence of pre-analytical errors resulting in non-conforming blood samples. Methods This study was conducted between January 2008 and December 2015. All specimens received at the Haematology Laboratory of the Sultan Qaboos University Hospital, Muscat, Oman, during this period were prospectively collected and analysed. Similar data from 2007 were collected retrospectively and used as a baseline for comparison. Non-conforming samples were defined as either clotted samples, haemolysed samples, use of the wrong anticoagulant, insufficient quantities of blood collected, incorrect/lack of labelling on a sample or lack of delivery of a sample in spite of a sample request. From 2008 onwards, multiple educational training activities directed at the hospital nursing staff and nursing students primarily responsible for blood collection were implemented on a regular basis. Results After initiating corrective measures in 2008, a progressive reduction in the percentage of non-conforming samples was observed from 2009 onwards. Despite a 127.84% increase in the total number of specimens received, there was a significant reduction in non-conforming samples from 0.29% in 2007 to 0.07% in 2015, resulting in an improvement of 75.86% (P <0.050). In particular, specimen identification errors decreased by 0.056%, with a 96.55% improvement. Conclusion Targeted educational activities directed primarily towards hospital nursing staff had a positive impact on the quality of laboratory specimens by significantly reducing pre-analytical errors. PMID:29062553

  4. Comparison of analysis and flight test data for a drone aircraft with active flutter suppression

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Pototzky, A. S.

    1981-01-01

    A drone aircraft equipped with an active flutter suppression system is considered with emphasis on the comparison of modal dampings and frequencies as a function of Mach number. Results are presented for both symmetric and antisymmetric motion with flutter suppression off. Only symmetric results are given for flutter suppression on. Frequency response functions of the vehicle are presented from both flight test data and analysis. The analysis correlation is improved by using an empirical aerodynamic correction factor which is proportional to the ratio of experimental to analytical steady-state lift curve slope. The mathematical models are included and existing analytical techniques are described as well as an alternative analytical technique for obtaining closed-loop results.

  5. Comparison of analysis and flight test data for a drone aircraft with active flutter suppression

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Pototzky, A. S.

    1981-01-01

    This paper presents a comparison of analysis and flight test data for a drone aircraft equipped with an active flutter suppression system. Emphasis is placed on the comparison of modal dampings and frequencies as a function of Mach number. Results are presented for both symmetric and antisymmetric motion with flutter suppression off. Only symmetric results are presented for flutter suppression on. Frequency response functions of the vehicle are presented from both flight test data and analysis. The analysis correlation is improved by using an empirical aerodynamic correction factor which is proportional to the ratio of experimental to analytical steady-state lift curve slope. In addition to presenting the mathematical models and a brief description of existing analytical techniques, an alternative analytical technique for obtaining closed-loop results is presented.

  6. An analytical method for free vibration analysis of functionally graded beams with edge cracks

    NASA Astrophysics Data System (ADS)

    Wei, Dong; Liu, Yinghua; Xiang, Zhihai

    2012-03-01

    In this paper, an analytical method is proposed for solving the free vibration of cracked functionally graded material (FGM) beams with axial loading, rotary inertia and shear deformation. The governing differential equations of motion for an FGM beam are established and the corresponding solutions are found first. The discontinuity of rotation caused by the cracks is simulated by means of the rotational spring model. Based on the transfer matrix method, then the recurrence formula is developed to get the eigenvalue equations of free vibration of FGM beams. The main advantage of the proposed method is that the eigenvalue equation for vibrating beams with an arbitrary number of cracks can be conveniently determined from a third-order determinant. Due to the decrease in the determinant order as compared with previous methods, the developed method is simpler and more convenient to analytically solve the free vibration problem of cracked FGM beams. Moreover, free vibration analyses of the Euler-Bernoulli and Timoshenko beams with any number of cracks can be conducted using the unified procedure based on the developed method. These advantages of the proposed procedure would be more remarkable as the increase of the number of cracks. A comprehensive analysis is conducted to investigate the influences of the location and total number of cracks, material properties, axial load, inertia and end supports on the natural frequencies and vibration mode shapes of FGM beams. The present work may be useful for the design and control of damaged structures.

  7. Lessons Learned from Deploying an Analytical Task Management Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Welch, Clara; Arceneaux, Joshua; Bulgatz, Dennis; Hunt, Mitch; Young, Stephen

    2007-01-01

    Defining requirements, missions, technologies, and concepts for space exploration involves multiple levels of organizations, teams of people with complementary skills, and analytical models and simulations. Analytical activities range from filling a To-Be-Determined (TBD) in a requirement to creating animations and simulations of exploration missions. In a program as large as returning to the Moon, there are hundreds of simultaneous analysis activities. A way to manage and integrate efforts of this magnitude is to deploy a centralized database that provides the capability to define tasks, identify resources, describe products, schedule deliveries, and generate a variety of reports. This paper describes a web-accessible task management system and explains the lessons learned during the development and deployment of the database. Through the database, managers and team leaders can define tasks, establish review schedules, assign teams, link tasks to specific requirements, identify products, and link the task data records to external repositories that contain the products. Data filters and spreadsheet export utilities provide a powerful capability to create custom reports. Import utilities provide a means to populate the database from previously filled form files. Within a four month period, a small team analyzed requirements, developed a prototype, conducted multiple system demonstrations, and deployed a working system supporting hundreds of users across the aeros pace community. Open-source technologies and agile software development techniques, applied by a skilled team enabled this impressive achievement. Topics in the paper cover the web application technologies, agile software development, an overview of the system's functions and features, dealing with increasing scope, and deploying new versions of the system.

  8. Maternal and infant activity: Analytic approaches for the study of circadian rhythm.

    PubMed

    Thomas, Karen A; Burr, Robert L; Spieker, Susan

    2015-11-01

    The study of infant and mother circadian rhythm entails choice of instruments appropriate for use in the home environment as well as selection of analytic approach that characterizes circadian rhythm. While actigraphy monitoring suits the needs of home study, limited studies have examined mother and infant rhythm derived from actigraphy. Among this existing research a variety of analyses have been employed to characterize 24-h rhythm, reducing ability to evaluate and synthesize findings. Few studies have examined the correspondence of mother and infant circadian parameters for the most frequently cited approaches: cosinor, non-parametric circadian rhythm analysis (NPCRA), and autocorrelation function (ACF). The purpose of this research was to examine analytic approaches in the study of mother and infant circadian activity rhythm. Forty-three healthy mother and infant pairs were studied in the home environment over a 72h period at infant age 4, 8, and 12 weeks. Activity was recorded continuously using actigraphy monitors and mothers completed a diary. Parameters of circadian rhythm were generated from cosinor analysis, NPCRA, and ACF. The correlation among measures of rhythm center (cosinor mesor, NPCRA mid level), strength or fit of 24-h period (cosinor magnitude and R(2), NPCRA amplitude and relative amplitude (RA)), phase (cosinor acrophase, NPCRA M10 and L5 midpoint), and rhythm stability and variability (NPCRA interdaily stability (IS) and intradaily variability (IV), ACF) was assessed, and additionally the effect size (eta(2)) for change over time evaluated. Results suggest that cosinor analysis, NPCRA, and autocorrelation provide several comparable parameters of infant and maternal circadian rhythm center, fit, and phase. IS and IV were strongly correlated with the 24-h cycle fit. The circadian parameters analyzed offer separate insight into rhythm and differing effect size for the detection of change over time. Findings inform selection of analysis and

  9. Maternal and infant activity: Analytic approaches for the study of circadian rhythm

    PubMed Central

    Thomas, Karen A.; Burr, Robert L.; Spieker, Susan

    2015-01-01

    The study of infant and mother circadian rhythm entails choice of instruments appropriate for use in the home environment as well as selection of analytic approach that characterizes circadian rhythm. While actigraphy monitoring suits the needs of home study, limited studies have examined mother and infant rhythm derived from actigraphy. Among this existing research a variety of analyses have been employed to characterize 24-h rhythm, reducing ability to evaluate and synthesize findings. Few studies have examined the correspondence of mother and infant circadian parameters for the most frequently cited approaches: cosinor, non-parametric circadian rhythm analysis (NPCRA), and autocorrelation function (ACF). The purpose of this research was to examine analytic approaches in the study of mother and infant circadian activity rhythm. Forty-three healthy mother and infant pairs were studied in the home environment over a 72 h period at infant age 4, 8, and 12 weeks. Activity was recorded continuously using actigraphy monitors and mothers completed a diary. Parameters of circadian rhythm were generated from cosinor analysis, NPCRA, and ACF. The correlation among measures of rhythm center (cosinor mesor, NPCRA mid level), strength or fit of 24-h period (cosinor magnitude and R2, NPCRA amplitude and relative amplitude (RA)), phase (cosinor acrophase, NPCRA M10 and L5 midpoint), and rhythm stability and variability (NPCRA interdaily stability (IS) and intradaily variability (IV), ACF) was assessed, and additionally the effect size (eta2) for change over time evaluated. Results suggest that cosinor analysis, NPCRA, and autocorrelation provide several comparable parameters of infant and maternal circadian rhythm center, fit, and phase. IS and IV were strongly correlated with the 24-h cycle fit. The circadian parameters analyzed offer separate insight into rhythm and differing effect size for the detection of change over time. Findings inform selection of analysis and

  10. An Analytical Hierarchy Process Model for the Evaluation of College Experimental Teaching Quality

    ERIC Educational Resources Information Center

    Yin, Qingli

    2013-01-01

    Taking into account the characteristics of college experimental teaching, through investigaton and analysis, evaluation indices and an Analytical Hierarchy Process (AHP) model of experimental teaching quality have been established following the analytical hierarchy process method, and the evaluation indices have been given reasonable weights. An…

  11. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality

  12. Integrating bio-inorganic and analytical chemistry into an undergraduate biochemistry laboratory.

    PubMed

    Erasmus, Daniel J; Brewer, Sharon E; Cinel, Bruno

    2015-01-01

    Undergraduate laboratories expose students to a wide variety of topics and techniques in a limited amount of time. This can be a challenge and lead to less exposure to concepts and activities in bio-inorganic chemistry and analytical chemistry that are closely-related to biochemistry. To address this, we incorporated a new iron determination by atomic absorption spectroscopy exercise as part of a five-week long laboratory-based project on the purification of myoglobin from beef. Students were required to prepare samples for chemical analysis, operate an atomic absorption spectrophotometer, critically evaluate their iron data, and integrate these data into a study of myoglobin. © 2015 The International Union of Biochemistry and Molecular Biology.

  13. Analysis of Volatile Compounds by Advanced Analytical Techniques and Multivariate Chemometrics.

    PubMed

    Lubes, Giuseppe; Goodarzi, Mohammad

    2017-05-10

    Smelling is one of the five senses, which plays an important role in our everyday lives. Volatile compounds are, for example, characteristics of food where some of them can be perceivable by humans because of their aroma. They have a great influence on the decision making of consumers when they choose to use a product or not. In the case where a product has an offensive and strong aroma, many consumers might not appreciate it. On the contrary, soft and fresh natural aromas definitely increase the acceptance of a given product. These properties can drastically influence the economy; thus, it has been of great importance to manufacturers that the aroma of their food product is characterized by analytical means to provide a basis for further optimization processes. A lot of research has been devoted to this domain in order to link the quality of, e.g., a food to its aroma. By knowing the aromatic profile of a food, one can understand the nature of a given product leading to developing new products, which are more acceptable by consumers. There are two ways to analyze volatiles: one is to use human senses and/or sensory instruments, and the other is based on advanced analytical techniques. This work focuses on the latter. Although requirements are simple, low-cost technology is an attractive research target in this domain; most of the data are generated with very high-resolution analytical instruments. Such data gathered based on different analytical instruments normally have broad, overlapping sensitivity profiles and require substantial data analysis. In this review, we have addressed not only the question of the application of chemometrics for aroma analysis but also of the use of different analytical instruments in this field, highlighting the research needed for future focus.

  14. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was built using the OpenMDAO framework. Pycycle provides analytic derivatives allowing for an efficient use of gradient-based optimization methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  15. Analytical study of comet nucleus samples

    NASA Technical Reports Server (NTRS)

    Albee, A. L.

    1989-01-01

    Analytical procedures for studying and handling frozen (130 K) core samples of comet nuclei are discussed. These methods include neutron activation analysis, x ray fluorescent analysis and high resolution mass spectroscopy.

  16. Unifying Approach to Analytical Chemistry and Chemical Analysis: Problem-Oriented Role of Chemical Analysis.

    ERIC Educational Resources Information Center

    Pardue, Harry L.; Woo, Jannie

    1984-01-01

    Proposes an approach to teaching analytical chemistry and chemical analysis in which a problem to be resolved is the focus of a course. Indicates that this problem-oriented approach is intended to complement detailed discussions of fundamental and applied aspects of chemical determinations and not replace such discussions. (JN)

  17. Instrumental neutron activation analysis for studying size-fractionated aerosols

    NASA Astrophysics Data System (ADS)

    Salma, Imre; Zemplén-Papp, Éva

    1999-10-01

    Instrumental neutron activation analysis (INAA) was utilized for studying aerosol samples collected into a coarse and a fine size fraction on Nuclepore polycarbonate membrane filters. As a result of the panoramic INAA, 49 elements were determined in an amount of about 200-400 μg of particulate matter by two irradiations and four γ-spectrometric measurements. The analytical calculations were performed by the absolute ( k0) standardization method. The calibration procedures, application protocol and the data evaluation process are described and discussed. They make it possible now to analyse a considerable number of samples, with assuring the quality of the results. As a means of demonstrating the system's analytical capabilities, the concentration ranges, median or mean atmospheric concentrations and detection limits are presented for an extensive series of aerosol samples collected within the framework of an urban air pollution study in Budapest. For most elements, the precision of the analysis was found to be beyond the uncertainty represented by the sampling techniques and sample variability.

  18. Microemulsification: an approach for analytical determinations.

    PubMed

    Lima, Renato S; Shiroma, Leandro Y; Teixeira, Alvaro V N C; de Toledo, José R; do Couto, Bruno C; de Carvalho, Rogério M; Carrilho, Emanuel; Kubota, Lauro T; Gobbi, Angelo L

    2014-09-16

    We address a novel method for analytical determinations that combines simplicity, rapidity, low consumption of chemicals, and portability with high analytical performance taking into account parameters such as precision, linearity, robustness, and accuracy. This approach relies on the effect of the analyte content over the Gibbs free energy of dispersions, affecting the thermodynamic stabilization of emulsions or Winsor systems to form microemulsions (MEs). Such phenomenon was expressed by the minimum volume fraction of amphiphile required to form microemulsion (Φ(ME)), which was the analytical signal of the method. Thus, the measurements can be taken by visually monitoring the transition of the dispersions from cloudy to transparent during the microemulsification, like a titration. It bypasses the employment of electric energy. The performed studies were: phase behavior, droplet dimension by dynamic light scattering, analytical curve, and robustness tests. The reliability of the method was evaluated by determining water in ethanol fuels and monoethylene glycol in complex samples of liquefied natural gas. The dispersions were composed of water-chlorobenzene (water analysis) and water-oleic acid (monoethylene glycol analysis) with ethanol as the hydrotrope phase. The mean hydrodynamic diameter values for the nanostructures in the droplet-based water-chlorobenzene MEs were in the range of 1 to 11 nm. The procedures of microemulsification were conducted by adding ethanol to water-oleic acid (W-O) mixtures with the aid of micropipette and shaking. The Φ(ME) measurements were performed in a thermostatic water bath at 23 °C by direct observation that is based on the visual analyses of the media. The experiments to determine water demonstrated that the analytical performance depends on the composition of ME. It shows flexibility in the developed method. The linear range was fairly broad with limits of linearity up to 70.00% water in ethanol. For monoethylene glycol in

  19. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis

    PubMed Central

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338

  20. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis.

    PubMed

    Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.

  1. Triangular dislocation: an analytical, artefact-free solution

    NASA Astrophysics Data System (ADS)

    Nikkhoo, Mehdi; Walter, Thomas R.

    2015-05-01

    Displacements and stress-field changes associated with earthquakes, volcanoes, landslides and human activity are often simulated using numerical models in an attempt to understand the underlying processes and their governing physics. The application of elastic dislocation theory to these problems, however, may be biased because of numerical instabilities in the calculations. Here, we present a new method that is free of artefact singularities and numerical instabilities in analytical solutions for triangular dislocations (TDs) in both full-space and half-space. We apply the method to both the displacement and the stress fields. The entire 3-D Euclidean space {R}3 is divided into two complementary subspaces, in the sense that in each one, a particular analytical formulation fulfils the requirements for the ideal, artefact-free solution for a TD. The primary advantage of the presented method is that the development of our solutions involves neither numerical approximations nor series expansion methods. As a result, the final outputs are independent of the scale of the input parameters, including the size and position of the dislocation as well as its corresponding slip vector components. Our solutions are therefore well suited for application at various scales in geoscience, physics and engineering. We validate the solutions through comparison to other well-known analytical methods and provide the MATLAB codes.

  2. Defining behavior-environment interactions: translating and developing an experimental and applied behavior-analytic vocabulary in and to the national language.

    PubMed

    Tuomisto, Martti T; Parkkinen, Lauri

    2012-05-01

    Verbal behavior, as in the use of terms, is an important part of scientific activity in general and behavior analysis in particular. Many glossaries and dictionaries of behavior analysis have been published in English, but few in any other language. Here we review the area of behavior analytic terminology, its translations, and development in languages other than English. As an example, we use our own mother tongue, Finnish, which provides a suitable example of the process of translation and development of behavior analytic terminology, because it differs from Indo-European languages and entails specific advantages and challenges in the translation process. We have published three editions of a general dictionary of behavior analysis including 801 terms relevant to the experimental analysis of behavior and applied behavior analysis and one edition of a dictionary of applied and clinical behavior analysis containing 280 terms. Because this work has been important to us, we hope this review will encourage similar work by behavior analysts in other countries whose native language is not English. Behavior analysis as an advanced science deserves widespread international dissemination and proper translations are essential to that goal.

  3. Defining Behavior–Environment Interactions: Translating and Developing an Experimental and Applied Behavior-Analytic Vocabulary in and to the National Language

    PubMed Central

    Tuomisto, Martti T; Parkkinen, Lauri

    2012-01-01

    Verbal behavior, as in the use of terms, is an important part of scientific activity in general and behavior analysis in particular. Many glossaries and dictionaries of behavior analysis have been published in English, but few in any other language. Here we review the area of behavior analytic terminology, its translations, and development in languages other than English. As an example, we use our own mother tongue, Finnish, which provides a suitable example of the process of translation and development of behavior analytic terminology, because it differs from Indo-European languages and entails specific advantages and challenges in the translation process. We have published three editions of a general dictionary of behavior analysis including 801 terms relevant to the experimental analysis of behavior and applied behavior analysis and one edition of a dictionary of applied and clinical behavior analysis containing 280 terms. Because this work has been important to us, we hope this review will encourage similar work by behavior analysts in other countries whose native language is not English. Behavior analysis as an advanced science deserves widespread international dissemination and proper translations are essential to that goal. PMID:22693363

  4. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    PubMed

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. An analytic approach for the study of pulsar spindown

    NASA Astrophysics Data System (ADS)

    Chishtie, F. A.; Zhang, Xiyang; Valluri, S. R.

    2018-07-01

    In this work we develop an analytic approach to study pulsar spindown. We use the monopolar spindown model by Alvarez and Carramiñana (2004 Astron. Astrophys. 414 651–8), which assumes an inverse linear law of magnetic field decay of the pulsar, to extract an all-order formula for the spindown parameters using the Taylor series representation of Jaranowski et al (1998 Phys. Rev. D 58 6300). We further extend the analytic model to incorporate the quadrupole term that accounts for the emission of gravitational radiation, and obtain expressions for the period P and frequency f in terms of transcendental equations. We derive the analytic solution for pulsar frequency spindown in the absence of glitches. We examine the different cases that arise in the analysis of the roots in the solution of the non-linear differential equation for pulsar period evolution. We provide expressions for the spin-down parameters and find that the spindown values are in reasonable agreement with observations. A detection of gravitational waves from pulsars will be the next landmark in the field of multi-messenger gravitational wave astronomy.

  6. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    NASA Astrophysics Data System (ADS)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  7. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    PubMed Central

    Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-01-01

    Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928

  8. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    PubMed

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  9. Promoting Active Learning by Practicing the "Self-Assembly" of Model Analytical Instruments

    ERIC Educational Resources Information Center

    Algar, W. Russ; Krull, Ulrich J.

    2010-01-01

    In our upper-year instrumental analytical chemistry course, we have developed "cut-and-paste" exercises where students "build" models of analytical instruments from individual schematic images of components. These exercises encourage active learning by students. Instead of trying to memorize diagrams, students are required to think deeply about…

  10. A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions

    PubMed Central

    Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.

    2009-01-01

    Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453

  11. Familiarity Vs Trust: A Comparative Study of Domain Scientists' Trust in Visual Analytics and Conventional Analysis Methods.

    PubMed

    Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel

    2017-01-01

    Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.

  12. Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Keller, J.; Wallen, R.

    2015-02-01

    Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.

  13. Long-term Results of an Analytical Assessment of Student Compounded Preparations.

    PubMed

    Roark, Angie M; Anksorus, Heidi N; Shrewsbury, Robert P

    2014-11-15

    To investigate the long-term (ie, 6-year) impact of a required remake vs an optional remake on student performance in a compounding laboratory course in which students' compounded preparations were analyzed. The analysis data for several preparations made by students were compared for differences in the analyzed content of the active pharmaceutical ingredient (API) and the number of students who successfully compounded the preparation on the first attempt. There was a consistent statistical difference in the API amount or concentration in 4 of the preparations (diphenhydramine, ketoprofen, metoprolol, and progesterone) in each optional remake year compared to the required remake year. As the analysis requirement was continued, the outcome for each preparation approached and/or attained the expected API result. Two preparations required more than 1 year to demonstrate a statistical difference. The analytical assessment resulted in a consistent, long-term improvement in student performance during the 5-year period after the optional remake policy was instituted. Our assumption is that investment in such an assessment would result in a similar benefits at other colleges and schools of pharmacy.

  14. Microplasmas for chemical analysis: analytical tools or research toys?

    NASA Astrophysics Data System (ADS)

    Karanassios, Vassili

    2004-07-01

    An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between "liquid" electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided.

  15. The design, analysis, and testing of a low-budget wind-tunnel flutter model with active aerodynamic controls

    NASA Technical Reports Server (NTRS)

    Bolding, R. M.; Stearman, R. O.

    1976-01-01

    A low budget flutter model incorporating active aerodynamic controls for flutter suppression studies was designed as both an educational and research tool to study the interfering lifting surface flutter phenomenon in the form of a swept wing-tail configuration. A flutter suppression mechanism was demonstrated on a simple semirigid three-degree-of-freedom flutter model of this configuration employing an active stabilator control, and was then verified analytically using a doublet lattice lifting surface code and the model's measured mass, mode shapes, and frequencies in a flutter analysis. Preliminary studies were significantly encouraging to extend the analysis to the larger degree of freedom AFFDL wing-tail flutter model where additional analytical flutter suppression studies indicated significant gains in flutter margins could be achieved. The analytical and experimental design of a flutter suppression system for the AFFDL model is presented along with the results of a preliminary passive flutter test.

  16. Active Storage with Analytics Capabilities and I/O Runtime System for Petascale Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhary, Alok

    Computational scientists must understand results from experimental, observational and computational simulation generated data to gain insights and perform knowledge discovery. As systems approach the petascale range, problems that were unimaginable a few years ago are within reach. With the increasing volume and complexity of data produced by ultra-scale simulations and high-throughput experiments, understanding the science is largely hampered by the lack of comprehensive I/O, storage, acceleration of data manipulation, analysis, and mining tools. Scientists require techniques, tools and infrastructure to facilitate better understanding of their data, in particular the ability to effectively perform complex data analysis, statistical analysis and knowledgemore » discovery. The goal of this work is to enable more effective analysis of scientific datasets through the integration of enhancements in the I/O stack, from active storage support at the file system layer to MPI-IO and high-level I/O library layers. We propose to provide software components to accelerate data analytics, mining, I/O, and knowledge discovery for large-scale scientific applications, thereby increasing productivity of both scientists and the systems. Our approaches include 1) design the interfaces in high-level I/O libraries, such as parallel netCDF, for applications to activate data mining operations at the lower I/O layers; 2) Enhance MPI-IO runtime systems to incorporate the functionality developed as a part of the runtime system design; 3) Develop parallel data mining programs as part of runtime library for server-side file system in PVFS file system; and 4) Prototype an active storage cluster, which will utilize multicore CPUs, GPUs, and FPGAs to carry out the data mining workload.« less

  17. Rapid B-rep model preprocessing for immersogeometric analysis using analytic surfaces

    PubMed Central

    Wang, Chenglong; Xu, Fei; Hsu, Ming-Chen; Krishnamurthy, Adarsh

    2017-01-01

    to immersogeometric simulations of the same model with NURBS surfaces. We also compare the results of our immersogeometric method with those obtained using boundary-fitted CFD of a tessellated torpedo shape, and quantities of interest such as drag coefficient are in good agreement. Finally, we demonstrate the effectiveness of our immersogeometric method for high-fidelity industrial scale simulations by performing an aerodynamic analysis of a truck that has a large percentage of analytic surfaces. Using analytic surfaces over NURBS avoids unnecessary surface type conversion and significantly reduces model-preprocessing time, while providing the same accuracy for the aerodynamic quantities of interest. PMID:29051678

  18. Clustering in analytical chemistry.

    PubMed

    Drab, Klaudia; Daszykowski, Michal

    2014-01-01

    Data clustering plays an important role in the exploratory analysis of analytical data, and the use of clustering methods has been acknowledged in different fields of science. In this paper, principles of data clustering are presented with a direct focus on clustering of analytical data. The role of the clustering process in the analytical workflow is underlined, and its potential impact on the analytical workflow is emphasized.

  19. Analytical design and evaluation of an active control system for helicopter vibration reduction and gust response alleviation

    NASA Technical Reports Server (NTRS)

    Taylor, R. B.; Zwicke, P. E.; Gold, P.; Miao, W.

    1980-01-01

    An analytical study was conducted to define the basic configuration of an active control system for helicopter vibration and gust response alleviation. The study culminated in a control system design which has two separate systems: narrow band loop for vibration reduction and wider band loop for gust response alleviation. The narrow band vibration loop utilizes the standard swashplate control configuration to input controller for the vibration loop is based on adaptive optimal control theory and is designed to adapt to any flight condition including maneuvers and transients. The prime characteristics of the vibration control system is its real time capability. The gust alleviation control system studied consists of optimal sampled data feedback gains together with an optimal one-step-ahead prediction. The prediction permits the estimation of the gust disturbance which can then be used to minimize the gust effects on the helicopter.

  20. An analytic description of electrodynamic dispersion in free-flow zone electrophoresis.

    PubMed

    Dutta, Debashis

    2015-07-24

    The present work analyzes the electrodynamic dispersion of sample streams in a free-flow zone electrophoresis (FFZE) chamber resulting due to partial or complete blockage of electroosmotic flow (EOF) across the channel width by the sidewalls of the conduit. This blockage of EOF has been assumed to generate a pressure-driven backflow in the transverse direction for maintaining flow balance in the system. A parallel-plate based FFZE device with the analyte stream located far away from the channel side regions has been considered to simplify the current analysis. Applying a method-of-moments formulation, an analytic expression was derived for the variance of the sample zone at steady state as a function of its position in the separation chamber under these conditions. It has been shown that the increase in stream broadening due to the electrodynamic dispersion phenomenon is additive to the contributions from molecular diffusion and sample injection, and simply modifies the coefficient for the hydrodynamic dispersion term for a fixed lateral migration distance of the sample stream. Moreover, this dispersion mechanism can dominate the overall spatial variance of analyte zones when a significant fraction of the EOF is blocked by the channel sidewalls. The analysis also shows that analyte streams do not undergo any hydrodynamic broadening due to unwanted pressure-driven cross-flows in an FFZE chamber in the absence of a transverse electric field. The noted results have been validated using Monte Carlo simulations which further demonstrate that while the sample concentration profile at the channel outlet approaches a Gaussian distribution only in FFZE chambers substantially longer than the product of the axial pressure-driven velocity and the characteristic diffusion time in the system, the spatial variance of the exiting analyte stream is well described by the Taylor-Aris dispersion limit even in analysis ducts much shorter than this length scale. Copyright © 2015

  1. Application of Data Provenance in Healthcare Analytics Software: Information Visualisation of User Activities

    PubMed Central

    Xu, Shen; Rogers, Toby; Fairweather, Elliot; Glenn, Anthony; Curran, James; Curcin, Vasa

    2018-01-01

    Data provenance is a technique that describes the history of digital objects. In health data settings, it can be used to deliver auditability and transparency, and to achieve trust in a software system. However, implementing data provenance in analytics software at an enterprise level presents a different set of challenges from the research environments where data provenance was originally devised. In this paper, the challenges of reporting provenance information to the user is presented. Provenance captured from analytics software can be large and complex and visualizing a series of tasks over a long period can be overwhelming even for a domain expert, requiring visual aggregation mechanisms that fit with complex human cognitive activities involved in the process. This research studied how provenance-based reporting can be integrated into a health data analytics software, using the example of Atmolytics visual reporting tool. PMID:29888084

  2. Semi-analytical Model for Estimating Absorption Coefficients of Optically Active Constituents in Coastal Waters

    NASA Astrophysics Data System (ADS)

    Wang, D.; Cui, Y.

    2015-12-01

    The objectives of this paper are to validate the applicability of a multi-band quasi-analytical algorithm (QAA) in retrieval absorption coefficients of optically active constituents in turbid coastal waters, and to further improve the model using a proposed semi-analytical model (SAA). The ap(531) and ag(531) semi-analytically derived using SAA model are quite different from the retrievals procedures of QAA model that ap(531) and ag(531) are semi-analytically derived from the empirical retrievals results of a(531) and a(551). The two models are calibrated and evaluated against datasets taken from 19 independent cruises in West Florida Shelf in 1999-2003, provided by SeaBASS. The results indicate that the SAA model produces a superior performance to QAA model in absorption retrieval. Using of the SAA model in retrieving absorption coefficients of optically active constituents from West Florida Shelf decreases the random uncertainty of estimation by >23.05% from the QAA model. This study demonstrates the potential of the SAA model in absorption coefficients of optically active constituents estimating even in turbid coastal waters. Keywords: Remote sensing; Coastal Water; Absorption Coefficient; Semi-analytical Model

  3. Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.

    PubMed

    Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U

    2015-05-01

    The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Joint Analysis of X-Ray and Sunyaev-Zel'Dovich Observations of Galaxy Clusters Using an Analytic Model of the Intracluster Medium

    NASA Technical Reports Server (NTRS)

    Hasler, Nicole; Bulbul, Esra; Bonamente, Massimiliano; Carlstrom, John E.; Culverhouse, Thomas L.; Gralla, Megan; Greer, Christopher; Lamb, James W.; Hawkins, David; Hennessy, Ryan; hide

    2012-01-01

    We perform a joint analysis of X-ray and Sunyaev-Zel'dovich effect data using an analytic model that describes the gas properties of galaxy clusters. The joint analysis allows the measurement of the cluster gas mass fraction profile and Hubble constant independent of cosmological parameters. Weak cosmological priors are used to calculate the overdensity radius within which the gas mass fractions are reported. Such an analysis can provide direct constraints on the evolution of the cluster gas mass fraction with redshift. We validate the model and the joint analysis on high signal-to-noise data from the Chandra X-ray Observatory and the Sunyaev-Zel'dovich Array for two clusters, A2631 and A2204.

  5. Performing data analytics on information obtained from various sensors on an OSUS compliant system

    NASA Astrophysics Data System (ADS)

    Cashion, Kelly; Landoll, Darian; Klawon, Kevin; Powar, Nilesh

    2017-05-01

    The Open Standard for Unattended Sensors (OSUS) was developed by DIA and ARL to provide a plug-n-play platform for sensor interoperability. Our objective is to use the standardized data produced by OSUS in performing data analytics on information obtained from various sensors. Data analytics can be integrated in one of three ways: within an asset itself; as an independent plug-in designed for one type of asset (i.e. camera or seismic sensor); or as an independent plug-in designed to incorporate data from multiple assets. As a proof-of-concept, we develop a model that can be used in the second of these types - an independent component for camera images. The dataset used was collected as part of a demonstration and test of OSUS capabilities. The image data includes images of empty outdoor scenes and scenes with human or vehicle activity. We design, test, and train a convolution neural network (CNN) to analyze these images and assess the presence of activity in the image. The resulting classifier labels input images as empty or activity with 86.93% accuracy, demonstrating the promising opportunities for deep learning, machine learning, and predictive analytics as an extension of OSUS's already robust suite of capabilities.

  6. An analytical and experimental study of crack extension in center-notched composites

    NASA Technical Reports Server (NTRS)

    Beuth, Jack L., Jr.; Herakovich, Carl T.

    1987-01-01

    The normal stress ratio theory for crack extension in anisotropic materials is studied analytically and experimentally. The theory is applied within a microscopic-level analysis of a single center notch of arbitrary orientation in a unidirectional composite material. The bulk of the analytical work of this study applies an elasticity solution for an infinite plate with a center line to obtain critical stress and crack growth direction predictions. An elasticity solution for an infinite plate with a center elliptical flaw is also used to obtain qualitative predictions of the location of crack initiation on the border of a rounded notch tip. The analytical portion of the study includes the formulation of a new crack growth theory that includes local shear stress. Normal stress ratio theory predictions are obtained for notched unidirectional tensile coupons and unidirectional Iosipescu shear specimens. These predictions are subsequently compared to experimental results.

  7. Proteomic analysis of serum and sputum analytes distinguishes controlled and poorly controlled asthmatics.

    PubMed

    Kasaian, M T; Lee, J; Brennan, A; Danto, S I; Black, K E; Fitz, L; Dixon, A E

    2018-04-17

    A major goal of asthma therapy is to achieve disease control, with maintenance of lung function, reduced need for rescue medication, and prevention of exacerbation. Despite current standard of care, up to 70% of patients with asthma remain poorly controlled. Analysis of serum and sputum biomarkers could offer insights into parameters associated with poor asthma control. To identify signatures as determinants of asthma disease control, we performed proteomics using Olink proximity extension analysis. Up to 3 longitudinal serum samples were collected from 23 controlled and 25 poorly controlled asthmatics. Nine of the controlled and 8 of the poorly controlled subjects also provided 2 longitudinal sputum samples. The study included an additional cohort of 9 subjects whose serum was collected within 48 hours of asthma exacerbation. Two separate pre-defined Proseek Multiplex panels (INF and CVDIII) were run to quantify 181 separate protein analytes in serum and sputum. Panels consisting of 9 markers in serum (CCL19, CCL25, CDCP1, CCL11, FGF21, FGF23, Flt3L, IL-10Rβ, IL-6) and 16 markers in sputum (tPA, KLK6, RETN, ADA, MMP9, Chit1, GRN, PGLYRP1, MPO, HGF, PRTN3, DNER, PI3, Chi3L1, AZU1, and OPG) distinguished controlled and poorly controlled asthmatics. The sputum analytes were consistent with a pattern of neutrophil activation associated with poor asthma control. The serum analyte profile of the exacerbation cohort resembled that of the controlled group rather than that of the poorly controlled asthmatics, possibly reflecting a therapeutic response to systemic corticosteroids. Proteomic profiles in serum and sputum distinguished controlled and poorly controlled asthmatics, and were maintained over time. Findings support a link between sputum neutrophil markers and loss of asthma control. © 2018 John Wiley & Sons Ltd.

  8. Health Informatics for Neonatal Intensive Care Units: An Analytical Modeling Perspective

    PubMed Central

    Mench-Bressan, Nadja; McGregor, Carolyn; Pugh, James Edward

    2015-01-01

    The effective use of data within intensive care units (ICUs) has great potential to create new cloud-based health analytics solutions for disease prevention or earlier condition onset detection. The Artemis project aims to achieve the above goals in the area of neonatal ICUs (NICU). In this paper, we proposed an analytical model for the Artemis cloud project which will be deployed at McMaster Children’s Hospital in Hamilton. We collect not only physiological data but also the infusion pumps data that are attached to NICU beds. Using the proposed analytical model, we predict the amount of storage, memory, and computation power required for the system. Capacity planning and tradeoff analysis would be more accurate and systematic by applying the proposed analytical model in this paper. Numerical results are obtained using real inputs acquired from McMaster Children’s Hospital and a pilot deployment of the system at The Hospital for Sick Children (SickKids) in Toronto. PMID:27170907

  9. Size analysis of polyglutamine protein aggregates using fluorescence detection in an analytical ultracentrifuge.

    PubMed

    Polling, Saskia; Hatters, Danny M; Mok, Yee-Foong

    2013-01-01

    Defining the aggregation process of proteins formed by poly-amino acid repeats in cells remains a challenging task due to a lack of robust techniques for their isolation and quantitation. Sedimentation velocity methodology using fluorescence detected analytical ultracentrifugation is one approach that can offer significant insight into aggregation formation and kinetics. While this technique has traditionally been used with purified proteins, it is now possible for substantial information to be collected with studies using cell lysates expressing a GFP-tagged protein of interest. In this chapter, we describe protocols for sample preparation and setting up the fluorescence detection system in an analytical ultracentrifuge to perform sedimentation velocity experiments on cell lysates containing aggregates formed by poly-amino acid repeat proteins.

  10. Analytical applications of microbial fuel cells. Part II: Toxicity, microbial activity and quantification, single analyte detection and other uses.

    PubMed

    Abrevaya, Ximena C; Sacco, Natalia J; Bonetto, Maria C; Hilding-Ohlsson, Astrid; Cortón, Eduardo

    2015-01-15

    Microbial fuel cells were rediscovered twenty years ago and now are a very active research area. The reasons behind this new activity are the relatively recent discovery of electrogenic or electroactive bacteria and the vision of two important practical applications, as wastewater treatment coupled with clean energy production and power supply systems for isolated low-power sensor devices. Although some analytical applications of MFCs were proposed earlier (as biochemical oxygen demand sensing) only lately a myriad of new uses of this technology are being presented by research groups around the world, which combine both biological-microbiological and electroanalytical expertises. This is the second part of a review of MFC applications in the area of analytical sciences. In Part I a general introduction to biological-based analytical methods including bioassays, biosensors, MFCs design, operating principles, as well as, perhaps the main and earlier presented application, the use as a BOD sensor was reviewed. In Part II, other proposed uses are presented and discussed. As other microbially based analytical systems, MFCs are satisfactory systems to measure and integrate complex parameters that are difficult or impossible to measure otherwise, such as water toxicity (where the toxic effect to aquatic organisms needed to be integrated). We explore here the methods proposed to measure toxicity, microbial metabolism, and, being of special interest to space exploration, life sensors. Also, some methods with higher specificity, proposed to detect a single analyte, are presented. Different possibilities to increase selectivity and sensitivity, by using molecular biology or other modern techniques are also discussed here. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Compensation for matrix effects in the gas chromatography-mass spectrometry analysis of 186 pesticides in tea matrices using analyte protectants.

    PubMed

    Li, Yan; Chen, Xi; Fan, Chunlin; Pang, Guofang

    2012-11-30

    A gas chromatography-mass spectrometry (GC-MS) analytical method was developed for simultaneously determining 186 pesticides in tea matrices using analyte protectants to counteract the matrix-induced effect. The matrix effects were evaluated for green, oolong and black tea, representing unfermented, partially fermented and completely fermented teas respectively and depending on the type of tea, 72%, 94% and 94% of the pesticides presented strong response enhancement effect. Several analyte protectants as well as certain combinations of these protectants were evaluated to check their compensation effects. A mixture of triglycerol and d-ribonic acid-γ-lactone (both at 2mg/mL in the injected samples) was found to be the most effective in improving the chromatographic behavior of the 186 pesticides. More than 96% of the 186 pesticides achieved recoveries within the range of 70-120% when using the selected mixture of analyte protectants. The simple addition of analyte protectants offers a more convenient solution to overcome matrix effects, results in less active sites compared to matrix-matched standardization and can be an effective approach to compensate for matrix effects in the GC-MS analysis of pesticide residues. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Empire: An Analytical Category for Educational Research

    ERIC Educational Resources Information Center

    Coloma, Roland Sintos

    2013-01-01

    In this article Roland Sintos Coloma argues for the relevance of empire as an analytical category in educational research. He points out the silence in mainstream studies of education on the subject of empire, the various interpretive approaches to deploying empire as an analytic, and the importance of indigeneity in research on empire and…

  13. Paper-based assay of antioxidant activity using analyte-mediated on-paper nucleation of gold nanoparticles as colorimetric probes.

    PubMed

    Choleva, Tatiana G; Kappi, Foteini A; Giokas, Dimosthenis L; Vlessidis, Athanasios G

    2015-02-20

    With the increasing interest in the health benefits arising from the consumption of dietary products rich in antioxidants, there exists a clear demand for easy-to-use and cost-effective tests that can be used for the identification of the antioxidant power of food products. Paper-based analytical devices constitute a remarkable platform for such expedient and low-cost assays with minimal external resources but efforts in this direction are still scarce. In this work we introduce a new paper-based device in the form of a sensor patch that enables the determination of antioxidant activity through analyte-driven on-paper formation of gold nanoparticles. The principle of detection capitalizes, for the first time, on the on-paper nucleation of gold ions to its respective nanoparticles, upon reduction by antioxidant compounds present in an aqueous sample. The ensuing chromatic transitions, induced on the paper surface, are used as an optical "signature" of the antioxidant strength of the solution. The response of the paper-based sensor was evaluated against a large variety of antioxidant species and the respective dose response curves were constructed. On the basis of these data, the contribution of each species according to its chemical structure was elucidated. For the analysis of real samples, a concentration-dependent colorimetric response was established against Gallic acid equivalents over a linear range of 10 μM-1.0 mM, with detection limits at the low and ultra-low μM levels (i.e. <1.0 μM) and satisfactory precision (RSD=3.6-12.6%). The sensor has been tested for the assessment of antioxidant activity in real samples (teas and wines) and the results correlated well with commonly used antioxidant detection methods. Importantly, the sensor performed favorably for long periods of time when stored at moisture-free and low temperature conditions without losing its activity thus posing as an attractive alternative to the assessment of antioxidant activity without

  14. Addressing fundamental architectural challenges of an activity-based intelligence and advanced analytics (ABIAA) system

    NASA Astrophysics Data System (ADS)

    Yager, Kevin; Albert, Thomas; Brower, Bernard V.; Pellechia, Matthew F.

    2015-06-01

    The domain of Geospatial Intelligence Analysis is rapidly shifting toward a new paradigm of Activity Based Intelligence (ABI) and information-based Tipping and Cueing. General requirements for an advanced ABIAA system present significant challenges in architectural design, computing resources, data volumes, workflow efficiency, data mining and analysis algorithms, and database structures. These sophisticated ABI software systems must include advanced algorithms that automatically flag activities of interest in less time and within larger data volumes than can be processed by human analysts. In doing this, they must also maintain the geospatial accuracy necessary for cross-correlation of multi-intelligence data sources. Historically, serial architectural workflows have been employed in ABIAA system design for tasking, collection, processing, exploitation, and dissemination. These simpler architectures may produce implementations that solve short term requirements; however, they have serious limitations that preclude them from being used effectively in an automated ABIAA system with multiple data sources. This paper discusses modern ABIAA architectural considerations providing an overview of an advanced ABIAA system and comparisons to legacy systems. It concludes with a recommended strategy and incremental approach to the research, development, and construction of a fully automated ABIAA system.

  15. Long-term Results of an Analytical Assessment of Student Compounded Preparations

    PubMed Central

    Roark, Angie M.; Anksorus, Heidi N.

    2014-01-01

    Objective. To investigate the long-term (ie, 6-year) impact of a required remake vs an optional remake on student performance in a compounding laboratory course in which students’ compounded preparations were analyzed. Methods. The analysis data for several preparations made by students were compared for differences in the analyzed content of the active pharmaceutical ingredient (API) and the number of students who successfully compounded the preparation on the first attempt. Results. There was a consistent statistical difference in the API amount or concentration in 4 of the preparations (diphenhydramine, ketoprofen, metoprolol, and progesterone) in each optional remake year compared to the required remake year. As the analysis requirement was continued, the outcome for each preparation approached and/or attained the expected API result. Two preparations required more than 1 year to demonstrate a statistical difference. Conclusion. The analytical assessment resulted in a consistent, long-term improvement in student performance during the 5-year period after the optional remake policy was instituted. Our assumption is that investment in such an assessment would result in a similar benefits at other colleges and schools of pharmacy. PMID:26056402

  16. Analytical and Radiochemistry for Nuclear Forensics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steiner, Robert Ernest; Dry, Donald E.; Kinman, William Scott

    Information about nonproliferation nuclear forensics, activities in forensics at Los Alamos National Laboratory, radio analytical work at LANL, radiochemical characterization capabilities, bulk chemical and materials analysis capabilities, and future interests in forensics interactions.

  17. Analytical Parameters of an Amperometric Glucose Biosensor for Fast Analysis in Food Samples.

    PubMed

    Artigues, Margalida; Abellà, Jordi; Colominas, Sergi

    2017-11-14

    Amperometric biosensors based on the use of glucose oxidase (GOx) are able to combine the robustness of electrochemical techniques with the specificity of biological recognition processes. However, very little information can be found in literature about the fundamental analytical parameters of these sensors. In this work, the analytical behavior of an amperometric biosensor based on the immobilization of GOx using a hydrogel (Chitosan) onto highly ordered titanium dioxide nanotube arrays (TiO₂NTAs) has been evaluated. The GOx-Chitosan/TiO₂NTAs biosensor showed a sensitivity of 5.46 μA·mM -1 with a linear range from 0.3 to 1.5 mM; its fundamental analytical parameters were studied using a commercial soft drink. The obtained results proved sufficient repeatability (RSD = 1.9%), reproducibility (RSD = 2.5%), accuracy (95-105% recovery), and robustness (RSD = 3.3%). Furthermore, no significant interferences from fructose, ascorbic acid and citric acid were obtained. In addition, the storage stability was further examined, after 30 days, the GOx-Chitosan/TiO₂NTAs biosensor retained 85% of its initial current response. Finally, the glucose content of different food samples was measured using the biosensor and compared with the respective HPLC value. In the worst scenario, a deviation smaller than 10% was obtained among the 20 samples evaluated.

  18. Analytical Chemistry Division annual progress report for period ending November 30, 1977

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, W.S.

    1978-03-01

    Activities for the year are summarized in sections on analytical methodology, mass and mass emission spectrometry, analytical services, bio-organic analysis, nuclear and radiochemical analysis, and quality assurance and safety. Presentations of research results in publications and reports are tabulated. (JRD)

  19. Solid-State Thermionic Power Generators: An Analytical Analysis in the Nonlinear Regime

    NASA Astrophysics Data System (ADS)

    Zebarjadi, M.

    2017-07-01

    Solid-state thermionic power generators are an alternative to thermoelectric modules. In this paper, we develop an analytical model to investigate the performance of these generators in the nonlinear regime. We identify dimensionless parameters determining their performance and provide measures to estimate an acceptable range of thermal and electrical resistances of thermionic generators. We find the relation between the optimum load resistance and the internal resistance and suggest guidelines for the design of thermionic power generators. Finally, we show that in the nonlinear regime, thermionic power generators can have efficiency values higher than the state-of-the-art thermoelectric modules.

  20. Analysis and synthesis of bianisotropic metasurfaces by using analytical approach based on equivalent parameters

    NASA Astrophysics Data System (ADS)

    Danaeifar, Mohammad; Granpayeh, Nosrat

    2018-03-01

    An analytical method is presented to analyze and synthesize bianisotropic metasurfaces. The equivalent parameters of metasurfaces in terms of meta-atom properties and other specifications of metasurfaces are derived. These parameters are related to electric, magnetic, and electromagnetic/magnetoelectric dipole moments of the bianisotropic media, and they can simplify the analysis of complicated and multilayer structures. A metasurface of split ring resonators is studied as an example demonstrating the proposed method. The optical properties of the meta-atom are explored, and the calculated polarizabilities are applied to find the reflection coefficient and the equivalent parameters of the metasurface. Finally, a structure consisting of two metasurfaces of the split ring resonators is provided, and the proposed analytical method is applied to derive the reflection coefficient. The validity of this analytical approach is verified by full-wave simulations which demonstrate good accuracy of the equivalent parameter method. This method can be used in the analysis and synthesis of bianisotropic metasurfaces with different materials and in different frequency ranges by considering electric, magnetic, and electromagnetic/magnetoelectric dipole moments.

  1. Paper-based analytical devices for environmental analysis.

    PubMed

    Meredith, Nathan A; Quinn, Casey; Cate, David M; Reilly, Thomas H; Volckens, John; Henry, Charles S

    2016-03-21

    The field of paper-based microfluidics has experienced rapid growth over the past decade. Microfluidic paper-based analytical devices (μPADs), originally developed for point-of-care medical diagnostics in resource-limited settings, are now being applied in new areas, such as environmental analyses. Low-cost paper sensors show great promise for on-site environmental analysis; the theme of ongoing research complements existing instrumental techniques by providing high spatial and temporal resolution for environmental monitoring. This review highlights recent applications of μPADs for environmental analysis along with technical advances that may enable μPADs to be more widely implemented in field testing.

  2. Approximated analytical solution to an Ebola optimal control problem

    NASA Astrophysics Data System (ADS)

    Hincapié-Palacio, Doracelly; Ospina, Juan; Torres, Delfim F. M.

    2016-11-01

    An analytical expression for the optimal control of an Ebola problem is obtained. The analytical solution is found as a first-order approximation to the Pontryagin Maximum Principle via the Euler-Lagrange equation. An implementation of the method is given using the computer algebra system Maple. Our analytical solutions confirm the results recently reported in the literature using numerical methods.

  3. An analytical and experimental investigation of active structural acoustic control of noise transmission through double panel systems

    NASA Astrophysics Data System (ADS)

    Carneal, James P.; Fuller, Chris R.

    2004-05-01

    An analytical and experimental investigation of active control of sound transmission through double panel systems has been performed. The technique used was active structural acoustic control (ASAC) where the control inputs, in the form of piezoelectric actuators, were applied to the structure while the radiating pressure field was minimized. Results verify earlier experimental investigations and indicate the application of control inputs to the radiating panel of the double panel system resulted in greater transmission loss (TL) due to its direct effect on the nature of the structural-acoustic (or radiation) coupling between the radiating panel and the receiving acoustic space. Increased control performance was seen in a double panel system consisting of a stiffer radiating panel due to its lower modal density and also as a result of better impedance matching between the piezoelectric actuator and the radiating plate. In general the results validate the ASAC approach for double panel systems, demonstrating that it is possible to take advantage of double panel system passive behavior to enhance control performance, and provide design guidelines.

  4. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less

  5. Analytical Parameters of an Amperometric Glucose Biosensor for Fast Analysis in Food Samples

    PubMed Central

    2017-01-01

    Amperometric biosensors based on the use of glucose oxidase (GOx) are able to combine the robustness of electrochemical techniques with the specificity of biological recognition processes. However, very little information can be found in literature about the fundamental analytical parameters of these sensors. In this work, the analytical behavior of an amperometric biosensor based on the immobilization of GOx using a hydrogel (Chitosan) onto highly ordered titanium dioxide nanotube arrays (TiO2NTAs) has been evaluated. The GOx–Chitosan/TiO2NTAs biosensor showed a sensitivity of 5.46 μA·mM−1 with a linear range from 0.3 to 1.5 mM; its fundamental analytical parameters were studied using a commercial soft drink. The obtained results proved sufficient repeatability (RSD = 1.9%), reproducibility (RSD = 2.5%), accuracy (95–105% recovery), and robustness (RSD = 3.3%). Furthermore, no significant interferences from fructose, ascorbic acid and citric acid were obtained. In addition, the storage stability was further examined, after 30 days, the GOx–Chitosan/TiO2NTAs biosensor retained 85% of its initial current response. Finally, the glucose content of different food samples was measured using the biosensor and compared with the respective HPLC value. In the worst scenario, a deviation smaller than 10% was obtained among the 20 samples evaluated. PMID:29135931

  6. Development and optimization of an analytical system for volatile organic compound analysis coming from the heating of interstellar/cometary ice analogues.

    PubMed

    Abou Mrad, Ninette; Duvernay, Fabrice; Theulé, Patrice; Chiavassa, Thierry; Danger, Grégoire

    2014-08-19

    This contribution presents an original analytical system for studying volatile organic compounds (VOC) coming from the heating and/or irradiation of interstellar/cometary ice analogues (VAHIIA system) through laboratory experiments. The VAHIIA system brings solutions to three analytical constraints regarding chromatography analysis: the low desorption kinetics of VOC (many hours) in the vacuum chamber during laboratory experiments, the low pressure under which they sublime (10(-9) mbar), and the presence of water in ice analogues. The VAHIIA system which we developed, calibrated, and optimized is composed of two units. The first is a preconcentration unit providing the VOC recovery. This unit is based on a cryogenic trapping which allows VOC preconcentration and provides an adequate pressure allowing their subsequent transfer to an injection unit. The latter is a gaseous injection unit allowing the direct injection into the GC-MS of the VOC previously transferred from the preconcentration unit. The feasibility of the online transfer through this interface is demonstrated. Nanomoles of VOC can be detected with the VAHIIA system, and the variability in replicate measurements is lower than 13%. The advantages of the GC-MS in comparison to infrared spectroscopy are pointed out, the GC-MS allowing an unambiguous identification of compounds coming from complex mixtures. Beyond the application to astrophysical subjects, these analytical developments can be used for all systems requiring vacuum/cryogenic environments.

  7. MERRA Analytic Services

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  8. Two-condition within-participant statistical mediation analysis: A path-analytic framework.

    PubMed

    Montoya, Amanda K; Hayes, Andrew F

    2017-03-01

    Researchers interested in testing mediation often use designs where participants are measured on a dependent variable Y and a mediator M in both of 2 different circumstances. The dominant approach to assessing mediation in such a design, proposed by Judd, Kenny, and McClelland (2001), relies on a series of hypothesis tests about components of the mediation model and is not based on an estimate of or formal inference about the indirect effect. In this article we recast Judd et al.'s approach in the path-analytic framework that is now commonly used in between-participant mediation analysis. By so doing, it is apparent how to estimate the indirect effect of a within-participant manipulation on some outcome through a mediator as the product of paths of influence. This path-analytic approach eliminates the need for discrete hypothesis tests about components of the model to support a claim of mediation, as Judd et al.'s method requires, because it relies only on an inference about the product of paths-the indirect effect. We generalize methods of inference for the indirect effect widely used in between-participant designs to this within-participant version of mediation analysis, including bootstrap confidence intervals and Monte Carlo confidence intervals. Using this path-analytic approach, we extend the method to models with multiple mediators operating in parallel and serially and discuss the comparison of indirect effects in these more complex models. We offer macros and code for SPSS, SAS, and Mplus that conduct these analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior.

    PubMed

    Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D

    2016-08-01

    Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Impact of Educational Activities in Reducing Pre-Analytical Laboratory Errors: A quality initiative.

    PubMed

    Al-Ghaithi, Hamed; Pathare, Anil; Al-Mamari, Sahimah; Villacrucis, Rodrigo; Fawaz, Naglaa; Alkindi, Salam

    2017-08-01

    Pre-analytic errors during diagnostic laboratory investigations can lead to increased patient morbidity and mortality. This study aimed to ascertain the effect of educational nursing activities on the incidence of pre-analytical errors resulting in non-conforming blood samples. This study was conducted between January 2008 and December 2015. All specimens received at the Haematology Laboratory of the Sultan Qaboos University Hospital, Muscat, Oman, during this period were prospectively collected and analysed. Similar data from 2007 were collected retrospectively and used as a baseline for comparison. Non-conforming samples were defined as either clotted samples, haemolysed samples, use of the wrong anticoagulant, insufficient quantities of blood collected, incorrect/lack of labelling on a sample or lack of delivery of a sample in spite of a sample request. From 2008 onwards, multiple educational training activities directed at the hospital nursing staff and nursing students primarily responsible for blood collection were implemented on a regular basis. After initiating corrective measures in 2008, a progressive reduction in the percentage of non-conforming samples was observed from 2009 onwards. Despite a 127.84% increase in the total number of specimens received, there was a significant reduction in non-conforming samples from 0.29% in 2007 to 0.07% in 2015, resulting in an improvement of 75.86% ( P <0.050). In particular, specimen identification errors decreased by 0.056%, with a 96.55% improvement. Targeted educational activities directed primarily towards hospital nursing staff had a positive impact on the quality of laboratory specimens by significantly reducing pre-analytical errors.

  11. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  12. Development of an analytical method for the simultaneous analysis of MCPD esters and glycidyl esters in oil-based foodstuffs.

    PubMed

    Ermacora, Alessia; Hrnčiřík, Karel

    2014-01-01

    Substantial progress has been recently made in the development and optimisation of analytical methods for the quantification of 2-MCPD, 3-MCPD and glycidyl esters in oils and fats, and there are a few methods currently available that allow a reliable quantification of these contaminants in bulk oils and fats. On the other hand, no standard method for the analysis of foodstuffs has yet been established. The aim of this study was the development and validation of a new method for the simultaneous quantification of 2-MCPD, 3-MCPD and glycidyl esters in oil-based food products. The developed protocol includes a first step of liquid-liquid extraction and purification of the lipophilic substances of the sample, followed by the application of a previously developed procedure based on acid transesterification, for the indirect quantification of these contaminants in oils and fats. The method validation was carried out on food products (fat-based spreads, creams, margarine, mayonnaise) manufactured in-house, in order to control the manufacturing process and account for any food matrix-analyte interactions (the sample spiking was carried out on the single components used for the formulations rather than the final products). The method showed good accuracy (the recoveries ranged from 97% to 106% for bound 3-MCPD and 2-MCPD and from 88% to 115% for bound glycidol) and sensitivity (the LOD was 0.04 and 0.05 mg kg(-1) for bound MCPD and glycidol, respectively). Repeatability and reproducibility were satisfactory (RSD below 2% and 5%, respectively) for all analytes. The levels of salts and surface-active compounds in the formulation were found to have no impact on the accuracy and the other parameters of the method.

  13. Perioperative and ICU Healthcare Analytics within a Veterans Integrated System Network: a Qualitative Gap Analysis.

    PubMed

    Mudumbai, Seshadri; Ayer, Ferenc; Stefanko, Jerry

    2017-08-01

    Health care facilities are implementing analytics platforms as a way to document quality of care. However, few gap analyses exist on platforms specifically designed for patients treated in the Operating Room, Post-Anesthesia Care Unit, and Intensive Care Unit (ICU). As part of a quality improvement effort, we undertook a gap analysis of an existing analytics platform within the Veterans Healthcare Administration. The objectives were to identify themes associated with 1) current clinical use cases and stakeholder needs; 2) information flow and pain points; and 3) recommendations for future analytics development. Methods consisted of semi-structured interviews in 2 phases with a diverse set (n = 9) of support personnel and end users from five facilities across a Veterans Integrated Service Network. Phase 1 identified underlying needs and previous experiences with the analytics platform across various roles and operational responsibilities. Phase 2 validated preliminary feedback, lessons learned, and recommendations for improvement. Emerging themes suggested that the existing system met a small pool of national reporting requirements. However, pain points were identified with accessing data in several information system silos and performing multiple manual validation steps of data content. Notable recommendations included enhancing systems integration to create "one-stop shopping" for data, and developing a capability to perform trends analysis. Our gap analysis suggests that analytics platforms designed for surgical and ICU patients should employ approaches similar to those being used for primary care patients.

  14. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    NASA Astrophysics Data System (ADS)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  15. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  16. An Analysis of Activities in Preschool Settings. Final Report.

    ERIC Educational Resources Information Center

    Berk, Laura E.

    This research was aimed at an analysis of classroom activities which make up educational programs for young children. Its broad purpose was to analyze systematically and to make comparisons among six preschool programs in order to describe the patterns of activity settings used; the objectives activity settings were designed to reach from the…

  17. Timing variation in an analytically solvable chaotic system

    NASA Astrophysics Data System (ADS)

    Blakely, J. N.; Milosavljevic, M. S.; Corron, N. J.

    2017-02-01

    We present analytic solutions for a chaotic dynamical system that do not have the regular timing characteristic of recently reported solvable chaotic systems. The dynamical system can be viewed as a first order filter with binary feedback. The feedback state may be switched only at instants defined by an external clock signal. Generalizing from a period one clock, we show analytic solutions for period two and higher period clocks. We show that even when the clock 'ticks' randomly the chaotic system has an analytic solution. These solutions can be visualized in a stroboscopic map whose complexity increases with the complexity of the clock. We provide both analytic results as well as experimental data from an electronic circuit implementation of the system. Our findings bridge the gap between the irregular timing of well known chaotic systems such as Lorenz and Rossler and the well regulated oscillations of recently reported solvable chaotic systems.

  18. Standardization, evaluation and early-phase method validation of an analytical scheme for batch-consistency N-glycosylation analysis of recombinant produced glycoproteins.

    PubMed

    Zietze, Stefan; Müller, Rainer H; Brecht, René

    2008-03-01

    In order to set up a batch-to-batch-consistency analytical scheme for N-glycosylation analysis, several sample preparation steps including enzyme digestions and fluorophore labelling and two HPLC-methods were established. The whole method scheme was standardized, evaluated and validated according to the requirements on analytical testing in early clinical drug development by usage of a recombinant produced reference glycoprotein (RGP). The standardization of the methods was performed by clearly defined standard operation procedures. During evaluation of the methods, the major interest was in the loss determination of oligosaccharides within the analytical scheme. Validation of the methods was performed with respect to specificity, linearity, repeatability, LOD and LOQ. Due to the fact that reference N-glycan standards were not available, a statistical approach was chosen to derive accuracy from the linearity data. After finishing the validation procedure, defined limits for method variability could be calculated and differences observed in consistency analysis could be separated into significant and incidental ones.

  19. Intersubject synchronisation analysis of brain activity associated with the instant effects of acupuncture: an fMRI study.

    PubMed

    Jin, Lingmin; Sun, Jinbo; Xu, Ziliang; Yang, Xuejuan; Liu, Peng; Qin, Wei

    2018-02-01

    To use a promising analytical method, namely intersubject synchronisation (ISS), to evaluate the brain activity associated with the instant effects of acupuncture and compare the findings with traditional general linear model (GLM) methods. 30 healthy volunteers were recruited for this study. Block-designed manual acupuncture stimuli were delivered at SP6, and de qi sensations were measured after acupuncture stimulation. All subjects underwent functional MRI (fMRI) scanning during the acupuncture stimuli. The fMRI data were separately analysed by ISS and traditional GLM methods. All subjects experienced de qi sensations. ISS analysis showed that the regions activated during acupuncture stimulation at SP6 were mainly divided into five clusters based on the time courses. The time courses of clusters 1 and 2 were in line with the acupuncture stimulation pattern, and the active regions were mainly involved in the sensorimotor system and salience network. Clusters 3, 4 and 5 displayed an almost contrary time course relative to the stimulation pattern. The brain regions activated included the default mode network, descending pain modulation pathway and visual cortices. GLM analysis indicated that the brain responses associated with the instant effects of acupuncture were largely implicated in sensory and motor processing and sensory integration. The ISS analysis considered the sustained effect of acupuncture and uncovered additional information not shown by GLM analysis. We suggest that ISS may be a suitable approach to investigate the brain responses associated with the instant effects of acupuncture. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. Sensitive analytical method for simultaneous analysis of some vasoconstrictors with highly overlapped analytical signals

    NASA Astrophysics Data System (ADS)

    Nikolić, G. S.; Žerajić, S.; Cakić, M.

    2011-10-01

    Multivariate calibration method is a powerful mathematical tool that can be applied in analytical chemistry when the analytical signals are highly overlapped. The method with regression by partial least squares is proposed for the simultaneous spectrophotometric determination of adrenergic vasoconstrictors in decongestive solution containing two active components: phenyleprine hydrochloride and trimazoline hydrochloride. These sympathomimetic agents are that frequently associated in pharmaceutical formulations against the common cold. The proposed method, which is, simple and rapid, offers the advantages of sensitivity and wide range of determinations without the need for extraction of the vasoconstrictors. In order to minimize the optimal factors necessary to obtain the calibration matrix by multivariate calibration, different parameters were evaluated. The adequate selection of the spectral regions proved to be important on the number of factors. In order to simultaneously quantify both hydrochlorides among excipients, the spectral region between 250 and 290 nm was selected. A recovery for the vasoconstrictor was 98-101%. The developed method was applied to assay of two decongestive pharmaceutical preparations.

  1. Analytical Characterization of Erythritol Tetranitrate, an Improvised Explosive.

    PubMed

    Matyáš, Robert; Lyčka, Antonín; Jirásko, Robert; Jakový, Zdeněk; Maixner, Jaroslav; Mišková, Linda; Künzel, Martin

    2016-05-01

    Erythritol tetranitrate (ETN), an ester of nitric acid and erythritol, is a solid crystalline explosive with high explosive performance. Although it has never been used in any industrial or military application, it has become one of the most prepared and misused improvise explosives. In this study, several analytical techniques were explored to facilitate analysis in forensic laboratories. FTIR and Raman spectrometry measurements expand existing data and bring more detailed assignment of bands through the parallel study of erythritol [(15) N4 ] tetranitrate. In the case of powder diffraction, recently published data were verified, and (1) H, (13) C, and (15) N NMR spectra are discussed in detail. The technique of electrospray ionization tandem mass spectrometry was successfully used for the analysis of ETN. Described methods allow fast, versatile, and reliable detection or analysis of samples containing erythritol tetranitrate in forensic laboratories. © 2016 American Academy of Forensic Sciences.

  2. An Analytic Approximation to Very High Specific Impulse and Specific Power Interplanetary Space Mission Analysis

    NASA Technical Reports Server (NTRS)

    Williams, Craig Hamilton

    1995-01-01

    A simple, analytic approximation is derived to calculate trip time and performance for propulsion systems of very high specific impulse (50,000 to 200,000 seconds) and very high specific power (10 to 1000 kW/kg) for human interplanetary space missions. The approach assumed field-free space, constant thrust/constant specific power, and near straight line (radial) trajectories between the planets. Closed form, one dimensional equations of motion for two-burn rendezvous and four-burn round trip missions are derived as a function of specific impulse, specific power, and propellant mass ratio. The equations are coupled to an optimizing parameter that maximizes performance and minimizes trip time. Data generated for hypothetical one-way and round trip human missions to Jupiter were found to be within 1% and 6% accuracy of integrated solutions respectively, verifying that for these systems, credible analysis does not require computationally intensive numerical techniques.

  3. Current trends in green liquid chromatography for the analysis of pharmaceutically active compounds in the environmental water compartments.

    PubMed

    Shaaban, Heba; Górecki, Tadeusz

    2015-01-01

    Green analytical chemistry is an aspect of green chemistry which introduced in the late nineties. The main objectives of green analytical chemistry are to obtain new analytical technologies or to modify an old method to incorporate procedures that use less hazardous chemicals. There are several approaches to achieve this goal such as using environmentally benign solvents and reagents, reducing the chromatographic separation times and miniaturization of analytical devices. Traditional methods used for the analysis of pharmaceutically active compounds require large volumes of organic solvents and generate large amounts of waste. Most of them are volatile and harmful to the environment. With the awareness about the environment, the development of green technologies has been receiving increasing attention aiming at eliminating or reducing the amount of organic solvents consumed everyday worldwide without loss in chromatographic performance. This review provides the state of the art of green analytical methodologies for environmental analysis of pharmaceutically active compounds in the aquatic environment with special emphasis on strategies for greening liquid chromatography (LC). The current trends of fast LC applied to environmental analysis, including elevated mobile phase temperature, as well as different column technologies such as monolithic columns, fully porous sub-2 μm and superficially porous particles are presented. In addition, green aspects of gas chromatography (GC) and supercritical fluid chromatography (SFC) will be discussed. We pay special attention to new green approaches such as automation, miniaturization, direct analysis and the possibility of locating the chromatograph on-line or at-line as a step forward in reducing the environmental impact of chromatographic analyses. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Analytical transmissibility based transfer path analysis for multi-energy-domain systems using four-pole parameter theory

    NASA Astrophysics Data System (ADS)

    Mashayekhi, Mohammad Jalali; Behdinan, Kamran

    2017-10-01

    The increasing demand to minimize undesired vibration and noise levels in several high-tech industries has generated a renewed interest in vibration transfer path analysis. Analyzing vibration transfer paths within a system is of crucial importance in designing an effective vibration isolation strategy. Most of the existing vibration transfer path analysis techniques are empirical which are suitable for diagnosis and troubleshooting purpose. The lack of an analytical transfer path analysis to be used in the design stage is the main motivation behind this research. In this paper an analytical transfer path analysis based on the four-pole theory is proposed for multi-energy-domain systems. Bond graph modeling technique which is an effective approach to model multi-energy-domain systems is used to develop the system model. In this paper an electro-mechanical system is used as a benchmark example to elucidate the effectiveness of the proposed technique. An algorithm to obtain the equivalent four-pole representation of a dynamical systems based on the corresponding bond graph model is also presented in this paper.

  5. Meta-analytically informed network analysis of resting state FMRI reveals hyperconnectivity in an introspective socio-affective network in depression.

    PubMed

    Schilbach, Leonhard; Müller, Veronika I; Hoffstaedter, Felix; Clos, Mareike; Goya-Maldonado, Roberto; Gruber, Oliver; Eickhoff, Simon B

    2014-01-01

    Alterations of social cognition and dysfunctional interpersonal expectations are thought to play an important role in the etiology of depression and have, thus, become a key target of psychotherapeutic interventions. The underlying neurobiology, however, remains elusive. Based upon the idea of a close link between affective and introspective processes relevant for social interactions and alterations thereof in states of depression, we used a meta-analytically informed network analysis to investigate resting-state functional connectivity in an introspective socio-affective (ISA) network in individuals with and without depression. Results of our analysis demonstrate significant differences between the groups with depressed individuals showing hyperconnectivity of the ISA network. These findings demonstrate that neurofunctional alterations exist in individuals with depression in a neural network relevant for introspection and socio-affective processing, which may contribute to the interpersonal difficulties that are linked to depressive symptomatology.

  6. Fault feature analysis of cracked gear based on LOD and analytical-FE method

    NASA Astrophysics Data System (ADS)

    Wu, Jiateng; Yang, Yu; Yang, Xingkai; Cheng, Junsheng

    2018-01-01

    At present, there are two main ideas for gear fault diagnosis. One is the model-based gear dynamic analysis; the other is signal-based gear vibration diagnosis. In this paper, a method for fault feature analysis of gear crack is presented, which combines the advantages of dynamic modeling and signal processing. Firstly, a new time-frequency analysis method called local oscillatory-characteristic decomposition (LOD) is proposed, which has the attractive feature of extracting fault characteristic efficiently and accurately. Secondly, an analytical-finite element (analytical-FE) method which is called assist-stress intensity factor (assist-SIF) gear contact model, is put forward to calculate the time-varying mesh stiffness (TVMS) under different crack states. Based on the dynamic model of the gear system with 6 degrees of freedom, the dynamic simulation response was obtained for different tooth crack depths. For the dynamic model, the corresponding relation between the characteristic parameters and the degree of the tooth crack is established under a specific condition. On the basis of the methods mentioned above, a novel gear tooth root crack diagnosis method which combines the LOD with the analytical-FE is proposed. Furthermore, empirical mode decomposition (EMD) and ensemble empirical mode decomposition (EEMD) are contrasted with the LOD by gear crack fault vibration signals. The analysis results indicate that the proposed method performs effectively and feasibility for the tooth crack stiffness calculation and the gear tooth crack fault diagnosis.

  7. Strategic, Analytic and Operational Domains of Information Management.

    ERIC Educational Resources Information Center

    Diener, Richard AV

    1992-01-01

    Discussion of information management focuses on three main areas of activities and their interrelationship: (1) strategic, including establishing frameworks and principles of operations; (2) analytic, or research elements, including user needs assessment, data gathering, and data analysis; and (3) operational activities, including reference…

  8. Performance analysis of junctionless double gate VeSFET considering the effects of thermal variation - An explicit 2D analytical model

    NASA Astrophysics Data System (ADS)

    Chaudhary, Tarun; Khanna, Gargi

    2017-03-01

    The purpose of this paper is to explore junctionless double gate vertical slit field effect transistor (JLDG VeSFET) with reduced short channel effects and to develop an analytical threshold voltage model for the device considering the impact of thermal variations for the very first time. The model has been derived by solving 2D Poisson's equation and the effects of variation in temperature on various electrical parameters of the device such as Rout, drain current, mobility, subthreshold slope and DIBL has been studied and described in the paper. The model provides a deep physical insight of the device behavior and is also very helpful in contributing to the design space exploration for JLDG VeSFET. The proposed model is verified with simulative analysis at different radii of the device and it has been observed that there is a good agreement between the analytical model and simulation results.

  9. An Overview of Conventional and Emerging Analytical Methods for the Determination of Mycotoxins

    PubMed Central

    Cigić, Irena Kralj; Prosen, Helena

    2009-01-01

    Mycotoxins are a group of compounds produced by various fungi and excreted into the matrices on which they grow, often food intended for human consumption or animal feed. The high toxicity and carcinogenicity of these compounds and their ability to cause various pathological conditions has led to widespread screening of foods and feeds potentially polluted with them. Maximum permissible levels in different matrices have also been established for some toxins. As these are quite low, analytical methods for determination of mycotoxins have to be both sensitive and specific. In addition, an appropriate sample preparation and pre-concentration method is needed to isolate analytes from rather complicated samples. In this article, an overview of methods for analysis and sample preparation published in the last ten years is given for the most often encountered mycotoxins in different samples, mainly in food. Special emphasis is on liquid chromatography with fluorescence and mass spectrometric detection, while in the field of sample preparation various solid-phase extraction approaches are discussed. However, an overview of other analytical and sample preparation methods less often used is also given. Finally, different matrices where mycotoxins have to be determined are discussed with the emphasis on their specific characteristics important for the analysis (human food and beverages, animal feed, biological samples, environmental samples). Various issues important for accurate qualitative and quantitative analyses are critically discussed: sampling and choice of representative sample, sample preparation and possible bias associated with it, specificity of the analytical method and critical evaluation of results. PMID:19333436

  10. Analytical and Numerical Studies of Active and Passive Microwave Ocean Remote Sensing

    DTIC Science & Technology

    2001-09-30

    of both analytical and efficient numerical methods for electromagnetics and hydrodynamics. New insights regarding these phenomena can then be applied to improve microwave active and passive remote sensing of the ocean surface.

  11. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    ERIC Educational Resources Information Center

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  12. Imagining with the body in analytical psychology. Movement as active imagination: an interdisciplinary perspective from philosophy and neuroscience.

    PubMed

    Deligiannis, Ana

    2018-04-01

    This article explores how the body and imagination operate as pathways of knowledge through the use of Movement as Active Imagination in clinical practice. This method activates the transcendent function, thus encouraging new therapeutic responses. A philosophical perspective (Spinoza, Nietzsche, Merleau-Ponty) and some concepts from neuroscience (embodied cognition, somatic markers, image schema, mirror neurons, neuronal plasticity) will accompany us throughout this work, illustrated with a clinical vignette. Three levels of integration: 1) body, 2) body-emotion, 3) body-emotion-imagination are proposed: these mark a progressive sense of articulation and complexity. Finally the relation between creativity and neuronal plasticity will be considered. © 2018, The Society of Analytical Psychology.

  13. Towards an Analytical Framework for Understanding the Development of a Quality Assurance System in an International Joint Programme

    ERIC Educational Resources Information Center

    Zheng, Gaoming; Cai, Yuzhuo; Ma, Shaozhuang

    2017-01-01

    This paper intends to construct an analytical framework for understanding quality assurance in international joint programmes and to test it in a case analysis of a European--Chinese joint doctoral degree programme. The development of a quality assurance system for an international joint programme is understood as an institutionalization process…

  14. ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence.

    PubMed

    Di Cataldo, Santa; Tonti, Simone; Bottino, Andrea; Ficarra, Elisa

    2016-05-01

    The automated analysis of indirect immunofluorescence images for Anti-Nuclear Autoantibody (ANA) testing is a fairly recent field that is receiving ever-growing interest from the research community. ANA testing leverages on the categorization of intensity level and fluorescent pattern of IIF images of HEp-2 cells to perform a differential diagnosis of important autoimmune diseases. Nevertheless, it suffers from tremendous lack of repeatability due to subjectivity in the visual interpretation of the images. The automatization of the analysis is seen as the only valid solution to this problem. Several works in literature address individual steps of the work-flow, nonetheless integrating such steps and assessing their effectiveness as a whole is still an open challenge. We present a modular tool, ANAlyte, able to characterize a IIF image in terms of fluorescent intensity level and fluorescent pattern without any user-interactions. For this purpose, ANAlyte integrates the following: (i) Intensity Classifier module, that categorizes the intensity level of the input slide based on multi-scale contrast assessment; (ii) Cell Segmenter module, that splits the input slide into individual HEp-2 cells; (iii) Pattern Classifier module, that determines the fluorescent pattern of the slide based on the pattern of the individual cells. To demonstrate the accuracy and robustness of our tool, we experimentally validated ANAlyte on two different public benchmarks of IIF HEp-2 images with rigorous leave-one-out cross-validation strategy. We obtained overall accuracy of fluorescent intensity and pattern classification respectively around 85% and above 90%. We assessed all results by comparisons with some of the most representative state of the art works. Unlike most of the other works in the recent literature, ANAlyte aims at the automatization of all the major steps of ANA image analysis. Results on public benchmarks demonstrate that the tool can characterize HEp-2 slides in terms of

  15. 40 CFR 158.355 - Enforcement analytical method.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Enforcement analytical method. 158.355... DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.355 Enforcement analytical method. An analytical method suitable for enforcement purposes must be provided for each active ingredient in the...

  16. Analytical Problems and Suggestions in the Analysis of Behavioral Economic Demand Curves.

    PubMed

    Yu, Jihnhee; Liu, Liu; Collins, R Lorraine; Vincent, Paula C; Epstein, Leonard H

    2014-01-01

    Behavioral economic demand curves (Hursh, Raslear, Shurtleff, Bauman, & Simmons, 1988) are innovative approaches to characterize the relationships between consumption of a substance and its price. In this article, we investigate common analytical issues in the use of behavioral economic demand curves, which can cause inconsistent interpretations of demand curves, and then we provide methodological suggestions to address those analytical issues. We first demonstrate that log transformation with different added values for handling zeros changes model parameter estimates dramatically. Second, demand curves are often analyzed using an overparameterized model that results in an inefficient use of the available data and a lack of assessment of the variability among individuals. To address these issues, we apply a nonlinear mixed effects model based on multivariate error structures that has not been used previously to analyze behavioral economic demand curves in the literature. We also propose analytical formulas for the relevant standard errors of derived values such as P max, O max, and elasticity. The proposed model stabilizes the derived values regardless of using different added increments and provides substantially smaller standard errors. We illustrate the data analysis procedure using data from a relative reinforcement efficacy study of simulated marijuana purchasing.

  17. Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.

    PubMed

    Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà

    2017-10-01

    Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.

  18. Analysis of System-Wide Investment in the National Airspace System: A Portfolio Analytical Framework and an Example

    NASA Technical Reports Server (NTRS)

    Bhadra, Dipasis; Morser, Frederick R.

    2006-01-01

    In this paper, the authors review the FAA s current program investments and lay out a preliminary analytical framework to undertake projects that may address some of the noted deficiencies. By drawing upon the well developed theories from corporate finance, an analytical framework is offered that can be used for choosing FAA s investments taking into account risk, expected returns and inherent dependencies across NAS programs. The framework can be expanded into taking multiple assets and realistic values for parameters in drawing an efficient risk-return frontier for the entire FAA investment programs.

  19. Activating analytic thinking enhances the value given to individualizing moral foundations.

    PubMed

    Yilmaz, Onurcan; Saribay, S Adil

    2017-08-01

    Two central debates within Moral Foundations Theory concern (1) which moral foundations are core and (2) how conflict between ideological camps stemming from valuing different moral foundations can be resolved. Previous studies have attempted to answer the first question by imposing cognitive load on participants to direct them toward intuitive and automatic thought. However, this method has limitations and has produced mixed findings. In the present research, in two experiments, instead of directing participants toward intuitive thought, we tested the effects of activating high-effort, analytic thought on participants' moral foundations. In both experiments, analytic thought activation caused participants to value individualizing foundations greater than the control condition. This effect was not qualified by participants' political orientation. No effect was observed on binding foundations. The results are consistent with the idea that upholding individualizing foundations requires mental effort and may provide the basis for reconciliation between different ideological camps. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Non-linear analytic and coanalytic problems ( L_p-theory, Clifford analysis, examples)

    NASA Astrophysics Data System (ADS)

    Dubinskii, Yu A.; Osipenko, A. S.

    2000-02-01

    Two kinds of new mathematical model of variational type are put forward: non-linear analytic and coanalytic problems. The formulation of these non-linear boundary-value problems is based on a decomposition of the complete scale of Sobolev spaces into the "orthogonal" sum of analytic and coanalytic subspaces. A similar decomposition is considered in the framework of Clifford analysis. Explicit examples are presented.

  1. An active learning representative subset selection method using net analyte signal

    NASA Astrophysics Data System (ADS)

    He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi

    2018-05-01

    To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced.

  2. An active learning representative subset selection method using net analyte signal.

    PubMed

    He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi

    2018-05-05

    To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Human Factors in Field Experimentation Design and Analysis of Analytical Suppression Model

    DTIC Science & Technology

    1978-09-01

    men in uf"an-dachine- Systems " supports the development of new doctrines, design of weapon systems as well as training programs for trQops. One...Experimentation Design -Master’s thesis: and Analysis.of an Analytical Suppression.Spebr17 Model PR@~w 3.RPR 7. AUTHOR(@) COT RIETeo 31AN? wijMu~aw...influences to suppression. Techniques are examined for including. the suppre.ssive effects of weapon systems in Lanchester-type combat m~odels, whir~h may be

  4. Analytical tools for the analysis of β-carotene and its degradation products

    PubMed Central

    Stutz, H.; Bresgen, N.; Eckl, P. M.

    2015-01-01

    Abstract β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method

  5. Physical activity and body image among men and boys: A meta-analysis.

    PubMed

    Bassett-Gunter, Rebecca; McEwan, Desmond; Kamarhie, Aria

    2017-09-01

    Three meta-analytic reviews have concluded that physical activity is positively related to body image. Historically, research regarding physical activity and body image has been disproportionately focused on female samples. For example, the most recent meta-analysis (2009) extracted 56 effect sizes for women and only 12 for men. The current paper provides an update to the literature regarding the relationship between physical activity and body image among men and boys across 84 individual effect sizes. The analysis also provides insight regarding moderator variables including participant age, and physical activity type and intensity. Overall, physical activity was positively related to body image among men and boys with various moderator variables warranting further investigation. Pragmatic implications are discussed as well as the limitations within existing research and need for additional research to further understand moderator and mediator variables. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Nine-analyte detection using an array-based biosensor

    NASA Technical Reports Server (NTRS)

    Taitt, Chris Rowe; Anderson, George P.; Lingerfelt, Brian M.; Feldstein, s. Mark. J.; Ligler, Frances S.

    2002-01-01

    A fluorescence-based multianalyte immunosensor has been developed for simultaneous analysis of multiple samples. While the standard 6 x 6 format of the array sensor has been used to analyze six samples for six different analytes, this same format has the potential to allow a single sample to be tested for 36 different agents. The method described herein demonstrates proof of principle that the number of analytes detectable using a single array can be increased simply by using complementary mixtures of capture and tracer antibodies. Mixtures were optimized to allow detection of closely related analytes without significant cross-reactivity. Following this facile modification of patterning and assay procedures, the following nine targets could be detected in a single 3 x 3 array: Staphylococcal enterotoxin B, ricin, cholera toxin, Bacillus anthracis Sterne, Bacillus globigii, Francisella tularensis LVS, Yersiniapestis F1 antigen, MS2 coliphage, and Salmonella typhimurium. This work maximizes the efficiency and utility of the described array technology, increasing only reagent usage and cost; production and fabrication costs are not affected.

  7. Chemiluminescence microarrays in analytical chemistry: a critical review.

    PubMed

    Seidel, Michael; Niessner, Reinhard

    2014-09-01

    Multi-analyte immunoassays on microarrays and on multiplex DNA microarrays have been described for quantitative analysis of small organic molecules (e.g., antibiotics, drugs of abuse, small molecule toxins), proteins (e.g., antibodies or protein toxins), and microorganisms, viruses, and eukaryotic cells. In analytical chemistry, multi-analyte detection by use of analytical microarrays has become an innovative research topic because of the possibility of generating several sets of quantitative data for different analyte classes in a short time. Chemiluminescence (CL) microarrays are powerful tools for rapid multiplex analysis of complex matrices. A wide range of applications for CL microarrays is described in the literature dealing with analytical microarrays. The motivation for this review is to summarize the current state of CL-based analytical microarrays. Combining analysis of different compound classes on CL microarrays reduces analysis time, cost of reagents, and use of laboratory space. Applications are discussed, with examples from food safety, water safety, environmental monitoring, diagnostics, forensics, toxicology, and biosecurity. The potential and limitations of research on multiplex analysis by use of CL microarrays are discussed in this review.

  8. An analytical study on groundwater flow in drainage basins with horizontal wells

    NASA Astrophysics Data System (ADS)

    Wang, Jun-Zhi; Jiang, Xiao-Wei; Wan, Li; Wang, Xu-Sheng; Li, Hailong

    2014-06-01

    Analytical studies on release/capture zones are often limited to a uniform background groundwater flow. In fact, for basin-scale problems, the undulating water table would lead to the development of hierarchically nested flow systems, which are more complex than a uniform flow. Under the premise that the water table is a replica of undulating topography and hardly influenced by wells, an analytical solution of hydraulic head is derived for a two-dimensional cross section of a drainage basin with horizontal injection/pumping wells. Based on the analytical solution, distributions of hydraulic head, stagnation points and flow systems (including release/capture zones) are explored. The superposition of injection/pumping wells onto the background flow field leads to the development of new internal stagnation points and new flow systems (including release/capture zones). Generally speaking, the existence of n injection/pumping wells would result in up to n new internal stagnation points and up to 2n new flow systems (including release/capture zones). The analytical study presented, which integrates traditional well hydraulics with the theory of regional groundwater flow, is useful in understanding basin-scale groundwater flow influenced by human activities.

  9. An alternative analytical method based on ultrasound micro bath hydrolysis and GC-MS analysis for the characterization of organic biomarkers in archaeological ceramics.

    PubMed

    Blanco-Zubiaguirre, Laura; Olivares, Maitane; Castro, Kepa; Iñañez, Javier G; Madariaga, Juan Manuel

    2016-11-01

    The analysis of organic biomarkers in ancient and valuable archaeological remains provides a worthwhile source of information regarding their management. This work was focused on the development of an analytical procedure to characterize organic residues that have remained in archaeological ceramic samples. A novel analytical approach based on an alkaline hydrolysis by means of an ultrasound micro bath followed by liquid extraction was proposed to isolate saturated and unsaturated fatty acids, degradation products such as dihydroxy acids or dienoic fatty acids, isoprenoid fatty acids, and many other biomarkers from archaeological remains. This main goal has been achieved after the optimization of the main parameters affecting the hydrolysis step, the extraction procedure, and the derivatization step prior to the gas chromatography-mass spectrometry analysis. In this work, archaeological ceramic remains suspected to have been used by Basque Whalers to store whale oil in the period from the sixteenth to the seventeenth century were studied. Nevertheless, the proposed method is useful to determine the organic remains preserved in many other archaeological ceramic remains. Moreover, this methodology can be used to determine organic remains in any porous ceramic, archaeological or not. The preliminary results of the analysis of ceramic vessels led to the determination of some interesting unsaturated compounds such as 11-eicosenoic acid, an important biomarker of marine commodities, and several saturated fatty acids, which could be indicative of having used the vessels to store whale oil. Graphical abstract ᅟ.

  10. An Analytical Solution for Transient Thermal Response of an Insulated Structure

    NASA Technical Reports Server (NTRS)

    Blosser, Max L.

    2012-01-01

    An analytical solution was derived for the transient response of an insulated aerospace vehicle structure subjected to a simplified heat pulse. This simplified problem approximates the thermal response of a thermal protection system of an atmospheric entry vehicle. The exact analytical solution is solely a function of two non-dimensional parameters. A simpler function of these two parameters was developed to approximate the maximum structural temperature over a wide range of parameter values. Techniques were developed to choose constant, effective properties to represent the relevant temperature and pressure-dependent properties for the insulator and structure. A technique was also developed to map a time-varying surface temperature history to an equivalent square heat pulse. Using these techniques, the maximum structural temperature rise was calculated using the analytical solutions and shown to typically agree with finite element simulations within 10 to 20 percent over the relevant range of parameters studied.

  11. Incorporating nuclear vibrational energies into the "atom in molecules" analysis: An analytical study

    NASA Astrophysics Data System (ADS)

    Gharabaghi, Masumeh; Shahbazian, Shant

    2017-04-01

    The quantum theory of atoms in molecules (QTAIM) is based on the clamped nucleus paradigm and solely working with the electronic wavefunctions, so does not include nuclear vibrations in the AIM analysis. On the other hand, the recently extended version of the QTAIM, called the multi-component QTAIM (MC-QTAIM), incorporates both electrons and quantum nuclei, i.e., those nuclei treated as quantum waves instead of clamped point charges, into the AIM analysis using non-adiabatic wavefunctions. Thus, the MC-QTAIM is the natural framework to incorporate the role of nuclear vibrations into the AIM analysis. In this study, within the context of the MC-QTAIM, the formalism of including nuclear vibrational energy in the atomic basin energy is developed in detail and its contribution is derived analytically using the recently proposed non-adiabatic Hartree product nuclear wavefunction. It is demonstrated that within the context of this wavefunction, the quantum nuclei may be conceived pseudo-adiabatically as quantum oscillators and both isotropic harmonic and anisotropic anharmonic oscillator models are used to compute the zero-point nuclear vibrational energy contribution to the basin energies explicitly. Inspired by the results gained within the context of the MC-QTAIM analysis, a heuristic approach is proposed within the context of the QTAIM to include nuclear vibrational energy in the basin energy from the vibrational wavefunction derived adiabatically. The explicit calculation of the basin contribution of the zero-point vibrational energy using the uncoupled harmonic oscillator model leads to results consistent with those derived from the MC-QTAIM.

  12. Incorporating nuclear vibrational energies into the "atom in molecules" analysis: An analytical study.

    PubMed

    Gharabaghi, Masumeh; Shahbazian, Shant

    2017-04-21

    The quantum theory of atoms in molecules (QTAIM) is based on the clamped nucleus paradigm and solely working with the electronic wavefunctions, so does not include nuclear vibrations in the AIM analysis. On the other hand, the recently extended version of the QTAIM, called the multi-component QTAIM (MC-QTAIM), incorporates both electrons and quantum nuclei, i.e., those nuclei treated as quantum waves instead of clamped point charges, into the AIM analysis using non-adiabatic wavefunctions. Thus, the MC-QTAIM is the natural framework to incorporate the role of nuclear vibrations into the AIM analysis. In this study, within the context of the MC-QTAIM, the formalism of including nuclear vibrational energy in the atomic basin energy is developed in detail and its contribution is derived analytically using the recently proposed non-adiabatic Hartree product nuclear wavefunction. It is demonstrated that within the context of this wavefunction, the quantum nuclei may be conceived pseudo-adiabatically as quantum oscillators and both isotropic harmonic and anisotropic anharmonic oscillator models are used to compute the zero-point nuclear vibrational energy contribution to the basin energies explicitly. Inspired by the results gained within the context of the MC-QTAIM analysis, a heuristic approach is proposed within the context of the QTAIM to include nuclear vibrational energy in the basin energy from the vibrational wavefunction derived adiabatically. The explicit calculation of the basin contribution of the zero-point vibrational energy using the uncoupled harmonic oscillator model leads to results consistent with those derived from the MC-QTAIM.

  13. Promising Ideas for Collective Advancement of Communal Knowledge Using Temporal Analytics and Cluster Analysis

    ERIC Educational Resources Information Center

    Lee, Alwyn Vwen Yen; Tan, Seng Chee

    2017-01-01

    Understanding ideas in a discourse is challenging, especially in textual discourse analysis. We propose using temporal analytics with unsupervised machine learning techniques to investigate promising ideas for the collective advancement of communal knowledge in an online knowledge building discourse. A discourse unit network was constructed and…

  14. An analytical model accounting for tip shape evolution during atom probe analysis of heterogeneous materials.

    PubMed

    Rolland, N; Larson, D J; Geiser, B P; Duguay, S; Vurpillot, F; Blavette, D

    2015-12-01

    An analytical model describing the field evaporation dynamics of a tip made of a thin layer deposited on a substrate is presented in this paper. The difference in evaporation field between the materials is taken into account in this approach in which the tip shape is modeled at a mesoscopic scale. It was found that the non-existence of sharp edge on the surface is a sufficient condition to derive the morphological evolution during successive evaporation of the layers. This modeling gives an instantaneous and smooth analytical representation of the surface that shows good agreement with finite difference simulations results, and a specific regime of evaporation was highlighted when the substrate is a low evaporation field phase. In addition, the model makes it possible to calculate theoretically the tip analyzed volume, potentially opening up new horizons for atom probe tomographic reconstruction. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Labyrinth Seal Analysis. Volume 3. Analytical and Experimental Development of a Design Model for Labyrinth Seals

    DTIC Science & Technology

    1986-01-01

    the information that has been determined experimentally. The Labyrinth Seal Analysis program was, therefore, directed to the develop - ment of an...labyrinth seal performance, the program included the development of an improved empirical design model to pro- j. .,’ vide the calculation of the flow... program . * Phase I was directed to the analytical development of both an *analysis* model and an improvwd empirical *design" model. Supporting rig tests

  16. Meta-Analytically Informed Network Analysis of Resting State fMRI Reveals Hyperconnectivity in an Introspective Socio-Affective Network in Depression

    PubMed Central

    Schilbach, Leonhard; Müller, Veronika I.; Hoffstaedter, Felix; Clos, Mareike; Goya-Maldonado, Roberto

    2014-01-01

    Alterations of social cognition and dysfunctional interpersonal expectations are thought to play an important role in the etiology of depression and have, thus, become a key target of psychotherapeutic interventions. The underlying neurobiology, however, remains elusive. Based upon the idea of a close link between affective and introspective processes relevant for social interactions and alterations thereof in states of depression, we used a meta-analytically informed network analysis to investigate resting-state functional connectivity in an introspective socio-affective (ISA) network in individuals with and without depression. Results of our analysis demonstrate significant differences between the groups with depressed individuals showing hyperconnectivity of the ISA network. These findings demonstrate that neurofunctional alterations exist in individuals with depression in a neural network relevant for introspection and socio-affective processing, which may contribute to the interpersonal difficulties that are linked to depressive symptomatology. PMID:24759619

  17. Climate Analytics as a Service

    NASA Technical Reports Server (NTRS)

    Schnase, John L.; Duffy, Daniel Q.; McInerney, Mark A.; Webster, W. Phillip; Lee, Tsengdar J.

    2014-01-01

    Climate science is a big data domain that is experiencing unprecedented growth. In our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). CAaaS combines high-performance computing and data-proximal analytics with scalable data management, cloud computing virtualization, the notion of adaptive analytics, and a domain-harmonized API to improve the accessibility and usability of large collections of climate data. MERRA Analytic Services (MERRA/AS) provides an example of CAaaS. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of key climate variables. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, CAaaS is providing the agility required to meet our customers' increasing and changing data management and data analysis needs.

  18. Competing on analytics.

    PubMed

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  19. Influence of Pre-Analytical Factors on Thymus- and Activation-Regulated Chemokine Quantitation in Plasma

    PubMed Central

    Zhao, Xuemei; Delgado, Liliana; Weiner, Russell; Laterza, Omar F.

    2015-01-01

    Thymus- and activation-regulated chemokine (TARC) in serum/plasma associates with the disease activity of atopic dermatitis (AD), and is a promising tool for assessing the response to the treatment of the disease. TARC also exists within platelets, with elevated levels detectable in AD patients. We examined the effects of pre-analytical factors on the quantitation of TARC in human EDTA plasma. TARC levels in platelet-free plasma were significantly lower than those in platelet-containing plasma. After freeze-thaw, TARC levels increased in platelet-containing plasma, but remained unchanged in platelet-free plasma, suggesting TARC was released from the platelets during the freeze-thaw process. In contrast, TARC levels were stable in serum independent of freeze-thaw. These findings underscore the importance of pre-analytical factors to TARC quantitation. Plasma TARC levels should be measured in platelet-free plasma for accurate quantitation. Pre-analytical factors influence the quantitation, interpretation, and implementation of circulating TARC as a biomarker for the development of AD therapeutics. PMID:28936246

  20. An activity-based methodology for operations cost analysis

    NASA Technical Reports Server (NTRS)

    Korsmeyer, David; Bilby, Curt; Frizzell, R. A.

    1991-01-01

    This report describes an activity-based cost estimation method, proposed for the Space Exploration Initiative (SEI), as an alternative to NASA's traditional mass-based cost estimation method. A case study demonstrates how the activity-based cost estimation technique can be used to identify the operations that have a significant impact on costs over the life cycle of the SEI. The case study yielded an operations cost of $101 billion for the 20-year span of the lunar surface operations for the Option 5a program architecture. In addition, the results indicated that the support and training costs for the missions were the greatest contributors to the annual cost estimates. A cost-sensitivity analysis of the cultural and architectural drivers determined that the length of training and the amount of support associated with the ground support personnel for mission activities are the most significant cost contributors.

  1. Functional magnetic resonance imaging during emotion recognition in social anxiety disorder: an activation likelihood meta-analysis

    PubMed Central

    Hattingh, Coenraad J.; Ipser, J.; Tromp, S. A.; Syal, S.; Lochner, C.; Brooks, S. J.; Stein, D. J.

    2012-01-01

    Background: Social anxiety disorder (SAD) is characterized by abnormal fear and anxiety in social situations. Functional magnetic resonance imaging (fMRI) is a brain imaging technique that can be used to demonstrate neural activation to emotionally salient stimuli. However, no attempt has yet been made to statistically collate fMRI studies of brain activation, using the activation likelihood-estimate (ALE) technique, in response to emotion recognition tasks in individuals with SAD. Methods: A systematic search of fMRI studies of neural responses to socially emotive cues in SAD was undertaken. ALE meta-analysis, a voxel-based meta-analytic technique, was used to estimate the most significant activations during emotional recognition. Results: Seven studies were eligible for inclusion in the meta-analysis, constituting a total of 91 subjects with SAD, and 93 healthy controls. The most significant areas of activation during emotional vs. neutral stimuli in individuals with SAD compared to controls were: bilateral amygdala, left medial temporal lobe encompassing the entorhinal cortex, left medial aspect of the inferior temporal lobe encompassing perirhinal cortex and parahippocampus, right anterior cingulate, right globus pallidus, and distal tip of right postcentral gyrus. Conclusion: The results are consistent with neuroanatomic models of the role of the amygdala in fear conditioning, and the importance of the limbic circuitry in mediating anxiety symptoms. PMID:23335892

  2. Technical pre-analytical effects on the clinical biochemistry of Atlantic salmon (Salmo salar L.).

    PubMed

    Braceland, M; Houston, K; Ashby, A; Matthews, C; Haining, H; Rodger, H; Eckersall, P D

    2017-01-01

    Clinical biochemistry has long been utilized in human and veterinary medicine as a vital diagnostic tool, but despite occasional studies showing its usefulness in monitoring health status in Atlantic salmon (Salmo salar L.), it has not yet been widely utilized within the aquaculture industry. This is due, in part, to a lack of an agreed protocol for collection and processing of blood prior to analysis. Moreover, while the analytical phase of clinical biochemistry is well controlled, there is a growing understanding that technical pre-analytical variables can influence analyte concentrations or activities. In addition, post-analytical interpretation of treatment effects is variable in the literature, thus making the true effect of sample treatment hard to evaluate. Therefore, a number of pre-analytical treatments have been investigated to examine their effect on analyte concentrations and activities. In addition, reference ranges for salmon plasma biochemical analytes have been established to inform veterinary practitioners and the aquaculture industry of the importance of clinical biochemistry in health and disease monitoring. Furthermore, a standardized protocol for blood collection has been proposed. © 2016 The Authors Journal of Fish Diseases Published by John Wiley & Sons Ltd.

  3. Instrumental Analysis of Biodiesel Content in Commercial Diesel Blends: An Experiment for Undergraduate Analytical Chemistry

    ERIC Educational Resources Information Center

    Feng, Z. Vivian; Buchman, Joseph T.

    2012-01-01

    The potential of replacing petroleum fuels with renewable biofuels has drawn significant public interest. Many states have imposed biodiesel mandates or incentives to use commercial biodiesel blends. We present an inquiry-driven experiment where students are given the tasks to gather samples, develop analytical methods using various instrumental…

  4. The forensic validity of visual analytics

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.

    2008-01-01

    The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and

  5. Characterising infant inter-breath interval patterns during active and quiet sleep using recurrence plot analysis.

    PubMed

    Terrill, Philip I; Wilson, Stephen J; Suresh, Sadasivam; Cooper, David M

    2009-01-01

    Breathing patterns are characteristically different between active and quiet sleep states in infants. It has been previously identified that breathing dynamics are governed by a non-linear controller which implies the need for a nonlinear analytical tool. Further, it has been shown that quantified nonlinear variables are different between adult sleep states. This study aims to determine whether a nonlinear analytical tool known as recurrence plot analysis can characterize breath intervals of active and quiet sleep states in infants. Overnight polysomnograms were obtained from 32 healthy infants. The 6 longest periods each of active and quiet sleep were identified and a software routine extracted inter-breath interval data for recurrence plot analysis. Determinism (DET), laminarity (LAM) and radius (RAD) values were calculated for an embedding dimension of 4, 6, 8 and 16, and fixed recurrence of 0.5, 1, 2, 3.5 and 5%. Recurrence plots exhibited characteristically different patterns for active and quiet sleep. Active sleep periods typically had higher values of RAD, DET and LAM than for quiet sleep, and this trend was invariant to a specific choice of embedding dimension or fixed recurrence. These differences may provide a basis for automated sleep state classification, and the quantitative investigation of pathological breathing patterns.

  6. Assembling the puzzle for promoting physical activity in Brazil: a social network analysis.

    PubMed

    Brownson, Ross C; Parra, Diana C; Dauti, Marsela; Harris, Jenine K; Hallal, Pedro C; Hoehner, Christine; Malta, Deborah Carvalho; Reis, Rodrigo S; Ramos, Luiz Roberto; Ribeiro, Isabela C; Soares, Jesus; Pratt, Michael

    2010-07-01

    Physical inactivity is a significant public health problem in Brazil that may be addressed by partnerships and networks. In conjunction with Project GUIA (Guide for Useful Interventions for Physical Activity in Brazil and Latin America), the aim of this study was to conduct a social network analysis of physical activity in Brazil. An online survey was completed by 28 of 35 organizations contacted from December 2008 through March 2009. Network analytic methods examined measures of collaboration, importance, leadership, and attributes of the respondent and organization. Leadership nominations for organizations studied ranged from 0 to 23. Positive predictors of collaboration included: south region, GUIA membership, years working in physical activity, and research, education, and promotion/practice areas of physical activity. The most frequently reported barrier to collaboration was bureaucracy. Social network analysis identified factors that are likely to improve collaboration among organizations in Brazil.

  7. Single Particle-Inductively Coupled Plasma Mass Spectroscopy Analysis of Metallic Nanoparticles in Environmental Samples with Large Dissolved Analyte Fractions.

    PubMed

    Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I

    2016-10-18

    There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.

  8. Analytical Round Robin for Elastic-Plastic Analysis of Surface Cracked Plates: Phase I Results

    NASA Technical Reports Server (NTRS)

    Wells, D. N.; Allen, P. A.

    2012-01-01

    An analytical round robin for the elastic-plastic analysis of surface cracks in flat plates was conducted with 15 participants. Experimental results from a surface crack tension test in 2219-T8 aluminum plate provided the basis for the inter-laboratory study (ILS). The study proceeded in a blind fashion given that the analysis methodology was not specified to the participants, and key experimental results were withheld. This approach allowed the ILS to serve as a current measure of the state of the art for elastic-plastic fracture mechanics analysis. The analytical results and the associated methodologies were collected for comparison, and sources of variability were studied and isolated. The results of the study revealed that the J-integral analysis methodology using the domain integral method is robust, providing reliable J-integral values without being overly sensitive to modeling details. General modeling choices such as analysis code, model size (mesh density), crack tip meshing, or boundary conditions, were not found to be sources of significant variability. For analyses controlled only by far-field boundary conditions, the greatest source of variability in the J-integral assessment is introduced through the constitutive model. This variability can be substantially reduced by using crack mouth opening displacements to anchor the assessment. Conclusions provide recommendations for analysis standardization.

  9. Method of identity analyte-binding peptides

    DOEpatents

    Kauvar, Lawrence M.

    1990-01-01

    A method for affinity chromatography or adsorption of a designated analyte utilizes a paralog as the affinity partner. The immobilized paralog can be used in purification or analysis of the analyte; the paralog can also be used as a substitute for antibody in an immunoassay. The paralog is identified by screening candidate peptide sequences of 4-20 amino acids for specific affinity to the analyte.

  10. Method of identity analyte-binding peptides

    DOEpatents

    Kauvar, L.M.

    1990-10-16

    A method for affinity chromatography or adsorption of a designated analyte utilizes a paralog as the affinity partner. The immobilized paralog can be used in purification or analysis of the analyte; the paralog can also be used as a substitute for antibody in an immunoassay. The paralog is identified by screening candidate peptide sequences of 4--20 amino acids for specific affinity to the analyte. 5 figs.

  11. Retail video analytics: an overview and survey

    NASA Astrophysics Data System (ADS)

    Connell, Jonathan; Fan, Quanfu; Gabbur, Prasad; Haas, Norman; Pankanti, Sharath; Trinh, Hoang

    2013-03-01

    Today retail video analytics has gone beyond the traditional domain of security and loss prevention by providing retailers insightful business intelligence such as store traffic statistics and queue data. Such information allows for enhanced customer experience, optimized store performance, reduced operational costs, and ultimately higher profitability. This paper gives an overview of various camera-based applications in retail as well as the state-ofthe- art computer vision techniques behind them. It also presents some of the promising technical directions for exploration in retail video analytics.

  12. Demonstrating the use of web analytics and an online survey to understand user groups of a national network of river level data

    NASA Astrophysics Data System (ADS)

    Macleod, Christopher Kit; Braga, Joao; Arts, Koen; Ioris, Antonio; Han, Xiwu; Sripada, Yaji; van der Wal, Rene

    2016-04-01

    The number of local, national and international networks of online environmental sensors are rapidly increasing. Where environmental data are made available online for public consumption, there is a need to advance our understanding of the relationships between the supply of and the different demands for such information. Understanding how individuals and groups of users are using online information resources may provide valuable insights into their activities and decision making. As part of the 'dot.rural wikiRivers' project we investigated the potential of web analytics and an online survey to generate insights into the use of a national network of river level data from across Scotland. These sources of online information were collected alongside phone interviews with volunteers sampled from the online survey, and interviews with providers of online river level data; as part of a larger project that set out to help improve the communication of Scotland's online river data. Our web analytics analysis was based on over 100 online sensors which are maintained by the Scottish Environmental Protection Agency (SEPA). Through use of Google Analytics data accessed via the R Ganalytics package we assessed: if the quality of data provided by Google Analytics free service is good enough for research purposes; if we could demonstrate what sensors were being used, when and where; how the nature and pattern of sensor data may affect web traffic; and whether we can identify and profile these users based on information from traffic sources. Web analytics data consists of a series of quantitative metrics which capture and summarize various dimensions of the traffic to a certain web page or set of pages. Examples of commonly used metrics include the number of total visits to a site and the number of total page views. Our analyses of the traffic sources from 2009 to 2011 identified several different major user groups. To improve our understanding of how the use of this national

  13. On the Multilevel Nature of Meta-Analysis: A Tutorial, Comparison of Software Programs, and Discussion of Analytic Choices.

    PubMed

    Pastor, Dena A; Lazowski, Rory A

    2018-01-01

    The term "multilevel meta-analysis" is encountered not only in applied research studies, but in multilevel resources comparing traditional meta-analysis to multilevel meta-analysis. In this tutorial, we argue that the term "multilevel meta-analysis" is redundant since all meta-analysis can be formulated as a special kind of multilevel model. To clarify the multilevel nature of meta-analysis the four standard meta-analytic models are presented using multilevel equations and fit to an example data set using four software programs: two specific to meta-analysis (metafor in R and SPSS macros) and two specific to multilevel modeling (PROC MIXED in SAS and HLM). The same parameter estimates are obtained across programs underscoring that all meta-analyses are multilevel in nature. Despite the equivalent results, not all software programs are alike and differences are noted in the output provided and estimators available. This tutorial also recasts distinctions made in the literature between traditional and multilevel meta-analysis as differences between meta-analytic choices, not between meta-analytic models, and provides guidance to inform choices in estimators, significance tests, moderator analyses, and modeling sequence. The extent to which the software programs allow flexibility with respect to these decisions is noted, with metafor emerging as the most favorable program reviewed.

  14. Visual analytics of brain networks.

    PubMed

    Li, Kaiming; Guo, Lei; Faraco, Carlos; Zhu, Dajiang; Chen, Hanbo; Yuan, Yixuan; Lv, Jinglei; Deng, Fan; Jiang, Xi; Zhang, Tuo; Hu, Xintao; Zhang, Degang; Miller, L Stephen; Liu, Tianming

    2012-05-15

    Identification of regions of interest (ROIs) is a fundamental issue in brain network construction and analysis. Recent studies demonstrate that multimodal neuroimaging approaches and joint analysis strategies are crucial for accurate, reliable and individualized identification of brain ROIs. In this paper, we present a novel approach of visual analytics and its open-source software for ROI definition and brain network construction. By combining neuroscience knowledge and computational intelligence capabilities, visual analytics can generate accurate, reliable and individualized ROIs for brain networks via joint modeling of multimodal neuroimaging data and an intuitive and real-time visual analytics interface. Furthermore, it can be used as a functional ROI optimization and prediction solution when fMRI data is unavailable or inadequate. We have applied this approach to an operation span working memory fMRI/DTI dataset, a schizophrenia DTI/resting state fMRI (R-fMRI) dataset, and a mild cognitive impairment DTI/R-fMRI dataset, in order to demonstrate the effectiveness of visual analytics. Our experimental results are encouraging. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. DOSY Analysis of Micromolar Analytes: Resolving Dilute Mixtures by SABRE Hyperpolarization.

    PubMed

    Reile, Indrek; Aspers, Ruud L E G; Tyburn, Jean-Max; Kempf, James G; Feiters, Martin C; Rutjes, Floris P J T; Tessari, Marco

    2017-07-24

    DOSY is an NMR spectroscopy technique that resolves resonances according to the analytes' diffusion coefficients. It has found use in correlating NMR signals and estimating the number of components in mixtures. Applications of DOSY in dilute mixtures are, however, held back by excessively long measurement times. We demonstrate herein, how the enhanced NMR sensitivity provided by SABRE hyperpolarization allows DOSY analysis of low-micromolar mixtures, thus reducing the concentration requirements by at least 100-fold. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. The effects of physical activity on sleep: a meta-analytic review.

    PubMed

    Kredlow, M Alexandra; Capozzoli, Michelle C; Hearon, Bridget A; Calkins, Amanda W; Otto, Michael W

    2015-06-01

    A significant body of research has investigated the effects of physical activity on sleep, yet this research has not been systematically aggregated in over a decade. As a result, the magnitude and moderators of these effects are unclear. This meta-analytical review examines the effects of acute and regular exercise on sleep, incorporating a range of outcome and moderator variables. PubMed and PsycINFO were used to identify 66 studies for inclusion in the analysis that were published through May 2013. Analyses reveal that acute exercise has small beneficial effects on total sleep time, sleep onset latency, sleep efficiency, stage 1 sleep, and slow wave sleep, a moderate beneficial effect on wake time after sleep onset, and a small effect on rapid eye movement sleep. Regular exercise has small beneficial effects on total sleep time and sleep efficiency, small-to-medium beneficial effects on sleep onset latency, and moderate beneficial effects on sleep quality. Effects were moderated by sex, age, baseline physical activity level of participants, as well as exercise type, time of day, duration, and adherence. Significant moderation was not found for exercise intensity, aerobic/anaerobic classification, or publication date. Results were discussed with regards to future avenues of research and clinical application to the treatment of insomnia.

  17. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  18. Modeling Patterns of Activities using Activity Curves

    PubMed Central

    Dawadi, Prafulla N.; Cook, Diane J.; Schmitter-Edgecombe, Maureen

    2016-01-01

    Pervasive computing offers an unprecedented opportunity to unobtrusively monitor behavior and use the large amount of collected data to perform analysis of activity-based behavioral patterns. In this paper, we introduce the notion of an activity curve, which represents an abstraction of an individual’s normal daily routine based on automatically-recognized activities. We propose methods to detect changes in behavioral routines by comparing activity curves and use these changes to analyze the possibility of changes in cognitive or physical health. We demonstrate our model and evaluate our change detection approach using a longitudinal smart home sensor dataset collected from 18 smart homes with older adult residents. Finally, we demonstrate how big data-based pervasive analytics such as activity curve-based change detection can be used to perform functional health assessment. Our evaluation indicates that correlations do exist between behavior and health changes and that these changes can be automatically detected using smart homes, machine learning, and big data-based pervasive analytics. PMID:27346990

  19. Modeling Patterns of Activities using Activity Curves.

    PubMed

    Dawadi, Prafulla N; Cook, Diane J; Schmitter-Edgecombe, Maureen

    2016-06-01

    Pervasive computing offers an unprecedented opportunity to unobtrusively monitor behavior and use the large amount of collected data to perform analysis of activity-based behavioral patterns. In this paper, we introduce the notion of an activity curve , which represents an abstraction of an individual's normal daily routine based on automatically-recognized activities. We propose methods to detect changes in behavioral routines by comparing activity curves and use these changes to analyze the possibility of changes in cognitive or physical health. We demonstrate our model and evaluate our change detection approach using a longitudinal smart home sensor dataset collected from 18 smart homes with older adult residents. Finally, we demonstrate how big data-based pervasive analytics such as activity curve-based change detection can be used to perform functional health assessment. Our evaluation indicates that correlations do exist between behavior and health changes and that these changes can be automatically detected using smart homes, machine learning, and big data-based pervasive analytics.

  20. Validation of an advanced analytical procedure applied to the measurement of environmental radioactivity.

    PubMed

    Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van

    2018-04-01

    In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Streaming Visual Analytics Workshop Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Burtner, Edwin R.; Kritzstein, Brian P.

    How can we best enable users to understand complex emerging events and make appropriate assessments from streaming data? This was the central question addressed at a three-day workshop on streaming visual analytics. This workshop was organized by Pacific Northwest National Laboratory for a government sponsor. It brought together forty researchers and subject matter experts from government, industry, and academia. This report summarizes the outcomes from that workshop. It describes elements of the vision for a streaming visual analytic environment and set of important research directions needed to achieve this vision. Streaming data analysis is in many ways the analysis andmore » understanding of change. However, current visual analytics systems usually focus on static data collections, meaning that dynamically changing conditions are not appropriately addressed. The envisioned mixed-initiative streaming visual analytics environment creates a collaboration between the analyst and the system to support the analysis process. It raises the level of discourse from low-level data records to higher-level concepts. The system supports the analyst’s rapid orientation and reorientation as situations change. It provides an environment to support the analyst’s critical thinking. It infers tasks and interests based on the analyst’s interactions. The system works as both an assistant and a devil’s advocate, finding relevant data and alerts as well as considering alternative hypotheses. Finally, the system supports sharing of findings with others. Making such an environment a reality requires research in several areas. The workshop discussions focused on four broad areas: support for critical thinking, visual representation of change, mixed-initiative analysis, and the use of narratives for analysis and communication.« less

  2. Experimental analysis of tablet properties for discrete element modeling of an active coating process.

    PubMed

    Just, Sarah; Toschkoff, Gregor; Funke, Adrian; Djuric, Dejan; Scharrer, Georg; Khinast, Johannes; Knop, Klaus; Kleinebudde, Peter

    2013-03-01

    Coating of solid dosage forms is an important unit operation in the pharmaceutical industry. In recent years, numerical simulations of drug manufacturing processes have been gaining interest as process analytical technology tools. The discrete element method (DEM) in particular is suitable to model tablet-coating processes. For the development of accurate simulations, information on the material properties of the tablets is required. In this study, the mechanical parameters Young's modulus, coefficient of restitution (CoR), and coefficients of friction (CoF) of gastrointestinal therapeutic systems (GITS) and of active-coated GITS were measured experimentally. The dynamic angle of repose of these tablets in a drum coater was investigated to revise the CoF. The resulting values were used as input data in DEM simulations to compare simulation and experiment. A mean value of Young's modulus of 31.9 MPa was determined by the uniaxial compression test. The CoR was found to be 0.78. For both tablet-steel and tablet-tablet friction, active-coated GITS showed a higher CoF compared with GITS. According to the values of the dynamic angle of repose, the CoF was adjusted to obtain consistent tablet motion in the simulation and in the experiment. On the basis of this experimental characterization, mechanical parameters are integrated into DEM simulation programs to perform numerical analysis of coating processes.

  3. An analytic performance model of disk arrays and its application

    NASA Technical Reports Server (NTRS)

    Lee, Edward K.; Katz, Randy H.

    1991-01-01

    As disk arrays become widely used, tools for understanding and analyzing their performance become increasingly important. In particular, performance models can be invaluable in both configuring and designing disk arrays. Accurate analytic performance models are desirable over other types of models because they can be quickly evaluated, are applicable under a wide range of system and workload parameters, and can be manipulated by a range of mathematical techniques. Unfortunately, analytical performance models of disk arrays are difficult to formulate due to the presence of queuing and fork-join synchronization; a disk array request is broken up into independent disk requests which must all complete to satisfy the original request. We develop, validate, and apply an analytic performance model for disk arrays. We derive simple equations for approximating their utilization, response time, and throughput. We then validate the analytic model via simulation and investigate the accuracy of each approximation used in deriving the analytical model. Finally, we apply the analytical model to derive an equation for the optimal unit of data striping in disk arrays.

  4. Empirical Analysis of the Subjective Impressions and Objective Measures of Domain Scientists’ Visual Analytic Judgments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Burrows, Susannah M.; Han, Kyungsik

    2017-05-08

    Scientists often use specific data analysis and presentation methods familiar within their domain. But does high familiarity drive better analytical judgment? This question is especially relevant when familiar methods themselves can have shortcomings: many visualizations used conventionally for scientific data analysis and presentation do not follow established best practices. This necessitates new methods that might be unfamiliar yet prove to be more effective. But there is little empirical understanding of the relationships between scientists’ subjective impressions about familiar and unfamiliar visualizations and objective measures of their visual analytic judgments. To address this gap and to study these factors, we focusmore » on visualizations used for comparison of climate model performance. We report on a comprehensive survey-based user study with 47 climate scientists and present an analysis of : i) relationships among scientists’ familiarity, their perceived lev- els of comfort, confidence, accuracy, and objective measures of accuracy, and ii) relationships among domain experience, visualization familiarity, and post-study preference.« less

  5. Evaluation of the matrix effect on gas chromatography--mass spectrometry with carrier gas containing ethylene glycol as an analyte protectant.

    PubMed

    Fujiyoshi, Tomoharu; Ikami, Takahito; Sato, Takashi; Kikukawa, Koji; Kobayashi, Masato; Ito, Hiroshi; Yamamoto, Atsushi

    2016-02-19

    The consequences of matrix effects in GC are a major issue of concern in pesticide residue analysis. The aim of this study was to evaluate the applicability of an analyte protectant generator in pesticide residue analysis using a GC-MS system. The technique is based on continuous introduction of ethylene glycol into the carrier gas. Ethylene glycol as an analyte protectant effectively compensated the matrix effects in agricultural product extracts. All peak intensities were increased by this technique without affecting the GC-MS performance. Calibration curves for ethylene glycol in the GC-MS system with various degrees of pollution were compared and similar response enhancements were observed. This result suggests a convenient multi-residue GC-MS method using an analyte protectant generator instead of the conventional compensation method for matrix-induced response enhancement adding the mixture of analyte protectants into both neat and sample solutions. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. SEURAT: visual analytics for the integrated analysis of microarray data.

    PubMed

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  7. SEURAT: Visual analytics for the integrated analysis of microarray data

    PubMed Central

    2010-01-01

    Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data. PMID:20525257

  8. The challenge of big data in public health: an opportunity for visual analytics.

    PubMed

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.

  9. The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics

    PubMed Central

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376

  10. Decisions through data: analytics in healthcare.

    PubMed

    Wills, Mary J

    2014-01-01

    The amount of data in healthcare is increasing at an astonishing rate. However, in general, the industry has not deployed the level of data management and analysis necessary to make use of those data. As a result, healthcare executives face the risk of being overwhelmed by a flood of unusable data. In this essay I argue that, in order to extract actionable information, leaders must take advantage of the promise of data analytics. Small data, predictive modeling expansion, and real-time analytics are three forms of data analytics. On the basis of my analysis for this study, I recommend all three for adoption. Recognizing the uniqueness of each organization's situation, I also suggest that practices, hospitals, and healthcare systems examine small data and conduct real-time analytics and that large-scale organizations managing populations of patients adopt predictive modeling. I found that all three solutions assist in the collection, management, and analysis of raw data to improve the quality of care and decrease costs.

  11. Analytical methods for quantitation of prenylated flavonoids from hops.

    PubMed

    Nikolić, Dejan; van Breemen, Richard B

    2013-01-01

    The female flowers of hops ( Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach.

  12. An iterative analytical technique for the design of interplanetary direct transfer trajectories including perturbations

    NASA Astrophysics Data System (ADS)

    Parvathi, S. P.; Ramanan, R. V.

    2018-06-01

    An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.

  13. Analytical analysis and implementation of a low-speed high-torque permanent magnet vernier in-wheel motor for electric vehicle

    NASA Astrophysics Data System (ADS)

    Li, Jiangui; Wang, Junhua; Zhigang, Zhao; Yan, Weili

    2012-04-01

    In this paper, analytical analysis of the permanent magnet vernier (PMV) is presented. The key is to analytically solve the governing Laplacian/quasi-Poissonian field equations in the motor regions. By using the time-stepping finite element method, the analytical method is verified. Hence, the performances of the PMV machine are quantitatively compared with that of the analytical results. The analytical results agree well with the finite element method results. Finally, the experimental results are given to further show the validity of the analysis.

  14. Directivity analysis of meander-line-coil EMATs with a wholly analytical method.

    PubMed

    Xie, Yuedong; Liu, Zenghua; Yin, Liyuan; Wu, Jiande; Deng, Peng; Yin, Wuliang

    2017-01-01

    This paper presents the simulation and experimental study of the radiation pattern of a meander-line-coil EMAT. A wholly analytical method, which involves the coupling of two models: an analytical EM model and an analytical UT model, has been developed to build EMAT models and analyse the Rayleigh waves' beam directivity. For a specific sensor configuration, Lorentz forces are calculated using the EM analytical method, which is adapted from the classic Deeds and Dodd solution. The calculated Lorentz force density are imported to an analytical ultrasonic model as driven point sources, which produce the Rayleigh waves within a layered medium. The effect of the length of the meander-line-coil on the Rayleigh waves' beam directivity is analysed quantitatively and verified experimentally. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Analytic Steering: Inserting Context into the Information Dialog

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bohn, Shawn J.; Calapristi, Augustin J.; Brown, Shyretha D.

    2011-10-23

    An analyst’s intrinsic domain knowledge is a primary asset in almost any analysis task. Unstructured text analysis systems that apply un-supervised content analysis approaches can be more effective if they can leverage this domain knowledge in a manner that augments the information discovery process without obfuscating new or unexpected content. Current unsupervised approaches rely upon the prowess of the analyst to submit the right queries or observe generalized document and term relationships from ranked or visual results. We propose a new approach which allows the user to control or steer the analytic view within the unsupervised space. This process ismore » controlled through the data characterization process via user supplied context in the form of a collection of key terms. We show that steering with an appropriate choice of key terms can provide better relevance to the analytic domain and still enable the analyst to uncover un-expected relationships; this paper discusses cases where various analytic steering approaches can provide enhanced analysis results and cases where analytic steering can have a negative impact on the analysis process.« less

  16. Multiple Theoretical Lenses as an Analytical Strategy in Researching Group Discussions

    ERIC Educational Resources Information Center

    Berge, Maria; Ingerman, Åke

    2017-01-01

    Background: In science education today, there is an emerging focus on what is happening in situ, making use of an array of analytical traditions. Common practice is to use one specific analytical framing within a research project, but there are projects that make use of multiple analytical framings to further the understanding of the same data,…

  17. A compact two-wave dichrometer of an optical biosensor analytical system for medicine

    NASA Astrophysics Data System (ADS)

    Chulkov, D. P.; Gusev, V. M.; Kompanets, O. N.; Vereschagin, F. V.; Skuridin, S. G.; Yevdokimov, Yu. M.

    2017-01-01

    An experimental model has been developed of a compact two-wave dichrometer on the base of LEDs that is well-suited to work with "liquid" DNA nanoconstructions as biosensing units. The mobile and inexpensive device is intended for use in a biosensor analytical system for rapid determination of biologically active compounds in liquids to solve practical problems of clinic medicine and pharmacology.

  18. Tungsten devices in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hou, Xiandeng; Jones, Bradley T.

    2002-04-01

    Tungsten devices have been employed in analytical atomic spectrometry for approximately 30 years. Most of these atomizers can be electrically heated up to 3000 °C at very high heating rates, with a simple power supply. Usually, a tungsten device is employed in one of two modes: as an electrothermal atomizer with which the sample vapor is probed directly, or as an electrothermal vaporizer, which produces a sample aerosol that is then carried to a separate atomizer for analysis. Tungsten devices may take various physical shapes: tubes, cups, boats, ribbons, wires, filaments, coils and loops. Most of these orientations have been applied to many analytical techniques, such as atomic absorption spectrometry, atomic emission spectrometry, atomic fluorescence spectrometry, laser excited atomic fluorescence spectrometry, metastable transfer emission spectroscopy, inductively coupled plasma optical emission spectrometry, inductively coupled plasma mass spectrometry and microwave plasma atomic spectrometry. The analytical figures of merit and the practical applications reported for these techniques are reviewed. Atomization mechanisms reported for tungsten atomizers are also briefly summarized. In addition, less common applications of tungsten devices are discussed, including analyte preconcentration by adsorption or electrodeposition and electrothermal separation of analytes prior to analysis. Tungsten atomization devices continue to provide simple, versatile alternatives for analytical atomic spectrometry.

  19. A conflict of analysis: analytical chemistry and milk adulteration in Victorian Britain.

    PubMed

    Steere-Williams, Jacob

    2014-08-01

    This article centres on a particularly intense debate within British analytical chemistry in the late nineteenth century, between local public analysts and the government chemists of the Inland Revenue Service. The two groups differed in both practical methodologies and in the interpretation of analytical findings. The most striking debates in this period were related to milk analysis, highlighted especially in Victorian courtrooms. It was in protracted court cases, such as the well known Manchester Milk Case in 1883, that analytical chemistry was performed between local public analysts and the government chemists, who were often both used as expert witnesses. Victorian courtrooms were thus important sites in the context of the uneven professionalisation of chemistry. I use this tension to highlight what Christopher Hamlin has called the defining feature of Victorian public health, namely conflicts of professional jurisdiction, which adds nuance to histories of the struggle of professionalisation and public credibility in analytical chemistry.

  20. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  1. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    ERIC Educational Resources Information Center

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  2. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    PubMed

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  3. The Role of Teamwork in the Analysis of Big Data: A Study of Visual Analytics and Box Office Prediction.

    PubMed

    Buchanan, Verica; Lu, Yafeng; McNeese, Nathan; Steptoe, Michael; Maciejewski, Ross; Cooke, Nancy

    2017-03-01

    Historically, domains such as business intelligence would require a single analyst to engage with data, develop a model, answer operational questions, and predict future behaviors. However, as the problems and domains become more complex, organizations are employing teams of analysts to explore and model data to generate knowledge. Furthermore, given the rapid increase in data collection, organizations are struggling to develop practices for intelligence analysis in the era of big data. Currently, a variety of machine learning and data mining techniques are available to model data and to generate insights and predictions, and developments in the field of visual analytics have focused on how to effectively link data mining algorithms with interactive visuals to enable analysts to explore, understand, and interact with data and data models. Although studies have explored the role of single analysts in the visual analytics pipeline, little work has explored the role of teamwork and visual analytics in the analysis of big data. In this article, we present an experiment integrating statistical models, visual analytics techniques, and user experiments to study the role of teamwork in predictive analytics. We frame our experiment around the analysis of social media data for box office prediction problems and compare the prediction performance of teams, groups, and individuals. Our results indicate that a team's performance is mediated by the team's characteristics such as openness of individual members to others' positions and the type of planning that goes into the team's analysis. These findings have important implications for how organizations should create teams in order to make effective use of information from their analytic models.

  4. Analytical procedure for the determination of Ethyl Lauroyl Arginate (LAE) to assess the kinetics and specific migration from a new antimicrobial active food packaging.

    PubMed

    Pezo, Davinson; Navascués, Beatriz; Salafranca, Jesús; Nerín, Cristina

    2012-10-01

    Ethyl Lauroyl Arginate (LAE) is a cationic tensoactive compound, soluble in water, with a wide activity spectrum against moulds and bacteria. LAE has been incorporated as antimicrobial agent into packaging materials for food contact and these materials require to comply with the specific migration criteria. In this paper, one analytical procedure has been developed and optimized for the analysis of LAE in food simulants after the migrations tests. It consists of the formation of an ionic pair between LAE and the inorganic complex Co(SCN)(4)(2-) in aqueous solution, followed by a liquid-liquid extraction in a suitable organic solvent and further UV-Vis absorbance measurement. In order to evaluate possible interferences, the ionic pair has been also analyzed by high performance liquid chromatography with UV-Vis detection. Both procedures provided similar analytical characteristics, with linear ranges from 1.10 to 25.00 mg kg(-1), linearity higher than 0.9886, limits of detection and quantification of 0.33 and 1.10 mg kg(-1), respectively, accuracy better than 1% as relative error and precision better than 3.6% expressed as RSD. Optimization of analytical techniques, thermal and chemical stability of LAE, as well as migration kinetics of LAE from experimental active packaging are reported and discussed. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Error-analysis and comparison to analytical models of numerical waveforms produced by the NRAR Collaboration

    NASA Astrophysics Data System (ADS)

    Hinder, Ian; Buonanno, Alessandra; Boyle, Michael; Etienne, Zachariah B.; Healy, James; Johnson-McDaniel, Nathan K.; Nagar, Alessandro; Nakano, Hiroyuki; Pan, Yi; Pfeiffer, Harald P.; Pürrer, Michael; Reisswig, Christian; Scheel, Mark A.; Schnetter, Erik; Sperhake, Ulrich; Szilágyi, Bela; Tichy, Wolfgang; Wardell, Barry; Zenginoğlu, Anıl; Alic, Daniela; Bernuzzi, Sebastiano; Bode, Tanja; Brügmann, Bernd; Buchman, Luisa T.; Campanelli, Manuela; Chu, Tony; Damour, Thibault; Grigsby, Jason D.; Hannam, Mark; Haas, Roland; Hemberger, Daniel A.; Husa, Sascha; Kidder, Lawrence E.; Laguna, Pablo; London, Lionel; Lovelace, Geoffrey; Lousto, Carlos O.; Marronetti, Pedro; Matzner, Richard A.; Mösta, Philipp; Mroué, Abdul; Müller, Doreen; Mundim, Bruno C.; Nerozzi, Andrea; Paschalidis, Vasileios; Pollney, Denis; Reifenberger, George; Rezzolla, Luciano; Shapiro, Stuart L.; Shoemaker, Deirdre; Taracchini, Andrea; Taylor, Nicholas W.; Teukolsky, Saul A.; Thierfelder, Marcus; Witek, Helvi; Zlochower, Yosef

    2013-01-01

    The Numerical-Relativity-Analytical-Relativity (NRAR) collaboration is a joint effort between members of the numerical relativity, analytical relativity and gravitational-wave data analysis communities. The goal of the NRAR collaboration is to produce numerical-relativity simulations of compact binaries and use them to develop accurate analytical templates for the LIGO/Virgo Collaboration to use in detecting gravitational-wave signals and extracting astrophysical information from them. We describe the results of the first stage of the NRAR project, which focused on producing an initial set of numerical waveforms from binary black holes with moderate mass ratios and spins, as well as one non-spinning binary configuration which has a mass ratio of 10. All of the numerical waveforms are analysed in a uniform and consistent manner, with numerical errors evaluated using an analysis code created by members of the NRAR collaboration. We compare previously-calibrated, non-precessing analytical waveforms, notably the effective-one-body (EOB) and phenomenological template families, to the newly-produced numerical waveforms. We find that when the binary's total mass is ˜100-200M⊙, current EOB and phenomenological models of spinning, non-precessing binary waveforms have overlaps above 99% (for advanced LIGO) with all of the non-precessing-binary numerical waveforms with mass ratios ⩽4, when maximizing over binary parameters. This implies that the loss of event rate due to modelling error is below 3%. Moreover, the non-spinning EOB waveforms previously calibrated to five non-spinning waveforms with mass ratio smaller than 6 have overlaps above 99.7% with the numerical waveform with a mass ratio of 10, without even maximizing on the binary parameters.

  6. Analytical Kinematics and Coupled Vibrations Analysis of Mechanical System Operated by Solar Array Drive Assembly

    NASA Astrophysics Data System (ADS)

    Sattar, M.; Wei, C.; Jalali, A.; Sattar, R.

    2017-07-01

    To address the impact of solar array (SA) anomalies and vibrations on performance of precision space-based operations, it is important to complete its accurate jitter analysis. This work provides mathematical modelling scheme to approximate kinematics and coupled micro disturbance dynamics of rigid load supported and operated by solar array drive assembly (SADA). SADA employed in analysis provides a step wave excitation torque to activate the system. Analytical investigations into kinematics is accomplished by using generalized linear and Euler angle coordinates, applying multi-body dynamics concepts and transformations principles. Theoretical model is extended, to develop equations of motion (EoM), through energy method (Lagrange equation). The main emphasis is to research coupled frequency response by determining energies dissipated and observing dynamic behaviour of internal vibratory systems of SADA. The disturbance model captures discrete active harmonics of SADA, natural modes and vibration amplifications caused by interactions between active harmonics and structural modes of mechanical assembly. The proposed methodology can help to predict true micro disturbance nature of SADA operating rigid load. Moreover, performance outputs may be compared against actual mission requirements to assess precise spacecraft controller design to meet next space generation stringent accuracy goals.

  7. Initiating an Online Reputation Monitoring System with Open Source Analytics Tools

    NASA Astrophysics Data System (ADS)

    Shuhud, Mohd Ilias M.; Alwi, Najwa Hayaati Md; Halim, Azni Haslizan Abd

    2018-05-01

    Online reputation is an invaluable asset for modern organizations as it can help in business performance especially in sales and profit. However, if we are not aware of our reputation, it is difficult to maintain it. Thus, social media analytics is a new tool that can provide online reputation monitoring in various ways such as sentiment analysis. As a result, numerous large-scale organizations have implemented Online Reputation Monitoring (ORM) systems. However, this solution is not supposed to be exclusively for high-income organizations, as many organizations regardless sizes and types are now online. This research attempts to propose an affordable and reliable ORM system using combination of open source analytics tools for both novice practitioners and academicians. We also evaluate its prediction accuracy and we discovered that the system provides acceptable predictions (sixty percent accuracy) and demonstrate a tally prediction of major polarity by human annotation. The proposed system can help in supporting business decisions with flexible monitoring strategies especially for organization that want to initiate and administrate ORM themselves at low cost.

  8. ARPEFS as an analytic technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schach von Wittenau, A.E.

    1991-04-01

    Two modifications to the ARPEFS technique are introduced. These are studied using p(2 {times} 2)S/Cu(001) as a model system. The first modification is the obtaining of ARPEFS {chi}(k) curves at temperatures as low as our equipment will permit. While adding to the difficulty of the experiment, this modification is shown to almost double the signal-to-noise ratio of normal emission p(2 {times} 2)S/Cu(001) {chi}(k) curves. This is shown by visual comparison of the raw data and by the improved precision of the extracted structural parameters. The second change is the replacement of manual fitting of the Fourier filtered {chi}(k) curves bymore » the use of the simplex algorithm for parameter determination. Again using p(2 {times} 2)S/Cu(001) data, this is shown to result in better agreement between experimental {chi}(k) curves and curves calculated based on model structures. The improved ARPEFS is then applied to p(2 {times} 2)S/Ni(111) and ({radical}3 {times} {radical}3) R30{degree}S/Ni(111). For p(2 {times} 2)S/Cu(001) we find a S-Cu bond length of 2.26 {Angstrom}, with the S adatom 1.31 {Angstrom} above the fourfold hollow site. The second Cu layer appears to be corrugated. Analysis of the p(2 {times} 2)S/Ni(111) data indicates that the S adatom adatom adsorbs onto the FCC threefold hollow site 1.53 {Angstrom} above the Ni surface. The S-Ni bond length is determined to be 2.13 {Angstrom}, indicating an outwards shift of the first layer Ni atoms. We are unable to assign a unique structure to ({radical}3 {times} {radical}3)R30{degree}S/Ni(111). An analysis of the strengths and weaknesses of ARPEFS as an experimental and analytic technique is presented, along with a summary of problems still to be addressed.« less

  9. A MASSive Laboratory Tour. An Interactive Mass Spectrometry Outreach Activity for Children

    NASA Astrophysics Data System (ADS)

    Jungmann, Julia H.; Mascini, Nadine E.; Kiss, Andras; Smith, Donald F.; Klinkert, Ivo; Eijkel, Gert B.; Duursma, Marc C.; Cillero Pastor, Berta; Chughtai, Kamila; Chughtai, Sanaullah; Heeren, Ron M. A.

    2013-07-01

    It is imperative to fascinate young children at an early stage in their education for the analytical sciences. The exposure of the public to mass spectrometry presently increases rapidly through the common media. Outreach activities can take advantage of this exposure and employ mass spectrometry as an exquisite example of an analytical science in which children can be fascinated. The presented teaching modules introduce children to mass spectrometry and give them the opportunity to experience a modern research laboratory. The modules are highly adaptable and can be applied to young children from the age of 6 to 14 y. In an interactive tour, the students explore three major scientific concepts related to mass spectrometry; the building blocks of matter, charged particle manipulation by electrostatic fields, and analyte identification by mass analysis. Also, the students carry out a mass spectrometry experiment and learn to interpret the resulting mass spectra. The multistage, inquiry-based tour contains flexible methods, which teach the students current-day research techniques and possible applications to real research topics. Besides the scientific concepts, laboratory safety and hygiene are stressed and the students are enthused for the analytical sciences by participating in "hands-on" work. The presented modules have repeatedly been successfully employed during laboratory open days. They are also found to be extremely suitable for (early) high school science classes during laboratory visit-focused field trips.

  10. 2002 railroad employee fatalities : an analytical study

    DOT National Transportation Integrated Search

    2005-02-01

    2002 Railroad Employee Fatalities: An Analytical Study, which is designed to promote and : enhance awareness of many unsafe behaviors and conditions that typically contribute to : railroad employee fatalities. By furthering our understanding of...

  11. An analytical method for 14C in environmental water based on a wet-oxidation process.

    PubMed

    Huang, Yan-Jun; Guo, Gui-Yin; Wu, Lian-Sheng; Zhang, Bing; Chen, Chao-Feng; Zhang, Hai-Ying; Qin, Hong-Juan; Shang-Guan, Zhi-Hong

    2015-04-01

    An analytical method for (14)C in environmental water based on a wet-oxidation process was developed. The method can be used to determine the activity concentrations of organic and inorganic (14)C in environmental water, or total (14)C, including in drinking water, surface water, rainwater and seawater. The wet-oxidation of the organic component allows the conversion of organic carbon to an inorganic form, and the extraction of the inorganic (14)C can be achieved by acidification and nitrogen purging. Environmental water with a volume of 20 L can be used for the wet-oxidation and extraction, and a detection limit of about 0.02 Bq/g(C) can be achieved for water with carbon content above 15 mg(C)/L, obviously lower than the natural level of (14)C in the environment. The collected carbon is sufficient for measurement with a low level liquid scintillation counter (LSC) for typical samples. Extraction or recovery experiments for inorganic carbon and organic carbon from typical materials, including analytical reagents of organic benzoquinone, sucrose, glutamic acid, nicotinic acid, humic acid, ethane diol, et cetera., were conducted with excellent results based on measurement on a total organic carbon analyzer and LSC. The recovery rate for inorganic carbon ranged tween 98.7%-99.0% with a mean of 98.9(± 0.1)%, for organic carbon recovery ranged between 93.8% and 100.0% with a mean of 97.1(± 2.6)%. Verification and an uncertainty budget of the method are also presented for a representative environmental water. The method is appropriate for (14)C analysis in environmental water, and can be applied also to the analysis of liquid effluent from nuclear facilities. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Phase-recovery improvement using analytic wavelet transform analysis of a noisy interferogram cepstrum.

    PubMed

    Etchepareborda, Pablo; Vadnjal, Ana Laura; Federico, Alejandro; Kaufmann, Guillermo H

    2012-09-15

    We evaluate the extension of the exact nonlinear reconstruction technique developed for digital holography to the phase-recovery problems presented by other optical interferometric methods, which use carrier modulation. It is shown that the introduction of an analytic wavelet analysis in the ridge of the cepstrum transformation corresponding to the analyzed interferogram can be closely related to the well-known wavelet analysis of the interferometric intensity. Subsequently, the phase-recovery process is improved. The advantages and limitations of this framework are analyzed and discussed using numerical simulations in singular scalar light fields and in temporal speckle pattern interferometry.

  13. An Introduction to Social Network Data Analytics

    NASA Astrophysics Data System (ADS)

    Aggarwal, Charu C.

    The advent of online social networks has been one of the most exciting events in this decade. Many popular online social networks such as Twitter, LinkedIn, and Facebook have become increasingly popular. In addition, a number of multimedia networks such as Flickr have also seen an increasing level of popularity in recent years. Many such social networks are extremely rich in content, and they typically contain a tremendous amount of content and linkage data which can be leveraged for analysis. The linkage data is essentially the graph structure of the social network and the communications between entities; whereas the content data contains the text, images and other multimedia data in the network. The richness of this network provides unprecedented opportunities for data analytics in the context of social networks. This book provides a data-centric view of online social networks; a topic which has been missing from much of the literature. This chapter provides an overview of the key topics in this field, and their coverage in this book.

  14. Competing on talent analytics.

    PubMed

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.

  15. Choice as an engine of analytic thought.

    PubMed

    Savani, Krishna; Stephens, Nicole M; Markus, Hazel Rose

    2017-09-01

    Choice is a behavioral act that has a variety of well-documented motivational consequences-it fosters independence by allowing people to simultaneously express themselves and influence the environment. Given the link between independence and analytic thinking, the current research tested whether choice also leads people to think in a more analytic rather than holistic manner. Four experiments demonstrate that making choices, recalling choices, and viewing others make choices leads people to think more analytically, as indicated by their attitudes, perceptual judgments, categorization, and patterns of attention allocation. People who made choices scored higher on a subjective self-report measure of analytic cognition compared to whose did not make a choice (pilot study). Using an objective task-based measure, people who recalled choices rather than actions were less influenced by changes in the background when making judgments about focal objects (Experiment 1). People who thought of others' behaviors as choices rather than actions were more likely to group objects based on categories rather than relationships (Experiment 2). People who recalled choices rather than actions subsequently allocated more visual attention to focal objects in a scene (Experiment 3). Together, these experiments demonstrate that choice has important yet previously unexamined consequences for basic psychological processes such as attention and cognition. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Flipping the Audience Script: An Activity That Integrates Research and Audience Analysis

    ERIC Educational Resources Information Center

    Lam, Chris; Hannah, Mark A.

    2016-01-01

    This article describes a flipped classroom activity that requires students to integrate research and audience analysis. The activity uses Twitter as a data source. In the activity, students identify a sample, collect customer tweets, and analyze the language of the tweets in an effort to construct knowledge about an audience's values, needs, and…

  17. 1998 railroad employee fatalities : an analytical study

    DOT National Transportation Integrated Search

    2003-11-01

    "1998 Railroad Employee Fatalities: An Analytical Study," is designed to promote and enhance awareness of many unsafe behaviors and conditions that typically contribute to railroad employee fatalities. By furthering our understanding of the causes of...

  18. An approach to estimate spatial distribution of analyte within cells using spectrally-resolved fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam

    2017-03-01

    While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in

  19. An approach to estimate spatial distribution of analyte within cells using spectrally-resolved fluorescence microscopy.

    PubMed

    Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam

    2017-01-18

    While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in

  20. Improved full analytical polygon-based method using Fourier analysis of the three-dimensional affine transformation.

    PubMed

    Pan, Yijie; Wang, Yongtian; Liu, Juan; Li, Xin; Jia, Jia

    2014-03-01

    Previous research [Appl. Opt.52, A290 (2013)] has revealed that Fourier analysis of three-dimensional affine transformation theory can be used to improve the computation speed of the traditional polygon-based method. In this paper, we continue our research and propose an improved full analytical polygon-based method developed upon this theory. Vertex vectors of primitive and arbitrary triangles and the pseudo-inverse matrix were used to obtain an affine transformation matrix representing the spatial relationship between the two triangles. With this relationship and the primitive spectrum, we analytically obtained the spectrum of the arbitrary triangle. This algorithm discards low-level angular dependent computations. In order to add diffusive reflection to each arbitrary surface, we also propose a whole matrix computation approach that takes advantage of the affine transformation matrix and uses matrix multiplication to calculate shifting parameters of similar sub-polygons. The proposed method improves hologram computation speed for the conventional full analytical approach. Optical experimental results are demonstrated which prove that the proposed method can effectively reconstruct three-dimensional scenes.

  1. Forced degradation and impurity profiling: recent trends in analytical perspectives.

    PubMed

    Jain, Deepti; Basniwal, Pawan Kumar

    2013-12-01

    This review describes an epigrammatic impression of the recent trends in analytical perspectives of degradation and impurities profiling of pharmaceuticals including active pharmaceutical ingredient (API) as well as drug products during 2008-2012. These recent trends in forced degradation and impurity profiling were discussed on the head of year of publication; columns, matrix (API and dosage forms) and type of elution in chromatography (isocratic and gradient); therapeutic categories of the drug which were used for analysis. It focuses distinctly on comprehensive update of various analytical methods including hyphenated techniques for the identification and quantification of thresholds of impurities and degradants in different pharmaceutical matrices. © 2013 Elsevier B.V. All rights reserved.

  2. Commentary on "Distributed Revisiting: An Analytic for Retention of Coherent Science Learning"

    ERIC Educational Resources Information Center

    Hewitt, Jim

    2015-01-01

    The article, "Distributed Revisiting: An Analytic for Retention of Coherent Science Learning" is an interesting study that operates at the intersection of learning theory and learning analytics. The authors observe that the relationship between learning theory and research in the learning analytics field is constrained by several…

  3. Development of an analytical method for analysis of flubendiamide, des-iodo flubendiamide and study of their residue persistence in tomato and soil.

    PubMed

    Mohapatra, S; Ahuja, A K; Deepa, M; Jagadish, G K; Rashmi, N; Sharma, D

    2011-01-01

    Flubendiamide is a new insecticide that has been found to give excellent control of lepidopterous pests of tomato. This study has been undertaken to develop an improved method for analysis of flubendiamide and its metabolite des-iodo flubendiamide and determine residue retention in tomato and soil. The analytical method developed involved extraction of flubendiamide and its metabolite des-iodo flubendiamide with acetonitrile, liquid-liquid partitioning into hexane-ethyl acetate mixture (6:4, v v⁻¹) and cleanup with activated neutral alumina. Finally the residues were dissolved in gradient high pressure liquid chromatography (HPLC) grade acetonitrile for analysis by HPLC. The mobile phase, acetonitrile-water at 60:40 (v v⁻¹) proportion and the wavelength of 235 nm gave maximum peak resolution. Using the above method and HPLC parameters described, nearly 100 % recovery of both insecticides were obtained. There was no matrix interference and the limit of quantification (LOQ) of the method was 0.01 mg kg⁻¹. Initial residue deposits of flubendiamide on field-treated tomato from treatments @ 48 and 96 g active ingredient hectare⁻¹ were 0.83 and 1.68 mg kg⁻¹, respectively. The residues of flubendiamide dissipated at the half-life of 3.9 and 4.4 days from treatments @ 48 and 96 g a.i. ha⁻¹, respectively and persisted for 15 days from both the treatments. Des-iodo flubendiamide was not detected in tomato fruits at any time during the study period. Residues of flubendiamide and des-iodo flubendiamide in soil from treatment @ 48 and 96 g a.i. ha⁻¹ were below detectable level (BDL, < 0.01 mg kg⁻¹) after 20 days. Flubendiamide completely dissipated from tomato within 20 days when the 480 SC formulation was applied at doses recommended for protection against lepidopterous pests.

  4. Analytical solution for the wind-driven circulation in a lake containing an island

    NASA Technical Reports Server (NTRS)

    Goldstein, M. E.; Gedney, R. T.

    1971-01-01

    An analysis was carried out to determine analytically the effect of an island on the wind driven currents in a shallow lake (or sea). A general analysis is developed that can be applied to a large class of lake and island geometries and bottom topographies. Detailed numerical results are obtained for a circular island located eccentrically or concentrically in a circular lake with a logarithmic bottom topography. It is shown that an island can produce volume flow (vertically integrated velocities) gyres that are completely different from those produced by a normal basin without an island. These gyres in the neighborhood of the island will produce different velocity patterns, which include the acceleration of flow near the island shore.

  5. Dynamic cues for whisker-based object localization: An analytical solution to vibration during active whisker touch

    PubMed Central

    Vaxenburg, Roman; Wyche, Isis; Svoboda, Karel; Efros, Alexander L.

    2018-01-01

    Vibrations are important cues for tactile perception across species. Whisker-based sensation in mice is a powerful model system for investigating mechanisms of tactile perception. However, the role vibration plays in whisker-based sensation remains unsettled, in part due to difficulties in modeling the vibration of whiskers. Here, we develop an analytical approach to calculate the vibrations of whiskers striking objects. We use this approach to quantify vibration forces during active whisker touch at a range of locations along the whisker. The frequency and amplitude of vibrations evoked by contact are strongly dependent on the position of contact along the whisker. The magnitude of vibrational shear force and bending moment is comparable to quasi-static forces. The fundamental vibration frequencies are in a detectable range for mechanoreceptor properties and below the maximum spike rates of primary sensory afferents. These results suggest two dynamic cues exist that rodents can use for object localization: vibration frequency and comparison of vibrational to quasi-static force magnitude. These complement the use of quasi-static force angle as a distance cue, particularly for touches close to the follicle, where whiskers are stiff and force angles hardly change during touch. Our approach also provides a general solution to calculation of whisker vibrations in other sensing tasks. PMID:29584719

  6. Analytical methods for the determination of personal care products in human samples: an overview.

    PubMed

    Jiménez-Díaz, I; Zafra-Gómez, A; Ballesteros, O; Navalón, A

    2014-11-01

    Personal care products (PCPs) are organic chemicals widely used in everyday human life. Nowadays, preservatives, UV-filters, antimicrobials and musk fragrances are widely used PCPs. Different studies have shown that some of these compounds can cause adverse health effects, such as genotoxicity, which could even lead to mutagenic or carcinogenic effects, or estrogenicity because of their endocrine disruption activity. Due to the absence of official monitoring protocols, there is an increasing demand of analytical methods that allow the determination of those compounds in human samples in order to obtain more information regarding their behavior and fate in the human body. The complexity of the biological matrices and the low concentration levels of these compounds make necessary the use of advanced sample treatment procedures that afford both, sample clean-up, to remove potentially interfering matrix components, as well as the concentration of analytes. In the present work, a review of the more recent analytical methods published in the scientific literature for the determination of PCPs in human fluids and tissue samples, is presented. The work focused on sample preparation and the analytical techniques employed. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Using Configural Frequency Analysis as a Person-Centered Analytic Approach with Categorical Data

    ERIC Educational Resources Information Center

    Stemmler, Mark; Heine, Jörg-Henrik

    2017-01-01

    Configural frequency analysis and log-linear modeling are presented as person-centered analytic approaches for the analysis of categorical or categorized data in multi-way contingency tables. Person-centered developmental psychology, based on the holistic interactionistic perspective of the Stockholm working group around David Magnusson and Lars…

  8. Advances in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.

    1991-01-01

    Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.

  9. Analytical analysis of the temporal asymmetry between seawater intrusion and retreat

    NASA Astrophysics Data System (ADS)

    Rathore, Saubhagya Singh; Zhao, Yue; Lu, Chunhui; Luo, Jian

    2018-01-01

    The quantification of timescales associated with the movement of the seawater-freshwater interface is useful for developing effective management strategies for controlling seawater intrusion (SWI). In this study, for the first time, we derive an explicit analytical solution for the timescales of SWI and seawater retreat (SWR) in a confined, homogeneous coastal aquifer system under the quasi-steady assumption, based on a classical sharp-interface solution for approximating freshwater outflow rates into the sea. The flow continuity and hydrostatic equilibrium across the interface are identified as two primary mechanisms governing timescales of the interface movement driven by an abrupt change in discharge rates or hydraulic heads at the inland boundary. Through theoretical analysis, we quantified the dependence of interface-movement timescales on porosity, hydraulic conductivity, aquifer thickness, aquifer length, density ratio, and boundary conditions. Predictions from the analytical solution closely agreed with those from numerical simulations. In addition, we define a temporal asymmetry index (the ratio of the SWI timescale to the SWR timescale) to represent the resilience of the coastal aquifer in response to SWI. The developed analytical solutions provide a simple tool for the quick assessment of SWI and SWR timescales and reveal that the temporal asymmetry between SWI and SWR mainly relies on the initial and final values of the freshwater flux at the inland boundary, and is weakly affected by aquifer parameters. Furthermore, we theoretically examined the log-linearity relationship between the timescale and the freshwater flux at the inland boundary, and found that the relationship may be approximated by two linear functions with a slope of -2 and -1 for large changes at the boundary flux for SWI and SWR, respectively.

  10. Digital forensics: an analytical crime scene procedure model (ACSPM).

    PubMed

    Bulbul, Halil Ibrahim; Yavuzcan, H Guclu; Ozel, Mesut

    2013-12-10

    In order to ensure that digital evidence is collected, preserved, examined, or transferred in a manner safeguarding the accuracy and reliability of the evidence, law enforcement and digital forensic units must establish and maintain an effective quality assurance system. The very first part of this system is standard operating procedures (SOP's) and/or models, conforming chain of custody requirements, those rely on digital forensics "process-phase-procedure-task-subtask" sequence. An acceptable and thorough Digital Forensics (DF) process depends on the sequential DF phases, and each phase depends on sequential DF procedures, respectively each procedure depends on tasks and subtasks. There are numerous amounts of DF Process Models that define DF phases in the literature, but no DF model that defines the phase-based sequential procedures for crime scene identified. An analytical crime scene procedure model (ACSPM) that we suggest in this paper is supposed to fill in this gap. The proposed analytical procedure model for digital investigations at a crime scene is developed and defined for crime scene practitioners; with main focus on crime scene digital forensic procedures, other than that of whole digital investigation process and phases that ends up in a court. When reviewing the relevant literature and interrogating with the law enforcement agencies, only device based charts specific to a particular device and/or more general perspective approaches to digital evidence management models from crime scene to courts are found. After analyzing the needs of law enforcement organizations and realizing the absence of crime scene digital investigation procedure model for crime scene activities we decided to inspect the relevant literature in an analytical way. The outcome of this inspection is our suggested model explained here, which is supposed to provide guidance for thorough and secure implementation of digital forensic procedures at a crime scene. In digital forensic

  11. An analytical platform for mass spectrometry-based identification and chemical analysis of RNA in ribonucleoprotein complexes.

    PubMed

    Taoka, Masato; Yamauchi, Yoshio; Nobe, Yuko; Masaki, Shunpei; Nakayama, Hiroshi; Ishikawa, Hideaki; Takahashi, Nobuhiro; Isobe, Toshiaki

    2009-11-01

    We describe here a mass spectrometry (MS)-based analytical platform of RNA, which combines direct nano-flow reversed-phase liquid chromatography (RPLC) on a spray tip column and a high-resolution LTQ-Orbitrap mass spectrometer. Operating RPLC under a very low flow rate with volatile solvents and MS in the negative mode, we could estimate highly accurate mass values sufficient to predict the nucleotide composition of a approximately 21-nucleotide small interfering RNA, detect post-transcriptional modifications in yeast tRNA, and perform collision-induced dissociation/tandem MS-based structural analysis of nucleolytic fragments of RNA at a sub-femtomole level. Importantly, the method allowed the identification and chemical analysis of small RNAs in ribonucleoprotein (RNP) complex, such as the pre-spliceosomal RNP complex, which was pulled down from cultured cells with a tagged protein cofactor as bait. We have recently developed a unique genome-oriented database search engine, Ariadne, which allows tandem MS-based identification of RNAs in biological samples. Thus, the method presented here has broad potential for automated analysis of RNA; it complements conventional molecular biology-based techniques and is particularly suited for simultaneous analysis of the composition, structure, interaction, and dynamics of RNA and protein components in various cellular RNP complexes.

  12. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    PubMed Central

    Pohanka, Miroslav

    2015-01-01

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance. PMID:26110404

  13. Analytical Protocols for Analysis of Organic Molecules in Mars Analog Materials

    NASA Technical Reports Server (NTRS)

    Mahaffy, Paul R.; Brinkerhoff, W.; Buch, A.; Demick, J.; Glavin, D. P.

    2004-01-01

    A range of analytical techniques and protocols that might be applied b in situ investigations of martian fines, ices, and rock samples are evaluated by analysis of organic molecules m Mars analogues. These simulants 6om terrestrial (i.e. tephra from Hawaii) or extraterrestrial (meteoritic) samples are examined by pyrolysis gas chromatograph mass spectrometry (GCMS), organic extraction followed by chemical derivatization GCMS, and laser desorption mass spectrometry (LDMS). The combination of techniques imparts analysis breadth since each technique provides a unique analysis capability for Certain classes of organic molecules.

  14. Verification of an Analytical Method for Measuring Crystal Nucleation Rates in Glasses from DTA Data

    NASA Technical Reports Server (NTRS)

    Ranasinghe, K. S.; Wei, P. F.; Kelton, K. F.; Ray, C. S.; Day, D. E.

    2004-01-01

    A recently proposed analytical (DTA) method for estimating the nucleation rates in glasses has been evaluated by comparing experimental data with numerically computed nucleation rates for a model lithium disilicate glass. The time and temperature dependent nucleation rates were predicted using the model and compared with those values from an analysis of numerically calculated DTA curves. The validity of the numerical approach was demonstrated earlier by a comparison with experimental data. The excellent agreement between the nucleation rates from the model calculations and fiom the computer generated DTA data demonstrates the validity of the proposed analytical DTA method.

  15. Analytical Chemistry Division. Annual progress report for period ending December 31, 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, W.S.

    1981-05-01

    This report is divided into: analytical methodology; mass and emission spectrometry; technical support; bio/organic analysis; nuclear and radiochemical analysis; quality assurance, safety, and tabulation of analyses; supplementary activities; and presentation of research results. Separate abstracts were prepared for the technical support, bio/organic analysis, and nuclear and radiochemical analysis. (DLC)

  16. Physical Analytics: An emerging field with real-world applications and impact

    NASA Astrophysics Data System (ADS)

    Hamann, Hendrik

    2015-03-01

    In the past most information on the internet has been originated by humans or computers. However with the emergence of cyber-physical systems, vast amount of data is now being created by sensors from devices, machines etc digitizing the physical world. While cyber-physical systems are subject to active research around the world, the vast amount of actual data generated from the physical world has attracted so far little attention from the engineering and physics community. In this presentation we use examples to highlight the opportunities in this new subject of ``Physical Analytics'' for highly inter-disciplinary research (including physics, engineering and computer science), which aims understanding real-world physical systems by leveraging cyber-physical technologies. More specifically, the convergence of the physical world with the digital domain allows applying physical principles to everyday problems in a much more effective and informed way than what was possible in the past. Very much like traditional applied physics and engineering has made enormous advances and changed our lives by making detailed measurements to understand the physics of an engineered device, we can now apply the same rigor and principles to understand large-scale physical systems. In the talk we first present a set of ``configurable'' enabling technologies for Physical Analytics including ultralow power sensing and communication technologies, physical big data management technologies, numerical modeling for physical systems, machine learning based physical model blending, and physical analytics based automation and control. Then we discuss in detail several concrete applications of Physical Analytics ranging from energy management in buildings and data centers, environmental sensing and controls, precision agriculture to renewable energy forecasting and management.

  17. Physical and Chemical Analytical Analysis: A key component of Bioforensics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velsko, S P

    The anthrax letters event of 2001 has raised our awareness of the potential importance of non-biological measurements on samples of biological agents used in a terrorism incident. Such measurements include a variety of mass spectral, spectroscopic, and other instrumental techniques that are part of the current armamentarium of the modern materials analysis or analytical chemistry laboratory. They can provide morphological, trace element, isotopic, and other molecular ''fingerprints'' of the agent that may be key pieces of evidence, supplementing that obtained from genetic analysis or other biological properties. The generation and interpretation of such data represents a new domain of forensicmore » science, closely aligned with other areas of ''microbial forensics''. This paper describes some major elements of the R&D agenda that will define this sub-field in the immediate future and provide the foundations for a coherent national capability. Data from chemical and physical analysis of BW materials can be useful to an investigation of a bio-terror event in two ways. First, it can be used to compare evidence samples collected at different locations where such incidents have occurred (e.g. between the powders in the New York and Washington letters in the Amerithrax investigation) or between the attack samples and those seized during the investigation of sites where it is suspected the material was manufactured (if such samples exist). Matching of sample properties can help establish the relatedness of disparate incidents, and mis-matches might exclude certain scenarios, or signify a more complex etiology of the events under investigation. Chemical and morphological analysis for sample matching has a long history in forensics, and is likely to be acceptable in principle in court, assuming that match criteria are well defined and derived from known limits of precision of the measurement techniques in question. Thus, apart from certain operational issues (such as how

  18. Development of an Analytical Method for Dibutyl Phthalate Determination Using Surrogate Analyte Approach

    PubMed Central

    Farzanehfar, Vahid; Faizi, Mehrdad; Naderi, Nima; Kobarfard, Farzad

    2017-01-01

    Dibutyl phthalate (DBP) is a phthalic acid ester and is widely used in polymeric products to make them more flexible. DBP is found in almost every plastic material and is believed to be persistent in the environment. Various analytical methods have been used to measure DBP in different matrices. Considering the ubiquitous nature of DBP, the most important challenge in DBP analyses is the contamination of even analytical grade organic solvents with this compound and lack of availability of a true blank matrix to construct the calibration line. Standard addition method or using artificial matrices reduce the precision and accuracy of the results. In this study a surrogate analyte approach that is based on using deuterium labeled analyte (DBP-d4) to construct the calibration line was applied to determine DBP in hexane samples. PMID:28496469

  19. An analytical prediction of the oscillation and extinction thresholds of a clarinet

    NASA Astrophysics Data System (ADS)

    Dalmont, Jean-Pierre; Gilbert, Joël; Kergomard, Jean; Ollivier, Sébastien

    2005-11-01

    This paper investigates the dynamic range of the clarinet from the oscillation threshold to the extinction at high pressure level. The use of an elementary model for the reed-mouthpiece valve effect combined with a simplified model of the pipe assuming frequency independent losses (Raman's model) allows an analytical calculation of the oscillations and their stability analysis. The different thresholds are shown to depend on parameters related to embouchure parameters and to the absorption coefficient in the pipe. Their values determine the dynamic range of the fundamental oscillations and the bifurcation scheme at the extinction.

  20. AN ANALYTICAL SOLUTION TO RICHARDS' EQUATIONS FOR A DRAINING SOIL PROFILE

    EPA Science Inventory

    Analytical solutions are developed for the Richards' equation following the analysis of Broadbridge and White. Included here is the solution for drainage and redistribution of a partially or deeply wetted profile. Additionally, infiltration for various initial conditions is exami...

  1. Analyte discrimination from chemiresistor response kinetics.

    PubMed

    Read, Douglas H; Martin, James E

    2010-08-15

    Chemiresistors are polymer-based sensors that transduce the sorption of a volatile organic compound into a resistance change. Like other polymer-based gas sensors that function through sorption, chemiresistors can be selective for analytes on the basis of the affinity of the analyte for the polymer. However, a single sensor cannot, in and of itself, discriminate between analytes, since a small concentration of an analyte that has a high affinity for the polymer might give the same response as a high concentration of another analyte with a low affinity. In this paper we use a field-structured chemiresistor to demonstrate that its response kinetics can be used to discriminate between analytes, even between those that have identical chemical affinities for the polymer phase of the sensor. The response kinetics is shown to be independent of the analyte concentration, and thus the magnitude of the sensor response, but is found to vary inversely with the analyte's saturation vapor pressure. Saturation vapor pressures often vary greatly from analyte to analyte, so analysis of the response kinetics offers a powerful method for obtaining analyte discrimination from a single sensor.

  2. Power Distribution Analysis For Electrical Usage In Province Area Using Olap (Online Analytical Processing)

    NASA Astrophysics Data System (ADS)

    Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi

    2018-02-01

    The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.

  3. [Offered income, salary expectations, and the economic activity of married women: an analytic model].

    PubMed

    Lollivier, S

    1984-06-01

    This study uses data from tax declarations for 40,000 French households for 1975 to propose a model that permits quantification of the effects of certain significant factors on the economic activity of married women. The PROBIT model of analysis of variance was used to determine the specific effect of several variables, including age of the woman, number of children under 25 years of age in the household, the age of the youngest child, husband's income and socioprofessional status, wife's level and type of education, size of community of residence and region of residence. The principal factors influencing activity rates were found to be educational level, age, and to those of childless women, but activity rates dropped by about 30% for mothers of 2 and even more for mothers of 3 or more children. Influence of the place of residence and the husband's income were associated with lesser disparities. The reasons for variations in female labor force participation can be viewed as analogous to a balance. Underlying factors can increase or decrease the income the woman hopes to earn (offered income) as well as the minimum income for which she will work (required salary). A TOBIT model was constructed in which income was a function of age, education, geographic location, and number of children, and salary required was a function of the variables related to the husband including income and socioprofessional status. For most of the effects considered, the observed variation in activity rates resulted from variations in offered income. The husband's income influences only the desired salary. The offered income decreases and the required salary increases when the number of children is 2 or more, reducing the rate of activity. More educated women have slightly greater salary expectations, but command much higher salaries, resulting in an increased rate of professional activity.

  4. (U) An Analytic Study of Piezoelectric Ejecta Mass Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tregillis, Ian Lee

    2017-02-16

    We consider the piezoelectric measurement of the areal mass of an ejecta cloud, for the specific case where ejecta are created by a single shock at the free surface and fly ballistically through vacuum to the sensor. To do so, we define time- and velocity-dependent ejecta “areal mass functions” at the source and sensor in terms of typically unknown distribution functions for the ejecta particles. Next, we derive an equation governing the relationship between the areal mass function at the source (which resides in the rest frame of the free surface) and at the sensor (which resides in the laboratorymore » frame). We also derive expressions for the analytic (“true”) accumulated ejecta mass at the sensor and the measured (“inferred”) value obtained via the standard method for analyzing piezoelectric voltage traces. This approach enables us to derive an exact expression for the error imposed upon a piezoelectric ejecta mass measurement (in a perfect system) by the assumption of instantaneous creation. We verify that when the ejecta are created instantaneously (i.e., when the time dependence is a delta function), the piezoelectric inference method exactly reproduces the correct result. When creation is not instantaneous, the standard piezo analysis will always overestimate the true mass. However, the error is generally quite small (less than several percent) for most reasonable velocity and time dependences. In some cases, errors exceeding 10-15% may require velocity distributions or ejecta production timescales inconsistent with experimental observations. These results are demonstrated rigorously with numerous analytic test problems.« less

  5. Clinical laboratory analytics: Challenges and promise for an emerging discipline.

    PubMed

    Shirts, Brian H; Jackson, Brian R; Baird, Geoffrey S; Baron, Jason M; Clements, Bryan; Grisson, Ricky; Hauser, Ronald George; Taylor, Julie R; Terrazas, Enrique; Brimhall, Brad

    2015-01-01

    The clinical laboratory is a major source of health care data. Increasingly these data are being integrated with other data to inform health system-wide actions meant to improve diagnostic test utilization, service efficiency, and "meaningful use." The Academy of Clinical Laboratory Physicians and Scientists hosted a satellite meeting on clinical laboratory analytics in conjunction with their annual meeting on May 29, 2014 in San Francisco. There were 80 registrants for the clinical laboratory analytics meeting. The meeting featured short presentations on current trends in clinical laboratory analytics and several panel discussions on data science in laboratory medicine, laboratory data and its role in the larger healthcare system, integrating laboratory analytics, and data sharing for collaborative analytics. One main goal of meeting was to have an open forum of leaders that work with the "big data" clinical laboratories produce. This article summarizes the proceedings of the meeting and content discussed.

  6. Export Odyssey: An Exposition and Analytical Review of Literature Concerning an Undergraduate Student Project in International Marketing on Key Teaching-Learning Dimensions.

    ERIC Educational Resources Information Center

    Williamson, Nicholas C.

    2001-01-01

    Describes Export Odyssey (EO), a structured, Internet-intensive, team-based undergraduate student project in international marketing. Presents an analytical review of articles in the literature that relate to three key teaching-learning dimensions of student projects (experiential versus non-experiential active learning, team-based versus…

  7. PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czuchlewski, Kristina Rodriguez; Hart, William E.

    Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of humanmore » perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into

  8. Validation of a multi-analyte panel with cell-bound complement activation products for systemic lupus erythematosus.

    PubMed

    Dervieux, Thierry; Conklin, John; Ligayon, Jo-Anne; Wolover, Leilani; O'Malley, Tyler; Alexander, Roberta Vezza; Weinstein, Arthur; Ibarra, Claudia A

    2017-07-01

    We describe the analytical validation of an assay panel intended to assist clinicians with the diagnosis of systemic lupus erythematosus (SLE). The multi-analyte panel includes quantitative assessment of complement activation and measurement of autoantibodies. The levels of the complement split product C4d bound to erythrocytes (EC4d) and B-lymphocytes (BC4d) (expressed as mean fluorescence intensity [MFI]) are measured by quantitative flow cytometry, while autoantibodies (inclusive of antinuclear and anti-double stranded DNA antibodies) are determined by immunoassays. Results of the multi-analyte panel are reported as positive or negative based on a 2-tiered index score. Post-phlebotomy stability of EC4d and BC4d in EDTA-anticoagulated blood is determined using specimens collected from patients with SLE and normal donors. Three-level C4 coated positive beads are run daily as controls. Analytical validity is reported using intra-day and inter-day coefficient of variation (CV). EC4d and BC4d are stable for 2days at ambient temperature and for 4days at 4°C post-phlebotomy. Median intra-day and inter-day CV range from 2.9% to 7.8% (n=30) and 7.3% to 12.4% (n=66), respectively. The 2-tiered index score is reproducible over 4 consecutive daysupon storage of blood at 4°C. A total of 2,888 three-level quality control data were collected from 6 flow cytometers with an overall failure rate below 3%. Median EC4d level is 6 net MFI (Interquartile [IQ] range 4-9 net MFI) and median BC4d is 18 net MFI (IQ range 13-27 net MFI) among 86,852 specimens submitted for testing. The incidence of 2-tiered positive test results is 13.4%. We have established the analytical validity of a multi-analyte assay panel for SLE. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. An analytical study of double bend achromat lattice.

    PubMed

    Fakhri, Ali Akbar; Kant, Pradeep; Singh, Gurnam; Ghodke, A D

    2015-03-01

    In a double bend achromat, Chasman-Green (CG) lattice represents the basic structure for low emittance synchrotron radiation sources. In the basic structure of CG lattice single focussing quadrupole (QF) magnet is used to form an achromat. In this paper, this CG lattice is discussed and an analytical relation is presented, showing the limitation of basic CG lattice to provide the theoretical minimum beam emittance in achromatic condition. To satisfy theoretical minimum beam emittance parameters, achromat having two, three, and four quadrupole structures is presented. In this structure, different arrangements of QF and defocusing quadruple (QD) are used. An analytical approach assuming quadrupoles as thin lenses has been followed for studying these structures. A study of Indus-2 lattice in which QF-QD-QF configuration in the achromat part has been adopted is also presented.

  10. Toward an integrative theory of training motivation: a meta-analytic path analysis of 20 years of research.

    PubMed

    Colquitt, J A; LePine, J A; Noe, R A

    2000-10-01

    This article meta-analytically summarizes the literature on training motivation, its antecedents, and its relationships with training outcomes such as declarative knowledge, skill acquisition, and transfer. Significant predictors of training motivation and outcomes included individual characteristics (e.g., locus of control, conscientiousness, anxiety, age, cognitive ability, self-efficacy, valence, job involvement) and situational characteristics (e.g., climate). Moreover, training motivation explained incremental variance in training outcomes beyond the effects of cognitive ability. Meta-analytic path analyses further showed that the effects of personality, climate, and age on training outcomes were only partially mediated by self-efficacy, valence, and job involvement. These findings are discussed in terms of their practical significance and their implications for an integrative theory of training motivation.

  11. Green analytical chemistry introduction to chloropropanols determination at no economic and analytical performance costs?

    PubMed

    Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek

    2016-01-15

    In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. IoT Big-Data Centred Knowledge Granule Analytic and Cluster Framework for BI Applications: A Case Base Analysis.

    PubMed

    Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih

    2015-01-01

    The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions.

  13. IoT Big-Data Centred Knowledge Granule Analytic and Cluster Framework for BI Applications: A Case Base Analysis

    PubMed Central

    Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih

    2015-01-01

    The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions. PMID:26600156

  14. Text-based Analytics for Biosurveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles, Lauren E.; Smith, William P.; Rounds, Jeremiah

    The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related tomore » biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when). The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related to biosurveillance. Online news articles are

  15. A health analytics semantic ETL service for obesity surveillance.

    PubMed

    Poulymenopoulou, M; Papakonstantinou, D; Malamateniou, F; Vassilacopoulos, G

    2015-01-01

    The increasingly large amount of data produced in healthcare (e.g. collected through health information systems such as electronic medical records - EMRs or collected through novel data sources such as personal health records - PHRs, social media, web resources) enable the creation of detailed records about people's health, sentiments and activities (e.g. physical activity, diet, sleep quality) that can be used in the public health area among others. However, despite the transformative potential of big data in public health surveillance there are several challenges in integrating big data. In this paper, the interoperability challenge is tackled and a semantic Extract Transform Load (ETL) service is proposed that seeks to semantically annotate big data to result into valuable data for analysis. This service is considered as part of a health analytics engine on the cloud that interacts with existing healthcare information exchange networks, like the Integrating the Healthcare Enterprise (IHE), PHRs, sensors, mobile applications, and other web resources to retrieve patient health, behavioral and daily activity data. The semantic ETL service aims at semantically integrating big data for use by analytic mechanisms. An illustrative implementation of the service on big data which is potentially relevant to human obesity, enables using appropriate analytic techniques (e.g. machine learning, text mining) that are expected to assist in identifying patterns and contributing factors (e.g. genetic background, social, environmental) for this social phenomenon and, hence, drive health policy changes and promote healthy behaviors where residents live, work, learn, shop and play.

  16. Clinical laboratory analytics: Challenges and promise for an emerging discipline

    PubMed Central

    Shirts, Brian H.; Jackson, Brian R.; Baird, Geoffrey S.; Baron, Jason M.; Clements, Bryan; Grisson, Ricky; Hauser, Ronald George; Taylor, Julie R.; Terrazas, Enrique; Brimhall, Brad

    2015-01-01

    The clinical laboratory is a major source of health care data. Increasingly these data are being integrated with other data to inform health system-wide actions meant to improve diagnostic test utilization, service efficiency, and “meaningful use.” The Academy of Clinical Laboratory Physicians and Scientists hosted a satellite meeting on clinical laboratory analytics in conjunction with their annual meeting on May 29, 2014 in San Francisco. There were 80 registrants for the clinical laboratory analytics meeting. The meeting featured short presentations on current trends in clinical laboratory analytics and several panel discussions on data science in laboratory medicine, laboratory data and its role in the larger healthcare system, integrating laboratory analytics, and data sharing for collaborative analytics. One main goal of meeting was to have an open forum of leaders that work with the “big data” clinical laboratories produce. This article summarizes the proceedings of the meeting and content discussed. PMID:25774320

  17. Analytical formulation of selected activities of the remote manipulator system

    NASA Technical Reports Server (NTRS)

    Zimmerman, K. J.

    1977-01-01

    Existing analysis of Orbiter-RMS-Payload kinematics were surveyed, including equations dealing with the two body kinematics in the presence of a massless RMS and compares analytical explicit solutions with numerical solutions. For the following operational phases of the RMS numerical demonstration, problems are provided: (1) payload capture; (2) payload stowage and removal from cargo bay; and (3) payload deployment. The equation of motion provided accounted for RMS control forces and torque moments and could be extended to RMS flexibility and control loop simulation without increasing the degrees of freedom of the two body system.

  18. Glycan characterization of the NIST RM monoclonal antibody using a total analytical solution: From sample preparation to data analysis.

    PubMed

    Hilliard, Mark; Alley, William R; McManus, Ciara A; Yu, Ying Qing; Hallinan, Sinead; Gebler, John; Rudd, Pauline M

    Glycosylation is an important attribute of biopharmaceutical products to monitor from development through production. However, glycosylation analysis has traditionally been a time-consuming process with long sample preparation protocols and manual interpretation of the data. To address the challenges associated with glycan analysis, we developed a streamlined analytical solution that covers the entire process from sample preparation to data analysis. In this communication, we describe the complete analytical solution that begins with a simplified and fast N-linked glycan sample preparation protocol that can be completed in less than 1 hr. The sample preparation includes labelling with RapiFluor-MS tag to improve both fluorescence (FLR) and mass spectral (MS) sensitivities. Following HILIC-UPLC/FLR/MS analyses, the data are processed and a library search based on glucose units has been included to expedite the task of structural assignment. We then applied this total analytical solution to characterize the glycosylation of the NIST Reference Material mAb 8761. For this glycoprotein, we confidently identified 35 N-linked glycans and all three major classes, high mannose, complex, and hybrid, were present. The majority of the glycans were neutral and fucosylated; glycans featuring N-glycolylneuraminic acid and those with two galactoses connected via an α1,3-linkage were also identified.

  19. Discourse-Centric Learning Analytics: Mapping the Terrain

    ERIC Educational Resources Information Center

    Knight, Simon; Littleton, Karen

    2015-01-01

    There is an increasing interest in developing learning analytic techniques for the analysis, and support of, high-quality learning discourse. This paper maps the terrain of discourse-centric learning analytics (DCLA), outlining the distinctive contribution of DCLA and outlining a definition for the field moving forwards. It is our claim that DCLA…

  20. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  1. Approximate analytical solutions in the analysis of elastic structures of complex geometry

    NASA Astrophysics Data System (ADS)

    Goloskokov, Dmitriy P.; Matrosov, Alexander V.

    2018-05-01

    A method of analytical decomposition for analysis plane structures of a complex configuration is presented. For each part of the structure in the form of a rectangle all the components of the stress-strain state are constructed by the superposition method. The method is based on two solutions derived in the form of trigonometric series with unknown coefficients using the method of initial functions. The coefficients are determined from the system of linear algebraic equations obtained while satisfying the boundary conditions and the conditions for joining the structure parts. The components of the stress-strain state of a bent plate with holes are calculated using the analytical decomposition method.

  2. Green analytical method development for statin analysis.

    PubMed

    Assassi, Amira Louiza; Roy, Claude-Eric; Perovitch, Philippe; Auzerie, Jack; Hamon, Tiphaine; Gaudin, Karen

    2015-02-06

    Green analytical chemistry method was developed for pravastatin, fluvastatin and atorvastatin analysis. HPLC/DAD method using ethanol-based mobile phase with octadecyl-grafted silica with various grafting and related-column parameters such as particle sizes, core-shell and monolith was studied. Retention, efficiency and detector linearity were optimized. Even for column with particle size under 2 μm, the benefit of keeping efficiency within a large range of flow rate was not obtained with ethanol based mobile phase compared to acetonitrile one. Therefore the strategy to shorten analysis by increasing the flow rate induced decrease of efficiency with ethanol based mobile phase. An ODS-AQ YMC column, 50 mm × 4.6 mm, 3 μm was selected which showed the best compromise between analysis time, statin separation, and efficiency. HPLC conditions were at 1 mL/min, ethanol/formic acid (pH 2.5, 25 mM) (50:50, v/v) and thermostated at 40°C. To reduce solvent consumption for sample preparation, 0.5mg/mL concentration of each statin was found the highest which respected detector linearity. These conditions were validated for each statin for content determination in high concentrated hydro-alcoholic solutions. Solubility higher than 100mg/mL was found for pravastatin and fluvastatin, whereas for atorvastatin calcium salt the maximum concentration was 2mg/mL for hydro-alcoholic binary mixtures between 35% and 55% of ethanol in water. Using atorvastatin instead of its calcium salt, solubility was improved. Highly concentrated solution of statins offered potential fluid for per Buccal Per-Mucous(®) administration with the advantages of rapid and easy passage of drugs. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Analytical approximations to the dynamics of an array of coupled DC SQUIDs

    NASA Astrophysics Data System (ADS)

    Berggren, Susan; Palacios, Antonio

    2014-04-01

    Coupled dynamical systems that operate near the onset of a bifurcation can lead, under certain conditions, to strong signal amplification effects. Over the past years we have studied this generic feature on a wide range of systems, including: magnetic and electric fields sensors, gyroscopic devices, and arrays of loops of superconducting quantum interference devices, also known as SQUIDs. In this work, we consider an array of SQUID loops connected in series as a case study to derive asymptotic analytical approximations to the exact solutions through perturbation analysis. Two approaches are considered. First, a straightforward expansion in which the non-linear parameter related to the inductance of the DC SQUID is treated as the small perturbation parameter. Second, a more accurate procedure that considers the SQUID phase dynamics as non-uniform motion on a circle. This second procedure is readily extended to the series array and it could serve as a mathematical framework to find approximate solutions to related complex systems with high-dimensionality. To the best of our knowledge, an approximate analytical solutions to an array of SQUIDs has not been reported yet in the literature.

  4. Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.

    PubMed

    Stolper, Charles D; Perer, Adam; Gotz, David

    2014-12-01

    As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.

  5. Healthcare predictive analytics: An overview with a focus on Saudi Arabia.

    PubMed

    Alharthi, Hana

    2018-03-08

    Despite a newfound wealth of data and information, the healthcare sector is lacking in actionable knowledge. This is largely because healthcare data, though plentiful, tends to be inherently complex and fragmented. Health data analytics, with an emphasis on predictive analytics, is emerging as a transformative tool that can enable more proactive and preventative treatment options. This review considers the ways in which predictive analytics has been applied in the for-profit business sector to generate well-timed and accurate predictions of key outcomes, with a focus on key features that may be applicable to healthcare-specific applications. Published medical research presenting assessments of predictive analytics technology in medical applications are reviewed, with particular emphasis on how hospitals have integrated predictive analytics into their day-to-day healthcare services to improve quality of care. This review also highlights the numerous challenges of implementing predictive analytics in healthcare settings and concludes with a discussion of current efforts to implement healthcare data analytics in the developing country, Saudi Arabia. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.

  6. Analytical performances of food microbiology laboratories - critical analysis of 7 years of proficiency testing results.

    PubMed

    Abdel Massih, M; Planchon, V; Polet, M; Dierick, K; Mahillon, J

    2016-02-01

    Based on the results of 19 food microbiology proficiency testing (PT) schemes, this study aimed to assess the laboratory performances, to highlight the main sources of unsatisfactory analytical results and to suggest areas of improvement. The 2009-2015 results of REQUASUD and IPH PT, involving a total of 48 laboratories, were analysed. On average, the laboratories failed to detect or enumerate foodborne pathogens in 3·0% of the tests. Thanks to a close collaboration with the PT participants, the causes of outliers could be identified in 74% of the cases. The main causes of erroneous PT results were either pre-analytical (handling of the samples, timing of analysis), analytical (unsuitable methods, confusion of samples, errors in colony counting or confirmation) or postanalytical mistakes (calculation and encoding of results). PT schemes are a privileged observation post to highlight analytical problems, which would otherwise remain unnoticed. In this perspective, this comprehensive study of PT results provides insight into the sources of systematic errors encountered during the analyses. This study draws the attention of the laboratories to the main causes of analytical errors and suggests practical solutions to avoid them, in an educational purpose. The observations support the hypothesis that regular participation to PT, when followed by feed-back and appropriate corrective actions, can play a key role in quality improvement and provide more confidence in the laboratory testing results. © 2015 The Society for Applied Microbiology.

  7. An explicit closed-form analytical solution for European options under the CGMY model

    NASA Astrophysics Data System (ADS)

    Chen, Wenting; Du, Meiyu; Xu, Xiang

    2017-01-01

    In this paper, we consider the analytical pricing of European path-independent options under the CGMY model, which is a particular type of pure jump Le´vy process, and agrees well with many observed properties of the real market data by allowing the diffusions and jumps to have both finite and infinite activity and variation. It is shown that, under this model, the option price is governed by a fractional partial differential equation (FPDE) with both the left-side and right-side spatial-fractional derivatives. In comparison to derivatives of integer order, fractional derivatives at a point not only involve properties of the function at that particular point, but also the information of the function in a certain subset of the entire domain of definition. This ;globalness; of the fractional derivatives has added an additional degree of difficulty when either analytical methods or numerical solutions are attempted. Albeit difficult, we still have managed to derive an explicit closed-form analytical solution for European options under the CGMY model. Based on our solution, the asymptotic behaviors of the option price and the put-call parity under the CGMY model are further discussed. Practically, a reliable numerical evaluation technique for the current formula is proposed. With the numerical results, some analyses of impacts of four key parameters of the CGMY model on European option prices are also provided.

  8. Micro-optics for microfluidic analytical applications.

    PubMed

    Yang, Hui; Gijs, Martin A M

    2018-02-19

    This critical review summarizes the developments in the integration of micro-optical elements with microfluidic platforms for facilitating detection and automation of bio-analytical applications. Micro-optical elements, made by a variety of microfabrication techniques, advantageously contribute to the performance of an analytical system, especially when the latter has microfluidic features. Indeed the easy integration of optical control and detection modules with microfluidic technology helps to bridge the gap between the macroscopic world and chip-based analysis, paving the way for automated and high-throughput applications. In our review, we start the discussion with an introduction of microfluidic systems and micro-optical components, as well as aspects of their integration. We continue with a detailed description of different microfluidic and micro-optics technologies and their applications, with an emphasis on the realization of optical waveguides and microlenses. The review continues with specific sections highlighting the advantages of integrated micro-optical components in microfluidic systems for tackling a variety of analytical problems, like cytometry, nucleic acid and protein detection, cell biology, and chemical analysis applications.

  9. Fitting It All In: Adapting a Green Chemistry Extraction Experiment for Inclusion in an Undergraduate Analytical Laboratory

    ERIC Educational Resources Information Center

    Buckley, Heather L.; Beck, Annelise R.; Mulvihill, Martin J.; Douskey, Michelle C.

    2013-01-01

    Several principles of green chemistry are introduced through this experiment designed for use in the undergraduate analytical chemistry laboratory. An established experiment of liquid CO2 extraction of D-limonene has been adapted to include a quantitative analysis by gas chromatography. This facilitates drop-in incorporation of an exciting…

  10. Supramolecular analytical chemistry.

    PubMed

    Anslyn, Eric V

    2007-02-02

    A large fraction of the field of supramolecular chemistry has focused in previous decades upon the study and use of synthetic receptors as a means of mimicking natural receptors. Recently, the demand for synthetic receptors is rapidly increasing within the analytical sciences. These classes of receptors are finding uses in simple indicator chemistry, cellular imaging, and enantiomeric excess analysis, while also being involved in various truly practical assays of bodily fluids. Moreover, one of the most promising areas for the use of synthetic receptors is in the arena of differential sensing. Although many synthetic receptors have been shown to yield exquisite selectivities, in general, this class of receptor suffers from cross-reactivities. Yet, cross-reactivity is an attribute that is crucial to the success of differential sensing schemes. Therefore, both selective and nonselective synthetic receptors are finding uses in analytical applications. Hence, a field of chemistry that herein is entitled "Supramolecular Analytical Chemistry" is emerging, and is predicted to undergo increasingly rapid growth in the near future.

  11. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  12. The analytical approach to optimization of active region structure of quantum dot laser

    NASA Astrophysics Data System (ADS)

    Korenev, V. V.; Savelyev, A. V.; Zhukov, A. E.; Omelchenko, A. V.; Maximov, M. V.

    2014-10-01

    Using the analytical approach introduced in our previous papers we analyse the possibilities of optimization of size and structure of active region of semiconductor quantum dot lasers emitting via ground-state optical transitions. It is shown that there are optimal length' dispersion and number of QD layers in laser active region which allow one to obtain lasing spectrum of a given width at minimum injection current. Laser efficiency corresponding to the injection current optimized by the cavity length is practically equal to its maximum value.

  13. Climate Analytics as a Service. Chapter 11

    NASA Technical Reports Server (NTRS)

    Schnase, John L.

    2016-01-01

    Exascale computing, big data, and cloud computing are driving the evolution of large-scale information systems toward a model of data-proximal analysis. In response, we are developing a concept of climate analytics as a service (CAaaS) that represents a convergence of data analytics and archive management. With this approach, high-performance compute-storage implemented as an analytic system is part of a dynamic archive comprising both static and computationally realized objects. It is a system whose capabilities are framed as behaviors over a static data collection, but where queries cause results to be created, not found and retrieved. Those results can be the product of a complex analysis, but, importantly, they also can be tailored responses to the simplest of requests. NASA's MERRA Analytic Service and associated Climate Data Services API provide a real-world example of climate analytics delivered as a service in this way. Our experiences reveal several advantages to this approach, not the least of which is orders-of-magnitude time reduction in the data assembly task common to many scientific workflows.

  14. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    NASA Technical Reports Server (NTRS)

    Lindstrom, David J.; Lindstrom, Richard M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.

  15. Semi Active Control of Civil Structures, Analytical and Numerical Studies

    NASA Astrophysics Data System (ADS)

    Kerboua, M.; Benguediab, M.; Megnounif, A.; Benrahou, K. H.; Kaoulala, F.

    Structural control for civil structures was born out of a need to provide safer and more efficient designs with the reality of limited resources. The purpose of structural control is to absorb and to reflect the energy introduced by dynamic loads such as winds, waves, earthquakes, and traffic. Today, the protection of civil structures from severe dynamic loading is typically achieved by allowing the structures to be damaged. Semi-active control devices, also called "smart" control devices, assume the positive aspects of both the passive and active control devices. A semi-active control strategy is similar to the active control strategy. Only here, the control actuator does not directly apply force to the structure, but instead it is used to control the properties of a passive energy device, a controllable passive damper. Semi-active control strategies can be used in many of the same civil applications as passive and active control. One method of operating smart cable dampers is in a purely passive capacity, supplying the dampers with constant optimal voltage. The advantages to this strategy are the relative simplicity of implementing the control strategy as compared to a smart or active control strategy and that the dampers are more easily optimally tuned in- place, eliminating the need to have passive dampers with unique optimal damping coefficients. This research investigated semi-active control of civil structures for natural hazard mitigation. The research has two components, the seismic protection of buildings and the mitigation of wind-induced vibration in structures. An ideal semi-active motion equation of a composite beam that consists of a cantilever beam bonded with a PZT patch using Hamilton's principle and Galerkin's method was treated. A series R-L and a parallel R-L shunt circuits are coupled into the motion equation respectively by means of the constitutive relation of piezoelectric material and Kirchhoff's law to control the beam vibration. A

  16. Study of dual radio frequency capacitively coupled plasma: an analytical treatment matched to an experiment

    NASA Astrophysics Data System (ADS)

    Saikia, P.; Bhuyan, H.; Escalona, M.; Favre, M.; Wyndham, E.; Maze, J.; Schulze, J.

    2018-01-01

    The behavior of a dual frequency capacitively coupled plasma (2f CCP) driven by 2.26 and 13.56 MHz radio frequency (rf) source is investigated using an approach that integrates a theoretical model and experimental data. The basis of the theoretical analysis is a time dependent dual frequency analytical sheath model that casts the relation between the instantaneous sheath potential and plasma parameters. The parameters used in the model are obtained by operating the 2f CCP experiment (2.26 MHz + 13.56 MHz) in argon at a working pressure of 50 mTorr. Experimentally measured plasma parameters such as the electron density, electron temperature, as well as the rf current density ratios are the inputs of the theoretical model. Subsequently, a convenient analytical solution for the output sheath potential and sheath thickness was derived. A comparison of the present numerical results is done with the results obtained in another 2f CCP experiment conducted by Semmler et al (2007 Plasma Sources Sci. Technol. 16 839). A good quantitative correspondence is obtained. The numerical solution shows the variation of sheath potential with the low and high frequency (HF) rf powers. In the low pressure plasma, the sheath potential is a qualitative measure of DC self-bias which in turn determines the ion energy. Thus, using this analytical model, the measured values of the DC self-bias as a function of low and HF rf powers are explained in detail.

  17. Active Control of Fan Noise: Feasibility Study. Volume 6; Theoretical Analysis for Coupling of Active Noise Control Actuator Ring Sources to an Annular Duct with Flow

    NASA Technical Reports Server (NTRS)

    Kraft, R. E.

    1996-01-01

    The objective of this effort is to develop an analytical model for the coupling of active noise control (ANC) piston-type actuators that are mounted flush to the inner and outer walls of an annular duct to the modes in the duct generated by the actuator motion. The analysis will be used to couple the ANC actuators to the modal analysis propagation computer program for the annular duct, to predict the effects of active suppression of fan-generated engine noise sources. This combined program will then be available to assist in the design or evaluation of ANC systems in fan engine annular exhaust ducts. An analysis has been developed to predict the modes generated in an annular duct due to the coupling of flush-mounted ring actuators on the inner and outer walls of the duct. The analysis has been combined with a previous analysis for the coupling of modes to a cylindrical duct in a FORTRAN computer program to perform the computations. The method includes the effects of uniform mean flow in the duct. The program can be used for design or evaluation purposes for active noise control hardware for turbofan engines. Predictions for some sample cases modeled after the geometry of the NASA Lewis ANC Fan indicate very efficient coupling in both the inlet and exhaust ducts for the m = 6 spinning mode at frequencies where only a single radial mode is cut-on. Radial mode content in higher order cut-off modes at the source plane and the required actuator displacement amplitude to achieve 110 dB SPL levels in the desired mode were predicted. Equivalent cases with and without flow were examined for the cylindrical and annular geometry, and little difference was found for a duct flow Mach number of 0.1. The actuator ring coupling program will be adapted as a subroutine to the cylindrical duct modal analysis and the exhaust duct modal analysis. This will allow the fan source to be defined in terms of characteristic modes at the fan source plane and predict the propagation to the

  18. Significant steps in the evolution of analytical chemistry--is the today's analytical chemistry only chemistry?

    PubMed

    Karayannis, Miltiades I; Efstathiou, Constantinos E

    2012-12-15

    In this review the history of chemistry and specifically the history and the significant steps of the evolution of analytical chemistry are presented. In chronological time spans, covering the ancient world, the middle ages, the period of the 19th century, and the three evolutional periods, from the verge of the 19th century to contemporary times, it is given information for the progress of chemistry and analytical chemistry. During this period, analytical chemistry moved gradually from its pure empirical nature to more rational scientific activities, transforming itself to an autonomous branch of chemistry and a separate discipline. It is also shown that analytical chemistry moved gradually from the status of exclusive serving the chemical science, towards serving, the environment, health, law, almost all areas of science and technology, and the overall society. Some recommendations are also directed to analytical chemistry educators concerning the indispensable nature of knowledge of classical analytical chemistry and the associated laboratory exercises and to analysts, in general, why it is important to use the chemical knowledge to make measurements on problems of everyday life. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Tank 241-AN-101, grab samples, 1AN-98-1, 1AN-98-2 and 1AN-98-3 analytical results for the final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FULLER, R.K.

    1999-02-24

    This document is the final report for tank 241-AN-101 grab samples. Three grab samples 1AN-98-1, 1AN-98-2 and 1AN-98-3 were taken from riser 16 of tank 241-AN-101 on April 8, 1998 and received by the 222-S Laboratory on April 9, 1998. Analyses were performed in accordance with the ''Compatability Grab Sampling and Analysis Plan'' (TSAP) and the ''Data Quality Objectives for Tank Farms Waste Compatability Program'' (DQO). The analytical results are presented in the data summary report. No notification limits were exceeded.

  20. Analysis of environmental contamination resulting from catastrophic incidents: part 2. Building laboratory capability by selecting and developing analytical methodologies.

    PubMed

    Magnuson, Matthew; Campisano, Romy; Griggs, John; Fitz-James, Schatzi; Hall, Kathy; Mapp, Latisha; Mullins, Marissa; Nichols, Tonya; Shah, Sanjiv; Silvestri, Erin; Smith, Terry; Willison, Stuart; Ernst, Hiba

    2014-11-01

    Catastrophic incidents can generate a large number of samples of analytically diverse types, including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for sample analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to acceptable levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illustrates the result of applying this principle, in the form of a compendium of analytical methods for contaminants of interest. The compendium is based on experience with actual incidents, where appropriate and available. This paper also discusses efforts aimed at adaptation of existing methods to increase fitness-for-purpose and development of innovative methods when necessary. The contaminants of interest are primarily those potentially released through catastrophes resulting from malicious activity

  1. Integrating Bio-Inorganic and Analytical Chemistry into an Undergraduate Biochemistry Laboratory

    ERIC Educational Resources Information Center

    Erasmus, Daniel J.; Brewer, Sharon E.; Cinel, Bruno

    2015-01-01

    Undergraduate laboratories expose students to a wide variety of topics and techniques in a limited amount of time. This can be a challenge and lead to less exposure to concepts and activities in bio-inorganic chemistry and analytical chemistry that are closely-related to biochemistry. To address this, we incorporated a new iron determination by…

  2. Frequency-independent radiation modes of interior sound radiation: An analytical study

    NASA Astrophysics Data System (ADS)

    Hesse, C.; Vivar Perez, J. M.; Sinapius, M.

    2017-03-01

    Global active control methods of sound radiation into acoustic cavities necessitate the formulation of the interior sound field in terms of the surrounding structural velocity. This paper proposes an efficient approach to do this by presenting an analytical method to describe the radiation modes of interior sound radiation. The method requires no knowledge of the structural modal properties, which are often difficult to obtain in control applications. The procedure is exemplified for two generic systems of fluid-structure interaction, namely a rectangular plate coupled to a cuboid cavity and a hollow cylinder with the fluid in its enclosed cavity. The radiation modes are described as a subset of the acoustic eigenvectors on the structural-acoustic interface. For the two studied systems, they are therefore independent of frequency.

  3. Task Analytic Models to Guide Analysis and Design: Use of the Operator Function Model to Represent Pilot-Autoflight System Mode Problems

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Mitchell, Christine M.; Chappell, Alan R.; Shafto, Mike (Technical Monitor)

    1995-01-01

    Task-analytic models structure essential information about operator interaction with complex systems, in this case pilot interaction with the autoflight system. Such models serve two purposes: (1) they allow researchers and practitioners to understand pilots' actions; and (2) they provide a compact, computational representation needed to design 'intelligent' aids, e.g., displays, assistants, and training systems. This paper demonstrates the use of the operator function model to trace the process of mode engagements while a pilot is controlling an aircraft via the, autoflight system. The operator function model is a normative and nondeterministic model of how a well-trained, well-motivated operator manages multiple concurrent activities for effective real-time control. For each function, the model links the pilot's actions with the required information. Using the operator function model, this paper describes several mode engagement scenarios. These scenarios were observed and documented during a field study that focused on mode engagements and mode transitions during normal line operations. Data including time, ATC clearances, altitude, system states, and active modes and sub-modes, engagement of modes, were recorded during sixty-six flights. Using these data, seven prototypical mode engagement scenarios were extracted. One scenario details the decision of the crew to disengage a fully automatic mode in favor of a semi-automatic mode, and the consequences of this action. Another describes a mode error involving updating aircraft speed following the engagement of a speed submode. Other scenarios detail mode confusion at various phases of the flight. This analysis uses the operator function model to identify three aspects of mode engagement: (1) the progress of pilot-aircraft-autoflight system interaction; (2) control/display information required to perform mode management activities; and (3) the potential cause(s) of mode confusion. The goal of this paper is twofold

  4. Analytic integrable systems: Analytic normalization and embedding flows

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang

    In this paper we mainly study the existence of analytic normalization and the normal form of finite dimensional complete analytic integrable dynamical systems. More details, we will prove that any complete analytic integrable diffeomorphism F(x)=Bx+f(x) in (Cn,0) with B having eigenvalues not modulus 1 and f(x)=O(|) is locally analytically conjugate to its normal form. Meanwhile, we also prove that any complete analytic integrable differential system x˙=Ax+f(x) in (Cn,0) with A having nonzero eigenvalues and f(x)=O(|) is locally analytically conjugate to its normal form. Furthermore we will prove that any complete analytic integrable diffeomorphism defined on an analytic manifold can be embedded in a complete analytic integrable flow. We note that parts of our results are the improvement of Moser's one in J. Moser, The analytic invariants of an area-preserving mapping near a hyperbolic fixed point, Comm. Pure Appl. Math. 9 (1956) 673-692 and of Poincaré's one in H. Poincaré, Sur l'intégration des équations différentielles du premier order et du premier degré, II, Rend. Circ. Mat. Palermo 11 (1897) 193-239. These results also improve the ones in Xiang Zhang, Analytic normalization of analytic integrable systems and the embedding flows, J. Differential Equations 244 (2008) 1080-1092 in the sense that the linear part of the systems can be nonhyperbolic, and the one in N.T. Zung, Convergence versus integrability in Poincaré-Dulac normal form, Math. Res. Lett. 9 (2002) 217-228 in the way that our paper presents the concrete expression of the normal form in a restricted case.

  5. On the Modeling and Management of Cloud Data Analytics

    NASA Astrophysics Data System (ADS)

    Castillo, Claris; Tantawi, Asser; Steinder, Malgorzata; Pacifici, Giovanni

    A new era is dawning where vast amount of data is subjected to intensive analysis in a cloud computing environment. Over the years, data about a myriad of things, ranging from user clicks to galaxies, have been accumulated, and continue to be collected, on storage media. The increasing availability of such data, along with the abundant supply of compute power and the urge to create useful knowledge, gave rise to a new data analytics paradigm in which data is subjected to intensive analysis, and additional data is created in the process. Meanwhile, a new cloud computing environment has emerged where seemingly limitless compute and storage resources are being provided to host computation and data for multiple users through virtualization technologies. Such a cloud environment is becoming the home for data analytics. Consequently, providing good performance at run-time to data analytics workload is an important issue for cloud management. In this paper, we provide an overview of the data analytics and cloud environment landscapes, and investigate the performance management issues related to running data analytics in the cloud. In particular, we focus on topics such as workload characterization, profiling analytics applications and their pattern of data usage, cloud resource allocation, placement of computation and data and their dynamic migration in the cloud, and performance prediction. In solving such management problems one relies on various run-time analytic models. We discuss approaches for modeling and optimizing the dynamic data analytics workload in the cloud environment. All along, we use the Map-Reduce paradigm as an illustration of data analytics.

  6. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Evaluation of analytical performance based on partial order methodology.

    PubMed

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points". Copyright © 2014 Elsevier B.V. All rights reserved.

  8. De-Identification in Learning Analytics

    ERIC Educational Resources Information Center

    Khalila, Mohammad; Ebner, Martin

    2016-01-01

    Learning analytics has reserved its position as an important field in the educational sector. However, the large-scale collection, processing, and analyzing of data has steered the wheel beyond the borders to face an abundance of ethical breaches and constraints. Revealing learners' personal information and attitudes, as well as their activities,…

  9. Development and Evaluation of an Analytical Method for the Determination of Total Atmospheric Mercury. Final Report.

    ERIC Educational Resources Information Center

    Chase, D. L.; And Others

    Total mercury in ambient air can be collected in iodine monochloride, but the subsequent analysis is relatively complex and tedious, and contamination from reagents and containers is a problem. A sliver wool collector, preceded by a catalytic pyrolysis furnace, gives good recovery of mercury and simplifies the analytical step. An instrumental…

  10. An analytical procedure for the determination of aluminum used in antiperspirants on human skin in Franz™ diffusion cell.

    PubMed

    Guillard, Olivier; Fauconneau, Bernard; Favreau, Frédéric; Marrauld, Annie; Pineau, Alain

    2012-04-01

    A local case report of hyperaluminemia (aluminum concentration: 3.88 µmol/L) in a woman using an aluminum-containing antiperspirant for 4 years raises the question of possible transdermal uptake of aluminum salt as a future public health problem. Prior to studying the transdermal uptake of three commercialized cosmetic formulas, an analytical assay of aluminum (Al) in chlorohydrate form (ACH) by Zeeman Electrothermal Atomic Absorption Spectrophotometer (ZEAAS) in a clean room was optimized and validated. This analysis was performed with different media on human skin using a Franz(™) diffusion cell. The detection and quantification limits were set at ≤ 3 µg/L. Precision analysis as within-run (n = 12) and between-run (n = 15-68 days) yield CV ≤ 6%. The high analytic sensitivity (2-3 µg/L) and low variability should allow an in vitro study of the transdermal uptake of ACH.

  11. DART-MS: A New Analytical Technique for Forensic Paint Analysis.

    PubMed

    Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice

    2018-06-05

    Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.

  12. An analytic model for limiting high density LH transition by the onset of the tertiary instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Raghvendra, E-mail: rsingh129@gmail.com; Jhang, Hogun; Kaang, Helen H.

    2016-07-15

    We perform an analytic study of the tertiary instability driven by a strong excitation of zonal flows during high density low to high (LH) mode transition. The drift resistive ballooning mode is assumed to be a dominant edge turbulence driver. The analysis reproduces main qualitative features of early computational results [Rogers and Drake, Phys. Rev. Lett. 81, 4396 (1998); Guzdar et al., Phys. Plasmas 14, 020701 (2007)], as well as new characteristics of the maximum edge density due to the onset of the tertiary instability. An analytical scaling indicates that the density scaling of LH transition power may be determinedmore » by the onset condition of the tertiary instability when the operating density approaches to the Greenwald density.« less

  13. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  14. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  15. Optimal design of an activated sludge plant: theoretical analysis

    NASA Astrophysics Data System (ADS)

    Islam, M. A.; Amin, M. S. A.; Hoinkis, J.

    2013-06-01

    The design procedure of an activated sludge plant consisting of an activated sludge reactor and settling tank has been theoretically analyzed assuming that (1) the Monod equation completely describes the growth kinetics of microorganisms causing the degradation of biodegradable pollutants and (2) the settling characteristics are fully described by a power law. For a given reactor height, the design parameter of the reactor (reactor volume) is reduced to the reactor area. Then the sum total area of the reactor and the settling tank is expressed as a function of activated sludge concentration X and the recycled ratio α. A procedure has been developed to calculate X opt, for which the total required area of the plant is minimum for given microbiological system and recycled ratio. Mathematical relations have been derived to calculate the α-range in which X opt meets the requirements of F/ M ratio. Results of the analysis have been illustrated for varying X and α. Mathematical formulae have been proposed to recalculate the recycled ratio in the events, when the influent parameters differ from those assumed in the design.

  16. Nonlinear analysis of switched semi-active controlled systems

    NASA Astrophysics Data System (ADS)

    Eslaminasab, Nima; Vahid A., Orang; Golnaraghi, Farid

    2011-02-01

    Semi-active systems improve suspension performance of the vehicles more effectively than conventional passive systems by simultaneously improving ride comfort and road handling. Also, because of size, weight, price and performance advantages, they have gained more interest over the active as well as passive systems. Probably the most neglected aspect of the semi-active on-off control systems and strategies is the effects of the added nonlinearities of those systems, which are introduced and analysed in this paper. To do so, numerical techniques, analytical method of averaging and experimental analysis are deployed. In this paper, a new method to analyse, calculate and compare the performances of the semi-active controlled systems is proposed; further, a new controller based on the observations of actual test data is proposed to eliminate the adverse effects of added nonlinearities. The significance of the proposed new system is the simplicity of the algorithm and ease of implementation. In fact, this new semi-active control strategy could be easily adopted and used with most of the existing semi-active control systems.

  17. Visual analytics in healthcare education: exploring novel ways to analyze and represent big data in undergraduate medical education

    PubMed Central

    Nilsson, Gunnar; Zary, Nabil

    2014-01-01

    Introduction. The big data present in the medical curriculum that informs undergraduate medical education is beyond human abilities to perceive and analyze. The medical curriculum is the main tool used by teachers and directors to plan, design, and deliver teaching and assessment activities and student evaluations in medical education in a continuous effort to improve it. Big data remains largely unexploited for medical education improvement purposes. The emerging research field of visual analytics has the advantage of combining data analysis and manipulation techniques, information and knowledge representation, and human cognitive strength to perceive and recognize visual patterns. Nevertheless, there is a lack of research on the use and benefits of visual analytics in medical education. Methods. The present study is based on analyzing the data in the medical curriculum of an undergraduate medical program as it concerns teaching activities, assessment methods and learning outcomes in order to explore visual analytics as a tool for finding ways of representing big data from undergraduate medical education for improvement purposes. Cytoscape software was employed to build networks of the identified aspects and visualize them. Results. After the analysis of the curriculum data, eleven aspects were identified. Further analysis and visualization of the identified aspects with Cytoscape resulted in building an abstract model of the examined data that presented three different approaches; (i) learning outcomes and teaching methods, (ii) examination and learning outcomes, and (iii) teaching methods, learning outcomes, examination results, and gap analysis. Discussion. This study identified aspects of medical curriculum that play an important role in how medical education is conducted. The implementation of visual analytics revealed three novel ways of representing big data in the undergraduate medical education context. It appears to be a useful tool to explore such data

  18. Visual analytics in healthcare education: exploring novel ways to analyze and represent big data in undergraduate medical education.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2014-01-01

    Introduction. The big data present in the medical curriculum that informs undergraduate medical education is beyond human abilities to perceive and analyze. The medical curriculum is the main tool used by teachers and directors to plan, design, and deliver teaching and assessment activities and student evaluations in medical education in a continuous effort to improve it. Big data remains largely unexploited for medical education improvement purposes. The emerging research field of visual analytics has the advantage of combining data analysis and manipulation techniques, information and knowledge representation, and human cognitive strength to perceive and recognize visual patterns. Nevertheless, there is a lack of research on the use and benefits of visual analytics in medical education. Methods. The present study is based on analyzing the data in the medical curriculum of an undergraduate medical program as it concerns teaching activities, assessment methods and learning outcomes in order to explore visual analytics as a tool for finding ways of representing big data from undergraduate medical education for improvement purposes. Cytoscape software was employed to build networks of the identified aspects and visualize them. Results. After the analysis of the curriculum data, eleven aspects were identified. Further analysis and visualization of the identified aspects with Cytoscape resulted in building an abstract model of the examined data that presented three different approaches; (i) learning outcomes and teaching methods, (ii) examination and learning outcomes, and (iii) teaching methods, learning outcomes, examination results, and gap analysis. Discussion. This study identified aspects of medical curriculum that play an important role in how medical education is conducted. The implementation of visual analytics revealed three novel ways of representing big data in the undergraduate medical education context. It appears to be a useful tool to explore such data

  19. Monitoring by forward scatter radar techniques: an improved second-order analytical model

    NASA Astrophysics Data System (ADS)

    Falconi, Marta Tecla; Comite, Davide; Galli, Alessandro; Marzano, Frank S.; Pastina, Debora; Lombardo, Pierfrancesco

    2017-10-01

    In this work, a second-order phase approximation is introduced to provide an improved analytical model of the signal received in forward scatter radar systems. A typical configuration with a rectangular metallic object illuminated while crossing the baseline, in far- or near-field conditions, is considered. An improved second-order model is compared with a simplified one already proposed by the authors and based on a paraxial approximation. A phase error analysis is carried out to investigate benefits and limitations of the second-order modeling. The results are validated by developing full-wave numerical simulations implementing the relevant scattering problem on a commercial tool.

  20. Integrating bioassays and analytical chemistry as an improved approach to support safety assessment of food contact materials.

    PubMed

    Veyrand, Julien; Marin-Kuan, Maricel; Bezencon, Claudine; Frank, Nancy; Guérin, Violaine; Koster, Sander; Latado, Hélia; Mollergues, Julie; Patin, Amaury; Piguet, Dominique; Serrant, Patrick; Varela, Jesus; Schilter, Benoît

    2017-10-01

    Food contact materials (FCM) contain chemicals which can migrate into food and result in human exposure. Although it is mandatory to ensure that migration does not endanger human health, there is still no consensus on how to pragmatically assess the safety of FCM since traditional approaches would require extensive toxicological and analytical testing which are expensive and time consuming. Recently, the combination of bioassays, analytical chemistry and risk assessment has been promoted as a new paradigm to identify toxicologically relevant molecules and address safety issues. However, there has been debate on the actual value of bioassays in that framework. In the present work, a FCM anticipated to release the endocrine active chemical 4-nonyphenol (4NP) was used as a model. In a migration study, the leaching of 4NP was confirmed by LC-MS/MS and GC-MS. This was correlated with an increase in both estrogenic and anti-androgenic activities as measured with bioassays. A standard risk assessment indicated that according to the food intake scenario applied, the level of 4NP measured was lower, close or slightly above the acceptable daily intake. Altogether these results show that bioassays could reveal the presence of an endocrine active chemical in a real-case FCM migration study. The levels reported were relevant for safety assessment. In addition, this work also highlighted that bioactivity measured in migrate does not necessarily represent a safety issue. In conclusion, together with analytics, bioassays contribute to identify toxicologically relevant molecules leaching from FCM and enable improved safety assessment.

  1. INTEGRATING BIOANALYTICAL CAPABILITY IN AN ENVIRONMENTAL ANALYTICAL LABORATORY

    EPA Science Inventory

    The product is a book chapter which is an introductory and summary chapter for the reference work "Immunoassays and Other Bianalytical Techniques" to be published by CRC Press, Taylor and Francis Books. The chapter provides analytical chemists information on new techni...

  2. Influence versus intent for predictive analytics in situation awareness

    NASA Astrophysics Data System (ADS)

    Cui, Biru; Yang, Shanchieh J.; Kadar, Ivan

    2013-05-01

    Predictive analytics in situation awareness requires an element to comprehend and anticipate potential adversary activities that might occur in the future. Most work in high level fusion or predictive analytics utilizes machine learning, pattern mining, Bayesian inference, and decision tree techniques to predict future actions or states. The emergence of social computing in broader contexts has drawn interests in bringing the hypotheses and techniques from social theory to algorithmic and computational settings for predictive analytics. This paper aims at answering the question on how influence and attitude (some interpreted such as intent) of adversarial actors can be formulated and computed algorithmically, as a higher level fusion process to provide predictions of future actions. The challenges in this interdisciplinary endeavor include drawing existing understanding of influence and attitude in both social science and computing fields, as well as the mathematical and computational formulation for the specific context of situation to be analyzed. The study of `influence' has resurfaced in recent years due to the emergence of social networks in the virtualized cyber world. Theoretical analysis and techniques developed in this area are discussed in this paper in the context of predictive analysis. Meanwhile, the notion of intent, or `attitude' using social theory terminologies, is a relatively uncharted area in the computing field. Note that a key objective of predictive analytics is to identify impending/planned attacks so their `impact' and `threat' can be prevented. In this spirit, indirect and direct observables are drawn and derived to infer the influence network and attitude to predict future threats. This work proposes an integrated framework that jointly assesses adversarial actors' influence network and their attitudes as a function of past actions and action outcomes. A preliminary set of algorithms are developed and tested using the Global Terrorism

  3. An analytic model for footprint dispersions and its application to mission design

    NASA Technical Reports Server (NTRS)

    Rao, J. R. Jagannatha; Chen, Yi-Chao

    1992-01-01

    This is the final report on our recent research activities that are complementary to those conducted by our colleagues, Professor Farrokh Mistree and students, in the context of the Taguchi method. We have studied the mathematical model that forms the basis of the Simulation and Optimization of Rocket Trajectories (SORT) program and developed an analytic method for determining mission reliability with a reduced number of flight simulations. This method can be incorporated in a design algorithm to mathematically optimize different performance measures of a mission, thus leading to a robust and easy-to-use methodology for mission planning and design.

  4. How we implemented an analytical support clinic to strengthen student research capacity in Zambia.

    PubMed

    Andrews, Ben; Musonda, Patrick; Simuyemba, Moses; Wilson, Craig M; Nzala, Selestine; Vermund, Sten H; Michelo, Charles

    2014-12-11

    Abstract Background: Research outputs in sub-Saharan Africa may be limited by a scarcity of clinical research expertise. In Zambia, clinical and biomedical postgraduate students are often delayed in graduation due to challenges in completing their research dissertations. We sought to strengthen institutional research capacity by supporting student and faculty researchers through weekly epidemiology and biostatistics clinics. Methods: We instituted a weekly Analytical Support Clinic at the University of Zambia, School of Medicine. A combination of biostatisticians, clinical researchers and epidemiologists meet weekly with clients to address questions of proposal development, data management and analysis. Clinic sign-in sheets were reviewed. Results: 109 students and faculty members accounted for 197 visits to the Clinic. Nearly all clients (107/109, 98.2%) were undergraduate or postgraduate students. Reasons for attending the Clinic were primarily for proposal development (46.7%) and data management/analysis (42.1%). The most common specific reasons for seeking help were data analysis and interpretation (36.5%), development of study design and research questions (26.9%) and sample size calculation (21.8%). Conclusions: The Analytical Support Clinic is an important vehicle for strengthening postgraduate research through one-on-one and small group demand-driven interactions. The clinic approach supplements mentorship from departmental supervisors, providing specific expertise and contextual teaching.

  5. An Analytical State Transition Matrix for Orbits Perturbed by an Oblate Spheroid

    NASA Technical Reports Server (NTRS)

    Mueller, A. C.

    1977-01-01

    An analytical state transition matrix and its inverse, which include the short period and secular effects of the second zonal harmonic, were developed from the nonsingular PS satellite theory. The fact that the independent variable in the PS theory is not time is in no respect disadvantageous, since any explicit analytical solution must be expressed in the true or eccentric anomaly. This is shown to be the case for the simple conic matrix. The PS theory allows for a concise, accurate, and algorithmically simple state transition matrix. The improvement over the conic matrix ranges from 2 to 4 digits accuracy.

  6. Neutron activation and other analytical data for plutonic rocks from North America and Africa. National Uranium Resource Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, V.; Fay, W.M.; Cook, J.R.

    1982-09-01

    The objective of this report is to retrieve the elements of an analytical study of granites and associated other plutonic rocks which was begun as a part of the U.S. Department of Energy's National Uranium Resource Evaluation (NURE) program. A discussion of the Savannah River Laboratory (SRL) neutron activation analysis system is given so that a user will understand the linmitations of the data. Enough information is given so that an experienced geochemist can clean up the data set to the extent required by any project. The data are generally good as they are presented. It is intended that themore » data be read from a magnetic tape written to accompany this report. Microfiche tables of the data follow the text. These tables were prepared from data on the tape, and programs which will read the tape are presented in the section THE DATA TAPE. It is our intent to write a later paper which will include a thoroughly scrubbed data set and a technical discussion of results of the study. 1 figure.« less

  7. Meta-analysis of workplace physical activity interventions.

    PubMed

    Conn, Vicki S; Hafdahl, Adam R; Cooper, Pamela S; Brown, Lori M; Lusk, Sally L

    2009-10-01

    Most adults do not achieve adequate physical activity levels. Despite the potential benefits of worksite health promotion, no previous comprehensive meta-analysis has summarized health and physical activity behavior outcomes from such programs. This comprehensive meta-analysis integrated the extant wide range of worksite physical activity intervention research. Extensive searching located published and unpublished intervention studies reported from 1969 through 2007. Results were coded from primary studies. Random-effects meta-analytic procedures, including moderator analyses, were completed in 2008. Effects on most variables were substantially heterogeneous because diverse studies were included. Standardized mean difference (d) effect sizes were synthesized across approximately 38,231 subjects. Significantly positive effects were observed for physical activity behavior (0.21); fitness (0.57); lipids (0.13); anthropometric measures (0.08); work attendance (0.19); and job stress (0.33). The significant effect size for diabetes risk (0.98) is less robust given small sample sizes. The mean effect size for fitness corresponds to a difference between treatment minus control subjects' means on VO2max of 3.5 mL/kg/min; for lipids, -0.2 on the ratio of total cholesterol to high-density lipoprotein; and for diabetes risk, -12.6 mg/dL on fasting glucose. These findings document that some workplace physical activity interventions can improve both health and important worksite outcomes. Effects were variable for most outcomes, reflecting the diversity of primary studies. Future primary research should compare interventions to confirm causal relationships and further explore heterogeneity.

  8. Verification of Decision-Analytic Models for Health Economic Evaluations: An Overview.

    PubMed

    Dasbach, Erik J; Elbasha, Elamin H

    2017-07-01

    Decision-analytic models for cost-effectiveness analysis are developed in a variety of software packages where the accuracy of the computer code is seldom verified. Although modeling guidelines recommend using state-of-the-art quality assurance and control methods for software engineering to verify models, the fields of pharmacoeconomics and health technology assessment (HTA) have yet to establish and adopt guidance on how to verify health and economic models. The objective of this paper is to introduce to our field the variety of methods the software engineering field uses to verify that software performs as expected. We identify how many of these methods can be incorporated in the development process of decision-analytic models in order to reduce errors and increase transparency. Given the breadth of methods used in software engineering, we recommend a more in-depth initiative to be undertaken (e.g., by an ISPOR-SMDM Task Force) to define the best practices for model verification in our field and to accelerate adoption. Establishing a general guidance for verifying models will benefit the pharmacoeconomics and HTA communities by increasing accuracy of computer programming, transparency, accessibility, sharing, understandability, and trust of models.

  9. Increasing Impact of Coursework Through Deep Analytics

    NASA Astrophysics Data System (ADS)

    Horodyskyj, L.; Schonstein, D.; Buxner, S.; Semken, S. C.; Anbar, A. D.

    2014-12-01

    Over the past few years, ASU has developed the online astrobiology lab course Habitable Worlds, which has been offered to over 1,500 students over seven semesters. The course is offered through Smart Sparrow's intelligent tutoring system, which records student answers, time on question, simulation setups, and additional data that we refer to as "analytics". As the development of the course has stabilized, we have been able to devote more time to analyzing these data, extracting patterns of student behavior and how they have changed as the course has developed. During the most recent two semesters, pre- and post-tests of content knowledge related to the greenhouse effect were administered to assess changes in students' knowledge. The results of the Fall 2013 content assessment and an analysis of each step of every activity using the course platform analytics were used to identify problematic concepts and lesson elements, which were redesigned for the following semester. We observed a statistically significant improvement from pre to post instruction in Spring 2014. Preliminary results seem to indicate that several interactive activities, which replaced written/spoken content, contributed to this positive outcome. Our study demonstrates the benefit of deep analytics for thorough analysis of student results and quick iteration, allowing for significantly improved exercises to be redeployed quickly. The misconceptions that students have and retain depend on the individual student, although certain patterns do emerge in the class as a whole. These patterns can be seen in student discussion board behavior, the types of answers they submit, and the patterns of mistakes they make. By interrogating this wealth of data, we seek to identify the patterns that outstanding, struggling, and failing students display and how early in the class these patterns can be detected. If these patterns can be identified and detected early in the semester, instructors can intervene earlier

  10. An analytical and experimental evaluation of a Fresnel lens solar concentrator

    NASA Technical Reports Server (NTRS)

    Hastings, L. J.; Allums, S. A.; Cosby, R. M.

    1976-01-01

    An analytical and experimental evaluation of line focusing Fresnel lenses with application potential in the 200 to 370 C range was studied. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves down lens. Experimentation was based on a 56 cm wide, f/1.0 lens. A Sun tracking heliostat provided a nonmoving solar source. Measured data indicated more spreading at the profile base than analytically predicted, resulting in a peak concentration 18 percent lower than the computed peak of 57. The measured and computed transmittances were 85 and 87 percent, respectively. Preliminary testing with a subsequent lens indicated that modified manufacturing techniques corrected the profile spreading problem and should enable improved analytical experimental correlation.

  11. Space Trajectories Error Analysis (STEAP) Programs. Volume 1: Analytic manual, update

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Manual revisions are presented for the modified and expanded STEAP series. The STEAP 2 is composed of three independent but related programs: NOMAL for the generation of n-body nominal trajectories performing a number of deterministic guidance events; ERRAN for the linear error analysis and generalized covariance analysis along specific targeted trajectories; and SIMUL for testing the mathematical models used in the navigation and guidance process. The analytic manual provides general problem description, formulation, and solution and the detailed analysis of subroutines. The programmers' manual gives descriptions of the overall structure of the programs as well as the computational flow and analysis of the individual subroutines. The user's manual provides information on the input and output quantities of the programs. These are updates to N69-36472 and N69-36473.

  12. The Analytic Onion: Examining Training Issues from Different Levels of Analysis. Interim Technical Paper for Period July 1989-June 1991.

    ERIC Educational Resources Information Center

    Lamb, Theodore A.; Chin, Keric B. O.

    This paper proposes a conceptual framework based on different levels of analysis using the metaphor of the layers of an onion to help organize and structure thinking on research issues concerning training. It discusses the core of the "analytic onion," the biological level, and seven levels of analysis that surround that core: the individual, the…

  13. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  14. Method and apparatus for detecting an analyte

    DOEpatents

    Allendorf, Mark D [Pleasanton, CA; Hesketh, Peter J [Atlanta, GA

    2011-11-29

    We describe the use of coordination polymers (CP) as coatings on microcantilevers for the detection of chemical analytes. CP exhibit changes in unit cell parameters upon adsorption of analytes, which will induce a stress in a static microcantilever upon which a CP layer is deposited. We also describe fabrication methods for depositing CP layers on surfaces.

  15. An analytical solution to assess the SH seismoelectric response of the vadose zone

    NASA Astrophysics Data System (ADS)

    Monachesi, L. B.; Zyserman, F. I.; Jouniaux, L.

    2018-03-01

    We derive an analytical solution of the seismoelectric conversions generated in the vadose zone, when this region is crossed by a pure shear horizontal (SH) wave. Seismoelectric conversions are induced by electrokinetic effects linked to relative motions between fluid and porous media. The considered model assumes a one-dimensional soil constituted by a single layer on top of a half space in contact at the water table, and a shearing force located at the earth's surface as the wave source. The water table is an interface expected to induce a seismoelectric interfacial response (IR). The top layer represents a porous rock which porous space is partially saturated by water and air, while the half-space is completely saturated with water, representing the saturated zone. The analytical expressions for the coseismic fields and the interface responses, both electric and magnetic, are derived by solving Pride's equations with proper boundary conditions. An approximate analytical expression of the solution is also obtained, which is very simple and applicable in a fairly broad set of situations. Hypothetical scenarios are proposed to study and analyse the dependence of the electromagnetic fields on various parameters of the medium. An analysis of the approximate solution is also made together with a comparison to the exact solution. The main result of the present analysis is that the amplitude of the interface response generated at the water table is found to be proportional to the jump in the electric current density, which in turn depends on the saturation contrast, poro-mechanical and electrical properties of the medium and on the amplitude of the solid displacement produced by the source. This result is in agreement with the one numerically obtained by the authors, which has been published in a recent work. We also predict the existence of an interface response located at the surface, and that the electric interface response is several orders of magnitude bigger than

  16. An analytical solution to assess the SH seismoelectric response of the vadose zone

    NASA Astrophysics Data System (ADS)

    Monachesi, L. B.; Zyserman, F. I.; Jouniaux, L.

    2018-06-01

    We derive an analytical solution of the seismoelectric conversions generated in the vadose zone, when this region is crossed by a pure shear horizontal (SH) wave. Seismoelectric conversions are induced by electrokinetic effects linked to relative motions between fluid and porous media. The considered model assumes a 1D soil constituted by a single layer on top of a half-space in contact at the water table, and a shearing force located at the earth's surface as the wave source. The water table is an interface expected to induce a seismoelectric interfacial response (IR). The top layer represents a porous rock in which porous space is partially saturated by water and air, while the half-space is completely saturated with water, representing the saturated zone. The analytical expressions for the coseismic fields and the interface responses, both electric and magnetic, are derived by solving Pride's equations with proper boundary conditions. An approximate analytical expression of the solution is also obtained, which is very simple and applicable in a fairly broad set of situations. Hypothetical scenarios are proposed to study and analyse the dependence of the electromagnetic fields on various parameters of the medium. An analysis of the approximate solution is also made together with a comparison to the exact solution. The main result of the present analysis is that the amplitude of the interface response generated at the water table is found to be proportional to the jump in the electric current density, which in turn depends on the saturation contrast, poro-mechanical and electrical properties of the medium and on the amplitude of the solid displacement produced by the source. This result is in agreement with the one numerically obtained by the authors, which has been published in a recent work. We also predict the existence of an interface response located at the surface, and that the electric interface response is several orders of magnitude bigger than the

  17. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  18. An Analytical Solution for the Impact of Vegetation Changes on Hydrological Partitioning Within the Budyko Framework

    NASA Astrophysics Data System (ADS)

    Zhang, Shulei; Yang, Yuting; McVicar, Tim R.; Yang, Dawen

    2018-01-01

    Vegetation change is a critical factor that profoundly affects the terrestrial water cycle. Here we derive an analytical solution for the impact of vegetation changes on hydrological partitioning within the Budyko framework. This is achieved by deriving an analytical expression between leaf area index (LAI) change and the Budyko land surface parameter (n) change, through the combination of a steady state ecohydrological model with an analytical carbon cost-benefit model for plant rooting depth. Using China where vegetation coverage has experienced dramatic changes over the past two decades as a study case, we quantify the impact of LAI changes on the hydrological partitioning during 1982-2010 and predict the future influence of these changes for the 21st century using climate model projections. Results show that LAI change exhibits an increasing importance on altering hydrological partitioning as climate becomes drier. In semiarid and arid China, increased LAI has led to substantial streamflow reductions over the past three decades (on average -8.5% in 1990s and -11.7% in 2000s compared to the 1980s baseline), and this decreasing trend in streamflow is projected to continue toward the end of this century due to predicted LAI increases. Our result calls for caution regarding the large-scale revegetation activities currently being implemented in arid and semiarid China, which may result in serious future water scarcity issues here. The analytical model developed here is physically based and suitable for simultaneously assessing both vegetation changes and climate change induced changes to streamflow globally.

  19. Different analytical approaches in assessing antibacterial activity and the purity of commercial lysozyme preparations for dairy application.

    PubMed

    Brasca, Milena; Morandi, Stefano; Silvetti, Tiziana; Rosi, Veronica; Cattaneo, Stefano; Pellegrino, Luisa

    2013-05-21

    Hen egg-white lysozyme (LSZ) is currently used in the food industry to limit the proliferation of lactic acid bacteria spoilage in the production of wine and beer, and to inhibit butyric acid fermentation in hard and extra hard cheeses (late blowing) caused by the outgrowth of clostridial spores. The aim of this work was to evaluate how the enzyme activity in commercial preparations correlates to the enzyme concentration and can be affected by the presence of process-related impurities. Different analytical approaches, including turbidimetric assay, SDS-PAGE and HPLC were used to analyse 17 commercial preparations of LSZ marketed in different countries. The HPLC method adopted by ISO allowed the true LSZ concentration to be determined with accuracy. The turbidimetric assay was the most suitable method to evaluate LSZ activity, whereas SDS-PAGE allowed the presence of other egg proteins, which are potential allergens, to be detected. The analytical results showed that the purity of commercially available enzyme preparations can vary significantly, and evidenced the effectiveness of combining different analytical approaches in this type of control.

  20. Shape anomaly detection under strong measurement noise: An analytical approach to adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Krasichkov, Alexander S.; Grigoriev, Eugene B.; Bogachev, Mikhail I.; Nifontov, Eugene M.

    2015-10-01

    We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.

  1. ENVIRONMENTAL ANALYTICAL CHEMISTRY OF ...

    EPA Pesticide Factsheets

    Within the scope of a number of emerging contaminant issues in environmental analysis, one area that has received a great deal of public interest has been the assessment of the role of pharmaceuticals and personal care products (PPCPs) as stressors and agents of change in ecosystems as well as their role in unplanned human exposure. The relationship between personal actions and the occurrence of PPCPs in the environment is clear-cut and comprehensible to the public. In this overview, we attempt to examine the separations aspect of the analytical approach to the vast array of potential analytes among this class of compounds. We also highlight the relationship between these compounds and endocrine disrupting compounds (EDCs) and between PPCPs and EDCs and the more traditional environmental analytes such as the persistent organic pollutants (POPs). Although the spectrum of chemical behavior extends from hydrophobic to hydrophilic, the current focus has shifted to moderately and highly polar analytes. Thus, emphasis on HPLC and LC/MS has grown and MS/MS has become a detection technique of choice with either electrospray ionization or atmospheric pressure chemical ionization. This contrasts markedly with the bench mark approach of capillary GC, GC/MS and electron ionization in traditional environmental analysis. The expansion of the analyte list has fostered new vigor in the development of environmental analytical chemistry, modernized the range of tools appli

  2. Neutron Activation Analysis of the Rare Earth Elements (REE) - With Emphasis on Geological Materials

    NASA Astrophysics Data System (ADS)

    Stosch, Heinz-Günter

    2016-08-01

    Neutron activation analysis (NAA) has been the analytical method of choice for rare earth element (REE) analysis from the early 1960s through the 1980s. At that time, irradiation facilitieswere widely available and fairly easily accessible. The development of high-resolution gamma-ray detectors in the mid-1960s eliminated, formany applications, the need for chemical separation of the REE from the matrix material, making NAA a reliable and effective analytical tool. While not as precise as isotopedilution mass spectrometry, NAA was competitive by being sensitive for the analysis of about half of the rare earths (La, Ce, Nd, Sm, Eu, Tb, Yb, Lu). The development of inductively coupled plasma mass spectrometry since the 1980s, together with decommissioning of research reactors and the lack of installation of new ones in Europe and North America has led to the rapid decline of NAA.

  3. Intracavity optogalvanic spectroscopy. An analytical technique for 14C analysis with subattomole sensitivity.

    PubMed

    Murnick, Daniel E; Dogru, Ozgur; Ilkmen, Erhan

    2008-07-01

    We show a new ultrasensitive laser-based analytical technique, intracavity optogalvanic spectroscopy, allowing extremely high sensitivity for detection of (14)C-labeled carbon dioxide. Capable of replacing large accelerator mass spectrometers, the technique quantifies attomoles of (14)C in submicrogram samples. Based on the specificity of narrow laser resonances coupled with the sensitivity provided by standing waves in an optical cavity and detection via impedance variations, limits of detection near 10(-15) (14)C/(12)C ratios are obtained. Using a 15-W (14)CO2 laser, a linear calibration with samples from 10(-15) to >1.5 x 10(-12) in (14)C/(12)C ratios, as determined by accelerator mass spectrometry, is demonstrated. Possible applications include microdosing studies in drug development, individualized subtherapeutic tests of drug metabolism, carbon dating and real time monitoring of atmospheric radiocarbon. The method can also be applied to detection of other trace entities.

  4. An Investigation to Manufacturing Analytical Services Composition using the Analytical Target Cascading Method.

    PubMed

    Tien, Kai-Wen; Kulvatunyou, Boonserm; Jung, Kiwook; Prabhu, Vittaldas

    2017-01-01

    As cloud computing is increasingly adopted, the trend is to offer software functions as modular services and compose them into larger, more meaningful ones. The trend is attractive to analytical problems in the manufacturing system design and performance improvement domain because 1) finding a global optimization for the system is a complex problem; and 2) sub-problems are typically compartmentalized by the organizational structure. However, solving sub-problems by independent services can result in a sub-optimal solution at the system level. This paper investigates the technique called Analytical Target Cascading (ATC) to coordinate the optimization of loosely-coupled sub-problems, each may be modularly formulated by differing departments and be solved by modular analytical services. The result demonstrates that ATC is a promising method in that it offers system-level optimal solutions that can scale up by exploiting distributed and modular executions while allowing easier management of the problem formulation.

  5. An audit of the contribution to post-mortem examination diagnosis of individual analyte results obtained from biochemical analysis of the vitreous.

    PubMed

    Mitchell, Rebecca; Charlwood, Cheryl; Thomas, Sunethra Devika; Bellis, Maria; Langlois, Neil E I

    2013-12-01

    Biochemical analysis of the vitreous humor from the eye is an accepted accessory test for post-mortem investigation of cause of death. Modern biochemical analyzers allow testing of a range of analytes from a sample. However, it is not clear which analytes should be requested in order to prevent unnecessary testing (and expense). The means and standard deviation of the values obtained from analysis of the vitreous humor for sodium, potassium, chloride, osmolality, glucose, ketones (β-hydroxybutyrate), creatinine, urea, calcium, lactate, and ammonia were calculated from which the contribution of each analyte was reviewed in the context of post-mortem findings and final cause of death. For sodium 32 cases were regarded as high (more than one standard deviation above the mean), from which 9 contributed to post-mortem diagnosis [drowning (4), heat related death (2), diabetic hyperglycemia (2), and dehydration (1)], but 25 low values (greater than one standard deviation below the mean) made no contribution. For chloride 29 high values contributed to 4 cases--3 drowning and 1 heat-related, but these were all previously identified by a high sodium level. There were 29 high and 35 low potassium values, none of which contributed to determining the final cause of death. Of 22 high values of creatinine, 12 contributed to a diagnosis of renal failure. From 32 high values of urea, 18 contributed to 16 cases of renal failure (2 associated with diabetic hyperglycemia), 1 heat-related death, and one case with dehydration. Osmolarity contributed to 12 cases (5 heat-related, 4 diabetes, 2 renal failure, and 1 dehydration) from 36 high values. There was no contribution from 32 high values and 19 low values of calcium and there was no contribution from 4 high and 2 low values of ammonia. There were 11 high values of glucose, which contributed to the diagnosis of 6 cases of diabetic hyperglycemia and 21 high ketone levels contributed to 8 cases: 4 diabetic ketosis, 3 hypothermia, 3

  6. Analysis of an Air Conditioning Coolant Solution for Metal Contamination Using Atomic Absorption Spectroscopy: An Undergraduate Instrumental Analysis Exercise Simulating an Industrial Assignment

    ERIC Educational Resources Information Center

    Baird, Michael J.

    2004-01-01

    A real-life analytical assignment is presented to students, who had to examine an air conditioning coolant solution for metal contamination using an atomic absorption spectroscopy (AAS). This hands-on access to a real problem exposed the undergraduate students to the mechanism of AAS, and promoted participation in a simulated industrial activity.

  7. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2002-07

    USGS Publications Warehouse

    Pearson, D.K.; Gary, R.H.; Wilson, Z.D.

    2007-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.

  8. Determining passive cooling limits in CPV using an analytical thermal model

    NASA Astrophysics Data System (ADS)

    Gualdi, Federico; Arenas, Osvaldo; Vossier, Alexis; Dollet, Alain; Aimez, Vincent; Arès, Richard

    2013-09-01

    We propose an original thermal analytical model aiming to predict the practical limits of passive cooling systems for high concentration photovoltaic modules. The analytical model is described and validated by comparison with a commercial 3D finite element model. The limiting performances of flat plate cooling systems in natural convection are then derived and discussed.

  9. An analytical and experimental evaluation of shadow shields and their support members

    NASA Technical Reports Server (NTRS)

    Stochl, R. J.; Boyle, R. J.

    1972-01-01

    Experimental tests were performed on a model shadow shield thermal protection system to examine the effect of certain configuration variables. The experimental results were used to verify the ability of an analytical program to predict the shadow shield performance including the shield-support interaction. In general, the analysis (assuming diffuse surfaces) agreed well with the experimental support temperature profiles. The agreement for the shield profiles was not as good. The results demonstrated: (1) shadow shields can be effective in reducing the heat transfer into cryogenic propellant tanks, and (2) the conductive heat transfer through supports can be reduced by selective surface coatings.

  10. Physiogenomic analysis of localized FMRI brain activity in schizophrenia.

    PubMed

    Windemuth, Andreas; Calhoun, Vince D; Pearlson, Godfrey D; Kocherla, Mohan; Jagannathan, Kanchana; Ruaño, Gualberto

    2008-06-01

    The search for genetic factors associated with disease is complicated by the complexity of the biological pathways linking genotype and phenotype. This analytical complexity is particularly concerning in diseases historically lacking reliable diagnostic biological markers, such as schizophrenia and other mental disorders. We investigate the use of functional magnetic resonance imaging (fMRI) as an intermediate phenotype (endophenotype) to identify physiogenomic associations to schizophrenia. We screened 99 subjects, 30 subjects diagnosed with schizophrenia, 13 unaffected relatives of schizophrenia patients, and 56 unrelated controls, for gene polymorphisms associated with fMRI activation patterns at two locations in temporal and frontal lobes previously implied in schizophrenia. A total of 22 single nucleotide polymorphisms (SNPs) in 15 genes from the dopamine and serotonin neurotransmission pathways were genotyped in all subjects. We identified three SNPs in genes that are significantly associated with fMRI activity. SNPs of the dopamine beta-hydroxylase (DBH) gene and of the dopamine receptor D4 (DRD4) were associated with activity in the temporal and frontal lobes, respectively. One SNP of serotonin-3A receptor (HTR3A) was associated with temporal lobe activity. The results of this study support the physiogenomic analysis of neuroimaging data to discover associations between genotype and disease-related phenotypes.

  11. Fusion Analytics: A Data Integration System for Public Health and Medical Disaster Response Decision Support

    PubMed Central

    Passman, Dina B.

    2013-01-01

    Objective The objective of this demonstration is to show conference attendees how they can integrate, analyze, and visualize diverse data type data from across a variety of systems by leveraging an off-the-shelf enterprise business intelligence (EBI) solution to support decision-making in disasters. Introduction Fusion Analytics is the data integration system developed by the Fusion Cell at the U.S. Department of Health and Human Services (HHS), Office of the Assistant Secretary for Preparedness and Response (ASPR). Fusion Analytics meaningfully augments traditional public and population health surveillance reporting by providing web-based data analysis and visualization tools. Methods Fusion Analytics serves as a one-stop-shop for the web-based data visualizations of multiple real-time data sources within ASPR. The 24-7 web availability makes it an ideal analytic tool for situational awareness and response allowing stakeholders to access the portal from any internet-enabled device without installing any software. The Fusion Analytics data integration system was built using off-the-shelf EBI software. Fusion Analytics leverages the full power of statistical analysis software and delivers reports to users in a secure web-based environment. Fusion Analytics provides an example of how public health staff can develop and deploy a robust public health informatics solution using an off-the shelf product and with limited development funding. It also provides the unique example of a public health information system that combines patient data for traditional disease surveillance with manpower and resource data to provide overall decision support for federal public health and medical disaster response operations. Conclusions We are currently in a unique position within public health. One the one hand, we have been gaining greater and greater access to electronic data of all kinds over the last few years. On the other, we are working in a time of reduced government spending

  12. Characterization of Compton-scatter imaging with an analytical simulation method

    PubMed Central

    Jones, Kevin C; Redler, Gage; Templeton, Alistair; Bernard, Damian; Turian, Julius V; Chu, James C H

    2018-01-01

    By collimating the photons scattered when a megavoltage therapy beam interacts with the patient, a Compton-scatter image may be formed without the delivery of an extra dose. To characterize and assess the potential of the technique, an analytical model for simulating scatter images was developed and validated against Monte Carlo (MC). For three phantoms, the scatter images collected during irradiation with a 6 MV flattening-filter-free therapy beam were simulated. Images, profiles, and spectra were compared for different phantoms and different irradiation angles. The proposed analytical method simulates accurate scatter images up to 1000 times faster than MC. Minor differences between MC and analytical simulated images are attributed to limitations in the isotropic superposition/convolution algorithm used to analytically model multiple-order scattering. For a detector placed at 90° relative to the treatment beam, the simulated scattered photon energy spectrum peaks at 140–220 keV, and 40–50% of the photons are the result of multiple scattering. The high energy photons originate at the beam entrance. Increasing the angle between source and detector increases the average energy of the collected photons and decreases the relative contribution of multiple scattered photons. Multiple scattered photons cause blurring in the image. For an ideal 5 mm diameter pinhole collimator placed 18.5 cm from the isocenter, 10 cGy of deposited dose (2 Hz imaging rate for 1200 MU min−1 treatment delivery) is expected to generate an average 1000 photons per mm2 at the detector. For the considered lung tumor CT phantom, the contrast is high enough to clearly identify the lung tumor in the scatter image. Increasing the treatment beam size perpendicular to the detector plane decreases the contrast, although the scatter subject contrast is expected to be greater than the megavoltage transmission image contrast. With the analytical method, real-time tumor tracking may be possible

  13. Characterization of Compton-scatter imaging with an analytical simulation method

    NASA Astrophysics Data System (ADS)

    Jones, Kevin C.; Redler, Gage; Templeton, Alistair; Bernard, Damian; Turian, Julius V.; Chu, James C. H.

    2018-01-01

    By collimating the photons scattered when a megavoltage therapy beam interacts with the patient, a Compton-scatter image may be formed without the delivery of an extra dose. To characterize and assess the potential of the technique, an analytical model for simulating scatter images was developed and validated against Monte Carlo (MC). For three phantoms, the scatter images collected during irradiation with a 6 MV flattening-filter-free therapy beam were simulated. Images, profiles, and spectra were compared for different phantoms and different irradiation angles. The proposed analytical method simulates accurate scatter images up to 1000 times faster than MC. Minor differences between MC and analytical simulated images are attributed to limitations in the isotropic superposition/convolution algorithm used to analytically model multiple-order scattering. For a detector placed at 90° relative to the treatment beam, the simulated scattered photon energy spectrum peaks at 140-220 keV, and 40-50% of the photons are the result of multiple scattering. The high energy photons originate at the beam entrance. Increasing the angle between source and detector increases the average energy of the collected photons and decreases the relative contribution of multiple scattered photons. Multiple scattered photons cause blurring in the image. For an ideal 5 mm diameter pinhole collimator placed 18.5 cm from the isocenter, 10 cGy of deposited dose (2 Hz imaging rate for 1200 MU min-1 treatment delivery) is expected to generate an average 1000 photons per mm2 at the detector. For the considered lung tumor CT phantom, the contrast is high enough to clearly identify the lung tumor in the scatter image. Increasing the treatment beam size perpendicular to the detector plane decreases the contrast, although the scatter subject contrast is expected to be greater than the megavoltage transmission image contrast. With the analytical method, real-time tumor tracking may be possible

  14. [Local Regression Algorithm Based on Net Analyte Signal and Its Application in Near Infrared Spectral Analysis].

    PubMed

    Zhang, Hong-guang; Lu, Jian-gang

    2016-02-01

    Abstract To overcome the problems of significant difference among samples and nonlinearity between the property and spectra of samples in spectral quantitative analysis, a local regression algorithm is proposed in this paper. In this algorithm, net signal analysis method(NAS) was firstly used to obtain the net analyte signal of the calibration samples and unknown samples, then the Euclidean distance between net analyte signal of the sample and net analyte signal of calibration samples was calculated and utilized as similarity index. According to the defined similarity index, the local calibration sets were individually selected for each unknown sample. Finally, a local PLS regression model was built on each local calibration sets for each unknown sample. The proposed method was applied to a set of near infrared spectra of meat samples. The results demonstrate that the prediction precision and model complexity of the proposed method are superior to global PLS regression method and conventional local regression algorithm based on spectral Euclidean distance.

  15. Analytical Chemistry Laboratory. Progress report for FY 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1996. This annual report is the thirteenth for the ACL. It describes effort on continuing and new projects and contributions of the ACL staff to various programs at ANL. The ACL operates in the ANL system as a full-cost-recovery service center, but has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support to solve research problems of our clients --more » Argonne National Laboratory, the Department of Energy, and others -- and will conduct world-class research and development in analytical chemistry and its applications. Because of the diversity of research and development work at ANL, the ACL handles a wide range of analytical chemistry problems. Some routine or standard analyses are done, but the ACL usually works with commercial laboratories if our clients require high-volume, production-type analyses. It is common for ANL programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. Thus, much of the support work done by the ACL is very similar to our applied analytical chemistry research.« less

  16. Text Stream Trend Analysis using Multiscale Visual Analytics with Applications to Social Media Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Beaver, Justin M; BogenII, Paul L.

    In this paper, we introduce a new visual analytics system, called Matisse, that allows exploration of global trends in textual information streams with specific application to social media platforms. Despite the potential for real-time situational awareness using these services, interactive analysis of such semi-structured textual information is a challenge due to the high-throughput and high-velocity properties. Matisse addresses these challenges through the following contributions: (1) robust stream data management, (2) automated sen- timent/emotion analytics, (3) inferential temporal, geospatial, and term-frequency visualizations, and (4) a flexible drill-down interaction scheme that progresses from macroscale to microscale views. In addition to describing thesemore » contributions, our work-in-progress paper concludes with a practical case study focused on the analysis of Twitter 1% sample stream information captured during the week of the Boston Marathon bombings.« less

  17. Optimizing an immersion ESL curriculum using analytic hierarchy process.

    PubMed

    Tang, Hui-Wen Vivian

    2011-11-01

    The main purpose of this study is to fill a substantial knowledge gap regarding reaching a uniform group decision in English curriculum design and planning. A comprehensive content-based course criterion model extracted from existing literature and expert opinions was developed. Analytical hierarchy process (AHP) was used to identify the relative importance of course criteria for the purpose of tailoring an optimal one-week immersion English as a second language (ESL) curriculum for elementary school students in a suburban county of Taiwan. The hierarchy model and AHP analysis utilized in the present study will be useful for resolving several important multi-criteria decision-making issues in planning and evaluating ESL programs. This study also offers valuable insights and provides a basis for further research in customizing ESL curriculum models for different student populations with distinct learning needs, goals, and socioeconomic backgrounds. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Microfluidic paper-based analytical devices for potential use in quantitative and direct detection of disease biomarkers in clinical analysis.

    PubMed

    Lim, Wei Yin; Goh, Boon Tong; Khor, Sook Mei

    2017-08-15

    Clinicians, working in the health-care diagnostic systems of developing countries, currently face the challenges of rising costs, increased number of patient visits, and limited resources. A significant trend is using low-cost substrates to develop microfluidic devices for diagnostic purposes. Various fabrication techniques, materials, and detection methods have been explored to develop these devices. Microfluidic paper-based analytical devices (μPADs) have gained attention for sensing multiplex analytes, confirming diagnostic test results, rapid sample analysis, and reducing the volume of samples and analytical reagents. μPADs, which can provide accurate and reliable direct measurement without sample pretreatment, can reduce patient medical burden and yield rapid test results, aiding physicians in choosing appropriate treatment. The objectives of this review are to provide an overview of the strategies used for developing paper-based sensors with enhanced analytical performances and to discuss the current challenges, limitations, advantages, disadvantages, and future prospects of paper-based microfluidic platforms in clinical diagnostics. μPADs, with validated and justified analytical performances, can potentially improve the quality of life by providing inexpensive, rapid, portable, biodegradable, and reliable diagnostics. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. The electrochemical performance of graphene modified electrodes: an analytical perspective.

    PubMed

    Brownson, Dale A C; Foster, Christopher W; Banks, Craig E

    2012-04-21

    We explore the use of graphene modified electrodes towards the electroanalytical sensing of various analytes, namely dopamine hydrochloride, uric acid, acetaminophen and p-benzoquinone via cyclic voltammetry. In line with literature methodologies and to investigate the full-implications of employing graphene in this electrochemical context, we modify electrode substrates that exhibit either fast or slow electron transfer kinetics (edge- or basal- plane pyrolytic graphite electrodes respectively) with well characterised commercially available graphene that has not been chemically treated, is free from surfactants and as a result of its fabrication has an extremely low oxygen content, allowing the true electroanalytical applicability of graphene to be properly de-convoluted and determined. In comparison to the unmodified underlying electrode substrates (constructed from graphite), we find that graphene exhibits a reduced analytical performance in terms of sensitivity, linearity and observed detection limits towards each of the various analytes studied within. Owing to graphene's structural composition, low proportion of edge plane sites and consequent slow heterogeneous electron transfer rates, there appears to be no advantages, for the analytes studied here, of employing graphene in this electroanalytical context.

  20. Algal Biomass Analysis by Laser-Based Analytical Techniques—A Review

    PubMed Central

    Pořízka, Pavel; Prochazková, Petra; Prochazka, David; Sládková, Lucia; Novotný, Jan; Petrilak, Michal; Brada, Michal; Samek, Ota; Pilát, Zdeněk; Zemánek, Pavel; Adam, Vojtěch; Kizek, René; Novotný, Karel; Kaiser, Jozef

    2014-01-01

    Algal biomass that is represented mainly by commercially grown algal strains has recently found many potential applications in various fields of interest. Its utilization has been found advantageous in the fields of bioremediation, biofuel production and the food industry. This paper reviews recent developments in the analysis of algal biomass with the main focus on the Laser-Induced Breakdown Spectroscopy, Raman spectroscopy, and partly Laser-Ablation Inductively Coupled Plasma techniques. The advantages of the selected laser-based analytical techniques are revealed and their fields of use are discussed in detail. PMID:25251409

  1. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  2. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  3. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  4. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  5. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  6. Visual Analytics for the Food-Water-Energy Nexus in the Phoenix Active Management Area

    NASA Astrophysics Data System (ADS)

    Maciejewski, R.; Mascaro, G.; White, D. D.; Ruddell, B. L.; Aggarwal, R.; Sarjoughian, H.

    2016-12-01

    The Phoenix Active Management Area (AMA) is an administrative region of 14,500 km2 identified by the Arizona Department of Water Resources with the aim of reaching and maintaining the safe yield (i.e. balance between annual amount of groundwater withdrawn and recharged) by 2025. The AMA includes the Phoenix metropolitan area, which has experienced a dramatic population growth over the last decades with a progressive conversion of agricultural land into residential land. As a result of these changes, the water and energy demand as well as the food production in the region have significantly evolved over the last 30 years. Given the arid climate, a crucial role to support this growth has been the creation of a complex water supply system based on renewable and non-renewable resources, including the energy-intensive Central Arizona Project. In this talk, we present a preliminary characterization of the evolution in time of the feedbacks between food, water, and energy in the Phoenix AMA by analyzing secondary data (available from water and energy providers, irrigation districts, and municipalities), as well as satellite imagery and primary data collected by the authors. A preliminary visual analytics framework is also discussed describing current design practices and ideas for exploring networked components and cascading impacts within the FEW Nexus. This analysis and framework represent the first steps towards the development of an integrated modeling, visualization, and decision support infrastructure for comprehensive FEW systems decision making at decision-relevant temporal and spatial scales.

  7. Constructing core competency indicators for clinical teachers in Taiwan: a qualitative analysis and an analytic hierarchy process

    PubMed Central

    2014-01-01

    Background The objective of this study was to construct a framework of core competency indicators of medical doctors who teach in the clinical setting in Taiwan and to evaluate the relative importance of the indicators among these clinical teachers. Methods The preliminary framework of the indicators was developed from an in-depth interview conducted with 12 clinical teachers who had previously been recognized and awarded for their teaching excellence in university hospitals. The framework was categorized into 4 dimensions: 1) Expertise (i.e., professional knowledge and skill); 2) Teaching Ability; 3) Attitudes and Traits; and 4) Beliefs and Values. These areas were further divided into 11 sub-dimensions and 40 indicators. Subsequently, a questionnaire built upon this qualitative analysis was distributed to another group of 17 clinical teachers. Saaty’s eigenvector approach, or the so-called analytic hierarchy process (AHP), was applied to perform the pairwise comparisons between indicators and to determine the ranking and relative importance of the indicators. Results Fourteen questionnaires were deemed valid for AHP assessment due to completeness of data input. The relative contribution of the four main dimensions was 31% for Attitudes and Traits, 30% for Beliefs and Values, 22% for Expertise, and 17% for Teaching Ability. Specifically, 9 out of the 10 top-ranked indicators belonged to the “Attitudes and Traits” or “Beliefs and Values” dimensions, indicating that inner characteristics (i.e., attitudes, traits, beliefs, and values) were perceived as more important than surface ones (i.e., professional knowledge, skills, and teaching competency). Conclusion We performed a qualitative analysis and developed a questionnaire based upon an interview with experienced clinical teachers in Taiwan, and used this tool to construct the key features for the role model. The application has also demonstrated the relative importance in the dimensions of the core

  8. Constructing core competency indicators for clinical teachers in Taiwan: a qualitative analysis and an analytic hierarchy process.

    PubMed

    Li, Ai-Tzu; Lin, Jou-Wei

    2014-04-11

    The objective of this study was to construct a framework of core competency indicators of medical doctors who teach in the clinical setting in Taiwan and to evaluate the relative importance of the indicators among these clinical teachers. The preliminary framework of the indicators was developed from an in-depth interview conducted with 12 clinical teachers who had previously been recognized and awarded for their teaching excellence in university hospitals. The framework was categorized into 4 dimensions: 1) Expertise (i.e., professional knowledge and skill); 2) Teaching Ability; 3) Attitudes and Traits; and 4) Beliefs and Values. These areas were further divided into 11 sub-dimensions and 40 indicators. Subsequently, a questionnaire built upon this qualitative analysis was distributed to another group of 17 clinical teachers. Saaty's eigenvector approach, or the so-called analytic hierarchy process (AHP), was applied to perform the pairwise comparisons between indicators and to determine the ranking and relative importance of the indicators. Fourteen questionnaires were deemed valid for AHP assessment due to completeness of data input. The relative contribution of the four main dimensions was 31% for Attitudes and Traits, 30% for Beliefs and Values, 22% for Expertise, and 17% for Teaching Ability. Specifically, 9 out of the 10 top-ranked indicators belonged to the "Attitudes and Traits" or "Beliefs and Values" dimensions, indicating that inner characteristics (i.e., attitudes, traits, beliefs, and values) were perceived as more important than surface ones (i.e., professional knowledge, skills, and teaching competency). We performed a qualitative analysis and developed a questionnaire based upon an interview with experienced clinical teachers in Taiwan, and used this tool to construct the key features for the role model. The application has also demonstrated the relative importance in the dimensions of the core competencies for clinical teachers in Taiwan.

  9. Transformation of an uncertain video search pipeline to a sketch-based visual analytics loop.

    PubMed

    Legg, Philip A; Chung, David H S; Parry, Matthew L; Bown, Rhodri; Jones, Mark W; Griffiths, Iwan W; Chen, Min

    2013-12-01

    Traditional sketch-based image or video search systems rely on machine learning concepts as their core technology. However, in many applications, machine learning alone is impractical since videos may not be semantically annotated sufficiently, there may be a lack of suitable training data, and the search requirements of the user may frequently change for different tasks. In this work, we develop a visual analytics systems that overcomes the shortcomings of the traditional approach. We make use of a sketch-based interface to enable users to specify search requirement in a flexible manner without depending on semantic annotation. We employ active machine learning to train different analytical models for different types of search requirements. We use visualization to facilitate knowledge discovery at the different stages of visual analytics. This includes visualizing the parameter space of the trained model, visualizing the search space to support interactive browsing, visualizing candidature search results to support rapid interaction for active learning while minimizing watching videos, and visualizing aggregated information of the search results. We demonstrate the system for searching spatiotemporal attributes from sports video to identify key instances of the team and player performance.

  10. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  11. Introducing Text Analytics as a Graduate Business School Course

    ERIC Educational Resources Information Center

    Edgington, Theresa M.

    2011-01-01

    Text analytics refers to the process of analyzing unstructured data from documented sources, including open-ended surveys, blogs, and other types of web dialog. Text analytics has enveloped the concept of text mining, an analysis approach influenced heavily from data mining. While text mining has been covered extensively in various computer…

  12. Fluorescence metrology used for analytics of high-quality optical materials

    NASA Astrophysics Data System (ADS)

    Engel, Axel; Haspel, Rainer; Rupertus, Volker

    2004-09-01

    Optical, glass ceramics and crystals are used for various specialized applications in telecommunication, biomedical, optical, and micro lithography technology. In order to qualify and control the material quality during the research and production processes several specialized ultra trace analytisis methods have to be appliedcs Schott Glas is applied. One focus of our the activities is the determination of impurities ranging in the sub ppb-regime, because such kind of impurity level is required e.g. for pure materials used for microlithography for example. Common analytical techniques for these impurity levels areSuch impurities are determined using analytical methods like LA ICP-MS and or Neutron Activation Analysis for example. On the other hand direct and non-destructive optical analysistic becomes is attractive because it visualizes the requirement of the optical applications additionally. Typical eExamples are absorption and laser resistivity measurements of optical material with optical methods like precision spectral photometers and or in-situ transmission measurements by means ofusing lamps and or UV lasers. Analytical methods have the drawback that they are time consuming and rather expensive, whereas the sensitivity for the absorption method will not be sufficient to characterize the future needs (coefficient much below 10-3 cm-1). For a non-destructive qualification for the current and future quality requirements a Jobin Yvon FLUOROLOG 3.22 fluorescence spectrometery is employed to enable fast and precise qualification and analysis. The main advantage of this setup is the combination of highest sensitivity (more than one order of magnitude higher sensitivity than state of the art UV absorption spectroscopy), fast measurement and evaluation cycles (several minutes compared to several hours necessary for chemical analystics). An overview is given for spectral characteristics using specified standards, which are necessary to establish the analytical system

  13. An Integrated Gate Turnaround Management Concept Leveraging Big Data Analytics for NAS Performance Improvements

    NASA Technical Reports Server (NTRS)

    Chung, William W.; Ingram, Carla D.; Ahlquist, Douglas Kurt; Chachad, Girish H.

    2016-01-01

    unique airport attributes (e.g., runway, taxiway, terminal, and gate configurations and tenants), and combined statistics from past data and live data based on a specific set of ATM concept-of-operations (ConOps) and operational parameters via systems analysis using an analytic network learning model. The IGTM tool will then bound the uncertainties that arise from nominal and off-nominal operational conditions with direct assessment of the gate turnaround status and the impact of a certain operational decision on the NAS performance, and provide a set of recommended actions to optimize the NAS performance by allowing stakeholders to take mitigation actions to reduce uncertainty and time deviation of planned operational events. An IGTM prototype was developed at NASA Ames Simulation Laboratories (SimLabs) to demonstrate the benefits and applicability of the concept. A data network, using the System Wide Information Management (SWIM)-like messaging application using the ActiveMQ message service, was connected to the simulated data warehouse, scheduled flight plans, a fast-time airport simulator, and a graphic UI. A fast-time simulation was integrated with the data warehouse or Big Data/Analytics (BAI), scheduled flight plans from Aeronautical Operational Control AOC, IGTM Controller, and a UI via a SWIM-like data messaging network using the ActiveMQ message service, illustrated in Figure 1, to demonstrate selected use-cases showing the benefits of the IGTM concept on the NAS performance.

  14. An analytical poroelastic model for ultrasound elastography imaging of tumors

    NASA Astrophysics Data System (ADS)

    Tauhidul Islam, Md; Chaudhry, Anuj; Unnikrishnan, Ginu; Reddy, J. N.; Righetti, Raffaella

    2018-01-01

    The mechanical behavior of biological tissues has been studied using a number of mechanical models. Due to the relatively high fluid content and mobility, many biological tissues have been modeled as poroelastic materials. Diseases such as cancers are known to alter the poroelastic response of a tissue. Tissue poroelastic properties such as compressibility, interstitial permeability and fluid pressure also play a key role for the assessment of cancer treatments and for improved therapies. At the present time, however, a limited number of poroelastic models for soft tissues are retrievable in the literature, and the ones available are not directly applicable to tumors as they typically refer to uniform tissues. In this paper, we report the analytical poroelastic model for a non-uniform tissue under stress relaxation. Displacement, strain and fluid pressure fields in a cylindrical poroelastic sample containing a cylindrical inclusion during stress relaxation are computed. Finite element simulations are then used to validate the proposed theoretical model. Statistical analysis demonstrates that the proposed analytical model matches the finite element results with less than 0.5% error. The availability of the analytical model and solutions presented in this paper may be useful to estimate diagnostically relevant poroelastic parameters such as interstitial permeability and fluid pressure, and, in general, for a better interpretation of clinically-relevant ultrasound elastography results.

  15. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  16. Ibmdbpy-spatial : An Open-source implementation of in-database geospatial analytics in Python

    NASA Astrophysics Data System (ADS)

    Roy, Avipsa; Fouché, Edouard; Rodriguez Morales, Rafael; Moehler, Gregor

    2017-04-01

    As the amount of spatial data acquired from several geodetic sources has grown over the years and as data infrastructure has become more powerful, the need for adoption of in-database analytic technology within geosciences has grown rapidly. In-database analytics on spatial data stored in a traditional enterprise data warehouse enables much faster retrieval and analysis for making better predictions about risks and opportunities, identifying trends and spot anomalies. Although there are a number of open-source spatial analysis libraries like geopandas and shapely available today, most of them have been restricted to manipulation and analysis of geometric objects with a dependency on GEOS and similar libraries. We present an open-source software package, written in Python, to fill the gap between spatial analysis and in-database analytics. Ibmdbpy-spatial provides a geospatial extension to the ibmdbpy package, implemented in 2015. It provides an interface for spatial data manipulation and access to in-database algorithms in IBM dashDB, a data warehouse platform with a spatial extender that runs as a service on IBM's cloud platform called Bluemix. Working in-database reduces the network overload, as the complete data need not be replicated into the user's local system altogether and only a subset of the entire dataset can be fetched into memory in a single instance. Ibmdbpy-spatial accelerates Python analytics by seamlessly pushing operations written in Python into the underlying database for execution using the dashDB spatial extender, thereby benefiting from in-database performance-enhancing features, such as columnar storage and parallel processing. The package is currently supported on Python versions from 2.7 up to 3.4. The basic architecture of the package consists of three main components - 1) a connection to the dashDB represented by the instance IdaDataBase, which uses a middleware API namely - pypyodbc or jaydebeapi to establish the database connection via

  17. Applying an analytical method to study neutron behavior for dosimetry

    NASA Astrophysics Data System (ADS)

    Shirazi, S. A. Mousavi

    2016-12-01

    In this investigation, a new dosimetry process is studied by applying an analytical method. This novel process is associated with a human liver tissue. The human liver tissue has compositions including water, glycogen and etc. In this study, organic compound materials of liver are decomposed into their constituent elements based upon mass percentage and density of every element. The absorbed doses are computed by analytical method in all constituent elements of liver tissue. This analytical method is introduced applying mathematical equations based on neutron behavior and neutron collision rules. The results show that the absorbed doses are converged for neutron energy below 15MeV. This method can be applied to study the interaction of neutrons in other tissues and estimating the absorbed dose for a wide range of neutron energy.

  18. An analytical approach to γ-ray self-shielding effects for radioactive bodies encountered nuclear decommissioning scenarios.

    PubMed

    Gamage, K A A; Joyce, M J

    2011-10-01

    A novel analytical approach is described that accounts for self-shielding of γ radiation in decommissioning scenarios. The approach is developed with plutonium-239, cobalt-60 and caesium-137 as examples; stainless steel and concrete have been chosen as the media for cobalt-60 and caesium-137, respectively. The analytical methods have been compared MCNPX 2.6.0 simulations. A simple, linear correction factor relates the analytical results and the simulated estimates. This has the potential to greatly simplify the estimation of self-shielding effects in decommissioning activities. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. An analytical theory of planetary rotation rates

    NASA Technical Reports Server (NTRS)

    Harris, A. W.

    1977-01-01

    An approximate analytical theory is derived for the rate of rotation acquired by a planet as it grows from the solar nebula. This theory was motivated by a numerical study by Giuli, and yields fair agreement with his results. The periods of planetary rotation obtained are proportional to planetesimal encounter velocity, and appear to suggest lower values of this velocity than are commonly assumed to have existed during planetary formation.

  20. Analytical Chemistry Laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  1. Meta-Analysis of Workplace Physical Activity Interventions

    PubMed Central

    Conn, Vicki S.; Hafdahl, Adam R.; Cooper, Pamela S.; Brown, Lori M.; Lusk, Sally L.

    2009-01-01

    Context Most adults do not achieve adequate physical activity. Despite the potential benefits of worksite health promotion, no previous comprehensive meta-analysis has summarized health and physical activity behavior outcomes from these programs. This comprehensive meta-analysis integrated the extant wide range of worksite physical activity intervention research. Evidence acquisition Extensive searching located published and unpublished intervention studies reported from 1969 through 2007. Results were coded from primary studies. Random-effects meta-analytic procedures, including moderator analyses, were completed in 2008. Evidence synthesis Effects on most variables were substantially heterogeneous because diverse studies were included. Standardized mean difference (d) effect sizes were synthesized across approximately 38,231 subjects. Significantly positive effects were observed for physical activity behavior (0.21), fitness (0.57), lipids (0.13), anthropometric measures (0.08), work attendance (0.19), and job stress (0.33). The significant effect size for diabetes risk (0.98) is more tentative given small sample sizes. Significant heterogeneity documents intervention effects varied across studies. The mean effect size for fitness corresponds to a difference between treatment minus control subjects' means on V02max of 3.5 mL/kg/min; for lipids, −0.2 on total cholesterol:HDL; and for diabetes risk, −12.6 mg/dL on fasting glucose. Conclusions These findings document that some workplace physical activity interventions can improve both health and important worksite outcomes. Effects were variable for most outcomes, reflecting the diversity of primary studies. Future primary research should compare interventions to confirm causal relationships and further explore heterogeneity. PMID:19765506

  2. Flow adjustment inside homogeneous canopies after a leading edge – An analytical approach backed by LES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroniger, Konstantin; Banerjee, Tirtha; De Roo, Frederik

    A two-dimensional analytical model for describing the mean flow behavior inside a vegetation canopy after a leading edge in neutral conditions was developed and tested by means of large eddy simulations (LES) employing the LES code PALM. The analytical model is developed for the region directly after the canopy edge, the adjustment region, where one-dimensional canopy models fail due to the sharp change in roughness. The derivation of this adjustment region model is based on an analytic solution of the two-dimensional Reynolds averaged Navier–Stokes equation in neutral conditions for a canopy with constant plant area density (PAD). The main assumptionsmore » for solving the governing equations are separability of the velocity components concerning the spatial variables and the neglection of the Reynolds stress gradients. These two assumptions are verified by means of LES. To determine the emerging model parameters, a simultaneous fitting scheme was applied to the velocity and pressure data of a reference LES simulation. Furthermore a sensitivity analysis of the adjustment region model, equipped with the previously calculated parameters, was performed varying the three relevant length, the canopy height ( h), the canopy length and the adjustment length ( Lc), in additional LES. Even if the model parameters are, in general, functions of h/ Lc, it was found out that the model is capable of predicting the flow quantities in various cases, when using constant parameters. Subsequently the adjustment region model is combined with the one-dimensional model of Massman, which is applicable for the interior of the canopy, to attain an analytical model capable of describing the mean flow for the full canopy domain. As a result, the model is tested against an analytical model based on a linearization approach.« less

  3. Flow adjustment inside homogeneous canopies after a leading edge – An analytical approach backed by LES

    DOE PAGES

    Kroniger, Konstantin; Banerjee, Tirtha; De Roo, Frederik; ...

    2017-10-06

    A two-dimensional analytical model for describing the mean flow behavior inside a vegetation canopy after a leading edge in neutral conditions was developed and tested by means of large eddy simulations (LES) employing the LES code PALM. The analytical model is developed for the region directly after the canopy edge, the adjustment region, where one-dimensional canopy models fail due to the sharp change in roughness. The derivation of this adjustment region model is based on an analytic solution of the two-dimensional Reynolds averaged Navier–Stokes equation in neutral conditions for a canopy with constant plant area density (PAD). The main assumptionsmore » for solving the governing equations are separability of the velocity components concerning the spatial variables and the neglection of the Reynolds stress gradients. These two assumptions are verified by means of LES. To determine the emerging model parameters, a simultaneous fitting scheme was applied to the velocity and pressure data of a reference LES simulation. Furthermore a sensitivity analysis of the adjustment region model, equipped with the previously calculated parameters, was performed varying the three relevant length, the canopy height ( h), the canopy length and the adjustment length ( Lc), in additional LES. Even if the model parameters are, in general, functions of h/ Lc, it was found out that the model is capable of predicting the flow quantities in various cases, when using constant parameters. Subsequently the adjustment region model is combined with the one-dimensional model of Massman, which is applicable for the interior of the canopy, to attain an analytical model capable of describing the mean flow for the full canopy domain. As a result, the model is tested against an analytical model based on a linearization approach.« less

  4. Microwave assisted solvent extraction and coupled-column reversed-phase liquid chromatography with UV detection use of an analytical restricted-access-medium column for the efficient multi-residue analysis of acidic pesticides in soils.

    PubMed

    Hogendoom, E A; Huls, R; Dijkman, E; Hoogerbrugge, R

    2001-12-14

    A screening method has been developed for the determination of acidic pesticides in various types of soils. Methodology is based on the use of microwave assisted solvent extraction (MASE) for fast and efficient extraction of the analytes from the soils and coupled-column reversed-phase liquid chromatography (LC-LC) with UV detection at 228 nm for the instrumental analysis of uncleaned extracts. Four types of soils, including sand, clay and peat, with a range in organic matter content of 0.3-13% and ten acidic pesticides of different chemical families (bentazone, bromoxynil, metsulfuron-methyl, 2,4-D, MCPA, MCPP, 2,4-DP, 2,4,5-T, 2,4-DB and MCPB) were selected as matrices and analytes, respectively. The method developed included the selection of suitable MASE and LC-LC conditions. The latter consisted of the selection of a 5-microm GFF-II internal surface reversed-phase (ISRP, Pinkerton) analytical column (50 x 4.6 mm, I.D.) as the first column in the RAM-C18 configuration in combination with an optimised linear gradient elution including on-line cleanup of sample extracts and reconditioning of the columns. The method was validated with the analysis of freshly spiked samples and samples with aged residues (120 days). The four types of soils were spiked with the ten acidic pesticides at levels between 20 and 200 microg/kg. Weighted regression of the recovery data showed for most analyte-matrix combinations, including freshly spiked samples and aged residues, that the method provides overall recoveries between 60 and 90% with relative standard deviations of the intra-laboratory reproducibility's between 5 and 25%; LODs were obtained between 5 and 50 microg/kg. Evaluation of the data set with principal component analysis revealed that the parameters (i) increase of organic matter content of the soil samples and (ii) aged residues negatively effect the recovery of the analytes.

  5. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  6. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  7. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  8. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture... Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MREs are listed as follows: (1) Official Methods of Analysis of AOAC...

  9. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  10. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  11. Explorative visual analytics on interval-based genomic data and their metadata.

    PubMed

    Jalili, Vahid; Matteucci, Matteo; Masseroli, Marco; Ceri, Stefano

    2017-12-04

    With the wide-spreading of public repositories of NGS processed data, the availability of user-friendly and effective tools for data exploration, analysis and visualization is becoming very relevant. These tools enable interactive analytics, an exploratory approach for the seamless "sense-making" of data through on-the-fly integration of analysis and visualization phases, suggested not only for evaluating processing results, but also for designing and adapting NGS data analysis pipelines. This paper presents abstractions for supporting the early analysis of NGS processed data and their implementation in an associated tool, named GenoMetric Space Explorer (GeMSE). This tool serves the needs of the GenoMetric Query Language, an innovative cloud-based system for computing complex queries over heterogeneous processed data. It can also be used starting from any text files in standard BED, BroadPeak, NarrowPeak, GTF, or general tab-delimited format, containing numerical features of genomic regions; metadata can be provided as text files in tab-delimited attribute-value format. GeMSE allows interactive analytics, consisting of on-the-fly cycling among steps of data exploration, analysis and visualization that help biologists and bioinformaticians in making sense of heterogeneous genomic datasets. By means of an explorative interaction support, users can trace past activities and quickly recover their results, seamlessly going backward and forward in the analysis steps and comparative visualizations of heatmaps. GeMSE effective application and practical usefulness is demonstrated through significant use cases of biological interest. GeMSE is available at http://www.bioinformatics.deib.polimi.it/GeMSE/ , and its source code is available at https://github.com/Genometric/GeMSE under GPLv3 open-source license.

  12. Analytic strategies to evaluate the association of time-varying exposures to HIV-related outcomes: Alcohol consumption as an example.

    PubMed

    Cook, Robert L; Kelso, Natalie E; Brumback, Babette A; Chen, Xinguang

    2016-01-01

    As persons with HIV are living longer, there is a growing need to investigate factors associated with chronic disease, rate of disease progression and survivorship. Many risk factors for this high-risk population change over time, such as participation in treatment, alcohol consumption and drug abuse. Longitudinal datasets are increasingly available, particularly clinical data that contain multiple observations of health exposures and outcomes over time. Several analytic options are available for assessment of longitudinal data; however, it can be challenging to choose the appropriate analytic method for specific combinations of research questions and types of data. The purpose of this review is to help researchers choose the appropriate methods to analyze longitudinal data, using alcohol consumption as an example of a time-varying exposure variable. When selecting the optimal analytic method, one must consider aspects of exposure (e.g. timing, pattern, and amount) and outcome (fixed or time-varying), while also addressing minimizing bias. In this article, we will describe several analytic approaches for longitudinal data, including developmental trajectory analysis, generalized estimating equations, and mixed effect models. For each analytic strategy, we describe appropriate situations to use the method and provide an example that demonstrates the use of the method. Clinical data related to alcohol consumption and HIV are used to illustrate these methods.

  13. Mercury and gold concentrations of highly polluted environmental samples determined using prompt gamma-ray analysis and instrument neutron activation analysis

    NASA Astrophysics Data System (ADS)

    Osawa, Takahito; Hatsukawa, Yuichi; Appel, Peter W. U.; Matsue, Hideaki

    2011-04-01

    The authors have established a method of determining mercury and gold in severely polluted environmental samples using prompt gamma-ray analysis (PGA) and instrumental neutron activation analysis (INAA). Since large amounts of mercury are constantly being released into the environment by small-scale gold mining in many developing countries, the mercury concentration in tailings and water has to be determined to mitigate environmental pollution. Cold-vapor atomic absorption analysis, the most pervasive method of mercury analysis, is not suitable because tailings and water around mining facilities have extremely high mercury concentrations. On the other hand, PGA can determine high mercury concentrations in polluted samples as it has an appropriate level of sensitivity. Moreover, gold concentrations can be determined sequentially by using INAA after PGA. In conclusion, the analytical procedure established in this work using PGA and INAA is the best way to evaluate the degree of pollution and the tailing resource value. This method will significantly contribute to mitigating problems in the global environment.

  14. Electron-cyclotron absorption in high-temperature plasmas: quasi-exact analytical evaluation and comparative numerical analysis

    NASA Astrophysics Data System (ADS)

    Albajar, F.; Bertelli, N.; Bornatici, M.; Engelmann, F.

    2007-01-01

    On the basis of the electromagnetic energy balance equation, a quasi-exact analytical evaluation of the electron-cyclotron (EC) absorption coefficient is performed for arbitrary propagation (with respect to the magnetic field) in a (Maxwellian) magneto-plasma for the temperature range of interest for fusion reactors (in which EC radiation losses tend to be important in the plasma power balance). The calculation makes use of Bateman's expansion for the product of two Bessel functions, retaining the lowest-order contribution. The integration over electron momentum can then be carried out analytically, fully accounting for finite Larmor radius effects in this approximation. On the basis of the analytical expressions for the EC absorption coefficients of both the extraordinary and ordinary modes thus obtained, (i) for the case of perpendicular propagation simple formulae are derived for both modes and (ii) a numerical analysis of the angular distribution of EC absorption is carried out. An assessment of the accuracy of asymptotic expressions that have been given earlier is also performed, showing that these approximations can be usefully applied for calculating EC power losses from reactor-grade plasmas. Presented in part at the 14th Joint Workshop on Electron Cyclotron Emission and Electron Cyclotron Resonance Heating, Santorini, Greece, 9-12 May 2006.

  15. Critical Access Hospitals and Retail Activity: An Empirical Analysis in Oklahoma

    ERIC Educational Resources Information Center

    Brooks, Lara; Whitacre, Brian E.

    2011-01-01

    Purpose: This paper takes an empirical approach to determining the effect that a critical access hospital (CAH) has on local retail activity. Previous research on the relationship between hospitals and economic development has primarily focused on single-case, multiplier-oriented analysis. However, as the efficacy of federal and state-level rural…

  16. ANALYTICAL CHEMISTRY IN NUCLEAR REACTOR TECHNOLOGY. Analysis of Reactor Fuels, Fission-Product Mixtures and Related Materials: Analytical Chemistry of Plutonium and the Transplutonic Elements. Third Conference, Gatlinburg, Tennessee, October 26-29, 1959

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1960-01-01

    Thirty-one papers and 10 summaries of papers presented at the Third Conference on Analytical Chemistry in Nuclear Reactor Technology held at Gatlinburg, Tennessee, October 26 to 29, 1959, are given. The papers are grouped into four sections: general, analytical chemistry of fuels, analytical chemistry of plutonium and the transplutonic elements, and the analysis of fission-product mixtures. Twenty-seven of the papers are covered by separate abstracts. Four were previously abstracted for NSA. (M.C.G.)

  17. Adult active transport in the Netherlands: an analysis of its contribution to physical activity requirements.

    PubMed

    Fishman, Elliot; Böcker, Lars; Helbich, Marco

    2015-01-01

    Modern, urban lifestyles have engineered physical activity out of everyday life and this presents a major threat to human health. The Netherlands is a world leader in active travel, particularly cycling, but little research has sought to quantify the cumulative amount of physical activity through everyday walking and cycling. Using data collected as part of the Dutch National Travel Survey (2010 - 2012), this paper determines the degree to which Dutch walking and cycling contributes to meeting minimum level of physical activity of 150 minutes of moderate intensity aerobic activity throughout the week. The sample includes 74,465 individuals who recorded at least some travel on the day surveyed. As physical activity benefits are cumulative, all walking and cycling trips are analysed, including those to and from public transport. These trips are then converted into an established measure of physical activity intensity, known as metabolic equivalents of tasks. Multivariate Tobit regression models were performed on a range of socio-demographic, transport resources, urban form and meteorological characteristics. The results reveal that Dutch men and women participate in 24 and 28 minutes of daily physical activity through walking and cycling, which is 41% and 55% more than the minimum recommended level. It should be noted however that some 57% of the entire sample failed to record any walking or cycling, and an investigation of this particular group serves as an important topic of future research. Active transport was positively related with age, income, bicycle ownership, urban density and air temperature. Car ownership had a strong negative relationship with physically active travel. The results of this analysis demonstrate the significance of active transport to counter the emerging issue of sedentary lifestyle disease. The Dutch experience provides other countries with a highly relevant case study in the creation of environments and cultures that support healthy

  18. Two-Dimensional Model for Reactive-Sorption Columns of Cylindrical Geometry: Analytical Solutions and Moment Analysis.

    PubMed

    Khan, Farman U; Qamar, Shamsul

    2017-05-01

    A set of analytical solutions are presented for a model describing the transport of a solute in a fixed-bed reactor of cylindrical geometry subjected to the first (Dirichlet) and third (Danckwerts) type inlet boundary conditions. Linear sorption kinetic process and first-order decay are considered. Cylindrical geometry allows the use of large columns to investigate dispersion, adsorption/desorption and reaction kinetic mechanisms. The finite Hankel and Laplace transform techniques are adopted to solve the model equations. For further analysis, statistical temporal moments are derived from the Laplace-transformed solutions. The developed analytical solutions are compared with the numerical solutions of high-resolution finite volume scheme. Different case studies are presented and discussed for a series of numerical values corresponding to a wide range of mass transfer and reaction kinetics. A good agreement was observed in the analytical and numerical concentration profiles and moments. The developed solutions are efficient tools for analyzing numerical algorithms, sensitivity analysis and simultaneous determination of the longitudinal and transverse dispersion coefficients from a laboratory-scale radial column experiment. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Analytical and Numerical Results for an Adhesively Bonded Joint Subjected to Pure Bending

    NASA Technical Reports Server (NTRS)

    Smeltzer, Stanley S., III; Lundgren, Eric

    2006-01-01

    A one-dimensional, semi-analytical methodology that was previously developed for evaluating adhesively bonded joints composed of anisotropic adherends and adhesives that exhibit inelastic material behavior is further verified in the present paper. A summary of the first-order differential equations and applied joint loading used to determine the adhesive response from the methodology are also presented. The method was previously verified against a variety of single-lap joint configurations from the literature that subjected the joints to cases of axial tension and pure bending. Using the same joint configuration and applied bending load presented in a study by Yang, the finite element analysis software ABAQUS was used to further verify the semi-analytical method. Linear static ABAQUS results are presented for two models, one with a coarse and one with a fine element meshing, that were used to verify convergence of the finite element analyses. Close agreement between the finite element results and the semi-analytical methodology were determined for both the shear and normal stress responses of the adhesive bondline. Thus, the semi-analytical methodology was successfully verified using the ABAQUS finite element software and a single-lap joint configuration subjected to pure bending.

  20. 37Cl/35Cl isotope ratio analysis in perchlorate by ion chromatography/multi collector -ICPMS: Analytical performance and implication for biodegradation studies.

    PubMed

    Zakon, Yevgeni; Ronen, Zeev; Halicz, Ludwik; Gelman, Faina

    2017-10-01

    In the present study we propose a new analytical method for 37 Cl/ 35 Cl analysis in perchlorate by Ion Chromatography(IC) coupled to Multicollector Inductively Coupled Plasma Mass Spectrometry (MC-ICPMS). The accuracy of the analytical method was validated by analysis of international perchlorate standard materials USGS-37 and USGS -38; analytical precision better than ±0.4‰ was achieved. 37 Cl/ 35 Cl isotope ratio analysis in perchlorate during laboratory biodegradation experiment with microbial cultures enriched from the contaminated soil in Israel resulted in isotope enrichment factor ε 37 Cl = -13.3 ± 1‰, which falls in the range reported previously for perchlorate biodegradation by pure microbial cultures. The proposed analytical method may significantly simplify the procedure for isotope analysis of perchlorate which is currently applied in environmental studies. Copyright © 2017. Published by Elsevier Ltd.

  1. Analytical Eco-Scale for Assessing the Greenness of a Developed RP-HPLC Method Used for Simultaneous Analysis of Combined Antihypertensive Medications.

    PubMed

    Mohamed, Heba M; Lamie, Nesrine T

    2016-09-01

    In the past few decades the analytical community has been focused on eliminating or reducing the usage of hazardous chemicals and solvents, in different analytical methodologies, that have been ascertained to be extremely dangerous to human health and environment. In this context, environmentally friendly, green, or clean practices have been implemented in different research areas. This study presents a greener alternative of conventional RP-HPLC methods for the simultaneous determination and quantitative analysis of a pharmaceutical ternary mixture composed of telmisartan, hydrochlorothiazide, and amlodipine besylate, using an ecofriendly mobile phase and short run time with the least amount of waste production. This solvent-replacement approach was feasible without compromising method performance criteria, such as separation efficiency, peak symmetry, and chromatographic retention. The greenness profile of the proposed method was assessed and compared with reported conventional methods using the analytical Eco-Scale as an assessment tool. The proposed method was found to be greener in terms of usage of hazardous chemicals and solvents, energy consumption, and production of waste. The proposed method can be safely used for the routine analysis of the studied pharmaceutical ternary mixture with a minimal detrimental impact on human health and the environment.

  2. Direct analysis of ethylene glycol in human serum on the basis of analyte adduct formation and liquid chromatography-tandem mass spectrometry.

    PubMed

    Dziadosz, Marek

    2018-01-01

    The aim of this work was to develop a fast, cost-effective and time-saving liquid chromatography-tandem mass spectrometry (LC-MS/MS) analytical method for the analysis of ethylene glycol (EG) in human serum. For these purposes, the formation/fragmentation of an EG adduct ion with sodium and sodium acetate was applied in the positive electrospray mode for signal detection. Adduct identification was performed with appropriate infusion experiments based on analyte solutions prepared in different concentrations. Corresponding analyte adduct ions and adduct ion fragments could be identified both for EG and the deuterated internal standard (EG-D4). Protein precipitation was used as sample preparation. The analysis of the supernatant was performed with a Luna 5μm C18 (2) 100A, 150mm×2mm analytical column and a mobile phase consisting of 95% A (H 2 O/methanol=95/5, v/v) and 5% B (H 2 O/methanol=3/97, v/v), both with 10mmolL -1 ammonium acetate and 0.1% acetic acid. Method linearity was examined in the range of 100-4000μg/mL and the calculated limit of detection/quantification was 35/98μg/mL. However, on the basis of the signal to noise ratio, quantification was recommended at a limit of 300μg/mL. Additionally, the examined precision, accuracy, stability, selectivity and matrix effect demonstrated that the method is a practicable alternative for EG quantification in human serum. In comparison to other methods based on liquid chromatography, the strategy presented made for the first time the EG analysis without analyte derivatisation possible. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Faculty Workload: An Analytical Approach

    ERIC Educational Resources Information Center

    Dennison, George M.

    2012-01-01

    Recent discussions of practices in higher education have tended toward muck-raking and self-styled exposure of cynical self-indulgence by faculty and administrators at the expense of students and their families, as usually occurs during periods of economic duress, rather than toward analytical studies designed to foster understanding This article…

  4. An Introduction to MAMA (Meta-Analysis of MicroArray data) System.

    PubMed

    Zhang, Zhe; Fenstermacher, David

    2005-01-01

    Analyzing microarray data across multiple experiments has been proven advantageous. To support this kind of analysis, we are developing a software system called MAMA (Meta-Analysis of MicroArray data). MAMA utilizes a client-server architecture with a relational database on the server-side for the storage of microarray datasets collected from various resources. The client-side is an application running on the end user's computer that allows the user to manipulate microarray data and analytical results locally. MAMA implementation will integrate several analytical methods, including meta-analysis within an open-source framework offering other developers the flexibility to plug in additional statistical algorithms.

  5. Single-analyte to multianalyte fluorescence sensors

    NASA Astrophysics Data System (ADS)

    Lavigne, John J.; Metzger, Axel; Niikura, Kenichi; Cabell, Larry A.; Savoy, Steven M.; Yoo, J. S.; McDevitt, John T.; Neikirk, Dean P.; Shear, Jason B.; Anslyn, Eric V.

    1999-05-01

    The rational design of small molecules for the selective complexation of analytes has reached a level of sophistication such that there exists a high degree of prediction. An effective strategy for transforming these hosts into sensors involves covalently attaching a fluorophore to the receptor which displays some fluorescence modulation when analyte is bound. Competition methods, such as those used with antibodies, are also amenable to these synthetic receptors, yet there are few examples. In our laboratories, the use of common dyes in competition assays with small molecules has proven very effective. For example, an assay for citrate in beverages and an assay for the secondary messenger IP3 in cells have been developed. Another approach we have explored focuses on multi-analyte sensor arrays with attempt to mimic the mammalian sense of taste. Our system utilizes polymer resin beads with the desired sensors covalently attached. These functionalized microspheres are then immobilized into micromachined wells on a silicon chip thereby creating our taste buds. Exposure of the resin to analyte causes a change in the transmittance of the bead. This change can be fluorescent or colorimetric. Optical interrogation of the microspheres, by illuminating from one side of the wafer and collecting the signal on the other, results in an image. These data streams are collected using a CCD camera which creates red, green and blue (RGB) patterns that are distinct and reproducible for their environments. Analysis of this data can identify and quantify the analytes present.

  6. An analytic model for buoyancy resonances in protoplanetary disks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lubow, Stephen H.; Zhu, Zhaohuan, E-mail: lubow@stsci.edu, E-mail: zhzhu@astro.princeton.edu

    2014-04-10

    Zhu et al. found in three-dimensional shearing box simulations a new form of planet-disk interaction that they attributed to a vertical buoyancy resonance in the disk. We describe an analytic linear model for this interaction. We adopt a simplified model involving azimuthal forcing that produces the resonance and permits an analytic description of its structure. We derive an analytic expression for the buoyancy torque and show that the vertical torque distribution agrees well with the results of the Athena simulations and a Fourier method for linear numerical calculations carried out with the same forcing. The buoyancy resonance differs from themore » classic Lindblad and corotation resonances in that the resonance lies along tilted planes. Its width depends on damping effects and is independent of the gas sound speed. The resonance does not excite propagating waves. At a given large azimuthal wavenumber k{sub y} > h {sup –1} (for disk thickness h), the buoyancy resonance exerts a torque over a region that lies radially closer to the corotation radius than the Lindblad resonance. Because the torque is localized to the region of excitation, it is potentially subject to the effects of nonlinear saturation. In addition, the torque can be reduced by the effects of radiative heat transfer between the resonant region and its surroundings. For each azimuthal wavenumber, the resonance establishes a large scale density wave pattern in a plane within the disk.« less

  7. An analytical solution for two-dimensional vacuum preloading combined with electro-osmosis consolidation using EKG electrodes

    PubMed Central

    Qiu, Chenchen; Li, Yande

    2017-01-01

    China is a country with vast territory, but economic development and population growth have reduced the usable land resources in recent years. Therefore, reclamation by pumping and filling is carried out in eastern coastal regions of China in order to meet the needs of urbanization. However, large areas of reclaimed land need rapid drainage consolidation treatment. Based on past researches on how to improve the treatment efficiency of soft clay using vacuum preloading combined with electro-osmosis, a two-dimensional drainage plane model was proposed according to the Terzaghi and Esrig consolidation theory. However, the analytical solution using two-dimensional plane model was never involved. Current analytical solutions can’t have a thorough theoretical analysis of practical engineering and give relevant guidance. Considering the smearing effect and the rectangle arrangement pattern, an analytical solution is derived to describe the behavior of pore-water and the consolidation process by using EKG (electro-kinetic geo synthetics) materials. The functions of EKG materials include drainage, electric conduction and corrosion resistance. Comparison with test results is carried out to verify the analytical solution. It is found that the measured value is larger than the applied vacuum degree because of the stacking effect of the vacuum preloading and electro-osmosis. The trends of the mean measured value and the mean analytical value processes are comparable. Therefore, the consolidation model can accurately assess the change in pore-water pressure and the consolidation process during vacuum preloading combined with electro-osmosis. PMID:28771496

  8. Analytic Analysis of Convergent Shocks to Multi-Gigabar Conditions

    NASA Astrophysics Data System (ADS)

    Ruby, J. J.; Rygg, J. R.; Collins, G. W.; Bachmann, B.; Doeppner, T.; Ping, Y.; Gaffney, J.; Lazicki, A.; Kritcher, A. L.; Swift, D.; Nilsen, J.; Landen, O. L.; Hatarik, R.; Masters, N.; Nagel, S.; Sterne, P.; Pardini, T.; Khan, S.; Celliers, P. M.; Patel, P.; Gericke, D.; Falcone, R.

    2017-10-01

    The gigabar experimental platform at the National Ignition Facility is designed to increase understanding of the physical states and processes that dominate in the hydrogen at pressures from several hundreds of Mbar to tens of Gbar. Recent experiments using a solid CD2 ball reached temperatures and densities of order 107 K and several tens of g/cm3 , respectively. These conditions lead to the production of D-D fusion neutrons and x-ray bremsstrahlung photons, which allow us to place constraints on the thermodynamic states at peak compression. We use an analytic model to connect the neutron and x-ray emission with the state variables at peak compression. This analytic model is based on the self-similar Guderley solution of an imploding shock wave and the self-similar solution of the point explosion with heat conduction from Reinicke. Work is also being done to create a fully self-similar solution of an imploding shock wave coupled with heat conduction and radiation transport using a general equation of state. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.

  9. An analytical model of iceberg drift

    NASA Astrophysics Data System (ADS)

    Eisenman, I.; Wagner, T. J. W.; Dell, R.

    2017-12-01

    Icebergs transport freshwater from glaciers and ice shelves, releasing the freshwater into the upper ocean thousands of kilometers from the source. This influences ocean circulation through its effect on seawater density. A standard empirical rule-of-thumb for estimating iceberg trajectories is that they drift at the ocean surface current velocity plus 2% of the atmospheric surface wind velocity. This relationship has been observed in empirical studies for decades, but it has never previously been physically derived or justified. In this presentation, we consider the momentum balance for an individual iceberg, which includes nonlinear drag terms. Applying a series of approximations, we derive an analytical solution for the iceberg velocity as a function of time. In order to validate the model, we force it with surface velocity and temperature data from an observational state estimate and compare the results with iceberg observations in both hemispheres. We show that the analytical solution reduces to the empirical 2% relationship in the asymptotic limit of small icebergs (or strong winds), which approximately applies for typical Arctic icebergs. We find that the 2% value arises due to a term involving the drag coefficients for water and air and the densities of the iceberg, ocean, and air. In the opposite limit of large icebergs (or weak winds), which approximately applies for typical Antarctic icebergs with horizontal length scales greater than about 12 km, we find that the 2% relationship is not applicable and that icebergs instead move with the ocean current, unaffected by the wind. The two asymptotic regimes can be understood by considering how iceberg size influences the relative importance of the wind and ocean current drag terms compared with the Coriolis and pressure gradient force terms in the iceberg momentum balance.

  10. An approximate analytical solution for interlaminar stresses in angle-ply laminates

    NASA Technical Reports Server (NTRS)

    Rose, Cheryl A.; Herakovich, Carl T.

    1991-01-01

    An improved approximate analytical solution for interlaminar stresses in finite width, symmetric, angle-ply laminated coupons subjected to axial loading is presented. The solution is based upon statically admissible stress fields which take into consideration local property mismatch effects and global equilibrium requirements. Unknown constants in the admissible stress states are determined through minimization of the complementary energy. Typical results are presented for through-the-thickness and interlaminar stress distributions for angle-ply laminates. It is shown that the results represent an improved approximate analytical solution for interlaminar stresses.

  11. An analytic survey of signing inventory procedures in Virginia.

    DOT National Transportation Integrated Search

    1972-01-01

    An analytic survey was made of the highway signing and sign-maintenance inventory systems in each of the districts of the Virginia Department of Highways. Of particular concern in reviewing the procedures was the format of the inventory forms, the ap...

  12. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  13. Analytic theory of high-order-harmonic generation by an intense few-cycle laser pulse

    NASA Astrophysics Data System (ADS)

    Frolov, M. V.; Manakov, N. L.; Popov, A. M.; Tikhonova, O. V.; Volkova, E. A.; Silaev, A. A.; Vvedenskii, N. V.; Starace, Anthony F.

    2012-03-01

    We present a theoretical model for describing the interaction of an electron, weakly bound in a short-range potential, with an intense, few-cycle laser pulse. General definitions for the differential probability of above-threshold ionization and for the yield of high-order-harmonic generation (HHG) are presented. For HHG we then derive detailed analytic expressions for the spectral density of generated radiation in terms of the key laser parameters, including the number N of optical cycles in the pulse and the carrier-envelope phase (CEP). In particular, in the tunneling approximation, we provide detailed derivations of the closed-form formulas presented briefly by M. V. Frolov [Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.83.021405 83, 021405(R) (2011)], which were used to describe key features of HHG by both H and Xe atom targets in an intense, few-cycle laser pulse. We then provide a complete analysis of the dependence of the HHG spectrum on both N and the CEP φ of an N-cycle laser pulse. Most importantly, we show analytically that the structure of the HHG spectrum stems from interference between electron wave packets originating from electron ionization from neighboring half-cycles near the peak of the intensity envelope of the few-cycle laser pulse. Such interference is shown to be very sensitive to the CEP. The usual HHG spectrum for a monochromatic driving laser field (comprising harmonic peaks at odd multiples of the carrier frequency and spaced by twice the carrier frequency) is shown analytically to occur only in the limit of very large N, and begins to form, as N increases, in the energy region beyond the HHG plateau cutoff.

  14. An analytical SMASH procedure (ASP) for sensitivity-encoded MRI.

    PubMed

    Lee, R F; Westgate, C R; Weiss, R G; Bottomley, P A

    2000-05-01

    The simultaneous acquisition of spatial harmonics (SMASH) method of imaging with detector arrays can reduce the number of phase-encoding steps, and MRI scan time several-fold. The original approach utilized numerical gradient-descent fitting with the coil sensitivity profiles to create a set of composite spatial harmonics to replace the phase-encoding steps. Here, an analytical approach for generating the harmonics is presented. A transform is derived to project the harmonics onto a set of sensitivity profiles. A sequence of Fourier, Hilbert, and inverse Fourier transform is then applied to analytically eliminate spatially dependent phase errors from the different coils while fully preserving the spatial-encoding. By combining the transform and phase correction, the original numerical image reconstruction method can be replaced by an analytical SMASH procedure (ASP). The approach also allows simulation of SMASH imaging, revealing a criterion for the ratio of the detector sensitivity profile width to the detector spacing that produces optimal harmonic generation. When detector geometry is suboptimal, a group of quasi-harmonics arises, which can be corrected and restored to pure harmonics. The simulation also reveals high-order harmonic modulation effects, and a demodulation procedure is presented that enables application of ASP to a large numbers of detectors. The method is demonstrated on a phantom and humans using a standard 4-channel phased-array MRI system. Copyright 2000 Wiley-Liss, Inc.

  15. Methods for integrating moderation and mediation: a general analytical framework using moderated path analysis.

    PubMed

    Edwards, Jeffrey R; Lambert, Lisa Schurer

    2007-03-01

    Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated and the mediated effects under investigation. This article presents a general analytical framework for combining moderation and mediation that integrates moderated regression analysis and path analysis. This framework clarifies how moderator variables influence the paths that constitute the direct, indirect, and total effects of mediated models. The authors empirically illustrate this framework and give step-by-step instructions for estimation and interpretation. They summarize the advantages of their framework over current approaches, explain how it subsumes moderated mediation and mediated moderation, and describe how it can accommodate additional moderator and mediator variables, curvilinear relationships, and structural equation models with latent variables. (c) 2007 APA, all rights reserved.

  16. EXAMPLES OF THE ROLE OF ANALYTICAL CHEMISTRY IN ENVIRONMENTAL RISK MANAGEMENT RESEARCH

    EPA Science Inventory

    Analytical chemistry is an important tier of environmental protection and has been traditionally linked to compliance and/or exposure monitoring activities for environmental contaminants. The adoption of the risk management paradigm has led to special challenges for analytical ch...

  17. An improved 3D MoF method based on analytical partial derivatives

    NASA Astrophysics Data System (ADS)

    Chen, Xiang; Zhang, Xiong

    2016-12-01

    MoF (Moment of Fluid) method is one of the most accurate approaches among various surface reconstruction algorithms. As other second order methods, MoF method needs to solve an implicit optimization problem to obtain the optimal approximate surface. Therefore, the partial derivatives of the objective function have to be involved during the iteration for efficiency and accuracy. However, to the best of our knowledge, the derivatives are currently estimated numerically by finite difference approximation because it is very difficult to obtain the analytical derivatives of the object function for an implicit optimization problem. Employing numerical derivatives in an iteration not only increase the computational cost, but also deteriorate the convergence rate and robustness of the iteration due to their numerical error. In this paper, the analytical first order partial derivatives of the objective function are deduced for 3D problems. The analytical derivatives can be calculated accurately, so they are incorporated into the MoF method to improve its accuracy, efficiency and robustness. Numerical studies show that by using the analytical derivatives the iterations are converged in all mixed cells with the efficiency improvement of 3 to 4 times.

  18. Culture-Sensitive Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Vandenberghe, L.

    2008-01-01

    Functional analytic psychotherapy (FAP) is defined as behavior-analytically conceptualized talk therapy. In contrast to the technique-oriented educational format of cognitive behavior therapy and the use of structural mediational models, FAP depends on the functional analysis of the moment-to-moment stream of interactions between client and…

  19. Developing an Emergency Physician Productivity Index Using Descriptive Health Analytics.

    PubMed

    Khalifa, Mohamed

    2015-01-01

    Emergency department (ED) crowding became a major barrier to receiving timely emergency care. At King Faisal Specialist Hospital and Research Center, Saudi Arabia, we identified variables and factors affecting crowding and performance to develop indicators to help evaluation and improvement. Measuring efficiency of work and activity of throughput processes; it was important to develop an ED physician productivity index. Data on all ED patients' encounters over the last six months of 2014 were retrieved and descriptive health analytics methods were used. Three variables were identified for their influence on productivity and performance; Number of Treated Patients per Physician, Patient Acuity Level and Treatment Time. The study suggested a formula to calculate the productivity index of each physician through dividing the Number of Treated Patients by Patient Acuity Level squared and Treatment Time to identify physicians with low productivity index and investigate causes and factors.

  20. Analytic materials

    PubMed Central

    2016-01-01

    The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer p. If p takes its maximum value, then we have a complete analytic material. Otherwise, it is incomplete analytic material of rank p. For two-dimensional materials, further progress can be made in the identification of analytic materials by using the well-known fact that a 90° rotation applied to a divergence-free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations. PMID:27956882

  1. Analytical aspects of hydrogen exchange mass spectrometry

    PubMed Central

    Engen, John R.; Wales, Thomas E.

    2016-01-01

    The analytical aspects of measuring hydrogen exchange by mass spectrometry are reviewed. The nature of analytical selectivity in hydrogen exchange is described followed by review of the analytical tools required to accomplish fragmentation, separation, and the mass spectrometry measurements under restrictive exchange quench conditions. In contrast to analytical quantitation that relies on measurements of peak intensity or area, quantitation in hydrogen exchange mass spectrometry depends on measuring a mass change with respect to an undeuterated or deuterated control, resulting in a value between zero and the maximum amount of deuterium that could be incorporated. Reliable quantitation is a function of experimental fidelity and to achieve high measurement reproducibility, a large number of experimental variables must be controlled during sample preparation and analysis. The method also reports on important qualitative aspects of the sample, including conformational heterogeneity and population dynamics. PMID:26048552

  2. Physiogenomic Analysis of Localized fMRI Brain Activity in Schizophrenia

    PubMed Central

    Windemuth, Andreas; Calhoun, Vince D.; Pearlson, Godfrey D.; Kocherla, Mohan; Jagannathan, Kanchana; Ruaño, Gualberto

    2009-01-01

    The search for genetic factors associated with disease is complicated by the complexity of the biological pathways linking genotype and phenotype. This analytical complexity is particularly concerning in diseases historically lacking reliable diagnostic biological markers, such as schizophrenia and other mental disorders. We investigate the use of functional magnetic resonance imaging (fMRI) as an intermediate phenotype (endophenotype) to identify physiogenomic associations to schizophrenia. We screened 99 subjects, 30 subjects diagnosed with schizophrenia, 13 unaffected relatives of schizophrenia patients, and 56 unrelated controls, for gene polymorphisms associated with fMRI activation patterns at two locations in temporal and frontal lobes previously implied in schizophrenia. A total of 22 single nucleotide polymorphisms (SNPs) in 15 genes from the dopamine and serotonin neurotransmission pathways were genotyped in all subjects. We identified three SNPs in genes that are significantly associated with fMRI activity. SNPs of the dopamine beta-hydroxylase (DBH) gene and of the dopamine receptor D4 (DRD4) were associated with activity in the temporal and frontal lobes, respectively. One SNP of serotonin-3A receptor (HTR3A) was associated with temporal lobe activity. The results of this study support the physiogenomic analysis of neuroimaging data to discover associations between genotype and disease-related phenotypes. PMID:18330705

  3. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based, Graduate-Level Analytical Chemistry Course

    NASA Astrophysics Data System (ADS)

    Toh, Chee-Seng

    2007-04-01

    A research-focused approach is described for a nonlaboratory-based graduate-level module on analytical chemistry. The approach utilizes commonly practiced activities carried out in active research laboratories, in particular, activities involving logging of ideas and thoughts, journal clubs, proposal writing, classroom participation and discussions, and laboratory tours. This approach was adapted without compromising the course content and results suggest possible adaptation and implementation in other graduate-level courses.

  4. [An evaluation of costs in nephrology by means of analytical accounting system].

    PubMed

    Hernández-Jaras, J; García Pérez, H; Pons, R; Calvo, C

    2005-01-01

    The analytical accounting is a countable technique directed to the evaluation, by means of pre-established criteria of distribution, of the internal economy of the hospital, in order to know the effectiveness and efficiency of Clinical Units. The aim of this study was to analyze the activity and costs of the Nephrology Department of General Hospital of Castellón. Activity of Hospitalization and Ambulatory Care, during 2003 was analysed. Hospitalization discharges were grouped in DGR and the costs per DGR were determinated. Total costs Hospitalisation and Ambulatory Care were 560.434,9 and 146.317,8 Euros, respectively. And the costs of one stay, one first outpatient visit and maintenance visit were 200, 63, and 31,6 Euros, respectively. Eighty per cent of the discharges were grouped in 9 DGR and DRG number 316 (Renal Failure) represented 30% of the total productivity. Costs of DGR 316 were 3.178,2 Euros and 16% represented laboratory cost and costs of diagnostic or therapeutic procedures. With introduction of analytical accounting and DGR system, the Nephrology Departments can acquire more full information on the results and costs of treatment. These techniques permits to improve the financial and economic performance.

  5. Deployment of Analytics into the Healthcare Safety Net: Lessons Learned.

    PubMed

    Hartzband, David; Jacobs, Feygele

    2016-01-01

    As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation's largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration. 1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population level, apparent underreporting

  6. Deployment of Analytics into the Healthcare Safety Net: Lessons Learned

    PubMed Central

    Hartzband, David; Jacobs, Feygele

    2016-01-01

    Background As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation’s largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. Methods To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration.1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. Results The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population

  7. An analytical and experimental investigation of the response of the curved, composite frame/skin specimens

    NASA Technical Reports Server (NTRS)

    Moas, Eduardo; Boitnott, Richard L.; Griffin, O. Hayden, Jr.

    1994-01-01

    Six-foot diameter, semicircular graphite/epoxy specimens representative of generic aircraft frames were loaded quasi-statistically to determine their load response and failure mechanisms for large deflections that occur in airplanes crashes. These frame/skin specimens consisted of a cylindrical skin section co-cured with a semicircular I-frame. The skin provided the necessary lateral stiffness to keep deformations in the plane of the frame in order to realistically represent deformations as they occur in actual fuselage structures. Various frame laminate stacking sequences and geometries were evaluated by statically loading the specimen until multiple failures occurred. Two analytical methods were compared for modeling the frame/skin specimens: a two-dimensional shell finite element analysis and a one-dimensional, closed-form, curved beam solution derived using an energy method. Flange effectivities were included in the beam analysis to account for the curling phenomenon that occurs in thin flanges of curved beams. Good correlation was obtained between experimental results and the analytical predictions of the linear response of the frames prior to the initial failure. The specimens were found to be useful for evaluating composite frame designs.

  8. An analytical approach of thermodynamic behavior in a gas target system on a medical cyclotron.

    PubMed

    Jahangiri, Pouyan; Zacchia, Nicholas A; Buckley, Ken; Bénard, François; Schaffer, Paul; Martinez, D Mark; Hoehr, Cornelia

    2016-01-01

    An analytical model has been developed to study the thermo-mechanical behavior of gas targets used to produce medical isotopes, assuming that the system reaches steady-state. It is based on an integral analysis of the mass and energy balance of the gas-target system, the ideal gas law, and the deformation of the foil. The heat transfer coefficients for different target bodies and gases have been calculated. Excellent agreement is observed between experiments performed at TRIUMF's 13 MeV cyclotron and the model. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Collisional evolution - an analytical study for the non steady-state mass distribution.

    NASA Astrophysics Data System (ADS)

    Vieira Martins, R.

    1999-05-01

    To study the collisional evolution of asteroidal groups one can use an analytical solution for the self-similar collision cascades. This solution is suitable to study the steady-state mass distribution of the collisional fragmentation. However, out of the steady-state conditions, this solution is not satisfactory for some values of the collisional parameters. In fact, for some values for the exponent of the mass distribution power law of an asteroidal group and its relation to the exponent of the function which describes "how rocks break" the author arrives at singular points for the equation which describes the collisional evolution. These singularities appear since some approximations are usually made in the laborious evaluation of many integrals that appear in the analytical calculations. They concern the cutoff for the smallest and the largest bodies. These singularities set some restrictions to the study of the analytical solution for the collisional equation. To overcome these singularities the author performed an algebraic computation considering the smallest and the largest bodies and he obtained the analytical expressions for the integrals that describe the collisional evolution without restriction on the parameters. However, the new distribution is more sensitive to the values of the collisional parameters. In particular the steady-state solution for the differential mass distribution has exponents slightly different from 11/6 for the usual parameters in the asteroid belt. The sensitivity of this distribution with respect to the parameters is analyzed for the usual values in the asteroidal groups. With an expression for the mass distribution without singularities, one can evaluate also its time evolution. The author arrives at an analytical expression given by a power series of terms constituted by a small parameter multiplied by the mass to an exponent, which depends on the initial power law distribution. This expression is a formal solution for the

  10. An analytical and experimental investigation of resistojet plumes

    NASA Technical Reports Server (NTRS)

    Zana, L. M.; Hoffman, D. J.; Breyley, L. R.; Serafini, J. S.

    1987-01-01

    As a part of the electrothermal propulsion plume research program at the NASA Lewis Research Center, efforts have been initiated to analytically and experimentally investigate the plumes of resistojet thrusters. The method of G.A. Simons for the prediction of rocket exhaust plumes is developed for the resistojet. Modifications are made to the source flow equations to account for the increased effects of the relatively large nozzle boundary layer. Additionally, preliminary mass flux measurements of a laboratory resistojet using CO2 propellant at 298 K have been obtained with a cryogenically cooled quartz crystal microbalance (QCM). There is qualitative agreement between analysis and experiment, at least in terms of the overall number density shape functions in the forward flux region.

  11. Development and application of accurate analytical models for single active electron potentials

    NASA Astrophysics Data System (ADS)

    Miller, Michelle; Jaron-Becker, Agnieszka; Becker, Andreas

    2015-05-01

    The single active electron (SAE) approximation is a theoretical model frequently employed to study scenarios in which inner-shell electrons may productively be treated as frozen spectators to a physical process of interest, and accurate analytical approximations for these potentials are sought as a useful simulation tool. Density function theory is often used to construct a SAE potential, requiring that a further approximation for the exchange correlation functional be enacted. In this study, we employ the Krieger, Li, and Iafrate (KLI) modification to the optimized-effective-potential (OEP) method to reduce the complexity of the problem to the straightforward solution of a system of linear equations through simple arguments regarding the behavior of the exchange-correlation potential in regions where a single orbital dominates. We employ this method for the solution of atomic and molecular potentials, and use the resultant curve to devise a systematic construction for highly accurate and useful analytical approximations for several systems. Supported by the U.S. Department of Energy (Grant No. DE-FG02-09ER16103), and the U.S. National Science Foundation (Graduate Research Fellowship, Grants No. PHY-1125844 and No. PHY-1068706).

  12. Importance of implementing an analytical quality control system in a core laboratory.

    PubMed

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions

  13. Absolute activity quantitation from projections using an analytical approach: comparison with iterative methods in Tc-99m and I-123 brain SPECT

    NASA Astrophysics Data System (ADS)

    Fakhri, G. El; Kijewski, M. F.; Moore, S. C.

    2001-06-01

    Estimates of SPECT activity within certain deep brain structures could be useful for clinical tasks such as early prediction of Alzheimer's disease with Tc-99m or Parkinson's disease with I-123; however, such estimates are biased by poor spatial resolution and inaccurate scatter and attenuation corrections. We compared an analytical approach (AA) of more accurate quantitation to a slower iterative approach (IA). Monte Carlo simulated projections of 12 normal and 12 pathologic Tc-99m perfusion studies, as well as 12, normal and 12 pathologic I-123 neurotransmission studies, were generated using a digital brain phantom and corrected for scatter by a multispectral fitting procedure. The AA included attenuation correction by a modified Metz-Fan algorithm and activity estimation by a technique that incorporated Metz filtering to compensate for variable collimator response (VCR), IA-modeled attenuation, and VCR in the projector/backprojector of an ordered subsets-expectation maximization (OSEM) algorithm. Bias and standard deviation over the 12 normal and 12 pathologic patients were calculated with respect to the reference values in the corpus callosum, caudate nucleus, and putamen. The IA and AA yielded similar quantitation results in both Tc-99m and I-123 studies in all brain structures considered in both normal and pathologic patients. The bias with respect to the reference activity distributions was less than 7% for Tc-99m studies, but greater than 30% for I-123 studies, due to partial volume effect in the striata. Our results were validated using I-123 physical acquisitions of an anthropomorphic brain phantom. The IA yielded quantitation accuracy comparable to that obtained with IA, while requiring much less processing time. However, in most conditions, IA yielded lower noise for the same bias than did AA.

  14. An analytically solvable three-body break-up model problem in hyperspherical coordinates

    NASA Astrophysics Data System (ADS)

    Ancarani, L. U.; Gasaneo, G.; Mitnik, D. M.

    2012-10-01

    An analytically solvable S-wave model for three particles break-up processes is presented. The scattering process is represented by a non-homogeneous Coulombic Schrödinger equation where the driven term is given by a Coulomb-like interaction multiplied by the product of a continuum wave function and a bound state in the particles coordinates. The closed form solution is derived in hyperspherical coordinates leading to an analytic expression for the associated scattering transition amplitude. The proposed scattering model contains most of the difficulties encountered in real three-body scattering problem, e.g., non-separability in the electrons' spherical coordinates and Coulombic asymptotic behavior. Since the coordinates' coupling is completely different, the model provides an alternative test to that given by the Temkin-Poet model. The knowledge of the analytic solution provides an interesting benchmark to test numerical methods dealing with the double continuum, in particular in the asymptotic regions. An hyperspherical Sturmian approach recently developed for three-body collisional problems is used to reproduce to high accuracy the analytical results. In addition to this, we generalized the model generating an approximate wave function possessing the correct radial asymptotic behavior corresponding to an S-wave three-body Coulomb problem. The model allows us to explore the typical structure of the solution of a three-body driven equation, to identify three regions (the driven, the Coulombic and the asymptotic), and to analyze how far one has to go to extract the transition amplitude.

  15. Hemispheric specialization and creative thinking: a meta-analytic review of lateralization of creativity.

    PubMed

    Mihov, Konstantin M; Denzler, Markus; Förster, Jens

    2010-04-01

    In the last two decades research on the neurophysiological processes of creativity has found contradicting results. Whereas most research suggests right hemisphere dominance in creative thinking, left-hemisphere dominance has also been reported. The present research is a meta-analytic review of the literature to establish how creative thinking relates to relative hemispheric dominance. The analysis was performed on the basis of a non-parametric vote-counting approach and effect-size calculations of Cramer's phi suggest relative dominance of the right hemisphere during creative thinking. Moderator analyses revealed no difference in predominant right-hemispheric activation for verbal vs. figural tasks, holistic vs. analytical tasks, and context-dependent vs. context-independent tasks. Suggestions for further investigations with the meta-analytic and neuroscience methodologies to answer the questions of left hemispheric activation and further moderation of the effects are discussed. Copyright 2009 Elsevier Inc. All rights reserved.

  16. Characteristics, Properties and Analytical Methods of Amoxicillin: A Review with Green Approach.

    PubMed

    de Marco, Bianca Aparecida; Natori, Jéssica Sayuri Hisano; Fanelli, Stefany; Tótoli, Eliane Gandolpho; Salgado, Hérida Regina Nunes

    2017-05-04

    Bacterial infections are the second leading cause of global mortality. Considering this fact, it is extremely important studying the antimicrobial agents. Amoxicillin is an antimicrobial agent that belongs to the class of penicillins; it has bactericidal activity and is widely used in the Brazilian health system. In literature, some analytical methods are found for the identification and quantification of this penicillin, which are essential for its quality control, which ensures maintaining the product characteristics, therapeutic efficacy and patient's safety. Thus, this study presents a brief literature review on amoxicillin and the analytical methods developed for the analysis of this drug in official and scientific papers. The major analytical methods found were high-performance liquid chromatography (HPLC), ultra-performance liquid chromatography (U-HPLC), capillary electrophoresis and iodometry and diffuse reflectance infrared Fourier transform. It is essential to note that most of the developed methods used toxic and hazardous solvents, which makes necessary industries and researchers choose to develop environmental-friendly techniques to provide enhanced benefits to environment and staff.

  17. An approximate analytical solution for describing surface runoff and sediment transport over hillslope

    NASA Astrophysics Data System (ADS)

    Tao, Wanghai; Wang, Quanjiu; Lin, Henry

    2018-03-01

    Soil and water loss from farmland causes land degradation and water pollution, thus continued efforts are needed to establish mathematical model for quantitative analysis of relevant processes and mechanisms. In this study, an approximate analytical solution has been developed for overland flow model and sediment transport model, offering a simple and effective means to predict overland flow and erosion under natural rainfall conditions. In the overland flow model, the flow regime was considered to be transitional with the value of parameter β (in the kinematic wave model) approximately two. The change rate of unit discharge with distance was assumed to be constant and equal to the runoff rate at the outlet of the plane. The excess rainfall was considered to be constant under uniform rainfall conditions. The overland flow model developed can be further applied to natural rainfall conditions by treating excess rainfall intensity as constant over a small time interval. For the sediment model, the recommended values of the runoff erosion calibration constant (cr) and the splash erosion calibration constant (cf) have been given in this study so that it is easier to use the model. These recommended values are 0.15 and 0.12, respectively. Comparisons with observed results were carried out to validate the proposed analytical solution. The results showed that the approximate analytical solution developed in this paper closely matches the observed data, thus providing an alternative method of predicting runoff generation and sediment yield, and offering a more convenient method of analyzing the quantitative relationships between variables. Furthermore, the model developed in this study can be used as a theoretical basis for developing runoff and erosion control methods.

  18. Problem-based learning on quantitative analytical chemistry course

    NASA Astrophysics Data System (ADS)

    Fitri, Noor

    2017-12-01

    This research applies problem-based learning method on chemical quantitative analytical chemistry, so called as "Analytical Chemistry II" course, especially related to essential oil analysis. The learning outcomes of this course include aspects of understanding of lectures, the skills of applying course materials, and the ability to identify, formulate and solve chemical analysis problems. The role of study groups is quite important in improving students' learning ability and in completing independent tasks and group tasks. Thus, students are not only aware of the basic concepts of Analytical Chemistry II, but also able to understand and apply analytical concepts that have been studied to solve given analytical chemistry problems, and have the attitude and ability to work together to solve the problems. Based on the learning outcome, it can be concluded that the problem-based learning method in Analytical Chemistry II course has been proven to improve students' knowledge, skill, ability and attitude. Students are not only skilled at solving problems in analytical chemistry especially in essential oil analysis in accordance with local genius of Chemistry Department, Universitas Islam Indonesia, but also have skilled work with computer program and able to understand material and problem in English.

  19. Solution-based analysis of multiple analytes by a sensor array: toward the development of an electronic tongue

    NASA Astrophysics Data System (ADS)

    Savoy, Steven M.; Lavigne, John J.; Yoo, J. S.; Wright, John; Rodriguez, Marc; Goodey, Adrian; McDoniel, Bridget; McDevitt, John T.; Anslyn, Eric V.; Shear, Jason B.; Ellington, Andrew D.; Neikirk, Dean P.

    1998-12-01

    A micromachined sensor array has been developed for the rapid characterization of multi-component mixtures in aqueous media. The sensor functions in a manner analogous to that of the mammalian tongue, using an array composed of individually immobilized polystyrene-polyethylene glycol composite microspheres selectively arranged in micromachined etch cavities localized o n silicon wafers. Sensing occurs via colorimetric or fluorometric changes to indicator molecules that are covalently bound to amine termination sites on the polymeric microspheres. The hybrid micromachined structure has been interfaced directly to a charged-coupled-device that is used for the simultaneous acquisition of the optical data from the individually addressable `taste bud' elements. With the miniature sensor array, acquisition of data streams composed of red, green, and blue color patterns distinctive for the analytes in the solution are rapidly acquired. The unique combination of carefully chosen reporter molecules with water permeable microspheres allows for the simultaneous detection and quantification of a variety of analytes. The fabrication of the sensor structures and the initial colorimetric and fluorescent responses for pH, Ca+2, Ce+3, and sugar are reported. Interface to microfluidic components should also be possible, producing a complete sampling/sensing system.

  20. Behavior Analytic Contributions to the Study of Creativity

    ERIC Educational Resources Information Center

    Kubina, Richard M., Jr.; Morrison, Rebecca S.; Lee, David L.

    2006-01-01

    As researchers continue to study creativity, a behavior analytic perspective may provide new vistas by offering an additional perspective. Contemporary behavior analysis began with B. F. Skinner and offers a selectionist approach to the scientific investigation of creativity. Behavior analysis contributes to the study of creativity by…

  1. An analytic study of nonsteady two-phase laminar boundary layer around an airfoil

    NASA Technical Reports Server (NTRS)

    Hsu, Yu-Kao

    1989-01-01

    Recently, NASA, FAA, and other organizations have focused their attention upon the possible effects of rain on airfoil performance. Rhode carried out early experiments and concluded that the rain impacting the aircraft increased the drag. Bergrum made numerical calculation for the rain effects on airfoils. Luers and Haines did an analytic investigation and found that heavy rain induces severe aerodynamic penalties including both a momentum penalty due to the impact of the rain and a drag and lift penalty due to rain roughening of the airfoil and fuselage. More recently, Hansman and Barsotti performed experiments and declared that performance degradation of an airfoil in heavy rain is due to the effective roughening of the surface by the water layer. Hansman and Craig did further experimental research at low Reynolds number. E. Dunham made a critical review for the potential influence of rain on airfoil performance. Dunham et al. carried out experiments for the transport type airfoil and concluded that there is a reduction of maximum lift capability with increase in drag. There is a scarcity of published literature in analytic research of two-phase boundary layer around an airfoil. Analytic research is being improved. The following assumptions are made: the fluid flow is non-steady, viscous, and incompressible; the airfoil is represented by a two-dimensional flat plate; and there is only a laminar boundary layer throughout the flow region. The boundary layer approximation is solved and discussed.

  2. Pure-rotational spectrometry: a vintage analytical method applied to modern breath analysis.

    PubMed

    Hrubesh, Lawrence W; Droege, Michael W

    2013-09-01

    Pure-rotational spectrometry (PRS) is an established method, typically used to study structures and properties of polar gas-phase molecules, including isotopic and isomeric varieties. PRS has also been used as an analytical tool where it is particularly well suited for detecting or monitoring low-molecular-weight species that are found in exhaled breath. PRS is principally notable for its ultra-high spectral resolution which leads to exceptional specificity to identify molecular compounds in complex mixtures. Recent developments using carbon aerogel for pre-concentrating polar molecules from air samples have extended the sensitivity of PRS into the part-per-billion range. In this paper we describe the principles of PRS and show how it may be configured in several different modes for breath analysis. We discuss the pre-concentration concept and demonstrate its use with the PRS analyzer for alcohols and ammonia sampled directly from the breath.

  3. THE IMPORTANCE OF PROPER INTENSITY CALIBRATION FOR RAMAN ANALYSIS OF LOW-LEVEL ANALYTES IN WATER

    EPA Science Inventory

    Modern dispersive Raman spectroscopy offers unique advantages for the analysis of low-concentration analytes in aqueous solution. However, we have found that proper intensity calibration is critical for obtaining these benefits. This is true not only for producing spectra with ...

  4. An analytic cosmology solution of Poincaré gauge gravity

    NASA Astrophysics Data System (ADS)

    Lu, Jianbo; Chee, Guoying

    2016-06-01

    A cosmology of Poincaré gauge theory is developed. An analytic solution is obtained. The calculation results agree with observation data and can be compared with the ΛCDM model. The cosmological constant puzzle is the coincidence and fine tuning problem are solved naturally at the same time. The cosmological constant turns out to be the intrinsic torsion and curvature of the vacuum universe, and is derived from the theory naturally rather than added artificially. The dark energy originates from geometry, includes the cosmological constant but differs from it. The analytic expression of the state equations of the dark energy and the density parameters of the matter and the geometric dark energy are derived. The full equations of linear cosmological perturbations and the solutions are obtained.

  5. Chemical clocks, oscillations, and other temporal effects in analytical chemistry: oddity or viable approach?

    PubMed

    Prabhu, Gurpur Rakesh D; Witek, Henryk A; Urban, Pawel L

    2018-05-31

    Most analytical methods are based on "analogue" inputs from sensors of light, electric potentials, or currents. The signals obtained by such sensors are processed using certain calibration functions to determine concentrations of the target analytes. The signal readouts are normally done after an optimised and fixed time period, during which an assay mixture is incubated. This minireview covers another-and somewhat unusual-analytical strategy, which relies on the measurement of time interval between the occurrences of two distinguishable states in the assay reaction. These states manifest themselves via abrupt changes in the properties of the assay mixture (e.g. change of colour, appearance or disappearance of luminescence, change in pH, variations in optical activity or mechanical properties). In some cases, a correlation between the time of appearance/disappearance of a given property and the analyte concentration can be also observed. An example of an assay based on time measurement is an oscillating reaction, in which the period of oscillations is linked to the concentration of the target analyte. A number of chemo-chronometric assays, relying on the existing (bio)transformations or artificially designed reactions, were disclosed in the past few years. They are very attractive from the fundamental point of view but-so far-only few of them have be validated and used to address real-world problems. Then, can chemo-chronometric assays become a practical tool for chemical analysis? Is there a need for further development of such assays? We are aiming to answer these questions.

  6. Merging Old and New: An Instrumentation-Based Introductory Analytical Laboratory

    ERIC Educational Resources Information Center

    Jensen, Mark B.

    2015-01-01

    An instrumentation-based laboratory curriculum combining traditional unknown analyses with student-designed projects has been developed for an introductory analytical chemistry course. In the first half of the course, students develop laboratory skills and instrumental proficiency by rotating through six different instruments performing…

  7. Analytical Round Robin for Elastic-Plastic Analysis of Surface Cracked Plates, Phase II Results

    NASA Technical Reports Server (NTRS)

    Allen, P. A.; Wells, D. N.

    2017-01-01

    The second phase of an analytical round robin for the elastic-plastic analysis of surface cracks in flat plates was conducted under the auspices of ASTM Interlaboratory Study 732. The interlaboratory study (ILS) had 10 participants with a broad range of expertise and experience, and experimental results from a surface crack tension test in 4142 steel plate loaded well into the elastic-plastic regime provided the basis for the study. The participants were asked to evaluate a surface crack tension test according to the version of the surface crack initiation toughness testing standard published at the time of the ILS, E2899-13. Data were provided to each participant that represent the fundamental information that would be provided by a mechanical test laboratory prior to evaluating the test result. Overall, the participant’s test analysis results were in good agreement and constructive feedback was received that has resulted in an improved published version of the standard E2899-15.

  8. Development of an analytical guidance algorithm for lunar descent

    NASA Astrophysics Data System (ADS)

    Chomel, Christina Tvrdik

    In recent years, NASA has indicated a desire to return humans to the moon. With NASA planning manned missions within the next couple of decades, the concept development for these lunar vehicles has begun. The guidance, navigation, and control (GN&C) computer programs that will perform the function of safely landing a spacecraft on the moon are part of that development. The lunar descent guidance algorithm takes the horizontally oriented spacecraft from orbital speeds hundreds of kilometers from the desired landing point to the landing point at an almost vertical orientation and very low speed. Existing lunar descent GN&C algorithms date back to the Apollo era with little work available for implementation since then. Though these algorithms met the criteria of the 1960's, they are cumbersome today. At the basis of the lunar descent phase are two elements: the targeting, which generates a reference trajectory, and the real-time guidance, which forces the spacecraft to fly that trajectory. The Apollo algorithm utilizes a complex, iterative, numerical optimization scheme for developing the reference trajectory. The real-time guidance utilizes this reference trajectory in the form of a quartic rather than a more general format to force the real-time trajectory errors to converge to zero; however, there exist no guarantees under any conditions for this convergence. The proposed algorithm implements a purely analytical targeting algorithm used to generate two-dimensional trajectories "on-the-fly"' or to retarget the spacecraft to another landing site altogether. It is based on the analytical solutions to the equations for speed, downrange, and altitude as a function of flight path angle and assumes two constant thrust acceleration curves. The proposed real-time guidance algorithm has at its basis the three-dimensional non-linear equations of motion and a control law that is proven to converge under certain conditions through Lyapunov analysis to a reference trajectory

  9. Manufacturing data analytics using a virtual factory representation.

    PubMed

    Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun

    2017-01-01

    Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.

  10. ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress

    NASA Technical Reports Server (NTRS)

    Kempler, Steven

    2015-01-01

    The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.

  11. Bridging the Gap between Human Judgment and Automated Reasoning in Predictive Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.; Riensche, Roderick M.; Unwin, Stephen D.

    2010-06-07

    Events occur daily that impact the health, security and sustainable growth of our society. If we are to address the challenges that emerge from these events, anticipatory reasoning has to become an everyday activity. Strong advances have been made in using integrated modeling for analysis and decision making. However, a wider impact of predictive analytics is currently hindered by the lack of systematic methods for integrating predictive inferences from computer models with human judgment. In this paper, we present a predictive analytics approach that supports anticipatory analysis and decision-making through a concerted reasoning effort that interleaves human judgment and automatedmore » inferences. We describe a systematic methodology for integrating modeling algorithms within a serious gaming environment in which role-playing by human agents provides updates to model nodes and the ensuing model outcomes in turn influence the behavior of the human players. The approach ensures a strong functional partnership between human players and computer models while maintaining a high degree of independence and greatly facilitating the connection between model and game structures.« less

  12. A meta-analytic review of the effects of mindfulness meditation on telomerase activity.

    PubMed

    Schutte, Nicola S; Malouff, John M

    2014-04-01

    The enzyme telomerase, through its influence on telomere length, is associated with health and mortality. Four pioneering randomized control trials, including a total of 190 participants, provided information on the effect of mindfulness meditation on telomerase. A meta-analytic effect size of d=0.46 indicated that mindfulness meditation leads to increased telomerase activity in peripheral blood mononuclear cells. These results suggest the need for further large-scale trials investigating optimal implementation of mindfulness meditation to facilitate telomerase functioning. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Linear models of activation cascades: analytical solutions and coarse-graining of delayed signal transduction

    PubMed Central

    Desikan, Radhika

    2016-01-01

    Cellular signal transduction usually involves activation cascades, the sequential activation of a series of proteins following the reception of an input signal. Here, we study the classic model of weakly activated cascades and obtain analytical solutions for a variety of inputs. We show that in the special but important case of optimal gain cascades (i.e. when the deactivation rates are identical) the downstream output of the cascade can be represented exactly as a lumped nonlinear module containing an incomplete gamma function with real parameters that depend on the rates and length of the cascade, as well as parameters of the input signal. The expressions obtained can be applied to the non-identical case when the deactivation rates are random to capture the variability in the cascade outputs. We also show that cascades can be rearranged so that blocks with similar rates can be lumped and represented through our nonlinear modules. Our results can be used both to represent cascades in computational models of differential equations and to fit data efficiently, by reducing the number of equations and parameters involved. In particular, the length of the cascade appears as a real-valued parameter and can thus be fitted in the same manner as Hill coefficients. Finally, we show how the obtained nonlinear modules can be used instead of delay differential equations to model delays in signal transduction. PMID:27581482

  14. Analytic Energy Gradients for Variational Two-Electron Reduced-Density-Matrix-Driven Complete Active Space Self-Consistent Field Theory.

    PubMed

    Maradzike, Elvis; Gidofalvi, Gergely; Turney, Justin M; Schaefer, Henry F; DePrince, A Eugene

    2017-09-12

    Analytic energy gradients are presented for a variational two-electron reduced-density-matrix (2-RDM)-driven complete active space self-consistent field (CASSCF) method. The active-space 2-RDM is determined using a semidefinite programing (SDP) algorithm built upon an augmented Lagrangian formalism. Expressions for analytic gradients are simplified by the fact that the Lagrangian is stationary with respect to variations in both the primal and the dual solutions to the SDP problem. Orbital response contributions to the gradient are identical to those that arise in conventional CASSCF methods in which the electronic structure of the active space is described by a full configuration interaction (CI) wave function. We explore the relative performance of variational 2-RDM (v2RDM)- and CI-driven CASSCF for the equilibrium geometries of 20 small molecules. When enforcing two-particle N-representability conditions, full-valence v2RDM-CASSCF-optimized bond lengths display a mean unsigned error of 0.0060 Å and a maximum unsigned error of 0.0265 Å, relative to those obtained from full-valence CI-CASSCF. When enforcing partial three-particle N-representability conditions, the mean and maximum unsigned errors are reduced to only 0.0006 and 0.0054 Å, respectively. For these same molecules, full-valence v2RDM-CASSCF bond lengths computed in the cc-pVQZ basis set deviate from experimentally determined ones on average by 0.017 and 0.011 Å when enforcing two- and three-particle conditions, respectively, whereas CI-CASSCF displays an average deviation of 0.010 Å. The v2RDM-CASSCF approach with two-particle conditions is also applied to the equilibrium geometry of pentacene; optimized bond lengths deviate from those derived from experiment, on average, by 0.015 Å when using a cc-pVDZ basis set and a (22e,22o) active space.

  15. Understanding Business Analytics

    DTIC Science & Technology

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  16. An uncertainty analysis of wildfire modeling [Chapter 13

    Treesearch

    Karin Riley; Matthew Thompson

    2017-01-01

    Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...

  17. Analytic analysis of auxetic metamaterials through analogy with rigid link systems

    NASA Astrophysics Data System (ADS)

    Rayneau-Kirkhope, Daniel; Zhang, Chengzhao; Theran, Louis; Dias, Marcelo A.

    2018-02-01

    In recent years, many structural motifs have been designed with the aim of creating auxetic metamaterials. One area of particular interest in this subject is the creation of auxetic material properties through elastic instability. Such metamaterials switch from conventional behaviour to an auxetic response for loads greater than some threshold value. This paper develops a novel methodology in the analysis of auxetic metamaterials which exhibit elastic instability through analogy with rigid link lattice systems. The results of our analytic approach are confirmed by finite-element simulations for both the onset of elastic instability and post-buckling behaviour including Poisson's ratio. The method gives insight into the relationships between mechanisms within lattices and their mechanical behaviour; as such, it has the potential to allow existing knowledge of rigid link lattices with auxetic paths to be used in the design of future buckling-induced auxetic metamaterials.

  18. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    NASA Astrophysics Data System (ADS)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  19. Nuclear analytical techniques in medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cesareo, R.

    1988-01-01

    This book acquaints one with the fundamental principles and the instrumentation relevant to analytical technique based on atomic and nuclear physics, as well as present and future biomedical applications. Besides providing a theoretical description of the physical phenomena, a large part of the book is devoted to applications in the medical and biological field, particularly in hematology, forensic medicine and environmental science. This volume reviews methods such as the possibility of carrying out rapid multi-element analysis of trace elements on biomedical samples, in vitro and in vivo, by XRF-analysis; the ability of the PIXE-microprobe to analyze in detail and tomore » map trace elements in fragments of biomedical samples or inside the cells; the potentiality of in vivo nuclear activation analysis for diagnostic purposes. Finally, techniques are described such as radiation scattering (elastic and inelastic scattering) and attenuation measurements which will undoubtedly see great development in the immediate future.« less

  20. Anisotropic Multishell Analytical Modeling of an Intervertebral Disk Subjected to Axial Compression.

    PubMed

    Demers, Sébastien; Nadeau, Sylvie; Bouzid, Abdel-Hakim

    2016-04-01

    Studies on intervertebral disk (IVD) response to various loads and postures are essential to understand disk's mechanical functions and to suggest preventive and corrective actions in the workplace. The experimental and finite-element (FE) approaches are well-suited for these studies, but validating their findings is difficult, partly due to the lack of alternative methods. Analytical modeling could allow methodological triangulation and help validation of FE models. This paper presents an analytical method based on thin-shell, beam-on-elastic-foundation and composite materials theories to evaluate the stresses in the anulus fibrosus (AF) of an axisymmetric disk composed of multiple thin lamellae. Large deformations of the soft tissues are accounted for using an iterative method and the anisotropic material properties are derived from a published biaxial experiment. The results are compared to those obtained by FE modeling. The results demonstrate the capability of the analytical model to evaluate the stresses at any location of the simplified AF. It also demonstrates that anisotropy reduces stresses in the lamellae. This novel model is a preliminary step in developing valuable analytical models of IVDs, and represents a distinctive groundwork that is able to sustain future refinements. This paper suggests important features that may be included to improve model realism.

  1. Rethinking Visual Analytics for Streaming Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is

  2. Exploration of Tensions in a Mobile-Technology Supported Fieldtrip: An Activity Theory Perspective

    ERIC Educational Resources Information Center

    Lai, Chih-Hung; Chen, Fei-Ching; Yang, Jie-Chi

    2014-01-01

    The purpose of this study was to analyze how mobile technologies were incorporated and implemented in an outdoor learning activity. Two classes of primary school students participated in the experiment. Using activity theory as an analytical framework, it is found that underlying tensions provided rich insights into system dynamics and that…

  3. Big data and visual analytics in anaesthesia and health care.

    PubMed

    Simpao, A F; Ahumada, L M; Rehman, M A

    2015-09-01

    Advances in computer technology, patient monitoring systems, and electronic health record systems have enabled rapid accumulation of patient data in electronic form (i.e. big data). Organizations such as the Anesthesia Quality Institute and Multicenter Perioperative Outcomes Group have spearheaded large-scale efforts to collect anaesthesia big data for outcomes research and quality improvement. Analytics--the systematic use of data combined with quantitative and qualitative analysis to make decisions--can be applied to big data for quality and performance improvements, such as predictive risk assessment, clinical decision support, and resource management. Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces, and it can facilitate performance of cognitive activities involving big data. Ongoing integration of big data and analytics within anaesthesia and health care will increase demand for anaesthesia professionals who are well versed in both the medical and the information sciences. © The Author 2015. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Analytical methods of the U.S. Geological Survey's New York District Water-Analysis Laboratory

    USGS Publications Warehouse

    Lawrence, Gregory B.; Lincoln, Tricia A.; Horan-Ross, Debra A.; Olson, Mark L.; Waldron, Laura A.

    1995-01-01

    The New York District of the U.S. Geological Survey (USGS) in Troy, N.Y., operates a water-analysis laboratory for USGS watershed-research projects in the Northeast that require analyses of precipitation and of dilute surface water and soil water for major ions; it also provides analyses of certain chemical constituents in soils and soil gas samples.This report presents the methods for chemical analyses of water samples, soil-water samples, and soil-gas samples collected in wateshed-research projects. The introduction describes the general materials and technicques for each method and explains the USGS quality-assurance program and data-management procedures; it also explains the use of cross reference to the three most commonly used methods manuals for analysis of dilute waters. The body of the report describes the analytical procedures for (1) solution analysis, (2) soil analysis, and (3) soil-gas analysis. The methods are presented in alphabetical order by constituent. The method for each constituent is preceded by (1) reference codes for pertinent sections of the three manuals mentioned above, (2) a list of the method's applications, and (3) a summary of the procedure. The methods section for each constitutent contains the following categories: instrumentation and equipment, sample preservation and storage, reagents and standards, analytical procedures, quality control, maintenance, interferences, safety considerations, and references. Sufficient information is presented for each method to allow the resulting data to be appropriately used in environmental investigations.

  5. Flow analysis techniques for phosphorus: an overview.

    PubMed

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  6. Unexpected Analyte Oxidation during Desorption Electrospray Ionization - Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pasilis, Sofie P; Kertesz, Vilmos; Van Berkel, Gary J

    2008-01-01

    During the analysis of surface spotted analytes using desorption electrospray ionization mass spectrometry (DESI-MS), abundant ions are sometimes observed that appear to be the result of oxygen addition reactions. In this investigation, the effect of sample aging, the ambient lab environment, spray voltage, analyte surface concentration, and surface type on this oxidative modification of spotted analytes, exemplified by tamoxifen and reserpine, during analysis by desorption electrospray ionization mass spectrometry was studied. Simple exposure of the samples to air and to ambient lighting increased the extent of oxidation. Increased spray voltage lead also to increased analyte oxidation, possibly as a resultmore » of oxidative species formed electrochemically at the emitter electrode or in the gas - phase by discharge processes. These oxidative species are carried by the spray and impinge on and react with the sampled analyte during desorption/ionization. The relative abundance of oxidized species was more significant for analysis of deposited analyte having a relatively low surface concentration. Increasing spray solvent flow rate and addition of hydroquinone as a redox buffer to the spray solvent were found to decrease, but not entirely eliminate, analyte oxidation during analysis. The major parameters that both minimize and maximize analyte oxidation were identified and DESI-MS operational recommendations to avoid these unwanted reactions are suggested.« less

  7. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  8. Conversion of multiple analyte cation types to a single analyte anion type via ion/ion charge inversion.

    PubMed

    Hassell, Kerry M; LeBlanc, Yves; McLuckey, Scott A

    2009-11-01

    Charge inversion ion/ion reactions can convert several cation types associated with a single analyte molecule to a single anion type for subsequent mass analysis. Specifically, analyte ions present with one of a variety of cationizing agents, such as an excess proton, excess sodium ion, or excess potassium ion, can all be converted to the deprotonated molecule, provided that a stable anion can be generated for the analyte. Multiply deprotonated species that are capable of exchanging a proton for a metal ion serve as the reagent anions for the reaction. This process is demonstrated here for warfarin and for a glutathione conjugate. Examples for several other glutathione conjugates are provided as supplementary material to demonstrate the generality of the reaction. In the case of glutathione conjugates, multiple metal ions can be associated with the singly-charged analyte due to the presence of two carboxylate groups. The charge inversion reaction involves the removal of the excess cationizing agent, as well as any metal ions associated with anionic groups to yield a singly deprotonated analyte molecule. The ability to convert multiple cation types to a single anion type is analytically desirable in cases in which the analyte signal is distributed among several cation types, as is common in the electrospray ionization of solutions with relatively high salt contents. For analyte species that undergo efficient charge inversion, such as glutathione conjugates, there is the additional potential advantage for significantly improved signal-to-noise ratios when species that give rise to 'chemical noise' in the positive ion spectrum do not undergo efficient charge inversion.

  9. Extended Analytic Device Optimization Employing Asymptotic Expansion

    NASA Technical Reports Server (NTRS)

    Mackey, Jonathan; Sehirlioglu, Alp; Dynsys, Fred

    2013-01-01

    Analytic optimization of a thermoelectric junction often introduces several simplifying assumptionsincluding constant material properties, fixed known hot and cold shoe temperatures, and thermallyinsulated leg sides. In fact all of these simplifications will have an effect on device performance,ranging from negligible to significant depending on conditions. Numerical methods, such as FiniteElement Analysis or iterative techniques, are often used to perform more detailed analysis andaccount for these simplifications. While numerical methods may stand as a suitable solution scheme,they are weak in gaining physical understanding and only serve to optimize through iterativesearching techniques. Analytic and asymptotic expansion techniques can be used to solve thegoverning system of thermoelectric differential equations with fewer or less severe assumptionsthan the classic case. Analytic methods can provide meaningful closed form solutions and generatebetter physical understanding of the conditions for when simplifying assumptions may be valid.In obtaining the analytic solutions a set of dimensionless parameters, which characterize allthermoelectric couples, is formulated and provide the limiting cases for validating assumptions.Presentation includes optimization of both classic rectangular couples as well as practically andtheoretically interesting cylindrical couples using optimization parameters physically meaningful toa cylindrical couple. Solutions incorporate the physical behavior for i) thermal resistance of hot andcold shoes, ii) variable material properties with temperature, and iii) lateral heat transfer through legsides.

  10. A comparative study of neutron activation analysis and proton-induced X-ray emission analysis for the determination of heavy metals in estuarine sediments

    NASA Astrophysics Data System (ADS)

    Randle, K.; Al-Jundi, J.; Mamas, C. J. V.; Sokhi, R. S.; Earwaker, L. G.

    1993-06-01

    Our work on heavy metals in the estuarine environment has involved the use of two multielement techniques: neutron activation analysis (NAA) and proton-induced X-ray emission (PIXE) analysis. As PIXE is essentially a surface analytical technique problems may arise due to sample inhomogeneity and surface roughness. In order to assess the contribution of these effects we have compared the results from PIXE analysis with those from a technique which analyzes a larger bulk sample rather than just the surface. An obvious method was NAA. A series of sediment samples containing particles of variable diameter were compared. Pellets containing a few mg of sediment were prepared from each sample and analyzed by the PIXE technique using both an absolute and a comparitive method. For INAA the rest of the sample was then irradiated with thermal neutrons and element concentrations determined from analyses of the subsequent gamma-ray spectrum. Results from the two methods are discussed.

  11. Means of introducing an analyte into liquid sampling atmospheric pressure glow discharge

    DOEpatents

    Marcus, R. Kenneth; Quarles, Jr., Charles Derrick; Russo, Richard E.; Koppenaal, David W.; Barinaga, Charles J.; Carado, Anthony J.

    2017-01-03

    A liquid sampling, atmospheric pressure, glow discharge (LS-APGD) device as well as systems that incorporate the device and methods for using the device and systems are described. The LS-APGD includes a hollow capillary for delivering an electrolyte solution to a glow discharge space. The device also includes a counter electrode in the form of a second hollow capillary that can deliver the analyte into the glow discharge space. A voltage across the electrolyte solution and the counter electrode creates the microplasma within the glow discharge space that interacts with the analyte to move it to a higher energy state (vaporization, excitation, and/or ionization of the analyte).

  12. An Analytical Study of Icing Similitude for Aircraft Engine Testing. Revision

    DTIC Science & Technology

    1987-02-01

    MODELING GEOMETRIES Component Cowl Spinner Fan Blade Fan Stator Exit Vane Probe Approximating Geometry NACA 0012 Airfoil Sphere NACA 0012...DOT/FAA/CT·86/35 AEDC·TR·86·26 An Analytical Study of Icing Similitude for Aircraft Engine Testing c. Scott Bartlett Sverdrup Technology, Inc...8217~,feCa.ORI A n AnalYtical Study )f Icin~ Similitude for Aircraft Engine Tes t tu~ 12. PERSONAL AUTHOR/S) B a r t l e t t , C. Scot t , Sverdrup

  13. The analytical and numerical approaches to the theory of the Moon's librations: Modern analysis and results

    NASA Astrophysics Data System (ADS)

    Petrova, N.; Zagidullin, A.; Nefedyev, Y.; Kosulin, V.; Andreev, A.

    2017-11-01

    Observing physical librations of celestial bodies and the Moon represents one of the astronomical methods of remotely assessing the internal structure of a celestial body without conducting expensive space experiments. The paper contains a review of recent advances in studying the Moon's structure using various methods of obtaining and applying the lunar physical librations (LPhL) data. In this article LPhL simulation methods of assessing viscoelastic and dissipative properties of the lunar body and lunar core parameters, whose existence has been recently confirmed during the seismic data reprocessing of ;Apollo; space mission, are described. Much attention is paid to physical interpretation of the free librations phenomenon and the methods for its determination. In the paper the practical application of the most accurate analytical LPhL tables (Rambaux and Williams, 2011) is discussed. The tables were built on the basis of complex analytical processing of the residual differences obtained when comparing long-term series of laser observations with the numerical ephemeris DE421. In the paper an efficiency analysis of two approaches to LPhL theory is conducted: the numerical and the analytical ones. It has been shown that in lunar investigation both approaches complement each other in various aspects: the numerical approach provides high accuracy of the theory, which is required for the proper processing of modern observations, the analytical approach allows to comprehend the essence of the phenomena in the lunar rotation, predict and interpret new effects in the observations of lunar body and lunar core parameters.

  14. Toxicologic evaluation of analytes from Tank 241-C-103

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahlum, D.D.; Young, J.Y.; Weller, R.E.

    1994-11-01

    Westinghouse Hanford Company requested PNL to assemble a toxicology review panel (TRP) to evaluate analytical data compiled by WHC, and provide advice concerning potential health effects associated with exposure to tank-vapor constituents. The team`s objectives would be to (1) review procedures used for sampling vapors from tanks, (2) identify constituents in tank-vapor samples that could be related to symptoms reported by workers, (3) evaluate the toxicological implications of those constituents by comparison to establish toxicological databases, (4) provide advice for additional analytical efforts, and (5) support other activities as requested by WHC. The TRP represents a wide range of expertise,more » including toxicology, industrial hygiene, and occupational medicine. The TRP prepared a list of target analytes that chemists at the Oregon Graduate Institute/Sandia (OGI), Oak Ridge National Laboratory (ORNL), and PNL used to establish validated methods for quantitative analysis of head-space vapors from Tank 241-C-103. this list was used by the analytical laboratories to develop appropriate analytical methods for samples from Tank 241-C-103. Target compounds on the list included acetone, acetonitrile, ammonia, benzene, 1, 3-butadiene, butanal, n-butanol, hexane, 2-hexanone, methylene chloride, nitric oxide, nitrogen dioxide, nitrous oxide, dodecane, tridecane, propane nitrile, sulfur oxide, tributyl phosphate, and vinylidene chloride. The TRP considered constituent concentrations, current exposure limits, reliability of data relative to toxicity, consistency of the analytical data, and whether the material was carcinogenic or teratogenic. A final consideration in the analyte selection process was to include representative chemicals for each class of compounds found.« less

  15. Analysis of thin-walled cylindrical composite shell structures subject to axial and bending loads: Concept development, analytical modeling and experimental verification

    NASA Astrophysics Data System (ADS)

    Mahadev, Sthanu

    Continued research and development efforts devoted in recent years have generated novel avenues towards the advancement of efficient and effective, slender laminated fiber-reinforced composite members. Numerous studies have focused on the modeling and response characterization of composite structures with particular relevance to thin-walled cylindrical composite shells. This class of shell configurations is being actively explored to fully determine their mechanical efficacy as primary aerospace structural members. The proposed research is targeted towards formulating a composite shell theory based prognosis methodology that entails an elaborate analysis and investigation of thin-walled cylindrical shell type laminated composite configurations that are highly desirable in increasing number of mechanical and aerospace applications. The prime motivation to adopt this theory arises from its superior ability to generate simple yet viable closed-form analytical solution procedure to numerous geometrically intense, inherent curvature possessing composite structures. This analytical evaluative routine offers to acquire a first-hand insight on the primary mechanical characteristics that essentially govern the behavior of slender composite shells under typical static loading conditions. Current work exposes the robustness of this mathematical framework via demonstrating its potential towards the prediction of structural properties such as axial stiffness and bending stiffness respectively. Longitudinal ply-stress computations are investigated upon deriving the global stiffness matrix model for composite cylindrical tubes with circular cross-sections. Additionally, this work employs a finite element based numerical technique to substantiate the analytical results reported for cylindrically shaped circular composite tubes. Furthermore, this concept development is extended to the study of thin-walled, open cross-sectioned, curved laminated shells that are geometrically

  16. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    PubMed Central

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  17. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    PubMed

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  18. An Analysis of Rocket Propulsion Testing Costs

    NASA Technical Reports Server (NTRS)

    Ramirez-Pagan, Carmen P.; Rahman, Shamim A.

    2009-01-01

    The primary mission at NASA Stennis Space Center (SSC) is rocket propulsion testing. Such testing is generally performed within two arenas: (1) Production testing for certification and acceptance, and (2) Developmental testing for prototype or experimental purposes. The customer base consists of NASA programs, DOD programs, and commercial programs. Resources in place to perform on-site testing include both civil servants and contractor personnel, hardware and software including data acquisition and control, and 6 test stands with a total of 14 test positions/cells. For several business reasons there is the need to augment understanding of the test costs for all the various types of test campaigns. Historical propulsion test data was evaluated and analyzed in many different ways with the intent to find any correlation or statistics that could help produce more reliable and accurate cost estimates and projections. The analytical efforts included timeline trends, statistical curve fitting, average cost per test, cost per test second, test cost timeline, and test cost envelopes. Further, the analytical effort includes examining the test cost from the perspective of thrust level and test article characteristics. Some of the analytical approaches did not produce evidence strong enough for further analysis. Some other analytical approaches yield promising results and are candidates for further development and focused study. Information was organized for into its elements: a Project Profile, Test Cost Timeline, and Cost Envelope. The Project Profile is a snap shot of the project life cycle on a timeline fashion, which includes various statistical analyses. The Test Cost Timeline shows the cumulative average test cost, for each project, at each month where there was test activity. The Test Cost Envelope shows a range of cost for a given number of test(s). The supporting information upon which this study was performed came from diverse sources and thus it was necessary to

  19. Analytical multiple scattering correction to the Mie theory: Application to the analysis of the lidar signal

    NASA Technical Reports Server (NTRS)

    Flesia, C.; Schwendimann, P.

    1992-01-01

    The contribution of the multiple scattering to the lidar signal is dependent on the optical depth tau. Therefore, the radar analysis, based on the assumption that the multiple scattering can be neglected is limited to cases characterized by low values of the optical depth (tau less than or equal to 0.1) and hence it exclude scattering from most clouds. Moreover, all inversion methods relating lidar signal to number densities and particle size must be modified since the multiple scattering affects the direct analysis. The essential requests of a realistic model for lidar measurements which include the multiple scattering and which can be applied to practical situations follow. (1) Requested are not only a correction term or a rough approximation describing results of a certain experiment, but a general theory of multiple scattering tying together the relevant physical parameter we seek to measure. (2) An analytical generalization of the lidar equation which can be applied in the case of a realistic aerosol is requested. A pure analytical formulation is important in order to avoid the convergency and stability problems which, in the case of numerical approach, are due to the large number of events that have to be taken into account in the presence of large depth and/or a strong experimental noise.

  20. New, small, fast acting blood glucose meters--an analytical laboratory evaluation.

    PubMed

    Weitgasser, Raimund; Hofmann, Manuela; Gappmayer, Brigitta; Garstenauer, Christa

    2007-09-22

    Patients and medical personnel are eager to use blood glucose meters that are easy to handle and fast acting. We questioned whether accuracy and precision of these new, small and light weight devices would meet analytical laboratory standards and tested four meters with the above mentioned conditions. Approximately 300 capillary blood samples were collected and tested using two devices of each brand and two different types of glucose test strips. Blood from the same samples was used for comparison. Results were evaluated using maximum deviation of 5% and 10% from the comparative method, the error grid analysis, the overall deviation of the devices, the linear regression analysis as well as the CVs for measurement in series. Of all 1196 measurements a deviation of less than 5% resp. 10% from the reference method was found for the FreeStyle (FS) meter in 69.5% and 96%, the Glucocard X Meter (GX) in 44% and 75%, the One Touch Ultra (OT) in 29% and 60%, the Wellion True Track (WT) in 28.5% and 58%. The error grid analysis gave 99.7% for FS, 99% for GX, 98% for OT and 97% for WT in zone A. The remainder of the values lay within zone B. Linear regression analysis resembled these results. CVs for measurement in series showed higher deviations for OT and WT compared to FS and GX. The four new, small and fast acting glucose meters fulfil clinically relevant analytical laboratory requirements making them appropriate for use by medical personnel. However, with regard to the tight and restrictive limits of the ADA recommendations, the devices are still in need of improvement. This should be taken into account when the devices are used by primarily inexperienced persons and is relevant for further industrial development of such devices.