Sample records for analytic receiver analysis

  1. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  2. Analytic model for ultrasound energy receivers and their optimal electric loads II: Experimental validation

    NASA Astrophysics Data System (ADS)

    Gorostiaga, M.; Wapler, M. C.; Wallrabe, U.

    2017-10-01

    In this paper, we verify the two optimal electric load concepts based on the zero reflection condition and on the power maximization approach for ultrasound energy receivers. We test a high loss 1-3 composite transducer, and find that the measurements agree very well with the predictions of the analytic model for plate transducers that we have developed previously. Additionally, we also confirm that the power maximization and zero reflection loads are very different when the losses in the receiver are high. Finally, we compare the optimal load predictions by the KLM and the analytic models with frequency dependent attenuation to evaluate the influence of the viscosity.

  3. Receive-Noise Analysis of Capacitive Micromachined Ultrasonic Transducers.

    PubMed

    Bozkurt, Ayhan; Yaralioglu, G Goksenin

    2016-11-01

    This paper presents an analysis of thermal (Johnson) noise received from the radiation medium by otherwise noiseless capacitive micromachined ultrasonic transducer (CMUT) membranes operating in their fundamental resonance mode. Determination of thermal noise received by multiple numbers of transducers or a transducer array requires the assessment of cross-coupling through the radiation medium, as well as the self-radiation impedance of the individual transducer. We show that the total thermal noise received by the cells of a CMUT has insignificant correlation, and is independent of the radiation impedance, but is only determined by the mass of each membrane and the electromechanical transformer ratio. The proof is based on the analytical derivations for a simple transducer with two cells, and extended to transducers with numerous cells using circuit simulators. We used a first-order model, which incorporates the fundamental resonance of the CMUT. Noise power is calculated by integrating over the entire spectrum; hence, the presented figures are an upper bound for the noise. The presented analyses are valid for a transimpedance amplifier in the receive path. We use the analysis results to calculate the minimum detectable pressure of a CMUT. We also provide an analysis based on the experimental data to show that output noise power is limited by and comparable to the theoretical upper limit.

  4. Analytic model for ultrasound energy receivers and their optimal electric loads

    NASA Astrophysics Data System (ADS)

    Gorostiaga, M.; Wapler, M. C.; Wallrabe, U.

    2017-08-01

    In this paper, we present an analytic model for thickness resonating plate ultrasound energy receivers, which we have derived from the piezoelectric and the wave equations and, in which we have included dielectric, viscosity and acoustic attenuation losses. Afterwards, we explore the optimal electric load predictions by the zero reflection and power maximization approaches present in the literature with different acoustic boundary conditions, and discuss their limitations. To validate our model, we compared our expressions with the KLM model solved numerically with very good agreement. Finally, we discuss the differences between the zero reflection and power maximization optimal electric loads, which start to differ as losses in the receiver increase.

  5. Analytical estimation of laser phase noise induced BER floor in coherent receiver with digital signal processing.

    PubMed

    Vanin, Evgeny; Jacobsen, Gunnar

    2010-03-01

    The Bit-Error-Ratio (BER) floor caused by the laser phase noise in the optical fiber communication system with differential quadrature phase shift keying (DQPSK) and coherent detection followed by digital signal processing (DSP) is analytically evaluated. An in-phase and quadrature (I&Q) receiver with a carrier phase recovery using DSP is considered. The carrier phase recovery is based on a phase estimation of a finite sum (block) of the signal samples raised to the power of four and the phase unwrapping at transitions between blocks. It is demonstrated that errors generated at block transitions cause the dominating contribution to the system BER floor when the impact of the additive noise is negligibly small in comparison with the effect of the laser phase noise. Even the BER floor in the case when the phase unwrapping is omitted is analytically derived and applied to emphasize the crucial importance of this signal processing operation. The analytical results are verified by full Monte Carlo simulations. The BER for another type of DQPSK receiver operation, which is based on differential phase detection, is also obtained in the analytical form using the principle of conditional probability. The principle of conditional probability is justified in the case of differential phase detection due to statistical independency of the laser phase noise induced signal phase error and the additive noise contributions. Based on the achieved analytical results the laser linewidth tolerance is calculated for different system cases.

  6. FFT analysis of sensible-heat solar-dynamic receivers

    NASA Astrophysics Data System (ADS)

    Lund, Kurt O.

    The use of solar dynamic receivers with sensible energy storage in single-phase materials is considered. The feasibility of single-phase designs with weight and thermal performance comparable to existing two-phase designs is addressed. Linearized heat transfer equations are formulated for the receiver heat storage, representing the periodic input solar flux as the sum of steady and oscillating distributions. The steady component is solved analytically to produce the desired receiver steady outlet gas temperature, and the FFT algorithm is applied to the oscillating components to obtain the amplitudes and mode shapes of the oscillating solid and gas temperatures. The results indicate that sensible-heat receiver designs with performance comparable to state-of-the-art two-phase receivers are available.

  7. Analytic process and dreaming about analysis.

    PubMed

    Sirois, François

    2016-12-01

    Dreams about the analytic session feature a manifest content in which the analytic setting is subject to distortion while the analyst appears undisguised. Such dreams are a consistent yet infrequent occurrence in most analyses. Their specificity consists in never reproducing the material conditions of the analysis as such. This paper puts forward the following hypothesis: dreams about the session relate to some aspects of the analyst's activity. In this sense, such dreams are indicative of the transference neurosis, prefiguring transference resistances to the analytic elaboration of key conflicts. The parts taken by the patient and by the analyst are discussed in terms of their ability to signal a deepening of the analysis. Copyright © 2016 Institute of Psychoanalysis.

  8. Exploratory Analysis in Learning Analytics

    ERIC Educational Resources Information Center

    Gibson, David; de Freitas, Sara

    2016-01-01

    This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex…

  9. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  10. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-05-01

    function (Womer)  Unit cost as a function of learning and rate  Learning with forgetting (Benkard)  Learning depreciates over time  Discretionary...Analytical Tools for Affordability Analysis David Tate Cost Analysis and Research Division Institute for Defense Analyses Report Documentation...ES) Institute for Defense Analyses, Cost Analysis and Research Division,4850 Mark Center Drive,Alexandria,VA,22311-1882 8. PERFORMING ORGANIZATION

  11. The Relation of Perceived and Received Social Support to Mental Health among First Responders: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Prati, Gabriele; Pietrantoni, Luca

    2010-01-01

    There are plenty of theories that may support the protective role of social support in the aftermath of potentially traumatic events. This meta-analytic review examined the role of received and perceived social support in promoting mental health among first responders (e.g., firefighters, police officers, and paramedics or emergency medical…

  12. Received optical power calculations for optical communications link performance analysis

    NASA Technical Reports Server (NTRS)

    Marshall, W. K.; Burk, B. D.

    1986-01-01

    The factors affecting optical communication link performance differ substantially from those at microwave frequencies, due to the drastically differing technologies, modulation formats, and effects of quantum noise in optical communications. In addition detailed design control table calculations for optical systems are less well developed than corresponding microwave system techniques, reflecting the relatively less mature state of development of optical communications. Described below are detailed calculations of received optical signal and background power in optical communication systems, with emphasis on analytic models for accurately predicting transmitter and receiver system losses.

  13. ENVIRONMENTAL ANALYTICAL CHEMISTRY OF ...

    EPA Pesticide Factsheets

    Within the scope of a number of emerging contaminant issues in environmental analysis, one area that has received a great deal of public interest has been the assessment of the role of pharmaceuticals and personal care products (PPCPs) as stressors and agents of change in ecosystems as well as their role in unplanned human exposure. The relationship between personal actions and the occurrence of PPCPs in the environment is clear-cut and comprehensible to the public. In this overview, we attempt to examine the separations aspect of the analytical approach to the vast array of potential analytes among this class of compounds. We also highlight the relationship between these compounds and endocrine disrupting compounds (EDCs) and between PPCPs and EDCs and the more traditional environmental analytes such as the persistent organic pollutants (POPs). Although the spectrum of chemical behavior extends from hydrophobic to hydrophilic, the current focus has shifted to moderately and highly polar analytes. Thus, emphasis on HPLC and LC/MS has grown and MS/MS has become a detection technique of choice with either electrospray ionization or atmospheric pressure chemical ionization. This contrasts markedly with the bench mark approach of capillary GC, GC/MS and electron ionization in traditional environmental analysis. The expansion of the analyte list has fostered new vigor in the development of environmental analytical chemistry, modernized the range of tools appli

  14. Focused analyte spray emission apparatus and process for mass spectrometric analysis

    DOEpatents

    Roach, Patrick J [Kennewick, WA; Laskin, Julia [Richland, WA; Laskin, Alexander [Richland, WA

    2012-01-17

    An apparatus and process are disclosed that deliver an analyte deposited on a substrate to a mass spectrometer that provides for trace analysis of complex organic analytes. Analytes are probed using a small droplet of solvent that is formed at the junction between two capillaries. A supply capillary maintains the droplet of solvent on the substrate; a collection capillary collects analyte desorbed from the surface and emits analyte ions as a focused spray to the inlet of a mass spectrometer for analysis. The invention enables efficient separation of desorption and ionization events, providing enhanced control over transport and ionization of the analyte.

  15. An overview on forensic analysis devoted to analytical chemists.

    PubMed

    Castillo-Peinado, L S; Luque de Castro, M D

    2017-05-15

    The present article has as main aim to show analytical chemists interested in forensic analysis the world they will face if decision in favor of being a forensic analytical chemist is adopted. With this purpose, the most outstanding aspects of forensic analysis in dealing with sampling (involving both bodily and no bodily samples), sample preparation, and analytical equipment used in detection, identification and quantitation of key sample components are critically discussed. The role of the great omics in forensic analysis, and the growing role of the youngest of the great omics -metabolomics- are also discussed. The foreseeable role of integrative omics is also outlined. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence.

    PubMed

    Di Cataldo, Santa; Tonti, Simone; Bottino, Andrea; Ficarra, Elisa

    2016-05-01

    The automated analysis of indirect immunofluorescence images for Anti-Nuclear Autoantibody (ANA) testing is a fairly recent field that is receiving ever-growing interest from the research community. ANA testing leverages on the categorization of intensity level and fluorescent pattern of IIF images of HEp-2 cells to perform a differential diagnosis of important autoimmune diseases. Nevertheless, it suffers from tremendous lack of repeatability due to subjectivity in the visual interpretation of the images. The automatization of the analysis is seen as the only valid solution to this problem. Several works in literature address individual steps of the work-flow, nonetheless integrating such steps and assessing their effectiveness as a whole is still an open challenge. We present a modular tool, ANAlyte, able to characterize a IIF image in terms of fluorescent intensity level and fluorescent pattern without any user-interactions. For this purpose, ANAlyte integrates the following: (i) Intensity Classifier module, that categorizes the intensity level of the input slide based on multi-scale contrast assessment; (ii) Cell Segmenter module, that splits the input slide into individual HEp-2 cells; (iii) Pattern Classifier module, that determines the fluorescent pattern of the slide based on the pattern of the individual cells. To demonstrate the accuracy and robustness of our tool, we experimentally validated ANAlyte on two different public benchmarks of IIF HEp-2 images with rigorous leave-one-out cross-validation strategy. We obtained overall accuracy of fluorescent intensity and pattern classification respectively around 85% and above 90%. We assessed all results by comparisons with some of the most representative state of the art works. Unlike most of the other works in the recent literature, ANAlyte aims at the automatization of all the major steps of ANA image analysis. Results on public benchmarks demonstrate that the tool can characterize HEp-2 slides in terms of

  17. Analysis of Optimum Heterodyne Receivers for Coherent Lidar Applications

    NASA Technical Reports Server (NTRS)

    Amzajerdian, Farzin

    2002-01-01

    A full analysis of the combined effects of all the noise sources of optical heterodyne receiver and the interaction between the competing control parameters of the receiver detector and pre-amplifier will be presented. This analysis provides the mean for true optimization of the coherent lidar receiver. The significance of the optimization of heterodyne receiver is shown for 2-micron coherent lidar.

  18. Heat-Energy Analysis for Solar Receivers

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.

    1982-01-01

    Heat-energy analysis program (HEAP) solves general heat-transfer problems, with some specific features that are "custom made" for analyzing solar receivers. Can be utilized not only to predict receiver performance under varying solar flux, ambient temperature and local heat-transfer rates but also to detect locations of hotspots and metallurgical difficulties and to predict performance sensitivity of neighboring component parameters.

  19. NAVSTAR GPS Marine Receiver Performance Analysis

    DOT National Transportation Integrated Search

    1984-09-01

    This report is an analysis and comparison of the capability of several NAVSTAR GPS receiver configurations to provide accurate position data to the civil marine user. The NAVSTAR GPS system itself has the potential to provide civil marine users with ...

  20. Receiver operating characteristic analysis of age-related changes in lineup performance.

    PubMed

    Humphries, Joyce E; Flowe, Heather D

    2015-04-01

    In the basic face memory literature, support has been found for the late maturation hypothesis, which holds that face recognition ability is not fully developed until at least adolescence. Support for the late maturation hypothesis in the criminal lineup identification literature, however, has been equivocal because of the analytic approach that has been used to examine age-related changes in identification performance. Recently, receiver operator characteristic (ROC) analysis was applied for the first time in the adult eyewitness memory literature to examine whether memory sensitivity differs across different types of lineup tests. ROC analysis allows for the separation of memory sensitivity from response bias in the analysis of recognition data. Here, we have made the first ROC-based comparison of adults' and children's (5- and 6-year-olds and 9- and 10-year-olds) memory performance on lineups by reanalyzing data from Humphries, Holliday, and Flowe (2012). In line with the late maturation hypothesis, memory sensitivity was significantly greater for adults compared with young children. Memory sensitivity for older children was similar to that for adults. The results indicate that the late maturation hypothesis can be generalized to account for age-related performance differences on an eyewitness memory task. The implications for developmental eyewitness memory research are discussed. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Calibration-free concentration analysis for an analyte prone to self-association.

    PubMed

    Imamura, Hiroshi; Honda, Shinya

    2017-01-01

    Calibration-free concentration analysis (CFCA) based on surface plasmon resonance uses the diffusion coefficient of an analyte to determine the concentration of that analyte in a bulk solution. In general, CFCA is avoided when investigating analytes prone to self-association, as the heterogeneous diffusion coefficient results in a loss of precision. The derivation for self-association of the analyte was presented here. By using the diffusion coefficient for the monomeric state, CFCA provides the lowest possible concentration even though the analyte is self-associated. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Receiver function analysis applied to refraction survey data

    NASA Astrophysics Data System (ADS)

    Subaru, T.; Kyosuke, O.; Hitoshi, M.

    2008-12-01

    For the estimation of the thickness of oceanic crust or petrophysical investigation of subsurface material, refraction or reflection seismic exploration is one of the methods frequently practiced. These explorations use four-component (x,y,z component of acceleration and pressure) seismometer, but only compressional wave or vertical component of seismometers tends to be used in the analyses. Hence, it is needed to use shear wave or lateral component of seismograms for more precise investigation to estimate the thickness of oceanic crust. Receiver function is a function at a place that can be used to estimate the depth of velocity interfaces by receiving waves from teleseismic signal including shear wave. Receiver function analysis uses both vertical and horizontal components of seismograms and deconvolves the horizontal with the vertical to estimate the spectral difference of P-S converted waves arriving after the direct P wave. Once the phase information of the receiver function is obtained, then one can estimate the depth of the velocity interface. This analysis has advantage in the estimation of the depth of velocity interface including Mohorovicic discontinuity using two components of seismograms when P-to-S converted waves are generated at the interface. Our study presents results of the preliminary study using synthetic seismograms. First, we use three types of geological models that are composed of a single sediment layer, a crust layer, and a sloped Moho, respectively, for underground sources. The receiver function can estimate the depth and shape of Moho interface precisely for the three models. Second, We applied this method to synthetic refraction survey data generated not by earthquakes but by artificial sources on the ground or sea surface. Compressional seismic waves propagate under the velocity interface and radiate converted shear waves as well as at the other deep underground layer interfaces. However, the receiver function analysis applied to the

  3. Composable Analytic Systems for next-generation intelligence analysis

    NASA Astrophysics Data System (ADS)

    DiBona, Phil; Llinas, James; Barry, Kevin

    2015-05-01

    Lockheed Martin Advanced Technology Laboratories (LM ATL) is collaborating with Professor James Llinas, Ph.D., of the Center for Multisource Information Fusion at the University at Buffalo (State of NY), researching concepts for a mixed-initiative associate system for intelligence analysts to facilitate reduced analysis and decision times while proactively discovering and presenting relevant information based on the analyst's needs, current tasks and cognitive state. Today's exploitation and analysis systems have largely been designed for a specific sensor, data type, and operational context, leading to difficulty in directly supporting the analyst's evolving tasking and work product development preferences across complex Operational Environments. Our interactions with analysts illuminate the need to impact the information fusion, exploitation, and analysis capabilities in a variety of ways, including understanding data options, algorithm composition, hypothesis validation, and work product development. Composable Analytic Systems, an analyst-driven system that increases flexibility and capability to effectively utilize Multi-INT fusion and analytics tailored to the analyst's mission needs, holds promise to addresses the current and future intelligence analysis needs, as US forces engage threats in contested and denied environments.

  4. Stability analysis of magnetized neutron stars - a semi-analytic approach

    NASA Astrophysics Data System (ADS)

    Herbrik, Marlene; Kokkotas, Kostas D.

    2017-04-01

    We implement a semi-analytic approach for stability analysis, addressing the ongoing uncertainty about stability and structure of neutron star magnetic fields. Applying the energy variational principle, a model system is displaced from its equilibrium state. The related energy density variation is set up analytically, whereas its volume integration is carried out numerically. This facilitates the consideration of more realistic neutron star characteristics within the model compared to analytical treatments. At the same time, our method retains the possibility to yield general information about neutron star magnetic field and composition structures that are likely to be stable. In contrast to numerical studies, classes of parametrized systems can be studied at once, finally constraining realistic configurations for interior neutron star magnetic fields. We apply the stability analysis scheme on polytropic and non-barotropic neutron stars with toroidal, poloidal and mixed fields testing their stability in a Newtonian framework. Furthermore, we provide the analytical scheme for dropping the Cowling approximation in an axisymmetric system and investigate its impact. Our results confirm the instability of simple magnetized neutron star models as well as a stabilization tendency in the case of mixed fields and stratification. These findings agree with analytical studies whose spectrum of model systems we extend by lifting former simplifications.

  5. Analysis of the Optimum Receiver Design Problem Using Interactive Computer Graphics.

    DTIC Science & Technology

    1981-12-01

    7 _AD A115 498A l AR FORCE INST OF TECH WR16HT-PATTERSON AF8 OH SCHOO--ETC F/6 9/2 ANALYSIS OF THE OPTIMUM RECEIVER DESIGN PROBLEM USING INTERACTI...ANALYSIS OF THE OPTIMUM RECEIVER DESIGN PROBLEM USING INTERACTIVE COMPUTER GRAPHICS THESIS AFIT/GE/EE/81D-39 Michael R. Mazzuechi Cpt USA Approved for...public release; distribution unlimited AFIT/GE/EE/SlD-39 ANALYSIS OF THE OPTIMUM RECEIVER DESIGN PROBLEM USING INTERACTIVE COMPUTER GRAPHICS THESIS

  6. An Analysis of Earth Science Data Analytics Use Cases

    NASA Technical Reports Server (NTRS)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  7. Finite Element Analysis of the LOLA Receiver Telescope Lens

    NASA Technical Reports Server (NTRS)

    Matzinger, Elizabeth

    2007-01-01

    This paper presents the finite element stress and distortion analysis completed on the Receiver Telescope lens of the Lunar Orbiter Laser Altimeter (LOLA). LOLA is one of six instruments on the Lunar Reconnaissance Orbiter (LRO), scheduled to launch in 2008. LOLA's main objective is to produce a high-resolution global lunar topographic model to aid in safe landings and enhance surface mobility in future exploration missions. The Receiver Telescope captures the laser pulses transmitted through a diffractive optical element (DOE) and reflected off the lunar surface. The largest lens of the Receiver Telescope, Lens 1, is a 150 mm diameter aspheric lens originally designed to be made of BK7 glass. The finite element model of the Receiver Telescope Lens 1 is comprised of solid elements and constrained in a manner consistent with the behavior of the mounting configuration of the Receiver Telescope tube. Twenty-one temperature load cases were mapped to the nodes based on thermal analysis completed by LOLA's lead thermal analyst, and loads were applied to simulate the preload applied from the ring flexure. The thermal environment of the baseline design (uncoated BK7 lens with no baffle) produces large radial and axial gradients in the lens. These large gradients create internal stresses that may lead to part failure, as well as significant bending that degrades optical performance. The high stresses and large distortions shown in the analysis precipitated a design change from BK7 glass to sapphire.

  8. VAST Challenge 2016: Streaming Visual Analytics

    DTIC Science & Technology

    2016-10-25

    understand rapidly evolving situations. To support such tasks, visual analytics solutions must move well beyond systems that simply provide real-time...received. Mini-Challenge 1: Design Challenge Mini-Challenge 1 focused on systems to support security and operational analytics at the Euybia...Challenge 1 was to solicit novel approaches for streaming visual analytics that push the boundaries for what constitutes a visual analytics system , and to

  9. Analytical Round Robin for Elastic-Plastic Analysis of Surface Cracked Plates, Phase II Results

    NASA Technical Reports Server (NTRS)

    Allen, P. A.; Wells, D. N.

    2017-01-01

    The second phase of an analytical round robin for the elastic-plastic analysis of surface cracks in flat plates was conducted under the auspices of ASTM Interlaboratory Study 732. The interlaboratory study (ILS) had 10 participants with a broad range of expertise and experience, and experimental results from a surface crack tension test in 4142 steel plate loaded well into the elastic-plastic regime provided the basis for the study. The participants were asked to evaluate a surface crack tension test according to the version of the surface crack initiation toughness testing standard published at the time of the ILS, E2899-13. Data were provided to each participant that represent the fundamental information that would be provided by a mechanical test laboratory prior to evaluating the test result. Overall, the participant’s test analysis results were in good agreement and constructive feedback was received that has resulted in an improved published version of the standard E2899-15.

  10. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Risk analysis by FMEA as an element of analytical validation.

    PubMed

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-05

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.

  12. Visual Analytics for Power Grid Contingency Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Pak C.; Huang, Zhenyu; Chen, Yousu

    2014-01-20

    Contingency analysis is the process of employing different measures to model scenarios, analyze them, and then derive the best response to remove the threats. This application paper focuses on a class of contingency analysis problems found in the power grid management system. A power grid is a geographically distributed interconnected transmission network that transmits and delivers electricity from generators to end users. The power grid contingency analysis problem is increasingly important because of both the growing size of the underlying raw data that need to be analyzed and the urgency to deliver working solutions in an aggressive timeframe. Failure tomore » do so may bring significant financial, economic, and security impacts to all parties involved and the society at large. The paper presents a scalable visual analytics pipeline that transforms about 100 million contingency scenarios to a manageable size and form for grid operators to examine different scenarios and come up with preventive or mitigation strategies to address the problems in a predictive and timely manner. Great attention is given to the computational scalability, information scalability, visual scalability, and display scalability issues surrounding the data analytics pipeline. Most of the large-scale computation requirements of our work are conducted on a Cray XMT multi-threaded parallel computer. The paper demonstrates a number of examples using western North American power grid models and data.« less

  13. Sensitivity analysis of heliostat aiming strategies and receiver size on annual thermal production of a molten salt external receiver

    NASA Astrophysics Data System (ADS)

    Servert, Jorge; González, Ana; Gil, Javier; López, Diego; Funes, Jose Felix; Jurado, Alfonso

    2017-06-01

    Even though receiver size and aiming strategy are to be jointly analyzed to optimize the thermal energy that can be extracted from a solar tower receiver, customarily, they have been studied as separated problems. The main reason is the high-level of detail required to define aiming strategies, which are often simplified in annual simulation models. Aiming strategies are usually focused on obtaining a homogeneous heat flux on the central receiver, with the goal to minimize the maximum heat flux value that may lead to damaging it. Some recent studies have addressed the effect of different aiming strategies on different receiver types, but they have only focused on the optical efficiency. The receiver size is also an additional parameter that has to be considered: larger receiver sizes provide a larger aiming surface and a reduction on spillage losses, but require higher investment while penalizing the thermal performance of the receiver due to the greater external convection losses. The present paper presents a sensitivity analysis of both factors for a predefined solar field at a fixed location, using a central receiver and molten salts as HTF. The analysis includes the design point values and annual energy outputs comparing the effect on the optical performance (measured using a spillage factor) and thermal energy production.

  14. FASP, an analytic resource appraisal program for petroleum play analysis

    USGS Publications Warehouse

    Crovelli, R.A.; Balay, R.H.

    1986-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented in a FORTRAN program termed FASP. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An established geologic model considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The program FASP produces resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and many laws of expectation and variance. ?? 1986.

  15. Paper-based analytical devices for environmental analysis.

    PubMed

    Meredith, Nathan A; Quinn, Casey; Cate, David M; Reilly, Thomas H; Volckens, John; Henry, Charles S

    2016-03-21

    The field of paper-based microfluidics has experienced rapid growth over the past decade. Microfluidic paper-based analytical devices (μPADs), originally developed for point-of-care medical diagnostics in resource-limited settings, are now being applied in new areas, such as environmental analyses. Low-cost paper sensors show great promise for on-site environmental analysis; the theme of ongoing research complements existing instrumental techniques by providing high spatial and temporal resolution for environmental monitoring. This review highlights recent applications of μPADs for environmental analysis along with technical advances that may enable μPADs to be more widely implemented in field testing.

  16. How Can Visual Analytics Assist Investigative Analysis? Design Implications from an Evaluation.

    PubMed

    Youn-Ah Kang; Görg, Carsten; Stasko, John

    2011-05-01

    Despite the growing number of systems providing visual analytic support for investigative analysis, few empirical studies of the potential benefits of such systems have been conducted, particularly controlled, comparative evaluations. Determining how such systems foster insight and sensemaking is important for their continued growth and study, however. Furthermore, studies that identify how people use such systems and why they benefit (or not) can help inform the design of new systems in this area. We conducted an evaluation of the visual analytics system Jigsaw employed in a small investigative sensemaking exercise, and compared its use to three other more traditional methods of analysis. Sixteen participants performed a simulated intelligence analysis task under one of the four conditions. Experimental results suggest that Jigsaw assisted participants to analyze the data and identify an embedded threat. We describe different analysis strategies used by study participants and how computational support (or the lack thereof) influenced the strategies. We then illustrate several characteristics of the sensemaking process identified in the study and provide design implications for investigative analysis tools based thereon. We conclude with recommendations on metrics and techniques for evaluating visual analytics systems for investigative analysis.

  17. Analytic uncertainty and sensitivity analysis of models with input correlations

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  18. Application of nomographs for analysis and prediction of receiver spurious response EMI

    NASA Astrophysics Data System (ADS)

    Heather, F. W.

    1985-07-01

    Spurious response EMI for the front end of a superheterodyne receiver follows a simple mathematic formula; however, the application of the formula to predict test frequencies produces more data than can be evaluated. An analysis technique has been developed to graphically depict all receiver spurious responses usig a nomograph and to permit selection of optimum test frequencies. The discussion includes the math model used to simulate a superheterodyne receiver, the implementation of the model in the computer program, the approach to test frequency selection, interpretation of the nomographs, analysis and prediction of receiver spurious response EMI from the nomographs, and application of the nomographs. In addition, figures are provided of sample applications. This EMI analysis and prediction technique greatly improves the Electromagnetic Compatibility (EMC) test engineer's ability to visualize the scope of receiver spurious response EMI testing and optimize test frequency selection.

  19. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    NASA Astrophysics Data System (ADS)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  20. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    NASA Astrophysics Data System (ADS)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  1. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    PubMed

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Cost-Effectiveness Analysis of Probiotic Use to Prevent Clostridium difficile Infection in Hospitalized Adults Receiving Antibiotics.

    PubMed

    Shen, Nicole T; Leff, Jared A; Schneider, Yecheskel; Crawford, Carl V; Maw, Anna; Bosworth, Brian; Simon, Matthew S

    2017-01-01

    Systematic reviews with meta-analyses and meta-regression suggest that timely probiotic use can prevent Clostridium difficile infection (CDI) in hospitalized adults receiving antibiotics, but the cost effectiveness is unknown. We sought to evaluate the cost effectiveness of probiotic use for prevention of CDI versus no probiotic use in the United States. We programmed a decision analytic model using published literature and national databases with a 1-year time horizon. The base case was modeled as a hypothetical cohort of hospitalized adults (mean age 68) receiving antibiotics with and without concurrent probiotic administration. Projected outcomes included quality-adjusted life-years (QALYs), costs (2013 US dollars), incremental cost-effectiveness ratios (ICERs; $/QALY), and cost per infection avoided. One-way, two-way, and probabilistic sensitivity analyses were conducted, and scenarios of different age cohorts were considered. The ICERs less than $100000 per QALY were considered cost effective. Probiotic use dominated (more effective and less costly) no probiotic use. Results were sensitive to probiotic efficacy (relative risk <0.73), the baseline risk of CDI (>1.6%), the risk of probiotic-associated bactermia/fungemia (<0.26%), probiotic cost (<$130), and age (>65). In probabilistic sensitivity analysis, at a willingness-to-pay threshold of $100000/QALY, probiotics were the optimal strategy in 69.4% of simulations. Our findings suggest that probiotic use may be a cost-effective strategy to prevent CDI in hospitalized adults receiving antibiotics age 65 or older or when the baseline risk of CDI exceeds 1.6%.

  3. Comparative analysis of methods for real-time analytical control of chemotherapies preparations.

    PubMed

    Bazin, Christophe; Cassard, Bruno; Caudron, Eric; Prognon, Patrice; Havard, Laurent

    2015-10-15

    Control of chemotherapies preparations are now an obligation in France, though analytical control is compulsory. Several methods are available and none of them is presumed as ideal. We wanted to compare them so as to determine which one could be the best choice. We compared non analytical (visual and video-assisted, gravimetric) and analytical (HPLC/FIA, UV/FT-IR, UV/Raman, Raman) methods thanks to our experience and a SWOT analysis. The results of the analysis show great differences between the techniques, but as expected none us them is without defects. However they can probably be used in synergy. Overall for the pharmacist willing to get involved, the implementation of the control for chemotherapies preparations must be widely anticipated, with the listing of every parameter, and remains according to us an analyst's job. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Rapid B-rep model preprocessing for immersogeometric analysis using analytic surfaces

    PubMed Central

    Wang, Chenglong; Xu, Fei; Hsu, Ming-Chen; Krishnamurthy, Adarsh

    2017-01-01

    Computational fluid dynamics (CFD) simulations of flow over complex objects have been performed traditionally using fluid-domain meshes that conform to the shape of the object. However, creating shape conforming meshes for complicated geometries like automobiles require extensive geometry preprocessing. This process is usually tedious and requires modifying the geometry, including specialized operations such as defeaturing and filling of small gaps. Hsu et al. (2016) developed a novel immersogeometric fluid-flow method that does not require the generation of a boundary-fitted mesh for the fluid domain. However, their method used the NURBS parameterization of the surfaces for generating the surface quadrature points to enforce the boundary conditions, which required the B-rep model to be converted completely to NURBS before analysis can be performed. This conversion usually leads to poorly parameterized NURBS surfaces and can lead to poorly trimmed or missing surface features. In addition, converting simple geometries such as cylinders to NURBS imposes a performance penalty since these geometries have to be dealt with as rational splines. As a result, the geometry has to be inspected again after conversion to ensure analysis compatibility and can increase the computational cost. In this work, we have extended the immersogeometric method to generate surface quadrature points directly using analytic surfaces. We have developed quadrature rules for all four kinds of analytic surfaces: planes, cones, spheres, and toroids. We have also developed methods for performing adaptive quadrature on trimmed analytic surfaces. Since analytic surfaces have frequently been used for constructing solid models, this method is also faster to generate quadrature points on real-world geometries than using only NURBS surfaces. To assess the accuracy of the proposed method, we perform simulations of a benchmark problem of flow over a torpedo shape made of analytic surfaces and compare those

  5. Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.

    PubMed

    Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà

    2017-10-01

    Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.

  6. Marketing Mix Formulation for Higher Education: An Integrated Analysis Employing Analytic Hierarchy Process, Cluster Analysis and Correspondence Analysis

    ERIC Educational Resources Information Center

    Ho, Hsuan-Fu; Hung, Chia-Chi

    2008-01-01

    Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…

  7. Non-linear analytic and coanalytic problems ( L_p-theory, Clifford analysis, examples)

    NASA Astrophysics Data System (ADS)

    Dubinskii, Yu A.; Osipenko, A. S.

    2000-02-01

    Two kinds of new mathematical model of variational type are put forward: non-linear analytic and coanalytic problems. The formulation of these non-linear boundary-value problems is based on a decomposition of the complete scale of Sobolev spaces into the "orthogonal" sum of analytic and coanalytic subspaces. A similar decomposition is considered in the framework of Clifford analysis. Explicit examples are presented.

  8. Analytical tools for the analysis of β-carotene and its degradation products

    PubMed Central

    Stutz, H.; Bresgen, N.; Eckl, P. M.

    2015-01-01

    Abstract β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method

  9. UV Lidar Receiver Analysis for Tropospheric Sensing of Ozone

    NASA Technical Reports Server (NTRS)

    Pliutau, Denis; DeYoung, Russell J.

    2013-01-01

    A simulation of a ground based Ultra-Violet Differential Absorption Lidar (UV-DIAL) receiver system was performed under realistic daytime conditions to understand how range and lidar performance can be improved for a given UV pulse laser energy. Calculations were also performed for an aerosol channel transmitting at 3 W. The lidar receiver simulation studies were optimized for the purpose of tropospheric ozone measurements. The transmitted lidar UV measurements were from 285 to 295 nm and the aerosol channel was 527-nm. The calculations are based on atmospheric transmission given by the HITRAN database and the Modern Era Retrospective Analysis for Research and Applications (MERRA) meteorological data. The aerosol attenuation is estimated using both the BACKSCAT 4.0 code as well as data collected during the CALIPSO mission. The lidar performance is estimated for both diffuseirradiance free cases corresponding to nighttime operation as well as the daytime diffuse scattered radiation component based on previously reported experimental data. This analysis presets calculations of the UV-DIAL receiver ozone and aerosol measurement range as a function of sky irradiance, filter bandwidth and laser transmitted UV and 527-nm energy

  10. Analytics for Education

    ERIC Educational Resources Information Center

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  11. Macro elemental analysis of food samples by nuclear analytical technique

    NASA Astrophysics Data System (ADS)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  12. Structural analysis of a reflux pool-boiler solar receiver

    NASA Astrophysics Data System (ADS)

    Hoffman, E. L.; Stone, C. M.

    1991-06-01

    Coupled thermal-structural finite element calculations of a reflux pool-boiler solar receiver were performed to characterize the operating stresses and to address issues affecting the service life of the receiver. Analyses performed using shell elements provided information for receiver material selection and design optimization. Calculations based on linear elastic fracture mechanics principles were performed using continuum elements to assess the vulnerability of a seam-weld to fatigue crack growth. All calculations were performed using ABAQUS, a general purpose finite element code, and elements specifically formulated for coupled thermal-structural analysis. Two materials were evaluated: 316L SS and Haynes 230 alloys. The receiver response was simulated for a combination of structural and thermal loads that represent the startup and operating conditions of the receiver. For both materials, maximum stresses in the receiver developed shortly after startup due to uneven temperature distribution across the receiver surface. The largest effective stress was near yield in the 316L SS receiver and below 39 percent of yield in the Haynes 230 receiver. The calculations demonstrated that stress reductions of over 25 percent could be obtained by reducing the aft dome thickness to one closer to the absorber. The fatigue calculations demonstrated that the stress distribution near the seam-weld notch depends primarily on the structural load created by internal pressurization of the receiver rather than the thermal, indicating that the thermal loads can be neglected when assessing the stress intensity near the seam-weld notch. The stress intensity factor, computed using the J-integral method and crack opening-displacement field equations, was significantly below the fatigue threshold for most steels. The calculations indicated that the weld notch was always loaded in compression, a condition which is not conducive to fatigue crack growth.

  13. An analytic data analysis method for oscillatory slug tests.

    PubMed

    Chen, Chia-Shyun

    2006-01-01

    An analytical data analysis method is developed for slug tests in partially penetrating wells in confined or unconfined aquifers of high hydraulic conductivity. As adapted from the van der Kamp method, the determination of the hydraulic conductivity is based on the occurrence times and the displacements of the extreme points measured from the oscillatory data and their theoretical counterparts available in the literature. This method is applied to two sets of slug test response data presented by Butler et al.: one set shows slow damping with seven discernable extremities, and the other shows rapid damping with three extreme points. The estimates of the hydraulic conductivity obtained by the analytic method are in good agreement with those determined by an available curve-matching technique.

  14. HPAEC-PAD for oligosaccharide analysis-novel insights into analyte sensitivity and response stability.

    PubMed

    Mechelke, Matthias; Herlet, Jonathan; Benz, J Philipp; Schwarz, Wolfgang H; Zverlov, Vladimir V; Liebl, Wolfgang; Kornberger, Petra

    2017-12-01

    The rising importance of accurately detecting oligosaccharides in biomass hydrolyzates or as ingredients in food, such as in beverages and infant milk products, demands for the availability of tools to sensitively analyze the broad range of available oligosaccharides. Over the last decades, HPAEC-PAD has been developed into one of the major technologies for this task and represents a popular alternative to state-of-the-art LC-MS oligosaccharide analysis. This work presents the first comprehensive study which gives an overview of the separation of 38 analytes as well as enzymatic hydrolyzates of six different polysaccharides focusing on oligosaccharides. The high sensitivity of the PAD comes at cost of its stability due to recession of the gold electrode. By an in-depth analysis of the sensitivity drop over time for 35 analytes, including xylo- (XOS), arabinoxylo- (AXOS), laminari- (LOS), manno- (MOS), glucomanno- (GMOS), and cellooligosaccharides (COS), we developed an analyte-specific one-phase decay model for this effect over time. Using this model resulted in significantly improved data normalization when using an internal standard. Our results thereby allow a quantification approach which takes the inevitable and analyte-specific PAD response drop into account. Graphical abstract HPAEC-PAD analysis of oligosaccharides and determination of PAD response drop leading to an improved data normalization.

  15. Analytical methods for quantitation of prenylated flavonoids from hops.

    PubMed

    Nikolić, Dejan; van Breemen, Richard B

    2013-01-01

    The female flowers of hops ( Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach.

  16. Modeling convection-diffusion-reaction systems for microfluidic molecular communications with surface-based receivers in Internet of Bio-Nano Things

    PubMed Central

    Akan, Ozgur B.

    2018-01-01

    We consider a microfluidic molecular communication (MC) system, where the concentration-encoded molecular messages are transported via fluid flow-induced convection and diffusion, and detected by a surface-based MC receiver with ligand receptors placed at the bottom of the microfluidic channel. The overall system is a convection-diffusion-reaction system that can only be solved by numerical methods, e.g., finite element analysis (FEA). However, analytical models are key for the information and communication technology (ICT), as they enable an optimisation framework to develop advanced communication techniques, such as optimum detection methods and reliable transmission schemes. In this direction, we develop an analytical model to approximate the expected time course of bound receptor concentration, i.e., the received signal used to decode the transmitted messages. The model obviates the need for computationally expensive numerical methods by capturing the nonlinearities caused by laminar flow resulting in parabolic velocity profile, and finite number of ligand receptors leading to receiver saturation. The model also captures the effects of reactive surface depletion layer resulting from the mass transport limitations and moving reaction boundary originated from the passage of finite-duration molecular concentration pulse over the receiver surface. Based on the proposed model, we derive closed form analytical expressions that approximate the received pulse width, pulse delay and pulse amplitude, which can be used to optimize the system from an ICT perspective. We evaluate the accuracy of the proposed model by comparing model-based analytical results to the numerical results obtained by solving the exact system model with COMSOL Multiphysics. PMID:29415019

  17. Modeling convection-diffusion-reaction systems for microfluidic molecular communications with surface-based receivers in Internet of Bio-Nano Things.

    PubMed

    Kuscu, Murat; Akan, Ozgur B

    2018-01-01

    We consider a microfluidic molecular communication (MC) system, where the concentration-encoded molecular messages are transported via fluid flow-induced convection and diffusion, and detected by a surface-based MC receiver with ligand receptors placed at the bottom of the microfluidic channel. The overall system is a convection-diffusion-reaction system that can only be solved by numerical methods, e.g., finite element analysis (FEA). However, analytical models are key for the information and communication technology (ICT), as they enable an optimisation framework to develop advanced communication techniques, such as optimum detection methods and reliable transmission schemes. In this direction, we develop an analytical model to approximate the expected time course of bound receptor concentration, i.e., the received signal used to decode the transmitted messages. The model obviates the need for computationally expensive numerical methods by capturing the nonlinearities caused by laminar flow resulting in parabolic velocity profile, and finite number of ligand receptors leading to receiver saturation. The model also captures the effects of reactive surface depletion layer resulting from the mass transport limitations and moving reaction boundary originated from the passage of finite-duration molecular concentration pulse over the receiver surface. Based on the proposed model, we derive closed form analytical expressions that approximate the received pulse width, pulse delay and pulse amplitude, which can be used to optimize the system from an ICT perspective. We evaluate the accuracy of the proposed model by comparing model-based analytical results to the numerical results obtained by solving the exact system model with COMSOL Multiphysics.

  18. Cost-Effectiveness Analysis of Probiotic Use to Prevent Clostridium difficile Infection in Hospitalized Adults Receiving Antibiotics

    PubMed Central

    Leff, Jared A; Schneider, Yecheskel; Crawford, Carl V; Maw, Anna; Bosworth, Brian; Simon, Matthew S

    2017-01-01

    Abstract Background Systematic reviews with meta-analyses and meta-regression suggest that timely probiotic use can prevent Clostridium difficile infection (CDI) in hospitalized adults receiving antibiotics, but the cost effectiveness is unknown. We sought to evaluate the cost effectiveness of probiotic use for prevention of CDI versus no probiotic use in the United States. Methods We programmed a decision analytic model using published literature and national databases with a 1-year time horizon. The base case was modeled as a hypothetical cohort of hospitalized adults (mean age 68) receiving antibiotics with and without concurrent probiotic administration. Projected outcomes included quality-adjusted life-years (QALYs), costs (2013 US dollars), incremental cost-effectiveness ratios (ICERs; $/QALY), and cost per infection avoided. One-way, two-way, and probabilistic sensitivity analyses were conducted, and scenarios of different age cohorts were considered. The ICERs less than $100000 per QALY were considered cost effective. Results Probiotic use dominated (more effective and less costly) no probiotic use. Results were sensitive to probiotic efficacy (relative risk <0.73), the baseline risk of CDI (>1.6%), the risk of probiotic-associated bactermia/fungemia (<0.26%), probiotic cost (<$130), and age (>65). In probabilistic sensitivity analysis, at a willingness-to-pay threshold of $100000/QALY, probiotics were the optimal strategy in 69.4% of simulations. Conclusions Our findings suggest that probiotic use may be a cost-effective strategy to prevent CDI in hospitalized adults receiving antibiotics age 65 or older or when the baseline risk of CDI exceeds 1.6%. PMID:29230429

  19. High dynamic GPS receiver validation demonstration

    NASA Technical Reports Server (NTRS)

    Hurd, W. J.; Statman, J. I.; Vilnrotter, V. A.

    1985-01-01

    The Validation Demonstration establishes that the high dynamic Global Positioning System (GPS) receiver concept developed at JPL meets the dynamic tracking requirements for range instrumentation of missiles and drones. It was demonstrated that the receiver can track the pseudorange and pseudorange rate of vehicles with acceleration in excess of 100 g and jerk in excess of 100 g/s, dynamics ten times more severe than specified for conventional High Dynamic GPS receivers. These results and analytic extensions to a complete system configuration establish that all range instrumentation requirements can be met. The receiver can be implemented in the 100 cu in volume required by all missiles and drones, and is ideally suited for transdigitizer or translator applications.

  20. A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions

    PubMed Central

    Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.

    2009-01-01

    Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453

  1. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  2. Analysis of Volatile Compounds by Advanced Analytical Techniques and Multivariate Chemometrics.

    PubMed

    Lubes, Giuseppe; Goodarzi, Mohammad

    2017-05-10

    Smelling is one of the five senses, which plays an important role in our everyday lives. Volatile compounds are, for example, characteristics of food where some of them can be perceivable by humans because of their aroma. They have a great influence on the decision making of consumers when they choose to use a product or not. In the case where a product has an offensive and strong aroma, many consumers might not appreciate it. On the contrary, soft and fresh natural aromas definitely increase the acceptance of a given product. These properties can drastically influence the economy; thus, it has been of great importance to manufacturers that the aroma of their food product is characterized by analytical means to provide a basis for further optimization processes. A lot of research has been devoted to this domain in order to link the quality of, e.g., a food to its aroma. By knowing the aromatic profile of a food, one can understand the nature of a given product leading to developing new products, which are more acceptable by consumers. There are two ways to analyze volatiles: one is to use human senses and/or sensory instruments, and the other is based on advanced analytical techniques. This work focuses on the latter. Although requirements are simple, low-cost technology is an attractive research target in this domain; most of the data are generated with very high-resolution analytical instruments. Such data gathered based on different analytical instruments normally have broad, overlapping sensitivity profiles and require substantial data analysis. In this review, we have addressed not only the question of the application of chemometrics for aroma analysis but also of the use of different analytical instruments in this field, highlighting the research needed for future focus.

  3. Optimizing piezoelectric receivers for acoustic power transfer applications

    NASA Astrophysics Data System (ADS)

    Gorostiaga, M.; Wapler, M. C.; Wallrabe, U.

    2018-07-01

    In this paper, we aim to optimize piezoelectric plate receivers for acoustic power transfer applications by analyzing the influence of the losses and of the acoustic boundary conditions. We derive the analytic expressions of the efficiency of the receiver with the optimal electric loads attached, and analyze the maximum efficiency value and its frequency with different loss and acoustic boundary conditions. To validate the analytical expressions that we have derived, we perform experiments in water with composite transducers of different filling fractions, and see that a lower acoustic impedance mismatch can compensate the influence of large dielectric and acoustic losses to achieve a good performance. Finally, we briefly compare the advantages and drawbacks of composite transducers and pure PZT (lead zirconate titanate) plates as acoustic power receivers, and conclude that 1–3 composites can achieve similar efficiency values in low power applications due to their adjustable acoustic impedance.

  4. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality

  5. Unifying Approach to Analytical Chemistry and Chemical Analysis: Problem-Oriented Role of Chemical Analysis.

    ERIC Educational Resources Information Center

    Pardue, Harry L.; Woo, Jannie

    1984-01-01

    Proposes an approach to teaching analytical chemistry and chemical analysis in which a problem to be resolved is the focus of a course. Indicates that this problem-oriented approach is intended to complement detailed discussions of fundamental and applied aspects of chemical determinations and not replace such discussions. (JN)

  6. SOMFlow: Guided Exploratory Cluster Analysis with Self-Organizing Maps and Analytic Provenance.

    PubMed

    Sacha, Dominik; Kraus, Matthias; Bernard, Jurgen; Behrisch, Michael; Schreck, Tobias; Asano, Yuki; Keim, Daniel A

    2018-01-01

    Clustering is a core building block for data analysis, aiming to extract otherwise hidden structures and relations from raw datasets, such as particular groups that can be effectively related, compared, and interpreted. A plethora of visual-interactive cluster analysis techniques has been proposed to date, however, arriving at useful clusterings often requires several rounds of user interactions to fine-tune the data preprocessing and algorithms. We present a multi-stage Visual Analytics (VA) approach for iterative cluster refinement together with an implementation (SOMFlow) that uses Self-Organizing Maps (SOM) to analyze time series data. It supports exploration by offering the analyst a visual platform to analyze intermediate results, adapt the underlying computations, iteratively partition the data, and to reflect previous analytical activities. The history of previous decisions is explicitly visualized within a flow graph, allowing to compare earlier cluster refinements and to explore relations. We further leverage quality and interestingness measures to guide the analyst in the discovery of useful patterns, relations, and data partitions. We conducted two pair analytics experiments together with a subject matter expert in speech intonation research to demonstrate that the approach is effective for interactive data analysis, supporting enhanced understanding of clustering results as well as the interactive process itself.

  7. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis

    PubMed Central

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338

  8. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis.

    PubMed

    Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.

  9. Analytical analysis and implementation of a low-speed high-torque permanent magnet vernier in-wheel motor for electric vehicle

    NASA Astrophysics Data System (ADS)

    Li, Jiangui; Wang, Junhua; Zhigang, Zhao; Yan, Weili

    2012-04-01

    In this paper, analytical analysis of the permanent magnet vernier (PMV) is presented. The key is to analytically solve the governing Laplacian/quasi-Poissonian field equations in the motor regions. By using the time-stepping finite element method, the analytical method is verified. Hence, the performances of the PMV machine are quantitatively compared with that of the analytical results. The analytical results agree well with the finite element method results. Finally, the experimental results are given to further show the validity of the analysis.

  10. A conflict of analysis: analytical chemistry and milk adulteration in Victorian Britain.

    PubMed

    Steere-Williams, Jacob

    2014-08-01

    This article centres on a particularly intense debate within British analytical chemistry in the late nineteenth century, between local public analysts and the government chemists of the Inland Revenue Service. The two groups differed in both practical methodologies and in the interpretation of analytical findings. The most striking debates in this period were related to milk analysis, highlighted especially in Victorian courtrooms. It was in protracted court cases, such as the well known Manchester Milk Case in 1883, that analytical chemistry was performed between local public analysts and the government chemists, who were often both used as expert witnesses. Victorian courtrooms were thus important sites in the context of the uneven professionalisation of chemistry. I use this tension to highlight what Christopher Hamlin has called the defining feature of Victorian public health, namely conflicts of professional jurisdiction, which adds nuance to histories of the struggle of professionalisation and public credibility in analytical chemistry.

  11. Study of image reconstruction for terahertz indirect holography with quasi-optics receiver.

    PubMed

    Gao, Xiang; Li, Chao; Fang, Guangyou

    2013-06-01

    In this paper, an indirect holographic image reconstruction algorithm was studied for terahertz imaging with a quasi-optics receiver. Based on the combination of the reciprocity principle and modified quasi-optics theory, analytical expressions of the received spatial power distribution and its spectrum are obtained for the interference pattern of target wave and reference wave. These results clearly give the quantitative relationship between imaging quality and the parameters of a Gaussian beam, which provides a good criterion for terahertz quasi-optics transceivers design in terahertz off-axis holographic imagers. To validate the effectiveness of the proposed analysis method, some imaging results with a 0.3 THz prototype system are shown based on electromagnetic simulation.

  12. Gravity field error analysis: Applications of GPS receivers and gradiometers on low orbiting platforms

    NASA Technical Reports Server (NTRS)

    Schrama, E.

    1990-01-01

    The concept of a Global Positioning System (GPS) receiver as a tracking facility and a gradiometer as a separate instrument on a low orbiting platform offers a unique tool to map the Earth's gravitational field with unprecedented accuracies. The former technique allows determination of the spacecraft's ephemeris at any epoch to within 3 to 10 cm, the latter permits the measurement of the tensor of second order derivatives of the gravity field to within 0.01 to 0.0001 Eotvos units depending on the type of gradiometer. First, a variety of error sources in gradiometry where emphasis is placed on the rotational problem pursuing as well a static as a dynamic approach is described. Next, an analytical technique is described and applied for an error analysis of gravity field parameters from gradiometer and GPS observation types. Results are discussed for various configurations proposed on Topex/Poseidon, Gravity Probe-B, and Aristoteles, indicating that GPS only solutions may be computed up to degree and order 35, 55, and 85 respectively, whereas a combined GPS/gradiometer experiment on Aristoteles may result in an acceptable solution up to degree and order 240.

  13. Bragg-cell receiver study

    NASA Technical Reports Server (NTRS)

    Wilson, Lonnie A.

    1987-01-01

    Bragg-cell receivers are employed in specialized Electronic Warfare (EW) applications for the measurement of frequency. Bragg-cell receiver characteristics are fully characterized for simple RF emitter signals. This receiver is early in its development cycle when compared to the IFM receiver. Functional mathematical models are derived and presented in this report for the Bragg-cell receiver. Theoretical analysis is presented and digital computer signal processing results are presented for the Bragg-cell receiver. Probability density function analysis are performed for output frequency. Probability density function distributions are observed to depart from assumed distributions for wideband and complex RF signals. This analysis is significant for high resolution and fine grain EW Bragg-cell receiver systems.

  14. Student Receivables Management: Opportunities for Improved Practices.

    ERIC Educational Resources Information Center

    Jacquin, Jules C.; Goyal, Anil K.

    1995-01-01

    The college or university's business office can help reduce problems with student receivables through procedural review of the tuition revenue process, application of analytical methods, and improved operating practices. Admissions, financial aid, and billing offices must all be involved. (MSE)

  15. Integrated sudomotor axon reflex sweat stimulation for continuous sweat analyte analysis with individuals at rest.

    PubMed

    Sonner, Zachary; Wilder, Eliza; Gaillard, Trudy; Kasting, Gerald; Heikenfeld, Jason

    2017-07-25

    Eccrine sweat has rapidly emerged as a non-invasive, ergonomic, and rich source of chemical analytes with numerous technological demonstrations now showing the ability for continuous electrochemical sensing. However, beyond active perspirers (athletes, workers, etc.), continuous sweat access in individuals at rest has hindered the advancement of both sweat sensing science and technology. Reported here is integration of sudomotor axon reflex sweat stimulation for continuous wearable sweat analyte analysis, including the ability for side-by-side integration of chemical stimulants & sensors without cross-contamination. This integration approach is uniquely compatible with sensors which consume the analyte (enzymatic) or sensors which equilibrate with analyte concentrations. In vivo validation is performed using iontophoretic delivery of carbachol with ion-selective and impedance sensors for sweat analysis. Carbachol has shown prolonged sweat stimulation in directly stimulated regions for five hours or longer. This work represents a significant leap forward in sweat sensing technology, and may be of broader interest to those interested in on-skin sensing integrated with drug-delivery.

  16. Analysis and Simulation of Disadvantaged Receivers for Multiple-Input Multiple-Output Communications Systems

    DTIC Science & Technology

    2010-09-01

    SIMULATION OF DISADVANTAGED RECEIVERS FOR MULTIPLE-INPUT MULTIPLE- OUTPUT COMMUNICATIONS SYSTEMS by Tracy A. Martin September 2010 Thesis...DATE September 2010 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE Analysis and Simulation of Disadvantaged Receivers...Channel State Information at the Transmitter (CSIT). A disadvantaged receiver is subsequently introduced to the system lacking the optimization enjoyed

  17. Familiarity Vs Trust: A Comparative Study of Domain Scientists' Trust in Visual Analytics and Conventional Analysis Methods.

    PubMed

    Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel

    2017-01-01

    Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.

  18. Clustering in analytical chemistry.

    PubMed

    Drab, Klaudia; Daszykowski, Michal

    2014-01-01

    Data clustering plays an important role in the exploratory analysis of analytical data, and the use of clustering methods has been acknowledged in different fields of science. In this paper, principles of data clustering are presented with a direct focus on clustering of analytical data. The role of the clustering process in the analytical workflow is underlined, and its potential impact on the analytical workflow is emphasized.

  19. Using Configural Frequency Analysis as a Person-Centered Analytic Approach with Categorical Data

    ERIC Educational Resources Information Center

    Stemmler, Mark; Heine, Jörg-Henrik

    2017-01-01

    Configural frequency analysis and log-linear modeling are presented as person-centered analytic approaches for the analysis of categorical or categorized data in multi-way contingency tables. Person-centered developmental psychology, based on the holistic interactionistic perspective of the Stockholm working group around David Magnusson and Lars…

  20. Perioperative and ICU Healthcare Analytics within a Veterans Integrated System Network: a Qualitative Gap Analysis.

    PubMed

    Mudumbai, Seshadri; Ayer, Ferenc; Stefanko, Jerry

    2017-08-01

    Health care facilities are implementing analytics platforms as a way to document quality of care. However, few gap analyses exist on platforms specifically designed for patients treated in the Operating Room, Post-Anesthesia Care Unit, and Intensive Care Unit (ICU). As part of a quality improvement effort, we undertook a gap analysis of an existing analytics platform within the Veterans Healthcare Administration. The objectives were to identify themes associated with 1) current clinical use cases and stakeholder needs; 2) information flow and pain points; and 3) recommendations for future analytics development. Methods consisted of semi-structured interviews in 2 phases with a diverse set (n = 9) of support personnel and end users from five facilities across a Veterans Integrated Service Network. Phase 1 identified underlying needs and previous experiences with the analytics platform across various roles and operational responsibilities. Phase 2 validated preliminary feedback, lessons learned, and recommendations for improvement. Emerging themes suggested that the existing system met a small pool of national reporting requirements. However, pain points were identified with accessing data in several information system silos and performing multiple manual validation steps of data content. Notable recommendations included enhancing systems integration to create "one-stop shopping" for data, and developing a capability to perform trends analysis. Our gap analysis suggests that analytics platforms designed for surgical and ICU patients should employ approaches similar to those being used for primary care patients.

  1. Performance outlook of the SCRAP receiver

    NASA Astrophysics Data System (ADS)

    Lubkoll, Matti; von Backström, Theodor W.; Harms, Thomas M.

    2016-05-01

    A combined cycle (CC) concentrating solar power (CSP) plant provides significant potential to achieve an efficiency increase and an electricity cost reduction compared to current single-cycle plants. A CC CSP system requires a receiver technology capable of effectively transferring heat from concentrated solar irradiation to a pressurized air stream of a gas turbine. The small number of pressurized air receivers demonstrated to date have practical limitations, when operating at high temperatures and pressures. As yet, a robust, scalable and efficient system has to be developed and commercialized. A novel receiver system, the Spiky Central Receiver Air Pre-heater (SCRAP) concept has been proposed to comply with these requirements. The SCRAP system is conceived as a solution for an efficient and robust pressurized air receiver that could be implemented in CC CSP concepts or standalone solar Brayton cycles without a bottoming Rankine cycle. The presented work expands on previous publications on the thermal modeling of the receiver system. Based on the analysis of a single heat transfer element (spike), predictions for its thermal performance can be made. To this end the existing thermal model was improved by heat transfer characteristics for the jet impingement region of the spike tip as well as heat transfer models simulating the interaction with ambient. While the jet impingement cooling effect was simulated employing a commercial CFD code, the ambient heat transfer model was based on simplifying assumptions in order to employ empirical and analytical equations. The thermal efficiency of a spike under design conditions (flux 1.0 MW/m2, air outlet temperature just below 800 °C) was calculated at approximately 80 %, where convective heat losses account for 16.2 % of the absorbed radiation and radiative heat losses for a lower 2.9 %. This effect is due to peak surface temperatures occurring at the root of the spikes. It can thus be concluded that the geometric

  2. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was built using the OpenMDAO framework. Pycycle provides analytic derivatives allowing for an efficient use of gradient-based optimization methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  3. Analytical Protocols for Analysis of Organic Molecules in Mars Analog Materials

    NASA Technical Reports Server (NTRS)

    Mahaffy, Paul R.; Brinkerhoff, W.; Buch, A.; Demick, J.; Glavin, D. P.

    2004-01-01

    A range of analytical techniques and protocols that might be applied b in situ investigations of martian fines, ices, and rock samples are evaluated by analysis of organic molecules m Mars analogues. These simulants 6om terrestrial (i.e. tephra from Hawaii) or extraterrestrial (meteoritic) samples are examined by pyrolysis gas chromatograph mass spectrometry (GCMS), organic extraction followed by chemical derivatization GCMS, and laser desorption mass spectrometry (LDMS). The combination of techniques imparts analysis breadth since each technique provides a unique analysis capability for Certain classes of organic molecules.

  4. The Dolinar Receiver in an Information Theoretic Framework

    NASA Technical Reports Server (NTRS)

    Erkmen, Baris I.; Birnbaum, Kevin M.; Moision, Bruce E.; Dolinar, Samuel J.

    2011-01-01

    Optical communication at the quantum limit requires that measurements on the optical field be maximally informative, but devising physical measurements that accomplish this objective has proven challenging. The Dolinar receiver exemplifies a rare instance of success in distinguishing between two coherent states: an adaptive local oscillator is mixed with the signal prior to photodetection, which yields an error probability that meets the Helstrom lower bound with equality. Here we apply the same local-oscillator-based architecture with aninformation-theoretic optimization criterion. We begin with analysis of this receiver in a general framework for an arbitrary coherent-state modulation alphabet, and then we concentrate on two relevant examples. First, we study a binary antipodal alphabet and show that the Dolinar receiver's feedback function not only minimizes the probability of error, but also maximizes the mutual information. Next, we study ternary modulation consistingof antipodal coherent states and the vacuum state. We derive an analytic expression for a near-optimal local oscillator feedback function, and, via simulation, we determine its photon information efficiency (PIE). We provide the PIE versus dimensional information efficiency (DIE) trade-off curve and show that this modulation and the our receiver combination performs universally better than (generalized) on-off keying plus photoncounting, although, the advantage asymptotically vanishes as the bits-per-photon diverges towards infinity.

  5. Jackknife variance of the partial area under the empirical receiver operating characteristic curve.

    PubMed

    Bandos, Andriy I; Guo, Ben; Gur, David

    2017-04-01

    Receiver operating characteristic analysis provides an important methodology for assessing traditional (e.g., imaging technologies and clinical practices) and new (e.g., genomic studies, biomarker development) diagnostic problems. The area under the clinically/practically relevant part of the receiver operating characteristic curve (partial area or partial area under the receiver operating characteristic curve) is an important performance index summarizing diagnostic accuracy at multiple operating points (decision thresholds) that are relevant to actual clinical practice. A robust estimate of the partial area under the receiver operating characteristic curve is provided by the area under the corresponding part of the empirical receiver operating characteristic curve. We derive a closed-form expression for the jackknife variance of the partial area under the empirical receiver operating characteristic curve. Using the derived analytical expression, we investigate the differences between the jackknife variance and a conventional variance estimator. The relative properties in finite samples are demonstrated in a simulation study. The developed formula enables an easy way to estimate the variance of the empirical partial area under the receiver operating characteristic curve, thereby substantially reducing the computation burden, and provides important insight into the structure of the variability. We demonstrate that when compared with the conventional approach, the jackknife variance has substantially smaller bias, and leads to a more appropriate type I error rate of the Wald-type test. The use of the jackknife variance is illustrated in the analysis of a data set from a diagnostic imaging study.

  6. Predictive factors in patients with hepatocellular carcinoma receiving sorafenib therapy using time-dependent receiver operating characteristic analysis.

    PubMed

    Nishikawa, Hiroki; Nishijima, Norihiro; Enomoto, Hirayuki; Sakamoto, Azusa; Nasu, Akihiro; Komekado, Hideyuki; Nishimura, Takashi; Kita, Ryuichi; Kimura, Toru; Iijima, Hiroko; Nishiguchi, Shuhei; Osaki, Yukio

    2017-01-01

    To investigate variables before sorafenib therapy on the clinical outcomes in hepatocellular carcinoma (HCC) patients receiving sorafenib and to further assess and compare the predictive performance of continuous parameters using time-dependent receiver operating characteristics (ROC) analysis. A total of 225 HCC patients were analyzed. We retrospectively examined factors related to overall survival (OS) and progression free survival (PFS) using univariate and multivariate analyses. Subsequently, we performed time-dependent ROC analysis of continuous parameters which were significant in the multivariate analysis in terms of OS and PFS. Total sum of area under the ROC in all time points (defined as TAAT score) in each case was calculated. Our cohort included 175 male and 50 female patients (median age, 72 years) and included 158 Child-Pugh A and 67 Child-Pugh B patients. The median OS time was 0.68 years, while the median PFS time was 0.24 years. On multivariate analysis, gender, body mass index (BMI), Child-Pugh classification, extrahepatic metastases, tumor burden, aspartate aminotransferase (AST) and alpha-fetoprotein (AFP) were identified as significant predictors of OS and ECOG-performance status, Child-Pugh classification and extrahepatic metastases were identified as significant predictors of PFS. Among three continuous variables (i.e., BMI, AST and AFP), AFP had the highest TAAT score for the entire cohort. In subgroup analyses, AFP had the highest TAAT score except for Child-Pugh B and female among three continuous variables. In continuous variables, AFP could have higher predictive accuracy for survival in HCC patients undergoing sorafenib therapy.

  7. Fault feature analysis of cracked gear based on LOD and analytical-FE method

    NASA Astrophysics Data System (ADS)

    Wu, Jiateng; Yang, Yu; Yang, Xingkai; Cheng, Junsheng

    2018-01-01

    At present, there are two main ideas for gear fault diagnosis. One is the model-based gear dynamic analysis; the other is signal-based gear vibration diagnosis. In this paper, a method for fault feature analysis of gear crack is presented, which combines the advantages of dynamic modeling and signal processing. Firstly, a new time-frequency analysis method called local oscillatory-characteristic decomposition (LOD) is proposed, which has the attractive feature of extracting fault characteristic efficiently and accurately. Secondly, an analytical-finite element (analytical-FE) method which is called assist-stress intensity factor (assist-SIF) gear contact model, is put forward to calculate the time-varying mesh stiffness (TVMS) under different crack states. Based on the dynamic model of the gear system with 6 degrees of freedom, the dynamic simulation response was obtained for different tooth crack depths. For the dynamic model, the corresponding relation between the characteristic parameters and the degree of the tooth crack is established under a specific condition. On the basis of the methods mentioned above, a novel gear tooth root crack diagnosis method which combines the LOD with the analytical-FE is proposed. Furthermore, empirical mode decomposition (EMD) and ensemble empirical mode decomposition (EEMD) are contrasted with the LOD by gear crack fault vibration signals. The analysis results indicate that the proposed method performs effectively and feasibility for the tooth crack stiffness calculation and the gear tooth crack fault diagnosis.

  8. Two-stage solar power tower cavity-receiver design and thermal performance analysis

    NASA Astrophysics Data System (ADS)

    Pang, Liping; Wang, Ting; Li, Ruihua; Yang, Yongping

    2017-06-01

    New type of two-stage solar power tower cavity-receiver is designed and a calculating procedure of radiation, convection and flow under the Gaussian heat flux is established so as to determine the piping layout and geometries in the receiver I and II and the heat flux distribution in different positions is obtained. Then the main thermal performance on water/steam temperature, steam quality, wall temperature along the typical tubes and pressure drop are specified according to the heat transfer and flow characteristics of two-phase flow. Meanwhile, a series of systematic design process is promoted and analysis on thermal performance of the two receivers is conducted. Results show that this type of two-stage cavity-receivers can minimize the size and reduce the mean temperature of receiver I while raise the average heat flux, thus increase the thermal efficiency of the two receivers; besides, the multiple serpentine tubes from header can make a more uniform distribution of the outlet parameters, preventing wall overheated.

  9. The limited relevance of analytical ethics to the problems of bioethics.

    PubMed

    Holmes, R L

    1990-04-01

    Philosophical ethics comprises metaethics, normative ethics and applied ethics. These have characteristically received analytic treatment by twentieth-century Anglo-American philosophy. But there has been disagreement over their interrelationship to one another and the relationship of analytical ethics to substantive morality--the making of moral judgments. I contend that the expertise philosophers have in either theoretical or applied ethics does not equip them to make sounder moral judgments on the problems of bioethics than nonphilosophers. One cannot "apply" theories like Kantianism or consequentialism to get solutions to practical moral problems unless one knows which theory is correct, and that is a metaethical question over which there is no consensus. On the other hand, to presume to be able to reach solutions through neutral analysis of problems is unavoidably to beg controversial theoretical issues in the process. Thus, while analytical ethics can play an important clarificatory role in bioethics, it can neither provide, nor substitute for, moral wisdom.

  10. Dynamic analysis of a flexible spacecraft with rotating components. Volume 1: Analytical developments

    NASA Technical Reports Server (NTRS)

    Bodley, C. S.; Devers, A. D.; Park, A. C.

    1975-01-01

    Analytical procedures and digital computer code are presented for the dynamic analysis of a flexible spacecraft with rotating components. Topics, considered include: (1) nonlinear response in the time domain, and (2) linear response in the frequency domain. The spacecraft is assumed to consist of an assembly of connected rigid or flexible subassemblies. The total system is not restricted to a topological connection arrangement and may be acting under the influence of passive or active control systems and external environments. The analytics and associated digital code provide the user with the capability to establish spacecraft system nonlinear total response for specified initial conditions, linear perturbation response about a calculated or specified nominal motion, general frequency response and graphical display, and spacecraft system stability analysis.

  11. Cavity radiation model for solar central receivers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lipps, F.W.

    1981-01-01

    The Energy Laboratory of the University of Houston has developed a computer simulation program called CREAM (i.e., Cavity Radiations Exchange Analysis Model) for application to the solar central receiver system. The zone generating capability of CREAM has been used in several solar re-powering studies. CREAM contains a geometric configuration factor generator based on Nusselt's method. A formulation of Nusselt's method provides support for the FORTRAN subroutine NUSSELT. Numerical results from NUSSELT are compared to analytic values and values from Sparrow's method. Sparrow's method is based on a double contour integral and its reduction to a single integral which is approximatedmore » by Guassian methods. Nusselt's method is adequate for the intended engineering applications, but Sparrow's method is found to be an order of magnitude more efficient in many situations.« less

  12. Meta-analysis as Statistical and Analytical Method of Journal's Content Scientific Evaluation.

    PubMed

    Masic, Izet; Begic, Edin

    2015-02-01

    A meta-analysis is a statistical and analytical method which combines and synthesizes different independent studies and integrates their results into one common result. Analysis of the journals "Medical Archives", "Materia Socio Medica" and "Acta Informatica Medica", which are located in the most eminent indexed databases of the biomedical milieu. The study has retrospective and descriptive character, and included the period of the calendar year 2014. Study included six editions of all three journals (total of 18 journals). In this period was published a total of 291 articles (in the "Medical Archives" 110, "Materia Socio Medica" 97, and in "Acta Informatica Medica" 84). The largest number of articles was original articles. Small numbers have been published as professional, review articles and case reports. Clinical events were most common in the first two journals, while in the journal "Acta Informatica Medica" belonged to the field of medical informatics, as part of pre-clinical medical disciplines. Articles are usually required period of fifty to fifty nine days for review. Articles were received from four continents, mostly from Europe. The authors are most often from the territory of Bosnia and Herzegovina, then Iran, Kosovo and Macedonia. The number of articles published each year is increasing, with greater participation of authors from different continents and abroad. Clinical medical disciplines are the most common, with the broader spectrum of topics and with a growing number of original articles. Greater support of the wider scientific community is needed for further development of all three of the aforementioned journals.

  13. Designing a Marketing Analytics Course for the Digital Age

    ERIC Educational Resources Information Center

    Liu, Xia; Burns, Alvin C.

    2018-01-01

    Marketing analytics is receiving great attention because of evolving technology and the radical changes in the marketing environment. This study aims to assist the design and implementation of a marketing analytics course. We assembled a rich data set from four sources: business executives, 400 employers' job postings, one million tweets about…

  14. [Analysis and experimental verification of sensitivity and SNR of laser warning receiver].

    PubMed

    Zhang, Ji-Long; Wang, Ming; Tian, Er-Ming; Li, Xiao; Wang, Zhi-Bin; Zhang, Yue

    2009-01-01

    In order to countermeasure increasingly serious threat from hostile laser in modern war, it is urgent to do research on laser warning technology and system, and the sensitivity and signal to noise ratio (SNR) are two important performance parameters in laser warning system. In the present paper, based on the signal statistical detection theory, a method for calculation of the sensitivity and SNR in coherent detection laser warning receiver (LWR) has been proposed. Firstly, the probabilities of the laser signal and receiver noise were analyzed. Secondly, based on the threshold detection theory and Neyman-Pearson criteria, the signal current equation was established by introducing detection probability factor and false alarm rate factor, then, the mathematical expressions of sensitivity and SNR were deduced. Finally, by using method, the sensitivity and SNR of the sinusoidal grating laser warning receiver developed by our group were analyzed, and the theoretic calculation and experimental results indicate that the SNR analysis method is feasible, and can be used in performance analysis of LWR.

  15. Analytical and multibody modeling for the power analysis of standing jumps.

    PubMed

    Palmieri, G; Callegari, M; Fioretti, S

    2015-01-01

    Two methods for the power analysis of standing jumps are proposed and compared in this article. The first method is based on a simple analytical formulation which requires as input the coordinates of the center of gravity in three specified instants of the jump. The second method is based on a multibody model that simulates the jumps processing the data obtained by a three-dimensional (3D) motion capture system and the dynamometric measurements obtained by the force platforms. The multibody model is developed with OpenSim, an open-source software which provides tools for the kinematic and dynamic analyses of 3D human body models. The study is focused on two of the typical tests used to evaluate the muscular activity of lower limbs, which are the counter movement jump and the standing long jump. The comparison between the results obtained by the two methods confirms that the proposed analytical formulation is correct and represents a simple tool suitable for a preliminary analysis of total mechanical work and the mean power exerted in standing jumps.

  16. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  17. Design requirements, challenges, and solutions for high-temperature falling particle receivers

    NASA Astrophysics Data System (ADS)

    Christian, Joshua; Ho, Clifford

    2016-05-01

    Falling particle receivers (FPR) utilize small particles as a heat collecting medium within a cavity receiver structure. Previous analysis for FPR systems include computational fluid dynamics (CFD), analytical evaluations, and experiments to determine the feasibility and achievability of this CSP technology. Sandia National Laboratories has fabricated and tested a 1 MWth FPR that consists of a cavity receiver, top hopper, bottom hopper, support structure, particle elevator, flux target, and instrumentation. Design requirements and inherent challenges were addressed to enable continuous operation of flowing particles under high-flux conditions and particle temperatures over 700 °C. Challenges include being able to withstand extremely high temperatures (up to 1200°C on the walls of the cavity), maintaining particle flow and conveyance, measuring temperatures and mass flow rates, filtering out debris, protecting components from direct flux spillage, and measuring irradiance in the cavity. Each of the major components of the system is separated into design requirements, associated challenges and corresponding solutions. The intent is to provide industry and researchers with lessons learned to avoid pitfalls and technical problems encountered during the development of Sandia's prototype particle receiver system at the National Solar Thermal Test Facility (NSTTF).

  18. Approximate analytical solutions in the analysis of elastic structures of complex geometry

    NASA Astrophysics Data System (ADS)

    Goloskokov, Dmitriy P.; Matrosov, Alexander V.

    2018-05-01

    A method of analytical decomposition for analysis plane structures of a complex configuration is presented. For each part of the structure in the form of a rectangle all the components of the stress-strain state are constructed by the superposition method. The method is based on two solutions derived in the form of trigonometric series with unknown coefficients using the method of initial functions. The coefficients are determined from the system of linear algebraic equations obtained while satisfying the boundary conditions and the conditions for joining the structure parts. The components of the stress-strain state of a bent plate with holes are calculated using the analytical decomposition method.

  19. Performance Analysis of Receive Diversity in Wireless Sensor Networks over GBSBE Models

    PubMed Central

    Goel, Shivali; Abawajy, Jemal H.; Kim, Tai-hoon

    2010-01-01

    Wireless sensor networks have attracted a lot of attention recently. In this paper, we develop a channel model based on the elliptical model for multipath components involving randomly placed scatterers in the scattering region with sensors deployed on a field. We verify that in a sensor network, the use of receive diversity techniques improves the performance of the system. Extensive performance analysis of the system is carried out for both single and multiple antennas with the applied receive diversity techniques. Performance analyses based on variations in receiver height, maximum multipath delay and transmit power have been performed considering different numbers of antenna elements present in the receiver array, Our results show that increasing the number of antenna elements for a wireless sensor network does indeed improve the BER rates that can be obtained. PMID:22163510

  20. Wavelet and receiver operating characteristic analysis of heart rate variability

    NASA Astrophysics Data System (ADS)

    McCaffery, G.; Griffith, T. M.; Naka, K.; Frennaux, M. P.; Matthai, C. C.

    2002-02-01

    Multiresolution wavelet analysis has been used to study the heart rate variability in two classes of patients with different pathological conditions. The scale dependent measure of Thurner et al. was found to be statistically significant in discriminating patients suffering from hypercardiomyopathy from a control set of normal subjects. We have performed Receiver Operating Characteristc (ROC) analysis and found the ROC area to be a useful measure by which to label the significance of the discrimination, as well as to describe the severity of heart dysfunction.

  1. Two-port network analysis and modeling of a balanced armature receiver.

    PubMed

    Kim, Noori; Allen, Jont B

    2013-07-01

    Models for acoustic transducers, such as loudspeakers, mastoid bone-drivers, hearing-aid receivers, etc., are critical elements in many acoustic applications. Acoustic transducers employ two-port models to convert between acoustic and electromagnetic signals. This study analyzes a widely-used commercial hearing-aid receiver ED series, manufactured by Knowles Electronics, Inc. Electromagnetic transducer modeling must consider two key elements: a semi-inductor and a gyrator. The semi-inductor accounts for electromagnetic eddy-currents, the 'skin effect' of a conductor (Vanderkooy, 1989), while the gyrator (McMillan, 1946; Tellegen, 1948) accounts for the anti-reciprocity characteristic [Lenz's law (Hunt, 1954, p. 113)]. Aside from Hunt (1954), no publications we know of have included the gyrator element in their electromagnetic transducer models. The most prevalent method of transducer modeling evokes the mobility method, an ideal transformer instead of a gyrator followed by the dual of the mechanical circuit (Beranek, 1954). The mobility approach greatly complicates the analysis. The present study proposes a novel, simplified and rigorous receiver model. Hunt's two-port parameters, the electrical impedance Ze(s), acoustic impedance Za(s) and electro-acoustic transduction coefficient Ta(s), are calculated using ABCD and impedance matrix methods (Van Valkenburg, 1964). The results from electrical input impedance measurements Zin(s), which vary with given acoustical loads, are used in the calculation (Weece and Allen, 2010). The hearing-aid receiver transducer model is designed based on energy transformation flow [electric→ mechanic→ acoustic]. The model has been verified with electrical input impedance, diaphragm velocity in vacuo, and output pressure measurements. This receiver model is suitable for designing most electromagnetic transducers and it can ultimately improve the design of hearing-aid devices by providing a simplified yet accurate, physically

  2. Improvement of analytical capabilities of neutron activation analysis laboratory at the Colombian Geological Survey

    NASA Astrophysics Data System (ADS)

    Parrado, G.; Cañón, Y.; Peña, M.; Sierra, O.; Porras, A.; Alonso, D.; Herrera, D. C.; Orozco, J.

    2016-07-01

    The Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey has developed a technique for multi-elemental analysis of soil and plant matrices, based on Instrumental Neutron Activation Analysis (INAA) using the comparator method. In order to evaluate the analytical capabilities of the technique, the laboratory has been participating in inter-comparison tests organized by Wepal (Wageningen Evaluating Programs for Analytical Laboratories). In this work, the experimental procedure and results for the multi-elemental analysis of four soil and four plant samples during participation in the first round on 2015 of Wepal proficiency test are presented. Only elements with radioactive isotopes with medium and long half-lives have been evaluated, 15 elements for soils (As, Ce, Co, Cr, Cs, Fe, K, La, Na, Rb, Sb, Sc, Th, U and Zn) and 7 elements for plants (Br, Co, Cr, Fe, K, Na and Zn). The performance assessment by Wepal based on Z-score distributions showed that most results obtained |Z-scores| ≤ 3.

  3. Two-condition within-participant statistical mediation analysis: A path-analytic framework.

    PubMed

    Montoya, Amanda K; Hayes, Andrew F

    2017-03-01

    Researchers interested in testing mediation often use designs where participants are measured on a dependent variable Y and a mediator M in both of 2 different circumstances. The dominant approach to assessing mediation in such a design, proposed by Judd, Kenny, and McClelland (2001), relies on a series of hypothesis tests about components of the mediation model and is not based on an estimate of or formal inference about the indirect effect. In this article we recast Judd et al.'s approach in the path-analytic framework that is now commonly used in between-participant mediation analysis. By so doing, it is apparent how to estimate the indirect effect of a within-participant manipulation on some outcome through a mediator as the product of paths of influence. This path-analytic approach eliminates the need for discrete hypothesis tests about components of the model to support a claim of mediation, as Judd et al.'s method requires, because it relies only on an inference about the product of paths-the indirect effect. We generalize methods of inference for the indirect effect widely used in between-participant designs to this within-participant version of mediation analysis, including bootstrap confidence intervals and Monte Carlo confidence intervals. Using this path-analytic approach, we extend the method to models with multiple mediators operating in parallel and serially and discuss the comparison of indirect effects in these more complex models. We offer macros and code for SPSS, SAS, and Mplus that conduct these analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Analysis and synthesis of bianisotropic metasurfaces by using analytical approach based on equivalent parameters

    NASA Astrophysics Data System (ADS)

    Danaeifar, Mohammad; Granpayeh, Nosrat

    2018-03-01

    An analytical method is presented to analyze and synthesize bianisotropic metasurfaces. The equivalent parameters of metasurfaces in terms of meta-atom properties and other specifications of metasurfaces are derived. These parameters are related to electric, magnetic, and electromagnetic/magnetoelectric dipole moments of the bianisotropic media, and they can simplify the analysis of complicated and multilayer structures. A metasurface of split ring resonators is studied as an example demonstrating the proposed method. The optical properties of the meta-atom are explored, and the calculated polarizabilities are applied to find the reflection coefficient and the equivalent parameters of the metasurface. Finally, a structure consisting of two metasurfaces of the split ring resonators is provided, and the proposed analytical method is applied to derive the reflection coefficient. The validity of this analytical approach is verified by full-wave simulations which demonstrate good accuracy of the equivalent parameter method. This method can be used in the analysis and synthesis of bianisotropic metasurfaces with different materials and in different frequency ranges by considering electric, magnetic, and electromagnetic/magnetoelectric dipole moments.

  5. Using Microwave Sample Decomposition in Undergraduate Analytical Chemistry

    NASA Astrophysics Data System (ADS)

    Griff Freeman, R.; McCurdy, David L.

    1998-08-01

    A shortcoming of many undergraduate classes in analytical chemistry is that students receive little exposure to sample preparation in chemical analysis. This paper reports the progress made in introducing microwave sample decomposition into several quantitative analysis experiments at Truman State University. Two experiments being performed in our current laboratory rotation include closed vessel microwave decomposition applied to the classical gravimetric determination of nickel and the determination of sodium in snack foods by flame atomic emission spectrometry. A third lab, using open-vessel microwave decomposition for the Kjeldahl nitrogen determination is now ready for student trial. Microwave decomposition reduces the time needed to complete these experiments and significantly increases the student awareness of the importance of sample preparation in quantitative chemical analyses, providing greater breadth and realism in the experiments.

  6. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    PubMed

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Design of coherent receiver optical front end for unamplified applications.

    PubMed

    Zhang, Bo; Malouin, Christian; Schmidt, Theodore J

    2012-01-30

    Advanced modulation schemes together with coherent detection and digital signal processing has enabled the next generation high-bandwidth optical communication systems. One of the key advantages of coherent detection is its superior receiver sensitivity compared to direct detection receivers due to the gain provided by the local oscillator (LO). In unamplified applications, such as metro and edge networks, the ultimate receiver sensitivity is dictated by the amount of shot noise, thermal noise, and the residual beating of the local oscillator with relative intensity noise (LO-RIN). We show that the best sensitivity is achieved when the thermal noise is balanced with the residual LO-RIN beat noise, which results in an optimum LO power. The impact of thermal noise from the transimpedance amplifier (TIA), the RIN from the LO, and the common mode rejection ratio (CMRR) from a balanced photodiode are individually analyzed via analytical models and compared to numerical simulations. The analytical model results match well with those of the numerical simulations, providing a simplified method to quantify the impact of receiver design tradeoffs. For a practical 100 Gb/s integrated coherent receiver with 7% FEC overhead, we show that an optimum receiver sensitivity of -33 dBm can be achieved at GFEC cliff of 8.55E-5 if the LO power is optimized at 11 dBm. We also discuss a potential method to monitor the imperfections of a balanced and integrated coherent receiver.

  8. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. Exhaled breath condensate – from an analytical point of view

    PubMed Central

    Dodig, Slavica; Čepelak, Ivana

    2013-01-01

    Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297

  10. A cost-performance model for ground-based optical communications receiving telescopes

    NASA Technical Reports Server (NTRS)

    Lesh, J. R.; Robinson, D. L.

    1986-01-01

    An analytical cost-performance model for a ground-based optical communications receiving telescope is presented. The model considers costs of existing telescopes as a function of diameter and field of view. This, coupled with communication performance as a function of receiver diameter and field of view, yields the appropriate telescope cost versus communication performance curve.

  11. Chemiluminescence microarrays in analytical chemistry: a critical review.

    PubMed

    Seidel, Michael; Niessner, Reinhard

    2014-09-01

    Multi-analyte immunoassays on microarrays and on multiplex DNA microarrays have been described for quantitative analysis of small organic molecules (e.g., antibiotics, drugs of abuse, small molecule toxins), proteins (e.g., antibodies or protein toxins), and microorganisms, viruses, and eukaryotic cells. In analytical chemistry, multi-analyte detection by use of analytical microarrays has become an innovative research topic because of the possibility of generating several sets of quantitative data for different analyte classes in a short time. Chemiluminescence (CL) microarrays are powerful tools for rapid multiplex analysis of complex matrices. A wide range of applications for CL microarrays is described in the literature dealing with analytical microarrays. The motivation for this review is to summarize the current state of CL-based analytical microarrays. Combining analysis of different compound classes on CL microarrays reduces analysis time, cost of reagents, and use of laboratory space. Applications are discussed, with examples from food safety, water safety, environmental monitoring, diagnostics, forensics, toxicology, and biosecurity. The potential and limitations of research on multiplex analysis by use of CL microarrays are discussed in this review.

  12. Stability Analysis of Receiver ISB for BDS/GPS

    NASA Astrophysics Data System (ADS)

    Zhang, H.; Hao, J. M.; Tian, Y. G.; Yu, H. L.; Zhou, Y. L.

    2017-07-01

    Stability analysis of receiver ISB (Inter-System Bias) is essential for understanding the feature of ISB as well as the ISB modeling and prediction. In order to analyze the long-term stability of ISB, the data from MGEX (Multi-GNSS Experiment) covering 3 weeks, which are from 2014, 2015 and 2016 respectively, are processed with the precise satellite clock and orbit products provided by Wuhan University and GeoForschungsZentrum (GFZ). Using the ISB calculated by BDS (BeiDou Navigation Satellite System)/GPS (Global Positioning System) combined PPP (Precise Point Positioning), the daily stability and weekly stability of ISB are investigated. The experimental results show that the diurnal variation of ISB is stable, and the average of daily standard deviation is about 0.5 ns. The weekly averages and standard deviations of ISB vary greatly in different years. The weekly averages of ISB are relevant to receiver types. There is a system bias between ISB calculated from the precise products provided by Wuhan University and GFZ. In addition, the system bias of the weekly average ISB of different stations is consistent with each other.

  13. Comprehensive Reactive Receiver Modeling for Diffusive Molecular Communication Systems: Reversible Binding, Molecule Degradation, and Finite Number of Receptors.

    PubMed

    Ahmadzadeh, Arman; Arjmandi, Hamidreza; Burkovski, Andreas; Schober, Robert

    2016-10-01

    This paper studies the problem of receiver modeling in molecular communication systems. We consider the diffusive molecular communication channel between a transmitter nano-machine and a receiver nano-machine in a fluid environment. The information molecules released by the transmitter nano-machine into the environment can degrade in the channel via a first-order degradation reaction and those that reach the receiver nano-machine can participate in a reversible bimolecular reaction with receiver receptor proteins. Thereby, we distinguish between two scenarios. In the first scenario, we assume that the entire surface of the receiver is covered by receptor molecules. We derive a closed-form analytical expression for the expected received signal at the receiver, i.e., the expected number of activated receptors on the surface of the receiver. Then, in the second scenario, we consider the case where the number of receptor molecules is finite and the uniformly distributed receptor molecules cover the receiver surface only partially. We show that the expected received signal for this scenario can be accurately approximated by the expected received signal for the first scenario after appropriately modifying the forward reaction rate constant. The accuracy of the derived analytical results is verified by Brownian motion particle-based simulations of the considered environment, where we also show the impact of the effect of receptor occupancy on the derived analytical results.

  14. Space Trajectories Error Analysis (STEAP) Programs. Volume 1: Analytic manual, update

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Manual revisions are presented for the modified and expanded STEAP series. The STEAP 2 is composed of three independent but related programs: NOMAL for the generation of n-body nominal trajectories performing a number of deterministic guidance events; ERRAN for the linear error analysis and generalized covariance analysis along specific targeted trajectories; and SIMUL for testing the mathematical models used in the navigation and guidance process. The analytic manual provides general problem description, formulation, and solution and the detailed analysis of subroutines. The programmers' manual gives descriptions of the overall structure of the programs as well as the computational flow and analysis of the individual subroutines. The user's manual provides information on the input and output quantities of the programs. These are updates to N69-36472 and N69-36473.

  15. An Analysis of Machine- and Human-Analytics in Classification.

    PubMed

    Tam, Gary K L; Kothari, Vivek; Chen, Min

    2017-01-01

    In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.

  16. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  17. Climate Analytics as a Service

    NASA Technical Reports Server (NTRS)

    Schnase, John L.; Duffy, Daniel Q.; McInerney, Mark A.; Webster, W. Phillip; Lee, Tsengdar J.

    2014-01-01

    Climate science is a big data domain that is experiencing unprecedented growth. In our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). CAaaS combines high-performance computing and data-proximal analytics with scalable data management, cloud computing virtualization, the notion of adaptive analytics, and a domain-harmonized API to improve the accessibility and usability of large collections of climate data. MERRA Analytic Services (MERRA/AS) provides an example of CAaaS. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of key climate variables. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, CAaaS is providing the agility required to meet our customers' increasing and changing data management and data analysis needs.

  18. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  19. Determination of Thermal State of Charge in Solar Heat Receivers

    NASA Technical Reports Server (NTRS)

    Glakpe, E. K.; Cannon, J. N.; Hall, C. A., III; Grimmett, I. W.

    1996-01-01

    The research project at Howard University seeks to develop analytical and numerical capabilities to study heat transfer and fluid flow characteristics, and the prediction of the performance of solar heat receivers for space applications. Specifically, the study seeks to elucidate the effects of internal and external thermal radiation, geometrical and applicable dimensionless parameters on the overall heat transfer in space solar heat receivers. Over the last year, a procedure for the characterization of the state-of-charge (SOC) in solar heat receivers for space applications has been developed. By identifying the various factors that affect the SOC, a dimensional analysis is performed resulting in a number of dimensionless groups of parameters. Although not accomplished during the first phase of the research, data generated from a thermal simulation program can be used to determine values of the dimensionless parameters and the state-of-charge and thereby obtain a correlation for the SOC. The simulation program selected for the purpose is HOTTube, a thermal numerical computer code based on a transient time-explicit, axisymmetric model of the total solar heat receiver. Simulation results obtained with the computer program are presented the minimum and maximum insolation orbits. In the absence of any validation of the code with experimental data, results from HOTTube appear reasonable qualitatively in representing the physical situations modeled.

  20. Analysis of Multi-Antenna GNSS Receiver Performance under Jamming Attacks.

    PubMed

    Vagle, Niranjana; Broumandan, Ali; Lachapelle, Gérard

    2016-11-17

    Although antenna array-based Global Navigation Satellite System (GNSS) receivers can be used to mitigate both narrowband and wideband electronic interference sources, measurement distortions induced by array processing methods are not suitable for high precision applications. The measurement distortions have an adverse effect on the carrier phase ambiguity resolution, affecting the navigation solution. Depending on the array attitude information availability and calibration parameters, different spatial processing methods can be implemented although they distort carrier phase measurements in some cases. This paper provides a detailed investigation of the effect of different array processing techniques on array-based GNSS receiver measurements and navigation performance. The main novelty of the paper is to provide a thorough analysis of array-based GNSS receivers employing different beamforming techniques from tracking to navigation solution. Two beamforming techniques, namely Power Minimization (PM) and Minimum Power Distortionless Response (MPDR), are being investigated. In the tracking domain, the carrier Doppler, Phase Lock Indicator (PLI), and Carrier-to-Noise Ratio (C/N₀) are analyzed. Pseudorange and carrier phase measurement distortions and carrier phase position performance are also evaluated. Performance analyses results from simulated GNSS signals and field tests are provided.

  1. Analytical Round Robin for Elastic-Plastic Analysis of Surface Cracked Plates: Phase I Results

    NASA Technical Reports Server (NTRS)

    Wells, D. N.; Allen, P. A.

    2012-01-01

    An analytical round robin for the elastic-plastic analysis of surface cracks in flat plates was conducted with 15 participants. Experimental results from a surface crack tension test in 2219-T8 aluminum plate provided the basis for the inter-laboratory study (ILS). The study proceeded in a blind fashion given that the analysis methodology was not specified to the participants, and key experimental results were withheld. This approach allowed the ILS to serve as a current measure of the state of the art for elastic-plastic fracture mechanics analysis. The analytical results and the associated methodologies were collected for comparison, and sources of variability were studied and isolated. The results of the study revealed that the J-integral analysis methodology using the domain integral method is robust, providing reliable J-integral values without being overly sensitive to modeling details. General modeling choices such as analysis code, model size (mesh density), crack tip meshing, or boundary conditions, were not found to be sources of significant variability. For analyses controlled only by far-field boundary conditions, the greatest source of variability in the J-integral assessment is introduced through the constitutive model. This variability can be substantially reduced by using crack mouth opening displacements to anchor the assessment. Conclusions provide recommendations for analysis standardization.

  2. On the Multilevel Nature of Meta-Analysis: A Tutorial, Comparison of Software Programs, and Discussion of Analytic Choices.

    PubMed

    Pastor, Dena A; Lazowski, Rory A

    2018-01-01

    The term "multilevel meta-analysis" is encountered not only in applied research studies, but in multilevel resources comparing traditional meta-analysis to multilevel meta-analysis. In this tutorial, we argue that the term "multilevel meta-analysis" is redundant since all meta-analysis can be formulated as a special kind of multilevel model. To clarify the multilevel nature of meta-analysis the four standard meta-analytic models are presented using multilevel equations and fit to an example data set using four software programs: two specific to meta-analysis (metafor in R and SPSS macros) and two specific to multilevel modeling (PROC MIXED in SAS and HLM). The same parameter estimates are obtained across programs underscoring that all meta-analyses are multilevel in nature. Despite the equivalent results, not all software programs are alike and differences are noted in the output provided and estimators available. This tutorial also recasts distinctions made in the literature between traditional and multilevel meta-analysis as differences between meta-analytic choices, not between meta-analytic models, and provides guidance to inform choices in estimators, significance tests, moderator analyses, and modeling sequence. The extent to which the software programs allow flexibility with respect to these decisions is noted, with metafor emerging as the most favorable program reviewed.

  3. Fiber-based free-space optical coherent receiver with vibration compensation mechanism.

    PubMed

    Zhang, Ruochi; Wang, Jianmin; Zhao, Guang; Lv, Junyi

    2013-07-29

    We propose a novel fiber-based free-space optical (FSO) coherent receiver for inter-satellite communication. The receiver takes advantage of established fiber-optic components and utilizes the fine-pointing subsystem installed in FSO terminals to minimize the influence of satellite platform vibrations. The received beam is coupled to a single-mode fiber, and the coupling efficiency of the system is investigated both analytically and experimentally. A receiving sensitivity of -38 dBm is obtained at the forward error correction limit with a transmission rate of 22.4 Gbit/s. The proposed receiver is shown to be a promising component for inter-satellite optical communication.

  4. Analytical method for nitroaromatic explosives in radiologically contaminated soil for ISO/IEC 17025 accreditation

    DOE PAGES

    Boggess, Andrew; Crump, Stephen; Gregory, Clint; ...

    2017-12-06

    Here, unique hazards are presented in the analysis of radiologically contaminated samples. Strenuous safety and security precautions must be in place to protect the analyst, laboratory, and instrumentation used to perform analyses. A validated method has been optimized for the analysis of select nitroaromatic explosives and degradative products using gas chromatography/mass spectrometry via sonication extraction of radiologically contaminated soils, for samples requiring ISO/IEC 17025 laboratory conformance. Target analytes included 2-nitrotoluene, 4-nitrotoluene, 2,6-dinitrotoluene, and 2,4,6-trinitrotoluene, as well as the degradative product 4-amino-2,6-dinitrotoluene. Analytes were extracted from soil in methylene chloride by sonication. Administrative and engineering controls, as well as instrument automationmore » and quality control measures, were utilized to minimize potential human exposure to radiation at all times and at all stages of analysis, from receiving through disposition. Though thermal instability increased uncertainties of these selected compounds, a mean lower quantitative limit of 2.37 µg/mL and mean accuracy of 2.3% relative error and 3.1% relative standard deviation were achieved. Quadratic regression was found to be optimal for calibration of all analytes, with compounds of lower hydrophobicity displaying greater parabolic curve. Blind proficiency testing (PT) of spiked soil samples demonstrated a mean relative error of 9.8%. Matrix spiked analyses of PT samples demonstrated that 99% recovery of target analytes was achieved. To the knowledge of the authors, this represents the first safe, accurate, and reproducible quantitative method for nitroaromatic explosives in soil for specific use on radiologically contaminated samples within the constraints of a nuclear analytical lab.« less

  5. Analytical method for nitroaromatic explosives in radiologically contaminated soil for ISO/IEC 17025 accreditation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boggess, Andrew; Crump, Stephen; Gregory, Clint

    Here, unique hazards are presented in the analysis of radiologically contaminated samples. Strenuous safety and security precautions must be in place to protect the analyst, laboratory, and instrumentation used to perform analyses. A validated method has been optimized for the analysis of select nitroaromatic explosives and degradative products using gas chromatography/mass spectrometry via sonication extraction of radiologically contaminated soils, for samples requiring ISO/IEC 17025 laboratory conformance. Target analytes included 2-nitrotoluene, 4-nitrotoluene, 2,6-dinitrotoluene, and 2,4,6-trinitrotoluene, as well as the degradative product 4-amino-2,6-dinitrotoluene. Analytes were extracted from soil in methylene chloride by sonication. Administrative and engineering controls, as well as instrument automationmore » and quality control measures, were utilized to minimize potential human exposure to radiation at all times and at all stages of analysis, from receiving through disposition. Though thermal instability increased uncertainties of these selected compounds, a mean lower quantitative limit of 2.37 µg/mL and mean accuracy of 2.3% relative error and 3.1% relative standard deviation were achieved. Quadratic regression was found to be optimal for calibration of all analytes, with compounds of lower hydrophobicity displaying greater parabolic curve. Blind proficiency testing (PT) of spiked soil samples demonstrated a mean relative error of 9.8%. Matrix spiked analyses of PT samples demonstrated that 99% recovery of target analytes was achieved. To the knowledge of the authors, this represents the first safe, accurate, and reproducible quantitative method for nitroaromatic explosives in soil for specific use on radiologically contaminated samples within the constraints of a nuclear analytical lab.« less

  6. GOCE SSTI GNSS Receiver Re-Entry Phase Analysis

    NASA Astrophysics Data System (ADS)

    Zin, A.; Zago, S.; Scaciga, L.; Marradi, L.; Floberghagen, R.; Fehringer, M.; Bigazzi, A.; Piccolo, A.; Luini, L.

    2015-03-01

    Gravity field and Ocean Circulation Explorer (GOCE) was an ESA Earth Explorer mission dedicated to the measure of the Earth Gravity field. The Spacecraft has been launched in 2009 and the re-entry in atmosphere happened at the end of 2013 [1]. The mean orbit altitude was set to 260 km to maximize the ultra-sensitive accelerometers on board. GOCE was equipped with two main payloads: the Electrostatic Gravity Gradiometer (EGG), a set of six 3-axis accelerometers able to measure the gravity field with unrivalled precision and then to produce the most accurate shape of the ‘geoid’ and two GPS receivers (nominal and redundant), used as a Satellite-to-Satellite Tracking Instrument (SSTI) to geolocate the gradiometer measurements and to measure the long wavelength components of the gravity field with an accuracy never reached before. Previous analyses have shown that the Precise Orbit Determination (POD) of the GOCE satellite, derived by processing the dual-frequency SSTI data (carrier phases and pseudoranges) are at the “state-of-art” of the GPS based POD: kinematic Orbits Average of daily 3D-RMS is 2,06 cm [2]. In most cases the overall accuracy is better than 2 cm 3D RMS. Moreover, the “almost continuous” [2] 1 Hz data availability from the SSTI receiver is unique and allows for a time series of kinematic positions with only 0.5% of missing epochs [2]. In October 2013 GOCE mission was concluded and in November the GOCE spacecraft re-entered in the atmosphere. During the re-entry phase the two SSTI receivers have been switched on simultaneously in order to maximize the data availability. In summer 2013, the SSTI firmware was tailored in order to sustain additional dynamic error (tracking loops robustness), expected during the re-entry phase. The SW was uploaded on SSTI-B (and purposely not on SSTI-A). Therefore this was an unique opportunity to compare a “standard” receiver behaviour (SSTI-A) with an improved one (SSTI-B) in the challenging reentry phase

  7. [Local Regression Algorithm Based on Net Analyte Signal and Its Application in Near Infrared Spectral Analysis].

    PubMed

    Zhang, Hong-guang; Lu, Jian-gang

    2016-02-01

    Abstract To overcome the problems of significant difference among samples and nonlinearity between the property and spectra of samples in spectral quantitative analysis, a local regression algorithm is proposed in this paper. In this algorithm, net signal analysis method(NAS) was firstly used to obtain the net analyte signal of the calibration samples and unknown samples, then the Euclidean distance between net analyte signal of the sample and net analyte signal of calibration samples was calculated and utilized as similarity index. According to the defined similarity index, the local calibration sets were individually selected for each unknown sample. Finally, a local PLS regression model was built on each local calibration sets for each unknown sample. The proposed method was applied to a set of near infrared spectra of meat samples. The results demonstrate that the prediction precision and model complexity of the proposed method are superior to global PLS regression method and conventional local regression algorithm based on spectral Euclidean distance.

  8. Text Stream Trend Analysis using Multiscale Visual Analytics with Applications to Social Media Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Beaver, Justin M; BogenII, Paul L.

    In this paper, we introduce a new visual analytics system, called Matisse, that allows exploration of global trends in textual information streams with specific application to social media platforms. Despite the potential for real-time situational awareness using these services, interactive analysis of such semi-structured textual information is a challenge due to the high-throughput and high-velocity properties. Matisse addresses these challenges through the following contributions: (1) robust stream data management, (2) automated sen- timent/emotion analytics, (3) inferential temporal, geospatial, and term-frequency visualizations, and (4) a flexible drill-down interaction scheme that progresses from macroscale to microscale views. In addition to describing thesemore » contributions, our work-in-progress paper concludes with a practical case study focused on the analysis of Twitter 1% sample stream information captured during the week of the Boston Marathon bombings.« less

  9. The Role of Teamwork in the Analysis of Big Data: A Study of Visual Analytics and Box Office Prediction.

    PubMed

    Buchanan, Verica; Lu, Yafeng; McNeese, Nathan; Steptoe, Michael; Maciejewski, Ross; Cooke, Nancy

    2017-03-01

    Historically, domains such as business intelligence would require a single analyst to engage with data, develop a model, answer operational questions, and predict future behaviors. However, as the problems and domains become more complex, organizations are employing teams of analysts to explore and model data to generate knowledge. Furthermore, given the rapid increase in data collection, organizations are struggling to develop practices for intelligence analysis in the era of big data. Currently, a variety of machine learning and data mining techniques are available to model data and to generate insights and predictions, and developments in the field of visual analytics have focused on how to effectively link data mining algorithms with interactive visuals to enable analysts to explore, understand, and interact with data and data models. Although studies have explored the role of single analysts in the visual analytics pipeline, little work has explored the role of teamwork and visual analytics in the analysis of big data. In this article, we present an experiment integrating statistical models, visual analytics techniques, and user experiments to study the role of teamwork in predictive analytics. We frame our experiment around the analysis of social media data for box office prediction problems and compare the prediction performance of teams, groups, and individuals. Our results indicate that a team's performance is mediated by the team's characteristics such as openness of individual members to others' positions and the type of planning that goes into the team's analysis. These findings have important implications for how organizations should create teams in order to make effective use of information from their analytic models.

  10. Analytical Problems and Suggestions in the Analysis of Behavioral Economic Demand Curves.

    PubMed

    Yu, Jihnhee; Liu, Liu; Collins, R Lorraine; Vincent, Paula C; Epstein, Leonard H

    2014-01-01

    Behavioral economic demand curves (Hursh, Raslear, Shurtleff, Bauman, & Simmons, 1988) are innovative approaches to characterize the relationships between consumption of a substance and its price. In this article, we investigate common analytical issues in the use of behavioral economic demand curves, which can cause inconsistent interpretations of demand curves, and then we provide methodological suggestions to address those analytical issues. We first demonstrate that log transformation with different added values for handling zeros changes model parameter estimates dramatically. Second, demand curves are often analyzed using an overparameterized model that results in an inefficient use of the available data and a lack of assessment of the variability among individuals. To address these issues, we apply a nonlinear mixed effects model based on multivariate error structures that has not been used previously to analyze behavioral economic demand curves in the literature. We also propose analytical formulas for the relevant standard errors of derived values such as P max, O max, and elasticity. The proposed model stabilizes the derived values regardless of using different added increments and provides substantially smaller standard errors. We illustrate the data analysis procedure using data from a relative reinforcement efficacy study of simulated marijuana purchasing.

  11. Algal Biomass Analysis by Laser-Based Analytical Techniques—A Review

    PubMed Central

    Pořízka, Pavel; Prochazková, Petra; Prochazka, David; Sládková, Lucia; Novotný, Jan; Petrilak, Michal; Brada, Michal; Samek, Ota; Pilát, Zdeněk; Zemánek, Pavel; Adam, Vojtěch; Kizek, René; Novotný, Karel; Kaiser, Jozef

    2014-01-01

    Algal biomass that is represented mainly by commercially grown algal strains has recently found many potential applications in various fields of interest. Its utilization has been found advantageous in the fields of bioremediation, biofuel production and the food industry. This paper reviews recent developments in the analysis of algal biomass with the main focus on the Laser-Induced Breakdown Spectroscopy, Raman spectroscopy, and partly Laser-Ablation Inductively Coupled Plasma techniques. The advantages of the selected laser-based analytical techniques are revealed and their fields of use are discussed in detail. PMID:25251409

  12. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  13. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  14. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  15. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  16. 7 CFR 94.303 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  17. Measuring diagnostic and predictive accuracy in disease management: an introduction to receiver operating characteristic (ROC) analysis.

    PubMed

    Linden, Ariel

    2006-04-01

    Diagnostic or predictive accuracy concerns are common in all phases of a disease management (DM) programme, and ultimately play an influential role in the assessment of programme effectiveness. Areas, such as the identification of diseased patients, predictive modelling of future health status and costs and risk stratification, are just a few of the domains in which assessment of accuracy is beneficial, if not critical. The most commonly used analytical model for this purpose is the standard 2 x 2 table method in which sensitivity and specificity are calculated. However, there are several limitations to this approach, including the reliance on a single defined criterion or cut-off for determining a true-positive result, use of non-standardized measurement instruments and sensitivity to outcome prevalence. This paper introduces the receiver operator characteristic (ROC) analysis as a more appropriate and useful technique for assessing diagnostic and predictive accuracy in DM. Its advantages include; testing accuracy across the entire range of scores and thereby not requiring a predetermined cut-off point, easily examined visual and statistical comparisons across tests or scores, and independence from outcome prevalence. Therefore the implementation of ROC as an evaluation tool should be strongly considered in the various phases of a DM programme.

  18. Loran-Based Buoy Position Auditing Systems - Analytical Evaluation

    DOT National Transportation Integrated Search

    1980-02-01

    An analytic evaluation and comparison of the following candidate Buoy Position Auditing System (BPAS) configurations is presented in this report: transmission of digital Time Difference (TD) data from a Loran-C receiver on the buoy, retransmission of...

  19. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  20. Meta-analysis as Statistical and Analytical Method of Journal’s Content Scientific Evaluation

    PubMed Central

    Masic, Izet; Begic, Edin

    2015-01-01

    Introduction: A meta-analysis is a statistical and analytical method which combines and synthesizes different independent studies and integrates their results into one common result. Goal: Analysis of the journals “Medical Archives”, “Materia Socio Medica” and “Acta Informatica Medica”, which are located in the most eminent indexed databases of the biomedical milieu. Material and methods: The study has retrospective and descriptive character, and included the period of the calendar year 2014. Study included six editions of all three journals (total of 18 journals). Results: In this period was published a total of 291 articles (in the “Medical Archives” 110, “Materia Socio Medica” 97, and in “Acta Informatica Medica” 84). The largest number of articles was original articles. Small numbers have been published as professional, review articles and case reports. Clinical events were most common in the first two journals, while in the journal “Acta Informatica Medica” belonged to the field of medical informatics, as part of pre-clinical medical disciplines. Articles are usually required period of fifty to fifty nine days for review. Articles were received from four continents, mostly from Europe. The authors are most often from the territory of Bosnia and Herzegovina, then Iran, Kosovo and Macedonia. Conclusion: The number of articles published each year is increasing, with greater participation of authors from different continents and abroad. Clinical medical disciplines are the most common, with the broader spectrum of topics and with a growing number of original articles. Greater support of the wider scientific community is needed for further development of all three of the aforementioned journals. PMID:25870484

  1. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  2. New insights into liquid chromatography for more eco-friendly analysis of pharmaceuticals.

    PubMed

    Shaaban, Heba

    2016-10-01

    Greening the analytical methods used for analysis of pharmaceuticals has been receiving great interest aimed at eliminating or minimizing the amount of organic solvents consumed daily worldwide without loss in chromatographic performance. Traditional analytical LC techniques employed in pharmaceutical analysis consume tremendous amounts of hazardous solvents and consequently generate large amounts of waste. The monetary and ecological impact of using large amounts of solvents and waste disposal motivated the analytical community to search for alternatives to replace polluting analytical methodologies with clean ones. In this context, implementing the principles of green analytical chemistry (GAC) in analytical laboratories is highly desired. This review gives a comprehensive overview on different green LC pathways for implementing GAC principles in analytical laboratories and focuses on evaluating the greenness of LC analytical procedures. This review presents green LC approaches for eco-friendly analysis of pharmaceuticals in industrial, biological, and environmental matrices. Graphical Abstract Green pathways of liquid chromatography for more eco-friendly analysis of pharmaceuticals.

  3. Analytical Chemistry Laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  4. Selected accounts receivable performance statistics for radiology practices: an analysis of the adjusted collection percentage and days charges in accounts receivable.

    PubMed

    Cergnul, John J; Russell, Philip J; Sunshine, Jonathan H

    2005-12-01

    To provide comparative data and analysis with respect to accounts receivable management performance criteria. Data from 3 sources were analyzed: the Radiology Business Management Association's (RBMA) 2003 Accounts Receivable Performance Survey; the RBMA's 2003 Accounts Receivable Survey; and Hogan and Sunshine's 2004 Radiology article "Financial Ratios in Diagnostic Radiology Practices: Variability and Trends," the data for which were drawn primarily from the ACR's 1999 Survey of Practices. The RBMA surveyed (via e-mail and postal mail) only its members, with response rates of 15% and 9%, respectively. The ACR's survey response rate was 66%, via postal mail, and was distributed without regard to the RBMA membership status of the practice manager or even whether the practice employed a practice manager. Comparison among the survey results provided information on trends. Median practice professional component adjusted collection percentage (ACP) deteriorated from 87.3% to 85.1% between the RBMA surveys. Practices limited to global fee billing faired much better when performing their billing in house, as opposed to using a billing service, with mean ACPs of 91.2% and 79.4%, respectively. Days charges in accounts receivable 2004 mean results for professional component billing and global fee billing were nearly identical at 56.11 and 55.54 days, respectively. The 2003 RBMA survey reported 63.74 days for professional component billing and 77.33 days for global fee billing. The improvement from 2003 to 2004 was highly significant for both professional component billing and global fee billing. The 2004 RBMA survey also reflected a rather dramatic improvement in days charges in accounts receivable compared with Hogan and Sunshine's results, which showed a mean of 69 days charges in accounts receivable. The conflicting trends between ACP performance and days charges in accounts receivable performance may be explained by the increasing sophistication of accounts receivable

  5. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  6. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  7. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  8. 7 CFR 98.4 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture... Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MREs are listed as follows: (1) Official Methods of Analysis of AOAC...

  9. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  10. 7 CFR 93.4 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...

  11. Applying Bayesian Modeling and Receiver Operating Characteristic Methodologies for Test Utility Analysis

    ERIC Educational Resources Information Center

    Wang, Qiu; Diemer, Matthew A.; Maier, Kimberly S.

    2013-01-01

    This study integrated Bayesian hierarchical modeling and receiver operating characteristic analysis (BROCA) to evaluate how interest strength (IS) and interest differentiation (ID) predicted low–socioeconomic status (SES) youth's interest-major congruence (IMC). Using large-scale Kuder Career Search online-assessment data, this study fit three…

  12. SEURAT: visual analytics for the integrated analysis of microarray data.

    PubMed

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  13. SEURAT: Visual analytics for the integrated analysis of microarray data

    PubMed Central

    2010-01-01

    Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data. PMID:20525257

  14. ANALYTICAL CHEMISTRY IN NUCLEAR REACTOR TECHNOLOGY. Analysis of Reactor Fuels, Fission-Product Mixtures and Related Materials: Analytical Chemistry of Plutonium and the Transplutonic Elements. Third Conference, Gatlinburg, Tennessee, October 26-29, 1959

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1960-01-01

    Thirty-one papers and 10 summaries of papers presented at the Third Conference on Analytical Chemistry in Nuclear Reactor Technology held at Gatlinburg, Tennessee, October 26 to 29, 1959, are given. The papers are grouped into four sections: general, analytical chemistry of fuels, analytical chemistry of plutonium and the transplutonic elements, and the analysis of fission-product mixtures. Twenty-seven of the papers are covered by separate abstracts. Four were previously abstracted for NSA. (M.C.G.)

  15. Two-Dimensional Model for Reactive-Sorption Columns of Cylindrical Geometry: Analytical Solutions and Moment Analysis.

    PubMed

    Khan, Farman U; Qamar, Shamsul

    2017-05-01

    A set of analytical solutions are presented for a model describing the transport of a solute in a fixed-bed reactor of cylindrical geometry subjected to the first (Dirichlet) and third (Danckwerts) type inlet boundary conditions. Linear sorption kinetic process and first-order decay are considered. Cylindrical geometry allows the use of large columns to investigate dispersion, adsorption/desorption and reaction kinetic mechanisms. The finite Hankel and Laplace transform techniques are adopted to solve the model equations. For further analysis, statistical temporal moments are derived from the Laplace-transformed solutions. The developed analytical solutions are compared with the numerical solutions of high-resolution finite volume scheme. Different case studies are presented and discussed for a series of numerical values corresponding to a wide range of mass transfer and reaction kinetics. A good agreement was observed in the analytical and numerical concentration profiles and moments. The developed solutions are efficient tools for analyzing numerical algorithms, sensitivity analysis and simultaneous determination of the longitudinal and transverse dispersion coefficients from a laboratory-scale radial column experiment. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Empirical Analysis of the Subjective Impressions and Objective Measures of Domain Scientists’ Visual Analytic Judgments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Burrows, Susannah M.; Han, Kyungsik

    2017-05-08

    Scientists often use specific data analysis and presentation methods familiar within their domain. But does high familiarity drive better analytical judgment? This question is especially relevant when familiar methods themselves can have shortcomings: many visualizations used conventionally for scientific data analysis and presentation do not follow established best practices. This necessitates new methods that might be unfamiliar yet prove to be more effective. But there is little empirical understanding of the relationships between scientists’ subjective impressions about familiar and unfamiliar visualizations and objective measures of their visual analytic judgments. To address this gap and to study these factors, we focusmore » on visualizations used for comparison of climate model performance. We report on a comprehensive survey-based user study with 47 climate scientists and present an analysis of : i) relationships among scientists’ familiarity, their perceived lev- els of comfort, confidence, accuracy, and objective measures of accuracy, and ii) relationships among domain experience, visualization familiarity, and post-study preference.« less

  17. 37Cl/35Cl isotope ratio analysis in perchlorate by ion chromatography/multi collector -ICPMS: Analytical performance and implication for biodegradation studies.

    PubMed

    Zakon, Yevgeni; Ronen, Zeev; Halicz, Ludwik; Gelman, Faina

    2017-10-01

    In the present study we propose a new analytical method for 37 Cl/ 35 Cl analysis in perchlorate by Ion Chromatography(IC) coupled to Multicollector Inductively Coupled Plasma Mass Spectrometry (MC-ICPMS). The accuracy of the analytical method was validated by analysis of international perchlorate standard materials USGS-37 and USGS -38; analytical precision better than ±0.4‰ was achieved. 37 Cl/ 35 Cl isotope ratio analysis in perchlorate during laboratory biodegradation experiment with microbial cultures enriched from the contaminated soil in Israel resulted in isotope enrichment factor ε 37 Cl = -13.3 ± 1‰, which falls in the range reported previously for perchlorate biodegradation by pure microbial cultures. The proposed analytical method may significantly simplify the procedure for isotope analysis of perchlorate which is currently applied in environmental studies. Copyright © 2017. Published by Elsevier Ltd.

  18. Analytical Chemistry Laboratory Progress Report for FY 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program inmore » analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.« less

  19. Decisions through data: analytics in healthcare.

    PubMed

    Wills, Mary J

    2014-01-01

    The amount of data in healthcare is increasing at an astonishing rate. However, in general, the industry has not deployed the level of data management and analysis necessary to make use of those data. As a result, healthcare executives face the risk of being overwhelmed by a flood of unusable data. In this essay I argue that, in order to extract actionable information, leaders must take advantage of the promise of data analytics. Small data, predictive modeling expansion, and real-time analytics are three forms of data analytics. On the basis of my analysis for this study, I recommend all three for adoption. Recognizing the uniqueness of each organization's situation, I also suggest that practices, hospitals, and healthcare systems examine small data and conduct real-time analytics and that large-scale organizations managing populations of patients adopt predictive modeling. I found that all three solutions assist in the collection, management, and analysis of raw data to improve the quality of care and decrease costs.

  20. An Improved Time-Frequency Analysis Method in Interference Detection for GNSS Receivers

    PubMed Central

    Sun, Kewen; Jin, Tian; Yang, Dongkai

    2015-01-01

    In this paper, an improved joint time-frequency (TF) analysis method based on a reassigned smoothed pseudo Wigner–Ville distribution (RSPWVD) has been proposed in interference detection for Global Navigation Satellite System (GNSS) receivers. In the RSPWVD, the two-dimensional low-pass filtering smoothing function is introduced to eliminate the cross-terms present in the quadratic TF distribution, and at the same time, the reassignment method is adopted to improve the TF concentration properties of the auto-terms of the signal components. This proposed interference detection method is evaluated by experiments on GPS L1 signals in the disturbing scenarios compared to the state-of-the-art interference detection approaches. The analysis results show that the proposed interference detection technique effectively overcomes the cross-terms problem and also preserves good TF localization properties, which has been proven to be effective and valid to enhance the interference detection performance of the GNSS receivers, particularly in the jamming environments. PMID:25905704

  1. Active matrix-based collection of airborne analytes: an analyte recording chip providing exposure history and finger print.

    PubMed

    Fang, Jun; Park, Se-Chul; Schlag, Leslie; Stauden, Thomas; Pezoldt, Jörg; Jacobs, Heiko O

    2014-12-03

    In the field of sensors that target the detection of airborne analytes, Corona/lens-based-collection provides a new path to achieve a high sensitivity. An active-matrix-based analyte collection approach referred to as "airborne analyte memory chip/recorder" is demonstrated, which takes and stores airborne analytes in a matrix to provide an exposure history for off-site analysis. © 2014 The Authors. Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Autonomous Integrated Receive System (AIRS) requirements definition. Volume 3: Performance and simulation

    NASA Technical Reports Server (NTRS)

    Chie, C. M.; Su, Y. T.; Lindsey, W. C.; Koukos, J.

    1984-01-01

    The autonomous and integrated aspects of the operation of the AIRS (Autonomous Integrated Receive System) are discussed from a system operation point of view. The advantages of AIRS compared to the existing SSA receive chain equipment are highlighted. The three modes of AIRS operation are addressed in detail. The configurations of the AIRS are defined as a function of the operating modes and the user signal characteristics. Each AIRS configuration selection is made up of three components: the hardware, the software algorithms and the parameters used by these algorithms. A comparison between AIRS and the wide dynamics demodulation (WDD) is provided. The organization of the AIRS analytical/simulation software is described. The modeling and analysis is for simulating the performance of the PN subsystem is documented. The frequence acquisition technique using a frequency-locked loop is also documented. Doppler compensation implementation is described. The technological aspects of employing CCD's for PN acquisition are addressed.

  3. Single Particle-Inductively Coupled Plasma Mass Spectroscopy Analysis of Metallic Nanoparticles in Environmental Samples with Large Dissolved Analyte Fractions.

    PubMed

    Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I

    2016-10-18

    There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.

  4. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive

  5. Promising Ideas for Collective Advancement of Communal Knowledge Using Temporal Analytics and Cluster Analysis

    ERIC Educational Resources Information Center

    Lee, Alwyn Vwen Yen; Tan, Seng Chee

    2017-01-01

    Understanding ideas in a discourse is challenging, especially in textual discourse analysis. We propose using temporal analytics with unsupervised machine learning techniques to investigate promising ideas for the collective advancement of communal knowledge in an online knowledge building discourse. A discourse unit network was constructed and…

  6. Methods for integrating moderation and mediation: a general analytical framework using moderated path analysis.

    PubMed

    Edwards, Jeffrey R; Lambert, Lisa Schurer

    2007-03-01

    Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated and the mediated effects under investigation. This article presents a general analytical framework for combining moderation and mediation that integrates moderated regression analysis and path analysis. This framework clarifies how moderator variables influence the paths that constitute the direct, indirect, and total effects of mediated models. The authors empirically illustrate this framework and give step-by-step instructions for estimation and interpretation. They summarize the advantages of their framework over current approaches, explain how it subsumes moderated mediation and mediated moderation, and describe how it can accommodate additional moderator and mediator variables, curvilinear relationships, and structural equation models with latent variables. (c) 2007 APA, all rights reserved.

  7. Analytical study of seismic effects of a solar receiver mounted on concrete towers with different fundamental periods

    NASA Astrophysics Data System (ADS)

    Deng, Lin

    2016-05-01

    This paper examines the seismic effects experienced by a solar receiver mounted on concrete towers with different fundamental periods. Ten concrete towers are modeled with the empty solar receiver structure and loaded solar receiver structure to examine the tower seismic effects on the solar receiver. The fundamental periods of the towers range from 0.22 seconds to 4.58 seconds, with heights ranging from 40.5 meters to 200 meters. Thirty earthquake ground motion records are used to investigate the responses of each of the combined receiver-on-tower models as well as the receiver-on-ground models by the STAAD Pro software using time history analyses. The earthquake ground motion records are chosen based on the ratio of the peak ground acceleration to the peak ground velocity, ranging from 0.29 g/m/s to 4.88 g/m/s. For each of the combined models, the base shear at the interface between the receiver and the concrete tower is compared with the base shear of the receiver-on-ground model, and the ratio of the two base shears represents the structure amplification factor. It is found that the peak mean plus one standard deviation value of the structure amplification factor matches well with equation 13.3-1 in ASCE 7-10 for the empty solar receiver structure. However, when the solar receiver structure is loaded with dead loads, the peak value is greatly suppressed, and using equation 13.3-1 in ASCE 7-10 will be overly conservative.

  8. Culture-Sensitive Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Vandenberghe, L.

    2008-01-01

    Functional analytic psychotherapy (FAP) is defined as behavior-analytically conceptualized talk therapy. In contrast to the technique-oriented educational format of cognitive behavior therapy and the use of structural mediational models, FAP depends on the functional analysis of the moment-to-moment stream of interactions between client and…

  9. On the analytical modeling of the nonlinear vibrations of pretensioned space structures

    NASA Technical Reports Server (NTRS)

    Housner, J. M.; Belvin, W. K.

    1983-01-01

    Pretensioned structures are receiving considerable attention as candidate large space structures. A typical example is a hoop-column antenna. The large number of preloaded members requires efficient analytical methods for concept validation and design. Validation through analyses is especially important since ground testing may be limited due to gravity effects and structural size. The present investigation has the objective to present an examination of the analytical modeling of pretensioned members undergoing nonlinear vibrations. Two approximate nonlinear analysis are developed to model general structural arrangements which include beam-columns and pretensioned cables attached to a common nucleus, such as may occur at a joint of a pretensioned structure. Attention is given to structures undergoing nonlinear steady-state oscillations due to sinusoidal excitation forces. Three analyses, linear, quasi-linear, and nonlinear are conducted and applied to study the response of a relatively simple cable stiffened structure.

  10. DOSY Analysis of Micromolar Analytes: Resolving Dilute Mixtures by SABRE Hyperpolarization.

    PubMed

    Reile, Indrek; Aspers, Ruud L E G; Tyburn, Jean-Max; Kempf, James G; Feiters, Martin C; Rutjes, Floris P J T; Tessari, Marco

    2017-07-24

    DOSY is an NMR spectroscopy technique that resolves resonances according to the analytes' diffusion coefficients. It has found use in correlating NMR signals and estimating the number of components in mixtures. Applications of DOSY in dilute mixtures are, however, held back by excessively long measurement times. We demonstrate herein, how the enhanced NMR sensitivity provided by SABRE hyperpolarization allows DOSY analysis of low-micromolar mixtures, thus reducing the concentration requirements by at least 100-fold. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Receiver design, performance analysis, and evaluation for space-borne laser altimeters and space-to-space laser ranging systems

    NASA Technical Reports Server (NTRS)

    Davidson, Frederic M.; Sun, Xiaoli; Field, Christopher T.

    1994-01-01

    Accomplishments in the following areas of research are presented: receiver performance study of spaceborne laser altimeters and cloud and aerosol lidars; receiver performance analysis for space-to-space laser ranging systems; and receiver performance study for the Mars Environmental Survey (MESUR).

  12. Problem-based learning on quantitative analytical chemistry course

    NASA Astrophysics Data System (ADS)

    Fitri, Noor

    2017-12-01

    This research applies problem-based learning method on chemical quantitative analytical chemistry, so called as "Analytical Chemistry II" course, especially related to essential oil analysis. The learning outcomes of this course include aspects of understanding of lectures, the skills of applying course materials, and the ability to identify, formulate and solve chemical analysis problems. The role of study groups is quite important in improving students' learning ability and in completing independent tasks and group tasks. Thus, students are not only aware of the basic concepts of Analytical Chemistry II, but also able to understand and apply analytical concepts that have been studied to solve given analytical chemistry problems, and have the attitude and ability to work together to solve the problems. Based on the learning outcome, it can be concluded that the problem-based learning method in Analytical Chemistry II course has been proven to improve students' knowledge, skill, ability and attitude. Students are not only skilled at solving problems in analytical chemistry especially in essential oil analysis in accordance with local genius of Chemistry Department, Universitas Islam Indonesia, but also have skilled work with computer program and able to understand material and problem in English.

  13. THE IMPORTANCE OF PROPER INTENSITY CALIBRATION FOR RAMAN ANALYSIS OF LOW-LEVEL ANALYTES IN WATER

    EPA Science Inventory

    Modern dispersive Raman spectroscopy offers unique advantages for the analysis of low-concentration analytes in aqueous solution. However, we have found that proper intensity calibration is critical for obtaining these benefits. This is true not only for producing spectra with ...

  14. Competing on talent analytics.

    PubMed

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.

  15. Method of identity analyte-binding peptides

    DOEpatents

    Kauvar, Lawrence M.

    1990-01-01

    A method for affinity chromatography or adsorption of a designated analyte utilizes a paralog as the affinity partner. The immobilized paralog can be used in purification or analysis of the analyte; the paralog can also be used as a substitute for antibody in an immunoassay. The paralog is identified by screening candidate peptide sequences of 4-20 amino acids for specific affinity to the analyte.

  16. Method of identity analyte-binding peptides

    DOEpatents

    Kauvar, L.M.

    1990-10-16

    A method for affinity chromatography or adsorption of a designated analyte utilizes a paralog as the affinity partner. The immobilized paralog can be used in purification or analysis of the analyte; the paralog can also be used as a substitute for antibody in an immunoassay. The paralog is identified by screening candidate peptide sequences of 4--20 amino acids for specific affinity to the analyte. 5 figs.

  17. Mathematical modeling and statistical analysis of SPE-OCDMA systems utilizing second harmonic generation effect in thick crystal receivers

    NASA Astrophysics Data System (ADS)

    Matinfar, Mehdi D.; Salehi, Jawad A.

    2009-11-01

    In this paper we analytically study and evaluate the performance of a Spectral-Phase-Encoded Optical CDMA system for different parameters such as the user's code length and the number of users in the network. In this system an advanced receiver structure in which the Second Harmonic Generation effect imposed in a thick crystal is employed as the nonlinear pre-processor prior to the conventional low speed photodetector. We consider ASE noise of the optical amplifiers, effective in low power conditions, besides the multiple access interference (MAI) noise which is the dominant source of noise in any OCDMA communications system. We use the results of the previous work which we analyzed the statistical behavior of the thick crystals in an optically amplified digital lightwave communication system to evaluate the performance of the SPE-OCDMA system with thick crystals receiver structure. The error probability is evaluated using Saddle-Point approximation and the approximation is verified by Monte-Carlo simulation.

  18. Receive Mode Analysis and Design of Microstrip Reflectarrays

    NASA Technical Reports Server (NTRS)

    Rengarajan, Sembiam

    2011-01-01

    Traditionally microstrip or printed reflectarrays are designed using the transmit mode technique. In this method, the size of each printed element is chosen so as to provide the required value of the reflection phase such that a collimated beam results along a given direction. The reflection phase of each printed element is approximated using an infinite array model. The infinite array model is an excellent engineering approximation for a large microstrip array since the size or orientation of elements exhibits a slow spatial variation. In this model, the reflection phase from a given printed element is approximated by that of an infinite array of elements of the same size and orientation when illuminated by a local plane wave. Thus the reflection phase is a function of the size (or orientation) of the element, the elevation and azimuth angles of incidence of a local plane wave, and polarization. Typically, one computes the reflection phase of the infinite array as a function of several parameters such as size/orientation, elevation and azimuth angles of incidence, and in some cases for vertical and horizontal polarization. The design requires the selection of the size/orientation of the printed element to realize the required phase by interpolating or curve fitting all the computed data. This is a substantially complicated problem, especially in applications requiring a computationally intensive commercial code to determine the reflection phase. In dual polarization applications requiring rectangular patches, one needs to determine the reflection phase as a function of five parameters (dimensions of the rectangular patch, elevation and azimuth angles of incidence, and polarization). This is an extremely complex problem. The new method employs the reciprocity principle and reaction concept, two well-known concepts in electromagnetics to derive the receive mode analysis and design techniques. In the "receive mode design" technique, the reflection phase is computed

  19. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality.

    PubMed

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; Maceachren, Alan M

    2008-11-07

    Kulldorff's spatial scan statistic and its software implementation - SaTScan - are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of

  20. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality

    PubMed Central

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; MacEachren, Alan M

    2008-01-01

    Background Kulldorff's spatial scan statistic and its software implementation – SaTScan – are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. Results We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed

  1. Simulation of a navigator algorithm for a low-cost GPS receiver

    NASA Technical Reports Server (NTRS)

    Hodge, W. F.

    1980-01-01

    The analytical structure of an existing navigator algorithm for a low cost global positioning system receiver is described in detail to facilitate its implementation on in-house digital computers and real-time simulators. The material presented includes a simulation of GPS pseudorange measurements, based on a two-body representation of the NAVSTAR spacecraft orbits, and a four component model of the receiver bias errors. A simpler test for loss of pseudorange measurements due to spacecraft shielding is also noted.

  2. Analytical Psychology: A Review of a Theoretical Approach and Its Application to Counseling.

    ERIC Educational Resources Information Center

    Ziff, Katherine K.

    Analytical psychology is a field supported by training centers, specially trained analysts, and a growing body of literature. While it receives much recognition, it remains mostly outside the mainstream of counseling and counselor education. This document presents a brief history of analytical psychology and how it has been revisited and renamed…

  3. Towards full band colorless reception with coherent balanced receivers.

    PubMed

    Zhang, Bo; Malouin, Christian; Schmidt, Theodore J

    2012-04-23

    In addition to linear compensation of fiber channel impairments, coherent receivers also provide colorless selection of any desired data channel within multitude of incident wavelengths, without the need of a channel selecting filter. In this paper, we investigate the design requirements for colorless reception using a coherent balanced receiver, considering both the optical front end (OFE) and the transimpedance amplifier (TIA). We develop analytical models to predict the system performance as a function of receiver design parameters and show good agreement against numerical simulations. At low input signal power, an optimum local oscillator (LO) power is shown to exist where the thermal noise is balanced with the residual LO-RIN beat noise. At high input signal power, we show the dominant noise effect is the residual self-beat noise from the out of band (OOB) channels, which scales not only with the number of OOB channels and the common mode rejection ratio (CMRR) of the OFE, but also depends on the link residual chromatic dispersion (CD) and the orientation of the polarization tributaries relative to the receiver. This residual self-beat noise from OOB channels sets the lower bound for the LO power. We also investigate the limitations imposed by overload in the TIA, showing analytically that the DC current scales only with the number of OOB channels, while the differential AC current scales only with the link residual CD, which induces high peak-to-average power ratio (PAPR). Both DC and AC currents at the input to the TIA set the upper bounds for the LO power. Considering both the OFE noise limit and the TIA overload limit, we show that the receiver operating range is notably narrowed for dispersion unmanaged links, as compared to dispersion managed links. © 2012 Optical Society of America

  4. Analytical transmissibility based transfer path analysis for multi-energy-domain systems using four-pole parameter theory

    NASA Astrophysics Data System (ADS)

    Mashayekhi, Mohammad Jalali; Behdinan, Kamran

    2017-10-01

    The increasing demand to minimize undesired vibration and noise levels in several high-tech industries has generated a renewed interest in vibration transfer path analysis. Analyzing vibration transfer paths within a system is of crucial importance in designing an effective vibration isolation strategy. Most of the existing vibration transfer path analysis techniques are empirical which are suitable for diagnosis and troubleshooting purpose. The lack of an analytical transfer path analysis to be used in the design stage is the main motivation behind this research. In this paper an analytical transfer path analysis based on the four-pole theory is proposed for multi-energy-domain systems. Bond graph modeling technique which is an effective approach to model multi-energy-domain systems is used to develop the system model. In this paper an electro-mechanical system is used as a benchmark example to elucidate the effectiveness of the proposed technique. An algorithm to obtain the equivalent four-pole representation of a dynamical systems based on the corresponding bond graph model is also presented in this paper.

  5. Rethinking Visual Analytics for Streaming Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is

  6. Visual analytics of brain networks.

    PubMed

    Li, Kaiming; Guo, Lei; Faraco, Carlos; Zhu, Dajiang; Chen, Hanbo; Yuan, Yixuan; Lv, Jinglei; Deng, Fan; Jiang, Xi; Zhang, Tuo; Hu, Xintao; Zhang, Degang; Miller, L Stephen; Liu, Tianming

    2012-05-15

    Identification of regions of interest (ROIs) is a fundamental issue in brain network construction and analysis. Recent studies demonstrate that multimodal neuroimaging approaches and joint analysis strategies are crucial for accurate, reliable and individualized identification of brain ROIs. In this paper, we present a novel approach of visual analytics and its open-source software for ROI definition and brain network construction. By combining neuroscience knowledge and computational intelligence capabilities, visual analytics can generate accurate, reliable and individualized ROIs for brain networks via joint modeling of multimodal neuroimaging data and an intuitive and real-time visual analytics interface. Furthermore, it can be used as a functional ROI optimization and prediction solution when fMRI data is unavailable or inadequate. We have applied this approach to an operation span working memory fMRI/DTI dataset, a schizophrenia DTI/resting state fMRI (R-fMRI) dataset, and a mild cognitive impairment DTI/R-fMRI dataset, in order to demonstrate the effectiveness of visual analytics. Our experimental results are encouraging. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Analytical methods of the U.S. Geological Survey's New York District Water-Analysis Laboratory

    USGS Publications Warehouse

    Lawrence, Gregory B.; Lincoln, Tricia A.; Horan-Ross, Debra A.; Olson, Mark L.; Waldron, Laura A.

    1995-01-01

    The New York District of the U.S. Geological Survey (USGS) in Troy, N.Y., operates a water-analysis laboratory for USGS watershed-research projects in the Northeast that require analyses of precipitation and of dilute surface water and soil water for major ions; it also provides analyses of certain chemical constituents in soils and soil gas samples.This report presents the methods for chemical analyses of water samples, soil-water samples, and soil-gas samples collected in wateshed-research projects. The introduction describes the general materials and technicques for each method and explains the USGS quality-assurance program and data-management procedures; it also explains the use of cross reference to the three most commonly used methods manuals for analysis of dilute waters. The body of the report describes the analytical procedures for (1) solution analysis, (2) soil analysis, and (3) soil-gas analysis. The methods are presented in alphabetical order by constituent. The method for each constituent is preceded by (1) reference codes for pertinent sections of the three manuals mentioned above, (2) a list of the method's applications, and (3) a summary of the procedure. The methods section for each constitutent contains the following categories: instrumentation and equipment, sample preservation and storage, reagents and standards, analytical procedures, quality control, maintenance, interferences, safety considerations, and references. Sufficient information is presented for each method to allow the resulting data to be appropriately used in environmental investigations.

  8. Green analytical method development for statin analysis.

    PubMed

    Assassi, Amira Louiza; Roy, Claude-Eric; Perovitch, Philippe; Auzerie, Jack; Hamon, Tiphaine; Gaudin, Karen

    2015-02-06

    Green analytical chemistry method was developed for pravastatin, fluvastatin and atorvastatin analysis. HPLC/DAD method using ethanol-based mobile phase with octadecyl-grafted silica with various grafting and related-column parameters such as particle sizes, core-shell and monolith was studied. Retention, efficiency and detector linearity were optimized. Even for column with particle size under 2 μm, the benefit of keeping efficiency within a large range of flow rate was not obtained with ethanol based mobile phase compared to acetonitrile one. Therefore the strategy to shorten analysis by increasing the flow rate induced decrease of efficiency with ethanol based mobile phase. An ODS-AQ YMC column, 50 mm × 4.6 mm, 3 μm was selected which showed the best compromise between analysis time, statin separation, and efficiency. HPLC conditions were at 1 mL/min, ethanol/formic acid (pH 2.5, 25 mM) (50:50, v/v) and thermostated at 40°C. To reduce solvent consumption for sample preparation, 0.5mg/mL concentration of each statin was found the highest which respected detector linearity. These conditions were validated for each statin for content determination in high concentrated hydro-alcoholic solutions. Solubility higher than 100mg/mL was found for pravastatin and fluvastatin, whereas for atorvastatin calcium salt the maximum concentration was 2mg/mL for hydro-alcoholic binary mixtures between 35% and 55% of ethanol in water. Using atorvastatin instead of its calcium salt, solubility was improved. Highly concentrated solution of statins offered potential fluid for per Buccal Per-Mucous(®) administration with the advantages of rapid and easy passage of drugs. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. The transfer of analytical procedures.

    PubMed

    Ermer, J; Limberger, M; Lis, K; Wätzig, H

    2013-11-01

    Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. The effect of medical malpractice liability on rate of referrals received by specialist physicians.

    PubMed

    Xu, Xiao; Spurr, Stephen J; Nan, Bin; Fendrick, A Mark

    2013-10-01

    Using nationally representative data from the United States, this paper analyzed the effect of a state’s medical malpractice environment on referral visits received by specialist physicians. The analytic sample included 12,839 ambulatory visits to specialist care doctors in office-based settings in the United States during 2003–2007. Whether the patient was referred for the visit was examined for its association with the state’s malpractice environment, assessed by the frequency and severity of paid medical malpractice claims, medical malpractice insurance premiums and an indicator for whether the state had a cap on non-economic damages. After accounting for potential confounders such as economic or professional incentives within practices, the analysis showed that statutory caps on non-economic damages of $250,000 were significantly associated with lower likelihood of a specialist receiving referrals, suggesting a potential impact of a state’s medical malpractice environment on physicians’ referral behavior.

  11. The Effect of Medical Malpractice Liability on Rate of Referrals Received by Specialist Physicians

    PubMed Central

    Xu, Xiao; Spurr, Stephen J.; Nan, Bin; Fendrick, A. Mark

    2013-01-01

    Using nationally representative data from the U.S., this paper analyzed the effect of a state’s medical malpractice environment on referral visits received by specialist physicians. The analytic sample included 12,839 ambulatory visits to specialist care doctors in office-based settings in the U.S. during 2003–2007. Whether the patient was referred for the visit was examined for its association with the state’s malpractice environment, assessed by the frequency and severity of paid medical malpractice claims, medical malpractice insurance premiums, and an indicator for whether the state had a cap on noneconomic damages. After accounting for potential confounders such as economic or professional incentives within practices, the analysis showed that statutory caps on noneconomic damages of $250,000 were significantly associated with lower likelihood of a specialist receiving referrals, suggesting a potential impact of a state’s medical malpractice environment on physicians’ referral behavior. PMID:23527533

  12. Unexpected Analyte Oxidation during Desorption Electrospray Ionization - Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pasilis, Sofie P; Kertesz, Vilmos; Van Berkel, Gary J

    2008-01-01

    During the analysis of surface spotted analytes using desorption electrospray ionization mass spectrometry (DESI-MS), abundant ions are sometimes observed that appear to be the result of oxygen addition reactions. In this investigation, the effect of sample aging, the ambient lab environment, spray voltage, analyte surface concentration, and surface type on this oxidative modification of spotted analytes, exemplified by tamoxifen and reserpine, during analysis by desorption electrospray ionization mass spectrometry was studied. Simple exposure of the samples to air and to ambient lighting increased the extent of oxidation. Increased spray voltage lead also to increased analyte oxidation, possibly as a resultmore » of oxidative species formed electrochemically at the emitter electrode or in the gas - phase by discharge processes. These oxidative species are carried by the spray and impinge on and react with the sampled analyte during desorption/ionization. The relative abundance of oxidized species was more significant for analysis of deposited analyte having a relatively low surface concentration. Increasing spray solvent flow rate and addition of hydroquinone as a redox buffer to the spray solvent were found to decrease, but not entirely eliminate, analyte oxidation during analysis. The major parameters that both minimize and maximize analyte oxidation were identified and DESI-MS operational recommendations to avoid these unwanted reactions are suggested.« less

  13. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  14. Analyte discrimination from chemiresistor response kinetics.

    PubMed

    Read, Douglas H; Martin, James E

    2010-08-15

    Chemiresistors are polymer-based sensors that transduce the sorption of a volatile organic compound into a resistance change. Like other polymer-based gas sensors that function through sorption, chemiresistors can be selective for analytes on the basis of the affinity of the analyte for the polymer. However, a single sensor cannot, in and of itself, discriminate between analytes, since a small concentration of an analyte that has a high affinity for the polymer might give the same response as a high concentration of another analyte with a low affinity. In this paper we use a field-structured chemiresistor to demonstrate that its response kinetics can be used to discriminate between analytes, even between those that have identical chemical affinities for the polymer phase of the sensor. The response kinetics is shown to be independent of the analyte concentration, and thus the magnitude of the sensor response, but is found to vary inversely with the analyte's saturation vapor pressure. Saturation vapor pressures often vary greatly from analyte to analyte, so analysis of the response kinetics offers a powerful method for obtaining analyte discrimination from a single sensor.

  15. Physical and Chemical Analytical Analysis: A key component of Bioforensics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velsko, S P

    The anthrax letters event of 2001 has raised our awareness of the potential importance of non-biological measurements on samples of biological agents used in a terrorism incident. Such measurements include a variety of mass spectral, spectroscopic, and other instrumental techniques that are part of the current armamentarium of the modern materials analysis or analytical chemistry laboratory. They can provide morphological, trace element, isotopic, and other molecular ''fingerprints'' of the agent that may be key pieces of evidence, supplementing that obtained from genetic analysis or other biological properties. The generation and interpretation of such data represents a new domain of forensicmore » science, closely aligned with other areas of ''microbial forensics''. This paper describes some major elements of the R&D agenda that will define this sub-field in the immediate future and provide the foundations for a coherent national capability. Data from chemical and physical analysis of BW materials can be useful to an investigation of a bio-terror event in two ways. First, it can be used to compare evidence samples collected at different locations where such incidents have occurred (e.g. between the powders in the New York and Washington letters in the Amerithrax investigation) or between the attack samples and those seized during the investigation of sites where it is suspected the material was manufactured (if such samples exist). Matching of sample properties can help establish the relatedness of disparate incidents, and mis-matches might exclude certain scenarios, or signify a more complex etiology of the events under investigation. Chemical and morphological analysis for sample matching has a long history in forensics, and is likely to be acceptable in principle in court, assuming that match criteria are well defined and derived from known limits of precision of the measurement techniques in question. Thus, apart from certain operational issues (such as how

  16. Thermodynamic Analysis of Beam down Solar Gas Turbine Power Plant equipped with Concentrating Receiver System

    NASA Astrophysics Data System (ADS)

    Azharuddin; Santarelli, Massimo

    2016-09-01

    Thermodynamic analysis of a closed cycle, solar powered Brayton gas turbine power plant with Concentrating Receiver system has been studied. A Brayton cycle is simpler than a Rankine cycle and has an advantage where the water is scarce. With the normal Brayton cycle a Concentrating Receiver System has been analysed which has a dependence on field density and optical system. This study presents a method of optimization of design parameter, such as the receiver working temperature and the heliostats density. This method aims at maximizing the overall efficiency of the three major subsystem that constitute the entire plant, namely, the heliostat field and the tower, the receiver and the power block. The results of the optimization process are shown and analysed.

  17. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  18. Error-analysis and comparison to analytical models of numerical waveforms produced by the NRAR Collaboration

    NASA Astrophysics Data System (ADS)

    Hinder, Ian; Buonanno, Alessandra; Boyle, Michael; Etienne, Zachariah B.; Healy, James; Johnson-McDaniel, Nathan K.; Nagar, Alessandro; Nakano, Hiroyuki; Pan, Yi; Pfeiffer, Harald P.; Pürrer, Michael; Reisswig, Christian; Scheel, Mark A.; Schnetter, Erik; Sperhake, Ulrich; Szilágyi, Bela; Tichy, Wolfgang; Wardell, Barry; Zenginoğlu, Anıl; Alic, Daniela; Bernuzzi, Sebastiano; Bode, Tanja; Brügmann, Bernd; Buchman, Luisa T.; Campanelli, Manuela; Chu, Tony; Damour, Thibault; Grigsby, Jason D.; Hannam, Mark; Haas, Roland; Hemberger, Daniel A.; Husa, Sascha; Kidder, Lawrence E.; Laguna, Pablo; London, Lionel; Lovelace, Geoffrey; Lousto, Carlos O.; Marronetti, Pedro; Matzner, Richard A.; Mösta, Philipp; Mroué, Abdul; Müller, Doreen; Mundim, Bruno C.; Nerozzi, Andrea; Paschalidis, Vasileios; Pollney, Denis; Reifenberger, George; Rezzolla, Luciano; Shapiro, Stuart L.; Shoemaker, Deirdre; Taracchini, Andrea; Taylor, Nicholas W.; Teukolsky, Saul A.; Thierfelder, Marcus; Witek, Helvi; Zlochower, Yosef

    2013-01-01

    The Numerical-Relativity-Analytical-Relativity (NRAR) collaboration is a joint effort between members of the numerical relativity, analytical relativity and gravitational-wave data analysis communities. The goal of the NRAR collaboration is to produce numerical-relativity simulations of compact binaries and use them to develop accurate analytical templates for the LIGO/Virgo Collaboration to use in detecting gravitational-wave signals and extracting astrophysical information from them. We describe the results of the first stage of the NRAR project, which focused on producing an initial set of numerical waveforms from binary black holes with moderate mass ratios and spins, as well as one non-spinning binary configuration which has a mass ratio of 10. All of the numerical waveforms are analysed in a uniform and consistent manner, with numerical errors evaluated using an analysis code created by members of the NRAR collaboration. We compare previously-calibrated, non-precessing analytical waveforms, notably the effective-one-body (EOB) and phenomenological template families, to the newly-produced numerical waveforms. We find that when the binary's total mass is ˜100-200M⊙, current EOB and phenomenological models of spinning, non-precessing binary waveforms have overlaps above 99% (for advanced LIGO) with all of the non-precessing-binary numerical waveforms with mass ratios ⩽4, when maximizing over binary parameters. This implies that the loss of event rate due to modelling error is below 3%. Moreover, the non-spinning EOB waveforms previously calibrated to five non-spinning waveforms with mass ratio smaller than 6 have overlaps above 99.7% with the numerical waveform with a mass ratio of 10, without even maximizing on the binary parameters.

  19. Tungsten devices in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hou, Xiandeng; Jones, Bradley T.

    2002-04-01

    Tungsten devices have been employed in analytical atomic spectrometry for approximately 30 years. Most of these atomizers can be electrically heated up to 3000 °C at very high heating rates, with a simple power supply. Usually, a tungsten device is employed in one of two modes: as an electrothermal atomizer with which the sample vapor is probed directly, or as an electrothermal vaporizer, which produces a sample aerosol that is then carried to a separate atomizer for analysis. Tungsten devices may take various physical shapes: tubes, cups, boats, ribbons, wires, filaments, coils and loops. Most of these orientations have been applied to many analytical techniques, such as atomic absorption spectrometry, atomic emission spectrometry, atomic fluorescence spectrometry, laser excited atomic fluorescence spectrometry, metastable transfer emission spectroscopy, inductively coupled plasma optical emission spectrometry, inductively coupled plasma mass spectrometry and microwave plasma atomic spectrometry. The analytical figures of merit and the practical applications reported for these techniques are reviewed. Atomization mechanisms reported for tungsten atomizers are also briefly summarized. In addition, less common applications of tungsten devices are discussed, including analyte preconcentration by adsorption or electrodeposition and electrothermal separation of analytes prior to analysis. Tungsten atomization devices continue to provide simple, versatile alternatives for analytical atomic spectrometry.

  20. Power Distribution Analysis For Electrical Usage In Province Area Using Olap (Online Analytical Processing)

    NASA Astrophysics Data System (ADS)

    Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi

    2018-02-01

    The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.

  1. The forensic validity of visual analytics

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.

    2008-01-01

    The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and

  2. Streaming Visual Analytics Workshop Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Burtner, Edwin R.; Kritzstein, Brian P.

    How can we best enable users to understand complex emerging events and make appropriate assessments from streaming data? This was the central question addressed at a three-day workshop on streaming visual analytics. This workshop was organized by Pacific Northwest National Laboratory for a government sponsor. It brought together forty researchers and subject matter experts from government, industry, and academia. This report summarizes the outcomes from that workshop. It describes elements of the vision for a streaming visual analytic environment and set of important research directions needed to achieve this vision. Streaming data analysis is in many ways the analysis andmore » understanding of change. However, current visual analytics systems usually focus on static data collections, meaning that dynamically changing conditions are not appropriately addressed. The envisioned mixed-initiative streaming visual analytics environment creates a collaboration between the analyst and the system to support the analysis process. It raises the level of discourse from low-level data records to higher-level concepts. The system supports the analyst’s rapid orientation and reorientation as situations change. It provides an environment to support the analyst’s critical thinking. It infers tasks and interests based on the analyst’s interactions. The system works as both an assistant and a devil’s advocate, finding relevant data and alerts as well as considering alternative hypotheses. Finally, the system supports sharing of findings with others. Making such an environment a reality requires research in several areas. The workshop discussions focused on four broad areas: support for critical thinking, visual representation of change, mixed-initiative analysis, and the use of narratives for analysis and communication.« less

  3. Phase-recovery improvement using analytic wavelet transform analysis of a noisy interferogram cepstrum.

    PubMed

    Etchepareborda, Pablo; Vadnjal, Ana Laura; Federico, Alejandro; Kaufmann, Guillermo H

    2012-09-15

    We evaluate the extension of the exact nonlinear reconstruction technique developed for digital holography to the phase-recovery problems presented by other optical interferometric methods, which use carrier modulation. It is shown that the introduction of an analytic wavelet analysis in the ridge of the cepstrum transformation corresponding to the analyzed interferogram can be closely related to the well-known wavelet analysis of the interferometric intensity. Subsequently, the phase-recovery process is improved. The advantages and limitations of this framework are analyzed and discussed using numerical simulations in singular scalar light fields and in temporal speckle pattern interferometry.

  4. An experimental analysis of a doped lithium fluoride direct absorption solar receiver

    NASA Technical Reports Server (NTRS)

    Kesseli, James; Pollak, Tom; Lacy, Dovie

    1988-01-01

    An experimental analysis of two key elements of a direct absorption solar receiver for use with Brayton solar dynamic systems was conducted. Experimental data are presented on LiF crystals doped with dysprosium, samarium, and cobalt fluorides. In addition, a simulation of the cavity/window environment was performed and a posttest inspection was conducted to evaluate chemical reactivity, transmissivity, and condensation rate.

  5. Analytical quality assurance in veterinary drug residue analysis methods: matrix effects determination and monitoring for sulfonamides analysis.

    PubMed

    Hoff, Rodrigo Barcellos; Rübensam, Gabriel; Jank, Louise; Barreto, Fabiano; Peralba, Maria do Carmo Ruaro; Pizzolato, Tânia Mara; Silvia Díaz-Cruz, M; Barceló, Damià

    2015-01-01

    In residue analysis of veterinary drugs in foodstuff, matrix effects are one of the most critical points. This work present a discuss considering approaches used to estimate, minimize and monitoring matrix effects in bioanalytical methods. Qualitative and quantitative methods for estimation of matrix effects such as post-column infusion, slopes ratios analysis, calibration curves (mathematical and statistical analysis) and control chart monitoring are discussed using real data. Matrix effects varying in a wide range depending of the analyte and the sample preparation method: pressurized liquid extraction for liver samples show matrix effects from 15.5 to 59.2% while a ultrasound-assisted extraction provide values from 21.7 to 64.3%. The matrix influence was also evaluated: for sulfamethazine analysis, losses of signal were varying from -37 to -96% for fish and eggs, respectively. Advantages and drawbacks are also discussed considering a workflow for matrix effects assessment proposed and applied to real data from sulfonamides residues analysis. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. The Influence of Judgment Calls on Meta-Analytic Findings.

    PubMed

    Tarrahi, Farid; Eisend, Martin

    2016-01-01

    Previous research has suggested that judgment calls (i.e., methodological choices made in the process of conducting a meta-analysis) have a strong influence on meta-analytic findings and question their robustness. However, prior research applies case study comparison or reanalysis of a few meta-analyses with a focus on a few selected judgment calls. These studies neglect the fact that different judgment calls are related to each other and simultaneously influence the outcomes of a meta-analysis, and that meta-analytic findings can vary due to non-judgment call differences between meta-analyses (e.g., variations of effects over time). The current study analyzes the influence of 13 judgment calls in 176 meta-analyses in marketing research by applying a multivariate, multilevel meta-meta-analysis. The analysis considers simultaneous influences from different judgment calls on meta-analytic effect sizes and controls for alternative explanations based on non-judgment call differences between meta-analyses. The findings suggest that judgment calls have only a minor influence on meta-analytic findings, whereas non-judgment call differences between meta-analyses are more likely to explain differences in meta-analytic findings. The findings support the robustness of meta-analytic results and conclusions.

  7. Analytical performances of food microbiology laboratories - critical analysis of 7 years of proficiency testing results.

    PubMed

    Abdel Massih, M; Planchon, V; Polet, M; Dierick, K; Mahillon, J

    2016-02-01

    Based on the results of 19 food microbiology proficiency testing (PT) schemes, this study aimed to assess the laboratory performances, to highlight the main sources of unsatisfactory analytical results and to suggest areas of improvement. The 2009-2015 results of REQUASUD and IPH PT, involving a total of 48 laboratories, were analysed. On average, the laboratories failed to detect or enumerate foodborne pathogens in 3·0% of the tests. Thanks to a close collaboration with the PT participants, the causes of outliers could be identified in 74% of the cases. The main causes of erroneous PT results were either pre-analytical (handling of the samples, timing of analysis), analytical (unsuitable methods, confusion of samples, errors in colony counting or confirmation) or postanalytical mistakes (calculation and encoding of results). PT schemes are a privileged observation post to highlight analytical problems, which would otherwise remain unnoticed. In this perspective, this comprehensive study of PT results provides insight into the sources of systematic errors encountered during the analyses. This study draws the attention of the laboratories to the main causes of analytical errors and suggests practical solutions to avoid them, in an educational purpose. The observations support the hypothesis that regular participation to PT, when followed by feed-back and appropriate corrective actions, can play a key role in quality improvement and provide more confidence in the laboratory testing results. © 2015 The Society for Applied Microbiology.

  8. Analytic Steering: Inserting Context into the Information Dialog

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bohn, Shawn J.; Calapristi, Augustin J.; Brown, Shyretha D.

    2011-10-23

    An analyst’s intrinsic domain knowledge is a primary asset in almost any analysis task. Unstructured text analysis systems that apply un-supervised content analysis approaches can be more effective if they can leverage this domain knowledge in a manner that augments the information discovery process without obfuscating new or unexpected content. Current unsupervised approaches rely upon the prowess of the analyst to submit the right queries or observe generalized document and term relationships from ranked or visual results. We propose a new approach which allows the user to control or steer the analytic view within the unsupervised space. This process ismore » controlled through the data characterization process via user supplied context in the form of a collection of key terms. We show that steering with an appropriate choice of key terms can provide better relevance to the analytic domain and still enable the analyst to uncover un-expected relationships; this paper discusses cases where various analytic steering approaches can provide enhanced analysis results and cases where analytic steering can have a negative impact on the analysis process.« less

  9. Human Factors in Field Experimentation Design and Analysis of Analytical Suppression Model

    DTIC Science & Technology

    1978-09-01

    men in uf"an-dachine- Systems " supports the development of new doctrines, design of weapon systems as well as training programs for trQops. One...Experimentation Design -Master’s thesis: and Analysis.of an Analytical Suppression.Spebr17 Model PR@~w 3.RPR 7. AUTHOR(@) COT RIETeo 31AN? wijMu~aw...influences to suppression. Techniques are examined for including. the suppre.ssive effects of weapon systems in Lanchester-type combat m~odels, whir~h may be

  10. Experimental Investigations of Non-Stationary Properties In Radiometer Receivers Using Measurements of Multiple Calibration References

    NASA Technical Reports Server (NTRS)

    Racette, Paul; Lang, Roger; Zhang, Zhao-Nan; Zacharias, David; Krebs, Carolyn A. (Technical Monitor)

    2002-01-01

    Radiometers must be periodically calibrated because the receiver response fluctuates. Many techniques exist to correct for the time varying response of a radiometer receiver. An analytical technique has been developed that uses generalized least squares regression (LSR) to predict the performance of a wide variety of calibration algorithms. The total measurement uncertainty including the uncertainty of the calibration can be computed using LSR. The uncertainties of the calibration samples used in the regression are based upon treating the receiver fluctuations as non-stationary processes. Signals originating from the different sources of emission are treated as simultaneously existing random processes. Thus, the radiometer output is a series of samples obtained from these random processes. The samples are treated as random variables but because the underlying processes are non-stationary the statistics of the samples are treated as non-stationary. The statistics of the calibration samples depend upon the time for which the samples are to be applied. The statistics of the random variables are equated to the mean statistics of the non-stationary processes over the interval defined by the time of calibration sample and when it is applied. This analysis opens the opportunity for experimental investigation into the underlying properties of receiver non stationarity through the use of multiple calibration references. In this presentation we will discuss the application of LSR to the analysis of various calibration algorithms, requirements for experimental verification of the theory, and preliminary results from analyzing experiment measurements.

  11. An Analysis of the Effects of RFID Tags on Narrowband Navigation and Communication Receivers

    NASA Technical Reports Server (NTRS)

    LaBerge, E. F. Charles

    2007-01-01

    The simulated effects of the Radio Frequency Identification (RFID) tag emissions on ILS Localizer and ILS Glide Slope functions match the analytical models developed in support of DO-294B provided that the measured peak power levels are adjusted for 1) peak-to-average power ratio, 2) effective duty cycle, and 3) spectrum analyzer measurement bandwidth. When these adjustments are made, simulated and theoretical results are in extraordinarily good agreement. The relationships hold over a large range of potential interference-to-desired signal power ratios, provided that the adjusted interference power is significantly higher than the sum of the receiver noise floor and the noise-like contributions of all other interference sources. When the duty-factor adjusted power spectral densities are applied in the evaluation process described in Section 6 of DO-294B, most narrowband guidance and communications radios performance parameters are unaffected by moderate levels of RFID interference. Specific conclusions and recommendations are provided.

  12. Analysis of solar receiver flux distributions for US/Russian solar dynamic system demonstration on the MIR Space Station

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Fincannon, James

    1995-01-01

    The United States and Russia have agreed to jointly develop a solar dynamic (SD) system for flight demonstration on the Russian MIR space station starting in late 1997. Two important components of this SD system are the solar concentrator and heat receiver provided by Russia and the U.S., respectively. This paper describes optical analysis of the concentrator and solar flux predictions on target receiver surfaces. The optical analysis is performed using the code CIRCE2. These analyses account for finite sun size with limb darkening, concentrator surface slope and position errors, concentrator petal thermal deformation, gaps between petals, and the shading effect of the receiver support struts. The receiver spatial flux distributions are then combined with concentrator shadowing predictions. Geometric shadowing patterns are traced from the concentrator to the target receiver surfaces. These patterns vary with time depending on the chosen MIR flight attitude and orbital mechanics of the MIR spacecraft. The resulting predictions provide spatial and temporal receiver flux distributions for any specified mission profile. The impact these flux distributions have on receiver design and control of the Brayton engine are discussed.

  13. Request Pattern, Pre-Analytical and Analytical Conditions of Urinalysis in Primary Care: Lessons from a One-Year Large-Scale Multicenter Study.

    PubMed

    Salinas, Maria; Lopez-Garrigos, Maite; Flores, Emilio; Leiva-Salinas, Carlos

    2018-06-01

    To study the urinalysis request, pre-analytical sample conditions, and analytical procedures. Laboratories were asked to provide the number of primary care urinalyses requested, and to fill out a questionnaire regarding pre-analytical conditions and analytical procedures. 110 laboratories participated in the study. 232.5 urinalyses/1,000 inhabitants were reported. 75.4% used the first morning urine. The sample reached the laboratory in less than 2 hours in 18.8%, between 2 - 4 hours in 78.3%, and between 4 - 6 hours in the remaining 2.9%. 92.5% combined the use of test strip and particle analysis, and only 7.5% used the strip exclusively. All participants except one performed automated particle analysis depending on strip results; in 16.2% the procedure was only manual. Urinalysis was highly requested. There was a lack of compliance with guidelines regarding time between micturition and analysis that usually involved the combination of strip followed by particle analysis.

  14. Analytical analysis of the temporal asymmetry between seawater intrusion and retreat

    NASA Astrophysics Data System (ADS)

    Rathore, Saubhagya Singh; Zhao, Yue; Lu, Chunhui; Luo, Jian

    2018-01-01

    The quantification of timescales associated with the movement of the seawater-freshwater interface is useful for developing effective management strategies for controlling seawater intrusion (SWI). In this study, for the first time, we derive an explicit analytical solution for the timescales of SWI and seawater retreat (SWR) in a confined, homogeneous coastal aquifer system under the quasi-steady assumption, based on a classical sharp-interface solution for approximating freshwater outflow rates into the sea. The flow continuity and hydrostatic equilibrium across the interface are identified as two primary mechanisms governing timescales of the interface movement driven by an abrupt change in discharge rates or hydraulic heads at the inland boundary. Through theoretical analysis, we quantified the dependence of interface-movement timescales on porosity, hydraulic conductivity, aquifer thickness, aquifer length, density ratio, and boundary conditions. Predictions from the analytical solution closely agreed with those from numerical simulations. In addition, we define a temporal asymmetry index (the ratio of the SWI timescale to the SWR timescale) to represent the resilience of the coastal aquifer in response to SWI. The developed analytical solutions provide a simple tool for the quick assessment of SWI and SWR timescales and reveal that the temporal asymmetry between SWI and SWR mainly relies on the initial and final values of the freshwater flux at the inland boundary, and is weakly affected by aquifer parameters. Furthermore, we theoretically examined the log-linearity relationship between the timescale and the freshwater flux at the inland boundary, and found that the relationship may be approximated by two linear functions with a slope of -2 and -1 for large changes at the boundary flux for SWI and SWR, respectively.

  15. Data analytics using canonical correlation analysis and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Rickman, Jeffrey M.; Wang, Yan; Rollett, Anthony D.; Harmer, Martin P.; Compson, Charles

    2017-07-01

    A canonical correlation analysis is a generic parametric model used in the statistical analysis of data involving interrelated or interdependent input and output variables. It is especially useful in data analytics as a dimensional reduction strategy that simplifies a complex, multidimensional parameter space by identifying a relatively few combinations of variables that are maximally correlated. One shortcoming of the canonical correlation analysis, however, is that it provides only a linear combination of variables that maximizes these correlations. With this in mind, we describe here a versatile, Monte-Carlo based methodology that is useful in identifying non-linear functions of the variables that lead to strong input/output correlations. We demonstrate that our approach leads to a substantial enhancement of correlations, as illustrated by two experimental applications of substantial interest to the materials science community, namely: (1) determining the interdependence of processing and microstructural variables associated with doped polycrystalline aluminas, and (2) relating microstructural decriptors to the electrical and optoelectronic properties of thin-film solar cells based on CuInSe2 absorbers. Finally, we describe how this approach facilitates experimental planning and process control.

  16. Social Cognitive Predictors of College Students' Academic Performance and Persistence: A Meta-Analytic Path Analysis

    ERIC Educational Resources Information Center

    Brown, Steven D.; Tramayne, Selena; Hoxha, Denada; Telander, Kyle; Fan, Xiaoyan; Lent, Robert W.

    2008-01-01

    This study tested Social Cognitive Career Theory's (SCCT) academic performance model using a two-stage approach that combined meta-analytic and structural equation modeling methodologies. Unbiased correlations obtained from a previously published meta-analysis [Robbins, S. B., Lauver, K., Le, H., Davis, D., & Langley, R. (2004). Do psychosocial…

  17. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    ERIC Educational Resources Information Center

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  18. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  19. Proteomic analysis of serum and sputum analytes distinguishes controlled and poorly controlled asthmatics.

    PubMed

    Kasaian, M T; Lee, J; Brennan, A; Danto, S I; Black, K E; Fitz, L; Dixon, A E

    2018-04-17

    A major goal of asthma therapy is to achieve disease control, with maintenance of lung function, reduced need for rescue medication, and prevention of exacerbation. Despite current standard of care, up to 70% of patients with asthma remain poorly controlled. Analysis of serum and sputum biomarkers could offer insights into parameters associated with poor asthma control. To identify signatures as determinants of asthma disease control, we performed proteomics using Olink proximity extension analysis. Up to 3 longitudinal serum samples were collected from 23 controlled and 25 poorly controlled asthmatics. Nine of the controlled and 8 of the poorly controlled subjects also provided 2 longitudinal sputum samples. The study included an additional cohort of 9 subjects whose serum was collected within 48 hours of asthma exacerbation. Two separate pre-defined Proseek Multiplex panels (INF and CVDIII) were run to quantify 181 separate protein analytes in serum and sputum. Panels consisting of 9 markers in serum (CCL19, CCL25, CDCP1, CCL11, FGF21, FGF23, Flt3L, IL-10Rβ, IL-6) and 16 markers in sputum (tPA, KLK6, RETN, ADA, MMP9, Chit1, GRN, PGLYRP1, MPO, HGF, PRTN3, DNER, PI3, Chi3L1, AZU1, and OPG) distinguished controlled and poorly controlled asthmatics. The sputum analytes were consistent with a pattern of neutrophil activation associated with poor asthma control. The serum analyte profile of the exacerbation cohort resembled that of the controlled group rather than that of the poorly controlled asthmatics, possibly reflecting a therapeutic response to systemic corticosteroids. Proteomic profiles in serum and sputum distinguished controlled and poorly controlled asthmatics, and were maintained over time. Findings support a link between sputum neutrophil markers and loss of asthma control. © 2018 John Wiley & Sons Ltd.

  20. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  1. Determining an Effective Intervention within a Brief Experimental Analysis for Reading: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Burns, Matthew K.; Wagner, Dana

    2008-01-01

    The current study applied meta-analytic procedures to brief experimental analysis research of reading fluency interventions to better inform practice and suggest areas for future research. Thirteen studies were examined to determine what magnitude of effect was needed to identify an intervention as the most effective within a brief experimental…

  2. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods

  3. Green analytical chemistry introduction to chloropropanols determination at no economic and analytical performance costs?

    PubMed

    Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek

    2016-01-15

    In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    PubMed Central

    Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-01-01

    Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928

  5. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    PubMed

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  6. CFD analysis of supercritical CO2 used as HTF in a solar tower receiver

    NASA Astrophysics Data System (ADS)

    Roldán, M. I.; Fernández-Reche, J.

    2016-05-01

    The relative cost of a solar receiver can be minimized by the selection of an appropriate heat transfer fluid capable of achieving high receiver efficiencies. In a conventional central receiver system, the concentrated solar energy is transferred from the receiver tube walls to the heat transfer fluid (HTF), which passes through a heat exchanger to generate steam for a Rankine cycle. Thus, higher working fluid temperature is associated with greater efficiency in receiver and power cycle. Emerging receiver designs that can enable higher efficiencies using advanced power cycles, such as supercritical CO2 (s-CO2) closed-loop Brayton cycles, include direct heating of s-CO2 in tubular receiver designs capable of withstanding high internal fluid pressures (around 20 MPa) and temperatures (900 K). Due to the high pressures required and the presence of moving components installed in pipelines (ball-joints and/or flexible connections), the use of s-CO2 presents many technical challenges due to the compatibility of seal materials and fluid leakages of the moving connections. These problems are solved in solar tower systems because the receiver is fixed. In this regard, a preliminary analysis of a tubular receiver with s-CO2 as HTF has been developed using the design of a molten-salt receiver which was previously tested at Plataforma Solar de Almería (PSA). Therefore, a simplified CFD model has been carried out in this study in order to analyze the feasibility of s-CO2 as HTF in solar towers. Simulation results showed that the heat gained by s-CO2 was around 75% greater than the one captured by molten salts (fluid inlet temperature of 715 K), but at a pressure range of 7.5-9.7 MPa. Thus, the use of s-CO2 as HTF in solar tower receivers appears to be a promising alternative, taking into account both the operating conditions required and their maintenance cost.

  7. Analytical Ultrasonics in Materials Research and Testing

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1986-01-01

    Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.

  8. Monitoring by forward scatter radar techniques: an improved second-order analytical model

    NASA Astrophysics Data System (ADS)

    Falconi, Marta Tecla; Comite, Davide; Galli, Alessandro; Marzano, Frank S.; Pastina, Debora; Lombardo, Pierfrancesco

    2017-10-01

    In this work, a second-order phase approximation is introduced to provide an improved analytical model of the signal received in forward scatter radar systems. A typical configuration with a rectangular metallic object illuminated while crossing the baseline, in far- or near-field conditions, is considered. An improved second-order model is compared with a simplified one already proposed by the authors and based on a paraxial approximation. A phase error analysis is carried out to investigate benefits and limitations of the second-order modeling. The results are validated by developing full-wave numerical simulations implementing the relevant scattering problem on a commercial tool.

  9. What are families most grateful for after receiving palliative care? Content analysis of written documents received: a chance to improve the quality of care.

    PubMed

    Aparicio, María; Centeno, Carlos; Carrasco, José Miguel; Barbosa, Antonio; Arantzamendi, María

    2017-09-06

    Family members are involved in the care of palliative patients at home and therefore, should be viewed as important sources of information to help clinicians better understand the quality palliative care service patients receive. The objective of the study was to analyse what is valued most by family carers undergoing bereavement of a palliative care home service in order to identify factors of quality of care. Qualitative exploratory study based on documentary analysis. Content analysis of 77 gratitude documents received over 8 years by a palliative home service in Odivelas, near Lisbon (Portugal) was undertaken, through an inductive approach and using investigator triangulation. Frequency of distinct categories was quantitatively defined. Three different content categories emerged from the analysis: a) Recognition of the care received and the value of particular aspects of care within recognised difficult situations included aspects such as kindness, listening, attention to the family, empathy, closeness, affection and the therapeutic relationships established (63/77 documents); b) Family recognition of the achievements of the palliative care team (29/77) indicated as relief from suffering for the patient and family, opportunity of dying at home, help in facing difficult situations, improvement in quality of life and wellbeing, and feeling of serenity during bereavement; c) Messages of support (45/77) related to the need of resources provided. The relational component emerges as an underlying key aspect of family carers' experience with palliative care home service. Family carers show spontaneous gratitude for the professionalism and humanity found in palliative care. The relational component of care emerges as key to achieve a high quality care experience of palliative care homes service, and could be one indicator of quality of palliative care.

  10. The analyst's participation in the analytic process.

    PubMed

    Levine, H B

    1994-08-01

    The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.

  11. Analysis of the WindSat Receiver Frequency Passbands

    DTIC Science & Technology

    2014-09-12

    water vapor ( PWV ) calculated for each atmospheric profile. The differences for the 18.7 and 23.8 GHz bands vary with PWV . Modeled Tb’s for receiver...precipitable water vapor ( PWV ). WindSat Receiver Frequency Passbands 11 22 24 26 28 30 32 34 36 38 40 REU Temperature (°C) 0 1 2 3 4 5 P er ce nt o f O cc

  12. Receiver design, performance analysis, and evaluation for space-borne laser altimeters and space-to-space laser ranging systems

    NASA Technical Reports Server (NTRS)

    Davidson, Frederic M.; Field, Christopher T.; Sun, Xiaoli

    1996-01-01

    We report here the design and the performance measurements of the breadboard receiver of the Geoscience Laser Altimeter System (GLAS). The measured ranging accuracy was better than 2 cm and 10 cm for 5 ns and 30 ns wide received laser pulses under the expected received signal level, which agreed well with the theoretical analysis. The measured receiver sensitivity or the link margin was also consistent with the theory. The effects of the waveform digitizer sample rate and resolution were also measured.

  13. Monte Carlo Analysis of Molecule Absorption Probabilities in Diffusion-Based Nanoscale Communication Systems with Multiple Receivers.

    PubMed

    Arifler, Dogu; Arifler, Dizem

    2017-04-01

    For biomedical applications of nanonetworks, employing molecular communication for information transport is advantageous over nano-electromagnetic communication: molecular communication is potentially biocompatible and inherently energy-efficient. Recently, several studies have modeled receivers in diffusion-based molecular communication systems as "perfectly monitoring" or "perfectly absorbing" spheres based on idealized descriptions of chemoreception. In this paper, we focus on perfectly absorbing receivers and present methods to improve the accuracy of simulation procedures that are used to analyze these receivers. We employ schemes available from the chemical physics and biophysics literature and outline a Monte Carlo simulation algorithm that accounts for the possibility of molecule absorption during discrete time steps, leading to a more accurate analysis of absorption probabilities. Unlike most existing studies that consider a single receiver, this paper analyzes absorption probabilities for multiple receivers deterministically or randomly deployed in a region. For random deployments, the ultimate absorption probabilities as a function of transmitter-receiver distance are shown to fit well to power laws; the exponents derived become more negative as the number of receivers increases up to a limit beyond which no additional receivers can be "packed" in the deployment region. This paper is expected to impact the design of molecular nanonetworks with multiple absorbing receivers.

  14. An analysis of fosaprepitant-induced venous toxicity in patients receiving highly emetogenic chemotherapy

    PubMed Central

    Leal, Alexis D.; Grendahl, Darryl C.; Seisler, Drew K.; Sorgatz, Kristine M.; Anderson, Kari J.; Hilger, Crystal R.; Loprinzi, Charles L.

    2015-01-01

    Purpose Fosaprepitant is an antiemetic used for chemotherapy-induced nausea and vomiting. We recently reported increased infusion site adverse events (ISAE) in a cohort of breast cancer patients receiving chemotherapy with doxorubicin and cyclophosphamide (AC). In this current study, we evaluated the venous toxicity of fosaprepitant use with non-anthracycline platinum-based antineoplastic regimens. Methods A retrospective review was conducted of the first 81 patients initiated on fosaprepitant among patients receiving highly emetogenic chemotherapy, on or after January 1, 2011 at Mayo Clinic Rochester. None of these regimens included an anthracycline. Data collected included baseline demographics, chemotherapy regimen, type of intravenous access and type, and severity of ISAE. Data from these patients were compared to previously collected data from patients who had received AC. Statistical analysis using χ2 and univariate logistic regression was used to evaluate the association between treatment regimen, fosaprepitant, and risk of ISAE. Results Among these 81 patients, the incidence of ISAE was 7.4 % in the non-anthracycline platinum group. The most commonly reported ISAE were swelling (3 %), extravasation (3 %), and phlebitis (3 %). When stratified by regimen, fosaprepitant was associated with a statistically significant increased risk of ISAE in the anthracycline group (OR 8.1; 95 % CI 2.0–31.9) compared to the platinum group. Conclusions Fosaprepitant antiemetic therapy causes significant ISAE that are appreciably higher than previous reports. Patients receiving platinum-based chemotherapy appear to have less significant ISAE than do patients who receive anthracycline-based regimens. PMID:24964876

  15. Improved full analytical polygon-based method using Fourier analysis of the three-dimensional affine transformation.

    PubMed

    Pan, Yijie; Wang, Yongtian; Liu, Juan; Li, Xin; Jia, Jia

    2014-03-01

    Previous research [Appl. Opt.52, A290 (2013)] has revealed that Fourier analysis of three-dimensional affine transformation theory can be used to improve the computation speed of the traditional polygon-based method. In this paper, we continue our research and propose an improved full analytical polygon-based method developed upon this theory. Vertex vectors of primitive and arbitrary triangles and the pseudo-inverse matrix were used to obtain an affine transformation matrix representing the spatial relationship between the two triangles. With this relationship and the primitive spectrum, we analytically obtained the spectrum of the arbitrary triangle. This algorithm discards low-level angular dependent computations. In order to add diffusive reflection to each arbitrary surface, we also propose a whole matrix computation approach that takes advantage of the affine transformation matrix and uses matrix multiplication to calculate shifting parameters of similar sub-polygons. The proposed method improves hologram computation speed for the conventional full analytical approach. Optical experimental results are demonstrated which prove that the proposed method can effectively reconstruct three-dimensional scenes.

  16. Climate Analytics as a Service. Chapter 11

    NASA Technical Reports Server (NTRS)

    Schnase, John L.

    2016-01-01

    Exascale computing, big data, and cloud computing are driving the evolution of large-scale information systems toward a model of data-proximal analysis. In response, we are developing a concept of climate analytics as a service (CAaaS) that represents a convergence of data analytics and archive management. With this approach, high-performance compute-storage implemented as an analytic system is part of a dynamic archive comprising both static and computationally realized objects. It is a system whose capabilities are framed as behaviors over a static data collection, but where queries cause results to be created, not found and retrieved. Those results can be the product of a complex analysis, but, importantly, they also can be tailored responses to the simplest of requests. NASA's MERRA Analytic Service and associated Climate Data Services API provide a real-world example of climate analytics delivered as a service in this way. Our experiences reveal several advantages to this approach, not the least of which is orders-of-magnitude time reduction in the data assembly task common to many scientific workflows.

  17. Analytical and numerical analysis of charge carriers extracted by linearly increasing voltage in a metal-insulator-semiconductor structure relevant to bulk heterojunction organic solar cells

    NASA Astrophysics Data System (ADS)

    Yumnam, Nivedita; Hirwa, Hippolyte; Wagner, Veit

    2017-12-01

    Analysis of charge extraction by linearly increasing voltage is conducted on metal-insulator-semiconductor capacitors in a structure relevant to organic solar cells. For this analysis, an analytical model is developed and is used to determine the conductivity of the active layer. Numerical simulations of the transient current were performed as a way to confirm the applicability of our analytical model and other analytical models existing in the literature. Our analysis is applied to poly(3-hexylthiophene)(P3HT) : phenyl-C61-butyric acid methyl ester (PCBM) which allows to determine the electron and hole mobility independently. A combination of experimental data analysis and numerical simulations reveals the effect of trap states on the transient current and where this contribution is crucial for data analysis.

  18. Supramolecular analytical chemistry.

    PubMed

    Anslyn, Eric V

    2007-02-02

    A large fraction of the field of supramolecular chemistry has focused in previous decades upon the study and use of synthetic receptors as a means of mimicking natural receptors. Recently, the demand for synthetic receptors is rapidly increasing within the analytical sciences. These classes of receptors are finding uses in simple indicator chemistry, cellular imaging, and enantiomeric excess analysis, while also being involved in various truly practical assays of bodily fluids. Moreover, one of the most promising areas for the use of synthetic receptors is in the arena of differential sensing. Although many synthetic receptors have been shown to yield exquisite selectivities, in general, this class of receptor suffers from cross-reactivities. Yet, cross-reactivity is an attribute that is crucial to the success of differential sensing schemes. Therefore, both selective and nonselective synthetic receptors are finding uses in analytical applications. Hence, a field of chemistry that herein is entitled "Supramolecular Analytical Chemistry" is emerging, and is predicted to undergo increasingly rapid growth in the near future.

  19. In situ analytical techniques for battery interface analysis.

    PubMed

    Tripathi, Alok M; Su, Wei-Nien; Hwang, Bing Joe

    2018-02-05

    Lithium-ion batteries, simply known as lithium batteries, are distinct among high energy density charge-storage devices. The power delivery of batteries depends upon the electrochemical performances and the stability of the electrode, electrolytes and their interface. Interfacial phenomena of the electrode/electrolyte involve lithium dendrite formation, electrolyte degradation and gas evolution, and a semi-solid protective layer formation at the electrode-electrolyte interface, also known as the solid-electrolyte interface (SEI). The SEI protects electrodes from further exfoliation or corrosion and suppresses lithium dendrite formation, which are crucial needs for enhancing the cell performance. This review covers the compositional, structural and morphological aspects of SEI, both artificially and naturally formed, and metallic dendrites using in situ/in operando cells and various in situ analytical tools. Critical challenges and the historical legacy in the development of in situ/in operando electrochemical cells with some reports on state-of-the-art progress are particularly highlighted. The present compilation pinpoints the emerging research opportunities in advancing this field and concludes on the future directions and strategies for in situ/in operando analysis.

  20. Labyrinth Seal Analysis. Volume 3. Analytical and Experimental Development of a Design Model for Labyrinth Seals

    DTIC Science & Technology

    1986-01-01

    the information that has been determined experimentally. The Labyrinth Seal Analysis program was, therefore, directed to the develop - ment of an...labyrinth seal performance, the program included the development of an improved empirical design model to pro- j. .,’ vide the calculation of the flow... program . * Phase I was directed to the analytical development of both an *analysis* model and an improvwd empirical *design" model. Supporting rig tests

  1. An analytics of electricity consumption characteristics based on principal component analysis

    NASA Astrophysics Data System (ADS)

    Feng, Junshu

    2018-02-01

    Abstract . More detailed analysis of the electricity consumption characteristics can make demand side management (DSM) much more targeted. In this paper, an analytics of electricity consumption characteristics based on principal component analysis (PCA) is given, which the PCA method can be used in to extract the main typical characteristics of electricity consumers. Then, electricity consumption characteristics matrix is designed, which can make a comparison of different typical electricity consumption characteristics between different types of consumers, such as industrial consumers, commercial consumers and residents. In our case study, the electricity consumption has been mainly divided into four characteristics: extreme peak using, peak using, peak-shifting using and others. Moreover, it has been found that industrial consumers shift their peak load often, meanwhile commercial and residential consumers have more peak-time consumption. The conclusions can provide decision support of DSM for the government and power providers.

  2. Sensitive analytical method for simultaneous analysis of some vasoconstrictors with highly overlapped analytical signals

    NASA Astrophysics Data System (ADS)

    Nikolić, G. S.; Žerajić, S.; Cakić, M.

    2011-10-01

    Multivariate calibration method is a powerful mathematical tool that can be applied in analytical chemistry when the analytical signals are highly overlapped. The method with regression by partial least squares is proposed for the simultaneous spectrophotometric determination of adrenergic vasoconstrictors in decongestive solution containing two active components: phenyleprine hydrochloride and trimazoline hydrochloride. These sympathomimetic agents are that frequently associated in pharmaceutical formulations against the common cold. The proposed method, which is, simple and rapid, offers the advantages of sensitivity and wide range of determinations without the need for extraction of the vasoconstrictors. In order to minimize the optimal factors necessary to obtain the calibration matrix by multivariate calibration, different parameters were evaluated. The adequate selection of the spectral regions proved to be important on the number of factors. In order to simultaneously quantify both hydrochlorides among excipients, the spectral region between 250 and 290 nm was selected. A recovery for the vasoconstrictor was 98-101%. The developed method was applied to assay of two decongestive pharmaceutical preparations.

  3. Analytical study of comet nucleus samples

    NASA Technical Reports Server (NTRS)

    Albee, A. L.

    1989-01-01

    Analytical procedures for studying and handling frozen (130 K) core samples of comet nuclei are discussed. These methods include neutron activation analysis, x ray fluorescent analysis and high resolution mass spectroscopy.

  4. Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.

    PubMed

    Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen

    2015-10-01

    Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.

  5. Directivity analysis of meander-line-coil EMATs with a wholly analytical method.

    PubMed

    Xie, Yuedong; Liu, Zenghua; Yin, Liyuan; Wu, Jiande; Deng, Peng; Yin, Wuliang

    2017-01-01

    This paper presents the simulation and experimental study of the radiation pattern of a meander-line-coil EMAT. A wholly analytical method, which involves the coupling of two models: an analytical EM model and an analytical UT model, has been developed to build EMAT models and analyse the Rayleigh waves' beam directivity. For a specific sensor configuration, Lorentz forces are calculated using the EM analytical method, which is adapted from the classic Deeds and Dodd solution. The calculated Lorentz force density are imported to an analytical ultrasonic model as driven point sources, which produce the Rayleigh waves within a layered medium. The effect of the length of the meander-line-coil on the Rayleigh waves' beam directivity is analysed quantitatively and verified experimentally. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Receiver design, performance analysis, and evaluation for space-borne laser altimeters and space-to-space laser ranging systems

    NASA Technical Reports Server (NTRS)

    Davidson, Frederic M.; Sun, Xiaoli

    1993-01-01

    This interim report consists of four separate reports from our research on the receivers of NASA's Gravity And Magnetic Experiment Satellite (GAMES). The first report is entitled 'Analysis of phase estimation bias of GAMES receiver due to Doppler shift.' The second report is 'Background radiation on GAMES fine ranging detector from the moon, the planets, and the stars.' The third report is 'Background radiation on GAMES receivers from the ocean sun glitter and the direct sun.' The fourth report is 'GAMES receiver performance versus background radiation power on the detectors.'

  7. 76 FR 56193 - KAP Analytics, LLC ; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-12

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER11-4440-000] KAP Analytics... 204 Authorization This is a supplemental notice in the above-referenced proceeding of KAP Analytics... eSubscription link on the Web site that enables subscribers to receive e-mail notification when a...

  8. United States Department of Energy solar receiver technology development

    NASA Astrophysics Data System (ADS)

    Klimas, P. C.; Diver, R. B.; Chavez, J. M.

    The United States Department of Energy (DOE), through Sandia National Laboratories, has been conducting a Solar Thermal Receiver Technology Development Program, which maintains a balance between analytical modeling, bench and small scale testing, and experimentation conducted at scales representative of commercially-sized equipment. Central receiver activities emphasize molten salt-based systems on large scales and volumetric devices in the modeling and small scale testing. These receivers are expected to be utilized in solar power plants rated between 100 and 200 MW. Distributed receiver research focuses on liquid metal refluxing devices. These are intended to mate parabolic dish concentrators with Stirling cycle engines in the 5 to 25 kW(sub e) power range. The effort in the area of volumetric receivers is less intensive and highly cooperative in nature. A ceramic foam absorber of Sandia design was successfully tested on the 200 kW(sub t) test bed at Plataforma Solar during 1989. Material integrity during the approximately 90-test series was excellent. Significant progress has been made with parabolic dish concentrator-mounted receivers using liquid metals (sodium or a potassium/sodium mixture) as heat transport media. Sandia has successfully solar-tested a pool boiling reflux receiver sized to power a 25 kW Stirling engine. Boiling stability and transient operation were both excellent. This document describes these activities in detail and will outline plans for future development.

  9. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  10. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  11. Industrial Technology Modernization Program. Project 28. Automation of Receiving, Receiving Inspection and Stores

    DTIC Science & Technology

    1987-06-15

    001 GENERAL DYNAMICS 00 FORT WORTH DIVISION INDUSTRIAL TECHNOLOGY MODERNIZATION PROGRAM Phase 2 Final Project Repc t JUNG 0 ?7 PROJECT 28 AUTOMATION...DYNAMICS FORT WORTH DIVISION INDUSTRIAL TECHNOLOGY MODERNIZATION PROGRAM Phase 2 Final Project Report PROJECT 28 AUTOMATION OF RECEIVING, RECEIVING...13 6 PROJECT ASSUMPTIONS 20 7 PRELIMINARY/FINAL DESIGN AND FINDINGS 21 8 SYSTEM/EQUIPMENT/MACHINING SPECIFICATIONS 37 9 VENDOR/ INDUSTRY ANALYSIS

  12. Analysis of Pre-Analytic Factors Affecting the Success of Clinical Next-Generation Sequencing of Solid Organ Malignancies.

    PubMed

    Chen, Hui; Luthra, Rajyalakshmi; Goswami, Rashmi S; Singh, Rajesh R; Roy-Chowdhuri, Sinchita

    2015-08-28

    Application of next-generation sequencing (NGS) technology to routine clinical practice has enabled characterization of personalized cancer genomes to identify patients likely to have a response to targeted therapy. The proper selection of tumor sample for downstream NGS based mutational analysis is critical to generate accurate results and to guide therapeutic intervention. However, multiple pre-analytic factors come into play in determining the success of NGS testing. In this review, we discuss pre-analytic requirements for AmpliSeq PCR-based sequencing using Ion Torrent Personal Genome Machine (PGM) (Life Technologies), a NGS sequencing platform that is often used by clinical laboratories for sequencing solid tumors because of its low input DNA requirement from formalin fixed and paraffin embedded tissue. The success of NGS mutational analysis is affected not only by the input DNA quantity but also by several other factors, including the specimen type, the DNA quality, and the tumor cellularity. Here, we review tissue requirements for solid tumor NGS based mutational analysis, including procedure types, tissue types, tumor volume and fraction, decalcification, and treatment effects.

  13. A Field Study Program in Analytical Chemistry for College Seniors.

    ERIC Educational Resources Information Center

    Langhus, D. L.; Flinchbaugh, D. A.

    1986-01-01

    Describes an elective field study program at Moravian College (Pennsylvania) in which seniors in analytical chemistry obtain first-hand experience at Bethlehem Steel Corporation. Discusses the program's planning phase, some method development projects done by students, experiences received in laboratory operations, and the evaluation of student…

  14. Analysis, development and testing of a fixed tilt solar collector employing reversible Vee-Trough reflectors and vacuum tube receivers

    NASA Technical Reports Server (NTRS)

    Selcuk, M. K.

    1979-01-01

    The Vee-Trough/Vacuum Tube Collector (VTVTC) aimed to improve the efficiency and reduce the cost of collectors assembled from evacuated tube receivers. The VTVTC was analyzed rigorously and a mathematical model was developed to calculate the optical performance of the vee-trough concentrator and the thermal performance of the evacuated tube receiver. A test bed was constructed to verify the mathematical analyses and compare reflectors made out of glass, Alzak and aluminized GEB Teflon. Tests were run at temperatures ranging from 95 to 180 C during the months of April, May, June, July and August 1977. Vee-trough collector efficiencies of 35-40 per cent were observed at an operating temperature of about 175 C. Test results compared well with the calculated values. Test data covering a complete day are presented for selected dates throughout the test season. Predicted daily useful heat collection and efficiency values are presented for a year's duration at operation temperatures ranging from 65 to 230 C. Estimated collector costs and resulting thermal energy costs are presented. Analytical and experimental results are discussed along with an economic evaluation.

  15. Light-emitting diodes for analytical chemistry.

    PubMed

    Macka, Mirek; Piasecki, Tomasz; Dasgupta, Purnendu K

    2014-01-01

    Light-emitting diodes (LEDs) are playing increasingly important roles in analytical chemistry, from the final analysis stage to photoreactors for analyte conversion to actual fabrication of and incorporation in microdevices for analytical use. The extremely fast turn-on/off rates of LEDs have made possible simple approaches to fluorescence lifetime measurement. Although they are increasingly being used as detectors, their wavelength selectivity as detectors has rarely been exploited. From their first proposed use for absorbance measurement in 1970, LEDs have been used in analytical chemistry in too many ways to make a comprehensive review possible. Hence, we critically review here the more recent literature on their use in optical detection and measurement systems. Cloudy as our crystal ball may be, we express our views on the future applications of LEDs in analytical chemistry: The horizon will certainly become wider as LEDs in the deep UV with sufficient intensity become available.

  16. AmO 2 Analysis for Analytical Method Testing and Assessment: Analysis Support for AmO 2 Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhn, Kevin John; Bland, Galey Jean; Fulwyler, James Brent

    Americium oxide samples will be measured for various analytes to support AmO 2 production. The key analytes that are currently requested by the Am production customer at LANL include total Am content, Am isotopics, Pu assay, Pu isotopics, and trace element content including 237Np content. Multiple analytical methods will be utilized depending on the sensitivity, accuracy and precision needs of the Am matrix. Traceability to the National Institute of Standards and Technology (NIST) will be achieved, where applicable, by running NIST traceable quality control materials. This given that there are no suitable AmO 2 reference materials currently available for requestedmore » analytes. The primary objective is to demonstrate the suitability of actinide analytical chemistry methods to support AmO 2 production operations.« less

  17. Anisotropic structure of the mantle wedge beneath the Ryukyu arc from teleseismic receiver function analysis

    NASA Astrophysics Data System (ADS)

    McCormack, K. A.; Wirth, E. A.; Long, M. D.

    2011-12-01

    The recycling of oceanic plates back into the mantle through subduction is an important process taking place within our planet. However, many fundamental aspects of subduction systems, such as the dynamics of mantle flow, have yet to be completely understood. Subducting slabs transport water down into the mantle, but how and where that water is released, as well as how it affects mantle flow, is still an open question. In this study, we focus on the Ryukyu subduction zone in southwestern Japan and use anisotropic receiver function analysis to characterize the structure of the mantle wedge. We compute radial and transverse P-to-S receiver functions for eight stations of the broadband F-net array using a multitaper receiver function estimator. We observe coherent P-to-SV converted energy in the radial receiver functions at ~6 sec for most of the stations analyzed consistent with conversions originating at the top of the slab. We also observe conversions on the transverse receiver functions that are consistent with the presence of multiple anisotropic and/or dipping layers. The character of the transverse receiver functions varies significantly along strike, with the northernmost three stations exhibiting markedly different behavior than stations located in the center of the Ryukyu arc. We compute synthetic receiver functions using a forward modeling scheme that can handle dipping interfaces and anisotropic layers to create models for the depths, thicknesses, and strengths of anisotropic layers in the mantle wedge beneath Ryukyu.

  18. Glycan characterization of the NIST RM monoclonal antibody using a total analytical solution: From sample preparation to data analysis.

    PubMed

    Hilliard, Mark; Alley, William R; McManus, Ciara A; Yu, Ying Qing; Hallinan, Sinead; Gebler, John; Rudd, Pauline M

    Glycosylation is an important attribute of biopharmaceutical products to monitor from development through production. However, glycosylation analysis has traditionally been a time-consuming process with long sample preparation protocols and manual interpretation of the data. To address the challenges associated with glycan analysis, we developed a streamlined analytical solution that covers the entire process from sample preparation to data analysis. In this communication, we describe the complete analytical solution that begins with a simplified and fast N-linked glycan sample preparation protocol that can be completed in less than 1 hr. The sample preparation includes labelling with RapiFluor-MS tag to improve both fluorescence (FLR) and mass spectral (MS) sensitivities. Following HILIC-UPLC/FLR/MS analyses, the data are processed and a library search based on glucose units has been included to expedite the task of structural assignment. We then applied this total analytical solution to characterize the glycosylation of the NIST Reference Material mAb 8761. For this glycoprotein, we confidently identified 35 N-linked glycans and all three major classes, high mannose, complex, and hybrid, were present. The majority of the glycans were neutral and fucosylated; glycans featuring N-glycolylneuraminic acid and those with two galactoses connected via an α1,3-linkage were also identified.

  19. Optimization of MLS receivers for multipath environments

    NASA Technical Reports Server (NTRS)

    Mcalpine, G. A.; Highfill, J. H., III; Tzeng, C. P. J.; Koleyni, G.

    1978-01-01

    Reduced order receiver (suboptimal receiver) analysis in multipath environments is presented. The origin and objective of MLS is described briefly. Signal modeling in MLS the optimum receiver is also included and a description of a computer oriented technique which was used in the simulation study of the suboptimal receiver is provided. Results and conclusion obtained from the research for the suboptimal receiver are reported.

  20. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  1. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  2. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  3. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  4. Blade loss transient dynamics analysis, volume 2. Task 2: Theoretical and analytical development. Task 3: Experimental verification

    NASA Technical Reports Server (NTRS)

    Gallardo, V. C.; Storace, A. S.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.

    1981-01-01

    The component element method was used to develop a transient dynamic analysis computer program which is essentially based on modal synthesis combined with a central, finite difference, numerical integration scheme. The methodology leads to a modular or building-block technique that is amenable to computer programming. To verify the analytical method, turbine engine transient response analysis (TETRA), was applied to two blade-out test vehicles that had been previously instrumented and tested. Comparison of the time dependent test data with those predicted by TETRA led to recommendations for refinement or extension of the analytical method to improve its accuracy and overcome its shortcomings. The development of working equations, their discretization, numerical solution scheme, the modular concept of engine modelling, the program logical structure and some illustrated results are discussed. The blade-loss test vehicles (rig full engine), the type of measured data, and the engine structural model are described.

  5. Thermal State-of-Charge in Solar Heat Receivers

    NASA Technical Reports Server (NTRS)

    Hall, Carsie A., Jr.; Glakpe, Emmanuel K.; Cannon, Joseph N.; Kerslake, Thomas W.

    1998-01-01

    A theoretical framework is developed to determine the so-called thermal state-of-charge (SOC) in solar heat receivers employing encapsulated phase change materials (PCMS) that undergo cyclic melting and freezing. The present problem is relevant to space solar dynamic power systems that would typically operate in low-Earth-orbit (LEO). The solar heat receiver is integrated into a closed-cycle Brayton engine that produces electric power during sunlight and eclipse periods of the orbit cycle. The concepts of available power and virtual source temperature, both on a finite-time basis, are used as the basis for determining the SOC. Analytic expressions for the available power crossing the aperture plane of the receiver, available power stored in the receiver, and available power delivered to the working fluid are derived, all of which are related to the SOC through measurable parameters. Lower and upper bounds on the SOC are proposed in order to delineate absolute limiting cases for a range of input parameters (orbital, geometric, etc.). SOC characterization is also performed in the subcooled, two-phase, and superheat regimes. Finally, a previously-developed physical and numerical model of the solar heat receiver component of NASA Lewis Research Center's Ground Test Demonstration (GTD) system is used in order to predict the SOC as a function of measurable parameters.

  6. A new numerical method for calculating extrema of received power for polarimetric SAR

    USGS Publications Warehouse

    Zhang, Y.; Zhang, Jiahua; Lu, Z.; Gong, W.

    2009-01-01

    A numerical method called cross-step iteration is proposed to calculate the maximal/minimal received power for polarized imagery based on a target's Kennaugh matrix. This method is much more efficient than the systematic method, which searches for the extrema of received power by varying the polarization ellipse angles of receiving and transmitting polarizations. It is also more advantageous than the Schuler method, which has been adopted by the PolSARPro package, because the cross-step iteration method requires less computation time and can derive both the maximal and minimal received powers, whereas the Schuler method is designed to work out only the maximal received power. The analytical model of received-power optimization indicates that the first eigenvalue of the Kennaugh matrix is the supremum of the maximal received power. The difference between these two parameters reflects the depolarization effect of the target's backscattering, which might be useful for target discrimination. ?? 2009 IEEE.

  7. Analytic integrable systems: Analytic normalization and embedding flows

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang

    In this paper we mainly study the existence of analytic normalization and the normal form of finite dimensional complete analytic integrable dynamical systems. More details, we will prove that any complete analytic integrable diffeomorphism F(x)=Bx+f(x) in (Cn,0) with B having eigenvalues not modulus 1 and f(x)=O(|) is locally analytically conjugate to its normal form. Meanwhile, we also prove that any complete analytic integrable differential system x˙=Ax+f(x) in (Cn,0) with A having nonzero eigenvalues and f(x)=O(|) is locally analytically conjugate to its normal form. Furthermore we will prove that any complete analytic integrable diffeomorphism defined on an analytic manifold can be embedded in a complete analytic integrable flow. We note that parts of our results are the improvement of Moser's one in J. Moser, The analytic invariants of an area-preserving mapping near a hyperbolic fixed point, Comm. Pure Appl. Math. 9 (1956) 673-692 and of Poincaré's one in H. Poincaré, Sur l'intégration des équations différentielles du premier order et du premier degré, II, Rend. Circ. Mat. Palermo 11 (1897) 193-239. These results also improve the ones in Xiang Zhang, Analytic normalization of analytic integrable systems and the embedding flows, J. Differential Equations 244 (2008) 1080-1092 in the sense that the linear part of the systems can be nonhyperbolic, and the one in N.T. Zung, Convergence versus integrability in Poincaré-Dulac normal form, Math. Res. Lett. 9 (2002) 217-228 in the way that our paper presents the concrete expression of the normal form in a restricted case.

  8. Analytical performance of benchtop total reflection X-ray fluorescence instrumentation for multielemental analysis of wine samples

    NASA Astrophysics Data System (ADS)

    Dalipi, Rogerta; Marguí, Eva; Borgese, Laura; Bilo, Fabjola; Depero, Laura E.

    2016-06-01

    Recent technological improvements have led to a widespread adoption of benchtop total reflection X-ray fluorescence systems (TXRF) for analysis of liquid samples. However, benchtop TXRF systems usually present limited sensitivity compared with high-scale instrumentation which can restrict its application in some fields. The aim of the present work was to evaluate and compare the analytical capabilities of two TXRF systems, equipped with low power Mo and W target X-ray tubes, for multielemental analysis of wine samples. Using the Mo-TXRF system, the detection limits for most elements were one order of magnitude lower than those attained using the W-TXRF system. For the detection of high Z elements like Cd and Ag, however, W-TXRF remains a very good option due to the possibility of K-Lines detection. Accuracy and precision of the obtained results have been evaluated analyzing spiked real wine samples and comparing the TXRF results with those obtained by inductively coupled plasma emission spectroscopy (ICP-OES). In general, good agreement was obtained between ICP-OES and TXRF results for the analysis of both red and white wine samples except for light elements (i.e., K) which TXRF concentrations were underestimated. However, a further achievement of analytical quality of TXRF results can be achieved if wine analysis is performed after dilution of the sample with de-ionized water.

  9. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  10. Design and evaluation of a high temperature/pressure supercritical carbon dioxide direct tubular receiver for concentrating solar power applications

    NASA Astrophysics Data System (ADS)

    Ortega, Jesus Daniel

    This work focuses on the development of a solar power thermal receiver for a supercritical-carbon dioxide (sCO2), Brayton power-cycle to produce ~1 MWe. Closed-loop sCO2 Brayton cycles are being evaluated in combination with concentrating solar power to provide higher thermal-to-electric conversion efficiencies relative to conventional steam Rankine cycles. High temperatures (923--973 K) and pressures (20--25 MPa) are required in the solar receiver to achieve thermal efficiencies of ~50%, making concentrating solar power (CSP) technologies a competitive alternative to current power generation methods. In this study, the CSP receiver is required to achieve an outlet temperature of 923 K at 25 MPa or 973 K at 20 MPa to meet the operating needs. To obtain compatible receiver tube material, an extensive material review was performed based the ASME Boiler and Pressure Vessel Code, ASME B31.1 and ASME B313.3 codes respectively. Subsequently, a thermal-structural model was developed using a commercial computational fluid (CFD) dynamics and structural mechanics software for designing and analyzing the tubular receiver that could provide the heat input for a ~2 MWth plant. These results were used to perform an analytical cumulative damage creep-fatigue analysis to estimate the work-life of the tubes. In sequence, an optical-thermal-fluid model was developed to evaluate the resulting thermal efficiency of the tubular receiver from the NSTTF heliostat field. The ray-tracing tool SolTrace was used to obtain the heat-flux distribution on the surfaces of the receiver. The K-ω SST turbulence model and P-1 radiation model used in Fluent were coupled with SolTrace to provide the heat flux distribution on the receiver surface. The creep-fatigue analysis displays the damage accumulated due to the cycling and the permanent deformation of the tubes. Nonetheless, they are able to support the required lifetime. The receiver surface temperatures were found to be within the safe

  11. Discourse-Centric Learning Analytics: Mapping the Terrain

    ERIC Educational Resources Information Center

    Knight, Simon; Littleton, Karen

    2015-01-01

    There is an increasing interest in developing learning analytic techniques for the analysis, and support of, high-quality learning discourse. This paper maps the terrain of discourse-centric learning analytics (DCLA), outlining the distinctive contribution of DCLA and outlining a definition for the field moving forwards. It is our claim that DCLA…

  12. Demonstrating Success: Web Analytics and Continuous Improvement

    ERIC Educational Resources Information Center

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  13. Direct analysis of ethylene glycol in human serum on the basis of analyte adduct formation and liquid chromatography-tandem mass spectrometry.

    PubMed

    Dziadosz, Marek

    2018-01-01

    The aim of this work was to develop a fast, cost-effective and time-saving liquid chromatography-tandem mass spectrometry (LC-MS/MS) analytical method for the analysis of ethylene glycol (EG) in human serum. For these purposes, the formation/fragmentation of an EG adduct ion with sodium and sodium acetate was applied in the positive electrospray mode for signal detection. Adduct identification was performed with appropriate infusion experiments based on analyte solutions prepared in different concentrations. Corresponding analyte adduct ions and adduct ion fragments could be identified both for EG and the deuterated internal standard (EG-D4). Protein precipitation was used as sample preparation. The analysis of the supernatant was performed with a Luna 5μm C18 (2) 100A, 150mm×2mm analytical column and a mobile phase consisting of 95% A (H 2 O/methanol=95/5, v/v) and 5% B (H 2 O/methanol=3/97, v/v), both with 10mmolL -1 ammonium acetate and 0.1% acetic acid. Method linearity was examined in the range of 100-4000μg/mL and the calculated limit of detection/quantification was 35/98μg/mL. However, on the basis of the signal to noise ratio, quantification was recommended at a limit of 300μg/mL. Additionally, the examined precision, accuracy, stability, selectivity and matrix effect demonstrated that the method is a practicable alternative for EG quantification in human serum. In comparison to other methods based on liquid chromatography, the strategy presented made for the first time the EG analysis without analyte derivatisation possible. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Laboratory Analytical Procedures | Bioenergy | NREL

    Science.gov Websites

    analytical procedures (LAPs) to provide validated methods for biofuels and pyrolysis bio-oils research . Biomass Compositional Analysis These lab procedures provide tested and accepted methods for performing

  15. Microfluidic paper-based analytical devices for potential use in quantitative and direct detection of disease biomarkers in clinical analysis.

    PubMed

    Lim, Wei Yin; Goh, Boon Tong; Khor, Sook Mei

    2017-08-15

    Clinicians, working in the health-care diagnostic systems of developing countries, currently face the challenges of rising costs, increased number of patient visits, and limited resources. A significant trend is using low-cost substrates to develop microfluidic devices for diagnostic purposes. Various fabrication techniques, materials, and detection methods have been explored to develop these devices. Microfluidic paper-based analytical devices (μPADs) have gained attention for sensing multiplex analytes, confirming diagnostic test results, rapid sample analysis, and reducing the volume of samples and analytical reagents. μPADs, which can provide accurate and reliable direct measurement without sample pretreatment, can reduce patient medical burden and yield rapid test results, aiding physicians in choosing appropriate treatment. The objectives of this review are to provide an overview of the strategies used for developing paper-based sensors with enhanced analytical performances and to discuss the current challenges, limitations, advantages, disadvantages, and future prospects of paper-based microfluidic platforms in clinical diagnostics. μPADs, with validated and justified analytical performances, can potentially improve the quality of life by providing inexpensive, rapid, portable, biodegradable, and reliable diagnostics. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Quality of care received and patient-reported regret in prostate cancer: Analysis of a population-based prospective cohort.

    PubMed

    Holmes, Jordan A; Bensen, Jeannette T; Mohler, James L; Song, Lixin; Mishel, Merle H; Chen, Ronald C

    2017-01-01

    Meeting quality of care standards in oncology is recognized as important by physicians, professional organizations, and payers. Data from a population-based cohort of patients with prostate cancer were used to examine whether receipt of care was consistent with published consensus metrics and whether receiving high-quality care was associated with less patient-reported treatment decisional regret. Patients with incident prostate cancer were enrolled in collaboration with the North Carolina Central Cancer Registry, with an oversampling of minority patients. Medical record abstraction was used to determine whether participants received high-quality care based on 5 standards: 1) discussion of all treatment options; 2) complete workup (prostate-specific antigen, Gleason grade, and clinical stage); 3) low-risk participants did not undergo a bone scan; 4) high-risk participants treated with radiotherapy (RT) received androgen deprivation therapy; and 5) participants treated with RT received conformal or intensity-modulated RT. Treatment decisional regret was assessed using a validated instrument. A total of 804 participants were analyzed. Overall, 66% of African American and 73% of white participants received care that met all standards (P = .03); this racial difference was confirmed by multivariable analysis. Care that included "discussion of all treatment options" was found to be associated with less patient-reported regret on univariable analysis (P = .03) and multivariable analysis (odds ratio, 0.59; 95% confidence interval, 0.37-0.95). The majority of participants received high-quality care, but racial disparity existed. Participants who discussed all treatment options appeared to have less treatment decisional regret. To the authors' knowledge, this is the first study to demonstrate an association between a quality of care metric and patient-reported outcome. Cancer 2017;138-143. © 2016 American Cancer Society. © 2016 American Cancer Society.

  17. An Examination of Advisor Concerns in the Era of Academic Analytics

    ERIC Educational Resources Information Center

    Daughtry, Jeremy J.

    2017-01-01

    Performance-based funding models are increasingly becoming the norm for many institutions of higher learning. Such models place greater emphasis on student retention and success metrics, for example, as requirements for receiving state appropriations. To stay competitive, universities have adopted academic analytics technologies capable of…

  18. Policy-Making Theory as an Analytical Framework in Policy Analysis: Implications for Research Design and Professional Advocacy.

    PubMed

    Sheldon, Michael R

    2016-01-01

    Policy studies are a recent addition to the American Physical Therapy Association's Research Agenda and are critical to our understanding of various federal, state, local, and organizational policies on the provision of physical therapist services across the continuum of care. Policy analyses that help to advance the profession's various policy agendas will require relevant theoretical frameworks to be credible. The purpose of this perspective article is to: (1) demonstrate the use of a policy-making theory as an analytical framework in a policy analysis and (2) discuss how sound policy analysis can assist physical therapists in becoming more effective change agents, policy advocates, and partners with other relevant stakeholder groups. An exploratory study of state agency policy responses to address work-related musculoskeletal disorders is provided as a contemporary example to illustrate key points and to demonstrate the importance of selecting a relevant analytical framework based on the context of the policy issue under investigation. © 2016 American Physical Therapy Association.

  19. DART-MS: A New Analytical Technique for Forensic Paint Analysis.

    PubMed

    Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice

    2018-06-05

    Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.

  20. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  1. Utility of the summation chromatographic peak integration function to avoid manual reintegrations in the analysis of targeted analytes

    USDA-ARS?s Scientific Manuscript database

    As sample preparation and analytical techniques have improved, data handling has become the main limitation in automated high-throughput analysis of targeted chemicals in many applications. Conventional chromatographic peak integration functions rely on complex software and settings, but untrustwor...

  2. Analysis of Solar Receiver Flux Distributions for US/Russian Solar Dynamic System Demonstration on the MIR Space Station

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Analyses have been performed at the NASA Lewis Research Center's Power Systems Project Office to support the design and development of the joint U.S./Russian Solar Dynamic Flight Demonstration Project. The optical analysis of the concentrator and solar flux predictions on target receiver surfaces have an important influence on receiver design and control of the Brayton engine.

  3. A Model for Developing Clinical Analytics Capacity: Closing the Loops on Outcomes to Optimize Quality.

    PubMed

    Eggert, Corinne; Moselle, Kenneth; Protti, Denis; Sanders, Dale

    2017-01-01

    Closed Loop Analytics© is receiving growing interest in healthcare as a term referring to information technology, local data and clinical analytics working together to generate evidence for improvement. The Closed Loop Analytics model consists of three loops corresponding to the decision-making levels of an organization and the associated data within each loop - Patients, Protocols, and Populations. The authors propose that each of these levels should utilize the same ecosystem of electronic health record (EHR) and enterprise data warehouse (EDW) enabled data, in a closed-loop fashion, with that data being repackaged and delivered to suit the analytic and decision support needs of each level, in support of better outcomes.

  4. User-Centered Evaluation of Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean C.

    suitable to visual analytics. A history of analysis and analysis techniques and problems is provided as well as an introduction to user-centered evaluation and various evaluation techniques for readers from different disciplines. The understanding of these techniques is imperative if we wish to support analysis in the visual analytics software we develop. Currently the evaluations that are conducted and published for visual analytics software are very informal and consist mainly of comments from users or potential users. Our goal is to help researchers in visual analytics to conduct more formal user-centered evaluations. While these are time-consuming and expensive to carryout, the outcomes of these studies will have a defining impact on the field of visual analytics and help point the direction for future features and visualizations to incorporate. While many researchers view work in user-centered evaluation as a less-than-exciting area to work, the opposite is true. First of all, the goal is user-centered evaluation is to help visual analytics software developers, researchers, and designers improve their solutions and discover creative ways to better accommodate their users. Working with the users is extremely rewarding as well. While we use the term “users” in almost all situations there are a wide variety of users that all need to be accommodated. Moreover, the domains that use visual analytics are varied and expanding. Just understanding the complexities of a number of these domains is exciting. Researchers are trying out different visualizations and interactions as well. And of course, the size and variety of data are expanding rapidly. User-centered evaluation in this context is rapidly changing. There are no standard processes and metrics and thus those of us working on user-centered evaluation must be creative in our work with both the users and with the researchers and developers.« less

  5. Analytical Characterization on Pulse Propagation in a Semiconductor Optical Amplifier Based on Homotopy Analysis Method

    NASA Astrophysics Data System (ADS)

    Jia, Xiaofei

    2018-06-01

    Starting from the basic equations describing the evolution of the carriers and photons inside a semiconductor optical amplifier (SOA), the equation governing pulse propagation in the SOA is derived. By employing homotopy analysis method (HAM), a series solution for the output pulse by the SOA is obtained, which can effectively characterize the temporal features of the nonlinear process during the pulse propagation inside the SOA. Moreover, the analytical solution is compared with numerical simulations with a good agreement. The theoretical results will benefit the future analysis of other problems related to the pulse propagation in the SOA.

  6. MERRA Analytic Services

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  7. IoT Big-Data Centred Knowledge Granule Analytic and Cluster Framework for BI Applications: A Case Base Analysis.

    PubMed

    Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih

    2015-01-01

    The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions.

  8. IoT Big-Data Centred Knowledge Granule Analytic and Cluster Framework for BI Applications: A Case Base Analysis

    PubMed Central

    Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih

    2015-01-01

    The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions. PMID:26600156

  9. Polybrominated Diphenyl Ethers in Dryer Lint: An Advanced Analysis Laboratory

    ERIC Educational Resources Information Center

    Thompson, Robert Q.

    2008-01-01

    An advanced analytical chemistry laboratory experiment is described that involves environmental analysis and gas chromatography-mass spectrometry. Students analyze lint from clothes dryers for traces of flame retardant chemicals, polybrominated diphenylethers (PBDEs), compounds receiving much attention recently. In a typical experiment, ng/g…

  10. Training the next generation analyst using red cell analytics

    NASA Astrophysics Data System (ADS)

    Graham, Meghan N.; Graham, Jacob L.

    2016-05-01

    We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.

  11. Electron-cyclotron absorption in high-temperature plasmas: quasi-exact analytical evaluation and comparative numerical analysis

    NASA Astrophysics Data System (ADS)

    Albajar, F.; Bertelli, N.; Bornatici, M.; Engelmann, F.

    2007-01-01

    On the basis of the electromagnetic energy balance equation, a quasi-exact analytical evaluation of the electron-cyclotron (EC) absorption coefficient is performed for arbitrary propagation (with respect to the magnetic field) in a (Maxwellian) magneto-plasma for the temperature range of interest for fusion reactors (in which EC radiation losses tend to be important in the plasma power balance). The calculation makes use of Bateman's expansion for the product of two Bessel functions, retaining the lowest-order contribution. The integration over electron momentum can then be carried out analytically, fully accounting for finite Larmor radius effects in this approximation. On the basis of the analytical expressions for the EC absorption coefficients of both the extraordinary and ordinary modes thus obtained, (i) for the case of perpendicular propagation simple formulae are derived for both modes and (ii) a numerical analysis of the angular distribution of EC absorption is carried out. An assessment of the accuracy of asymptotic expressions that have been given earlier is also performed, showing that these approximations can be usefully applied for calculating EC power losses from reactor-grade plasmas. Presented in part at the 14th Joint Workshop on Electron Cyclotron Emission and Electron Cyclotron Resonance Heating, Santorini, Greece, 9-12 May 2006.

  12. Receivers

    NASA Technical Reports Server (NTRS)

    Donnelly, H.

    1983-01-01

    Before discussing Deep Space Network receivers, a brief description of the functions of receivers and how they interface with other elements of the Network is presented. Different types of receivers are used in the Network for various purposes. The principal receiver type is used for telemetry and tracking. This receiver provides the capability, with other elements of the Network, to track the space probe utilizing Doppler and range measurements, and to receive telemetry, including both scientific data from the onboard experiments and engineering data pertaining to the health of the probe. Another type of receiver is used for radio science applications. This receiver measures phase perturbations on the carrier signal to obtain information on the composition of solar and planetary atmospheres and interplanetary space. A third type of receiver utilizes very long baseline interferometry (VLBI) techniques for both radio science and spacecraft navigation data. Only the telemetry receiver is described in detail in this document. The integration of the Receiver-Exciter subsystem with other portions of the Deep Space Network is described.

  13. 40 CFR 141.25 - Analytical methods for radioactivity.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 24 2012-07-01 2012-07-01 false Analytical methods for radioactivity... § 141.25 Analytical methods for radioactivity. (a) Analysis for the following contaminants shall be conducted to determine compliance with § 141.66 (radioactivity) in accordance with the methods in the...

  14. Patient perspectives on care received at community acupuncture clinics: a qualitative thematic analysis.

    PubMed

    Tippens, Kimberly M; Chao, Maria T; Connelly, Erin; Locke, Adrianna

    2013-10-29

    Community acupuncture is a recent innovation in acupuncture service delivery in the U.S. that aims to improve access to care through low-cost treatments in group-based settings. Patients at community acupuncture clinics represent a broader socioeconomic spectrum and receive more frequent treatments compared to acupuncture users nationwide. As a relatively new model of acupuncture in the U.S., little is known about the experiences of patients at community acupuncture clinics and whether quality of care is compromised through this high-volume model. The aim of this study was to assess patients' perspectives on the care received through community acupuncture clinics. The investigators conducted qualitative, thematic analysis of written comments from an observational, cross-sectional survey of clients of the Working Class Acupuncture clinics in Portland, Oregon. The survey included an open-ended question for respondents to share comments about their experiences with community acupuncture. Comments were received from 265 community acupuncture patients. Qualitative analysis of written comments identified two primary themes that elucidate patients' perspectives on quality of care: 1) aspects of health care delivery unique to community acupuncture, and 2) patient engagement in health care. Patients identified unique aspects of community acupuncture, including structures that facilitate access, processes that make treatments more comfortable and effective and holistic outcomes including physical improvements, enhanced quality of life, and empowerment. The group setting, community-based locations, and low cost were highlighted as aspects of this model that allow patients to access acupuncture. Patients' perspectives on the values and experiences unique to community acupuncture offer insights on the quality of care received in these settings. The group setting, community-based locations, and low cost of this model potentially reduce access barriers for those who might not

  15. Standardization, evaluation and early-phase method validation of an analytical scheme for batch-consistency N-glycosylation analysis of recombinant produced glycoproteins.

    PubMed

    Zietze, Stefan; Müller, Rainer H; Brecht, René

    2008-03-01

    In order to set up a batch-to-batch-consistency analytical scheme for N-glycosylation analysis, several sample preparation steps including enzyme digestions and fluorophore labelling and two HPLC-methods were established. The whole method scheme was standardized, evaluated and validated according to the requirements on analytical testing in early clinical drug development by usage of a recombinant produced reference glycoprotein (RGP). The standardization of the methods was performed by clearly defined standard operation procedures. During evaluation of the methods, the major interest was in the loss determination of oligosaccharides within the analytical scheme. Validation of the methods was performed with respect to specificity, linearity, repeatability, LOD and LOQ. Due to the fact that reference N-glycan standards were not available, a statistical approach was chosen to derive accuracy from the linearity data. After finishing the validation procedure, defined limits for method variability could be calculated and differences observed in consistency analysis could be separated into significant and incidental ones.

  16. Receiver function HV ratio: a new measurement for reducing non-uniqueness of receiver function waveform inversion

    NASA Astrophysics Data System (ADS)

    Chong, Jiajun; Chu, Risheng; Ni, Sidao; Meng, Qingjun; Guo, Aizhi

    2018-02-01

    It is known that a receiver function has relatively weak constraint on absolute seismic wave velocity, and that joint inversion of the receiver function with surface wave dispersion has been widely applied to reduce the trade-off of velocity with interface depth. However, some studies indicate that the receiver function itself is capable for determining the absolute shear-wave velocity. In this study, we propose to measure the receiver function HV ratio which takes advantage of the amplitude information of the receiver function to constrain the shear-wave velocity. Numerical analysis indicates that the receiver function HV ratio is sensitive to the average shear-wave velocity in the depth range it samples, and can help to reduce the non-uniqueness of receiver function waveform inversion. A joint inversion scheme has been developed, and both synthetic tests and real data application proved the feasibility of the joint inversion.

  17. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  18. Design and Analysis of the Aperture Shield Assembly for a Space Solar Receiver

    NASA Technical Reports Server (NTRS)

    Strumpf, Hal J.; Trinh, Tuan; Westelaken, William; Krystkowiak, Christopher; Avanessian, Vahe; Kerslake, Thomas W.

    1997-01-01

    A joint U.S./Russia program has been conducted to design, develop, fabricate, launch, and operate the world's first space solar dynamic power system on the Russian Space Station Mir. The goal of the program was to demonstrate and confirm that solar dynamic power systems are viable for future space applications such as the International Space Station (ISS). The major components of the system include a solar receiver, a closed Brayton cycle power conversion unit, a power conditioning and control unit, a solar concentrator, a radiator, a thermal control system, and a Space Shuttle carrier. Unfortunately, the mission was demanifested from the ISS Phase 1 Space Shuttle Program in 1996. However, NASA Lewis is proposing to use the fabricated flight hardware as part of an all-American flight demonstration on the ISS in 2002. The present paper concerns the design and analysis of the solar receiver aperture shield assembly. The aperture shield assembly comprises the front face of the cylindrical receiver and is located at the focal plane of the solar concentrator. The aperture shield assembly is a critical component that protects the solar receiver structure from highly concentrated solar fluxes during concentrator off-pointing events. A full-size aperture shield assembly was fabricated. This unit was essentially identical to the flight configuration, with the exception of materials substitution. In addition, a thermal shock test aperture shield assembly was fabricated. This test article utilized the flight materials and was used for high-flux testing in the solar simulator test rig at NASA Lewis. This testing is described in a companion paper.

  19. Identification and analysis of factors affecting thermal shock resistance of ceramic materials in solar receivers

    NASA Technical Reports Server (NTRS)

    Hasselman, D. P. H.; Singh, J. P.; Satyamurthy, K.

    1980-01-01

    An analysis was conducted of the possible modes of thermal stress failure of brittle ceramics for potential use in point-focussing solar receivers. The pertinent materials properties which control thermal stress resistance were identified for conditions of steady-state and transient heat flow, convective and radiative heat transfer, thermal buckling and thermal fatigue as well as catastrophic crack propagation. Selection rules for materials with optimum thermal stress resistance for a particular thermal environment were identified. Recommendations for materials for particular components were made. The general requirements for a thermal shock testing program quantitatively meaningful for point-focussing solar receivers were outlined. Recommendations for follow-on theoretical analyses were made.

  20. Environmental Stewardship: A Conceptual Review and Analytical Framework.

    PubMed

    Bennett, Nathan J; Whitty, Tara S; Finkbeiner, Elena; Pittman, Jeremy; Bassett, Hannah; Gelcich, Stefan; Allison, Edward H

    2018-04-01

    There has been increasing attention to and investment in local environmental stewardship in conservation and environmental management policies and programs globally. Yet environmental stewardship has not received adequate conceptual attention. Establishing a clear definition and comprehensive analytical framework could strengthen our ability to understand the factors that lead to the success or failure of environmental stewardship in different contexts and how to most effectively support and enable local efforts. Here we propose such a definition and framework. First, we define local environmental stewardship as the actions taken by individuals, groups or networks of actors, with various motivations and levels of capacity, to protect, care for or responsibly use the environment in pursuit of environmental and/or social outcomes in diverse social-ecological contexts. Next, drawing from a review of the environmental stewardship, management and governance literatures, we unpack the elements of this definition to develop an analytical framework that can facilitate research on local environmental stewardship. Finally, we discuss potential interventions and leverage points for promoting or supporting local stewardship and future applications of the framework to guide descriptive, evaluative, prescriptive or systematic analysis of environmental stewardship. Further application of this framework in diverse environmental and social contexts is recommended to refine the elements and develop insights that will guide and improve the outcomes of environmental stewardship initiatives and investments. Ultimately, our aim is to raise the profile of environmental stewardship as a valuable and holistic concept for guiding productive and sustained relationships with the environment.

  1. Environmental Stewardship: A Conceptual Review and Analytical Framework

    NASA Astrophysics Data System (ADS)

    Bennett, Nathan J.; Whitty, Tara S.; Finkbeiner, Elena; Pittman, Jeremy; Bassett, Hannah; Gelcich, Stefan; Allison, Edward H.

    2018-04-01

    There has been increasing attention to and investment in local environmental stewardship in conservation and environmental management policies and programs globally. Yet environmental stewardship has not received adequate conceptual attention. Establishing a clear definition and comprehensive analytical framework could strengthen our ability to understand the factors that lead to the success or failure of environmental stewardship in different contexts and how to most effectively support and enable local efforts. Here we propose such a definition and framework. First, we define local environmental stewardship as the actions taken by individuals, groups or networks of actors, with various motivations and levels of capacity, to protect, care for or responsibly use the environment in pursuit of environmental and/or social outcomes in diverse social-ecological contexts. Next, drawing from a review of the environmental stewardship, management and governance literatures, we unpack the elements of this definition to develop an analytical framework that can facilitate research on local environmental stewardship. Finally, we discuss potential interventions and leverage points for promoting or supporting local stewardship and future applications of the framework to guide descriptive, evaluative, prescriptive or systematic analysis of environmental stewardship. Further application of this framework in diverse environmental and social contexts is recommended to refine the elements and develop insights that will guide and improve the outcomes of environmental stewardship initiatives and investments. Ultimately, our aim is to raise the profile of environmental stewardship as a valuable and holistic concept for guiding productive and sustained relationships with the environment.

  2. Preliminary Analysis of Fluctuations in the Received Uplink-Beacon-Power Data Obtained From the GOLD Experiments

    NASA Technical Reports Server (NTRS)

    Jeganathan, M.; Wilson, K. E.; Lesh, J. R.

    1996-01-01

    Uplink data from recent free-space optical communication experiments carried out between the Table Mountain Facility and the Japanese Engineering Test Satellite are used to study fluctuations caused by beam propagation through the atmosphere. The influence of atmospheric scintillation, beam wander and jitter, and multiple uplink beams on the statistics of power received by the satellite is analyzed and compared to experimental data. Preliminary analysis indicates the received signal obeys an approximate lognormal distribution, as predicted by the weak-turbulence model, but further characterization of other sources of fluctuations is necessary for accurate link predictions.

  3. Preliminary analysis of fluctuations in the received uplink-beacon-power data obtained from the GOLD experiments

    NASA Technical Reports Server (NTRS)

    Jeganathan, M.; Wilson, K. E.; Lesh, J. R.

    1996-01-01

    Uplink data from recent free-space optical communication experiments carried out between the Table Mountain Facility and the Japanese Engineering Test Satellite are used to study fluctuations caused by beam propagation through the atmosphere. The influence of atmospheric scintillation, beam wander and jitter, and multiple uplink beams on the statistics of power received by the satellite is analyzed and compared to experimental data. Preliminary analysis indicates the received signal obeys an approximate lognormal distribution, as predicted by the weak-turbulence model, but further characterization of other sources of fluctuations is necessary for accurate link predictions.

  4. Telemetry Tests Of The Advanced Receiver II

    NASA Technical Reports Server (NTRS)

    Hinedi, Sami M.; Bevan, Roland P.; Marina, Miguel

    1993-01-01

    Report describes telemetry tests of Advanced Receiver II (ARX-II): digital radio receiving subsystem operating on intermediate-frequency output of another receiving subsystem called "multimission receiver" (MMR), detecting carrier, subcarrier, and data-symbol signals transmitted by spacecraft, and extracts Doppler information from signals. Analysis of data shows performance of MMR/ARX-II system comparable and sometimes superior to performances of Blk-III/BPA and Blk-III/SDA/SSA systems.

  5. Effect of Pointing Error on the BER Performance of an Optical CDMA FSO Link with SIK Receiver

    NASA Astrophysics Data System (ADS)

    Nazrul Islam, A. K. M.; Majumder, S. P.

    2017-12-01

    An analytical approach is presented for an optical code division multiple access (OCDMA) system over free space optical (FSO) channel considering the effect of pointing error between the transmitter and the receiver. Analysis is carried out with an optical sequence inverse keying (SIK) correlator receiver with intensity modulation and direct detection (IM/DD) to find the bit error rate (BER) with pointing error. The results are evaluated numerically in terms of signal-to-noise plus multi-access interference (MAI) ratio, BER and power penalty due to pointing error. It is noticed that the OCDMA FSO system is highly affected by pointing error with significant power penalty at a BER of 10-6 and 10-9. For example, penalty at BER 10-9 is found to be 9 dB corresponding to normalized pointing error of 1.4 for 16 users with processing gain of 256 and is reduced to 6.9 dB when the processing gain is increased to 1,024.

  6. Sensitivity, Specificity, and Receiver Operating Characteristics: A Primer for Neuroscience Nurses.

    PubMed

    McNett, Molly; Amato, Shelly; Olson, DaiWai M

    2017-04-01

    It is important for neuroscience nurses to have a solid understanding of the instruments they use in clinical practice. Specifically, when reviewing reports of research instruments, nurses should be knowledgeable of analytical terms when determining the applicability of instruments for use in clinical practice. The purpose of this article is to review 3 such analytical terms: sensitivity, specificity, and receiver operating characteristic curves. Examples of how these terms are used in the neuroscience literature highlight the relevance of these terms to neuroscience nursing practice. As the role of the nurse continues to expand, it is important not to simply accept all instruments as valid but to be able to critically evaluate their properties for applicability to nursing practice and evidence-based care of our patients.

  7. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  8. On the Modeling and Management of Cloud Data Analytics

    NASA Astrophysics Data System (ADS)

    Castillo, Claris; Tantawi, Asser; Steinder, Malgorzata; Pacifici, Giovanni

    A new era is dawning where vast amount of data is subjected to intensive analysis in a cloud computing environment. Over the years, data about a myriad of things, ranging from user clicks to galaxies, have been accumulated, and continue to be collected, on storage media. The increasing availability of such data, along with the abundant supply of compute power and the urge to create useful knowledge, gave rise to a new data analytics paradigm in which data is subjected to intensive analysis, and additional data is created in the process. Meanwhile, a new cloud computing environment has emerged where seemingly limitless compute and storage resources are being provided to host computation and data for multiple users through virtualization technologies. Such a cloud environment is becoming the home for data analytics. Consequently, providing good performance at run-time to data analytics workload is an important issue for cloud management. In this paper, we provide an overview of the data analytics and cloud environment landscapes, and investigate the performance management issues related to running data analytics in the cloud. In particular, we focus on topics such as workload characterization, profiling analytics applications and their pattern of data usage, cloud resource allocation, placement of computation and data and their dynamic migration in the cloud, and performance prediction. In solving such management problems one relies on various run-time analytic models. We discuss approaches for modeling and optimizing the dynamic data analytics workload in the cloud environment. All along, we use the Map-Reduce paradigm as an illustration of data analytics.

  9. Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"

    NASA Astrophysics Data System (ADS)

    Pal, Sangita; Singha, Mousumi; Meena, Sher Singh

    2018-04-01

    Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.

  10. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  11. Learning Analytics: Potential for Enhancing School Library Programs

    ERIC Educational Resources Information Center

    Boulden, Danielle Cadieux

    2015-01-01

    Learning analytics has been defined as the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. The potential use of data and learning analytics in educational contexts has caught the attention of educators and…

  12. Analytical formulation of directly modulated OOFDM signals transmitted over an IM/DD dispersive link.

    PubMed

    Sánchez, C; Ortega, B; Wei, J L; Tang, J; Capmany, J

    2013-03-25

    We provide an analytical study on the propagation effects of a directly modulated OOFDM signal through a dispersive fiber and subsequent photo-detection. The analysis includes the effects of the laser operation point and the interplay between chromatic dispersion and laser chirp. The final expression allows to understand the physics behind the transmission of a multi-carrier signal in the presence of residual frequency modulation and the description of the induced intermodulation distortion gives us a detailed insight into the diferent intermodulation products which impair the recovered signal at the receiver-end side. Numerical comparisons between transmission simulations results and those provided by evaluating the expression obtained are carried out for different laser operation points. Results obtained by changing the fiber length, laser parameters and using single mode fiber with negative and positive dispersion are calculated in order to demonstrate the validity and versatility of the theory provided in this paper. Therefore, a novel analytical formulation is presented as a versatile tool for the description and study of IM/DD OOFDM systems with variable design parameters.

  13. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  14. Micro-optics for microfluidic analytical applications.

    PubMed

    Yang, Hui; Gijs, Martin A M

    2018-02-19

    This critical review summarizes the developments in the integration of micro-optical elements with microfluidic platforms for facilitating detection and automation of bio-analytical applications. Micro-optical elements, made by a variety of microfabrication techniques, advantageously contribute to the performance of an analytical system, especially when the latter has microfluidic features. Indeed the easy integration of optical control and detection modules with microfluidic technology helps to bridge the gap between the macroscopic world and chip-based analysis, paving the way for automated and high-throughput applications. In our review, we start the discussion with an introduction of microfluidic systems and micro-optical components, as well as aspects of their integration. We continue with a detailed description of different microfluidic and micro-optics technologies and their applications, with an emphasis on the realization of optical waveguides and microlenses. The review continues with specific sections highlighting the advantages of integrated micro-optical components in microfluidic systems for tackling a variety of analytical problems, like cytometry, nucleic acid and protein detection, cell biology, and chemical analysis applications.

  15. Compensation for matrix effects in the gas chromatography-mass spectrometry analysis of 186 pesticides in tea matrices using analyte protectants.

    PubMed

    Li, Yan; Chen, Xi; Fan, Chunlin; Pang, Guofang

    2012-11-30

    A gas chromatography-mass spectrometry (GC-MS) analytical method was developed for simultaneously determining 186 pesticides in tea matrices using analyte protectants to counteract the matrix-induced effect. The matrix effects were evaluated for green, oolong and black tea, representing unfermented, partially fermented and completely fermented teas respectively and depending on the type of tea, 72%, 94% and 94% of the pesticides presented strong response enhancement effect. Several analyte protectants as well as certain combinations of these protectants were evaluated to check their compensation effects. A mixture of triglycerol and d-ribonic acid-γ-lactone (both at 2mg/mL in the injected samples) was found to be the most effective in improving the chromatographic behavior of the 186 pesticides. More than 96% of the 186 pesticides achieved recoveries within the range of 70-120% when using the selected mixture of analyte protectants. The simple addition of analyte protectants offers a more convenient solution to overcome matrix effects, results in less active sites compared to matrix-matched standardization and can be an effective approach to compensate for matrix effects in the GC-MS analysis of pesticide residues. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Boundary and analytic attitude: reflections on a summer holiday break.

    PubMed

    Wright, Susanna

    2016-06-01

    The effect of a boundary in analytic work at the summer holiday break is discussed in relation to archetypal experiences of exclusion, loss and limitation. Some attempts by patients to mitigate an analyst's act of separation are reviewed as enactments, and in particular the meanings of a gift made by one patient. Analytic attitude towards enactment from within different schools of practice is sketched, with reference to the effect on the analyst of departing from the received practice of their own allegiance. A theory is adumbrated that the discomfort of 'contravening the rules' has a useful effect in sparking the analyst into consciousness, with greater attention to salient features in an individual case. Interpretation as an enactment is briefly considered, along with the possible effects of containing the discomfort of a patient's enactment in contrast to confronting it with interpretation. © 2016, The Society of Analytical Psychology.

  17. Optimization of MLS receivers for multipath environments

    NASA Technical Reports Server (NTRS)

    Mcalpine, G. A.; Irwin, S. H.; NELSON; Roleyni, G.

    1977-01-01

    Optimal design studies of MLS angle-receivers and a theoretical design-study of MLS DME-receivers are reported. The angle-receiver results include an integration of the scan data processor and tracking filter components of the optimal receiver into a unified structure. An extensive simulation study comparing the performance of the optimal and threshold receivers in a wide variety of representative dynamical interference environments was made. The optimal receiver was generally superior. A simulation of the performance of the threshold and delay-and-compare receivers in various signal environments was performed. An analysis of combined errors due to lateral reflections from vertical structures with small differential path delays, specular ground reflections with neglible differential path delays, and thermal noise in the receivers is provided.

  18. Tank 241-T-204, core 188 analytical results for the final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nuzum, J.L.

    TANK 241-T-204, CORE 188, ANALYTICAL RESULTS FOR THE FINAL REPORT. This document is the final laboratory report for Tank 241 -T-204. Push mode core segments were removed from Riser 3 between March 27, 1997, and April 11, 1997. Segments were received and extruded at 222-8 Laboratory. Analyses were performed in accordance with Tank 241-T-204 Push Mode Core Sampling and analysis Plan (TRAP) (Winkleman, 1997), Letter of instruction for Core Sample Analysis of Tanks 241-T-201, 241- T-202, 241-T-203, and 241-T-204 (LAY) (Bell, 1997), and Safety Screening Data Qual@ Objective (DO) ODukelow, et al., 1995). None of the subsamples submitted for totalmore » alpha activity (AT) or differential scanning calorimetry (DC) analyses exceeded the notification limits stated in DO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group and are not considered in this report.« less

  19. Design of radar receivers

    NASA Astrophysics Data System (ADS)

    Sokolov, M. A.

    This handbook treats the design and analysis of of pulsed radar receivers, with emphasis on elements (especially IC elements) that implement optimal and suboptimal algorithms. The design methodology is developed from the viewpoint of statistical communications theory. Particular consideration is given to the synthesis of single-channel and multichannel detectors, the design of analog and digital signal-processing devices, and the analysis of IF amplifiers.

  20. The Case for Adopting Server-side Analytics

    NASA Astrophysics Data System (ADS)

    Tino, C.; Holmes, C. P.; Feigelson, E.; Hurlburt, N. E.

    2017-12-01

    The standard method for accessing Earth and space science data relies on a scheme developed decades ago: data residing in one or many data stores must be parsed out and shipped via internet lines or physical transport to the researcher who in turn locally stores the data for analysis. The analyses tasks are varied and include visualization, parameterization, and comparison with or assimilation into physics models. In many cases this process is inefficient and unwieldy as the data sets become larger and demands on the analysis tasks become more sophisticated and complex. For about a decade, several groups have explored a new paradigm to this model. The names applied to the paradigm include "data analytics", "climate analytics", and "server-side analytics". The general concept is that in close network proximity to the data store there will be a tailored processing capability appropriate to the type and use of the data served. The user of the server-side analytics will operate on the data with numerical procedures. The procedures can be accessed via canned code, a scripting processor, or an analysis package such as Matlab, IDL or R. Results of the analytics processes will then be relayed via the internet to the user. In practice, these results will be at a much lower volume, easier to transport to and store locally by the user and easier for the user to interoperate with data sets from other remote data stores. The user can also iterate on the processing call to tailor the results as needed. A major component of server-side analytics could be to provide sets of tailored results to end users in order to eliminate the repetitive preconditioning that is both often required with these data sets and which drives much of the throughput challenges. NASA's Big Data Task Force studied this issue. This paper will present the results of this study including examples of SSAs that are being developed and demonstrated and suggestions for architectures that might be developed for

  1. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  2. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  3. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  4. Conversion of multiple analyte cation types to a single analyte anion type via ion/ion charge inversion.

    PubMed

    Hassell, Kerry M; LeBlanc, Yves; McLuckey, Scott A

    2009-11-01

    Charge inversion ion/ion reactions can convert several cation types associated with a single analyte molecule to a single anion type for subsequent mass analysis. Specifically, analyte ions present with one of a variety of cationizing agents, such as an excess proton, excess sodium ion, or excess potassium ion, can all be converted to the deprotonated molecule, provided that a stable anion can be generated for the analyte. Multiply deprotonated species that are capable of exchanging a proton for a metal ion serve as the reagent anions for the reaction. This process is demonstrated here for warfarin and for a glutathione conjugate. Examples for several other glutathione conjugates are provided as supplementary material to demonstrate the generality of the reaction. In the case of glutathione conjugates, multiple metal ions can be associated with the singly-charged analyte due to the presence of two carboxylate groups. The charge inversion reaction involves the removal of the excess cationizing agent, as well as any metal ions associated with anionic groups to yield a singly deprotonated analyte molecule. The ability to convert multiple cation types to a single anion type is analytically desirable in cases in which the analyte signal is distributed among several cation types, as is common in the electrospray ionization of solutions with relatively high salt contents. For analyte species that undergo efficient charge inversion, such as glutathione conjugates, there is the additional potential advantage for significantly improved signal-to-noise ratios when species that give rise to 'chemical noise' in the positive ion spectrum do not undergo efficient charge inversion.

  5. The Analytic Onion: Examining Training Issues from Different Levels of Analysis. Interim Technical Paper for Period July 1989-June 1991.

    ERIC Educational Resources Information Center

    Lamb, Theodore A.; Chin, Keric B. O.

    This paper proposes a conceptual framework based on different levels of analysis using the metaphor of the layers of an onion to help organize and structure thinking on research issues concerning training. It discusses the core of the "analytic onion," the biological level, and seven levels of analysis that surround that core: the individual, the…

  6. Psychotropic Polypharmacy Among Youths With Serious Emotional and Behavioral Disorders Receiving Coordinated Care Services.

    PubMed

    Wu, Benjamin; Bruns, Eric J; Tai, Ming-Hui; Lee, Bethany R; Raghavan, Ramesh; dosReis, Susan

    2018-06-01

    The study examined differences in psychotropic polypharmacy among youths with serious emotional and behavioral disorders who received coordinated care services (CCS) that used a wraparound model and a matched sample of youths who received traditional services. A quasi-experimental design compared psychotropic polypharmacy one year before and one year after discharge from CCS. The cohort was youths with serious emotional and behavioral disorders who were enrolled in CCS from December 2009 through May 2014. The comparison group was youths with serious emotional and behavioral disorders who received outpatient mental health services during the same time. Administrative data from Medicaid, child welfare, and juvenile justice services were used. A difference-in-difference analysis with propensity score matching evaluated the CCS intervention by time effect on psychotropic polypharmacy. In both groups, most youths were male, black, and 10-18 years old, with attention-deficit hyperactivity disorder (54%-55%), mood disorder (39%-42%), depression (26%-27%), and bipolar disorder (25%-26%). About half of each group was taking an antipsychotic. The percentage reduction in polypharmacy from one year before CCS enrollment to one year after discharge was 28% for the CCS group and 29% for the non-CCS group, a nonsignificant difference. CCS youths excluded from the analysis had more complex mental health needs and a greater change in polypharmacy than the CCS youths who were included in the analytic sample. Mental health care coordination had limited impact in reducing psychotropic polypharmacy for youths with less complex mental health needs. Further research is needed to evaluate the effect on psychotropic polypharmacy among youths with the greatest mental health needs.

  7. Text-based Analytics for Biosurveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles, Lauren E.; Smith, William P.; Rounds, Jeremiah

    The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related tomore » biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when). The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related to biosurveillance. Online news articles are

  8. Manufacturing data analytics using a virtual factory representation.

    PubMed

    Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun

    2017-01-01

    Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.

  9. Simple quasi-analytical holonomic homogenization model for the non-linear analysis of in-plane loaded masonry panels: Part 1, meso-scale

    NASA Astrophysics Data System (ADS)

    Milani, G.; Bertolesi, E.

    2017-07-01

    A simple quasi analytical holonomic homogenization approach for the non-linear analysis of masonry walls in-plane loaded is presented. The elementary cell (REV) is discretized with 24 triangular elastic constant stress elements (bricks) and non-linear interfaces (mortar). A holonomic behavior with softening is assumed for mortar. It is shown how the mechanical problem in the unit cell is characterized by very few displacement variables and how homogenized stress-strain behavior can be evaluated semi-analytically.

  10. The analytical and numerical approaches to the theory of the Moon's librations: Modern analysis and results

    NASA Astrophysics Data System (ADS)

    Petrova, N.; Zagidullin, A.; Nefedyev, Y.; Kosulin, V.; Andreev, A.

    2017-11-01

    Observing physical librations of celestial bodies and the Moon represents one of the astronomical methods of remotely assessing the internal structure of a celestial body without conducting expensive space experiments. The paper contains a review of recent advances in studying the Moon's structure using various methods of obtaining and applying the lunar physical librations (LPhL) data. In this article LPhL simulation methods of assessing viscoelastic and dissipative properties of the lunar body and lunar core parameters, whose existence has been recently confirmed during the seismic data reprocessing of ;Apollo; space mission, are described. Much attention is paid to physical interpretation of the free librations phenomenon and the methods for its determination. In the paper the practical application of the most accurate analytical LPhL tables (Rambaux and Williams, 2011) is discussed. The tables were built on the basis of complex analytical processing of the residual differences obtained when comparing long-term series of laser observations with the numerical ephemeris DE421. In the paper an efficiency analysis of two approaches to LPhL theory is conducted: the numerical and the analytical ones. It has been shown that in lunar investigation both approaches complement each other in various aspects: the numerical approach provides high accuracy of the theory, which is required for the proper processing of modern observations, the analytical approach allows to comprehend the essence of the phenomena in the lunar rotation, predict and interpret new effects in the observations of lunar body and lunar core parameters.

  11. Tank 241-AP-106, Grab samples, 6AP-98-1, 6AP-98-2 and 6AP-98-3 Analytical results for the final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FULLER, R.K.

    1999-02-23

    This document is the final report for tank 241-AP-106 grab samples. Three grab samples 6AP-98-1, 6AP-98-2 and 6AP-98-3 were taken from riser 1 of tank 241-AP-106 on May 28, 1998 and received by the 222-S Laboratory on May 28, 1998. Analyses were performed in accordance with the ''Compatability Grab Sampling and Analysis Plan'' (TSAP) (Sasaki, 1998) and the ''Data Quality Objectives for Tank Farms Waste Compatability Program (DQO). The analytical results are presented in the data summary report. No notification limits were exceeded. The request for sample analysis received for AP-106 indicated that the samples were polychlorinated biphenyl (PCB) suspects.more » The results of this analysis indicated that no PCBs were present at the Toxic Substance Control Act (TSCA) regulated limit of 50 ppm. The results and raw data for the PCB analysis are included in this document.« less

  12. The Earth Data Analytic Services (EDAS) Framework

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2017-12-01

    Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.

  13. Analytical Chemistry Division annual progress report for period ending November 30, 1977

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, W.S.

    1978-03-01

    Activities for the year are summarized in sections on analytical methodology, mass and mass emission spectrometry, analytical services, bio-organic analysis, nuclear and radiochemical analysis, and quality assurance and safety. Presentations of research results in publications and reports are tabulated. (JRD)

  14. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases

    PubMed Central

    2012-01-01

    Background Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. Results This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. Conclusions The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of

  15. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases.

    PubMed

    Lluveras-Tenorio, Anna; Mazurek, Joy; Restivo, Annalaura; Colombini, Maria Perla; Bonaduce, Ilaria

    2012-10-10

    Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in

  16. Analytically solvable chaotic oscillator based on a first-order filter.

    PubMed

    Corron, Ned J; Cooper, Roy M; Blakely, Jonathan N

    2016-02-01

    A chaotic hybrid dynamical system is introduced and its analytic solution is derived. The system is described as an unstable first order filter subject to occasional switching of a set point according to a feedback rule. The system qualitatively differs from other recently studied solvable chaotic hybrid systems in that the timing of the switching is regulated by an external clock. The chaotic analytic solution is an optimal waveform for communications in noise when a resistor-capacitor-integrate-and-dump filter is used as a receiver. As such, these results provide evidence in support of a recent conjecture that the optimal communication waveform for any stable infinite-impulse response filter is chaotic.

  17. Analytically solvable chaotic oscillator based on a first-order filter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corron, Ned J.; Cooper, Roy M.; Blakely, Jonathan N.

    2016-02-15

    A chaotic hybrid dynamical system is introduced and its analytic solution is derived. The system is described as an unstable first order filter subject to occasional switching of a set point according to a feedback rule. The system qualitatively differs from other recently studied solvable chaotic hybrid systems in that the timing of the switching is regulated by an external clock. The chaotic analytic solution is an optimal waveform for communications in noise when a resistor-capacitor-integrate-and-dump filter is used as a receiver. As such, these results provide evidence in support of a recent conjecture that the optimal communication waveform formore » any stable infinite-impulse response filter is chaotic.« less

  18. Evaluation of analytical performance based on partial order methodology.

    PubMed

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points". Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Microplasmas for chemical analysis: analytical tools or research toys?

    NASA Astrophysics Data System (ADS)

    Karanassios, Vassili

    2004-07-01

    An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between "liquid" electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided.

  20. Analytical Chemistry and the Microchip.

    ERIC Educational Resources Information Center

    Lowry, Robert K.

    1986-01-01

    Analytical techniques used at various points in making microchips are described. They include: Fourier transform infrared spectrometry (silicon purity); optical emission spectroscopy (quantitative thin-film composition); X-ray photoelectron spectroscopy (chemical changes in thin films); wet chemistry, instrumental analysis (process chemicals);…

  1. An Item Gains and Losses Analysis of False Memories Suggests Critical Items Receive More Item-Specific Processing than List Items

    ERIC Educational Resources Information Center

    Burns, Daniel J.; Martens, Nicholas J.; Bertoni, Alicia A.; Sweeney, Emily J.; Lividini, Michelle D.

    2006-01-01

    In a repeated testing paradigm, list items receiving item-specific processing are more likely to be recovered across successive tests (item gains), whereas items receiving relational processing are likely to be forgotten progressively less on successive tests. Moreover, analysis of cumulative-recall curves has shown that item-specific processing…

  2. Receiver subsystem analysis report (RADL Item 4-1). The 10-MWe solar thermal central-receiver pilot plant: Solar-facilities design integration

    NASA Astrophysics Data System (ADS)

    1982-04-01

    The results of thermal hydraulic, design for the stress analyses which are required to demonstrate that the receiver design for the Barstow Solar Pilot Plant satisfies the general design and performance requirements during the plant's design life are presented. Recommendations are made for receiver operation. The analyses are limited to receiver subsystem major structural parts (primary tower, receiver unit core support structure), pressure parts (absorber panels, feedwater, condensate and steam piping/components, flash tank, and steam mainfold) and shielding.

  3. Development of Analytical Algorithm for the Performance Analysis of Power Train System of an Electric Vehicle

    NASA Astrophysics Data System (ADS)

    Kim, Chul-Ho; Lee, Kee-Man; Lee, Sang-Heon

    Power train system design is one of the key R&D areas on the development process of new automobile because an optimum size of engine with adaptable power transmission which can accomplish the design requirement of new vehicle can be obtained through the system design. Especially, for the electric vehicle design, very reliable design algorithm of a power train system is required for the energy efficiency. In this study, an analytical simulation algorithm is developed to estimate driving performance of a designed power train system of an electric. The principal theory of the simulation algorithm is conservation of energy with several analytical and experimental data such as rolling resistance, aerodynamic drag, mechanical efficiency of power transmission etc. From the analytical calculation results, running resistance of a designed vehicle is obtained with the change of operating condition of the vehicle such as inclined angle of road and vehicle speed. Tractive performance of the model vehicle with a given power train system is also calculated at each gear ratio of transmission. Through analysis of these two calculation results: running resistance and tractive performance, the driving performance of a designed electric vehicle is estimated and it will be used to evaluate the adaptability of the designed power train system on the vehicle.

  4. Analytic materials

    PubMed Central

    2016-01-01

    The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer p. If p takes its maximum value, then we have a complete analytic material. Otherwise, it is incomplete analytic material of rank p. For two-dimensional materials, further progress can be made in the identification of analytic materials by using the well-known fact that a 90° rotation applied to a divergence-free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations. PMID:27956882

  5. Dairy Analytics and Nutrient Analysis (DANA) Prototype System User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sam Alessi; Dennis Keiser

    2012-10-01

    This document is a user manual for the Dairy Analytics and Nutrient Analysis (DANA) model. DANA provides an analysis of dairy anaerobic digestion technology and allows users to calculate biogas production, co-product valuation, capital costs, expenses, revenue and financial metrics, for user customizable scenarios, dairy and digester types. The model provides results for three anaerobic digester types; Covered Lagoons, Modified Plug Flow, and Complete Mix, and three main energy production technologies; electricity generation, renewable natural gas generation, and compressed natural gas generation. Additional options include different dairy types, bedding types, backend treatment type as well as numerous production, and economicmore » parameters. DANA’s goal is to extend the National Market Value of Anaerobic Digester Products analysis (informa economics, 2012; Innovation Center, 2011) to include a greater and more flexible set of regional digester scenarios and to provide a modular framework for creation of a tool to support farmer and investor needs. Users can set up scenarios from combinations of existing parameters or add new parameters, run the model and view a variety of reports, charts and tables that are automatically produced and delivered over the web interface. DANA is based in the INL’s analysis architecture entitled Generalized Environment for Modeling Systems (GEMS) , which offers extensive collaboration, analysis, and integration opportunities and greatly speeds the ability construct highly scalable web delivered user-oriented decision tools. DANA’s approach uses server-based data processing and web-based user interfaces, rather a client-based spreadsheet approach. This offers a number of benefits over the client-based approach. Server processing and storage can scale up to handle a very large number of scenarios, so that analysis of county, even field level, across the whole U.S., can be performed. Server based databases allow dairy and

  6. Robust analysis of the hydrophobic basic analytes loratadine and desloratadine in pharmaceutical preparations and biological fluids by sweeping-cyclodextrin-modified micellar electrokinetic chromatography.

    PubMed

    El-Awady, Mohamed; Belal, Fathalla; Pyell, Ute

    2013-09-27

    The analysis of hydrophobic basic analytes by micellar electrokinetic chromatography (MEKC) is usually challenging because of the tendency of these analytes to be adsorbed onto the inner capillary wall in addition to the difficulty to separate these compounds as they exhibit extremely high retention factors. A robust and reliable method for the simultaneous determination of loratadine (LOR) and its major metabolite desloratadine (DSL) is developed based on cyclodextrin-modified micellar electrokinetic chromatography (CD-MEKC) with acidic sample matrix and basic background electrolyte (BGE). The influence of the sample matrix on the reachable focusing efficiency is studied. It is shown that the application of a low pH sample solution mitigates problems associated with the low solubility of the hydrophobic basic analytes in aqueous solution while having advantages with regard to on-line focusing. Moreover, the use of a basic BGE reduces the adsorption of these analytes in the separation compartment. The separation of the studied analytes is achieved in less than 7min using a BGE consisting of 10mmolL(-1) disodium tetraborate buffer, pH 9.30 containing 40mmolL(-1) SDS and 20mmolL(-1) hydroxypropyl-β-CD while the sample solution is composed of 10mmolL(-1) phosphoric acid, pH 2.15. A full validation study of the developed method based on the pharmacopeial guidelines is performed. The method is successfully applied to the analysis of the studied drugs in tablets without interference of tablet additives as well as the analysis of spiked human urine without any sample pretreatment. Furthermore, DSL can be detected as an impurity in LOR bulk powder at the stated pharmacopeial limit (0.1%, w/w). The selectivity of the developed method allows the analysis of LOR and DSL in combination with the co-formulated drug pseudoephedrine. It is shown that in CD-MEKC with basic BGE, solute-wall interactions are effectively suppressed allowing the development of efficient and precise

  7. Cytobank: providing an analytics platform for community cytometry data analysis and collaboration.

    PubMed

    Chen, Tiffany J; Kotecha, Nikesh

    2014-01-01

    Cytometry is used extensively in clinical and laboratory settings to diagnose and track cell subsets in blood and tissue. High-throughput, single-cell approaches leveraging cytometry are developed and applied in the computational and systems biology communities by researchers, who seek to improve the diagnosis of human diseases, map the structures of cell signaling networks, and identify new cell types. Data analysis and management present a bottleneck in the flow of knowledge from bench to clinic. Multi-parameter flow and mass cytometry enable identification of signaling profiles of patient cell samples. Currently, this process is manual, requiring hours of work to summarize multi-dimensional data and translate these data for input into other analysis programs. In addition, the increase in the number and size of collaborative cytometry studies as well as the computational complexity of analytical tools require the ability to assemble sufficient and appropriately configured computing capacity on demand. There is a critical need for platforms that can be used by both clinical and basic researchers who routinely rely on cytometry. Recent advances provide a unique opportunity to facilitate collaboration and analysis and management of cytometry data. Specifically, advances in cloud computing and virtualization are enabling efficient use of large computing resources for analysis and backup. An example is Cytobank, a platform that allows researchers to annotate, analyze, and share results along with the underlying single-cell data.

  8. Analytical aspects of hydrogen exchange mass spectrometry

    PubMed Central

    Engen, John R.; Wales, Thomas E.

    2016-01-01

    The analytical aspects of measuring hydrogen exchange by mass spectrometry are reviewed. The nature of analytical selectivity in hydrogen exchange is described followed by review of the analytical tools required to accomplish fragmentation, separation, and the mass spectrometry measurements under restrictive exchange quench conditions. In contrast to analytical quantitation that relies on measurements of peak intensity or area, quantitation in hydrogen exchange mass spectrometry depends on measuring a mass change with respect to an undeuterated or deuterated control, resulting in a value between zero and the maximum amount of deuterium that could be incorporated. Reliable quantitation is a function of experimental fidelity and to achieve high measurement reproducibility, a large number of experimental variables must be controlled during sample preparation and analysis. The method also reports on important qualitative aspects of the sample, including conformational heterogeneity and population dynamics. PMID:26048552

  9. Path integral analysis of Jarzynski's equality: Analytical results

    NASA Astrophysics Data System (ADS)

    Minh, David D. L.; Adib, Artur B.

    2009-02-01

    We apply path integrals to study nonequilibrium work theorems in the context of Brownian dynamics, deriving in particular the equations of motion governing the most typical and most dominant trajectories. For the analytically soluble cases of a moving harmonic potential and a harmonic oscillator with a time-dependent natural frequency, we find such trajectories, evaluate the work-weighted propagators, and validate Jarzynski’s equality.

  10. Extended Analytic Device Optimization Employing Asymptotic Expansion

    NASA Technical Reports Server (NTRS)

    Mackey, Jonathan; Sehirlioglu, Alp; Dynsys, Fred

    2013-01-01

    Analytic optimization of a thermoelectric junction often introduces several simplifying assumptionsincluding constant material properties, fixed known hot and cold shoe temperatures, and thermallyinsulated leg sides. In fact all of these simplifications will have an effect on device performance,ranging from negligible to significant depending on conditions. Numerical methods, such as FiniteElement Analysis or iterative techniques, are often used to perform more detailed analysis andaccount for these simplifications. While numerical methods may stand as a suitable solution scheme,they are weak in gaining physical understanding and only serve to optimize through iterativesearching techniques. Analytic and asymptotic expansion techniques can be used to solve thegoverning system of thermoelectric differential equations with fewer or less severe assumptionsthan the classic case. Analytic methods can provide meaningful closed form solutions and generatebetter physical understanding of the conditions for when simplifying assumptions may be valid.In obtaining the analytic solutions a set of dimensionless parameters, which characterize allthermoelectric couples, is formulated and provide the limiting cases for validating assumptions.Presentation includes optimization of both classic rectangular couples as well as practically andtheoretically interesting cylindrical couples using optimization parameters physically meaningful toa cylindrical couple. Solutions incorporate the physical behavior for i) thermal resistance of hot andcold shoes, ii) variable material properties with temperature, and iii) lateral heat transfer through legsides.

  11. Using Learning Analytics to Predict (and Improve) Student Success: A Faculty Perspective

    ERIC Educational Resources Information Center

    Dietz-Uhler, Beth; Hurn, Janet E.

    2013-01-01

    Learning analytics is receiving increased attention, in part because it offers to assist educational institutions in increasing student retention, improving student success, and easing the burden of accountability. Although these large-scale issues are worthy of consideration, faculty might also be interested in how they can use learning analytics…

  12. Analytical and Radiochemistry for Nuclear Forensics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steiner, Robert Ernest; Dry, Donald E.; Kinman, William Scott

    Information about nonproliferation nuclear forensics, activities in forensics at Los Alamos National Laboratory, radio analytical work at LANL, radiochemical characterization capabilities, bulk chemical and materials analysis capabilities, and future interests in forensics interactions.

  13. Developing Guidelines for Assessing Visual Analytics Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean

    2011-07-01

    In this paper, we develop guidelines for evaluating visual analytic environments based on a synthesis of reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We then looked at guidelines developed by researchers in various domainsmore » and synthesized these into an initial set for use by others in the community. In a second part of the user study, we looked at guidelines for a new aspect of visual analytic systems – the generation of reports. Future visual analytic systems have been challenged to help analysts generate their reports. In our study we worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 Based on these efforts, we produced some initial guidelines for evaluating visual analytic environment and for evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope because of the type of tasks for which the visual analytic systems used in the studies in this paper were designed. More research and refinement is needed by the Visual Analytics Community to provide additional evaluation guidelines for different types of visual analytic environments.« less

  14. Competing on analytics.

    PubMed

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  15. A harmonic analysis approach to joint inversion of P-receiver functions and wave dispersion data in high dense seismic profiles

    NASA Astrophysics Data System (ADS)

    Molina-Aguilera, A.; Mancilla, F. D. L.; Julià, J.; Morales, J.

    2017-12-01

    Joint inversion techniques of P-receiver functions and wave dispersion data implicitly assume an isotropic radial stratified earth. The conventional approach invert stacked radial component receiver functions from different back-azimuths to obtain a laterally homogeneous single-velocity model. However, in the presence of strong lateral heterogeneities as anisotropic layers and/or dipping interfaces, receiver functions are considerably perturbed and both the radial and transverse components exhibit back azimuthal dependences. Harmonic analysis methods exploit these azimuthal periodicities to separate the effects due to the isotropic flat-layered structure from those effects caused by lateral heterogeneities. We implement a harmonic analysis method based on radial and transverse receiver functions components and carry out a synthetic study to illuminate the capabilities of the method in isolating the isotropic flat-layered part of receiver functions and constrain the geometry and strength of lateral heterogeneities. The independent of the baz P receiver function are jointly inverted with phase and group dispersion curves using a linearized inversion procedure. We apply this approach to high dense seismic profiles ( 2 km inter-station distance, see figure) located in the central Betics (western Mediterranean region), a region which has experienced complex geodynamic processes and exhibit strong variations in Moho topography. The technique presented here is robust and can be applied systematically to construct a 3-D model of the crust and uppermost mantle across large networks.

  16. Direct analysis in real time mass spectrometry and multivariate data analysis: a novel approach to rapid identification of analytical markers for quality control of traditional Chinese medicine preparation.

    PubMed

    Zeng, Shanshan; Wang, Lu; Chen, Teng; Wang, Yuefei; Mo, Huanbiao; Qu, Haibin

    2012-07-06

    The paper presents a novel strategy to identify analytical markers of traditional Chinese medicine preparation (TCMP) rapidly via direct analysis in real time mass spectrometry (DART-MS). A commonly used TCMP, Danshen injection, was employed as a model. The optimal analysis conditions were achieved by measuring the contribution of various experimental parameters to the mass spectra. Salvianolic acids and saccharides were simultaneously determined within a single 1-min DART-MS run. Furthermore, spectra of Danshen injections supplied by five manufacturers were processed with principal component analysis (PCA). Obvious clustering was observed in the PCA score plot, and candidate markers were recognized from the contribution plots of PCA. The suitability of potential markers was then confirmed by contrasting with the results of traditional analysis methods. Using this strategy, fructose, glucose, sucrose, protocatechuic aldehyde and salvianolic acid A were rapidly identified as the markers of Danshen injections. The combination of DART-MS with PCA provides a reliable approach to the identification of analytical markers for quality control of TCMP. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. A Systematic Mapping on the Learning Analytics Field and Its Analysis in the Massive Open Online Courses Context

    ERIC Educational Resources Information Center

    Moissa, Barbara; Gasparini, Isabela; Kemczinski, Avanilde

    2015-01-01

    Learning Analytics (LA) is a field that aims to optimize learning through the study of dynamical processes occurring in the students' context. It covers the measurement, collection, analysis and reporting of data about students and their contexts. This study aims at surveying existing research on LA to identify approaches, topics, and needs for…

  18. Analytical multiple scattering correction to the Mie theory: Application to the analysis of the lidar signal

    NASA Technical Reports Server (NTRS)

    Flesia, C.; Schwendimann, P.

    1992-01-01

    The contribution of the multiple scattering to the lidar signal is dependent on the optical depth tau. Therefore, the radar analysis, based on the assumption that the multiple scattering can be neglected is limited to cases characterized by low values of the optical depth (tau less than or equal to 0.1) and hence it exclude scattering from most clouds. Moreover, all inversion methods relating lidar signal to number densities and particle size must be modified since the multiple scattering affects the direct analysis. The essential requests of a realistic model for lidar measurements which include the multiple scattering and which can be applied to practical situations follow. (1) Requested are not only a correction term or a rough approximation describing results of a certain experiment, but a general theory of multiple scattering tying together the relevant physical parameter we seek to measure. (2) An analytical generalization of the lidar equation which can be applied in the case of a realistic aerosol is requested. A pure analytical formulation is important in order to avoid the convergency and stability problems which, in the case of numerical approach, are due to the large number of events that have to be taken into account in the presence of large depth and/or a strong experimental noise.

  19. Building pit dewatering: application of transient analytic elements.

    PubMed

    Zaadnoordijk, Willem J

    2006-01-01

    Analytic elements are well suited for the design of building pit dewatering. Wells and drains can be modeled accurately by analytic elements, both nearby to determine the pumping level and at some distance to verify the targeted drawdown at the building site and to estimate the consequences in the vicinity. The ability to shift locations of wells or drains easily makes the design process very flexible. The temporary pumping has transient effects, for which transient analytic elements may be used. This is illustrated using the free, open-source, object-oriented analytic element simulator Tim(SL) for the design of a building pit dewatering near a canal. Steady calculations are complemented with transient calculations. Finally, the bandwidths of the results are estimated using linear variance analysis.

  20. Analytic analysis of auxetic metamaterials through analogy with rigid link systems

    NASA Astrophysics Data System (ADS)

    Rayneau-Kirkhope, Daniel; Zhang, Chengzhao; Theran, Louis; Dias, Marcelo A.

    2018-02-01

    In recent years, many structural motifs have been designed with the aim of creating auxetic metamaterials. One area of particular interest in this subject is the creation of auxetic material properties through elastic instability. Such metamaterials switch from conventional behaviour to an auxetic response for loads greater than some threshold value. This paper develops a novel methodology in the analysis of auxetic metamaterials which exhibit elastic instability through analogy with rigid link lattice systems. The results of our analytic approach are confirmed by finite-element simulations for both the onset of elastic instability and post-buckling behaviour including Poisson's ratio. The method gives insight into the relationships between mechanisms within lattices and their mechanical behaviour; as such, it has the potential to allow existing knowledge of rigid link lattices with auxetic paths to be used in the design of future buckling-induced auxetic metamaterials.

  1. Introducing Text Analytics as a Graduate Business School Course

    ERIC Educational Resources Information Center

    Edgington, Theresa M.

    2011-01-01

    Text analytics refers to the process of analyzing unstructured data from documented sources, including open-ended surveys, blogs, and other types of web dialog. Text analytics has enveloped the concept of text mining, an analysis approach influenced heavily from data mining. While text mining has been covered extensively in various computer…

  2. Results of the U. S. Geological Survey's analytical evaluation program for standard reference samples distributed in April 2001

    USGS Publications Warehouse

    Woodworth, M.T.; Connor, B.F.

    2001-01-01

    This report presents the results of the U.S. Geological Survey's analytical evaluation program for six standard reference samples -- T-165 (trace constituents), M-158 (major constituents), N-69 (nutrient constituents), N-70 (nutrient constituents), P-36 (low ionic-strength constituents), and Hg-32 (mercury) -- that were distributed in April 2001 to laboratories enrolled in the U.S. Geological Survey sponsored interlaboratory testing program. Analytical data received from 73 laboratories were evaluated with respect to overall laboratory performance and relative laboratory performance for each analyte in the six reference samples. Results of these evaluations are presented in tabular form. Also presented are tables and graphs summarizing the analytical data provided by each laboratory for each analyte in the six standard reference samples. The most probable value for each analyte was determined using nonparametric statistics.

  3. Results of the U. S. Geological Survey's Analytical Evaluation Program for Standard Reference Samples Distributed in March 2002

    USGS Publications Warehouse

    Woodworth, M.T.; Conner, B.F.

    2002-01-01

    This report presents the results of the U.S. Geological Survey's analytical evaluation program for six standard reference samples -- T- 169 (trace constituents), M- 162 (major constituents), N-73 (nutrient constituents), N-74 (nutrient constituents), P-38 (low ionic-strength constituents), and Hg-34 (mercury) -- that were distributed in March 2002 to laboratories enrolled in the U.S. Geological Survey sponsored intedaboratory testing program. Analytical data received from 93 laboratories were evaluated with respect to overall laboratory performance and relative laboratory performance for each analyte in the six reference samples. Results of these evaluations are presented in tabular form. Also presented are tables and graphs summarizing the analytical data provided by each laboratory for each analyte in the six standard reference samples. The most probable value for each analyte was determined using nonparametric statistics.

  4. Results of the U.S. Geological Survey's analytical evaluation program for standard reference samples distributed in September 2002

    USGS Publications Warehouse

    Woodworth, Mark T.; Connor, Brooke F.

    2003-01-01

    This report presents the results of the U.S. Geological Survey's analytical evaluation program for six standard reference samples -- T-171 (trace constituents), M-164 (major constituents), N-75 (nutrient constituents), N-76 (nutrient constituents), P-39 (low ionic-strength constituents), and Hg-35 (mercury) -- that were distributed in September 2002 to laboratories enrolled in the U.S. Geological Survey sponsored interlaboratory testing program. Analytical data received from 102 laboratories were evaluated with respect to overall laboratory performance and relative laboratory performance for each analyte in the six reference samples. Results of these evaluations are presented in tabular form. Also presented are tables and graphs summarizing the analytical data provided by each laboratory for each analyte in the six standard reference samples. The most probable value for each analyte was determined using nonparametric statistics.

  5. Results of the U.S. Geological Survey's analytical evaluation program for standard reference samples distributed in September 2001

    USGS Publications Warehouse

    Woodworth, Mark T.; Connor, Brooke F.

    2002-01-01

    This report presents the results of the U.S. Geological Survey's analytical evaluation program for six standard reference samples -- T-167 (trace constituents), M-160 (major constituents), N-71 (nutrient constituents), N-72 (nutrient constituents), P-37 (low ionic-strength constituents), and Hg-33 (mercury) -- that were distributed in September 2001 to laboratories enrolled in the U.S. Geological Survey sponsored interlaboratory testing program. Analytical data received from 98 laboratories were evaluated with respect to overall laboratory performance and relative laboratory performance for each analyte in the six reference samples. Results of these evaluations are presented in tabular form. Also presented are tables and graphs summarizing the analytical data provided by each laboratory for each analyte in the six standard reference samples. The most probable value for each analyte was determined using nonparametric statistics.

  6. Results of the U.S. Geological Survey's analytical evaluation program for standard reference samples distributed in March 2003

    USGS Publications Warehouse

    Woodworth, Mark T.; Connor, Brooke F.

    2003-01-01

    This report presents the results of the U.S. Geological Survey's analytical evaluation program for six standard reference samples -- T-173 (trace constituents), M-166 (major constituents), N-77 (nutrient constituents), N-78 (nutrient constituents), P-40 (low ionic-strength constituents), and Hg-36 (mercury) -- that were distributed in March 2003 to laboratories enrolled in the U.S. Geological Survey sponsored interlaboratory testing program. Analytical data received from 110 laboratories were evaluated with respect to overall laboratory performance and relative laboratory performance for each analyte in the six reference samples. Results of these evaluations are presented in tabular form. Also presented are tables and graphs summarizing the analytical data provided by each laboratory for each analyte in the six standard reference samples. The most probable value for each analyte was determined using nonparametric statistics.

  7. Sensitivity to imputation models and assumptions in receiver operating characteristic analysis with incomplete data

    PubMed Central

    Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M.

    2015-01-01

    Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms. PMID:26379316

  8. Web-based Visual Analytics for Extreme Scale Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Evans, Katherine J; Harney, John F

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less

  9. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  10. Ultrasonic power transfer from a spherical acoustic wave source to a free-free piezoelectric receiver: Modeling and experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shahab, S.; Gray, M.; Erturk, A., E-mail: alper.erturk@me.gatech.edu

    2015-03-14

    Contactless powering of small electronic components has lately received growing attention for wireless applications in which battery replacement or tethered charging is undesired or simply impossible, and ambient energy harvesting is not a viable solution. As an alternative to well-studied methods of contactless energy transfer, such as the inductive coupling method, the use of ultrasonic waves transmitted and received by piezoelectric devices enables larger power transmission distances, which is critical especially for deep-implanted electronic devices. Moreover, energy transfer by means of acoustic waves is well suited in situations where no electromagnetic fields are allowed. The limited literature of ultrasonic acousticmore » energy transfer is mainly centered on proof-of-concept experiments demonstrating the feasibility of this method, lacking experimentally validated modeling efforts for the resulting multiphysics problem that couples the source and receiver dynamics with domain acoustics. In this work, we present fully coupled analytical, numerical, and experimental multiphysics investigations for ultrasonic acoustic energy transfer from a spherical wave source to a piezoelectric receiver bar that operates in the 33-mode of piezoelectricity. The fluid-loaded piezoelectric receiver under free-free mechanical boundary conditions is shunted to an electrical load for quantifying the electrical power output for a given acoustic source strength of the transmitter. The analytical acoustic-piezoelectric structure interaction modeling framework is validated experimentally, and the effects of system parameters are reported along with optimal electrical loading and frequency conditions of the receiver.« less

  11. Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.

    PubMed

    Stolper, Charles D; Perer, Adam; Gotz, David

    2014-12-01

    As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.

  12. Application of Dynamic Analysis in Semi-Analytical Finite Element Method

    PubMed Central

    Oeser, Markus

    2017-01-01

    Analyses of dynamic responses are significantly important for the design, maintenance and rehabilitation of asphalt pavement. In order to evaluate the dynamic responses of asphalt pavement under moving loads, a specific computational program, SAFEM, was developed based on a semi-analytical finite element method. This method is three-dimensional and only requires a two-dimensional FE discretization by incorporating Fourier series in the third dimension. In this paper, the algorithm to apply the dynamic analysis to SAFEM was introduced in detail. Asphalt pavement models under moving loads were built in the SAFEM and commercial finite element software ABAQUS to verify the accuracy and efficiency of the SAFEM. The verification shows that the computational accuracy of SAFEM is high enough and its computational time is much shorter than ABAQUS. Moreover, experimental verification was carried out and the prediction derived from SAFEM is consistent with the measurement. Therefore, the SAFEM is feasible to reliably predict the dynamic response of asphalt pavement under moving loads, thus proving beneficial to road administration in assessing the pavement’s state. PMID:28867813

  13. Application of Dynamic Analysis in Semi-Analytical Finite Element Method.

    PubMed

    Liu, Pengfei; Xing, Qinyan; Wang, Dawei; Oeser, Markus

    2017-08-30

    Analyses of dynamic responses are significantly important for the design, maintenance and rehabilitation of asphalt pavement. In order to evaluate the dynamic responses of asphalt pavement under moving loads, a specific computational program, SAFEM, was developed based on a semi-analytical finite element method. This method is three-dimensional and only requires a two-dimensional FE discretization by incorporating Fourier series in the third dimension. In this paper, the algorithm to apply the dynamic analysis to SAFEM was introduced in detail. Asphalt pavement models under moving loads were built in the SAFEM and commercial finite element software ABAQUS to verify the accuracy and efficiency of the SAFEM. The verification shows that the computational accuracy of SAFEM is high enough and its computational time is much shorter than ABAQUS. Moreover, experimental verification was carried out and the prediction derived from SAFEM is consistent with the measurement. Therefore, the SAFEM is feasible to reliably predict the dynamic response of asphalt pavement under moving loads, thus proving beneficial to road administration in assessing the pavement's state.

  14. Impact of Educational Activities in Reducing Pre-Analytical Laboratory Errors

    PubMed Central

    Al-Ghaithi, Hamed; Pathare, Anil; Al-Mamari, Sahimah; Villacrucis, Rodrigo; Fawaz, Naglaa; Alkindi, Salam

    2017-01-01

    Objectives Pre-analytic errors during diagnostic laboratory investigations can lead to increased patient morbidity and mortality. This study aimed to ascertain the effect of educational nursing activities on the incidence of pre-analytical errors resulting in non-conforming blood samples. Methods This study was conducted between January 2008 and December 2015. All specimens received at the Haematology Laboratory of the Sultan Qaboos University Hospital, Muscat, Oman, during this period were prospectively collected and analysed. Similar data from 2007 were collected retrospectively and used as a baseline for comparison. Non-conforming samples were defined as either clotted samples, haemolysed samples, use of the wrong anticoagulant, insufficient quantities of blood collected, incorrect/lack of labelling on a sample or lack of delivery of a sample in spite of a sample request. From 2008 onwards, multiple educational training activities directed at the hospital nursing staff and nursing students primarily responsible for blood collection were implemented on a regular basis. Results After initiating corrective measures in 2008, a progressive reduction in the percentage of non-conforming samples was observed from 2009 onwards. Despite a 127.84% increase in the total number of specimens received, there was a significant reduction in non-conforming samples from 0.29% in 2007 to 0.07% in 2015, resulting in an improvement of 75.86% (P <0.050). In particular, specimen identification errors decreased by 0.056%, with a 96.55% improvement. Conclusion Targeted educational activities directed primarily towards hospital nursing staff had a positive impact on the quality of laboratory specimens by significantly reducing pre-analytical errors. PMID:29062553

  15. Origin of Analyte-Induced Porous Silicon Photoluminescence Quenching.

    PubMed

    Reynard, Justin M; Van Gorder, Nathan S; Bright, Frank V

    2017-09-01

    We report on gaseous analyte-induced photoluminescence (PL) quenching of porous silicon, as-prepared (ap-pSi) and oxidized (ox-pSi). By using steady-state and emission wavelength-dependent time-resolved intensity luminescence measurements in concert with a global analysis scheme, we find that the analyte-induced quenching is best described by a three-component static quenching model. In the model, there are blue, green, and red emitters (associated with the nanocrystallite core and surface trap states) that each exhibit unique analyte-emitter association constants and these association constants are a consequence of differences in the pSi surface chemistries.

  16. Demodulation of messages received with low signal to noise ratio

    NASA Astrophysics Data System (ADS)

    Marguinaud, A.; Quignon, T.; Romann, B.

    The implementation of this all-digital demodulator is derived from maximum likelihood considerations applied to an analytical representation of the received signal. Traditional adapted filters and phase lock loops are replaced by minimum variance estimators and hypothesis tests. These statistical tests become very simple when working on phase signal. These methods, combined with rigorous control data representation allow significant computation savings as compared to conventional realizations. Nominal operation has been verified down to energetic signal over noise of -3 dB upon a QPSK demodulator.

  17. The Moho discontinuity beneath Taiwan orogenic zone inferred from receiver function analysis

    NASA Astrophysics Data System (ADS)

    Chang, H.; Chen, C.; Liang, W.

    2013-12-01

    We determine the depth variations of the Moho discontinuity beneath Taiwan from receiver function analysis. Taiwan is a young (~6.5 Ma) orogenic zone as a consequence of oblique collision between the Philippine Sea Plate and the Eurasian Plate. In northeastern Taiwan, the Philippine Sea Plate subducts northwestward under the Eurasian Plate along the Ryukyu Trench; in southern Taiwan, the Eurasian Plate subducts eastward beneath the Philippine Sea Plate along the Manila Trench. Recent tomographic models of Taiwan reveal P-wave velocity variations of the lithospheric structure that provide important constraints on the orogenic processes in this region. However, the depth variations of the Moho discontinuity, a key observation for better understanding crustal deformation, remain elusive. In this study, we aim to delineate the Moho depth variations by analyzing seismic converted phases indicative of the presence of discontinuity structure. We analyze waveform data from teleseismic events recorded at the Broadband Array in Taiwan for Seismology (BATS). Preliminary results of receiver functions beneath BATS stations in eastern Taiwan show that more than one converted phase (P-to-S) are likely present in crustal depths, suggesting possible multiple crustal layering, which may complicate the detection of the Moho. We further carry out synthetic experiments to explore possible crustal structures that reconcile our observations.

  18. A LabVIEW®-based software for the control of the AUTORAD platform: a fully automated multisequential flow injection analysis Lab-on-Valve (MSFIA-LOV) system for radiochemical analysis.

    PubMed

    Barbesi, Donato; Vicente Vilas, Víctor; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Aldave de Las Heras, Laura

    2017-01-01

    A LabVIEW ® -based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino ® -based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW ® VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste.

  19. A Comparison of the Glass Meta-Analytic Technique with the Hunter-Schmidt Meta-Analytic Technique on Three Studies from the Education Literature.

    ERIC Educational Resources Information Center

    Hough, Susan L.; Hall, Bruce W.

    The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…

  20. Nonlinear analyte concentration gradients for one-step kinetic analysis employing optical microring resonators.

    PubMed

    Marty, Michael T; Sloan, Courtney D Kuhnline; Bailey, Ryan C; Sligar, Stephen G

    2012-07-03

    Conventional methods to probe the binding kinetics of macromolecules at biosensor surfaces employ a stepwise titration of analyte concentrations and measure the association and dissociation to the immobilized ligand at each concentration level. It has previously been shown that kinetic rates can be measured in a single step by monitoring binding as the analyte concentration increases over time in a linear gradient. We report here the application of nonlinear analyte concentration gradients for determining kinetic rates and equilibrium binding affinities in a single experiment. A versatile nonlinear gradient maker is presented, which is easily applied to microfluidic systems. Simulations validate that accurate kinetic rates can be extracted for a wide range of association and dissociation rates, gradient slopes, and curvatures, and with models for mass transport. The nonlinear analyte gradient method is demonstrated with a silicon photonic microring resonator platform to measure prostate specific antigen-antibody binding kinetics.

  1. Nonlinear Analyte Concentration Gradients for One-Step Kinetic Analysis Employing Optical Microring Resonators

    PubMed Central

    Marty, Michael T.; Kuhnline Sloan, Courtney D.; Bailey, Ryan C.; Sligar, Stephen G.

    2012-01-01

    Conventional methods to probe the binding kinetics of macromolecules at biosensor surfaces employ a stepwise titration of analyte concentrations and measure the association and dissociation to the immobilized ligand at each concentration level. It has previously been shown that kinetic rates can be measured in a single step by monitoring binding as the analyte concentration increases over time in a linear gradient. We report here the application of nonlinear analyte concentration gradients for determining kinetic rates and equilibrium binding affinities in a single experiment. A versatile nonlinear gradient maker is presented, which is easily applied to microfluidic systems. Simulations validate that accurate kinetic rates can be extracted for a wide range of association and dissociation rates, gradient slopes and curvatures, and with models for mass transport. The nonlinear analyte gradient method is demonstrated with a silicon photonic microring resonator platform to measure prostate specific antigen-antibody binding kinetics. PMID:22686186

  2. Assessing the Classification Accuracy of Early Numeracy Curriculum-Based Measures Using Receiver Operating Characteristic Curve Analysis

    ERIC Educational Resources Information Center

    Laracy, Seth D.; Hojnoski, Robin L.; Dever, Bridget V.

    2016-01-01

    Receiver operating characteristic curve (ROC) analysis was used to investigate the ability of early numeracy curriculum-based measures (EN-CBM) administered in preschool to predict performance below the 25th and 40th percentiles on a quantity discrimination measure in kindergarten. Areas under the curve derived from a sample of 279 students ranged…

  3. Analytical Eco-Scale for Assessing the Greenness of a Developed RP-HPLC Method Used for Simultaneous Analysis of Combined Antihypertensive Medications.

    PubMed

    Mohamed, Heba M; Lamie, Nesrine T

    2016-09-01

    In the past few decades the analytical community has been focused on eliminating or reducing the usage of hazardous chemicals and solvents, in different analytical methodologies, that have been ascertained to be extremely dangerous to human health and environment. In this context, environmentally friendly, green, or clean practices have been implemented in different research areas. This study presents a greener alternative of conventional RP-HPLC methods for the simultaneous determination and quantitative analysis of a pharmaceutical ternary mixture composed of telmisartan, hydrochlorothiazide, and amlodipine besylate, using an ecofriendly mobile phase and short run time with the least amount of waste production. This solvent-replacement approach was feasible without compromising method performance criteria, such as separation efficiency, peak symmetry, and chromatographic retention. The greenness profile of the proposed method was assessed and compared with reported conventional methods using the analytical Eco-Scale as an assessment tool. The proposed method was found to be greener in terms of usage of hazardous chemicals and solvents, energy consumption, and production of waste. The proposed method can be safely used for the routine analysis of the studied pharmaceutical ternary mixture with a minimal detrimental impact on human health and the environment.

  4. The receiver operational characteristic for binary classification with multiple indices and its application to the neuroimaging study of Alzheimer's disease.

    PubMed

    Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei

    2013-01-01

    Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis.

  5. Putting the "receive" in accounts receivable.

    PubMed

    McDaniel, John W; Baum, Neil

    2006-01-01

    There isn't a practice in the United States that doesn't have a concern about accounts receivable. The financial success of any practice depends on the care and feeding of the accounts receivable. This is not an area of practice management that can be taken lightly or delegated to someone who is not attentive to detail and doggedly persistent. In this article, we will discuss how to identify problematic accounts receivable and what can be done to bring the accounts receivable under control. We will provide you with a plan of action that can be adopted by any practice regardless of size, number of physicians, or whether the practice uses in-house billing or outsources its billing arrangements.

  6. Analytical Chemistry (edited by R. Kellner, J.- M. Mermet, M. Otto, and H. M. Widmer)

    NASA Astrophysics Data System (ADS)

    Thompson, Reviewed By Robert Q.

    2000-04-01

    (I have done my share of reshuffling over the years), some timing is sacrosanct. For example, I suspect that most first courses in analytical chemistry include basic statistics early on, yet this topic is found under Chemometrics in Part IV. Another example is the separation of the discussions of acid-base equilibria (Chapter 4) and acid-base titrations (Chapter 7), with chromatography and kinetics interspersed. Simple UV-vis spectrometry and Beer's law are discussed after topics such as thermal analysis and biosensors. Information on monochromators is buried in the chapter on atomic emission spectroscopy. The editors have organized the material in a reasonably logical yet unfamiliar order. I would guess that those who adopt this text will need to skip from chapter to chapter or restructure their courses in a major way. Some topics receive more or less attention than I believe is justified. Let me provide a few examples of this uneven treatment. The editors include in Part I a 5-page description of the regulatory aspects of QA & QC, a topic of little interest to undergraduates. In the liquid chromatography section there are 3.5 pages on thin-layer chromatography and 6.5 pages on field flow fractionation, but only 2.5 pages on capillary electrophoresis, a burgeoning area of analysis. While biamperometric and conductometric titrations are discussed, common redox titrations employing an indicator (e.g. iodometric titrations with starch endpoint) are ignored. Likewise, electrochemical stripping analysis, important in trace analysis, is given short shrift (half a page). The editors set a useful chapter template, but it is not followed in all cases. At the top of the first page of each chapter is a grayed box of general learning objectives, sort of a chapter overview, and most chapters begin with a very brief, often interesting historical overview. Worked numerical examples, though scarce, are found in grayed areas throughout the text. Specific and general references for

  7. Numerical modeling of reflux solar receivers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, R.E. Jr.

    1993-05-01

    Using reflux solar receivers to collect solar energy for dish-Stirling electric power generation systems is presently being investigated by several organizations, including Sandia National Laboratories, Albuquerque, N. Mex. In support of this program, Sandia has developed two numerical models describing the thermal performance of pool-boiler and heat-pipe reflux receivers. Both models are applicable to axisymmetric geometries and they both consider the radiative and convective energy transfer within the receiver cavity, the conductive and convective energy transfer from the receiver housing, and the energy transfer to the receiver working fluid. The primary difference between the models is the level of detailmore » in modeling the heat conduction through the receiver walls. The more detailed model uses a two-dimensional finite control volume method, whereas the simpler model uses a one-dimensional thermal resistance approach. The numerical modeling concepts presented are applicable to conventional tube-type solar receivers, as well as to reflux receivers. Good agreement between the two models is demonstrated by comparing the predicted and measured performance of a pool-boiler reflux receiver being tested at Sandia. For design operating conditions, the receiver thermal efficiencies agree within 1 percent and the average receiver cavity temperature within 1.3 percent. The thermal efficiency and receiver temperatures predicted by the simpler thermal resistance model agree well with experimental data from on-sun tests of the Sandia reflux pool-boiler receiver. An analysis of these comparisons identifies several plausible explanations for the differences between the predicted results and the experimental data.« less

  8. The Ophidia framework: toward cloud-based data analytics for climate change

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni

    2015-04-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private

  9. A novel approach to piecewise analytic agricultural machinery path reconstruction

    NASA Astrophysics Data System (ADS)

    Wörz, Sascha; Mederle, Michael; Heizinger, Valentin; Bernhardt, Heinz

    2017-12-01

    Before analysing machinery operation in fields, it has to be coped with the problem that the GPS signals of GPS receivers located on the machines contain measurement noise, are time-discrete, and the underlying physical system describing the positions, axial and absolute velocities, angular rates and angular orientation of the operating machines during the whole working time are unknown. This research work presents a new three-dimensional mathematical approach using kinematic relations based on control variables as Euler angular velocities and angles and a discrete target control problem, such that the state control function is given by the sum of squared residuals involving the state and control variables to get such a physical system, which yields a noise-free and piecewise analytic representation of the positions, velocities, angular rates and angular orientation. It can be used for a further detailed study and analysis of the problem of why agricultural vehicles operate in practice as they do.

  10. The application of multiple analyte adduct formation in the LC-MS3 analysis of valproic acid in human serum.

    PubMed

    Dziadosz, Marek

    2017-01-01

    LC-MS using electrospray ionisation (negative ion mode) and low-energy collision-induced dissociation tandem mass spectrometric (CID-MS/MS) analysis, together with the multiple analyte adduct formation with the components of the mobile phase, were applied to analyse valproic acid in human serum with LC-MS 3 . The CID-fragmentation of the precursor analyte adduct [M+2CH 3 COONa-H] - was applied in the method validation (307.1/225.1/143.0). Chromatographic separation was performed with a Luna 5μm C18 (2) 100A, 150mm×2mm column and the elution with a mobile phase consisting of A (H 2 O/methanol=95/5, v/v) and B (H 2 O/methanol=3/97, v/v), both with 10mM ammonium acetate and 0.1% acetic acid. A binary flow pumping mode with a total flow rate of 0.400mL/min was used. The calculated limit of detection/quantification of the method calibrated in the range of 10-200μg/mL was 0.31/1.0μg/mL. The sample preparation based on protein precipitation with 1mL of H 2 O/methanol solution (3/97, v/v) with 10mM sodium acetate and 100mM acetic acid. On the basis of the experiments performed could be demonstrated, that multiple analyte adduct formation can be applied to generate MS 3 quantitation of analytes with problematic fragmentation. The presented new strategy makes the analysis of small drugs, which do not produce any stable product ions at all, on the basis of LC-MS 3 possible. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. General analytical solutions for DC/AC circuit-network analysis

    NASA Astrophysics Data System (ADS)

    Rubido, Nicolás; Grebogi, Celso; Baptista, Murilo S.

    2017-06-01

    In this work, we present novel general analytical solutions for the currents that are developed in the edges of network-like circuits when some nodes of the network act as sources/sinks of DC or AC current. We assume that Ohm's law is valid at every edge and that charge at every node is conserved (with the exception of the source/sink nodes). The resistive, capacitive, and/or inductive properties of the lines in the circuit define a complex network structure with given impedances for each edge. Our solution for the currents at each edge is derived in terms of the eigenvalues and eigenvectors of the Laplacian matrix of the network defined from the impedances. This derivation also allows us to compute the equivalent impedance between any two nodes of the circuit and relate it to currents in a closed circuit which has a single voltage generator instead of many input/output source/sink nodes. This simplifies the treatment that could be done via Thévenin's theorem. Contrary to solving Kirchhoff's equations, our derivation allows to easily calculate the redistribution of currents that occurs when the location of sources and sinks changes within the network. Finally, we show that our solutions are identical to the ones found from Circuit Theory nodal analysis.

  12. Let's Talk... Analytics

    ERIC Educational Resources Information Center

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  13. Modern data science for analytical chemical data - A comprehensive review.

    PubMed

    Szymańska, Ewa

    2018-10-22

    Efficient and reliable analysis of chemical analytical data is a great challenge due to the increase in data size, variety and velocity. New methodologies, approaches and methods are being proposed not only by chemometrics but also by other data scientific communities to extract relevant information from big datasets and provide their value to different applications. Besides common goal of big data analysis, different perspectives and terms on big data are being discussed in scientific literature and public media. The aim of this comprehensive review is to present common trends in the analysis of chemical analytical data across different data scientific fields together with their data type-specific and generic challenges. Firstly, common data science terms used in different data scientific fields are summarized and discussed. Secondly, systematic methodologies to plan and run big data analysis projects are presented together with their steps. Moreover, different analysis aspects like assessing data quality, selecting data pre-processing strategies, data visualization and model validation are considered in more detail. Finally, an overview of standard and new data analysis methods is provided and their suitability for big analytical chemical datasets shortly discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Analysis of structural dynamic data from Skylab. Volume 2: Skylab analytical and test model data

    NASA Technical Reports Server (NTRS)

    Demchak, L.; Harcrow, H.

    1976-01-01

    The orbital configuration test modal data, analytical test correlation modal data, and analytical flight configuration modal data are presented. Tables showing the generalized mass contributions (GMCs) for each of the thirty tests modes are given along with the two dimensional mode shape plots and tables of GMCs for the test correlated analytical modes. The two dimensional mode shape plots for the analytical modes and uncoupled and coupled modes of the orbital flight configuration at three development phases of the model are included.

  15. Analytical-HZETRN Model for Rapid Assessment of Active Magnetic Radiation Shielding

    NASA Technical Reports Server (NTRS)

    Washburn, S. A.; Blattnig, S. R.; Singleterry, R. C.; Westover, S. C.

    2014-01-01

    The use of active radiation shielding designs has the potential to reduce the radiation exposure received by astronauts on deep-space missions at a significantly lower mass penalty than designs utilizing only passive shielding. Unfortunately, the determination of the radiation exposure inside these shielded environments often involves lengthy and computationally intensive Monte Carlo analysis. In order to evaluate the large trade space of design parameters associated with a magnetic radiation shield design, an analytical model was developed for the determination of flux inside a solenoid magnetic field due to the Galactic Cosmic Radiation (GCR) radiation environment. This analytical model was then coupled with NASA's radiation transport code, HZETRN, to account for the effects of passive/structural shielding mass. The resulting model can rapidly obtain results for a given configuration and can therefore be used to analyze an entire trade space of potential variables in less time than is required for even a single Monte Carlo run. Analyzing this trade space for a solenoid magnetic shield design indicates that active shield bending powers greater than 15 Tm and passive/structural shielding thicknesses greater than 40 g/cm2 have a limited impact on reducing dose equivalent values. Also, it is shown that higher magnetic field strengths are more effective than thicker magnetic fields at reducing dose equivalent.

  16. NaK pool-boiler bench-scale receiver durability test: Test results and materials analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andraka, C.E.; Goods, S.H.; Bradshaw, R.W.

    1994-06-01

    Pool-boiler reflux receivers have been considered as an alternative to heat pipes for the input of concentrated solar energy to Stirling-cycle engines in dish-Stirling electric generation systems. Pool boilers offer simplicity in design and fabrication. The operation of a full-scale pool-boiler receiver has been demonstrated for short periods of time. However, to generate cost-effective electricity, the receiver must operate Without significant maintenance for the entire system life, as much as 20 to 30 years. Long-term liquid-metal boiling stability and materials compatibility with refluxing NaK-78 is not known and must be determined for the pool boiler receiver. No boiling system hasmore » been demonstrated for a significant duration with the current porous boiling enhancement surface and materials. Therefore, it is necessary to simulate the full-scale pool boiler design as much as possible, including flux levels, materials, and operating cycles. On-sun testing is impractical because of the limited test time available. A test vessel was constructed with a porous boiling enhancement surface. The boiling surface consisted of a brazed stainless steel powder with about 50% porosity. The vessel was heated with a quartz lamp array providing about go W/CM2 peak incident thermal flux. The vessel was charged with NaK-78. This allows the elimination of costly electric preheating, both on this test and on fullscale receivers. The vessel was fabricated from Haynes 230 alloy. The vessel operated at 750{degrees}C around the clock, with a 1/2-hour shutdown cycle to ambient every 8 hours. The test completed 7500 hours of lamp-on operation time, and over 1000 startups from ambient. The test was terminated when a small leak in an Inconel 600 thermowell was detected. The test design and data are presented here. Metallurgical analysis of virgin and tested materials has begun, and initial results are also presented.« less

  17. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    ERIC Educational Resources Information Center

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  18. Results of the U.S. Geological Survey's Analytical Evaluation Program for Standard Reference Samples Distributed in March 2000

    USGS Publications Warehouse

    Farrar, Jerry W.; Copen, Ashley M.

    2000-01-01

    This report presents the results of the U.S. Geological Survey's analytical evaluation program for six standard reference samples -- T-161 (trace constituents), M-154 (major constituents), N-65 (nutrient constituents), N-66 nutrient constituents), P-34 (low ionic strength constituents), and Hg-30 (mercury) -- that were distributed in March 2000 to 144 laboratories enrolled in the U.S. Geological Survey sponsored interlaboratory testing program. Analytical data that were received from 132 of the laboratories were evaluated with respect to overall laboratory performance and relative laboratory performance for each analyte in the six reference samples. Results of these evaluations are presented in tabular form. Also presented are tables and graphs summarizing the analytical data provided by each laboratory for each analyte in the six standard reference samples. The most probable value for each analyte was determined using nonparametric statistics.

  19. Results of the U.S. Geological Survey's analytical evaluation program for standard reference samples distributed in October 1999

    USGS Publications Warehouse

    Farrar, T.W.

    2000-01-01

    This report presents the results of the U.S. Geological Survey's analytical evaluation program for six standard reference samples -- T-159 (trace constituents), M-152 (major constituents), N-63 (nutrient constituents), N-64 (nutrient constituents), P-33 (low ionic strength constituents), and Hg-29 (mercury) -- that were distributed in October 1999 to 149 laboratories enrolled in the U.S. Geological Survey sponsored interlaboratory testing program. Analytical data that were received from 131 of the laboratories were evaluated with respect to overall laboratory performance and relative laboratory performance for each analyte in the six reference samples. Results of these evaluations are presented in tabular form. Also presented are tables and graphs summarizing the analytical data provided by each laboratory for each analyte in the six standard reference samples. The most probable value for each analyte was determined using nonparametric statistics.

  20. Results of the U. S. Geological Survey's analytical evaluation program for standard reference samples distributed in October 2000

    USGS Publications Warehouse

    Connor, B.F.; Currier, J.P.; Woodworth, M.T.

    2001-01-01

    This report presents the results of the U.S. Geological Survey's analytical evaluation program for six standard reference samples -- T-163 (trace constituents), M-156 (major constituents), N-67 (nutrient constituents), N-68 (nutrient constituents), P-35 (low ionic strength constituents), and Hg-31 (mercury) -- that were distributed in October 2000 to 126 laboratories enrolled in the U.S. Geological Survey sponsored interlaboratory testing program. Analytical data that were received from 122 of the laboratories were evaluated with respect to overall laboratory performance and relative laboratory performance for each analyte in the six reference samples. Results of these evaluations are presented in tabular form. Also presented are tables and graphs summarizing the analytical data provided by each laboratory for each analyte in the six standard reference samples. The most probable value for each analyte was determined using nonparametric statistics.

  1. Calibrated Noise Measurements with Induced Receiver Gain Fluctuations

    NASA Technical Reports Server (NTRS)

    Racette, Paul; Walker, David; Gu, Dazhen; Rajola, Marco; Spevacek, Ashly

    2011-01-01

    The lack of well-developed techniques for modeling changing statistical moments in our observations has stymied the application of stochastic process theory in science and engineering. These limitations were encountered when modeling the performance of radiometer calibration architectures and algorithms in the presence of non stationary receiver fluctuations. Analyses of measured signals have traditionally been limited to a single measurement series. Whereas in a radiometer that samples a set of noise references, the data collection can be treated as an ensemble set of measurements of the receiver state. Noise Assisted Data Analysis is a growing field of study with significant potential for aiding the understanding and modeling of non stationary processes. Typically, NADA entails adding noise to a signal to produce an ensemble set on which statistical analysis is performed. Alternatively as in radiometric measurements, mixing a signal with calibrated noise provides, through the calibration process, the means to detect deviations from the stationary assumption and thereby a measurement tool to characterize the signal's non stationary properties. Data sets comprised of calibrated noise measurements have been limited to those collected with naturally occurring fluctuations in the radiometer receiver. To examine the application of NADA using calibrated noise, a Receiver Gain Modulation Circuit (RGMC) was designed and built to modulate the gain of a radiometer receiver using an external signal. In 2010, an RGMC was installed and operated at the National Institute of Standards and Techniques (NIST) using their Noise Figure Radiometer (NFRad) and national standard noise references. The data collected is the first known set of calibrated noise measurements from a receiver with an externally modulated gain. As an initial step, sinusoidal and step-function signals were used to modulate the receiver gain, to evaluate the circuit characteristics and to study the performance of

  2. Reference Intervals of Hematology and Clinical Chemistry Analytes for 1-Year-Old Korean Children

    PubMed Central

    Lee, Hye Ryun; Roh, Eun Youn; Chang, Ju Young

    2016-01-01

    Background Reference intervals need to be established according to age. We established reference intervals of hematology and chemistry from community-based healthy 1-yr-old children and analyzed their iron status according to the feeding methods during the first six months after birth. Methods A total of 887 children who received a medical check-up between 2010 and 2014 at Boramae Hospital (Seoul, Korea) were enrolled. A total of 534 children (247 boys and 287 girls) were enrolled as reference individuals after the exclusion of data obtained from children with suspected iron deficiency. Hematology and clinical chemistry analytes were measured, and the reference value of each analyte was estimated by using parametric (mean±2 SD) or nonparametric methods (2.5-97.5th percentile). Iron, total iron-binding capacity, and ferritin were measured, and transferrin saturation was calculated. Results As there were no differences in the mean values between boys and girls, we established the reference intervals for 1-yr-old children regardless of sex. The analysis of serum iron status according to feeding methods during the first six months revealed higher iron, ferritin, and transferrin saturation levels in children exclusively or mainly fed formula than in children exclusively or mainly fed breast milk. Conclusions We established reference intervals of hematology and clinical chemistry analytes from community-based healthy children at one year of age. These reference intervals will be useful for interpreting results of medical check-ups at one year of age. PMID:27374715

  3. Reference Intervals of Hematology and Clinical Chemistry Analytes for 1-Year-Old Korean Children.

    PubMed

    Lee, Hye Ryun; Shin, Sue; Yoon, Jong Hyun; Roh, Eun Youn; Chang, Ju Young

    2016-09-01

    Reference intervals need to be established according to age. We established reference intervals of hematology and chemistry from community-based healthy 1-yr-old children and analyzed their iron status according to the feeding methods during the first six months after birth. A total of 887 children who received a medical check-up between 2010 and 2014 at Boramae Hospital (Seoul, Korea) were enrolled. A total of 534 children (247 boys and 287 girls) were enrolled as reference individuals after the exclusion of data obtained from children with suspected iron deficiency. Hematology and clinical chemistry analytes were measured, and the reference value of each analyte was estimated by using parametric (mean±2 SD) or nonparametric methods (2.5-97.5th percentile). Iron, total iron-binding capacity, and ferritin were measured, and transferrin saturation was calculated. As there were no differences in the mean values between boys and girls, we established the reference intervals for 1-yr-old children regardless of sex. The analysis of serum iron status according to feeding methods during the first six months revealed higher iron, ferritin, and transferrin saturation levels in children exclusively or mainly fed formula than in children exclusively or mainly fed breast milk. We established reference intervals of hematology and clinical chemistry analytes from community-based healthy children at one year of age. These reference intervals will be useful for interpreting results of medical check-ups at one year of age.

  4. ANALYTICAL METHODS AND QUALITY ASSURANCE CRITERIA FOR LC/ES/MS DETERMINATION OF PFOS IN FISH

    EPA Science Inventory

    PFOS, perfluorooctanesulfonate, has recently received much attention from environmental researchers. Previous analytical methods were based upon complexing with a strong ion-pairing reagent and extraction into MTBE. Detection was done on a concentrate using negative ion LC/ES/MS/...

  5. Patients Receiving Prebiotics and Probiotics Before Liver Transplantation Develop Fewer Infections Than Controls: A Systematic Review and Meta-Analysis.

    PubMed

    Sawas, Tarek; Al Halabi, Shadi; Hernaez, Ruben; Carey, William D; Cho, Won Kyoo

    2015-09-01

    Among patients who have received liver transplants, infections increase morbidity and mortality and prolong hospital stays. Administration of antibiotics and surgical trauma create intestinal barrier dysfunction and microbial imbalances that allow enteric bacteria to translocate to the blood. Probiotics are believed to prevent bacterial translocation by stabilizing the intestinal barrier and stimulating proliferation of the intestinal epithelium, mucus secretion, and motility. We performed a meta-analysis to determine the effects of probiotics on infections in patients receiving liver transplants. We searched PubMed and EMBASE for controlled trials that evaluated the effects of prebiotics and probiotics on infections in patients who underwent liver transplantation. Heterogeneity was analyzed by the Cochran Q statistic. Pooled Mantel-Haenszel relative risks were calculated with a fixed-effects model. We identified 4 controlled studies, comprising 246 participants (123 received probiotics, 123 served as controls), for inclusion in the meta-analysis. In these studies, the intervention groups received enteric nutrition and fiber (prebiotics) with probiotics, and the control groups received only enteric nutrition and fiber without probiotics. The infection rate was 7% in groups that received probiotics vs 35% in control groups (relative risk [RR], 0.21; 95% confidence interval [CI], 0.11-0.41; P = .001). The number needed to treat to prevent 1 infection was 3.6. In subgroup analyses, only 2% of subjects in the probiotic groups developed urinary tract infections, compared with 16% of controls (RR, 0.14; 95% CI, 0.04-0.47; P < .001); only 2% of subjects in the probiotic groups developed intra-abdominal infections, compared with 11% of controls (RR, 0.27; 95% CI, 0.09-0.78; P = .02). Subjects receiving probiotics also had shorter stays in the hospital than controls (mean difference, 1.41 d; P < .001), as well as in the intensive care unit (mean difference, 1.41 d; P

  6. An Overview of Learning Analytics

    ERIC Educational Resources Information Center

    Clow, Doug

    2013-01-01

    Learning analytics, the analysis and representation of data about learners in order to improve learning, is a new lens through which teachers can understand education. It is rooted in the dramatic increase in the quantity of data about learners and linked to management approaches that focus on quantitative metrics, which are sometimes antithetical…

  7. Joint multifractal analysis based on the partition function approach: analytical analysis, numerical simulation and empirical application

    NASA Astrophysics Data System (ADS)

    Xie, Wen-Jie; Jiang, Zhi-Qiang; Gu, Gao-Feng; Xiong, Xiong; Zhou, Wei-Xing

    2015-10-01

    Many complex systems generate multifractal time series which are long-range cross-correlated. Numerous methods have been proposed to characterize the multifractal nature of these long-range cross correlations. However, several important issues about these methods are not well understood and most methods consider only one moment order. We study the joint multifractal analysis based on partition function with two moment orders, which was initially invented to investigate fluid fields, and derive analytically several important properties. We apply the method numerically to binomial measures with multifractal cross correlations and bivariate fractional Brownian motions without multifractal cross correlations. For binomial multifractal measures, the explicit expressions of mass function, singularity strength and multifractal spectrum of the cross correlations are derived, which agree excellently with the numerical results. We also apply the method to stock market indexes and unveil intriguing multifractality in the cross correlations of index volatilities.

  8. Study on bending behaviour of nickel–titanium rotary endodontic instruments by analytical and numerical analyses

    PubMed Central

    Tsao, C C; Liou, J U; Wen, P H; Peng, C C; Liu, T S

    2013-01-01

    Aim To develop analytical models and analyse the stress distribution and flexibility of nickel–titanium (NiTi) instruments subject to bending forces. Methodology The analytical method was used to analyse the behaviours of NiTi instruments under bending forces. Two NiTi instruments (RaCe and Mani NRT) with different cross-sections and geometries were considered. Analytical results were derived using Euler–Bernoulli nonlinear differential equations that took into account the screw pitch variation of these NiTi instruments. In addition, the nonlinear deformation analysis based on the analytical model and the finite element nonlinear analysis was carried out. Numerical results are obtained by carrying out a finite element method. Results According to analytical results, the maximum curvature of the instrument occurs near the instrument tip. Results of the finite element analysis revealed that the position of maximum von Mises stress was near the instrument tip. Therefore, the proposed analytical model can be used to predict the position of maximum curvature in the instrument where fracture may occur. Finally, results of analytical and numerical models were compatible. Conclusion The proposed analytical model was validated by numerical results in analysing bending deformation of NiTi instruments. The analytical model is useful in the design and analysis of instruments. The proposed theoretical model is effective in studying the flexibility of NiTi instruments. Compared with the finite element method, the analytical model can deal conveniently and effectively with the subject of bending behaviour of rotary NiTi endodontic instruments. PMID:23173762

  9. The Promise and Peril of Predictive Analytics in Higher Education: A Landscape Analysis

    ERIC Educational Resources Information Center

    Ekowo, Manuela; Palmer, Iris

    2016-01-01

    Predictive analytics in higher education is a hot-button topic among educators and administrators as institutions strive to better serve students by becoming more data-informed. In this paper, the authors describe how predictive analytics are used in higher education to identify students who need extra support, steer students in courses they will…

  10. Scalable and Power Efficient Data Analytics for Hybrid Exascale Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhary, Alok; Samatova, Nagiza; Wu, Kesheng

    This project developed a generic and optimized set of core data analytics functions. These functions organically consolidate a broad constellation of high performance analytical pipelines. As the architectures of emerging HPC systems become inherently heterogeneous, there is a need to design algorithms for data analysis kernels accelerated on hybrid multi-node, multi-core HPC architectures comprised of a mix of CPUs, GPUs, and SSDs. Furthermore, the power-aware trend drives the advances in our performance-energy tradeoff analysis framework which enables our data analysis kernels algorithms and software to be parameterized so that users can choose the right power-performance optimizations.

  11. Tunable lasers and their application in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Steinfeld, J. I.

    1975-01-01

    The impact that laser techniques might have in chemical analysis is examined. Absorption, scattering, and heterodyne detection is considered. Particular emphasis is placed on the advantages of using frequency-tunable sources, and dye solution lasers are regarded as the outstanding example of this type of laser. Types of spectroscopy that can be carried out with lasers are discussed along with the ultimate sensitivity or minimum detectable concentration of molecules that can be achieved with each method. Analytical applications include laser microprobe analysis, remote sensing and instrumental methods such as laser-Raman spectroscopy, atomic absorption/fluorescence spectrometry, fluorescence assay techniques, optoacoustic spectroscopy, and polarization measurements. The application of lasers to spectroscopic methods of analysis would seem to be a rewarding field both for research in analytical chemistry and for investments in instrument manufacturing.

  12. The Receiver Operational Characteristic for Binary Classification with Multiple Indices and Its Application to the Neuroimaging Study of Alzheimer’s Disease

    PubMed Central

    Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei

    2014-01-01

    Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis. PMID:23702553

  13. Sampling and analyte enrichment strategies for ambient mass spectrometry.

    PubMed

    Li, Xianjiang; Ma, Wen; Li, Hongmei; Ai, Wanpeng; Bai, Yu; Liu, Huwei

    2018-01-01

    Ambient mass spectrometry provides great convenience for fast screening, and has showed promising potential in analytical chemistry. However, its relatively low sensitivity seriously restricts its practical utility in trace compound analysis. In this review, we summarize the sampling and analyte enrichment strategies coupled with nine modes of representative ambient mass spectrometry (desorption electrospray ionization, paper vhspray ionization, wooden-tip spray ionization, probe electrospray ionization, coated blade spray ionization, direct analysis in real time, desorption corona beam ionization, dielectric barrier discharge ionization, and atmospheric-pressure solids analysis probe) that have dramatically increased the detection sensitivity. We believe that these advances will promote routine use of ambient mass spectrometry. Graphical abstract Scheme of sampling stretagies for ambient mass spectrometry.

  14. Reevaluation of analytical methods for photogenerated singlet oxygen

    PubMed Central

    Nakamura, Keisuke; Ishiyama, Kirika; Ikai, Hiroyo; Kanno, Taro; Sasaki, Keiichi; Niwano, Yoshimi; Kohno, Masahiro

    2011-01-01

    The aim of the present study is to compare different analytical methods for singlet oxygen and to discuss an appropriate way to evaluate the yield of singlet oxygen photogenerated from photosensitizers. Singlet oxygen photogenerated from rose bengal was evaluated by electron spin resonance analysis using sterically hindered amines, spectrophotometric analysis of 1,3-diphenylisobenzofuran oxidation, and analysis of fluorescent probe (Singlet Oxygen Sensor Green®). All of the analytical methods could evaluate the relative yield of singlet oxygen. The sensitivity of the analytical methods was 1,3-diphenylisobenzofuran < electron spin resonance < Singlet Oxygen Sensor Green®. However, Singlet Oxygen Sensor Green® could be used only when the concentration of rose bengal was very low (<1 µM). In addition, since the absorption spectra of 1,3-diphenylisobenzofuran is considerably changed by irradiation of 405 nm laser, photosensitizers which are excited by light with a wavelength of around 400 nm such as hematoporphyrin cannot be used in the 1,3-diphenylisobenzofuran oxidation method. On the other hand, electron spin resonance analysis using a sterically hindered amine, especially 2,2,6,6-tetramethyl-4-piperidinol and 2,2,5,5-tetramethyl-3-pyrroline-3-carboxamide, had proper sensitivity and wide detectable range for the yield of photogenerated singlet oxygen. Therefore, in photodynamic therapy, it is suggested that the relative yield of singlet oxygen generated by various photosensitizers can be evaluated properly by electron spin resonance analysis. PMID:21980223

  15. Analytical relationships for prediction of the mechanical properties of additively manufactured porous biomaterials

    PubMed Central

    Hedayati, Reza

    2016-01-01

    Abstract Recent developments in additive manufacturing techniques have motivated an increasing number of researchers to study regular porous biomaterials that are based on repeating unit cells. The physical and mechanical properties of such porous biomaterials have therefore received increasing attention during recent years. One of the areas that have revived is analytical study of the mechanical behavior of regular porous biomaterials with the aim of deriving analytical relationships that could predict the relative density and mechanical properties of porous biomaterials, given the design and dimensions of their repeating unit cells. In this article, we review the analytical relationships that have been presented in the literature for predicting the relative density, elastic modulus, Poisson's ratio, yield stress, and buckling limit of regular porous structures based on various types of unit cells. The reviewed analytical relationships are used to compare the mechanical properties of porous biomaterials based on different types of unit cells. The major areas where the analytical relationships have improved during the recent years are discussed and suggestions are made for future research directions. © 2016 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 104A: 3164–3174, 2016. PMID:27502358

  16. Image analysis driven single-cell analytics for systems microbiology.

    PubMed

    Balomenos, Athanasios D; Tsakanikas, Panagiotis; Aspridou, Zafiro; Tampakaki, Anastasia P; Koutsoumanis, Konstantinos P; Manolakos, Elias S

    2017-04-04

    Time-lapse microscopy is an essential tool for capturing and correlating bacterial morphology and gene expression dynamics at single-cell resolution. However state-of-the-art computational methods are limited in terms of the complexity of cell movies that they can analyze and lack of automation. The proposed Bacterial image analysis driven Single Cell Analytics (BaSCA) computational pipeline addresses these limitations thus enabling high throughput systems microbiology. BaSCA can segment and track multiple bacterial colonies and single-cells, as they grow and divide over time (cell segmentation and lineage tree construction) to give rise to dense communities with thousands of interacting cells in the field of view. It combines advanced image processing and machine learning methods to deliver very accurate bacterial cell segmentation and tracking (F-measure over 95%) even when processing images of imperfect quality with several overcrowded colonies in the field of view. In addition, BaSCA extracts on the fly a plethora of single-cell properties, which get organized into a database summarizing the analysis of the cell movie. We present alternative ways to analyze and visually explore the spatiotemporal evolution of single-cell properties in order to understand trends and epigenetic effects across cell generations. The robustness of BaSCA is demonstrated across different imaging modalities and microscopy types. BaSCA can be used to analyze accurately and efficiently cell movies both at a high resolution (single-cell level) and at a large scale (communities with many dense colonies) as needed to shed light on e.g. how bacterial community effects and epigenetic information transfer play a role on important phenomena for human health, such as biofilm formation, persisters' emergence etc. Moreover, it enables studying the role of single-cell stochasticity without losing sight of community effects that may drive it.

  17. A novel magnet focusing plate for matrix-assisted laser desorption/ionization analysis of magnetic bead-bound analytes.

    PubMed

    Gode, David; Volmer, Dietrich A

    2013-05-15

    Magnetic beads are often used for serum profiling of peptide and protein biomarkers. In these assays, the bead-bound analytes are eluted from the beads prior to mass spectrometric analysis. This study describes a novel matrix-assisted laser desorption/ionization (MALDI) technique for direct application and focusing of magnetic beads to MALDI plates by means of dedicated micro-magnets as sample spots. Custom-made MALDI plates with magnetic focusing spots were made using small nickel-coated neodymium micro-magnets integrated into a stainless steel plate in a 16 × 24 (384) pattern. For demonstrating the proof-of-concept, commercial C-18 magnetic beads were used for the extraction of a test compound (reserpine) from aqueous solution. Experiments were conducted to study focusing abilities, the required laser energies, the influence of a matrix compound, dispensing techniques, solvent choice and the amount of magnetic beads. Dispensing the magnetic beads onto the micro-magnet sample spots resulted in immediate and strong binding to the magnetic surface. Light microscope images illustrated the homogeneous distribution of beads across the surfaces of the magnets, when the entire sample volume containing the beads was pipetted onto the surface. Subsequent MALDI analysis of the bead-bound analyte demonstrated excellent and reproducible ionization yields. The surface-assisted laser desorption/ionization (SALDI) properties of the strongly light-absorbing γ-Fe2O3-based beads resulted in similar ionization efficiencies to those obtained from experiments with an additional MALDI matrix compound. This feasibility study successfully demonstrated the magnetic focusing abilities for magnetic bead-bound analytes on a novel MALDI plate containing small micro-magnets as sample spots. One of the key advantages of this integrated approach is that no elution steps from magnetic beads were required during analyses compared with conventional bead experiments. Copyright © 2013 John Wiley

  18. Performance Analysis of Low-Cost Single-Frequency GPS Receivers in Hydrographic Surveying

    NASA Astrophysics Data System (ADS)

    Elsobeiey, M.

    2017-10-01

    The International Hydrographic Organization (IHO) has issued standards that provide the minimum requirements for different types of hydrographic surveys execution to collect data to be used to compile navigational charts. Such standards are usually updated from time to time to reflect new survey techniques and practices and must be achieved to assure both surface navigation safety and marine environment protection. Hydrographic surveys can be classified to four orders namely, special order, order 1a, order 1b, and order 2. The order of hydrographic surveys to use should be determined in accordance with the importance to the safety of navigation in the surveyed area. Typically, geodetic-grade dual-frequency GPS receivers are utilized for position determination during data collection in hydrographic surveys. However, with the evolution of high-sensitivity low-cost single-frequency receivers, it is very important to evaluate the performance of such receivers. This paper investigates the performance of low-cost single-frequency GPS receivers in hydrographic surveying applications. The main objective is to examine whether low-cost single-frequency receivers fulfil the IHO standards for hydrographic surveys. It is shown that the low-cost single-frequency receivers meet the IHO horizontal accuracy for all hydrographic surveys orders at any depth. However, the single-frequency receivers meet only order 2 requirements for vertical accuracy at depth more than or equal 100 m.

  19. Deployment of Analytics into the Healthcare Safety Net: Lessons Learned.

    PubMed

    Hartzband, David; Jacobs, Feygele

    2016-01-01

    As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation's largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration. 1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population level, apparent underreporting

  20. Deployment of Analytics into the Healthcare Safety Net: Lessons Learned

    PubMed Central

    Hartzband, David; Jacobs, Feygele

    2016-01-01

    Background As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation’s largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. Methods To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration.1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. Results The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population

  1. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  2. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  3. Applications of reversible covalent chemistry in analytical sample preparation.

    PubMed

    Siegel, David

    2012-12-07

    Reversible covalent chemistry (RCC) adds another dimension to commonly used sample preparation techniques like solid-phase extraction (SPE), solid-phase microextraction (SPME), molecular imprinted polymers (MIPs) or immuno-affinity cleanup (IAC): chemical selectivity. By selecting analytes according to their covalent reactivity, sample complexity can be reduced significantly, resulting in enhanced analytical performance for low-abundance target analytes. This review gives a comprehensive overview of the applications of RCC in analytical sample preparation. The major reactions covered include reversible boronic ester formation, thiol-disulfide exchange and reversible hydrazone formation, targeting analyte groups like diols (sugars, glycoproteins and glycopeptides, catechols), thiols (cysteinyl-proteins and cysteinyl-peptides) and carbonyls (carbonylated proteins, mycotoxins). Their applications range from low abundance proteomics to reversible protein/peptide labelling to antibody chromatography to quantitative and qualitative food analysis. In discussing the potential of RCC, a special focus is on the conditions and restrictions of the utilized reaction chemistry.

  4. Analytical dose modeling for preclinical proton irradiation of millimetric targets.

    PubMed

    Vanstalle, Marie; Constanzo, Julie; Karakaya, Yusuf; Finck, Christian; Rousseau, Marc; Brasse, David

    2018-01-01

    Due to the considerable development of proton radiotherapy, several proton platforms have emerged to irradiate small animals in order to study the biological effectiveness of proton radiation. A dedicated analytical treatment planning tool was developed in this study to accurately calculate the delivered dose given the specific constraints imposed by the small dimensions of the irradiated areas. The treatment planning system (TPS) developed in this study is based on an analytical formulation of the Bragg peak and uses experimental range values of protons. The method was validated after comparison with experimental data from the literature and then compared to Monte Carlo simulations conducted using Geant4. Three examples of treatment planning, performed with phantoms made of water targets and bone-slab insert, were generated with the analytical formulation and Geant4. Each treatment planning was evaluated using dose-volume histograms and gamma index maps. We demonstrate the value of the analytical function for mouse irradiation, which requires a targeting accuracy of 0.1 mm. Using the appropriate database, the analytical modeling limits the errors caused by misestimating the stopping power. For example, 99% of a 1-mm tumor irradiated with a 24-MeV beam receives the prescribed dose. The analytical dose deviations from the prescribed dose remain within the dose tolerances stated by report 62 of the International Commission on Radiation Units and Measurements for all tested configurations. In addition, the gamma index maps show that the highly constrained targeting accuracy of 0.1 mm for mouse irradiation leads to a significant disagreement between Geant4 and the reference. This simulated treatment planning is nevertheless compatible with a targeting accuracy exceeding 0.2 mm, corresponding to rat and rabbit irradiations. Good dose accuracy for millimetric tumors is achieved with the analytical calculation used in this work. These volume sizes are typical in mouse

  5. Results of the U.S. Geological Survey's Analytical Evaluation Program for standard reference samples distributed in March 1999

    USGS Publications Warehouse

    Farrar, Jerry W.; Chleboun, Kimberly M.

    1999-01-01

    This report presents the results of the U.S. Geological Survey's analytical evaluation program for 8 standard reference samples -- T-157 (trace constituents), M-150 (major constituents), N-61 (nutrient constituents), N-62 (nutrient constituents), P-32 (low ionic strength constituents), GWT-5 (ground-water trace constituents), GWM- 4 (ground-water major constituents),and Hg-28 (mercury) -- that were distributed in March 1999 to 120 laboratories enrolled in the U.S. Geological Survey sponsored interlaboratory testing program. Analytical data that were received from 111 of the laboratories were evaluated with respect to overall laboratory performance and relative laboratory performance for each analyte in the seven reference samples. Results of these evaluations are presented in tabular form. Also presented are tables and graphs summarizing the analytical data provided by each laboratory for each analyte in the 8 standard reference samples. The most probable value for each analyte was determined using nonparametric statistics.

  6. 7 CFR 94.4 - Analytical methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., Gaithersburg, MD 20877-2417. (d) Manual of Analytical Methods for the Analysis of Pesticide Residues in Human... Splittstoesser (Editors), American Public Health Association, 1015 Fifteenth Street, NW, Washington, DC 20005. (b... Examination of Dairy Products, American Public Health Association, 1015 Fifteenth Street, NW, Washington, DC...

  7. 7 CFR 94.4 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., Gaithersburg, MD 20877-2417. (d) Manual of Analytical Methods for the Analysis of Pesticide Residues in Human... Splittstoesser (Editors), American Public Health Association, 1015 Fifteenth Street, NW, Washington, DC 20005. (b... Examination of Dairy Products, American Public Health Association, 1015 Fifteenth Street, NW, Washington, DC...

  8. 7 CFR 94.4 - Analytical methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., Gaithersburg, MD 20877-2417. (d) Manual of Analytical Methods for the Analysis of Pesticide Residues in Human... Splittstoesser (Editors), American Public Health Association, 1015 Fifteenth Street, NW, Washington, DC 20005. (b... Examination of Dairy Products, American Public Health Association, 1015 Fifteenth Street, NW, Washington, DC...

  9. 7 CFR 94.4 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., Gaithersburg, MD 20877-2417. (d) Manual of Analytical Methods for the Analysis of Pesticide Residues in Human... Splittstoesser (Editors), American Public Health Association, 1015 Fifteenth Street, NW, Washington, DC 20005. (b... Examination of Dairy Products, American Public Health Association, 1015 Fifteenth Street, NW, Washington, DC...

  10. Microemulsification: an approach for analytical determinations.

    PubMed

    Lima, Renato S; Shiroma, Leandro Y; Teixeira, Alvaro V N C; de Toledo, José R; do Couto, Bruno C; de Carvalho, Rogério M; Carrilho, Emanuel; Kubota, Lauro T; Gobbi, Angelo L

    2014-09-16

    We address a novel method for analytical determinations that combines simplicity, rapidity, low consumption of chemicals, and portability with high analytical performance taking into account parameters such as precision, linearity, robustness, and accuracy. This approach relies on the effect of the analyte content over the Gibbs free energy of dispersions, affecting the thermodynamic stabilization of emulsions or Winsor systems to form microemulsions (MEs). Such phenomenon was expressed by the minimum volume fraction of amphiphile required to form microemulsion (Φ(ME)), which was the analytical signal of the method. Thus, the measurements can be taken by visually monitoring the transition of the dispersions from cloudy to transparent during the microemulsification, like a titration. It bypasses the employment of electric energy. The performed studies were: phase behavior, droplet dimension by dynamic light scattering, analytical curve, and robustness tests. The reliability of the method was evaluated by determining water in ethanol fuels and monoethylene glycol in complex samples of liquefied natural gas. The dispersions were composed of water-chlorobenzene (water analysis) and water-oleic acid (monoethylene glycol analysis) with ethanol as the hydrotrope phase. The mean hydrodynamic diameter values for the nanostructures in the droplet-based water-chlorobenzene MEs were in the range of 1 to 11 nm. The procedures of microemulsification were conducted by adding ethanol to water-oleic acid (W-O) mixtures with the aid of micropipette and shaking. The Φ(ME) measurements were performed in a thermostatic water bath at 23 °C by direct observation that is based on the visual analyses of the media. The experiments to determine water demonstrated that the analytical performance depends on the composition of ME. It shows flexibility in the developed method. The linear range was fairly broad with limits of linearity up to 70.00% water in ethanol. For monoethylene glycol in

  11. An analysis of beam parameters on proton-acoustic waves through an analytic approach.

    PubMed

    Kipergil, Esra Aytac; Erkol, Hakan; Kaya, Serhat; Gulsen, Gultekin; Unlu, Mehmet Burcin

    2017-06-21

    It has been reported that acoustic waves are generated when a high-energy pulsed proton beam is deposited in a small volume within tissue. One possible application of proton-induced acoustics is to get real-time feedback for intra-treatment adjustments by monitoring such acoustic waves. A high spatial resolution in ultrasound imaging may reduce proton range uncertainty. Thus, it is crucial to understand the dependence of the acoustic waves on the proton beam characteristics. In this manuscript, firstly, an analytic solution for the proton-induced acoustic wave is presented to reveal the dependence of the signal on the beam parameters; then it is combined with an analytic approximation of the Bragg curve. The influence of the beam energy, pulse duration and beam diameter variation on the acoustic waveform are investigated. Further analysis is performed regarding the Fourier decomposition of the proton-acoustic signals. Our results show that the smaller spill time of the proton beam upsurges the amplitude of the acoustic wave for a constant number of protons, which is hence beneficial for dose monitoring. The increase in the energy of each individual proton in the beam leads to the spatial broadening of the Bragg curve, which also yields acoustic waves of greater amplitude. The pulse duration and the beam width of the proton beam do not affect the central frequency of the acoustic wave, but they change the amplitude of the spectral components.

  12. Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP)

    EPA Pesticide Factsheets

    The Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP) provides guidance for the planning, implementation and assessment phases of projects that require laboratory analysis of radionuclides.

  13. Targeted analyte deconvolution and identification by four-way parallel factor analysis using three-dimensional gas chromatography with mass spectrometry data.

    PubMed

    Watson, Nathanial E; Prebihalo, Sarah E; Synovec, Robert E

    2017-08-29

    Comprehensive three-dimensional gas chromatography with time-of-flight mass spectrometry (GC 3 -TOFMS) creates an opportunity to explore a new paradigm in chemometric analysis. Using this newly described instrument and the well understood Parallel Factor Analysis (PARAFAC) model we present one option for utilization of the novel GC 3 -TOFMS data structure. We present a method which builds upon previous work in both GC 3 and targeted analysis using PARAFAC to simplify some of the implementation challenges previously discovered. Conceptualizing the GC 3 -TOFMS instead as a one-dimensional gas chromatograph with GC × GC-TOFMS detection we allow the instrument to create the PARAFAC target window natively. Each first dimension modulation thus creates a full GC × GC-TOFMS chromatogram fully amenable to PARAFAC. A simple mixture of 115 compounds and a diesel sample are interrogated through this methodology. All test analyte targets are successfully identified in both mixtures. In addition, mass spectral matching of the PARAFAC loadings to library spectra yielded results greater than 900 in 40 of 42 test analyte cases. Twenty-nine of these cases produced match values greater than 950. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Applying Pragmatics Principles for Interaction with Visual Analytics.

    PubMed

    Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac

    2018-01-01

    Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.

  15. Analytic Analysis of Convergent Shocks to Multi-Gigabar Conditions

    NASA Astrophysics Data System (ADS)

    Ruby, J. J.; Rygg, J. R.; Collins, G. W.; Bachmann, B.; Doeppner, T.; Ping, Y.; Gaffney, J.; Lazicki, A.; Kritcher, A. L.; Swift, D.; Nilsen, J.; Landen, O. L.; Hatarik, R.; Masters, N.; Nagel, S.; Sterne, P.; Pardini, T.; Khan, S.; Celliers, P. M.; Patel, P.; Gericke, D.; Falcone, R.

    2017-10-01

    The gigabar experimental platform at the National Ignition Facility is designed to increase understanding of the physical states and processes that dominate in the hydrogen at pressures from several hundreds of Mbar to tens of Gbar. Recent experiments using a solid CD2 ball reached temperatures and densities of order 107 K and several tens of g/cm3 , respectively. These conditions lead to the production of D-D fusion neutrons and x-ray bremsstrahlung photons, which allow us to place constraints on the thermodynamic states at peak compression. We use an analytic model to connect the neutron and x-ray emission with the state variables at peak compression. This analytic model is based on the self-similar Guderley solution of an imploding shock wave and the self-similar solution of the point explosion with heat conduction from Reinicke. Work is also being done to create a fully self-similar solution of an imploding shock wave coupled with heat conduction and radiation transport using a general equation of state. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.

  16. Behavior Analytic Contributions to the Study of Creativity

    ERIC Educational Resources Information Center

    Kubina, Richard M., Jr.; Morrison, Rebecca S.; Lee, David L.

    2006-01-01

    As researchers continue to study creativity, a behavior analytic perspective may provide new vistas by offering an additional perspective. Contemporary behavior analysis began with B. F. Skinner and offers a selectionist approach to the scientific investigation of creativity. Behavior analysis contributes to the study of creativity by…

  17. The magnetic particle in a box: Analytic and micromagnetic analysis of probe-localized spin wave modes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adur, Rohan, E-mail: adur@physics.osu.edu; Du, Chunhui; Manuilov, Sergei A.

    2015-05-07

    The dipole field from a probe magnet can be used to localize a discrete spectrum of standing spin wave modes in a continuous ferromagnetic thin film without lithographic modification to the film. Obtaining the resonance field for a localized mode is not trivial due to the effect of the confined and inhomogeneous magnetization precession. We compare the results of micromagnetic and analytic methods to find the resonance field of localized modes in a ferromagnetic thin film, and investigate the accuracy of these methods by comparing with a numerical minimization technique that assumes Bessel function modes with pinned boundary conditions. Wemore » find that the micromagnetic technique, while computationally more intensive, reveals that the true magnetization profiles of localized modes are similar to Bessel functions with gradually decaying dynamic magnetization at the mode edges. We also find that an analytic solution, which is simple to implement and computationally much faster than other methods, accurately describes the resonance field of localized modes when exchange fields are negligible, and demonstrating the accessibility of localized mode analysis.« less

  18. Multiplexed analysis combining distinctly-sized CdTe-MPA quantum dots and chemometrics for multiple mutually interfering analyte determination.

    PubMed

    Bittar, Dayana B; Ribeiro, David S M; Páscoa, Ricardo N M J; Soares, José X; Rodrigues, S Sofia M; Castro, Rafael C; Pezza, Leonardo; Pezza, Helena R; Santos, João L M

    2017-11-01

    Semiconductor quantum dots (QDs) have demonstrated a great potential as fluorescent probes for heavy metals monitoring. However, their great reactivity, whose tunability could be difficult to attain, could impair selectivity yielding analytical results with poor accuracy. In this work, the combination in the same analysis of multiple QDs, each with a particular ability to interact with the analyte, assured a multi-point detection that was not only exploited for a more precise analyte discrimination but also for the simultaneous discrimination of multiple mutually interfering species, in the same sample. Three different MPA-CdTe QDs (2.5, 3.0 and 3.8nm) with a good size distribution, confirmed by the FWHM values of 48.6, 55.4 and 80.8nm, respectively, were used. Principal component analysis (PCA) and partial least squares regression (PLS) were used for fluorescence data analysis. Mixtures of two MPA-CdTe QDs, emitting at different wavelength namely 549/566, 549/634 and 566/634nm were assayed. The 549/634nm emitting QDs mixture provided the best results for the discrimination of distinct ions on binary and ternary mixtures. The obtained RMSECV and R 2 CV values for the binary mixture were good, namely, from 0.01 to 0.08mgL -1 and from 0.74 to 0.89, respectively. Regarding the ternary mixture the RMSECV and R 2 CV values were good for Hg(II) (0.06 and 0.73mgL -1 , respectively) and Pb(II) (0.08 and 0.87mg L -1 , respectively) and acceptable for Cu(II) (0.02 and 0.51mgL -1 , respectively). In conclusion, the obtained results showed that the developed approach is capable of resolve binary and ternary mixtures of Pb (II), Hg (II) and Cu (II), providing accurate information about lead (II) and mercury (II) concentration and signaling the occurrence of Cu (II). Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Single-analyte to multianalyte fluorescence sensors

    NASA Astrophysics Data System (ADS)

    Lavigne, John J.; Metzger, Axel; Niikura, Kenichi; Cabell, Larry A.; Savoy, Steven M.; Yoo, J. S.; McDevitt, John T.; Neikirk, Dean P.; Shear, Jason B.; Anslyn, Eric V.

    1999-05-01

    The rational design of small molecules for the selective complexation of analytes has reached a level of sophistication such that there exists a high degree of prediction. An effective strategy for transforming these hosts into sensors involves covalently attaching a fluorophore to the receptor which displays some fluorescence modulation when analyte is bound. Competition methods, such as those used with antibodies, are also amenable to these synthetic receptors, yet there are few examples. In our laboratories, the use of common dyes in competition assays with small molecules has proven very effective. For example, an assay for citrate in beverages and an assay for the secondary messenger IP3 in cells have been developed. Another approach we have explored focuses on multi-analyte sensor arrays with attempt to mimic the mammalian sense of taste. Our system utilizes polymer resin beads with the desired sensors covalently attached. These functionalized microspheres are then immobilized into micromachined wells on a silicon chip thereby creating our taste buds. Exposure of the resin to analyte causes a change in the transmittance of the bead. This change can be fluorescent or colorimetric. Optical interrogation of the microspheres, by illuminating from one side of the wafer and collecting the signal on the other, results in an image. These data streams are collected using a CCD camera which creates red, green and blue (RGB) patterns that are distinct and reproducible for their environments. Analysis of this data can identify and quantify the analytes present.

  20. Influence of Desorption Conditions on Analyte Sensitivity and Internal Energy in Discrete Tissue or Whole Body Imaging by IR-MALDESI

    NASA Astrophysics Data System (ADS)

    Rosen, Elias P.; Bokhart, Mark T.; Ghashghaei, H. Troy; Muddiman, David C.

    2015-06-01

    Analyte signal in a laser desorption/postionization scheme such as infrared matrix-assisted laser desorption electrospray ionization (IR-MALDESI) is strongly coupled to the degree of overlap between the desorbed plume of neutral material from a sample and an orthogonal electrospray. In this work, we systematically examine the effect of desorption conditions on IR-MALDESI response to pharmaceutical drugs and endogenous lipids in biological tissue using a design of experiments approach. Optimized desorption conditions have then been used to conduct an untargeted lipidomic analysis of whole body sagittal sections of neonate mouse. IR-MALDESI response to a wide range of lipid classes has been demonstrated, with enhanced lipid coverage received by varying the laser wavelength used for mass spectrometry imaging (MSI). Targeted MS2 imaging (MS2I) of an analyte, cocaine, deposited beneath whole body sections allowed determination of tissue-specific ion response factors, and CID fragments of cocaine were monitored to comment on wavelength-dependent internal energy deposition based on the "survival yield" method.

  1. The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.

    2015-12-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the

  2. Pure-rotational spectrometry: a vintage analytical method applied to modern breath analysis.

    PubMed

    Hrubesh, Lawrence W; Droege, Michael W

    2013-09-01

    Pure-rotational spectrometry (PRS) is an established method, typically used to study structures and properties of polar gas-phase molecules, including isotopic and isomeric varieties. PRS has also been used as an analytical tool where it is particularly well suited for detecting or monitoring low-molecular-weight species that are found in exhaled breath. PRS is principally notable for its ultra-high spectral resolution which leads to exceptional specificity to identify molecular compounds in complex mixtures. Recent developments using carbon aerogel for pre-concentrating polar molecules from air samples have extended the sensitivity of PRS into the part-per-billion range. In this paper we describe the principles of PRS and show how it may be configured in several different modes for breath analysis. We discuss the pre-concentration concept and demonstrate its use with the PRS analyzer for alcohols and ammonia sampled directly from the breath.

  3. Adjustment of pesticide concentrations for temporal changes in analytical recovery, 1992–2010

    USGS Publications Warehouse

    Martin, Jeffrey D.; Eberle, Michael

    2011-01-01

    Recovery is the proportion of a target analyte that is quantified by an analytical method and is a primary indicator of the analytical bias of a measurement. Recovery is measured by analysis of quality-control (QC) water samples that have known amounts of target analytes added ("spiked" QC samples). For pesticides, recovery is the measured amount of pesticide in the spiked QC sample expressed as a percentage of the amount spiked, ideally 100 percent. Temporal changes in recovery have the potential to adversely affect time-trend analysis of pesticide concentrations by introducing trends in apparent environmental concentrations that are caused by trends in performance of the analytical method rather than by trends in pesticide use or other environmental conditions. This report presents data and models related to the recovery of 44 pesticides and 8 pesticide degradates (hereafter referred to as "pesticides") that were selected for a national analysis of time trends in pesticide concentrations in streams. Water samples were analyzed for these pesticides from 1992 through 2010 by gas chromatography/mass spectrometry. Recovery was measured by analysis of pesticide-spiked QC water samples. Models of recovery, based on robust, locally weighted scatterplot smooths (lowess smooths) of matrix spikes, were developed separately for groundwater and stream-water samples. The models of recovery can be used to adjust concentrations of pesticides measured in groundwater or stream-water samples to 100 percent recovery to compensate for temporal changes in the performance (bias) of the analytical method.

  4. [Analysis on willingness to receive human papillomavirus vaccination among risk males and related factors].

    PubMed

    Meng, Xiaojun; Jia, Tianjian; Zhang, Xuan; Zhu, Chen; Chen, Xin; Zou, Huachun

    2015-10-01

    To understand the willingness to receive human papillomavirus (HPV) vaccination of men who have sex with men (MSM) and the male clients of sexually transmitted disease (STD) clinics and related factors in China. MSM were enrolled from the community through snowball sampling and male clients of STD clinics were enrolled from a sexual health clinic through convenience sampling in Wuxi, China. A questionnaire survey on the subjects' socio-demographic characteristics and the awareness of HPV was conducted. A total of 186 MSM and 182 STD clients were recruited. The awareness rates of HPV were 18.4% and 23.1%, respectively and the awareness rates of HPV vaccination were 10.2% and 15.4%, respectively. STD clinic clients (70.9%) were more likely to receive HPV vaccination than MSM (34.9%) (χ² = 47.651, P<0.01). Only 26.2% of MSM and 20.2% of STD clinic clients were willing to receive free HPV vaccination before the age of 20 years. Multivariate logistic regression analysis showed that MSM who had passive anal sex (OR=2.831, 95% CI: 1.703-13.526) , MSM who never used condom in anal sex in the past 6 months (OR=3.435, 95% CI: 1.416-20.108) , MSM who had been diagnosed with STDs (OR=1.968, 95% CI: 1.201-8.312) and STD clinic clients who had commercial sex with females in the past 3 months (OR=1.748, 95% CI: 1.207-8.539) , STD clinic clients who never used condom in commercial sex in the past 3 months (OR=1.926, 95% CI: 1.343-5.819) and STD clinic clients who had been diagnosed with STDs in past 12 months (OR=2.017, 95% CI: 1.671-7.264) were more likely to receive free HPV vaccination. Sexually active MSM and male clients in STD clinics in China had lower awareness of the HPV related knowledge. Their willing to receive HPV vaccination were influenced by their behavior related factors. It is necessary to strengthen the health education about HPV and improve people's awareness of HPV vaccination.

  5. Understanding Business Analytics

    DTIC Science & Technology

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  6. Real-time GPS seismology using a single receiver: method comparison, error analysis and precision validation

    NASA Astrophysics Data System (ADS)

    Li, Xingxing

    2014-05-01

    Earthquake monitoring and early warning system for hazard assessment and mitigation has traditional been based on seismic instruments. However, for large seismic events, it is difficult for traditional seismic instruments to produce accurate and reliable displacements because of the saturation of broadband seismometers and problematic integration of strong-motion data. Compared with the traditional seismic instruments, GPS can measure arbitrarily large dynamic displacements without saturation, making them particularly valuable in case of large earthquakes and tsunamis. GPS relative positioning approach is usually adopted to estimate seismic displacements since centimeter-level accuracy can be achieved in real-time by processing double-differenced carrier-phase observables. However, relative positioning method requires a local reference station, which might itself be displaced during a large seismic event, resulting in misleading GPS analysis results. Meanwhile, the relative/network approach is time-consuming, particularly difficult for the simultaneous and real-time analysis of GPS data from hundreds or thousands of ground stations. In recent years, several single-receiver approaches for real-time GPS seismology, which can overcome the reference station problem of the relative positioning approach, have been successfully developed and applied to GPS seismology. One available method is real-time precise point positioning (PPP) relied on precise satellite orbit and clock products. However, real-time PPP needs a long (re)convergence period, of about thirty minutes, to resolve integer phase ambiguities and achieve centimeter-level accuracy. In comparison with PPP, Colosimo et al. (2011) proposed a variometric approach to determine the change of position between two adjacent epochs, and then displacements are obtained by a single integration of the delta positions. This approach does not suffer from convergence process, but the single integration from delta positions to

  7. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less

  8. Analyticity without Differentiability

    ERIC Educational Resources Information Center

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  9. Visual Analytics for MOOC Data.

    PubMed

    Qu, Huamin; Chen, Qing

    2015-01-01

    With the rise of massive open online courses (MOOCs), tens of millions of learners can now enroll in more than 1,000 courses via MOOC platforms such as Coursera and edX. As a result, a huge amount of data has been collected. Compared with traditional education records, the data from MOOCs has much finer granularity and also contains new pieces of information. It is the first time in history that such comprehensive data related to learning behavior has become available for analysis. What roles can visual analytics play in this MOOC movement? The authors survey the current practice and argue that MOOCs provide an opportunity for visualization researchers and that visual analytics systems for MOOCs can benefit a range of end users such as course instructors, education researchers, students, university administrators, and MOOC providers.

  10. Usefulness of Analytical Research: Rethinking Analytical R&D&T Strategies.

    PubMed

    Valcárcel, Miguel

    2017-11-07

    This Perspective is intended to help foster true innovation in Research & Development & Transfer (R&D&T) in Analytical Chemistry in the form of advances that are primarily useful for analytical purposes rather than solely for publishing. Devising effective means to strengthen the crucial contribution of Analytical Chemistry to progress in Chemistry, Science & Technology, and Society requires carefully examining the present status of our discipline and also identifying internal and external driving forces with a potential adverse impact on its development. The diagnostic process should be followed by administration of an effective therapy and supported by adoption of a theragnostic strategy if Analytical Chemistry is to enjoy a better future.

  11. Analytical methods for determination of mycotoxins: a review.

    PubMed

    Turner, Nicholas W; Subrahmanyam, Sreenath; Piletsky, Sergey A

    2009-01-26

    Mycotoxins are small (MW approximately 700), toxic chemical products formed as secondary metabolites by a few fungal species that readily colonise crops and contaminate them with toxins in the field or after harvest. Ochratoxins and Aflatoxins are mycotoxins of major significance and hence there has been significant research on broad range of analytical and detection techniques that could be useful and practical. Due to the variety of structures of these toxins, it is impossible to use one standard technique for analysis and/or detection. Practical requirements for high-sensitivity analysis and the need for a specialist laboratory setting create challenges for routine analysis. Several existing analytical techniques, which offer flexible and broad-based methods of analysis and in some cases detection, have been discussed in this manuscript. There are a number of methods used, of which many are lab-based, but to our knowledge there seems to be no single technique that stands out above the rest, although analytical liquid chromatography, commonly linked with mass spectroscopy is likely to be popular. This review manuscript discusses (a) sample pre-treatment methods such as liquid-liquid extraction (LLE), supercritical fluid extraction (SFE), solid phase extraction (SPE), (b) separation methods such as (TLC), high performance liquid chromatography (HPLC), gas chromatography (GC), and capillary electrophoresis (CE) and (c) others such as ELISA. Further currents trends, advantages and disadvantages and future prospects of these methods have been discussed.

  12. Sensitive glow discharge ion source for aerosol and gas analysis

    DOEpatents

    Reilly, Peter T. A. [Knoxville, TN

    2007-08-14

    A high sensitivity glow discharge ion source system for analyzing particles includes an aerodynamic lens having a plurality of constrictions for receiving an aerosol including at least one analyte particle in a carrier gas and focusing the analyte particles into a collimated particle beam. A separator separates the carrier gas from the analyte particle beam, wherein the analyte particle beam or vapors derived from the analyte particle beam are selectively transmitted out of from the separator. A glow discharge ionization source includes a discharge chamber having an entrance orifice for receiving the analyte particle beam or analyte vapors, and a target electrode and discharge electrode therein. An electric field applied between the target electrode and discharge electrode generates an analyte ion stream from the analyte vapors, which is directed out of the discharge chamber through an exit orifice, such as to a mass spectrometer. High analyte sensitivity is obtained by pumping the discharge chamber exclusively through the exit orifice and the entrance orifice.

  13. Signal-processing theory for the TurboRogue receiver

    NASA Technical Reports Server (NTRS)

    Thomas, J. B.

    1995-01-01

    Signal-processing theory for the TurboRogue receiver is presented. The signal form is traced from its formation at the GPS satellite, to the receiver antenna, and then through the various stages of the receiver, including extraction of phase and delay. The analysis treats the effects of ionosphere, troposphere, signal quantization, receiver components, and system noise, covering processing in both the 'code mode' when the P code is not encrypted and in the 'P-codeless mode' when the P code is encrypted. As a possible future improvement to the current analog front end, an example of a highly digital front end is analyzed.

  14. Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris; Little, Mike; Huang, Thomas; Jacob, Joseph; Yang, Phil; Kuo, Kwo-Sen

    2016-01-01

    Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based file systems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.

  15. Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations

    NASA Astrophysics Data System (ADS)

    Lynnes, C.; Little, M. M.; Huang, T.; Jacob, J. C.; Yang, C. P.; Kuo, K. S.

    2016-12-01

    Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based filesystems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.

  16. Cost differential by site of service for cancer patients receiving chemotherapy.

    PubMed

    Hayes, Jad; Hoverman, Russell J; Brow, Matthew E; Dilbeck, Dana C; Verrilli, Diana K; Garey, Jody; Espirito, Janet L; Cardona, Jorge; Beveridge, Roy

    2015-03-01

    To compare the costs of: 1) chemotherapy treatment across clinical, demographic, and geographic variables; and 2) various cancer care-related cost categories between patients receiving chemotherapy in a community oncology versus a hospital outpatient setting. Data from the calendar years 2008 to 2010 from the Truven Health Analytics MarketScan Commercial Claims and Encounters Database were analyzed. During 2010, the data set contained approximately 45 million unique commercially insured patients with 70,984 cancer patients receiving chemotherapy. These patients were assigned to cohorts depending on whether they received chemotherapy at a community oncology or hospital outpatient setting. Cost data for 9 common cancer types were extracted from the database and analyzed on a per member per month basis to normalize costs; costs included amounts paid by the payer and patient payment. Community oncology and hospital outpatient setting chemotherapy treatment costs were categorized and examined according to cancer diagnosis, patient demographics, and geographic location. Patients receiving chemotherapy treatment in the community oncology clinic had a 20% to 39% lower mean per member per month cost of care, depending on diagnosis, compared with those receiving chemotherapy in the hospital outpatient setting. This cost differential was consistent across cancer type, geographic location, patient age, and number of chemotherapy sessions. Various cost categories examined were also higher for those treated in the hospital outpatient setting. The cost of care for patients receiving chemotherapy was consistently lower in the community oncology clinic compared with the hospital outpatient setting, controlling for the clinical, demographic, and geographic variables analyzed.

  17. Analytical applications of aptamers

    NASA Astrophysics Data System (ADS)

    Tombelli, S.; Minunni, M.; Mascini, M.

    2007-05-01

    Aptamers are single stranded DNA or RNA ligands which can be selected for different targets starting from a library of molecules containing randomly created sequences. Aptamers have been selected to bind very different targets, from proteins to small organic dyes. Aptamers are proposed as alternatives to antibodies as biorecognition elements in analytical devices with ever increasing frequency. This in order to satisfy the demand for quick, cheap, simple and highly reproducible analytical devices, especially for protein detection in the medical field or for the detection of smaller molecules in environmental and food analysis. In our recent experience, DNA and RNA aptamers, specific for three different proteins (Tat, IgE and thrombin), have been exploited as bio-recognition elements to develop specific biosensors (aptasensors). These recognition elements have been coupled to piezoelectric quartz crystals and surface plasmon resonance (SPR) devices as transducers where the aptamers have been immobilized on the gold surface of the crystals electrodes or on SPR chips, respectively.

  18. Estimation and analysis of the short-term variations of multi-GNSS receiver differential code biases using global ionosphere maps

    NASA Astrophysics Data System (ADS)

    Li, Min; Yuan, Yunbin; Wang, Ningbo; Liu, Teng; Chen, Yongchang

    2017-12-01

    Care should be taken to minimize the adverse impact of differential code biases (DCBs) on global navigation satellite systems (GNSS)-derived ionospheric information determinations. For the sake of convenience, satellite and receiver DCB products provided by the International GNSS Service (IGS) are treated as constants over a period of 24 h (Li et al. (2014)). However, if DCB estimates show remarkable intra-day variability, the DCBs estimated as constants over 1-day period will partially account for ionospheric modeling error; in this case DCBs will be required to be estimated over shorter time period. Therefore, it is important to further gain insight into the short-term variation characteristics of receiver DCBs. In this contribution, the IGS combined global ionospheric maps and the German Aerospace Center (DLR)-provided satellite DCBs are used in the improved method to determine the multi-GNSS receiver DCBs with an hourly time resolution. The intra-day stability of the receiver DCBs is thereby analyzed in detail. Based on 1 month of data collected within the multi-GNSS experiment of the IGS, a good agreement within the receiver DCBs is found between the resulting receiver DCB estimates and multi-GNSS DCB products from the DLR at a level of 0.24 ns for GPS, 0.28 ns for GLONASS, 0.28 ns for BDS, and 0.30 ns for Galileo. Although most of the receiver DCBs are relatively stable over a 1-day period, large fluctuations (more than 9 ns between two consecutive hours) within the receiver DCBs can be found. We also demonstrate the impact of the significant short-term variations in receiver DCBs on the extraction of ionospheric total electron content (TEC), at a level of 12.96 TECu (TEC unit). Compared to daily receiver DCB estimates, the hourly DCB estimates obtained from this study can reflect the short-term variations of the DCB estimates more dedicatedly. The main conclusion is that preliminary analysis of characteristics of receiver DCB variations over short

  19. Using Fuzzy Analytic Hierarchy Process multicriteria and Geographical information system for coastal vulnerability analysis in Morocco: The case of Mohammedia

    NASA Astrophysics Data System (ADS)

    Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha

    2016-04-01

    This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.

  20. Analytical relationships for prediction of the mechanical properties of additively manufactured porous biomaterials.

    PubMed

    Zadpoor, Amir Abbas; Hedayati, Reza

    2016-12-01

    Recent developments in additive manufacturing techniques have motivated an increasing number of researchers to study regular porous biomaterials that are based on repeating unit cells. The physical and mechanical properties of such porous biomaterials have therefore received increasing attention during recent years. One of the areas that have revived is analytical study of the mechanical behavior of regular porous biomaterials with the aim of deriving analytical relationships that could predict the relative density and mechanical properties of porous biomaterials, given the design and dimensions of their repeating unit cells. In this article, we review the analytical relationships that have been presented in the literature for predicting the relative density, elastic modulus, Poisson's ratio, yield stress, and buckling limit of regular porous structures based on various types of unit cells. The reviewed analytical relationships are used to compare the mechanical properties of porous biomaterials based on different types of unit cells. The major areas where the analytical relationships have improved during the recent years are discussed and suggestions are made for future research directions. © 2016 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 104A: 3164-3174, 2016. © 2016 The Authors Journal of Biomedical Materials Research Part A Published by Wiley Periodicals, Inc.

  1. Utilization of hypofractionated whole-breast radiation therapy in patients receiving chemotherapy: a National Cancer Database analysis.

    PubMed

    Diwanji, Tejan P; Molitoris, Jason K; Chhabra, Arpit M; Snider, James W; Bentzen, Soren M; Tkaczuk, Katherine H; Rosenblatt, Paula Y; Kesmodel, Susan B; Bellavance, Emily C; Cohen, Randi J; Cheston, Sally B; Nichols, Elizabeth M; Feigenberg, Steven J

    2017-09-01

    Results from four major hypofractionated whole-breast radiotherapy (HF-WBRT) trials have demonstrated equivalence in select patients with early-stage breast cancer when compared with conventionally fractionated WBRT (CF-WBRT). Because relatively little data were available on patients receiving neoadjuvant or adjuvant chemotherapy, consensus guidelines published in 2011 did not endorse the use of HF-WBRT in this population. Our goal is to evaluate trends in utilization of HF-WBRT in patients receiving chemotherapy. We retrospectively analyzed data from 2004 to 2013 in the National Cancer DataBase on breast cancer patients treated with HF-WBRT who met the clinical criteria proposed by consensus guidelines (i.e., age >0 years, T1-2N0, and breast-conserving surgery), regardless of receipt of chemotherapy. We employed logistic regression to delineate and compare clinical and demographic factors associated with utilization of HF-WBRT and CF-WBRT. A total of 56,836 women were treated with chemotherapy and WBRT (without regional nodal irradiation) from 2004 to 2013; 9.0% (n = 5093) were treated with HF-WBRT. Utilization of HF-WBRT increased from 4.6% in 2004 to 18.2% in 2013 (odds ratio [OR] 1.21/year; P < 0.001). Among patients receiving chemotherapy, factors most dramatically associated with increased odds of receiving HF-WBRT on multivariate analysis were academic facilities (OR 2.07; P < 0.001), age >80 (OR 2.58; P < 0.001), west region (OR 1.91; P < 0.001), and distance >50 miles from cancer reporting facility (OR 1.43; P < 0.001). Factors associated with decreased odds of receiving HF-WBRT included white race, income <$48,000, lack of private insurance, T2 versus T1, and higher grade (all P < 0.02). Despite the absence of consensus guideline recommendations, the use of HF-WBRT in patients receiving chemotherapy has increased fourfold (absolute = 13.6%) over the last decade. Increased utilization of HF-WBRT should result in institutional reports

  2. service line analytics in the new era.

    PubMed

    Spence, Jay; Seargeant, Dan

    2015-08-01

    To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.

  3. Big Data Analytics in Medicine and Healthcare.

    PubMed

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  4. Analytical and numerical analysis of frictional damage in quasi brittle materials

    NASA Astrophysics Data System (ADS)

    Zhu, Q. Z.; Zhao, L. Y.; Shao, J. F.

    2016-07-01

    Frictional sliding and crack growth are two main dissipation processes in quasi brittle materials. The frictional sliding along closed cracks is the origin of macroscopic plastic deformation while the crack growth induces a material damage. The main difficulty of modeling is to consider the inherent coupling between these two processes. Various models and associated numerical algorithms have been proposed. But there are so far no analytical solutions even for simple loading paths for the validation of such algorithms. In this paper, we first present a micro-mechanical model taking into account the damage-friction coupling for a large class of quasi brittle materials. The model is formulated by combining a linear homogenization procedure with the Mori-Tanaka scheme and the irreversible thermodynamics framework. As an original contribution, a series of analytical solutions of stress-strain relations are developed for various loading paths. Based on the micro-mechanical model, two numerical integration algorithms are exploited. The first one involves a coupled friction/damage correction scheme, which is consistent with the coupling nature of the constitutive model. The second one contains a friction/damage decoupling scheme with two consecutive steps: the friction correction followed by the damage correction. With the analytical solutions as reference results, the two algorithms are assessed through a series of numerical tests. It is found that the decoupling correction scheme is efficient to guarantee a systematic numerical convergence.

  5. Analytical Analysis and Case Study of Transient Behavior of Inrush Current in Power Transformer for Designing of Efficient Circuit Breakers

    NASA Astrophysics Data System (ADS)

    Harmanpreet, Singh, Sukhwinder; Kumar, Ashok; Kaur, Parneet

    2010-11-01

    Stability & security are main aspects in electrical power systems. Transformer protection is major issue of concern to system operation. There are many mall-trip cases of transformer protection are caused by inrush current problems. The phenomenon of transformer inrush current has been discussed in many papers since 1958. In this paper analytical analysis of inrush current in a transformer switched on dc and ac supply has been done. This analysis will help in design aspects of circuit breakers for better performance.

  6. Design of simplified maximum-likelihood receivers for multiuser CPM systems.

    PubMed

    Bing, Li; Bai, Baoming

    2014-01-01

    A class of simplified maximum-likelihood receivers designed for continuous phase modulation based multiuser systems is proposed. The presented receiver is built upon a front end employing mismatched filters and a maximum-likelihood detector defined in a low-dimensional signal space. The performance of the proposed receivers is analyzed and compared to some existing receivers. Some schemes are designed to implement the proposed receivers and to reveal the roles of different system parameters. Analysis and numerical results show that the proposed receivers can approach the optimum multiuser receivers with significantly (even exponentially in some cases) reduced complexity and marginal performance degradation.

  7. Analytical electron microscopy in mineralogy; exsolved phases in pyroxenes

    USGS Publications Warehouse

    Nord, G.L.

    1982-01-01

    Analytical scanning transmission electron microscopy has been successfully used to characterize the structure and composition of lamellar exsolution products in pyroxenes. At operating voltages of 100 and 200 keV, microanalytical techniques of x-ray energy analysis, convergent-beam electron diffraction, and lattice imaging have been used to chemically and structurally characterize exsolution lamellae only a few unit cells wide. Quantitative X-ray energy analysis using ratios of peak intensities has been adopted for the U.S. Geological Survey AEM in order to study the compositions of exsolved phases and changes in compositional profiles as a function of time and temperature. The quantitative analysis procedure involves 1) removal of instrument-induced background, 2) reduction of contamination, and 3) measurement of correction factors obtained from a wide range of standard compositions. The peak-ratio technique requires that the specimen thickness at the point of analysis be thin enough to make absorption corrections unnecessary (i.e., to satisfy the "thin-foil criteria"). In pyroxenes, the calculated "maximum thicknesses" range from 130 to 1400 nm for the ratios Mg/Si, Fe/Si, and Ca/Si; these "maximum thicknesses" have been contoured in pyroxene composition space as a guide during analysis. Analytical spatial resolutions of 50-100 nm have been achieved in AEM at 200 keV from the composition-profile studies, and analytical reproducibility in AEM from homogeneous pyroxene standards is ?? 1.5 mol% endmember. ?? 1982.

  8. Dioxins, Furans, PCBs, and Congeners Analytical Service within the Superfund Contract Laboratory Program

    EPA Pesticide Factsheets

    This page contains information about the DLM02.2 analytical service for the analysis of dioxins and furans at hazardous waste sites. The SOW contains the analytical method and contractual requirements for laboratories.

  9. A reference web architecture and patterns for real-time visual analytics on large streaming data

    NASA Astrophysics Data System (ADS)

    Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer

    2013-12-01

    Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.

  10. CALUTRON RECEIVER

    DOEpatents

    York, H.F.

    1959-07-01

    A receiver construction is presented for calutrons having two or more ion sources and an individual receiver unit for each source. Design requirements dictate that the face plate defining the receiver entrance slots be placed at an angle to the approaching beam, which means that ions striking the face plate are likely to be scattcred into the entrance slots of other receivers. According to the present invention, the face plate has a surface provided with parallel ridges so disposed that one side only of each ridge's exposed directly to the ion beam. The scattered ions are directed away from adjacent receivers by the ridges on the lace plate.

  11. Analytical Chemistry Laboratory. Progress report for FY 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1996. This annual report is the thirteenth for the ACL. It describes effort on continuing and new projects and contributions of the ACL staff to various programs at ANL. The ACL operates in the ANL system as a full-cost-recovery service center, but has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support to solve research problems of our clients --more » Argonne National Laboratory, the Department of Energy, and others -- and will conduct world-class research and development in analytical chemistry and its applications. Because of the diversity of research and development work at ANL, the ACL handles a wide range of analytical chemistry problems. Some routine or standard analyses are done, but the ACL usually works with commercial laboratories if our clients require high-volume, production-type analyses. It is common for ANL programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. Thus, much of the support work done by the ACL is very similar to our applied analytical chemistry research.« less

  12. Cost-Utility Analysis of Bariatric Surgery in Italy: Results of Decision-Analytic Modelling

    PubMed Central

    Lucchese, Marcello; Borisenko, Oleg; Mantovani, Lorenzo Giovanni; Cortesi, Paolo Angelo; Cesana, Giancarlo; Adam, Daniel; Burdukova, Elisabeth; Lukyanov, Vasily; Di Lorenzo, Nicola

    2017-01-01

    Objective To evaluate the cost-effectiveness of bariatric surgery in Italy from a third-party payer perspective over a medium-term (10 years) and a long-term (lifetime) horizon. Methods A state-transition Markov model was developed, in which patients may experience surgery, post-surgery complications, diabetes mellitus type 2, cardiovascular diseases or die. Transition probabilities, costs, and utilities were obtained from the Italian and international literature. Three types of surgeries were considered: gastric bypass, sleeve gastrectomy, and adjustable gastric banding. A base-case analysis was performed for the population, the characteristics of which were obtained from surgery candidates in Italy. Results In the base-case analysis, over 10 years, bariatric surgery led to cost increment of EUR 2,661 and generated additional 1.1 quality-adjusted life years (QALYs). Over a lifetime, surgery led to savings of EUR 8,649, additional 0.5 life years and 3.2 QALYs. Bariatric surgery was cost-effective at 10 years with an incremental cost-effectiveness ratio of EUR 2,412/QALY and dominant over conservative management over a lifetime. Conclusion In a comprehensive decision analytic model, a current mix of surgical methods for bariatric surgery was cost-effective at 10 years and cost-saving over the lifetime of the Italian patient cohort considered in this analysis. PMID:28601866

  13. Cost-Utility Analysis of Bariatric Surgery in Italy: Results of Decision-Analytic Modelling.

    PubMed

    Lucchese, Marcello; Borisenko, Oleg; Mantovani, Lorenzo Giovanni; Cortesi, Paolo Angelo; Cesana, Giancarlo; Adam, Daniel; Burdukova, Elisabeth; Lukyanov, Vasily; Di Lorenzo, Nicola

    2017-01-01

    To evaluate the cost-effectiveness of bariatric surgery in Italy from a third-party payer perspective over a medium-term (10 years) and a long-term (lifetime) horizon. A state-transition Markov model was developed, in which patients may experience surgery, post-surgery complications, diabetes mellitus type 2, cardiovascular diseases or die. Transition probabilities, costs, and utilities were obtained from the Italian and international literature. Three types of surgeries were considered: gastric bypass, sleeve gastrectomy, and adjustable gastric banding. A base-case analysis was performed for the population, the characteristics of which were obtained from surgery candidates in Italy. In the base-case analysis, over 10 years, bariatric surgery led to cost increment of EUR 2,661 and generated additional 1.1 quality-adjusted life years (QALYs). Over a lifetime, surgery led to savings of EUR 8,649, additional 0.5 life years and 3.2 QALYs. Bariatric surgery was cost-effective at 10 years with an incremental cost-effectiveness ratio of EUR 2,412/QALY and dominant over conservative management over a lifetime. In a comprehensive decision analytic model, a current mix of surgical methods for bariatric surgery was cost-effective at 10 years and cost-saving over the lifetime of the Italian patient cohort considered in this analysis. © 2017 The Author(s) Published by S. Karger GmbH, Freiburg.

  14. Error analysis of analytic solutions for self-excited near-symmetric rigid bodies - A numerical study

    NASA Technical Reports Server (NTRS)

    Kia, T.; Longuski, J. M.

    1984-01-01

    Analytic error bounds are presented for the solutions of approximate models for self-excited near-symmetric rigid bodies. The error bounds are developed for analytic solutions to Euler's equations of motion. The results are applied to obtain a simplified analytic solution for Eulerian rates and angles. The results of a sample application of the range and error bound expressions for the case of the Galileo spacecraft experiencing transverse torques demonstrate the use of the bounds in analyses of rigid body spin change maneuvers.

  15. A response surface analysis of expected and received support for smoking cessation: Expectancy violations predict greater relapse.

    PubMed

    Derrick, Jaye L; Britton, Maggie; Baker, Zachary G; Haddad, Sana

    2018-08-01

    People attempting to stop smoking cigarettes (quitters) hold expectations about the extent to which their partner will provide helpful support during a quit attempt. However, these expectations may not align with their perceptions of the helpfulness of the support they receive. We examine expected and received helpful support during a quit attempt. We hypothesized that receiving less helpful support than expected (i.e., creating an expectancy violation) would be associated with the greatest return to smoking. Sixty-two quitters completed a 21-day ecological momentary assessment (EMA) study. They reported expected support at baseline and support receipt and smoking during the EMA phase. At follow-up, they completed an expelled breath carbon monoxide test. Analyses using polynomial generalized linear models with response surface analysis indicated that smoking outcomes depended on the joint influence of expected and received helpful support. As hypothesized, when quitters expected more helpful support than they received, they were more likely to smoke in the first 24h and the last seven days of the EMA, and they provided higher carbon monoxide readings at follow-up. These results are consistent with an expectancy violation explanation: quitters are more likely to smoke when they perceive their partner has failed to provide support that is as helpful as expected. Given the importance of support for smoking cessation, many researchers have attempted to experimentally increase provision of support. The current findings suggest that partner support interventions might backfire if the quitter is led to expect more helpful support than the partner is able to provide. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Analytical Strategies to Disclose Repeated Consumption of New Psychoactive Substances by Hair Analysis.

    PubMed

    Rotolo, Maria C; Klein, Julia; Pacifici, Roberta; Busardo, Francesco Paolo; Pichini, Simona; Marchei, Emilia

    2017-01-01

    New psychoactive substances (NPS) are a heterogenic group of substances with different chemical structures and psychotropic effects. Many pharmacotoxicological laboratories performing drug testing in conventional and nonconventional biological matrices for clinical and forensic purposes do not include screening procedures for NPS in their routine protocols. This is mainly due to the continued entry in the market of newly synthesized products, the low availability of reference standards, in particular of their metabolites, the low availability of immunochemical kits, etc. Moreover, many of the new compounds are very potent, and low doses ingested will lead to low concentrations in biological matrices, especially in hair. Hair analysis has become a powerful tool for detecting chronic drug use and has become a routine technique in forensic toxicology laboratories. The aim of this study was to set up analytical strategies to identify repeated consumption of NPS by hair analysis. Although UHPLC-MS/MS may represent the elective technique in studying NPS, a combination of both GC-MS and UHPLC-MS/MS techniques is useful in creating a complete toxicological image. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  17. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  18. Analytical modeling and experimental validation of a magnetorheological mount

    NASA Astrophysics Data System (ADS)

    Nguyen, The; Ciocanel, Constantin; Elahinia, Mohammad

    2009-03-01

    Magnetorheological (MR) fluid has been increasingly researched and applied in vibration isolation devices. To date, the suspension system of several high performance vehicles has been equipped with MR fluid based dampers and research is ongoing to develop MR fluid based mounts for engine and powertrain isolation. MR fluid based devices have received attention due to the MR fluid's capability to change its properties in the presence of a magnetic field. This characteristic places MR mounts in the class of semiactive isolators making them a desirable substitution for the passive hydraulic mounts. In this research, an analytical model of a mixed-mode MR mount was constructed. The magnetorheological mount employs flow (valve) mode and squeeze mode. Each mode is powered by an independent electromagnet, so one mode does not affect the operation of the other. The analytical model was used to predict the performance of the MR mount with different sets of parameters. Furthermore, in order to produce the actual prototype, the analytical model was used to identify the optimal geometry of the mount. The experimental phase of this research was carried by fabricating and testing the actual MR mount. The manufactured mount was tested to evaluate the effectiveness of each mode individually and in combination. The experimental results were also used to validate the ability of the analytical model in predicting the response of the MR mount. Based on the observed response of the mount a suitable controller can be designed for it. However, the control scheme is not addressed in this study.

  19. Quality Measures in Pre-Analytical Phase of Tissue Processing: Understanding Its Value in Histopathology.

    PubMed

    Rao, Shalinee; Masilamani, Suresh; Sundaram, Sandhya; Duvuru, Prathiba; Swaminathan, Rajendiran

    2016-01-01

    Quality monitoring in histopathology unit is categorized into three phases, pre-analytical, analytical and post-analytical, to cover various steps in the entire test cycle. Review of literature on quality evaluation studies pertaining to histopathology revealed that earlier reports were mainly focused on analytical aspects with limited studies on assessment of pre-analytical phase. Pre-analytical phase encompasses several processing steps and handling of specimen/sample by multiple individuals, thus allowing enough scope for errors. Due to its critical nature and limited studies in the past to assess quality in pre-analytical phase, it deserves more attention. This study was undertaken to analyse and assess the quality parameters in pre-analytical phase in a histopathology laboratory. This was a retrospective study done on pre-analytical parameters in histopathology laboratory of a tertiary care centre on 18,626 tissue specimens received in 34 months. Registers and records were checked for efficiency and errors for pre-analytical quality variables: specimen identification, specimen in appropriate fixatives, lost specimens, daily internal quality control performance on staining, performance in inter-laboratory quality assessment program {External quality assurance program (EQAS)} and evaluation of internal non-conformities (NC) for other errors. The study revealed incorrect specimen labelling in 0.04%, 0.01% and 0.01% in 2007, 2008 and 2009 respectively. About 0.04%, 0.07% and 0.18% specimens were not sent in fixatives in 2007, 2008 and 2009 respectively. There was no incidence of specimen lost. A total of 113 non-conformities were identified out of which 92.9% belonged to the pre-analytical phase. The predominant NC (any deviation from normal standard which may generate an error and result in compromising with quality standards) identified was wrong labelling of slides. Performance in EQAS for pre-analytical phase was satisfactory in 6 of 9 cycles. A low incidence

  20. Dynamic Graph Analytic Framework (DYGRAF): greater situation awareness through layered multi-modal network analysis

    NASA Astrophysics Data System (ADS)

    Margitus, Michael R.; Tagliaferri, William A., Jr.; Sudit, Moises; LaMonica, Peter M.

    2012-06-01

    Understanding the structure and dynamics of networks are of vital importance to winning the global war on terror. To fully comprehend the network environment, analysts must be able to investigate interconnected relationships of many diverse network types simultaneously as they evolve both spatially and temporally. To remove the burden from the analyst of making mental correlations of observations and conclusions from multiple domains, we introduce the Dynamic Graph Analytic Framework (DYGRAF). DYGRAF provides the infrastructure which facilitates a layered multi-modal network analysis (LMMNA) approach that enables analysts to assemble previously disconnected, yet related, networks in a common battle space picture. In doing so, DYGRAF provides the analyst with timely situation awareness, understanding and anticipation of threats, and support for effective decision-making in diverse environments.

  1. Application of surface analytical methods in thin film analysis

    NASA Astrophysics Data System (ADS)

    Wen, Xingu

    Self-assembly and the sol-gel process are two promising methods for the preparation of novel materials and thin films. In this research, these two methods were utilized to prepare two types of thin films: self-assembled monolayers of peptides on gold and SiO2 sol-gel thin films modified with Ru(II) complexes. The properties of the resulting thin films were investigated by several analytical techniques in order to explore their potential applications in biomaterials, chemical sensors, nonlinear optics and catalysis. Among the analytical techniques employed in the study, surface analytical techniques, such as X-ray photoelectron spectroscopy (XPS) and grazing angle reflection absorption Fourier transform infrared spectroscopy (RA-FTIR), are particularly useful in providing information regarding the compositions and structures of the thin films. In the preparation of peptide thin films, monodisperse peptides were self-assembled on gold substrate via the N-terminus-coupled lipoic acid. The film compositions were investigated by XPS and agreed well with the theoretical values. XPS results also revealed that the surface coverage of the self-assembled films was significantly larger than that of the physisorbed films and that the chemisorption between the peptides and gold surface was stable in solvent. Studies by angle dependent XPS (ADXPS) and grazing angle RA-FTIR indicated that the peptides were on average oriented at a small angle from the surface normal. By using a model of orientation distribution function, both the peptide tilt angle and film thickness can be well calculated. Ru(II) complex doped SiO2 sol-gel thin films were prepared by low temperature sol-gel process. The ability of XPS coupled with Ar + ion sputtering to provide both chemical and compositional depth profile information of these sol-gel films was evaluated. This technique, together with UV-VIS and electrochemical measurements, was used to investigate the stability of Ru complexes in the composite

  2. Validation of an advanced analytical procedure applied to the measurement of environmental radioactivity.

    PubMed

    Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van

    2018-04-01

    In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Multi-center evaluation of analytical performance of the Beckman Coulter AU5822 chemistry analyzer.

    PubMed

    Zimmerman, M K; Friesen, L R; Nice, A; Vollmer, P A; Dockery, E A; Rankin, J D; Zmuda, K; Wong, S H

    2015-09-01

    Our three academic institutions, Indiana University, Northwestern Memorial Hospital, and Wake Forest, were among the first in the United States to implement the Beckman Coulter AU5822 series chemistry analyzers. We undertook this post-hoc multi-center study by merging our data to determine performance characteristics and the impact of methodology changes on analyte measurement. We independently completed performance validation studies including precision, linearity/analytical measurement range, method comparison, and reference range verification. Complete data sets were available from at least one institution for 66 analytes with the following groups: 51 from all three institutions, and 15 from 1 or 2 institutions for a total sample size of 12,064. Precision was similar among institutions. Coefficients of variation (CV) were <10% for 97%. Analytes with CVs >10% included direct bilirubin and digoxin. All analytes exhibited linearity over the analytical measurement range. Method comparison data showed slopes between 0.900-1.100 for 87.9% of the analytes. Slopes for amylase, tobramycin and urine amylase were <0.8; the slope for lipase was >1.5, due to known methodology or standardization differences. Consequently, reference ranges of amylase, urine amylase and lipase required only minor or no modification. The four AU5822 analyzers independently evaluated at three sites showed consistent precision, linearity, and correlation results. Since installations, the test results had been well received by clinicians from all three institutions. Copyright © 2015. Published by Elsevier Inc.

  4. Psychological therapy for inpatients receiving acute mental health care: A systematic review and meta-analysis of controlled trials.

    PubMed

    Paterson, Charlotte; Karatzias, Thanos; Dickson, Adele; Harper, Sean; Dougall, Nadine; Hutton, Paul

    2018-04-16

    The effectiveness of psychological therapies for those receiving acute adult mental health inpatient care remains unclear, partly because of the difficulty in conducting randomized controlled trials (RCTs) in this setting. The aim of this meta-analysis was to synthesize evidence from all controlled trials of psychological therapy carried out with this group, to estimate its effects on a number of important outcomes and examine whether the presence of randomization and rater blinding moderated these estimates. A systematic review and meta-analysis of all controlled trials of psychological therapy delivered in acute inpatient settings was conducted, with a focus on psychotic symptoms, readmissions or emotional distress (anxiety and depression). Studies were identified through ASSIA, EMBASE, CINAHL, Cochrane, MEDLINE, and PsycINFO using a combination of the key terms 'inpatient', 'psychological therapy', and 'acute'. No restriction was placed on diagnosis. The moderating effect of the use of assessor-blind RCT methodology was examined via subgroup and sensitivity analyses. Overall, psychological therapy was associated with small-to-moderate improvements in psychotic symptoms at end of therapy but the effect was smaller and not significant at follow-up. Psychological therapy was also associated with reduced readmissions, depression, and anxiety. The use of single-blind randomized controlled trial methodology was associated with significantly reduced benefits on psychotic symptoms and was also associated with reduced benefits on readmission and depression; however, these reductions were not statistically significant. The provision of psychological therapy to acute psychiatric inpatients is associated with improvements; however, the use of single-blind RCT methodology was associated with reduced therapy-attributable improvements. Whether this is a consequence of increased internal validity or reduced external validity is unclear. Trials with both high internal and

  5. Effect of different analyte diffusion/adsorption protocols on SERS signals

    NASA Astrophysics Data System (ADS)

    Li, Ruoping; Petschek, Rolfe G.; Han, Junhe; Huang, Mingju

    2018-07-01

    The effect of different analyte diffusion/adsorption protocols was studied which is often overlooked in surface-enhanced Raman scattering (SERS) technique. Three protocols: highly concentrated dilution (HCD) protocol, half-half dilution (HHD) protocol and layered adsorption (LA) protocol were studied and the SERS substrates were monolayer films of 80 nm Ag nanoparticles (NPs) which were modified by polyvinylpyrrolidone. The diffusion/adsorption mechanisms were modelled using the diffusion equation and the electromagnetic field distribution of two adjacent Ag NPs was simulated by the finite-different time-domain method. All experimental data and theoretical analysis suggest that different diffusion/adsorption behaviour of analytes will cause different SERS signal enhancements. HHD protocol could produce the most uniform and reproducible samples, and the corresponding signal intensity of the analyte is the strongest. This study will help to understand and promote the use of SERS technique in quantitative analysis.

  6. Optimization in the design of a 12 gigahertz low cost ground receiving system for broadcast satellites. Volume 1: System design, performance, and cost analysis

    NASA Technical Reports Server (NTRS)

    Ohkubo, K.; Han, C. C.; Albernaz, J.; Janky, J. M.; Lusignan, B. B.

    1972-01-01

    The technical and economical feasibility of using the 12 GHz band for broadcasting from satellites were examined. Among the assigned frequency bands for broadcast satellites, the 12 GHz band system offers the most channels. It also has the least interference on and from the terrestrial communication links. The system design and analysis are carried out on the basis of a decision analysis model. Technical difficulties in achieving low-cost 12 GHz ground receivers are solved by making use of a die cast aluminum packaging, a hybrid integrated circuit mixer, a cavity stabilized Gunn oscillator and other state-of-the-art microwave technologies for the receiver front-end. A working model was designed and tested, which used frequency modulation. A final design for the 2.6 GHz system ground receiver is also presented. The cost of the ground-terminal was analyzed and minimized for a given figure-of-merit (a ratio of receiving antenna gain to receiver system noise temperature). The results were used to analyze the performance and cost of the whole satellite system.

  7. Combining oxytocin administration and positive emotion inductions: Examining social perception and analytical performance.

    PubMed

    Human, Lauren J; Thorson, Katherine R; Woolley, Joshua D; Mendes, Wendy Berry

    2017-04-01

    Intranasal administration of the hypothalamic neuropeptide oxytocin (OT) has, in some studies, been associated with positive effects on social perception and cognition. Similarly, positive emotion inductions can improve a range of perceptual and performance-based behaviors. In this exploratory study, we examined how OT administration and positive emotion inductions interact in their associations with social and analytical performance. Participants (N=124) were randomly assigned to receive an intranasal spray of OT (40IU) or placebo and then viewed one of three videos designed to engender one of the following emotion states: social warmth, pride, or an affectively neutral state. Following the emotion induction, participants completed social perception and analytical tasks. There were no significant main effects of OT condition on social perception tasks, failing to replicate prior research, or on analytical performance. Further, OT condition and positive emotion inductions did not interact with each other in their associations with social perception performance. However, OT condition and positive emotion manipulations did significantly interact in their associations with analytical performance. Specifically, combining positive emotion inductions with OT administration was associated with worse analytical performance, with the pride induction no longer benefiting performance and the warmth induction resulting in worse performance. In sum, we found little evidence for main or interactive effects of OT on social perception but preliminary evidence that OT administration may impair analytical performance when paired with positive emotion inductions. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. The Climate Data Analytic Services (CDAS) Framework.

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2016-12-01

    Faced with unprecedented growth in climate data volume and demand, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute data processing workflows combining common analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted climate data analysis tools (ESMF, CDAT, NCO, etc.). A dynamic caching architecture enables interactive response times. CDAS utilizes Apache Spark for parallelization and a custom array framework for processing huge datasets within limited memory spaces. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using either direct web service calls, a python script, a unix-like shell client, or a javascript-based web application. Client packages in python, scala, or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends and variability, and compare multiple reanalysis datasets.

  9. Parallel Aircraft Trajectory Optimization with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Falck, Robert D.; Gray, Justin S.; Naylor, Bret

    2016-01-01

    Trajectory optimization is an integral component for the design of aerospace vehicles, but emerging aircraft technologies have introduced new demands on trajectory analysis that current tools are not well suited to address. Designing aircraft with technologies such as hybrid electric propulsion and morphing wings requires consideration of the operational behavior as well as the physical design characteristics of the aircraft. The addition of operational variables can dramatically increase the number of design variables which motivates the use of gradient based optimization with analytic derivatives to solve the larger optimization problems. In this work we develop an aircraft trajectory analysis tool using a Legendre-Gauss-Lobatto based collocation scheme, providing analytic derivatives via the OpenMDAO multidisciplinary optimization framework. This collocation method uses an implicit time integration scheme that provides a high degree of sparsity and thus several potential options for parallelization. The performance of the new implementation was investigated via a series of single and multi-trajectory optimizations using a combination of parallel computing and constraint aggregation. The computational performance results show that in order to take full advantage of the sparsity in the problem it is vital to parallelize both the non-linear analysis evaluations and the derivative computations themselves. The constraint aggregation results showed a significant numerical challenge due to difficulty in achieving tight convergence tolerances. Overall, the results demonstrate the value of applying analytic derivatives to trajectory optimization problems and lay the foundation for future application of this collocation based method to the design of aircraft with where operational scheduling of technologies is key to achieving good performance.

  10. CALUTRON RECEIVER

    DOEpatents

    Brunk, W.O.

    1959-09-29

    A description is given for an improved calutron receiver having a face plate lying at an angle to the direction of the entering ion beams but having an opening, the plane of which is substantially perpendicular to that of the entering ion beams. By so positioning the opening in the receiver, the effective area through which the desired material may enter the receiver is increased, and at the same time the effective area through which containattng material may enter the receiver is reduced.

  11. Analytical methods manual for the Mineral Resource Surveys Program, U.S. Geological Survey

    USGS Publications Warehouse

    Arbogast, Belinda F.

    1996-01-01

    The analytical methods validated by the Mineral Resource Surveys Program, Geologic Division, is the subject of this manual. This edition replaces the methods portion of Open-File Report 90-668 published in 1990. Newer methods may be used which have been approved by the quality assurance (QA) project and are on file with the QA coordinator.This manual is intended primarily for use by laboratory scientists; this manual can also assist laboratory users to evaluate the data they receive. The analytical methods are written in a step by step approach so that they may be used as a training tool and provide detailed documentation of the procedures for quality assurance. A "Catalog of Services" is available for customer (submitter) use with brief listings of:the element(s)/species determined,method of determination,reference to cite,contact person,summary of the technique,and analyte concentration range.For a copy please contact the Branch office at (303) 236-1800 or fax (303) 236-3200.

  12. Inverse Thermal Analysis of Ti-6Al-4V Friction Stir Welds Using Numerical-Analytical Basis Functions with Pseudo-Advection

    NASA Astrophysics Data System (ADS)

    Lambrakos, S. G.

    2018-04-01

    Inverse thermal analysis of Ti-6Al-4V friction stir welds is presented that demonstrates application of a methodology using numerical-analytical basis functions and temperature-field constraint conditions. This analysis provides parametric representation of friction-stir-weld temperature histories that can be adopted as input data to computational procedures for prediction of solid-state phase transformations and mechanical response. These parameterized temperature histories can be used for inverse thermal analysis of friction stir welds having process conditions similar those considered here. Case studies are presented for inverse thermal analysis of friction stir welds that use three-dimensional constraint conditions on calculated temperature fields, which are associated with experimentally measured transformation boundaries and weld-stir-zone cross sections.

  13. Comparing Perceived Adequacy of Help Received Among Different Classes of Individuals with Severe Mental Disorders at Five-Year Follow-Up: A Longitudinal Cluster Analysis.

    PubMed

    Fleury, Marie-Josée; Grenier, Guy; Bamvita, Jean-Marie

    2017-11-13

    This study developed a typology describing change in the perceived adequacy of help received among 204 individuals with severe mental disorders, 5 years after transfer to the community following a major mental health reform in Quebec (Canada). Participant typologies were constructed using a two-step cluster analysis. There were significant differences between T0 and T2 for perceived adequacy of help received and other independent variables, including seriousness of needs, help from services or relatives, and care continuity. Five classes emerged from the analysis. Perceived adequacy of help received at T2 increased for Class 1, mainly comprised of older women with mood disorders. Overall, greater care continuity and levels of help from services and relatives related to higher perceived AHR. Changes in perceived adequacy of help received resulting from several combinations of associated variables indicate that MH service delivery should respond to specific profiles and determinants.

  14. Benefits of Software GPS Receivers for Enhanced Signal Processing

    DTIC Science & Technology

    2000-01-01

    1 Published in GPS SOLUTIONS 4(1) Summer, 2000, pages 56-66. Benefits of Software GPS Receivers for Enhanced Signal Processing Alison Brown...Diego, CA 92110-3127 Number of Pages: 24 Number of Figures: 20 ABSTRACT In this paper the architecture of a software GPS receiver is described...and an analysis is included of the performance of a software GPS receiver when tracking the GPS signals in challenging environments. Results are

  15. Fast transient analysis and first-stage collision-induced dissociation with the flowing atmospheric-pressure afterglow ionization source to improve analyte detection and identification.

    PubMed

    Shelley, Jacob T; Hieftje, Gary M

    2010-04-01

    The recent development of ambient desorption/ionization mass spectrometry (ADI-MS) has enabled fast, simple analysis of many different sample types. The ADI-MS sources have numerous advantages, including little or no required sample pre-treatment, simple mass spectra, and direct analysis of solids and liquids. However, problems of competitive ionization and limited fragmentation require sample-constituent separation, high mass accuracy, and/or tandem mass spectrometry (MS/MS) to detect, identify, and quantify unknown analytes. To maintain the inherent high throughput of ADI-MS, it is essential for the ion source/mass analyzer combination to measure fast transient signals and provide structural information. In the current study, the flowing atmospheric-pressure afterglow (FAPA) ionization source is coupled with a time-of-flight mass spectrometer (TOF-MS) to analyze fast transient signals (<500 ms FWHM). It was found that gas chromatography (GC) coupled with the FAPA source resulted in a reproducible (<5% RSD) and sensitive (detection limits of <6 fmol for a mixture of herbicides) system with analysis times of ca. 5 min. Introducing analytes to the FAPA in a transient was also shown to significantly reduce matrix effects caused by competitive ionization by minimizing the number and amount of constituents introduced into the ionization source. Additionally, MS/MS with FAPA-TOF-MS, enabling analyte identification, was performed via first-stage collision-induced dissociation (CID). Lastly, molecular and structural information was obtained across a fast transient peak by modulating the conditions that caused the first-stage CID.

  16. Post-analytical stability of 23 common chemistry and immunochemistry analytes in incurred samples.

    PubMed

    Nielsen, Betina Klint; Frederiksen, Tina; Friis-Hansen, Lennart; Larsen, Pia Bükmann

    2017-12-01

    Storage of blood samples after centrifugation, decapping and initial sampling allows ordering of additional blood tests. The pre-analytic stability of biochemistry and immunochemistry analytes has been studied in detail, but little is known about the post-analytical stability in incurred samples. We examined the stability of 23 routine analytes on the Dimension Vista® (Siemens Healthineers, Denmark): 42-60 routine samples in lithium-heparin gel tubes (Vacutainer, BD, USA) were centrifuged at 3000×g for 10min. Immediately after centrifugation, initial concentration of analytes were measured in duplicate (t=0). The tubes were stored decapped at room temperature and re-analyzed after 2, 4, 6, 8 and 10h in singletons. The concentration from reanalysis were normalized to initial concentration (t=0). Internal acceptance criteria for bias and total error were used to determine stability of each analyte. Additionally, evaporation from the decapped blood collection tubes and the residual platelet count in the plasma after centrifugation were quantified. We report a post-analytical stability of most routine analytes of ≥8h and do therefore - with few exceptions - suggest a standard 8hour-time limit for reordering and reanalysis of analytes in incurred samples. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  17. Analytical determination of flavonoids aimed to analysis of natural samples and active packaging applications.

    PubMed

    Castro-López, María del Mar; López-Vilariño, José Manuel; González-Rodríguez, María Victoria

    2014-05-01

    Several HPLC and UHPLC developed methods were compared to analyse the natural antioxidants catechins and quercetin used in active packaging and functional foods. Photodiode array detector coupled with a fluorescence detector and compared with LTQ-Orbitrap-MS was used. UHPLC was investigated as quick alternative without compromising the separation, analysis time shortened up to 6-fold. The feasibility of the four developed methods was compared. Linearity up to 0.9995, low detection limits (between 0.02 and 0.7 for HPLC-PDA, 2 to 7-fold lower for HPLC- LTQ-Orbitrap-MS and from 0.2 to 2mgL(-)(1) for UHPLC-PDA) and good precision parameters (RSD lower than 0.06%) were obtained. All methods were successfully applied to natural samples. LTQ-Orbitrap-MS allowed to identify other analytes of interest too. Good feasibility of the methods was also concluded from the analysis of catechin and quercetin release from new active packaging materials based on polypropylene added with catechins and green tea. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to

  19. Frequency Estimator Performance for a Software-Based Beacon Receiver

    NASA Technical Reports Server (NTRS)

    Zemba, Michael J.; Morse, Jacquelynne Rose; Nessel, James A.; Miranda, Felix

    2014-01-01

    As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a QV-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.

  20. Frequency Estimator Performance for a Software-Based Beacon Receiver

    NASA Technical Reports Server (NTRS)

    Zemba, Michael J.; Morse, Jacquelynne R.; Nessel, James A.

    2014-01-01

    As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a Q/V-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.