Sample records for set includes analytical

  1. RUPTURES IN THE ANALYTIC SETTING AND DISTURBANCES IN THE TRANSFORMATIONAL FIELD OF DREAMS.

    PubMed

    Brown, Lawrence J

    2015-10-01

    This paper explores some implications of Bleger's (1967, 2013) concept of the analytic situation, which he views as comprising the analytic setting and the analytic process. The author discusses Bleger's idea of the analytic setting as the depositary for projected painful aspects in either the analyst or patient or both-affects that are then rendered as nonprocess. In contrast, the contents of the analytic process are subject to an incessant process of transformation (Green 2005). The author goes on to enumerate various components of the analytic setting: the nonhuman, object relational, and the analyst's "person" (including mental functioning). An extended clinical vignette is offered as an illustration. © 2015 The Psychoanalytic Quarterly, Inc.

  2. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL

    EPA Science Inventory

    The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...

  3. Organic materials able to detect analytes

    NASA Technical Reports Server (NTRS)

    Swager, Timothy M. (Inventor); Zhu, Zhengguo (Inventor); Bulovic, Vladimir (Inventor); Rose, Aimee (Inventor); Madigan, Conor Francis (Inventor)

    2012-01-01

    The present invention generally relates to polymers with lasing characteristics that allow the polymers to be useful in detecting analytes. In one aspect, the polymer, upon an interaction with an analyte, may exhibit a change in a lasing characteristic that can be determined in some fashion. For example, interaction of an analyte with the polymer may affect the ability of the polymer to reach an excited state that allows stimulated emission of photons to occur, which may be determined, thereby determining the analyte. In another aspect, the polymer, upon interaction with an analyte, may exhibit a change in stimulated emission that is at least 10 times greater with respect to a change in the spontaneous emission of the polymer upon interaction with the analyte. The polymer may be a conjugated polymer in some cases. In one set of embodiments, the polymer includes one or more hydrocarbon side chains, which may be parallel to the polymer backbone in some instances. In another set of embodiments, the polymer may include one or more pendant aromatic rings. In yet another set of embodiments, the polymer may be substantially encapsulated in a hydrocarbon. In still another set of embodiments, the polymer may be substantially resistant to photobleaching. In certain aspects, the polymer may be useful in the detection of explosive agents, such as 2,4,6-trinitrotoluene (TNT) and 2,4-dinitrotoluene (DNT).

  4. Expanding the analyte set of the JPL Electronic Nose to include inorganic compounds

    NASA Technical Reports Server (NTRS)

    Ryan, M. A.; Homer, M. L.; Zhou, H.; Mannat, K.; Manfreda, A.; Kisor, A.; Shevade, A.; Yen, S. P. S.

    2005-01-01

    An array-based sensing system based on 32 polymer/carbon composite conductometric sensors is under development at JPL. Until the present phase of development, the analyte set has focuses on organic compounds and a few selected inorganic compounds, notably ammonia and hydrazine.

  5. Niosh analytical methods for Set G

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1976-12-01

    Industrial Hygiene sampling and analytical monitoring methods validated under the joint NIOSH/OSHA Standards Completion Program for Set G are contained herein. Monitoring methods for the following compounds are included: butadiene, heptane, ketene, methyl cyclohexane, octachloronaphthalene, pentachloronaphthalene, petroleum distillates, propylene dichloride, turpentine, dioxane, hexane, LPG, naphtha(coal tar), octane, pentane, propane, and stoddard solvent.

  6. NHEXAS PHASE I REGION 5 STUDY--METALS IN DUST ANALYTICAL RESULTS

    EPA Science Inventory

    This data set includes analytical results for measurements of metals in 1,906 dust samples. Dust samples were collected to assess potential residential sources of dermal and inhalation exposures and to examine relationships between analyte levels in dust and in personal and bioma...

  7. Spectral multivariate calibration without laboratory prepared or determined reference analyte values.

    PubMed

    Ottaway, Josh; Farrell, Jeremy A; Kalivas, John H

    2013-02-05

    An essential part to calibration is establishing the analyte calibration reference samples. These samples must characterize the sample matrix and measurement conditions (chemical, physical, instrumental, and environmental) of any sample to be predicted. Calibration usually requires measuring spectra for numerous reference samples in addition to determining the corresponding analyte reference values. Both tasks are typically time-consuming and costly. This paper reports on a method named pure component Tikhonov regularization (PCTR) that does not require laboratory prepared or determined reference values. Instead, an analyte pure component spectrum is used in conjunction with nonanalyte spectra for calibration. Nonanalyte spectra can be from different sources including pure component interference samples, blanks, and constant analyte samples. The approach is also applicable to calibration maintenance when the analyte pure component spectrum is measured in one set of conditions and nonanalyte spectra are measured in new conditions. The PCTR method balances the trade-offs between calibration model shrinkage and the degree of orthogonality to the nonanalyte content (model direction) in order to obtain accurate predictions. Using visible and near-infrared (NIR) spectral data sets, the PCTR results are comparable to those obtained using ridge regression (RR) with reference calibration sets. The flexibility of PCTR also allows including reference samples if such samples are available.

  8. The Case for Adopting Server-side Analytics

    NASA Astrophysics Data System (ADS)

    Tino, C.; Holmes, C. P.; Feigelson, E.; Hurlburt, N. E.

    2017-12-01

    The standard method for accessing Earth and space science data relies on a scheme developed decades ago: data residing in one or many data stores must be parsed out and shipped via internet lines or physical transport to the researcher who in turn locally stores the data for analysis. The analyses tasks are varied and include visualization, parameterization, and comparison with or assimilation into physics models. In many cases this process is inefficient and unwieldy as the data sets become larger and demands on the analysis tasks become more sophisticated and complex. For about a decade, several groups have explored a new paradigm to this model. The names applied to the paradigm include "data analytics", "climate analytics", and "server-side analytics". The general concept is that in close network proximity to the data store there will be a tailored processing capability appropriate to the type and use of the data served. The user of the server-side analytics will operate on the data with numerical procedures. The procedures can be accessed via canned code, a scripting processor, or an analysis package such as Matlab, IDL or R. Results of the analytics processes will then be relayed via the internet to the user. In practice, these results will be at a much lower volume, easier to transport to and store locally by the user and easier for the user to interoperate with data sets from other remote data stores. The user can also iterate on the processing call to tailor the results as needed. A major component of server-side analytics could be to provide sets of tailored results to end users in order to eliminate the repetitive preconditioning that is both often required with these data sets and which drives much of the throughput challenges. NASA's Big Data Task Force studied this issue. This paper will present the results of this study including examples of SSAs that are being developed and demonstrated and suggestions for architectures that might be developed for future applications.

  9. NHEXAS PHASE I REGION 5 STUDY--QA ANALYTICAL RESULTS FOR METALS IN SPIKES

    EPA Science Inventory

    This data set includes analytical results for measurements of metals in 49 field control samples (spikes). Measurements were made for up to 11 metals in samples of water, blood, and urine. Field controls were used to assess recovery of target analytes from a sample media during s...

  10. NHEXAS PHASE I REGION 5 STUDY--METALS IN BLOOD ANALYTICAL RESULTS

    EPA Science Inventory

    This data set includes analytical results for measurements of metals in 165 blood samples. These samples were collected to examine the relationships between personal exposure measurements, environmental measurements, and body burden. Venous blood samples were collected by venipun...

  11. NHEXAS PHASE I REGION 5 STUDY--VOCS IN BLOOD ANALYTICAL RESULTS

    EPA Science Inventory

    This data set includes analytical results for measurements of VOCs (volatile organic compounds) in 145 blood samples. These samples were collected to examine the relationships between personal exposure measurements, environmental measurements, and body burden. Venous blood sample...

  12. The NASA Reanalysis Ensemble Service - Advanced Capabilities for Integrated Reanalysis Access and Intercomparison

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.

    2017-12-01

    NASA's efforts to advance climate analytics-as-a-service are making new capabilities available to the research community: (1) A full-featured Reanalysis Ensemble Service (RES) comprising monthly means data from multiple reanalysis data sets, accessible through an enhanced set of extraction, analytic, arithmetic, and intercomparison operations. The operations are made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib; (2) A cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables. This near real-time capability enables advanced technologies like Spark and Hadoop-based MapReduce analytics over native NetCDF files; and (3) A WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation systems such as ESGF. The Reanalysis Ensemble Service includes the following: - New API that supports full temporal, spatial, and grid-based resolution services with sample queries - A Docker-ready RES application to deploy across platforms - Extended capabilities that enable single- and multiple reanalysis area average, vertical average, re-gridding, standard deviation, and ensemble averages - Convenient, one-stop shopping for commonly used data products from multiple reanalyses including basic sub-setting and arithmetic operations (e.g., avg, sum, max, min, var, count, anomaly) - Full support for the MERRA-2 reanalysis dataset in addition to, ECMWF ERA-Interim, NCEP CFSR, JMA JRA-55 and NOAA/ESRL 20CR… - A Jupyter notebook-based distribution mechanism designed for client use cases that combines CDSlib documentation with interactive scenarios and personalized project management - Supporting analytic services for NASA GMAO Forward Processing datasets - Basic uncertainty quantification services that combine heterogeneous ensemble products with comparative observational products (e.g., reanalysis, observational, visualization) - The ability to compute and visualize multiple reanalysis for ease of inter-comparisons - Automated tools to retrieve and prepare data collections for analytic processing

  13. The analytic setting today: using the couch or the chair?

    PubMed

    Wiener, Jan

    2015-09-01

    This paper re-visits Murray Jackson's 1961 paper in the Journal of Analytical Psychology, 'Chair, couch and countertransference', with the aim of exploring the role of the couch for Jungian analysts in clinical practice today. Within the Society of Analytical Psychology (SAP) and some other London-based societies, there has been an evolution of practice from face-to-face sessions with the patient in the chair, as was Jung's preference, to a mode of practice where patients use the couch with the analyst sitting to the side rather than behind, as has been the tradition in psychoanalysis. Fordham was the founding member of the SAP and it was because of his liaison with psychoanalysis and psychoanalysts that this cultural shift came about. Using clinical examples, the author explores the couch/chair question in terms of her own practice and the internal setting as a structure in her mind. With reference to Bleger's (2013) paper 'Psychoanalysis of the psychoanalytic setting', the author discusses how the analytic setting, including use of the couch or the chair, can act as a silent container for the most primitive aspects of the patient's psyche which will only emerge in analysis when the setting changes or is breached. © 2015, The Society of Analytical Psychology.

  14. Extended analytical formulas for the perturbed Keplerian motion under a constant control acceleration

    NASA Astrophysics Data System (ADS)

    Zuiani, Federico; Vasile, Massimiliano

    2015-03-01

    This paper presents a set of analytical formulae for the perturbed Keplerian motion of a spacecraft under the effect of a constant control acceleration. The proposed set of formulae can treat control accelerations that are fixed in either a rotating or inertial reference frame. Moreover, the contribution of the zonal harmonic is included in the analytical formulae. It will be shown that the proposed analytical theory allows for the fast computation of long, multi-revolution spirals while maintaining good accuracy. The combined effect of different perturbations and of the shadow regions due to solar eclipse is also included. Furthermore, a simplified control parameterisation is introduced to optimise thrusting patterns with two thrust arcs and two cost arcs per revolution. This simple parameterisation is shown to ensure enough flexibility to describe complex low thrust spirals. The accuracy and speed of the proposed analytical formulae are compared against a full numerical integration with different integration schemes. An averaging technique is then proposed as an application of the analytical formulae. Finally, the paper presents an example of design of an optimal low-thrust spiral to transfer a spacecraft from an elliptical to a circular orbit around the Earth.

  15. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.

  16. A modeling approach to compare ΣPCB concentrations between congener-specific analyses

    USGS Publications Warehouse

    Gibson, Polly P.; Mills, Marc A.; Kraus, Johanna M.; Walters, David M.

    2017-01-01

    Changes in analytical methods over time pose problems for assessing long-term trends in environmental contamination by polychlorinated biphenyls (PCBs). Congener-specific analyses vary widely in the number and identity of the 209 distinct PCB chemical configurations (congeners) that are quantified, leading to inconsistencies among summed PCB concentrations (ΣPCB) reported by different studies. Here we present a modeling approach using linear regression to compare ΣPCB concentrations derived from different congener-specific analyses measuring different co-eluting groups. The approach can be used to develop a specific conversion model between any two sets of congener-specific analytical data from similar samples (similar matrix and geographic origin). We demonstrate the method by developing a conversion model for an example data set that includes data from two different analytical methods, a low resolution method quantifying 119 congeners and a high resolution method quantifying all 209 congeners. We used the model to show that the 119-congener set captured most (93%) of the total PCB concentration (i.e., Σ209PCB) in sediment and biological samples. ΣPCB concentrations estimated using the model closely matched measured values (mean relative percent difference = 9.6). General applications of the modeling approach include (a) generating comparable ΣPCB concentrations for samples that were analyzed for different congener sets; and (b) estimating the proportional contribution of different congener sets to ΣPCB. This approach may be especially valuable for enabling comparison of long-term remediation monitoring results even as analytical methods change over time. 

  17. Quantum Dot and Polymer Composite Cross-Reactive Array for Chemical Vapor Detection.

    PubMed

    Bright, Collin J; Nallon, Eric C; Polcha, Michael P; Schnee, Vincent P

    2015-12-15

    A cross-reactive chemical sensing array was made from CdSe Quantum Dots (QDs) and five different organic polymers by inkjet printing to create segmented fluorescent composite regions on quartz substrates. The sensor array was challenged with exposures from two sets of analytes, including one set of 14 different functionalized benzenes and one set of 14 compounds related to security concerns, including the explosives trinitrotoluene (TNT) and ammonium nitrate. The array was broadly responsive to analytes with different chemical functionalities due to the multiple sensing mechanisms that altered the QDs' fluorescence. The sensor array displayed excellent discrimination between members within both sets. Classification accuracy of more than 93% was achieved, including the complete discrimination of very similar dinitrobenzene isomers and three halogenated, substituted benzene compounds. The simple fabrication, broad responsivity, and high discrimination capacity of this type of cross-reactive array are ideal qualities for the development of sensors with excellent sensitivity to chemical and explosive threats while maintaining low false alarm rates.

  18. NHEXAS PHASE I ARIZONA STUDY--METALS IN SOIL ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Soil data set contains analytical results for measurements of up to 11 metals in 551 soil samples over 392 households. Samples were taken by collecting surface soil in the yard and next to the foundation from each residence. The primary metals of interest include ...

  19. NHEXAS PHASE I REGION 5 STUDY--QA ANALYTICAL RESULTS FOR VOCS IN BLANKS

    EPA Science Inventory

    This data set includes analytical results for measurements of VOCs in 88 blank samples. Measurements were made for up to 23 VOCs in blank samples of air, water, and blood. Blank samples were used to assess the potential for sample contamination during collection, storage, shipmen...

  20. NHEXAS PHASE I REGION 5 STUDY--QA ANALYTICAL RESULTS FOR VOCS IN REPLICATES

    EPA Science Inventory

    This data set includes analytical results for measurements of VOCs in 204 duplicate (replicate) samples. Measurements were made for up to 23 VOCs in samples of air, water, and blood. Duplicate samples (samples collected along with or next to the original samples) were collected t...

  1. NHEXAS PHASE I REGION 5 STUDY--QA ANALYTICAL RESULTS FOR METALS IN REPLICATES

    EPA Science Inventory

    This data set includes analytical results for measurements of metals in 490 duplicate (replicate) samples and for particles in 130 duplicate samples. Measurements were made for up to 11 metals in samples of air, dust, water, blood, and urine. Duplicate samples (samples collected ...

  2. A Ricin Forensic Profiling Approach Based on a Complex Set of Biomarkers

    DOE PAGES

    Fredriksson, Sten-Ake; Wunschel, David S.; Lindstrom, Susanne Wiklund; ...

    2018-03-28

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1 – PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods andmore » robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved.« less

  3. A Ricin Forensic Profiling Approach Based on a Complex Set of Biomarkers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredriksson, Sten-Ake; Wunschel, David S.; Lindstrom, Susanne Wiklund

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1 – PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods andmore » robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved.« less

  4. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--METALS IN SOIL ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Soil data set contains analytical results for measurements of up to 11 metals in 91 soil samples over 91 households. Samples were taken by collecting surface soil in the yard of each residence. The primary metals of interest include lead (CAS# 7439-92-1), arsenic ...

  5. NHEXAS PHASE I REGION 5 STUDY--QA ANALYTICAL RESULTS FOR METALS IN BLANKS

    EPA Science Inventory

    This data set includes analytical results for measurements of metals in 205 blank samples and for particles in 64 blank samples. Measurements were made for up to 12 metals in blank samples of air, dust, soil, water, food and beverages, blood, hair, and urine. Blank samples were u...

  6. Application of the correlation constrained multivariate curve resolution alternating least-squares method for analyte quantitation in the presence of unexpected interferences using first-order instrumental data.

    PubMed

    Goicoechea, Héctor C; Olivieri, Alejandro C; Tauler, Romà

    2010-03-01

    Correlation constrained multivariate curve resolution-alternating least-squares is shown to be a feasible method for processing first-order instrumental data and achieve analyte quantitation in the presence of unexpected interferences. Both for simulated and experimental data sets, the proposed method could correctly retrieve the analyte and interference spectral profiles and perform accurate estimations of analyte concentrations in test samples. Since no information concerning the interferences was present in calibration samples, the proposed multivariate calibration approach including the correlation constraint facilitates the achievement of the so-called second-order advantage for the analyte of interest, which is known to be present for more complex higher-order richer instrumental data. The proposed method is tested using a simulated data set and two experimental data systems, one for the determination of ascorbic acid in powder juices using UV-visible absorption spectral data, and another for the determination of tetracycline in serum samples using fluorescence emission spectroscopy.

  7. CALIBRATION OF SEMI-ANALYTIC MODELS OF GALAXY FORMATION USING PARTICLE SWARM OPTIMIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruiz, Andrés N.; Domínguez, Mariano J.; Yaryura, Yamila

    2015-03-10

    We present a fast and accurate method to select an optimal set of parameters in semi-analytic models of galaxy formation and evolution (SAMs). Our approach compares the results of a model against a set of observables applying a stochastic technique called Particle Swarm Optimization (PSO), a self-learning algorithm for localizing regions of maximum likelihood in multidimensional spaces that outperforms traditional sampling methods in terms of computational cost. We apply the PSO technique to the SAG semi-analytic model combined with merger trees extracted from a standard Lambda Cold Dark Matter N-body simulation. The calibration is performed using a combination of observedmore » galaxy properties as constraints, including the local stellar mass function and the black hole to bulge mass relation. We test the ability of the PSO algorithm to find the best set of free parameters of the model by comparing the results with those obtained using a MCMC exploration. Both methods find the same maximum likelihood region, however, the PSO method requires one order of magnitude fewer evaluations. This new approach allows a fast estimation of the best-fitting parameter set in multidimensional spaces, providing a practical tool to test the consequences of including other astrophysical processes in SAMs.« less

  8. betaFIT: A computer program to fit pointwise potentials to selected analytic functions

    NASA Astrophysics Data System (ADS)

    Le Roy, Robert J.; Pashov, Asen

    2017-01-01

    This paper describes program betaFIT, which performs least-squares fits of sets of one-dimensional (or radial) potential function values to four different types of sophisticated analytic potential energy functional forms. These families of potential energy functions are: the Expanded Morse Oscillator (EMO) potential [J Mol Spectrosc 1999;194:197], the Morse/Long-Range (MLR) potential [Mol Phys 2007;105:663], the Double Exponential/Long-Range (DELR) potential [J Chem Phys 2003;119:7398], and the "Generalized Potential Energy Function (GPEF)" form introduced by Šurkus et al. [Chem Phys Lett 1984;105:291], which includes a wide variety of polynomial potentials, such as the Dunham [Phys Rev 1932;41:713], Simons-Parr-Finlan [J Chem Phys 1973;59:3229], and Ogilvie-Tipping [Proc R Soc A 1991;378:287] polynomials, as special cases. This code will be useful for providing the realistic sets of potential function shape parameters that are required to initiate direct fits of selected analytic potential functions to experimental data, and for providing better analytical representations of sets of ab initio results.

  9. On trying something new: effort and practice in psychoanalytic change.

    PubMed

    Power, D G

    2000-07-01

    This paper describes one of the ingredients of successful psychoanalytic change: the necessity for the analysand to actively attempt altered patterns of thinking, behaving, feeling, and relating outside of the analytic relationship. When successful, such self-initiated attempts at change are founded on insight and experience gained in the transference and constitute a crucial step in the consolidation and transfer of therapeutic gains. The analytic literature related to this aspect of therapeutic action is reviewed, including the work of Freud, Bader, Rangell, Renik, Valenstein, and Wheelis. Recent interest in the complex and complementary relationship between action and increased self-understanding as it unfolds in the analytic setting is extended beyond the consulting room to include the analysand's extra-analytic attempts to initiate change. Contemporary views of the relationship between praxis and self-knowledge are discussed and offered as theoretical support for broadening analytic technique to include greater attention to the analysand's efforts at implementing therapeutic gains. Case vignettes are presented.

  10. Method for improving the limit of detection in a data signal

    DOEpatents

    Synovec, Robert E.; Yueng, Edward S.

    1989-10-17

    A method for improving the limit of detection for a data set in which experimental noise is uncorrelated along a given abscissa and an analytical signal is correlated to the abscissa, the steps comprising collecting the data set, converting the data set into a data signal including an analytical portion and the experimental noise portion, designating and adjusting a baseline of the data signal to center the experimental noise numerically about a zero reference, and integrating the data signal preserving the corresponding information for each point of the data signal. The steps of the method produce an enhanced integrated data signal which improves the limit of detection of the data signal.

  11. Method for improving the limit of detection in a data signal

    DOEpatents

    Synovec, R.E.; Yueng, E.S.

    1989-10-17

    Disclosed is a method for improving the limit of detection for a data set in which experimental noise is uncorrelated along a given abscissa and an analytical signal is correlated to the abscissa, the steps comprising collecting the data set, converting the data set into a data signal including an analytical portion and the experimental noise portion, designating and adjusting a baseline of the data signal to center the experimental noise numerically about a zero reference, and integrating the data signal preserving the corresponding information for each point of the data signal. The steps of the method produce an enhanced integrated data signal which improves the limit of detection of the data signal. 8 figs.

  12. A ricin forensic profiling approach based on a complex set of biomarkers.

    PubMed

    Fredriksson, Sten-Åke; Wunschel, David S; Lindström, Susanne Wiklund; Nilsson, Calle; Wahl, Karen; Åstot, Crister

    2018-08-15

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1-PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods and robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Capabilities for Intercultural Dialogue

    ERIC Educational Resources Information Center

    Crosbie, Veronica

    2014-01-01

    The capabilities approach offers a valuable analytical lens for exploring the challenge and complexity of intercultural dialogue in contemporary settings. The central tenets of the approach, developed by Amartya Sen and Martha Nussbaum, involve a set of humanistic goals including the recognition that development is a process whereby people's…

  14. Behavioral Analytic Approach to Placement of Patients in Community Settings.

    ERIC Educational Resources Information Center

    Glickman, Henry S.; And Others

    Twenty adult psychiatric outpatients were assessed by their primary therapists on the Current Behavior Inventory prior to placing them in community settings. The diagnoses included schizophrenia, major affective disorder, dysthymic disorder, and atypical paranoid disorder. The inventory assessed behaviors in four areas: independent community…

  15. Engage Families for Anywhere, Anytime Learning

    ERIC Educational Resources Information Center

    Weiss, Heather B.; Lopez, M. Elena

    2015-01-01

    As society expects children and youth today to explore content-area topics in depth and to develop critical-thinking, problem-solving, and analytical skills, out-of-school settings are becoming increasingly important to individual learning. These settings, which include libraries, museums, digital media, and after-school programs, are evolving…

  16. Analytic cognitive style predicts religious and paranormal belief.

    PubMed

    Pennycook, Gordon; Cheyne, James Allan; Seli, Paul; Koehler, Derek J; Fugelsang, Jonathan A

    2012-06-01

    An analytic cognitive style denotes a propensity to set aside highly salient intuitions when engaging in problem solving. We assess the hypothesis that an analytic cognitive style is associated with a history of questioning, altering, and rejecting (i.e., unbelieving) supernatural claims, both religious and paranormal. In two studies, we examined associations of God beliefs, religious engagement (attendance at religious services, praying, etc.), conventional religious beliefs (heaven, miracles, etc.) and paranormal beliefs (extrasensory perception, levitation, etc.) with performance measures of cognitive ability and analytic cognitive style. An analytic cognitive style negatively predicted both religious and paranormal beliefs when controlling for cognitive ability as well as religious engagement, sex, age, political ideology, and education. Participants more willing to engage in analytic reasoning were less likely to endorse supernatural beliefs. Further, an association between analytic cognitive style and religious engagement was mediated by religious beliefs, suggesting that an analytic cognitive style negatively affects religious engagement via lower acceptance of conventional religious beliefs. Results for types of God belief indicate that the association between an analytic cognitive style and God beliefs is more nuanced than mere acceptance and rejection, but also includes adopting less conventional God beliefs, such as Pantheism or Deism. Our data are consistent with the idea that two people who share the same cognitive ability, education, political ideology, sex, age and level of religious engagement can acquire very different sets of beliefs about the world if they differ in their propensity to think analytically. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Analytical applications of microbial fuel cells. Part I: Biochemical oxygen demand.

    PubMed

    Abrevaya, Ximena C; Sacco, Natalia J; Bonetto, Maria C; Hilding-Ohlsson, Astrid; Cortón, Eduardo

    2015-01-15

    Microbial fuel cells (MFCs) are bio-electrochemical devices, where usually the anode (but sometimes the cathode, or both) contains microorganisms able to generate and sustain an electrochemical gradient which is used typically to generate electrical power. In the more studied set-up, the anode contains heterotrophic bacteria in anaerobic conditions, capable to oxidize organic molecules releasing protons and electrons, as well as other by-products. Released protons could reach the cathode (through a membrane or not) whereas electrons travel across an external circuit originating an easily measurable direct current flow. MFCs have been proposed fundamentally as electric power producing devices or more recently as hydrogen producing devices. Here we will review the still incipient development of analytical uses of MFCs or related devices or set-ups, in the light of a non-restrictive MFC definition, as promising tools to asset water quality or other measurable parameters. An introduction to biological based analytical methods, including bioassays and biosensors, as well as MFCs design and operating principles, will also be included. Besides, the use of MFCs as biochemical oxygen demand sensors (perhaps the main analytical application of MFCs) is discussed. In a companion review (Part 2), other new analytical applications are reviewed used for toxicity sensors, metabolic sensors, life detectors, and other proposed applications. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Solving Differential Equations Analytically. Elementary Differential Equations. Modules and Monographs in Undergraduate Mathematics and Its Applications Project. UMAP Unit 335.

    ERIC Educational Resources Information Center

    Goldston, J. W.

    This unit introduces analytic solutions of ordinary differential equations. The objective is to enable the student to decide whether a given function solves a given differential equation. Examples of problems from biology and chemistry are covered. Problem sets, quizzes, and a model exam are included, and answers to all items are provided. The…

  19. Nonlinear core deflection in injection molding

    NASA Astrophysics Data System (ADS)

    Poungthong, P.; Giacomin, A. J.; Saengow, C.; Kolitawong, C.; Liao, H.-C.; Tseng, S.-C.

    2018-05-01

    Injection molding of thin slender parts is often complicated by core deflection. This deflection is caused by molten plastics race tracking through the slit between the core and the rigid cavity wall. The pressure of this liquid exerts a lateral force of the slender core causing the core to bend, and this bending is governed by a nonlinear fifth order ordinary differential equation for the deflection that is not directly in the position along the core. Here we subject this differential equation to 6 sets of boundary conditions, corresponding to 6 commercial core constraints. For each such set of boundary conditions, we develop an explicit approximate analytical solution, including both a linear term and a nonlinear term. By comparison with finite difference solutions, we find our new analytical solutions to be accurate. We then use these solutions to derive explicit analytical approximations for maximum deflections and for the core position of these maximum deflections. Our experiments on the base-gated free-tip boundary condition agree closely with our new explicit approximate analytical solution.

  20. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis

    PubMed Central

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338

  1. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis.

    PubMed

    Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.

  2. Tests of a Semi-Analytical Case 1 and Gelbstoff Case 2 SeaWiFS Algorithm with a Global Data Set

    NASA Technical Reports Server (NTRS)

    Carder, Kendall L.; Hawes, Steve K.; Lee, Zhongping

    1997-01-01

    A semi-analytical algorithm was tested with a total of 733 points of either unpackaged or packaged-pigment data, with corresponding algorithm parameters for each data type. The 'unpackaged' type consisted of data sets that were generally consistent with the Case 1 CZCS algorithm and other well calibrated data sets. The 'packaged' type consisted of data sets apparently containing somewhat more packaged pigments, requiring modification of the absorption parameters of the model consistent with the CalCOFI study area. This resulted in two equally divided data sets. A more thorough scrutiny of these and other data sets using a semianalytical model requires improved knowledge of the phytoplankton and gelbstoff of the specific environment studied. Since the semi-analytical algorithm is dependent upon 4 spectral channels including the 412 nm channel, while most other algorithms are not, a means of testing data sets for consistency was sought. A numerical filter was developed to classify data sets into the above classes. The filter uses reflectance ratios, which can be determined from space. The sensitivity of such numerical filters to measurement resulting from atmospheric correction and sensor noise errors requires further study. The semi-analytical algorithm performed superbly on each of the data sets after classification, resulting in RMS1 errors of 0.107 and 0.121, respectively, for the unpackaged and packaged data-set classes, with little bias and slopes near 1.0. In combination, the RMS1 performance was 0.114. While these numbers appear rather sterling, one must bear in mind what mis-classification does to the results. Using an average or compromise parameterization on the modified global data set yielded an RMS1 error of 0.171, while using the unpackaged parameterization on the global evaluation data set yielded an RMS1 error of 0.284. So, without classification, the algorithm performs better globally using the average parameters than it does using the unpackaged parameters. Finally, the effects of even more extreme pigment packaging must be examined in order to improve algorithm performance at high latitudes. Note, however, that the North Sea and Mississippi River plume studies contributed data to the packaged and unpackaged classess, respectively, with little effect on algorithm performance. This suggests that gelbstoff-rich Case 2 waters do not seriously degrade performance of the semi-analytical algorithm.

  3. Importance of implementing an analytical quality control system in a core laboratory.

    PubMed

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  4. Method Development in Forensic Toxicology.

    PubMed

    Peters, Frank T; Wissenbach, Dirk K; Busardo, Francesco Paolo; Marchei, Emilia; Pichini, Simona

    2017-01-01

    In the field of forensic toxicology, the quality of analytical methods is of great importance to ensure the reliability of results and to avoid unjustified legal consequences. A key to high quality analytical methods is a thorough method development. The presented article will provide an overview on the process of developing methods for forensic applications. This includes the definition of the method's purpose (e.g. qualitative vs quantitative) and the analytes to be included, choosing an appropriate sample matrix, setting up separation and detection systems as well as establishing a versatile sample preparation. Method development is concluded by an optimization process after which the new method is subject to method validation. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Systems and Methods for Correcting Optical Reflectance Measurements

    NASA Technical Reports Server (NTRS)

    Yang, Ye (Inventor); Shear, Michael A. (Inventor); Soller, Babs R. (Inventor); Soyemi, Olusola O. (Inventor)

    2014-01-01

    We disclose measurement systems and methods for measuring analytes in target regions of samples that also include features overlying the target regions. The systems include: (a) a light source; (b) a detection system; (c) a set of at least first, second, and third light ports which transmit light from the light source to a sample and receive and direct light reflected from the sample to the detection system, generating a first set of data including information corresponding to both an internal target within the sample and features overlying the internal target, and a second set of data including information corresponding to features overlying the internal target; and (d) a processor configured to remove information characteristic of the overlying features from the first set of data using the first and second sets of data to produce corrected information representing the internal target.

  6. Systems and methods for correcting optical reflectance measurements

    NASA Technical Reports Server (NTRS)

    Yang, Ye (Inventor); Soller, Babs R. (Inventor); Soyemi, Olusola O. (Inventor); Shear, Michael A. (Inventor)

    2009-01-01

    We disclose measurement systems and methods for measuring analytes in target regions of samples that also include features overlying the target regions. The systems include: (a) a light source; (b) a detection system; (c) a set of at least first, second, and third light ports which transmit light from the light source to a sample and receive and direct light reflected from the sample to the detection system, generating a first set of data including information corresponding to both an internal target within the sample and features overlying the internal target, and a second set of data including information corresponding to features overlying the internal target; and (d) a processor configured to remove information characteristic of the overlying features from the first set of data using the first and second sets of data to produce corrected information representing the internal target.

  7. Direction-Setting School Leadership Practices: A Meta-Analytical Review of Evidence about Their Influence

    ERIC Educational Resources Information Center

    Sun, Jingping; Leithwood, Kenneth

    2015-01-01

    This study reviews evidence about the overall influence of direction-setting leadership practices (DSLPs), 1 of 4 major categories of practices included in a widely known conception of effective leadership (e.g., Leithwood & Louis, 2011) and a focus of many other such conceptions, as well. This study also inquires about how direction-setting…

  8. Complete set of homogeneous isotropic analytic solutions in scalar-tensor cosmology with radiation and curvature

    NASA Astrophysics Data System (ADS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-10-01

    We study a model of a scalar field minimally coupled to gravity, with a specific potential energy for the scalar field, and include curvature and radiation as two additional parameters. Our goal is to obtain analytically the complete set of configurations of a homogeneous and isotropic universe as a function of time. This leads to a geodesically complete description of the Universe, including the passage through the cosmological singularities, at the classical level. We give all the solutions analytically without any restrictions on the parameter space of the model or initial values of the fields. We find that for generic solutions the Universe goes through a singular (zero-size) bounce by entering a period of antigravity at each big crunch and exiting from it at the following big bang. This happens cyclically again and again without violating the null-energy condition. There is a special subset of geodesically complete nongeneric solutions which perform zero-size bounces without ever entering the antigravity regime in all cycles. For these, initial values of the fields are synchronized and quantized but the parameters of the model are not restricted. There is also a subset of spatial curvature-induced solutions that have finite-size bounces in the gravity regime and never enter the antigravity phase. These exist only within a small continuous domain of parameter space without fine-tuning the initial conditions. To obtain these results, we identified 25 regions of a 6-parameter space in which the complete set of analytic solutions are explicitly obtained.

  9. Modeling and analysis of energy quantization effects on single electron inverter performance

    NASA Astrophysics Data System (ADS)

    Dan, Surya Shankar; Mahapatra, Santanu

    2009-08-01

    In this paper, for the first time, the effects of energy quantization on single electron transistor (SET) inverter performance are analyzed through analytical modeling and Monte Carlo simulations. It is shown that energy quantization mainly changes the Coulomb blockade region and drain current of SET devices and thus affects the noise margin, power dissipation, and the propagation delay of SET inverter. A new analytical model for the noise margin of SET inverter is proposed which includes the energy quantization effects. Using the noise margin as a metric, the robustness of SET inverter is studied against the effects of energy quantization. A compact expression is developed for a novel parameter quantization threshold which is introduced for the first time in this paper. Quantization threshold explicitly defines the maximum energy quantization that an SET inverter logic circuit can withstand before its noise margin falls below a specified tolerance level. It is found that SET inverter designed with CT:CG=1/3 (where CT and CG are tunnel junction and gate capacitances, respectively) offers maximum robustness against energy quantization.

  10. Synthesized airfoil data method for prediction of dynamic stall and unsteady airloads

    NASA Technical Reports Server (NTRS)

    Gangwani, S. T.

    1983-01-01

    A detailed analysis of dynamic stall experiments has led to a set of relatively compact analytical expressions, called synthesized unsteady airfoil data, which accurately describe in the time-domain the unsteady aerodynamic characteristics of stalled airfoils. An analytical research program was conducted to expand and improve this synthesized unsteady airfoil data method using additional available sets of unsteady airfoil data. The primary objectives were to reduce these data to synthesized form for use in rotor airload prediction analyses and to generalize the results. Unsteady drag data were synthesized which provided the basis for successful expansion of the formulation to include computation of the unsteady pressure drag of airfoils and rotor blades. Also, an improved prediction model for airfoil flow reattachment was incorporated in the method. Application of this improved unsteady aerodynamics model has resulted in an improved correlation between analytic predictions and measured full scale helicopter blade loads and stress data.

  11. Extending Climate Analytics as a Service to the Earth System Grid Federation Progress Report on the Reanalysis Ensemble Service

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.

    2016-12-01

    We are extending climate analytics-as-a-service, including: (1) A high-performance Virtual Real-Time Analytics Testbed supporting six major reanalysis data sets using advanced technologies like the Cloudera Impala-based SQL and Hadoop-based MapReduce analytics over native NetCDF files. (2) A Reanalysis Ensemble Service (RES) that offers a basic set of commonly used operations over the reanalysis collections that are accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib. (3) An Open Geospatial Consortium (OGC) WPS-compliant Web service interface to CDSLib to accommodate ESGF's Web service endpoints. This presentation will report on the overall progress of this effort, with special attention to recent enhancements that have been made to the Reanalysis Ensemble Service, including the following: - An CDSlib Python library that supports full temporal, spatial, and grid-based resolution services - A new reanalysis collections reference model to enable operator design and implementation - An enhanced library of sample queries to demonstrate and develop use case scenarios - Extended operators that enable single- and multiple reanalysis area average, vertical average, re-gridding, and trend, climatology, and anomaly computations - Full support for the MERRA-2 reanalysis and the initial integration of two additional reanalyses - A prototype Jupyter notebook-based distribution mechanism that combines CDSlib documentation with interactive use case scenarios and personalized project management - Prototyped uncertainty quantification services that combine ensemble products with comparative observational products - Convenient, one-stop shopping for commonly used data products from multiple reanalyses, including basic subsetting and arithmetic operations over the data and extractions of trends, climatologies, and anomalies - The ability to compute and visualize multiple reanalysis intercomparisons

  12. hydropower biological evaluation tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This software is a set of analytical tools to evaluate the physical and biological performance of existing, refurbished, or newly installed conventional hydro-turbines nationwide where fish passage is a regulatory concern. The current version is based on information collected by the Sensor Fish. Future version will include other technologies. The tool set includes data acquisition, data processing, and biological response tools with applications to various turbine designs and other passage alternatives. The associated database is centralized, and can be accessed remotely. We have demonstrated its use for various applications including both turbines and spillways

  13. CTEPP NC DATA QA/QC RESULTS

    EPA Science Inventory

    This data set contains the method performance results. This includes field blanks, method blanks, duplicate samples, analytical duplicates, matrix spikes, and surrogate recovery standards.

    The Children’s Total Exposure to Persistent Pesticides and Other Persistent Pollutant (...

  14. A Population-Level Data Analytics Portal for Self-Administered Lifestyle and Mental Health Screening.

    PubMed

    Zhang, Xindi; Warren, Jim; Corter, Arden; Goodyear-Smith, Felicity

    2016-01-01

    This paper describes development of a prototype data analytics portal for analysis of accumulated screening results from eCHAT (electronic Case-finding and Help Assessment Tool). eCHAT allows individuals to conduct a self-administered lifestyle and mental health screening assessment, with usage to date chiefly in the context of primary care waiting rooms. The intention is for wide roll-out to primary care clinics, including secondary school based clinics, resulting in the accumulation of population-level data. Data from a field trial of eCHAT with sexual health questions tailored to youth were used to support design of a data analytics portal for population-level data. The design process included user personas and scenarios, screen prototyping and a simulator for generating large-scale data sets. The prototype demonstrates the promise of wide-scale self-administered screening data to support a range of users including practice managers, clinical directors and health policy analysts.

  15. Evaluation of analytical errors in a clinical chemistry laboratory: a 3 year experience.

    PubMed

    Sakyi, As; Laing, Ef; Ephraim, Rk; Asibey, Of; Sadique, Ok

    2015-01-01

    Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified.

  16. The legal and ethical concerns that arise from using complex predictive analytics in health care.

    PubMed

    Cohen, I Glenn; Amarasingham, Ruben; Shah, Anand; Xie, Bin; Lo, Bernard

    2014-07-01

    Predictive analytics, or the use of electronic algorithms to forecast future events in real time, makes it possible to harness the power of big data to improve the health of patients and lower the cost of health care. However, this opportunity raises policy, ethical, and legal challenges. In this article we analyze the major challenges to implementing predictive analytics in health care settings and make broad recommendations for overcoming challenges raised in the four phases of the life cycle of a predictive analytics model: acquiring data to build the model, building and validating it, testing it in real-world settings, and disseminating and using it more broadly. For instance, we recommend that model developers implement governance structures that include patients and other stakeholders starting in the earliest phases of development. In addition, developers should be allowed to use already collected patient data without explicit consent, provided that they comply with federal regulations regarding research on human subjects and the privacy of health information. Project HOPE—The People-to-People Health Foundation, Inc.

  17. SATURN (Situational Awareness Tool for Urban Responder Networks)

    DTIC Science & Technology

    2012-07-01

    timeline. SATURN is applicable to a broad set of law enforcement, security, and counterterrorism missions typically addressed by urban responders...Keywords-video analytics; sensor fusion; video; urban responders I. INTRODUCTION Urban authorities have a broad set of missions . Duties vary in...both the frequency of occurrence and in the complexity of execution. They include everyday public safety missions such as traffic enforcement as

  18. Quality Assurance of RNA Expression Profiling in Clinical Laboratories

    PubMed Central

    Tang, Weihua; Hu, Zhiyuan; Muallem, Hind; Gulley, Margaret L.

    2012-01-01

    RNA expression profiles are increasingly used to diagnose and classify disease, based on expression patterns of as many as several thousand RNAs. To ensure quality of expression profiling services in clinical settings, a standard operating procedure incorporates multiple quality indicators and controls, beginning with preanalytic specimen preparation and proceeding thorough analysis, interpretation, and reporting. Before testing, histopathological examination of each cellular specimen, along with optional cell enrichment procedures, ensures adequacy of the input tissue. Other tactics include endogenous controls to evaluate adequacy of RNA and exogenous or spiked controls to evaluate run- and patient-specific performance of the test system, respectively. Unique aspects of quality assurance for array-based tests include controls for the pertinent outcome signatures that often supersede controls for each individual analyte, built-in redundancy for critical analytes or biochemical pathways, and software-supported scrutiny of abundant data by a laboratory physician who interprets the findings in a manner facilitating appropriate medical intervention. Access to high-quality reagents, instruments, and software from commercial sources promotes standardization and adoption in clinical settings, once an assay is vetted in validation studies as being analytically sound and clinically useful. Careful attention to the well-honed principles of laboratory medicine, along with guidance from government and professional groups on strategies to preserve RNA and manage large data sets, promotes clinical-grade assay performance. PMID:22020152

  19. CTEPP-OH DATA QA/QC RESULTS

    EPA Science Inventory

    This data set contains the method performance results for CTEPP-OH. This includes field blanks, method blanks, duplicate samples, analytical duplicates, matrix spikes, and surrogate recovery standards.

    The Children’s Total Exposure to Persistent Pesticides and Other Persisten...

  20. Data Analytics of Hydraulic Fracturing Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jovan Yang; Viswanathan, Hari; Hyman, Jeffery

    These are a set of slides on the data analytics of hydraulic fracturing data. The conclusions from this research are the following: they proposed a permeability evolution as a new mechanism to explain hydraulic fracturing trends; they created a model to include this mechanism and it showed promising results; the paper from this research is ready for submission; they devised a way to identify and sort refractures in order to study their effects, and this paper is currently being written.

  1. Analytical and Clinical Performance of Blood Glucose Monitors

    PubMed Central

    Boren, Suzanne Austin; Clarke, William L.

    2010-01-01

    Background The objective of this study was to understand the level of performance of blood glucose monitors as assessed in the published literature. Methods Medline from January 2000 to October 2009 and reference lists of included articles were searched to identify eligible studies. Key information was abstracted from eligible studies: blood glucose meters tested, blood sample, meter operators, setting, sample of people (number, diabetes type, age, sex, and race), duration of diabetes, years using a glucose meter, insulin use, recommendations followed, performance evaluation measures, and specific factors affecting the accuracy evaluation of blood glucose monitors. Results Thirty-one articles were included in this review. Articles were categorized as review articles of blood glucose accuracy (6 articles), original studies that reported the performance of blood glucose meters in laboratory settings (14 articles) or clinical settings (9 articles), and simulation studies (2 articles). A variety of performance evaluation measures were used in the studies. The authors did not identify any studies that demonstrated a difference in clinical outcomes. Examples of analytical tools used in the description of accuracy (e.g., correlation coefficient, linear regression equations, and International Organization for Standardization standards) and how these traditional measures can complicate the achievement of target blood glucose levels for the patient were presented. The benefits of using error grid analysis to quantify the clinical accuracy of patient-determined blood glucose values were discussed. Conclusions When examining blood glucose monitor performance in the real world, it is important to consider if an improvement in analytical accuracy would lead to improved clinical outcomes for patients. There are several examples of how analytical tools used in the description of self-monitoring of blood glucose accuracy could be irrelevant to treatment decisions. PMID:20167171

  2. Quality of Big Data in health care.

    PubMed

    Sukumar, Sreenivas R; Natarajan, Ramachandran; Ferrell, Regina K

    2015-01-01

    The current trend in Big Data analytics and in particular health information technology is toward building sophisticated models, methods and tools for business, operational and clinical intelligence. However, the critical issue of data quality required for these models is not getting the attention it deserves. The purpose of this paper is to highlight the issues of data quality in the context of Big Data health care analytics. The insights presented in this paper are the results of analytics work that was done in different organizations on a variety of health data sets. The data sets include Medicare and Medicaid claims, provider enrollment data sets from both public and private sources, electronic health records from regional health centers accessed through partnerships with health care claims processing entities under health privacy protected guidelines. Assessment of data quality in health care has to consider: first, the entire lifecycle of health data; second, problems arising from errors and inaccuracies in the data itself; third, the source(s) and the pedigree of the data; and fourth, how the underlying purpose of data collection impact the analytic processing and knowledge expected to be derived. Automation in the form of data handling, storage, entry and processing technologies is to be viewed as a double-edged sword. At one level, automation can be a good solution, while at another level it can create a different set of data quality issues. Implementation of health care analytics with Big Data is enabled by a road map that addresses the organizational and technological aspects of data quality assurance. The value derived from the use of analytics should be the primary determinant of data quality. Based on this premise, health care enterprises embracing Big Data should have a road map for a systematic approach to data quality. Health care data quality problems can be so very specific that organizations might have to build their own custom software or data quality rule engines. Today, data quality issues are diagnosed and addressed in a piece-meal fashion. The authors recommend a data lifecycle approach and provide a road map, that is more appropriate with the dimensions of Big Data and fits different stages in the analytical workflow.

  3. Analytic Results for e+e- --> tt-bar and gammagamma --> tt-bar Observables near the Threshold up to the Next-to-Next-to-Leading Order of NRQCD

    NASA Astrophysics Data System (ADS)

    Penin, A. A.; Pivovarov, A. A.

    2001-02-01

    We present an analytical description of top-antitop pair production near the threshold in $e^+e^-$ annihilation and $\\g\\g$ collisions. A set of basic observables considered includes the total cross sections, forward-backward asymmetry and top quark polarization. The threshold effects relevant for the basic observables are described by three universal functions related to S wave production, P wave production and S-P interference. These functions are computed analytically up to the next-to-next-to-leading order of NRQCD. The total $e^+e^-\\to t\\bar t$ cross section near the threshold is obtained in the next-to-next-to-leading order in the closed form including the contribution due to the axial coupling of top quark and mediated by the Z-boson. The effects of the running of the strong coupling constant and of the finite top quark width are taken into account analytically for the P wave production and S-P wave interference.

  4. Evaluation of Analytical Errors in a Clinical Chemistry Laboratory: A 3 Year Experience

    PubMed Central

    Sakyi, AS; Laing, EF; Ephraim, RK; Asibey, OF; Sadique, OK

    2015-01-01

    Background: Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. Aim: We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. Materials and Methods: We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). Results: A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Conclusion: Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified. PMID:25745569

  5. Hydrocarbon-Fueled Rocket Engine Plume Diagnostics: Analytical Developments and Experimental Results

    NASA Technical Reports Server (NTRS)

    Tejwani, Gopal D.; McVay, Gregory P.; Langford, Lester A.; St. Cyr, William W.

    2006-01-01

    A viewgraph presentation describing experimental results and analytical developments about plume diagnostics for hydrocarbon-fueled rocket engines is shown. The topics include: 1) SSC Plume Diagnostics Background; 2) Engine Health Monitoring Approach; 3) Rocket Plume Spectroscopy Simulation Code; 4) Spectral Simulation for 10 Atomic Species and for 11 Diatomic Molecular Electronic Bands; 5) "Best" Lines for Plume Diagnostics for Hydrocarbon-Fueled Rocket Engines; 6) Experimental Set Up for the Methane Thruster Test Program and Experimental Results; and 7) Summary and Recommendations.

  6. Transportation Life Cycle Assessment (LCA) Synthesis, Phase II

    DOT National Transportation Integrated Search

    2018-04-24

    The Transportation Life Cycle Assessment (LCA) Synthesis includes an LCA Learning Module Series, case studies, and analytics on the use of the modules. The module series is a set of narrated slideshows on topics related to environmental LCA. Phase I ...

  7. CTEPP NC DATA SUPPLEMENTAL INFORMATION ON FIELD AND LABORATORY SAMPLES

    EPA Science Inventory

    This data set contains supplemental data related to the final core analytical results table. This includes sample collection data for example sample weight, air volume, creatinine, specific gravity etc.

    The Children’s Total Exposure to Persistent Pesticides and Other Persistent...

  8. Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.

    2016-12-01

    The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.

  9. Measuring myokines with cardiovascular functions: pre-analytical variables affecting the analytical output.

    PubMed

    Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe

    2017-08-01

    In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.

  10. Multi-center evaluation of analytical performance of the Beckman Coulter AU5822 chemistry analyzer.

    PubMed

    Zimmerman, M K; Friesen, L R; Nice, A; Vollmer, P A; Dockery, E A; Rankin, J D; Zmuda, K; Wong, S H

    2015-09-01

    Our three academic institutions, Indiana University, Northwestern Memorial Hospital, and Wake Forest, were among the first in the United States to implement the Beckman Coulter AU5822 series chemistry analyzers. We undertook this post-hoc multi-center study by merging our data to determine performance characteristics and the impact of methodology changes on analyte measurement. We independently completed performance validation studies including precision, linearity/analytical measurement range, method comparison, and reference range verification. Complete data sets were available from at least one institution for 66 analytes with the following groups: 51 from all three institutions, and 15 from 1 or 2 institutions for a total sample size of 12,064. Precision was similar among institutions. Coefficients of variation (CV) were <10% for 97%. Analytes with CVs >10% included direct bilirubin and digoxin. All analytes exhibited linearity over the analytical measurement range. Method comparison data showed slopes between 0.900-1.100 for 87.9% of the analytes. Slopes for amylase, tobramycin and urine amylase were <0.8; the slope for lipase was >1.5, due to known methodology or standardization differences. Consequently, reference ranges of amylase, urine amylase and lipase required only minor or no modification. The four AU5822 analyzers independently evaluated at three sites showed consistent precision, linearity, and correlation results. Since installations, the test results had been well received by clinicians from all three institutions. Copyright © 2015. Published by Elsevier Inc.

  11. CTEPP-OH DATA SUPPLEMENTAL INFORMATION ON FIELD AND LABORATORY SAMPLES

    EPA Science Inventory

    This data set contains supplemental data related to the final core analytical results table for CTEPP-OH. This includes sample collection data for example sample weight, air volume, creatinine, specific gravity etc.

    The Children’s Total Exposure to Persistent Pesticides and Oth...

  12. A potential energy surface for the process H2 + H2O yielding H + H + H2O - Ab initio calculations and analytical representation

    NASA Technical Reports Server (NTRS)

    Schwenke, David W.; Walch, Stephen P.; Taylor, Peter R.

    1991-01-01

    Extensive ab initio calculations on the ground state potential energy surface of H2 + H2O were performed using a large contracted Gaussian basis set and a high level of correlation treatment. An analytical representation of the potential energy surface was then obtained which reproduces the calculated energies with an overall root-mean-square error of only 0.64 mEh. The analytic representation explicitly includes all nine internal degrees of freedom and is also well behaved as the H2 dissociates; it thus can be used to study collision-induced dissociation or recombination of H2. The strategy used to minimize the number of energy calculations is discussed, as well as other advantages of the present method for determining the analytical representation.

  13. Big data analytics workflow management for eScience

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the real challenge in many practical scientific use cases. This talk will specifically address the main needs, requirements and challenges regarding data analytics workflow management applied to large scientific datasets. Three real use cases concerning analytics workflows for sea situational awareness, fire danger prevention, climate change and biodiversity will be discussed in detail.

  14. Enhancements of nonpoint source monitoring of volatile organic compounds in ground water

    USGS Publications Warehouse

    Lapham, W.W.; Moran, M.J.; Zogorski, J.S.

    2000-01-01

    The U.S. Geological Survey (USGS) has compiled a national retrospective data set of analyses of volatile organic compounds (VOCs) in ground water of the United States. The data are from Federal, State, and local nonpoint-source monitoring programs, collected between 1985–95. This data set is being used to augment data collected by the USGS National Water-Quality Assessment (NAWQA) Program to ascertain the occurrence of VOCs in ground water nationwide. Eleven attributes of the retrospective data set were evaluated to determine the suitability of the data to augment NAWQA data in answering occurrence questions of varying complexity. These 11 attributes are the VOC analyte list and the associated reporting levels for each VOC, well type, well-casing material, type of openings in the interval (screened interval or open hole), well depth, depth to the top and bottom of the open interval(s), depth to water level in the well, aquifer type (confined or unconfined), and aquifer lithology. VOCs frequently analyzed included solvents, industrial reagents, and refrigerants, but other VOCs of current interest were not frequently analyzed. About 70 percent of the sampled wells have the type of well documented in the data set, and about 74 percent have well depth documented. However, the data set generally lacks documentation of other characteristics, such as well-casing material, information about the screened or open interval(s), depth to water level in the well, and aquifer type and lithology. For example, only about 20 percent of the wells include information on depth to water level in the well and only about 14 percent of the wells include information about aquifer type. The three most important enhancements to VOC data collected in nonpoint-source monitoring programs for use in a national assessment of VOC occurrence in ground water would be an expanded VOC analyte list, recording the reporting level for each analyte for every analysis, and recording key ancillary information about each well. These enhancements would greatly increase the usefulness of VOC data in addressing complex occurrence questions, such as those that seek to explain the reasons for VOC occurrence and nonoccurrence in ground water of the United States.

  15. Setting Learning Analytics in Context: Overcoming the Barriers to Large-Scale Adoption

    ERIC Educational Resources Information Center

    Ferguson, Rebecca; Macfadyen, Leah P.; Clow, Doug; Tynan, Belinda; Alexander, Shirley; Dawson, Shane

    2014-01-01

    A core goal for most learning analytic projects is to move from small-scale research towards broader institutional implementation, but this introduces a new set of challenges because institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires explicit and…

  16. Analytical quality goals derived from the total deviation from patients' homeostatic set points, with a margin for analytical errors.

    PubMed

    Bolann, B J; Asberg, A

    2004-01-01

    The deviation of test results from patients' homeostatic set points in steady-state conditions may complicate interpretation of the results and the comparison of results with clinical decision limits. In this study the total deviation from the homeostatic set point is defined as the maximum absolute deviation for 95% of measurements, and we present analytical quality requirements that prevent analytical error from increasing this deviation to more than about 12% above the value caused by biology alone. These quality requirements are: 1) The stable systematic error should be approximately 0, and 2) a systematic error that will be detected by the control program with 90% probability, should not be larger than half the value of the combined analytical and intra-individual standard deviation. As a result, when the most common control rules are used, the analytical standard deviation may be up to 0.15 times the intra-individual standard deviation. Analytical improvements beyond these requirements have little impact on the interpretability of measurement results.

  17. Anisotropic cosmological solutions in R + R^2 gravity

    NASA Astrophysics Data System (ADS)

    Müller, Daniel; Ricciardone, Angelo; Starobinsky, Alexei A.; Toporensky, Aleksey

    2018-04-01

    In this paper we investigate the past evolution of an anisotropic Bianchi I universe in R+R^2 gravity. Using the dynamical system approach we show that there exists a new two-parameter set of solutions that includes both an isotropic "false radiation" solution and an anisotropic generalized Kasner solution, which is stable. We derive the analytic behavior of the shear from a specific property of f( R) gravity and the analytic asymptotic form of the Ricci scalar when approaching the initial singularity. Finally, we numerically check our results.

  18. Enhancing the chemical selectivity in discovery-based analysis with tandem ionization time-of-flight mass spectrometry detection for comprehensive two-dimensional gas chromatography.

    PubMed

    Freye, Chris E; Moore, Nicholas R; Synovec, Robert E

    2018-02-16

    The complementary information provided by tandem ionization time-of-flight mass spectrometry (TI-TOFMS) is investigated for comparative discovery-based analysis, when coupled with comprehensive two-dimensional gas chromatography (GC × GC). The TI conditions implemented were a hard ionization energy (70 eV) concurrently collected with a soft ionization energy (14 eV). Tile-based Fisher ratio (F-ratio) analysis is used to analyze diesel fuel spiked with twelve analytes at a nominal concentration of 50 ppm. F-ratio analysis is a supervised discovery-based technique that compares two different sample classes, in this case spiked and unspiked diesel, to reduce the complex GC × GC-TI-TOFMS data into a hit list of class distinguishing analyte features. Hit lists of the 70 eV and 14 eV data sets, and the single hit list produced when the two data sets are fused together, are all investigated. For the 70 eV hit list, eleven of the twelve analytes were found in the top thirteen hits. For the 14 eV hit list, nine of the twelve analytes were found in the top nine hits, with the other three analytes either not found or well down the hit list. As expected, the F-ratios per m/z used to calculate each average F-ratio per hit were generally smaller fragment ions for the 70 eV data set, while the larger fragment ions were emphasized in the 14 eV data set, supporting the notion that complementary information was provided. The discovery rate was improved when F-ratio analysis was performed on the fused data sets resulted in eleven of the twelve analytes being at the top of the single hit list. Using PARAFAC, analytes that were "discovered" were deconvoluted in order to obtain their identification via match values (MV). Location of the analytes and the "F-ratio spectra" obtained from F-ratio analysis were used to guide the deconvolution. Eight of the twelve analytes where successfully deconvoluted and identified using the in-house library for the 70 eV data set. PARAFAC deconvolution of the two separate data sets provided increased confidence in identification of "discovered" analytes. Herein, we explore the limit of analyte discovery and limit of analyte identification, and demonstrate a general workflow for the investigation of key chemical features in complex samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Multiple Hypnotizabilities: Differentiating the Building Blocks of Hypnotic Response

    ERIC Educational Resources Information Center

    Woody, Erik Z.; Barnier, Amanda J.; McConkey, Kevin M.

    2005-01-01

    Although hypnotizability can be conceptualized as involving component subskills, standard measures do not differentiate them from a more general unitary trait, partly because the measures include limited sets of dichotomous items. To overcome this, the authors applied full-information factor analysis, a sophisticated analytic approach for…

  20. Physical Property Analysis and Report for Sediments at 100-BC-5 Operable Unit, Boreholes C7505, C7506, C7507, and C7665

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindberg, Michael J.

    2010-09-28

    Between October 14, 2009 and February 22, 2010 sediment samples were received from 100-BC Decision Unit for geochemical studies. This is an analytical data report for sediments received from CHPRC at the 100 BC 5 OU. The analyses for this project were performed at the 325 building located in the 300 Area of the Hanford Site. The analyses were performed according to Pacific Northwest National Laboratory (PNNL) approved procedures and/or nationally recognized test procedures. The data sets include the sample identification numbers, analytical results, estimated quantification limits (EQL), and quality control data. The preparatory and analytical quality control requirements, calibrationmore » requirements, acceptance criteria, and failure actions are defined in the on-line QA plan 'Conducting Analytical Work in Support of Regulatory Programs' (CAW). This QA plan implements the Hanford Analytical Services Quality Assurance Requirements Documents (HASQARD) for PNNL.« less

  1. Destructive analysis capabilities for plutonium and uranium characterization at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tandon, Lav; Kuhn, Kevin J; Drake, Lawrence R

    Los Alamos National Laboratory's (LANL) Actinide Analytical Chemistry (AAC) group has been in existence since the Manhattan Project. It maintains a complete set of analytical capabilities for performing complete characterization (elemental assay, isotopic, metallic and non metallic trace impurities) of uranium and plutonium samples in different forms. For a majority of the customers there are strong quality assurance (QA) and quality control (QC) objectives including highest accuracy and precision with well defined uncertainties associated with the analytical results. Los Alamos participates in various international and national programs such as the Plutonium Metal Exchange Program, New Brunswick Laboratory's (NBL' s) Safeguardsmore » Measurement Evaluation Program (SME) and several other inter-laboratory round robin exercises to monitor and evaluate the data quality generated by AAC. These programs also provide independent verification of analytical measurement capabilities, and allow any technical problems with analytical measurements to be identified and corrected. This presentation will focus on key analytical capabilities for destructive analysis in AAC and also comparative data between LANL and peer groups for Pu assay and isotopic analysis.« less

  2. Measuring research progress in photovoltaics

    NASA Technical Reports Server (NTRS)

    Jackson, B.; Mcguire, P.

    1986-01-01

    The role and some results of the project analysis and integration function in the Flat-plate Solar Array (FSA) Project are presented. Activities included supporting the decision-making process, preparation of plans for project direction, setting goals for project activities, measuring progress within the project, and the development and maintenance of analytical models.

  3. Completeness of the Coulomb Wave Functions in Quantum Mechanics

    ERIC Educational Resources Information Center

    Mukunda, N.

    1978-01-01

    Gives an explicit and elementary proof that the radial energy eigenfunctions for the hydrogen atom in quantum mechanics, bound and scattering states included, form a complete set. The proof uses some properties of the confluent hypergeometric functions and the Cauchy residue theorem from analytic function theory. (Author/GA)

  4. China-ASEAN Relations in Higher Education: An Analytical Framework

    ERIC Educational Resources Information Center

    Welch, Anthony

    2012-01-01

    China's dramatic economic rise has tended to overshadow other wider perspectives on the developing China and Association of Southeast Asian Nations (ASEAN) relationship, including in higher education. The article examines contemporary relations between China and ASEAN, set against the longer term development of cultural and trade relations. It is…

  5. Big Data Analytics for a Smart Green Infrastructure Strategy

    NASA Astrophysics Data System (ADS)

    Barrile, Vincenzo; Bonfa, Stefano; Bilotta, Giuliana

    2017-08-01

    As well known, Big Data is a term for data sets so large or complex that traditional data processing applications aren’t sufficient to process them. The term “Big Data” is referred to using predictive analytics. It is often related to user behavior analytics, or other advanced data analytics methods which from data extract value, and rarely to a particular size of data set. This is especially true for the huge amount of Earth Observation data that satellites constantly orbiting the earth daily transmit.

  6. Analytical learning and term-rewriting systems

    NASA Technical Reports Server (NTRS)

    Laird, Philip; Gamble, Evan

    1990-01-01

    Analytical learning is a set of machine learning techniques for revising the representation of a theory based on a small set of examples of that theory. When the representation of the theory is correct and complete but perhaps inefficient, an important objective of such analysis is to improve the computational efficiency of the representation. Several algorithms with this purpose have been suggested, most of which are closely tied to a first order logical language and are variants of goal regression, such as the familiar explanation based generalization (EBG) procedure. But because predicate calculus is a poor representation for some domains, these learning algorithms are extended to apply to other computational models. It is shown that the goal regression technique applies to a large family of programming languages, all based on a kind of term rewriting system. Included in this family are three language families of importance to artificial intelligence: logic programming, such as Prolog; lambda calculus, such as LISP; and combinatorial based languages, such as FP. A new analytical learning algorithm, AL-2, is exhibited that learns from success but is otherwise quite different from EBG. These results suggest that term rewriting systems are a good framework for analytical learning research in general, and that further research should be directed toward developing new techniques.

  7. Big data and high-performance analytics in structural health monitoring for bridge management

    NASA Astrophysics Data System (ADS)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  8. New approaches to wipe sampling methods for antineoplastic and other hazardous drugs in healthcare settings.

    PubMed

    Connor, Thomas H; Smith, Jerome P

    2016-09-01

    At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic. Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Current recommendations are that all healthc a re settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use.

  9. Non-universal critical exponents in earthquake complex networks

    NASA Astrophysics Data System (ADS)

    Pastén, Denisse; Torres, Felipe; Toledo, Benjamín A.; Muñoz, Víctor; Rogan, José; Valdivia, Juan Alejandro

    2018-02-01

    The problem of universality of critical exponents in complex networks is studied based on networks built from seismic data sets. Using two data sets corresponding to Chilean seismicity (northern zone, including the 2014 Mw = 8 . 2 earthquake in Iquique; and central zone without major earthquakes), directed networks for each set are constructed. Connectivity and betweenness centrality distributions are calculated and found to be scale-free, with respective exponents γ and δ. The expected relation between both characteristic exponents, δ >(γ + 1) / 2, is verified for both data sets. However, unlike the expectation for certain scale-free analytical complex networks, the value of δ is found to be non-universal.

  10. Publishing Outside the Box: Popular Press Books.

    PubMed

    Vyse, Stuart

    2014-10-01

    Writing and publishing popular press books requires a set of skills, not natural to basic and applied researchers trained to publish in peer-referred behavior analytic journals or to practice behavior analysis in applied settings. This article provides suggestions and examples. These include finding a distinctive idea, securing a contract, hiring an agent (or not), deciding on a publisher, and writing engagingly for a broad audience. The last is the greatest challenge. Among my recommendations are to read good prose, good models, and good books about publishing; talk to experienced colleagues; read aloud to judge the appropriateness of your vocabulary and style; and interject humor, imagery, and drama. Book publishing is a long and difficult process, but it is possible. It has a great potential for bringing behavior analytic research, practice and theory to the attention of the general public.

  11. Chemical Sensor Array Response Modeling Using Quantitative Structure-Activity Relationships Technique

    NASA Astrophysics Data System (ADS)

    Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.

    We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.

  12. Extending Climate Analytics-As to the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.

    2015-12-01

    We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.

  13. Integration within the Felsenstein equation for improved Markov chain Monte Carlo methods in population genetics

    PubMed Central

    Hey, Jody; Nielsen, Rasmus

    2007-01-01

    In 1988, Felsenstein described a framework for assessing the likelihood of a genetic data set in which all of the possible genealogical histories of the data are considered, each in proportion to their probability. Although not analytically solvable, several approaches, including Markov chain Monte Carlo methods, have been developed to find approximate solutions. Here, we describe an approach in which Markov chain Monte Carlo simulations are used to integrate over the space of genealogies, whereas other parameters are integrated out analytically. The result is an approximation to the full joint posterior density of the model parameters. For many purposes, this function can be treated as a likelihood, thereby permitting likelihood-based analyses, including likelihood ratio tests of nested models. Several examples, including an application to the divergence of chimpanzee subspecies, are provided. PMID:17301231

  14. The Synergistic Engineering Environment

    NASA Technical Reports Server (NTRS)

    Cruz, Jonathan

    2006-01-01

    The Synergistic Engineering Environment (SEE) is a system of software dedicated to aiding the understanding of space mission operations. The SEE can integrate disparate sets of data with analytical capabilities, geometric models of spacecraft, and a visualization environment, all contributing to the creation of an interactive simulation of spacecraft. Initially designed to satisfy needs pertaining to the International Space Station, the SEE has been broadened in scope to include spacecraft ranging from those in low orbit around the Earth to those on deep-space missions. The SEE includes analytical capabilities in rigid-body dynamics, kinematics, orbital mechanics, and payload operations. These capabilities enable a user to perform real-time interactive engineering analyses focusing on diverse aspects of operations, including flight attitudes and maneuvers, docking of visiting spacecraft, robotic operations, impingement of spacecraft-engine exhaust plumes, obscuration of instrumentation fields of view, communications, and alternative assembly configurations. .

  15. Merging OLTP and OLAP - Back to the Future

    NASA Astrophysics Data System (ADS)

    Lehner, Wolfgang

    When the terms "Data Warehousing" and "Online Analytical Processing" were coined in the 1990s by Kimball, Codd, and others, there was an obvious need for separating data and workload for operational transactional-style processing and decision-making implying complex analytical queries over large and historic data sets. Large data warehouse infrastructures have been set up to cope with the special requirements of analytical query answering for multiple reasons: For example, analytical thinking heavily relies on predefined navigation paths to guide the user through the data set and to provide different views on different aggregation levels.Multi-dimensional queries exploiting hierarchically structured dimensions lead to complex star queries at a relational backend, which could hardly be handled by classical relational systems.

  16. PyVCI: A flexible open-source code for calculating accurate molecular infrared spectra

    NASA Astrophysics Data System (ADS)

    Sibaev, Marat; Crittenden, Deborah L.

    2016-06-01

    The PyVCI program package is a general purpose open-source code for simulating accurate molecular spectra, based upon force field expansions of the potential energy surface in normal mode coordinates. It includes harmonic normal coordinate analysis and vibrational configuration interaction (VCI) algorithms, implemented primarily in Python for accessibility but with time-consuming routines written in C. Coriolis coupling terms may be optionally included in the vibrational Hamiltonian. Non-negligible VCI matrix elements are stored in sparse matrix format to alleviate the diagonalization problem. CPU and memory requirements may be further controlled by algorithmic choices and/or numerical screening procedures, and recommended values are established by benchmarking using a test set of 44 molecules for which accurate analytical potential energy surfaces are available. Force fields in normal mode coordinates are obtained from the PyPES library of high quality analytical potential energy surfaces (to 6th order) or by numerical differentiation of analytic second derivatives generated using the GAMESS quantum chemical program package (to 4th order).

  17. Wideband analytical equivalent circuit for one-dimensional periodic stacked arrays.

    PubMed

    Molero, Carlos; Rodríguez-Berral, Raúl; Mesa, Francisco; Medina, Francisco; Yakovlev, Alexander B

    2016-01-01

    A wideband equivalent circuit is proposed for the accurate analysis of scattering from a set of stacked slit gratings illuminated by a plane wave with transverse magnetic or electric polarization that impinges normally or obliquely along one of the principal planes of the structure. The slit gratings are printed on dielectric slabs of arbitrary thickness, including the case of closely spaced gratings that interact by higher-order modes. A Π-circuit topology is obtained for a pair of coupled arrays, with fully analytical expressions for all the circuit elements. This equivalent Π circuit is employed as the basis to derive the equivalent circuit of finite stacks with any given number of gratings. Analytical expressions for the Brillouin diagram and the Bloch impedance are also obtained for infinite periodic stacks.

  18. Majoring in the Rest of Your Life. Career Secrets for College Students.

    ERIC Educational Resources Information Center

    Carter, Carol

    Primarily intended for college freshmen, this book provides practical advice and hints on ways to succeed in college and on setting career goals. Thirteen chapters outline and discuss various life skills and "tools" for succeeding in college and on the job, including planning and organizing; problem solving/analytical skills;…

  19. School Choice and the Achievement Gap

    ERIC Educational Resources Information Center

    Jeynes, William H.

    2014-01-01

    The possibility is examined that school choice programs could be a means to reducing the achievement gap. Data based on meta-analytic research and the examination of nationwide data sets suggest that school choice programs that include private schools could reduce the achievement gap by 25%. The propounding of this possibility is based on research…

  20. The analyst's participation in the analytic process.

    PubMed

    Levine, H B

    1994-08-01

    The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.

  1. Multivariate Protein Signatures of Pre-Clinical Alzheimer's Disease in the Alzheimer's Disease Neuroimaging Initiative (ADNI) Plasma Proteome Dataset

    PubMed Central

    Johnstone, Daniel; Milward, Elizabeth A.; Berretta, Regina; Moscato, Pablo

    2012-01-01

    Background Recent Alzheimer's disease (AD) research has focused on finding biomarkers to identify disease at the pre-clinical stage of mild cognitive impairment (MCI), allowing treatment to be initiated before irreversible damage occurs. Many studies have examined brain imaging or cerebrospinal fluid but there is also growing interest in blood biomarkers. The Alzheimer's Disease Neuroimaging Initiative (ADNI) has generated data on 190 plasma analytes in 566 individuals with MCI, AD or normal cognition. We conducted independent analyses of this dataset to identify plasma protein signatures predicting pre-clinical AD. Methods and Findings We focused on identifying signatures that discriminate cognitively normal controls (n = 54) from individuals with MCI who subsequently progress to AD (n = 163). Based on p value, apolipoprotein E (APOE) showed the strongest difference between these groups (p = 2.3×10−13). We applied a multivariate approach based on combinatorial optimization ((α,β)-k Feature Set Selection), which retains information about individual participants and maintains the context of interrelationships between different analytes, to identify the optimal set of analytes (signature) to discriminate these two groups. We identified 11-analyte signatures achieving values of sensitivity and specificity between 65% and 86% for both MCI and AD groups, depending on whether APOE was included and other factors. Classification accuracy was improved by considering “meta-features,” representing the difference in relative abundance of two analytes, with an 8-meta-feature signature consistently achieving sensitivity and specificity both over 85%. Generating signatures based on longitudinal rather than cross-sectional data further improved classification accuracy, returning sensitivities and specificities of approximately 90%. Conclusions Applying these novel analysis approaches to the powerful and well-characterized ADNI dataset has identified sets of plasma biomarkers for pre-clinical AD. While studies of independent test sets are required to validate the signatures, these analyses provide a starting point for developing a cost-effective and minimally invasive test capable of diagnosing AD in its pre-clinical stages. PMID:22485168

  2. Analytical performance of the ThyroSeq v3 genomic classifier for cancer diagnosis in thyroid nodules.

    PubMed

    Nikiforova, Marina N; Mercurio, Stephanie; Wald, Abigail I; Barbi de Moura, Michelle; Callenberg, Keith; Santana-Santos, Lucas; Gooding, William E; Yip, Linwah; Ferris, Robert L; Nikiforov, Yuri E

    2018-04-15

    Molecular tests have clinical utility for thyroid nodules with indeterminate fine-needle aspiration (FNA) cytology, although their performance requires further improvement. This study evaluated the analytical performance of the newly created ThyroSeq v3 test. ThyroSeq v3 is a DNA- and RNA-based next-generation sequencing assay that analyzes 112 genes for a variety of genetic alterations, including point mutations, insertions/deletions, gene fusions, copy number alterations, and abnormal gene expression, and it uses a genomic classifier (GC) to separate malignant lesions from benign lesions. It was validated in 238 tissue samples and 175 FNA samples with known surgical follow-up. Analytical performance studies were conducted. In the training tissue set of samples, ThyroSeq GC detected more than 100 genetic alterations, including BRAF, RAS, TERT, and DICER1 mutations, NTRK1/3, BRAF, and RET fusions, 22q loss, and gene expression alterations. GC cutoffs were established to distinguish cancer from benign nodules with 93.9% sensitivity, 89.4% specificity, and 92.1% accuracy. This correctly classified most papillary, follicular, and Hurthle cell lesions, medullary thyroid carcinomas, and parathyroid lesions. In the FNA validation set, the GC sensitivity was 98.0%, the specificity was 81.8%, and the accuracy was 90.9%. Analytical accuracy studies demonstrated a minimal required nucleic acid input of 2.5 ng, a 12% minimal acceptable tumor content, and reproducible test results under variable stress conditions. The ThyroSeq v3 GC analyzes 5 different classes of molecular alterations and provides high accuracy for detecting all common types of thyroid cancer and parathyroid lesions. The analytical sensitivity, specificity, and robustness of the test have been successfully validated and indicate its suitability for clinical use. Cancer 2018;124:1682-90. © 2018 American Cancer Society. © 2018 American Cancer Society.

  3. Paper-based analytical devices for clinical diagnosis: recent advances in the fabrication techniques and sensing mechanisms

    PubMed Central

    Sher, Mazhar; Zhuang, Rachel; Demirci, Utkan; Asghar, Waseem

    2017-01-01

    Introduction There is a significant interest in developing inexpensive portable biosensing platforms for various applications including disease diagnostics, environmental monitoring, food safety, and water testing at the point-of-care (POC) settings. Current diagnostic assays available in the developed world require sophisticated laboratory infrastructure and expensive reagents. Hence, they are not suitable for resource-constrained settings with limited financial resources, basic health infrastructure, and few trained technicians. Cellulose and flexible transparency paper-based analytical devices have demonstrated enormous potential for developing robust, inexpensive and portable devices for disease diagnostics. These devices offer promising solutions to disease management in resource-constrained settings where the vast majority of the population cannot afford expensive and highly sophisticated treatment options. Areas covered In this review, the authors describe currently developed cellulose and flexible transparency paper-based microfluidic devices, device fabrication techniques, and sensing technologies that are integrated with these devices. The authors also discuss the limitations and challenges associated with these devices and their potential in clinical settings. Expert commentary In recent years, cellulose and flexible transparency paper-based microfluidic devices have demonstrated the potential to become future healthcare options despite a few limitations such as low sensitivity and reproducibility. PMID:28103450

  4. Paper-based analytical devices for clinical diagnosis: recent advances in the fabrication techniques and sensing mechanisms.

    PubMed

    Sher, Mazhar; Zhuang, Rachel; Demirci, Utkan; Asghar, Waseem

    2017-04-01

    There is a significant interest in developing inexpensive portable biosensing platforms for various applications including disease diagnostics, environmental monitoring, food safety, and water testing at the point-of-care (POC) settings. Current diagnostic assays available in the developed world require sophisticated laboratory infrastructure and expensive reagents. Hence, they are not suitable for resource-constrained settings with limited financial resources, basic health infrastructure, and few trained technicians. Cellulose and flexible transparency paper-based analytical devices have demonstrated enormous potential for developing robust, inexpensive and portable devices for disease diagnostics. These devices offer promising solutions to disease management in resource-constrained settings where the vast majority of the population cannot afford expensive and highly sophisticated treatment options. Areas covered: In this review, the authors describe currently developed cellulose and flexible transparency paper-based microfluidic devices, device fabrication techniques, and sensing technologies that are integrated with these devices. The authors also discuss the limitations and challenges associated with these devices and their potential in clinical settings. Expert commentary: In recent years, cellulose and flexible transparency paper-based microfluidic devices have demonstrated the potential to become future healthcare options despite a few limitations such as low sensitivity and reproducibility.

  5. Tissue-aware RNA-Seq processing and normalization for heterogeneous and sparse data.

    PubMed

    Paulson, Joseph N; Chen, Cho-Yi; Lopes-Ramos, Camila M; Kuijjer, Marieke L; Platig, John; Sonawane, Abhijeet R; Fagny, Maud; Glass, Kimberly; Quackenbush, John

    2017-10-03

    Although ultrahigh-throughput RNA-Sequencing has become the dominant technology for genome-wide transcriptional profiling, the vast majority of RNA-Seq studies typically profile only tens of samples, and most analytical pipelines are optimized for these smaller studies. However, projects are generating ever-larger data sets comprising RNA-Seq data from hundreds or thousands of samples, often collected at multiple centers and from diverse tissues. These complex data sets present significant analytical challenges due to batch and tissue effects, but provide the opportunity to revisit the assumptions and methods that we use to preprocess, normalize, and filter RNA-Seq data - critical first steps for any subsequent analysis. We find that analysis of large RNA-Seq data sets requires both careful quality control and the need to account for sparsity due to the heterogeneity intrinsic in multi-group studies. We developed Yet Another RNA Normalization software pipeline (YARN), that includes quality control and preprocessing, gene filtering, and normalization steps designed to facilitate downstream analysis of large, heterogeneous RNA-Seq data sets and we demonstrate its use with data from the Genotype-Tissue Expression (GTEx) project. An R package instantiating YARN is available at http://bioconductor.org/packages/yarn .

  6. A General Methodology for the Translation of Behavioral Terms into Vernacular Languages.

    PubMed

    Virues-Ortega, Javier; Martin, Neil; Schnerch, Gabriel; García, Jesús Ángel Miguel; Mellichamp, Fae

    2015-05-01

    As the field of behavior analysis expands internationally, the need for comprehensive and systematic glossaries of behavioral terms in the vernacular languages of professionals and clients becomes crucial. We created a Spanish-language glossary of behavior-analytic terms by developing and employing a systematic set of decision-making rules for the inclusion of terms. We then submitted the preliminary translation to a multi-national advisory committee to evaluate the transnational acceptability of the glossary. This method led to a translated corpus of over 1200 behavioral terms. The end products of this work included the following: (a) a Spanish-language glossary of behavior analytic terms that are publicly available over the Internet through the Behavior Analyst Certification Board and (b) a set of translation guidelines summarized here that may be useful for the development of glossaries of behavioral terms into other vernacular languages.

  7. An analytic superfield formalism for tree superamplitudes in D=10 and D=11

    NASA Astrophysics Data System (ADS)

    Bandos, Igor

    2018-05-01

    Tree amplitudes of 10D supersymmetric Yang-Mills theory (SYM) and 11D supergravity (SUGRA) are collected in multi-particle counterparts of analytic on-shell superfields. These have essentially the same form as their chiral 4D counterparts describing N=4 SYM and N=8 SUGRA, but with components dependent on a different set of bosonic variables. These are the D=10 and D=11 spinor helicity variables, the set of which includes the spinor frame variable (Lorentz harmonics) and a scalar density, and generalized homogeneous coordinates of the coset SO(D-2)/SO(D-4)⊗ U(1) (internal harmonics). We present an especially convenient parametrization of the spinor harmonics (Lorentz covariant gauge fixed with the use of an auxiliary gauge symmetry) and use this to find (a gauge fixed version of) the 3-point tree superamplitudes of 10D SYM and 11D SUGRA which generalize the 4 dimensional anti-MHV superamplitudes.

  8. Dendritic cell immunotherapy followed by cART interruption during HIV-1 infection induces plasma protein markers of cellular immunity and neutrophil recruitment.

    PubMed

    van den Ham, Henk-Jan; Cooper, Jason D; Tomasik, Jakub; Bahn, Sabine; Aerts, Joeri L; Osterhaus, Albert D M E; Gruters, Rob A; Andeweg, Arno C

    2018-01-01

    To characterize the host response to dendritic cell-based immunotherapy and subsequent combined antiretroviral therapy (cART) interruption in HIV-1-infected individuals at the plasma protein level. An autologous dendritic cell (DC) therapeutic vaccine was administered to HIV-infected individuals, stable on cART. The effect of vaccination was evaluated at the plasma protein level during the period preceding cART interruption, during analytical therapy interruption and at viral reactivation. Healthy controls and post-exposure prophylactically treated healthy individuals were included as controls. Plasma marker ('analyte') levels including cytokines, chemokines, growth factors, and hormones were measured in trial participants and control plasma samples using a multiplex immunoassay. Analyte levels were analysed using principle component analysis, cluster analysis and limma. Blood neutrophil counts were analysed using linear regression. Plasma analyte levels of HIV-infected individuals are markedly different from those of healthy controls and HIV-negative individuals receiving post-exposure prophylaxis. Viral reactivation following cART interruption also affects multiple analytes, but cART interruption itself only has only a minor effect. We find that Thyroxine-Binding Globulin (TBG) levels and late-stage neutrophil numbers correlate with the time off cART after DC vaccination. Furthermore, analysis shows that cART alters several regulators of blood glucose levels, including C-peptide, chromogranin-A and leptin. HIV reactivation is associated with the upregulation of CXCR3 ligands. Chronic HIV infection leads to a change in multiple plasma analyte levels, as does virus reactivation after cART interruption. Furthermore, we find evidence for the involvement of TBG and neutrophils in the response to DC-vaccination in the setting of HIV-infection.

  9. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  10. eTRIKS platform: Conception and operation of a highly scalable cloud-based platform for translational research and applications development.

    PubMed

    Bussery, Justin; Denis, Leslie-Alexandre; Guillon, Benjamin; Liu, Pengfeï; Marchetti, Gino; Rahal, Ghita

    2018-04-01

    We describe the genesis, design and evolution of a computing platform designed and built to improve the success rate of biomedical translational research. The eTRIKS project platform was developed with the aim of building a platform that can securely host heterogeneous types of data and provide an optimal environment to run tranSMART analytical applications. Many types of data can now be hosted, including multi-OMICS data, preclinical laboratory data and clinical information, including longitudinal data sets. During the last two years, the platform has matured into a robust translational research knowledge management system that is able to host other data mining applications and support the development of new analytical tools. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Geologic setting, petrophysical characteristics, and regional heterogeneity patterns of the Smackover in southwest Alabama. Draft topical report on Subtasks 2 and 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kopaska-Merkel, D.C.; Mann, S.D.; Tew, B.H.

    1992-06-01

    This is the draft topical report on Subtasks 2 and 3 of DOE contract number DE-FG22-89BC14425, entitled ``Establishment of an oil and gas database for increased recovery and characterization of oil and gas carbonate reservoir heterogeneity.`` This volume constitutes the final report on Subtask 3, which had as its primary goal the geological modeling of reservoir heterogeneity in Smackover reservoirs of southwest Alabama. This goal was interpreted to include a thorough analysis of Smackover reservoirs, which was required for an understanding of Smackover reservoir heterogeneity. This report is divided into six sections (including this brief introduction). Section two, entitled ``Geologicmore » setting,`` presents a concise summary of Jurassic paleogeography, structural setting, and stratigraphy in southwest Alabama. This section also includes a brief review of sedimentologic characteristics and stratigraphic framework of the Smackover, and a summary of the diagenetic processes that strongly affected Smackover reservoirs in Alabama. Section three, entitled ``Analytical methods,`` summarizes all nonroutine aspects of the analytical procedures used in this project. The major topics are thin-section description, analysis of commercial porosity and permeability data, capillary-pressure analysis, and field characterization. ``Smackover reservoir characteristics`` are described in section four, which begins with a general summary of the petrographic characteristics of porous and permeable Smackover strata. This is followed by a more-detailed petrophysical description of Smackover reservoirs.« less

  12. Analytical modelling of temperature effects on an AMPA-type synapse.

    PubMed

    Kufel, Dominik S; Wojcik, Grzegorz M

    2018-05-11

    It was previously reported, that temperature may significantly influence neural dynamics on the different levels of brain function. Thus, in computational neuroscience, it would be useful to make models scalable for a wide range of various brain temperatures. However, lack of experimental data and an absence of temperature-dependent analytical models of synaptic conductance does not allow to include temperature effects at the multi-neuron modeling level. In this paper, we propose a first step to deal with this problem: A new analytical model of AMPA-type synaptic conductance, which is able to incorporate temperature effects in low-frequency stimulations. It was constructed based on Markov model description of AMPA receptor kinetics using the set of coupled ODEs. The closed-form solution for the set of differential equations was found using uncoupling assumption (introduced in the paper) with few simplifications motivated both from experimental data and from Monte Carlo simulation of synaptic transmission. The model may be used for computationally efficient and biologically accurate implementation of temperature effects on AMPA receptor conductance in large-scale neural network simulations. As a result, it may open a wide range of new possibilities for researching the influence of temperature on certain aspects of brain functioning.

  13. The influence of parametric and external noise in act-and-wait control with delayed feedback.

    PubMed

    Wang, Jiaxing; Kuske, Rachel

    2017-11-01

    We apply several novel semi-analytic approaches for characterizing and calculating the effects of noise in a system with act-and-wait control. For concrete illustration, we apply these to a canonical balance model for an inverted pendulum to study the combined effect of delay and noise within the act-and-wait setting. While the act-and-wait control facilitates strong stabilization through deadbeat control, a comparison of different models with continuous vs. discrete updating of the control strategy in the active period illustrates how delays combined with the imprecise application of the control can seriously degrade the performance. We give several novel analyses of a generalized act-and-wait control strategy, allowing flexibility in the updating of the control strategy, in order to understand the sensitivities to delays and random fluctuations. In both the deterministic and stochastic settings, we give analytical and semi-analytical results that characterize and quantify the dynamics of the system. These results include the size and shape of stability regions, densities for the critical eigenvalues that capture the rate of reaching the desired stable equilibrium, and amplification factors for sustained fluctuations in the context of external noise. They also provide the dependence of these quantities on the length of the delay and the active period. In particular, we see that the combined influence of delay, parametric error, or external noise and on-off control can qualitatively change the dynamics, thus reducing the robustness of the control strategy. We also capture the dependence on how frequently the control is updated, allowing an interpolation between continuous and frequent updating. In addition to providing insights for these specific models, the methods we propose are generalizable to other settings with noise, delay, and on-off control, where analytical techniques are otherwise severely scarce.

  14. The Complexity of Bodily Events through an Ethnographer's Gaze: Focusing on the Youngest Children in Preschool

    ERIC Educational Resources Information Center

    Rossholt, Nina

    2009-01-01

    This article discusses theoretical, methodological and analytical strategies for researching the material subject. The discussion relates to discursive practices in a preschool setting with children of one and two years of age, where the material subject includes both bodily and discursive practices. Using critical ethnography research, the author…

  15. Analytical Model and Optimized Design of Power Transmitting Coil for Inductively Coupled Endoscope Robot.

    PubMed

    Ke, Quan; Luo, Weijie; Yan, Guozheng; Yang, Kai

    2016-04-01

    A wireless power transfer system based on the weakly inductive coupling makes it possible to provide the endoscope microrobot (EMR) with infinite power. To facilitate the patients' inspection with the EMR system, the diameter of the transmitting coil is enlarged to 69 cm. Due to the large transmitting range, a high quality factor of the Litz-wire transmitting coil is a necessity to ensure the intensity of magnetic field generated efficiently. Thus, this paper builds an analytical model of the transmitting coil, and then, optimizes the parameters of the coil by enlarging the quality factor. The lumped model of the transmitting coil includes three parameters: ac resistance, self-inductance, and stray capacitance. Based on the exact two-dimension solution, the accurate analytical expression of ac resistance is derived. Several transmitting coils of different specifications are utilized to verify this analytical expression, being in good agreements with the measured results except the coils with a large number of strands. Then, the quality factor of transmitting coils can be well predicted with the available analytical expressions of self- inductance and stray capacitance. Owing to the exact estimation of quality factor, the appropriate coil turns of the transmitting coil is set to 18-40 within the restrictions of transmitting circuit and human tissue issues. To supply enough energy for the next generation of the EMR equipped with a Ø9.5×10.1 mm receiving coil, the coil turns of the transmitting coil is optimally set to 28, which can transfer a maximum power of 750 mW with the remarkable delivering efficiency of 3.55%.

  16. Standardization, evaluation and early-phase method validation of an analytical scheme for batch-consistency N-glycosylation analysis of recombinant produced glycoproteins.

    PubMed

    Zietze, Stefan; Müller, Rainer H; Brecht, René

    2008-03-01

    In order to set up a batch-to-batch-consistency analytical scheme for N-glycosylation analysis, several sample preparation steps including enzyme digestions and fluorophore labelling and two HPLC-methods were established. The whole method scheme was standardized, evaluated and validated according to the requirements on analytical testing in early clinical drug development by usage of a recombinant produced reference glycoprotein (RGP). The standardization of the methods was performed by clearly defined standard operation procedures. During evaluation of the methods, the major interest was in the loss determination of oligosaccharides within the analytical scheme. Validation of the methods was performed with respect to specificity, linearity, repeatability, LOD and LOQ. Due to the fact that reference N-glycan standards were not available, a statistical approach was chosen to derive accuracy from the linearity data. After finishing the validation procedure, defined limits for method variability could be calculated and differences observed in consistency analysis could be separated into significant and incidental ones.

  17. Analytical derivatives of the individual state energies in ensemble density functional theory method. I. General formalism

    DOE PAGES

    Filatov, Michael; Liu, Fang; Martínez, Todd J.

    2017-07-21

    The state-averaged (SA) spin restricted ensemble referenced Kohn-Sham (REKS) method and its state interaction (SI) extension, SI-SA-REKS, enable one to describe correctly the shape of the ground and excited potential energy surfaces of molecules undergoing bond breaking/bond formation reactions including features such as conical intersections crucial for theoretical modeling of non-adiabatic reactions. Until recently, application of the SA-REKS and SI-SA-REKS methods to modeling the dynamics of such reactions was obstructed due to the lack of the analytical energy derivatives. Here, the analytical derivatives of the individual SA-REKS and SI-SA-REKS energies are derived. The final analytic gradient expressions are formulated entirelymore » in terms of traces of matrix products and are presented in the form convenient for implementation in the traditional quantum chemical codes employing basis set expansions of the molecular orbitals. Finally, we will describe the implementation and benchmarking of the derived formalism in a subsequent article of this series.« less

  18. Irregular analytical errors in diagnostic testing - a novel concept.

    PubMed

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC-isotope-dilution mass spectrometry methods are increasingly used for pre-market validation of routine diagnostic assays (these tests also involve substantial sets of clinical validation samples). Based on this definition/terminology, we list recognized causes of irregular analytical error as a risk catalog for clinical chemistry in this article. These issues include reproducible individual analytical errors (e.g. caused by anti-reagent antibodies) and non-reproducible, sporadic errors (e.g. errors due to incorrect pipetting volume due to air bubbles in a sample), which can both lead to inaccurate results and risks for patients.

  19. Analytical solutions for one-, two-, and three-dimensional solute transport in ground-water systems with uniform flow

    USGS Publications Warehouse

    Wexler, Eliezer J.

    1992-01-01

    Analytical solutions to the advective-dispersive solute-transport equation are useful in predicting the fate of solutes in ground water. Analytical solutions compiled from available literature or derived by the author are presented for a variety of boundary condition types and solute-source configurations in one-, two-, and three-dimensional systems having uniform ground-water flow. A set of user-oriented computer programs was created to evaluate these solutions and to display the results in tabular and computer-graphics format. These programs incorporate many features that enhance their accuracy, ease of use, and versatility. Documentation for the programs describes their operation and required input data, and presents the results of sample problems. Derivations of selected solutions, source codes for the computer programs, and samples of program input and output also are included.

  20. Distribution of Steps with Finite-Range Interactions: Analytic Approximations and Numerical Results

    NASA Astrophysics Data System (ADS)

    GonzáLez, Diego Luis; Jaramillo, Diego Felipe; TéLlez, Gabriel; Einstein, T. L.

    2013-03-01

    While most Monte Carlo simulations assume only nearest-neighbor steps interact elastically, most analytic frameworks (especially the generalized Wigner distribution) posit that each step elastically repels all others. In addition to the elastic repulsions, we allow for possible surface-state-mediated interactions. We investigate analytically and numerically how next-nearest neighbor (NNN) interactions and, more generally, interactions out to q'th nearest neighbor alter the form of the terrace-width distribution and of pair correlation functions (i.e. the sum over n'th neighbor distribution functions, which we investigated recently.[2] For physically plausible interactions, we find modest changes when NNN interactions are included and generally negligible changes when more distant interactions are allowed. We discuss methods for extracting from simulated experimental data the characteristic scale-setting terms in assumed potential forms.

  1. Standing in the gap: ref lections on translating the Jung-Neumann correspondence.

    PubMed

    McCartney, Heather

    2016-04-01

    This paper considers the experience of translating the correspondence between C.G. Jung and Erich Neumann as part of the Philemon series. The translator explores the similarities between analytical work and the task of translation by means of the concepts of the dialectical third and the interactional field. The history and politics of the translation of analytic writing and their consequences for the lingua franca of analysis are discussed. Key themes within the correspondence are outlined, including Jung and Neumann's pre-war exploration of Judaism and the unconscious, the post-war difficulties around the publication of Neumann's Depth Psychology and a New Ethic set against the early years of the C.G. Jung Institute in Zurich, and the development of the correspondents' relationship over time. © 2016, The Society of Analytical Psychology.

  2. Evaluation of one dimensional analytical models for vegetation canopies

    NASA Technical Reports Server (NTRS)

    Goel, Narendra S.; Kuusk, Andres

    1992-01-01

    The SAIL model for one-dimensional homogeneous vegetation canopies has been modified to include the specular reflectance and hot spot effects. This modified model and the Nilson-Kuusk model are evaluated by comparing the reflectances given by them against those given by a radiosity-based computer model, Diana, for a set of canopies, characterized by different leaf area index (LAI) and leaf angle distribution (LAD). It is shown that for homogeneous canopies, the analytical models are generally quite accurate in the visible region, but not in the infrared region. For architecturally realistic heterogeneous canopies of the type found in nature, these models fall short. These shortcomings are quantified.

  3. Media Literacy Interventions: A Meta-Analytic Review

    PubMed Central

    Jeong, Se-Hoon; Cho, Hyunyi; Hwang, Yoori

    2012-01-01

    Although numerous media literacy interventions have been developed and delivered over the past 3 decades, a comprehensive meta-analytic assessment of their effects has not been available. This study investigates the average effect size and moderators of 51 media literacy interventions. Media literacy interventions had positive effects (d=.37) on outcomes including media knowledge, criticism, perceived realism, influence, behavioral beliefs, attitudes, self-efficacy, and behavior. Moderator analyses indicated that interventions with more sessions were more effective, but those with more components were less effective. Intervention effects did not vary by the agent, target age, the setting, audience involvement, the topic, the country, or publication status. PMID:22736807

  4. Set this house on fire: the self-analysis of Raymond Carver.

    PubMed

    Tutter, Adele

    2011-10-01

    The convergence of features of Raymond Carver's short-story oeuvre and of psychoanalytic methodology suggests that Carver's writing served as the fulcrum and focus of a self-analytic experience. Within this model, his stories function as container and mirror of myriad aspects of the writer's self. Tracing the developmental arc of the contextual meanings of one motif--fire--through six stories and their ur-texts demonstrates gains comparable to certain analytic goals, including enhanced integration, accountability, and self-awareness. Over time, Carver's narratives of rage, impotence, and despair give way to a new story: of mourning, forgiveness, and the rekindling of hope.

  5. Determination of hydroxytyrosol and tyrosol by liquid chromatography for the quality control of cosmetic products based on olive extracts.

    PubMed

    Miralles, Pablo; Chisvert, Alberto; Salvador, Amparo

    2015-01-01

    An analytical method for the simultaneous determination of hydroxytyrosol and tyrosol in different types of olive extract raw materials and cosmetic cream samples has been developed. The determination was performed by liquid chromatography with UV spectrophotometric detection. Different chromatographic parameters, such as mobile phase pH and composition, oven temperature and different sample preparation variables were studied. The best chromatographic separation was obtained under the following conditions: C18 column set at 35°C and isocratic elution of a mixture ethanol: 1% acetic acid solution at pH 5 (5:95, v/v) as mobile phase pumped at 1 mL min(-1). The detection wavelength was set at 280 nm and the total run time required for the chromatographic analysis was 10 min, except for cosmetic cream samples where 20 min runtime was required (including a cleaning step). The method was satisfactorily applied to 23 samples including solid, water-soluble and fat-soluble olive extracts and cosmetic cream samples containing hydroxytyrosol and tyrosol. Good recoveries (95-107%) and repeatability (1.1-3.6%) were obtained, besides of limits of detection values below the μg mL(-1) level. These good analytical features, as well as its environmentally-friendly characteristics, make the presented method suitable to carry out both the control of the whole manufacture process of raw materials containing the target analytes and the quality control of the finished cosmetic products. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Analytic model of a multi-electron atom

    NASA Astrophysics Data System (ADS)

    Skoromnik, O. D.; Feranchuk, I. D.; Leonau, A. U.; Keitel, C. H.

    2017-12-01

    A fully analytical approximation for the observable characteristics of many-electron atoms is developed via a complete and orthonormal hydrogen-like basis with a single-effective charge parameter for all electrons of a given atom. The basis completeness allows us to employ the secondary-quantized representation for the construction of regular perturbation theory, which includes in a natural way correlation effects, converges fast and enables an effective calculation of the subsequent corrections. The hydrogen-like basis set provides a possibility to perform all summations over intermediate states in closed form, including both the discrete and continuous spectra. This is achieved with the help of the decomposition of the multi-particle Green function in a convolution of single-electronic Coulomb Green functions. We demonstrate that our fully analytical zeroth-order approximation describes the whole spectrum of the system, provides accuracy, which is independent of the number of electrons and is important for applications where the Thomas-Fermi model is still utilized. In addition already in second-order perturbation theory our results become comparable with those via a multi-configuration Hartree-Fock approach.

  7. User's manual for the one-dimensional hypersonic experimental aero-thermodynamic (1DHEAT) data reduction code

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.

    1995-01-01

    A FORTRAN computer code for the reduction and analysis of experimental heat transfer data has been developed. This code can be utilized to determine heat transfer rates from surface temperature measurements made using either thin-film resistance gages or coaxial surface thermocouples. Both an analytical and a numerical finite-volume heat transfer model are implemented in this code. The analytical solution is based on a one-dimensional, semi-infinite wall thickness model with the approximation of constant substrate thermal properties, which is empirically corrected for the effects of variable thermal properties. The finite-volume solution is based on a one-dimensional, implicit discretization. The finite-volume model directly incorporates the effects of variable substrate thermal properties and does not require the semi-finite wall thickness approximation used in the analytical model. This model also includes the option of a multiple-layer substrate. Fast, accurate results can be obtained using either method. This code has been used to reduce several sets of aerodynamic heating data, of which samples are included in this report.

  8. Versatile electrophoresis-based self-test platform.

    PubMed

    Guijt, Rosanne M

    2015-03-01

    Lab on a Chip technology offers the possibility to extract chemical information from a complex sample in a simple, automated way without the need for a laboratory setting. In the health care sector, this chemical information could be used as a diagnostic tool for example to inform dosing. In this issue, the research underpinning a family of electrophoresis-based point-of-care devices for self-testing of ionic analytes in various sample matrices is described [Electrophoresis 2015, 36, 712-721.]. Hardware, software, and methodological chances made to improve the overall analytical performance in terms of accuracy, precision, detection limit, and reliability are discussed. In addition to the main focus of lithium monitoring, new applications including the use of the platform for veterinary purposes, sodium, and for creatinine measurements are included. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Reconnaissance-Level Assessment of Water Quality Near Flandreau, South Dakota

    DTIC Science & Technology

    2002-01-01

    associations with adverse health effects have been established. Con- centrations of some selected analytes were less than U.S. Environmental ...unregulated synthetic organic compounds in aquatic environments . This study provides information concerning the occurrence of selected organic compounds...water sites. Selected data from various other investigations also are described. A total of 15 environmental samples, which included two sets of

  10. Utilization of Negative Ion ESI-MS and Tandem Mass Spectrometry to Detect and Confirm the NADH-Boric Acid Complex

    ERIC Educational Resources Information Center

    Kim, Danny H.; Eckhert, Curtis D.; Faull, Kym F.

    2011-01-01

    Mass spectrometry (MS) is a powerful analytical technique that is now widely used in the chemical, physical, engineering, and life sciences, with rapidly growing applications in many areas including clinical, forensic, pharmaceutical, and environmental fields. The increase in use of MS in both academic and industrial settings for research and…

  11. The Utility of the GRE Analytical Score for Selection into a Graduate Program in Educational Psychology.

    ERIC Educational Resources Information Center

    Mowsesian, Richard; Hays, William L.

    The Graduate Record Examination (GRE) Aptitude Test has been in use since 1938. In 1975 the GRE Aptitude Test was broadened to include an experimental set of items designed to tap a respondent's recognition of logical relationships and consistency of interrelated statements, and to make inferences from abstract relationships. To test the…

  12. Analytic energy gradients for the coupled-cluster singles and doubles method with the density-fitting approximation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bozkaya, Uğur, E-mail: ugur.bozkaya@hacettepe.edu.tr; Department of Chemistry, Atatürk University, Erzurum 25240; Sherrill, C. David

    2016-05-07

    An efficient implementation is presented for analytic gradients of the coupled-cluster singles and doubles (CCSD) method with the density-fitting approximation, denoted DF-CCSD. Frozen core terms are also included. When applied to a set of alkanes, the DF-CCSD analytic gradients are significantly accelerated compared to conventional CCSD for larger molecules. The efficiency of our DF-CCSD algorithm arises from the acceleration of several different terms, which are designated as the “gradient terms”: computation of particle density matrices (PDMs), generalized Fock-matrix (GFM), solution of the Z-vector equation, formation of the relaxed PDMs and GFM, back-transformation of PDMs and GFM to the atomic orbitalmore » (AO) basis, and evaluation of gradients in the AO basis. For the largest member of the alkane set (C{sub 10}H{sub 22}), the computational times for the gradient terms (with the cc-pVTZ basis set) are 2582.6 (CCSD) and 310.7 (DF-CCSD) min, respectively, a speed up of more than 8-folds. For gradient related terms, the DF approach avoids the usage of four-index electron repulsion integrals. Based on our previous study [U. Bozkaya, J. Chem. Phys. 141, 124108 (2014)], our formalism completely avoids construction or storage of the 4-index two-particle density matrix (TPDM), using instead 2- and 3-index TPDMs. The DF approach introduces negligible errors for equilibrium bond lengths and harmonic vibrational frequencies.« less

  13. Metadata management for high content screening in OMERO

    PubMed Central

    Li, Simon; Besson, Sébastien; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Lindner, Dominik; Linkert, Melissa; Moore, William J.; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Allan, Chris; Burel, Jean-Marie; Moore, Josh; Swedlow, Jason R.

    2016-01-01

    High content screening (HCS) experiments create a classic data management challenge—multiple, large sets of heterogeneous structured and unstructured data, that must be integrated and linked to produce a set of “final” results. These different data include images, reagents, protocols, analytic output, and phenotypes, all of which must be stored, linked and made accessible for users, scientists, collaborators and where appropriate the wider community. The OME Consortium has built several open source tools for managing, linking and sharing these different types of data. The OME Data Model is a metadata specification that supports the image data and metadata recorded in HCS experiments. Bio-Formats is a Java library that reads recorded image data and metadata and includes support for several HCS screening systems. OMERO is an enterprise data management application that integrates image data, experimental and analytic metadata and makes them accessible for visualization, mining, sharing and downstream analysis. We discuss how Bio-Formats and OMERO handle these different data types, and how they can be used to integrate, link and share HCS experiments in facilities and public data repositories. OME specifications and software are open source and are available at https://www.openmicroscopy.org. PMID:26476368

  14. Metadata management for high content screening in OMERO.

    PubMed

    Li, Simon; Besson, Sébastien; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Lindner, Dominik; Linkert, Melissa; Moore, William J; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Allan, Chris; Burel, Jean-Marie; Moore, Josh; Swedlow, Jason R

    2016-03-01

    High content screening (HCS) experiments create a classic data management challenge-multiple, large sets of heterogeneous structured and unstructured data, that must be integrated and linked to produce a set of "final" results. These different data include images, reagents, protocols, analytic output, and phenotypes, all of which must be stored, linked and made accessible for users, scientists, collaborators and where appropriate the wider community. The OME Consortium has built several open source tools for managing, linking and sharing these different types of data. The OME Data Model is a metadata specification that supports the image data and metadata recorded in HCS experiments. Bio-Formats is a Java library that reads recorded image data and metadata and includes support for several HCS screening systems. OMERO is an enterprise data management application that integrates image data, experimental and analytic metadata and makes them accessible for visualization, mining, sharing and downstream analysis. We discuss how Bio-Formats and OMERO handle these different data types, and how they can be used to integrate, link and share HCS experiments in facilities and public data repositories. OME specifications and software are open source and are available at https://www.openmicroscopy.org. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  15. The Same Fish: Creating Space for Therapeutic Relationships, Play, and Development in a School for Children with Special Needs.

    PubMed

    Alston, Richard; Sosland, Rachel; Tuohy, Anne; Weiler, Nori Anna; Zeitlin, Diane

    2015-01-01

    This paper represents and attempts to describe psychoanalytically informed work applied in a school setting with children with special needs. While many therapists at the Parkside School are trained in analytic techniques and principles, these ideas have not traditionally been applied to children with language-based learning difficulties. Over the years, we have found that analytic ideas such as transference, countertransference, projective identification, containment, and attachment are especially salient to our understanding of these very complex children. Despite being in a school--a nontraditional setting for psychoanalysis--children are seen in individual and group therapy, often more than once a week. We believe that therapeutic relationships and play (sometimes bringing a child to a place of being able to play) are especially mutative with children with language-based learning challenges. Play and relationship provide a holding environment that, over time, allows for the reorganization of a child's often immature developmental capacities into a sense of agency that captures more clearly a child's innate potential. This article includes case studies of children with complex language-based learning difficulties, including autism spectrum disorders.

  16. [The requirements of standard and conditions of interchangeability of medical articles].

    PubMed

    Men'shikov, V V; Lukicheva, T I

    2013-11-01

    The article deals with possibility to apply specific approaches under evaluation of interchangeability of medical articles for laboratory analysis. The development of standardized analytical technologies of laboratory medicine and formulation of requirements of standards addressed to manufacturers of medical articles the clinically validated requirements are to be followed. These requirements include sensitivity and specificity of techniques, accuracy and precision of research results, stability of reagents' quality in particular conditions of their transportation and storage. The validity of requirements formulated in standards and addressed to manufacturers of medical articles can be proved using reference system, which includes master forms and standard samples, reference techniques and reference laboratories. This approach is supported by data of evaluation of testing systems for measurement of level of thyrotrophic hormone, thyroid hormones and glycated hemoglobin HB A1c. The versions of testing systems can be considered as interchangeable only in case of results corresponding to the results of reference technique and comparable with them. In case of absence of functioning reference system the possibilities of the Joined committee of traceability in laboratory medicine make it possible for manufacturers of reagent sets to apply the certified reference materials under development of manufacturing of sets for large listing of analytes.

  17. Boeing Smart Rotor Full-scale Wind Tunnel Test Data Report

    NASA Technical Reports Server (NTRS)

    Kottapalli, Sesi; Hagerty, Brandon; Salazar, Denise

    2016-01-01

    A full-scale helicopter smart material actuated rotor technology (SMART) rotor test was conducted in the USAF National Full-Scale Aerodynamics Complex 40- by 80-Foot Wind Tunnel at NASA Ames. The SMART rotor system is a five-bladed MD 902 bearingless rotor with active trailing-edge flaps. The flaps are actuated using piezoelectric actuators. Rotor performance, structural loads, and acoustic data were obtained over a wide range of rotor shaft angles of attack, thrust, and airspeeds. The primary test objective was to acquire unique validation data for the high-performance computing analyses developed under the Defense Advanced Research Project Agency (DARPA) Helicopter Quieting Program (HQP). Other research objectives included quantifying the ability of the on-blade flaps to achieve vibration reduction, rotor smoothing, and performance improvements. This data set of rotor performance and structural loads can be used for analytical and experimental comparison studies with other full-scale rotor systems and for analytical validation of computer simulation models. The purpose of this final data report is to document a comprehensive, highquality data set that includes only data points where the flap was actively controlled and each of the five flaps behaved in a similar manner.

  18. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.

    2015-12-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the datasets distribution - and will include intercomparison, ensemble, and outlier analysis. The two-level workflow solution envisioned in INDIGO (coarse grain for distributed tasks orchestration, and fine grain, at the level of a single data analytics cluster instance) will be presented and discussed.

  20. Developing Learning Analytics Design Knowledge in the "Middle Space": The Student Tuning Model and Align Design Framework for Learning Analytics Use

    ERIC Educational Resources Information Center

    Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting

    2016-01-01

    This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…

  1. Geospatial Analytics in Retail Site Selection and Sales Prediction.

    PubMed

    Ting, Choo-Yee; Ho, Chiung Ching; Yee, Hui Jia; Matsah, Wan Razali

    2018-03-01

    Studies have shown that certain features from geography, demography, trade area, and environment can play a vital role in retail site selection, largely due to the impact they asserted on retail performance. Although the relevant features could be elicited by domain experts, determining the optimal feature set can be intractable and labor-intensive exercise. The challenges center around (1) how to determine features that are important to a particular retail business and (2) how to estimate retail sales performance given a new location? The challenges become apparent when the features vary across time. In this light, this study proposed a nonintervening approach by employing feature selection algorithms and subsequently sales prediction through similarity-based methods. The results of prediction were validated by domain experts. In this study, data sets from different sources were transformed and aggregated before an analytics data set that is ready for analysis purpose could be obtained. The data sets included data about feature location, population count, property type, education status, and monthly sales from 96 branches of a telecommunication company in Malaysia. The finding suggested that (1) optimal retail performance can only be achieved through fulfillment of specific location features together with the surrounding trade area characteristics and (2) similarity-based method can provide solution to retail sales prediction.

  2. Analytical solutions for one-, two-, and three-dimensional solute transport in ground-water systems with uniform flow

    USGS Publications Warehouse

    Wexler, Eliezer J.

    1989-01-01

    Analytical solutions to the advective-dispersive solute-transport equation are useful in predicting the fate of solutes in ground water. Analytical solutions compiled from available literature or derived by the author are presented in this report for a variety of boundary condition types and solute-source configurations in one-, two-, and three-dimensional systems with uniform ground-water flow. A set of user-oriented computer programs was created to evaluate these solutions and to display the results in tabular and computer-graphics format. These programs incorporate many features that enhance their accuracy, ease of use, and versatility. Documentation for the programs describes their operation and required input data, and presents the results of sample problems. Derivations of select solutions, source codes for the computer programs, and samples of program input and output also are included.

  3. Simultaneous determination of three herbicides by differential pulse voltammetry and chemometrics.

    PubMed

    Ni, Yongnian; Wang, Lin; Kokot, Serge

    2011-01-01

    A novel differential pulse voltammetry method (DPV) was researched and developed for the simultaneous determination of Pendimethalin, Dinoseb and sodium 5-nitroguaiacolate (5NG) with the aid of chemometrics. The voltammograms of these three compounds overlapped significantly, and to facilitate the simultaneous determination of the three analytes, chemometrics methods were applied. These included classical least squares (CLS), principal component regression (PCR), partial least squares (PLS) and radial basis function-artificial neural networks (RBF-ANN). A separately prepared verification data set was used to confirm the calibrations, which were built from the original and first derivative data matrices of the voltammograms. On the basis relative prediction errors and recoveries of the analytes, the RBF-ANN and the DPLS (D - first derivative spectra) models performed best and are particularly recommended for application. The DPLS calibration model was applied satisfactorily for the prediction of the three analytes from market vegetables and lake water samples.

  4. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    NASA Technical Reports Server (NTRS)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  5. Analysis of propellant feedline dynamics

    NASA Technical Reports Server (NTRS)

    Holster, J. L.; Astleford, W. J.; Gerlach, C. R.

    1973-01-01

    An analytical model and corresponding computer program for studying disturbances of liquid propellants in typical engine feedline systems were developed. The model includes the effects of steady turbulent mean flow, the influence of distributed compliances, the effects of local compliances, and various factors causing structural-hydraulic coupling. The computer program was set up such that the amplitude and phase of the terminal pressure/input excitation is calculated over any desired frequency range for an arbitrary assembly of various feedline components. A user's manual is included.

  6. High resolution x-ray CMT: Reconstruction methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, J.K.

    This paper qualitatively discusses the primary characteristics of methods for reconstructing tomographic images from a set of projections. These reconstruction methods can be categorized as either {open_quotes}analytic{close_quotes} or {open_quotes}iterative{close_quotes} techniques. Analytic algorithms are derived from the formal inversion of equations describing the imaging process, while iterative algorithms incorporate a model of the imaging process and provide a mechanism to iteratively improve image estimates. Analytic reconstruction algorithms are typically computationally more efficient than iterative methods; however, analytic algorithms are available for a relatively limited set of imaging geometries and situations. Thus, the framework of iterative reconstruction methods is better suited formore » high accuracy, tomographic reconstruction codes.« less

  7. Dendritic cell immunotherapy followed by cART interruption during HIV-1 infection induces plasma protein markers of cellular immunity and neutrophil recruitment

    PubMed Central

    Cooper, Jason D.; Tomasik, Jakub; Bahn, Sabine; Aerts, Joeri L.; Osterhaus, Albert D. M. E.; Gruters, Rob A.; Andeweg, Arno C.

    2018-01-01

    Objectives To characterize the host response to dendritic cell-based immunotherapy and subsequent combined antiretroviral therapy (cART) interruption in HIV-1-infected individuals at the plasma protein level. Design An autologous dendritic cell (DC) therapeutic vaccine was administered to HIV-infected individuals, stable on cART. The effect of vaccination was evaluated at the plasma protein level during the period preceding cART interruption, during analytical therapy interruption and at viral reactivation. Healthy controls and post-exposure prophylactically treated healthy individuals were included as controls. Methods Plasma marker (‘analyte’) levels including cytokines, chemokines, growth factors, and hormones were measured in trial participants and control plasma samples using a multiplex immunoassay. Analyte levels were analysed using principle component analysis, cluster analysis and limma. Blood neutrophil counts were analysed using linear regression. Results Plasma analyte levels of HIV-infected individuals are markedly different from those of healthy controls and HIV-negative individuals receiving post-exposure prophylaxis. Viral reactivation following cART interruption also affects multiple analytes, but cART interruption itself only has only a minor effect. We find that Thyroxine-Binding Globulin (TBG) levels and late-stage neutrophil numbers correlate with the time off cART after DC vaccination. Furthermore, analysis shows that cART alters several regulators of blood glucose levels, including C-peptide, chromogranin-A and leptin. HIV reactivation is associated with the upregulation of CXCR3 ligands. Conclusions Chronic HIV infection leads to a change in multiple plasma analyte levels, as does virus reactivation after cART interruption. Furthermore, we find evidence for the involvement of TBG and neutrophils in the response to DC-vaccination in the setting of HIV-infection. PMID:29389978

  8. Exact Local Correlations and Full Counting Statistics for Arbitrary States of the One-Dimensional Interacting Bose Gas

    NASA Astrophysics Data System (ADS)

    Bastianello, Alvise; Piroli, Lorenzo; Calabrese, Pasquale

    2018-05-01

    We derive exact analytic expressions for the n -body local correlations in the one-dimensional Bose gas with contact repulsive interactions (Lieb-Liniger model) in the thermodynamic limit. Our results are valid for arbitrary states of the model, including ground and thermal states, stationary states after a quantum quench, and nonequilibrium steady states arising in transport settings. Calculations for these states are explicitly presented and physical consequences are critically discussed. We also show that the n -body local correlations are directly related to the full counting statistics for the particle-number fluctuations in a short interval, for which we provide an explicit analytic result.

  9. On the Use of Accelerated Test Methods for Characterization of Advanced Composite Materials

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.

    2003-01-01

    A rational approach to the problem of accelerated testing for material characterization of advanced polymer matrix composites is discussed. The experimental and analytical methods provided should be viewed as a set of tools useful in the screening of material systems for long-term engineering properties in aerospace applications. Consideration is given to long-term exposure in extreme environments that include elevated temperature, reduced temperature, moisture, oxygen, and mechanical load. Analytical formulations useful for predictive models that are based on the principles of time-based superposition are presented. The need for reproducible mechanisms, indicator properties, and real-time data are outlined as well as the methodologies for determining specific aging mechanisms.

  10. Variational Trajectory Optimization Tool Set: Technical description and user's manual

    NASA Technical Reports Server (NTRS)

    Bless, Robert R.; Queen, Eric M.; Cavanaugh, Michael D.; Wetzel, Todd A.; Moerder, Daniel D.

    1993-01-01

    The algorithms that comprise the Variational Trajectory Optimization Tool Set (VTOTS) package are briefly described. The VTOTS is a software package for solving nonlinear constrained optimal control problems from a wide range of engineering and scientific disciplines. The VTOTS package was specifically designed to minimize the amount of user programming; in fact, for problems that may be expressed in terms of analytical functions, the user needs only to define the problem in terms of symbolic variables. This version of the VTOTS does not support tabular data; thus, problems must be expressed in terms of analytical functions. The VTOTS package consists of two methods for solving nonlinear optimal control problems: a time-domain finite-element algorithm and a multiple shooting algorithm. These two algorithms, under the VTOTS package, may be run independently or jointly. The finite-element algorithm generates approximate solutions, whereas the shooting algorithm provides a more accurate solution to the optimization problem. A user's manual, some examples with results, and a brief description of the individual subroutines are included.

  11. The singular values of the imbedding operators of some classes of analytic functions of several variables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parfenov, O.G.

    1994-12-25

    We discuss three results. The first exhibits the order of decrease of the s-values as a function of the CR-dimension of a compact set on which we approximate the class of analytic functions being studied. The second is an asymptotic formula for the case when the domain of analyticity and the compact set are Reinhart domains. The third is the computation of the s-values of a special operator that is of interest for approximation theory on one-dimensional manifolds.

  12. Big Data: Are Biomedical and Health Informatics Training Programs Ready?

    PubMed Central

    Hersh, W.; Ganesh, A. U. Jai

    2014-01-01

    Summary Objectives The growing volume and diversity of health and biomedical data indicate that the era of Big Data has arrived for healthcare. This has many implications for informatics, not only in terms of implementing and evaluating information systems, but also for the work and training of informatics researchers and professionals. This article addresses the question: What do biomedical and health informaticians working in analytics and Big Data need to know? Methods We hypothesize a set of skills that we hope will be discussed among academic and other informaticians. Results The set of skills includes: Programming - especially with data-oriented tools, such as SQL and statistical programming languages; Statistics - working knowledge to apply tools and techniques; Domain knowledge - depending on one’s area of work, bioscience or health care; and Communication - being able to understand needs of people and organizations, and articulate results back to them. Conclusions Biomedical and health informatics educational programs must introduce concepts of analytics, Big Data, and the underlying skills to use and apply them into their curricula. The development of new coursework should focus on those who will become experts, with training aiming to provide skills in “deep analytical talent” as well as those who need knowledge to support such individuals. PMID:25123740

  13. Big Data: Are Biomedical and Health Informatics Training Programs Ready? Contribution of the IMIA Working Group for Health and Medical Informatics Education.

    PubMed

    Otero, P; Hersh, W; Jai Ganesh, A U

    2014-08-15

    The growing volume and diversity of health and biomedical data indicate that the era of Big Data has arrived for healthcare. This has many implications for informatics, not only in terms of implementing and evaluating information systems, but also for the work and training of informatics researchers and professionals. This article addresses the question: What do biomedical and health informaticians working in analytics and Big Data need to know? We hypothesize a set of skills that we hope will be discussed among academic and other informaticians. The set of skills includes: Programming - especially with data-oriented tools, such as SQL and statistical programming languages; Statistics - working knowledge to apply tools and techniques; Domain knowledge - depending on one's area of work, bioscience or health care; and Communication - being able to understand needs of people and organizations, and articulate results back to them. Biomedical and health informatics educational programs must introduce concepts of analytics, Big Data, and the underlying skills to use and apply them into their curricula. The development of new coursework should focus on those who will become experts, with training aiming to provide skills in "deep analytical talent" as well as those who need knowledge to support such individuals.

  14. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less

  15. A Meta-Analytic Review of Tactile-Cued Self-Monitoring Interventions Used by Students in Educational Settings

    ERIC Educational Resources Information Center

    McDougall, Dennis; Ornelles, Cecily; Mersberg, Kawika; Amona, Kekama

    2015-01-01

    In this meta-analytic review, we critically evaluate procedures and outcomes from nine intervention studies in which students used tactile-cued self-monitoring in educational settings. Findings suggest that most tactile-cued self-monitoring interventions have moderate to strong effects, have emerged only recently, and have not yet achieved the…

  16. Geophysical Data Sets in GeoMapApp

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.

    2017-12-01

    GeoMapApp (http://www.geomapapp.org), a free map-based data tool developed at Lamont-Doherty Earth Observatory, provides access to hundreds of integrated geoscience data sets that are useful for geophysical studies. Examples include earthquake and volcano catalogues, gravity and magnetics data, seismic velocity tomographic models, geological maps, geochemical analytical data, lithospheric plate boundary information, geodetic velocities, and high-resolution bathymetry and land elevations. Users can also import and analyse their own data files. Data analytical functions provide contouring, shading, profiling, layering and transparency, allowing multiple data sets to be seamlessly compared. A new digitization and field planning portal allow stations and waypoints to be generated. Sessions can be saved and shared with colleagues and students. In this eLightning presentation we will demonstrate some of GeoMapApp's capabilities with a focus upon subduction zones and tectonics. In the attached screen shot of the Cascadia margin, the contoured depth to the top of the subducting Juan de Fuca slab is overlain on a shear wave velocity depth slice. Geochemical data coloured on Al2O3 and scaled on MgO content is shown as circles. The stack of data profiles was generated along the white line.

  17. Analytical approaches to the determination of spin-dependent parton distribution functions at NNLO approximation

    NASA Astrophysics Data System (ADS)

    Salajegheh, Maral; Nejad, S. Mohammad Moosavi; Khanpour, Hamzeh; Tehrani, S. Atashbar

    2018-05-01

    In this paper, we present SMKA18 analysis, which is a first attempt to extract the set of next-to-next-leading-order (NNLO) spin-dependent parton distribution functions (spin-dependent PDFs) and their uncertainties determined through the Laplace transform technique and Jacobi polynomial approach. Using the Laplace transformations, we present an analytical solution for the spin-dependent Dokshitzer-Gribov-Lipatov-Altarelli-Parisi evolution equations at NNLO approximation. The results are extracted using a wide range of proton g1p(x ,Q2) , neutron g1n(x ,Q2) , and deuteron g1d(x ,Q2) spin-dependent structure functions data set including the most recent high-precision measurements from COMPASS16 experiments at CERN, which are playing an increasingly important role in global spin-dependent fits. The careful estimations of uncertainties have been done using the standard Hessian error propagation. We will compare our results with the available spin-dependent inclusive deep inelastic scattering data set and other results for the spin-dependent PDFs in literature. The results obtained for the spin-dependent PDFs as well as spin-dependent structure functions are clearly explained both in the small and large values of x .

  18. Evolution of an Implementation-Ready Interprofessional Pain Assessment Reference Model

    PubMed Central

    Collins, Sarah A; Bavuso, Karen; Swenson, Mary; Suchecki, Christine; Mar, Perry; Rocha, Roberto A.

    2017-01-01

    Standards to increase consistency of comprehensive pain assessments are important for safety, quality, and analytics activities, including meeting Joint Commission requirements and learning the best management strategies and interventions for the current prescription Opioid epidemic. In this study we describe the development and validation of a Pain Assessment Reference Model ready for implementation on EHR forms and flowsheets. Our process resulted in 5 successive revisions of the reference model, which more than doubled the number of data elements to 47. The organization of the model evolved during validation sessions with panels totaling 48 subject matter experts (SMEs) to include 9 sets of data elements, with one set recommended as a minimal data set. The reference model also evolved when implemented into EHR forms and flowsheets, indicating specifications such as cascading logic that are important to inform secondary use of data. PMID:29854125

  19. Using business analytics to improve outcomes.

    PubMed

    Rivera, Jose; Delaney, Stephen

    2015-02-01

    Orlando Health has brought its hospital and physician practice revenue cycle systems into better balance using four sets of customized analytics: Physician performance analytics gauge the total net revenue for every employed physician. Patient-pay analytics provide financial risk scores for all patients on both the hospital and physician practice sides. Revenue management analytics bridge the gap between the back-end central business office and front-end physician practice managers and administrators. Enterprise management analytics allow the hospitals and physician practices to share important information about common patients.

  20. A case history: from traumatic repetition towards psychic representability.

    PubMed

    Bichi, Estela L

    2008-06-01

    This paper is devoted principally to a case history concerning an analytic process extending over a period of almost ten years. The patient is B, who consulted the author after a traumatic episode. Although that was her reason for commencing treatment, a history of previous traumatogenic situations, including a rape during her adolescence, subsequently came to light. The author describes three stages of the treatment, reflected in three different settings in accordance with the work done by both patient and analyst in enabling B to own and work through her infantile and adult traumatic experiences. The process of transformation of traumatic traces lacking psychic representation, which was undertaken by both members of the analytic couple from the beginning of the treatment, was eventually approached in a particular way on the basis of their respective creative capacities, which facilitated the patient's psychic progress towards representability and the possibility of working through the experiences of the past. Much of the challenge of this case involved the analyst's capacity to maintain and at the same time consolidate her analytic posture within her internal setting, while doing her best to overcome any possible misfit (Balint, 1968) between her own technique and the specific complexities of the individual patient. The account illustrates the alternation of phases, at the beginning of the analysis, of remembering and interpretation on the one hand and of the representational void and construction on the other. In the case history proper and in her detailed summing up, the author refers to the place of the analyst during the analytic process, the involvement of her psychic functioning, and the importance of her capacity to work on and make use of her countertransference and self-analytic introspection, with a view to neutralizing any influence that aspects of her 'real person' might have had on the analytic field and on the complex processes taking place within it.

  1. First 25-hydroxyvitamin D assay for general chemistry analyzers.

    PubMed

    Saida, Fakhri B; Chen, Xiaoru; Tran, Kiet; Dou, Chao; Yuan, Chong

    2015-03-01

    25-Hydroxyvitamin D [25(OH)D], the predominant circulating form of vitamin D, is an accurate indicator of the general vitamin D status of an individual. Because vitamin D deficiencies have been linked to several pathologies (including osteoporosis and rickets), accurate monitoring of 25(OH)D levels is becoming increasingly important in clinical settings. Current 25(OH)D assays are either chromatographic or immunoassay-based assays. These assays include HPLC, liquid chromatography-tandem mass spectrometry (LC-MS/MS), enzyme-immunosorbent, immunochemiluminescence, immunofluorescence and radioimmunoassay. All these assays use heterogeneous formats that require phase separation and special instrumentations. In this article, we present an overview of these assays and introduce the first homogeneous assay of 25(OH)D for use on general chemistry analyzers. A special emphasis is put on the unique challenges posed by the 25(OH)D analyte. These challenges include a low detection limit, the dissociation of the analyte from its serum transporter and the inactivation of various binding proteins without phase separation steps.

  2. Model verification of large structural systems

    NASA Technical Reports Server (NTRS)

    Lee, L. T.; Hasselman, T. K.

    1977-01-01

    A methodology was formulated, and a general computer code implemented for processing sinusoidal vibration test data to simultaneously make adjustments to a prior mathematical model of a large structural system, and resolve measured response data to obtain a set of orthogonal modes representative of the test model. The derivation of estimator equations is shown along with example problems. A method for improving the prior analytic model is included.

  3. Assessing the global reach and value of a provider-facing healthcare app using large-scale analytics.

    PubMed

    O'Reilly-Shah, Vikas; Easton, George; Gillespie, Scott

    2017-01-01

    The rapid global adoption of mobile health (mHealth) smartphone apps by healthcare providers presents challenges and opportunities in medicine. Challenges include ensuring the delivery of high-quality, up-to-date and optimised information. Opportunities include the ability to study global practice patterns, access to medical and surgical care and continuing medical education needs. We studied users of a free anaesthesia calculator app used worldwide. We combined traditional app analytics with in-app surveys to collect user demographics and feedback. 31 173 subjects participated. Users were from 206 countries and represented a spectrum of healthcare provider roles. Low-income country users had greater rates of app use (p<0.001) and ascribed greater importance of the app to their practice (p<0.001). Physicians from low-income countries were more likely to adopt the app (p<0.001). The app was used primarily for paediatric patients. The app was used around the clock, peaking during times typical for first start cases. This mHealth app is a valuable decision support tool for global healthcare providers, particularly those in more resource-limited settings and with less training. App adoption and use may provide a mechanism for measuring longitudinal changes in access to surgical care and engaging providers in resource-limited settings. In-app surveys and app analytics provide a window into healthcare provider behaviour at a breadth and level of detail previously impossible to achieve. Given the potentially immense value of crowdsourced information, healthcare providers should be encouraged to participate in these types of studies.

  4. Clinical implementation of RNA signatures for pharmacogenomic decision-making

    PubMed Central

    Tang, Weihua; Hu, Zhiyuan; Muallem, Hind; Gulley, Margaret L

    2011-01-01

    RNA profiling is increasingly used to predict drug response, dose, or toxicity based on analysis of drug pharmacokinetic or pharmacodynamic pathways. Before implementing multiplexed RNA arrays in clinical practice, validation studies are carried out to demonstrate sufficient evidence of analytic and clinical performance, and to establish an assay protocol with quality assurance measures. Pathologists assure quality by selecting input tissue and by interpreting results in the context of the input tissue as well as the technologies that were used and the clinical setting in which the test was ordered. A strength of RNA profiling is the array-based measurement of tens to thousands of RNAs at once, including redundant tests for critical analytes or pathways to promote confidence in test results. Instrument and reagent manufacturers are crucial for supplying reliable components of the test system. Strategies for quality assurance include careful attention to RNA preservation and quality checks at pertinent steps in the assay protocol, beginning with specimen collection and proceeding through the various phases of transport, processing, storage, analysis, interpretation, and reporting. Specimen quality is checked by probing housekeeping transcripts, while spiked and exogenous controls serve as a check on analytic performance of the test system. Software is required to manipulate abundant array data and present it for interpretation by a laboratory physician who reports results in a manner facilitating therapeutic decision-making. Maintenance of the assay requires periodic documentation of personnel competency and laboratory proficiency. These strategies are shepherding genomic arrays into clinical settings to provide added value to patients and to the larger health care system. PMID:23226056

  5. Assessing the global reach and value of a provider-facing healthcare app using large-scale analytics

    PubMed Central

    Easton, George; Gillespie, Scott

    2017-01-01

    Background The rapid global adoption of mobile health (mHealth) smartphone apps by healthcare providers presents challenges and opportunities in medicine. Challenges include ensuring the delivery of high-quality, up-to-date and optimised information. Opportunities include the ability to study global practice patterns, access to medical and surgical care and continuing medical education needs. Methods We studied users of a free anaesthesia calculator app used worldwide. We combined traditional app analytics with in-app surveys to collect user demographics and feedback. Results 31 173 subjects participated. Users were from 206 countries and represented a spectrum of healthcare provider roles. Low-income country users had greater rates of app use (p<0.001) and ascribed greater importance of the app to their practice (p<0.001). Physicians from low-income countries were more likely to adopt the app (p<0.001). The app was used primarily for paediatric patients. The app was used around the clock, peaking during times typical for first start cases. Conclusions This mHealth app is a valuable decision support tool for global healthcare providers, particularly those in more resource-limited settings and with less training. App adoption and use may provide a mechanism for measuring longitudinal changes in access to surgical care and engaging providers in resource-limited settings. In-app surveys and app analytics provide a window into healthcare provider behaviour at a breadth and level of detail previously impossible to achieve. Given the potentially immense value of crowdsourced information, healthcare providers should be encouraged to participate in these types of studies. PMID:29082007

  6. From observational to analytical morphology of the stratum corneum: progress avoiding hazardous animal and human testings

    PubMed Central

    Piérard, Gérald E; Courtois, Justine; Ritacco, Caroline; Humbert, Philippe; Fanian, Ferial; Piérard-Franchimont, Claudine

    2015-01-01

    Background In cosmetic science, noninvasive sampling of the upper part of the stratum corneum is conveniently performed using strippings with adhesive-coated discs (SACD) and cyanoacrylate skin surface strippings (CSSSs). Methods Under controlled conditions, it is possible to scrutinize SACD and CSSS with objectivity using appropriate methods of analytical morphology. These procedures apply to a series of clinical conditions including xerosis grading, comedometry, corneodynamics, corneomelametry, corneosurfametry, corneoxenometry, and dandruff assessment. Results With any of the analytical evaluations, SACD and CSSS provide specific salient information that is useful in the field of cosmetology. In particular, both methods appear valuable and complementary in assessing the human skin compatibility of personal skincare products. Conclusion A set of quantitative analytical methods applicable to the minimally invasive and low-cost SACD and CSSS procedures allow for a sound assessment of cosmetic effects on the stratum corneum. Under regular conditions, both methods are painless and do not induce adverse events. Globally, CSSS appears more precise and informative than the regular SACD stripping. PMID:25767402

  7. Internally insulated thermal storage system development program

    NASA Technical Reports Server (NTRS)

    Scott, O. L.

    1980-01-01

    A cost effective thermal storage system for a solar central receiver power system using molten salt stored in internally insulated carbon steel tanks is described. Factors discussed include: testing of internal insulation materials in molten salt; preliminary design of storage tanks, including insulation and liner installation; optimization of the storage configuration; and definition of a subsystem research experiment to demonstrate the system. A thermal analytical model and analysis of a thermocline tank was performed. Data from a present thermocline test tank was compared to gain confidence in the analytical approach. A computer analysis of the various storage system parameters (insulation thickness, number of tanks, tank geometry, etc.,) showed that (1) the most cost-effective configuration was a small number of large cylindrical tanks, and (2) the optimum is set by the mechanical constraints of the system, such as soil bearing strength and tank hoop stress, not by the economics.

  8. Internally insulated thermal storage system development program

    NASA Astrophysics Data System (ADS)

    Scott, O. L.

    1980-03-01

    A cost effective thermal storage system for a solar central receiver power system using molten salt stored in internally insulated carbon steel tanks is described. Factors discussed include: testing of internal insulation materials in molten salt; preliminary design of storage tanks, including insulation and liner installation; optimization of the storage configuration; and definition of a subsystem research experiment to demonstrate the system. A thermal analytical model and analysis of a thermocline tank was performed. Data from a present thermocline test tank was compared to gain confidence in the analytical approach. A computer analysis of the various storage system parameters (insulation thickness, number of tanks, tank geometry, etc.,) showed that (1) the most cost-effective configuration was a small number of large cylindrical tanks, and (2) the optimum is set by the mechanical constraints of the system, such as soil bearing strength and tank hoop stress, not by the economics.

  9. Installation-restoration program. Phase 2. Confirmation/quantification. Stage 1 for Mather AFB, Sacramento, California. Volume 1. Final report, September 1983-June 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    A Problem Confirmation Study was performed at three sites on Mather AFB identified in the Phase I investigation as requiring further study (the Air command and warning Area, the 7100 Area, the West Ditch) and the Northeast Perimeter. The field investigation was conducted from February 1984 to June 1985 and included installation of 11 monitor wells, collection of groundwater samples from the monitor wells and 15 base production wells, and collection of sediment samples from two locations on the West Ditch. Analytes included oil and grease, TOC, volatile organic compounds (VOA), as well as dimethylnitrosamine, phenols, pesticides, and dissolved metalsmore » at some specific sites. Based on the hydrogeologic complexity of the physical setting and the findings of the sampling and analytical work, follow-on investigations were recommended at all three sites.« less

  10. Particle Demagnetization in Collisionless Magnetic Reconnection

    NASA Technical Reports Server (NTRS)

    Hesse, Michael

    2006-01-01

    The dissipation mechanism of magnetic reconnection remains a subject of intense scientific interest. On one hand, one set of recent studies have shown that particle inertia-based processes, which include thermal and bulk inertial effects, provide the reconnection electric field in the diffusion region. In this presentation, we present analytical theory results, as well as 2.5 and three-dimensional PIC simulations of guide field magnetic reconnection. We will show that diffusion region scale sizes in moderate and large guide field cases are determined by electron Larmor radii, and that analytical estimates of diffusion region dimensions need to include description of the heat flux tensor. The dominant electron dissipation process appears to be based on thermal electron inertia, expressed through nongyrotropic electron pressure tensors. We will argue that this process remains viable in three dimensions by means of a detailed comparison of high resolution particle-in-cell simulations.

  11. Closing the brain-to-brain loop in laboratory testing.

    PubMed

    Plebani, Mario; Lippi, Giuseppe

    2011-07-01

    Abstract The delivery of laboratory services has been described 40 years ago and defined with the foremost concept of "brain-to-brain turnaround time loop". This concept consists of several processes, including the final step which is the action undertaken on the patient based on laboratory information. Unfortunately, the need for systematic feedback to improve the value of laboratory services has been poorly understood and, even more risky, poorly applied in daily laboratory practice. Currently, major problems arise from the unavailability of consensually accepted quality specifications for the extra-analytical phase of laboratory testing. This, in turn, does not allow clinical laboratories to calculate a budget for the "patient-related total error". The definition and use of the term "total error" refers only to the analytical phase, and should be better defined as "total analytical error" to avoid any confusion and misinterpretation. According to the hierarchical approach to classify strategies to set analytical quality specifications, the "assessment of the effect of analytical performance on specific clinical decision-making" is comprehensively at the top and therefore should be applied as much as possible to address analytical efforts towards effective goals. In addition, an increasing number of laboratories worldwide are adopting risk management strategies such as FMEA, FRACAS, LEAN and Six Sigma since these techniques allow the identification of the most critical steps in the total testing process, and to reduce the patient-related risk of error. As a matter of fact, an increasing number of laboratory professionals recognize the importance of understanding and monitoring any step in the total testing process, including the appropriateness of the test request as well as the appropriate interpretation and utilization of test results.

  12. DEMONSTRATION OF THE ANALYTIC ELEMENT METHOD FOR WELLHEAD PROTECTION

    EPA Science Inventory

    A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a ground-water fl...

  13. Development of a bird banding recapture database

    USGS Publications Warehouse

    Tautin, J.; Doherty, P.F.; Metras, L.

    2001-01-01

    Recaptures (and resightings) constitute the vast majority of post-release data from banded or otherwise marked nongame birds. A powerful suite of contemporary analytical models is available for using recapture data to estimate population size, survival rates and other parameters, and many banders collect recapture data for their project specific needs. However, despite widely recognized, broader programmatic needs for more and better data, banders' recapture data are not centrally reposited and made available for use by others. To address this need, the US Bird Banding Laboratory, the Canadian Bird Banding Office and the Georgia Cooperative Fish and Wildlife Research Unit are developing a bird banding recapture database. In this poster we discuss the critical steps in developing the database, including: determining exactly which recapture data should be included; developing a standard record format and structure for the database; developing electronic means for collecting, vetting and disseminating the data; and most importantly, developing metadata descriptions and individual data set profiles to facilitate the user's selection of appropriate analytical models. We provide examples of individual data sets to be included in the database, and we assess the feasibility of developing a prescribed program for obtaining recapture data from banders who do not presently collect them. It is expected that the recapture database eventually will contain millions of records made available publicly for a variety of avian research and management purposes

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shirkov, Leonid; Makarewicz, Jan, E-mail: jama@amu.edu.pl

    An ab initio intermolecular potential energy surface (PES) has been constructed for the benzene-krypton (BKr) van der Waals (vdW) complex. The interaction energy has been calculated at the coupled cluster level of theory with single, double, and perturbatively included triple excitations using different basis sets. As a result, a few analytical PESs of the complex have been determined. They allowed a prediction of the complex structure and its vibrational vdW states. The vibrational energy level pattern exhibits a distinct polyad structure. Comparison of the equilibrium structure, the dipole moment, and vibrational levels of BKr with their experimental counterparts has allowedmore » us to design an optimal basis set composed of a small Dunning’s basis set for the benzene monomer, a larger effective core potential adapted basis set for Kr and additional midbond functions. Such a basis set yields vibrational energy levels that agree very well with the experimental ones as well as with those calculated from the available empirical PES derived from the microwave spectra of the BKr complex. The basis proposed can be applied to larger complexes including Kr because of a reasonable computational cost and accurate results.« less

  15. [Local Regression Algorithm Based on Net Analyte Signal and Its Application in Near Infrared Spectral Analysis].

    PubMed

    Zhang, Hong-guang; Lu, Jian-gang

    2016-02-01

    Abstract To overcome the problems of significant difference among samples and nonlinearity between the property and spectra of samples in spectral quantitative analysis, a local regression algorithm is proposed in this paper. In this algorithm, net signal analysis method(NAS) was firstly used to obtain the net analyte signal of the calibration samples and unknown samples, then the Euclidean distance between net analyte signal of the sample and net analyte signal of calibration samples was calculated and utilized as similarity index. According to the defined similarity index, the local calibration sets were individually selected for each unknown sample. Finally, a local PLS regression model was built on each local calibration sets for each unknown sample. The proposed method was applied to a set of near infrared spectra of meat samples. The results demonstrate that the prediction precision and model complexity of the proposed method are superior to global PLS regression method and conventional local regression algorithm based on spectral Euclidean distance.

  16. Microwave assisted solvent extraction and coupled-column reversed-phase liquid chromatography with UV detection use of an analytical restricted-access-medium column for the efficient multi-residue analysis of acidic pesticides in soils.

    PubMed

    Hogendoom, E A; Huls, R; Dijkman, E; Hoogerbrugge, R

    2001-12-14

    A screening method has been developed for the determination of acidic pesticides in various types of soils. Methodology is based on the use of microwave assisted solvent extraction (MASE) for fast and efficient extraction of the analytes from the soils and coupled-column reversed-phase liquid chromatography (LC-LC) with UV detection at 228 nm for the instrumental analysis of uncleaned extracts. Four types of soils, including sand, clay and peat, with a range in organic matter content of 0.3-13% and ten acidic pesticides of different chemical families (bentazone, bromoxynil, metsulfuron-methyl, 2,4-D, MCPA, MCPP, 2,4-DP, 2,4,5-T, 2,4-DB and MCPB) were selected as matrices and analytes, respectively. The method developed included the selection of suitable MASE and LC-LC conditions. The latter consisted of the selection of a 5-microm GFF-II internal surface reversed-phase (ISRP, Pinkerton) analytical column (50 x 4.6 mm, I.D.) as the first column in the RAM-C18 configuration in combination with an optimised linear gradient elution including on-line cleanup of sample extracts and reconditioning of the columns. The method was validated with the analysis of freshly spiked samples and samples with aged residues (120 days). The four types of soils were spiked with the ten acidic pesticides at levels between 20 and 200 microg/kg. Weighted regression of the recovery data showed for most analyte-matrix combinations, including freshly spiked samples and aged residues, that the method provides overall recoveries between 60 and 90% with relative standard deviations of the intra-laboratory reproducibility's between 5 and 25%; LODs were obtained between 5 and 50 microg/kg. Evaluation of the data set with principal component analysis revealed that the parameters (i) increase of organic matter content of the soil samples and (ii) aged residues negatively effect the recovery of the analytes.

  17. DEMONSTRATION OF THE ANALYTIC ELEMENT METHOD FOR WELLHEAD PROJECTION - PROJECT SUMMARY

    EPA Science Inventory

    A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a ground-water fl...

  18. Features Students Really Expect from Learning Analytics

    ERIC Educational Resources Information Center

    Schumacher, Clara; Ifenthaler, Dirk

    2016-01-01

    In higher education settings more and more learning is facilitated through online learning environments. To support and understand students' learning processes better, learning analytics offers a promising approach. The purpose of this study was to investigate students' expectations toward features of learning analytics systems. In a first…

  19. Translation of proteomic biomarkers into FDA approved cancer diagnostics: issues and challenges

    PubMed Central

    2013-01-01

    Tremendous efforts have been made over the past few decades to discover novel cancer biomarkers for use in clinical practice. However, a striking discrepancy exists between the effort directed toward biomarker discovery and the number of markers that make it into clinical practice. One of the confounding issues in translating a novel discovery into clinical practice is that quite often the scientists working on biomarker discovery have limited knowledge of the analytical, diagnostic, and regulatory requirements for a clinical assay. This review provides an introduction to such considerations with the aim of generating more extensive discussion for study design, assay performance, and regulatory approval in the process of translating new proteomic biomarkers from discovery into cancer diagnostics. We first describe the analytical requirements for a robust clinical biomarker assay, including concepts of precision, trueness, specificity and analytical interference, and carryover. We next introduce the clinical considerations of diagnostic accuracy, receiver operating characteristic analysis, positive and negative predictive values, and clinical utility. We finish the review by describing components of the FDA approval process for protein-based biomarkers, including classification of biomarker assays as medical devices, analytical and clinical performance requirements, and the approval process workflow. While we recognize that the road from biomarker discovery, validation, and regulatory approval to the translation into the clinical setting could be long and difficult, the reward for patients, clinicians and scientists could be rather significant. PMID:24088261

  20. Analytical and molecular dynamics studies on the impact loading of single-layered graphene sheet by fullerene

    NASA Astrophysics Data System (ADS)

    Hosseini-Hashemi, Shahrokh; Sepahi-Boroujeni, Amin; Sepahi-Boroujeni, Saeid

    2018-04-01

    Normal impact performance of a system including a fullerene molecule and a single-layered graphene sheet is studied in the present paper. Firstly, through a mathematical approach, a new contact law is derived to describe the overall non-bonding interaction forces of the "hollow indenter-target" system. Preliminary verifications show that the derived contact law gives a reliable picture of force field of the system which is in good agreements with the results of molecular dynamics (MD) simulations. Afterwards, equation of the transversal motion of graphene sheet is utilized on the basis of both the nonlocal theory of elasticity and the assumptions of classical plate theory. Then, to derive dynamic behavior of the system, a set including the proposed contact law and the equations of motion of both graphene sheet and fullerene molecule is solved numerically. In order to evaluate outcomes of this method, the problem is modeled by MD simulation. Despite intrinsic differences between analytical and MD methods as well as various errors arise due to transient nature of the problem, acceptable agreements are established between analytical and MD outcomes. As a result, the proposed analytical method can be reliably used to address similar impact problems. Furthermore, it is found that a single-layered graphene sheet is capable of trapping fullerenes approaching with low velocities. Otherwise, in case of rebound, the sheet effectively absorbs predominant portion of fullerene energy.

  1. Influence versus intent for predictive analytics in situation awareness

    NASA Astrophysics Data System (ADS)

    Cui, Biru; Yang, Shanchieh J.; Kadar, Ivan

    2013-05-01

    Predictive analytics in situation awareness requires an element to comprehend and anticipate potential adversary activities that might occur in the future. Most work in high level fusion or predictive analytics utilizes machine learning, pattern mining, Bayesian inference, and decision tree techniques to predict future actions or states. The emergence of social computing in broader contexts has drawn interests in bringing the hypotheses and techniques from social theory to algorithmic and computational settings for predictive analytics. This paper aims at answering the question on how influence and attitude (some interpreted such as intent) of adversarial actors can be formulated and computed algorithmically, as a higher level fusion process to provide predictions of future actions. The challenges in this interdisciplinary endeavor include drawing existing understanding of influence and attitude in both social science and computing fields, as well as the mathematical and computational formulation for the specific context of situation to be analyzed. The study of `influence' has resurfaced in recent years due to the emergence of social networks in the virtualized cyber world. Theoretical analysis and techniques developed in this area are discussed in this paper in the context of predictive analysis. Meanwhile, the notion of intent, or `attitude' using social theory terminologies, is a relatively uncharted area in the computing field. Note that a key objective of predictive analytics is to identify impending/planned attacks so their `impact' and `threat' can be prevented. In this spirit, indirect and direct observables are drawn and derived to infer the influence network and attitude to predict future threats. This work proposes an integrated framework that jointly assesses adversarial actors' influence network and their attitudes as a function of past actions and action outcomes. A preliminary set of algorithms are developed and tested using the Global Terrorism Database (GTD). Our results reveals the benefits to perform joint predictive analytics with both attitude and influence. At the same time, we discover significant challenges in deriving influence and attitude from indirect observables for diverse adversarial behavior. These observations warrant further investigation of optimal use of influence and attitude for predictive analytics, as well as the potential inclusion of other environmental or capability elements for the actors.

  2. Novel predictive models for metabolic syndrome risk: a "big data" analytic approach.

    PubMed

    Steinberg, Gregory B; Church, Bruce W; McCall, Carol J; Scott, Adam B; Kalis, Brian P

    2014-06-01

    We applied a proprietary "big data" analytic platform--Reverse Engineering and Forward Simulation (REFS)--to dimensions of metabolic syndrome extracted from a large data set compiled from Aetna's databases for 1 large national customer. Our goals were to accurately predict subsequent risk of metabolic syndrome and its various factors on both a population and individual level. The study data set included demographic, medical claim, pharmacy claim, laboratory test, and biometric screening results for 36,944 individuals. The platform reverse-engineered functional models of systems from diverse and large data sources and provided a simulation framework for insight generation. The platform interrogated data sets from the results of 2 Comprehensive Metabolic Syndrome Screenings (CMSSs) as well as complete coverage records; complete data from medical claims, pharmacy claims, and lab results for 2010 and 2011; and responses to health risk assessment questions. The platform predicted subsequent risk of metabolic syndrome, both overall and by risk factor, on population and individual levels, with ROC/AUC varying from 0.80 to 0.88. We demonstrated that improving waist circumference and blood glucose yielded the largest benefits on subsequent risk and medical costs. We also showed that adherence to prescribed medications and, particularly, adherence to routine scheduled outpatient doctor visits, reduced subsequent risk. The platform generated individualized insights using available heterogeneous data within 3 months. The accuracy and short speed to insight with this type of analytic platform allowed Aetna to develop targeted cost-effective care management programs for individuals with or at risk for metabolic syndrome.

  3. Analytical assessment of some characteristic ratios for s-wave superconductors

    NASA Astrophysics Data System (ADS)

    Gonczarek, Ryszard; Krzyzosiak, Mateusz; Gonczarek, Adam; Jacak, Lucjan

    2018-04-01

    We evaluate some thermodynamic quantities and characteristic ratios that describe low- and high-temperature s-wave superconducting systems. Based on a set of fundamental equations derived within the conformal transformation method, a simple model is proposed and studied analytically. After including a one-parameter class of fluctuations in the density of states, the mathematical structure of the s-wave superconducting gap, the free energy difference, and the specific heat difference is found and discussed in an analytic manner. Both the zero-temperature limit T = 0 and the subcritical temperature range T ≲ T c are discussed using the method of successive approximations. The equation for the ratio R 1, relating the zero-temperature energy gap and the critical temperature, is formulated and solved numerically for various values of the model parameter. Other thermodynamic quantities are analyzed, including a characteristic ratio R 2, quantifying the dynamics of the specific heat jump at the critical temperature. It is shown that the obtained model results coincide with experimental data for low- T c superconductors. The prospect of application of the presented model in studies of high- T c superconductors and other superconducting systems of the new generation is also discussed.

  4. Exploring the Different Trajectories of Analytical Thinking Ability Factors: An Application of the Second-Order Growth Curve Factor Model

    ERIC Educational Resources Information Center

    Saengprom, Narumon; Erawan, Waraporn; Damrongpanit, Suntonrapot; Sakulku, Jaruwan

    2015-01-01

    The purposes of this study were 1) Compare analytical thinking ability by testing the same sets of students 5 times 2) Develop and verify whether analytical thinking ability of students corresponds to second-order growth curve factors model. Samples were 1,093 eighth-grade students. The results revealed that 1) Analytical thinking ability scores…

  5. Functional neuroimaging correlates of thinking flexibility and knowledge structure in memory: Exploring the relationships between clinical reasoning and diagnostic thinking.

    PubMed

    Durning, Steven J; Costanzo, Michelle E; Beckman, Thomas J; Artino, Anthony R; Roy, Michael J; van der Vleuten, Cees; Holmboe, Eric S; Lipner, Rebecca S; Schuwirth, Lambert

    2016-06-01

    Diagnostic reasoning involves the thinking steps up to and including arrival at a diagnosis. Dual process theory posits that a physician's thinking is based on both non-analytic or fast, subconscious thinking and analytic thinking that is slower, more conscious, effortful and characterized by comparing and contrasting alternatives. Expertise in clinical reasoning may relate to the two dimensions measured by the diagnostic thinking inventory (DTI): memory structure and flexibility in thinking. Explored the functional magnetic resonance imaging (fMRI) correlates of these two aspects of the DTI: memory structure and flexibility of thinking. Participants answered and reflected upon multiple-choice questions (MCQs) during fMRI. A DTI was completed shortly after the scan. The brain processes associated with the two dimensions of the DTI were correlated with fMRI phases - assessing flexibility in thinking during analytical clinical reasoning, memory structure during non-analytical clinical reasoning and the total DTI during both non-analytical and analytical reasoning in experienced physicians. Each DTI component was associated with distinct functional neuroanatomic activation patterns, particularly in the prefrontal cortex. Our findings support diagnostic thinking conceptual models and indicate mechanisms through which cognitive demands may induce functional adaptation within the prefrontal cortex. This provides additional objective validity evidence for the use of the DTI in medical education and practice settings.

  6. Analytic Cognitive Style Predicts Religious and Paranormal Belief

    ERIC Educational Resources Information Center

    Pennycook, Gordon; Cheyne, James Allan; Seli, Paul; Koehler, Derek J.; Fugelsang, Jonathan A.

    2012-01-01

    An analytic cognitive style denotes a propensity to set aside highly salient intuitions when engaging in problem solving. We assess the hypothesis that an analytic cognitive style is associated with a history of questioning, altering, and rejecting (i.e., unbelieving) supernatural claims, both religious and paranormal. In two studies, we examined…

  7. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL (EPA/600/SR-94/210)

    EPA Science Inventory

    A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a groundwater flo...

  8. NHEXAS PHASE I MARYLAND STUDY--QA ANALYTICAL RESULTS FOR PESTICIDES IN SPIKE SAMPLES

    EPA Science Inventory

    The Pesticides in Spikes data set contains the analytical results of measurements of up to 17 pesticides in 12 control samples (spikes) from 11 households. Measurements were made in samples of blood serum. Controls were used to assess recovery of target analytes from a sample m...

  9. FT-midIR determination of fatty acid profiles, including trans fatty acids, in bakery products after focused microwave-assisted Soxhlet extraction.

    PubMed

    Ruiz-Jiménez, J; Priego-Capote, F; Luque de Castro, M D

    2006-08-01

    A study of the feasibility of Fourier transform medium infrared spectroscopy (FT-midIR) for analytical determination of fatty acid profiles, including trans fatty acids, is presented. The training and validation sets-75% (102 samples) and 25% (36 samples) of the samples once the spectral outliers have been removed-to develop FT-midIR general equations, were built with samples from 140 commercial and home-made bakery products. The concentration of the analytes in the samples used for this study is within the typical range found in these kinds of products. Both sets were independent; thus, the validation set was only used for testing the equations. The criterion used for the selection of the validation set was samples with the highest number of neighbours and the most separation between them (H<0.6). Partial least squares regression and cross validation were used for multivariate calibration. The FT-midIR method does not require post-extraction manipulation and gives information about the fatty acid profile in two min. The 14:0, 16:0, 18:0, 18:1 and 18:2 fatty acids can be determined with excellent precision and other fatty acids with good precision according to the Shenk criteria, R (2)>/=0.90, SEP=1-1.5 SEL and R (2)=0.70-0.89, SEP=2-3 SEL, respectively. The results obtained with the proposed method were compared with those provided by the conventional method based on GC-MS. At 95% significance level, the differences between the values obtained for the different fatty acids were within the experimental error.

  10. Labour Market Driven Learning Analytics

    ERIC Educational Resources Information Center

    Kobayashi, Vladimer; Mol, Stefan T.; Kismihók, Gábor

    2014-01-01

    This paper briefly outlines a project about integrating labour market information in a learning analytics goal-setting application that provides guidance to students in their transition from education to employment.

  11. ASVCP guidelines: quality assurance for point-of-care testing in veterinary medicine.

    PubMed

    Flatland, Bente; Freeman, Kathleen P; Vap, Linda M; Harr, Kendal E

    2013-12-01

    Point-of-care testing (POCT) refers to any laboratory testing performed outside the conventional reference laboratory and implies close proximity to patients. Instrumental POCT systems consist of small, handheld or benchtop analyzers. These have potential utility in many veterinary settings, including private clinics, academic veterinary medical centers, the community (eg, remote area veterinary medical teams), and for research applications in academia, government, and industry. Concern about the quality of veterinary in-clinic testing has been expressed in published veterinary literature; however, little guidance focusing on POCT is available. Recognizing this void, the ASVCP formed a subcommittee in 2009 charged with developing quality assurance (QA) guidelines for veterinary POCT. Guidelines were developed through literature review and a consensus process. Major recommendations include (1) taking a formalized approach to POCT within the facility, (2) use of written policies, standard operating procedures, forms, and logs, (3) operator training, including periodic assessment of skills, (4) assessment of instrument analytical performance and use of both statistical quality control and external quality assessment programs, (5) use of properly established or validated reference intervals, (6) and ensuring accurate patient results reporting. Where possible, given instrument analytical performance, use of a validated 13s control rule for interpretation of control data is recommended. These guidelines are aimed at veterinarians and veterinary technicians seeking to improve management of POCT in their clinical or research setting, and address QA of small chemistry and hematology instruments. These guidelines are not intended to be all-inclusive; rather, they provide a minimum standard for maintenance of POCT instruments in the veterinary setting. © 2013 American Society for Veterinary Clinical Pathology and European Society for Veterinary Clinical Pathology.

  12. Mind-sets matter: a meta-analytic review of implicit theories and self-regulation.

    PubMed

    Burnette, Jeni L; O'Boyle, Ernest H; VanEpps, Eric M; Pollack, Jeffrey M; Finkel, Eli J

    2013-05-01

    This review builds on self-control theory (Carver & Scheier, 1998) to develop a theoretical framework for investigating associations of implicit theories with self-regulation. This framework conceptualizes self-regulation in terms of 3 crucial processes: goal setting, goal operating, and goal monitoring. In this meta-analysis, we included articles that reported a quantifiable assessment of implicit theories and at least 1 self-regulatory process or outcome. With a random effects approach used, meta-analytic results (total unique N = 28,217; k = 113) across diverse achievement domains (68% academic) and populations (age range = 5-42; 10 different nationalities; 58% from United States; 44% female) demonstrated that implicit theories predict distinct self-regulatory processes, which, in turn, predict goal achievement. Incremental theories, which, in contrast to entity theories, are characterized by the belief that human attributes are malleable rather than fixed, significantly predicted goal setting (performance goals, r = -.151; learning goals, r = .187), goal operating (helpless-oriented strategies, r = -.238; mastery-oriented strategies, r = .227), and goal monitoring (negative emotions, r = -.233; expectations, r = .157). The effects for goal setting and goal operating were stronger in the presence (vs. absence) of ego threats such as failure feedback. Discussion emphasizes how the present theoretical analysis merges an implicit theory perspective with self-control theory to advance scholarship and unlock major new directions for basic and applied research.

  13. An Analytical and Experimental Investigation of FM-by-Noise Jamming

    DTIC Science & Technology

    1992-12-01

    connocted with this research. II Preface This investigation had two major goals. The first goal was to provide a lucid, complete , and au- thoritative...duplicate the experiments and noise quality measurements are included. Complete listings of the programs written for this investigation are contained...writers specifically for a cleared audience in a mutually-undertood, classified setting. This thesis seeks to provide a lucid, complete , and authoritative

  14. Interacting steps with finite-range interactions: Analytical approximation and numerical results

    NASA Astrophysics Data System (ADS)

    Jaramillo, Diego Felipe; Téllez, Gabriel; González, Diego Luis; Einstein, T. L.

    2013-05-01

    We calculate an analytical expression for the terrace-width distribution P(s) for an interacting step system with nearest- and next-nearest-neighbor interactions. Our model is derived by mapping the step system onto a statistically equivalent one-dimensional system of classical particles. The validity of the model is tested with several numerical simulations and experimental results. We explore the effect of the range of interactions q on the functional form of the terrace-width distribution and pair correlation functions. For physically plausible interactions, we find modest changes when next-nearest neighbor interactions are included and generally negligible changes when more distant interactions are allowed. We discuss methods for extracting from simulated experimental data the characteristic scale-setting terms in assumed potential forms.

  15. Analytical modeling of flash-back phenomena. [premixed/prevaporized combustion system

    NASA Technical Reports Server (NTRS)

    Feng, C. C.

    1979-01-01

    To understand the flame flash-back phenomena more extensively, an analytical model was formed and a numerical program was written and tested to solve the set of differential equations describing the model. Results show that under a given set of conditions flame propagates in the boundary layer on a flat plate when the free stream is at or below 1.8 m/s.

  16. Chemistry Data for Geothermometry Mapping of Deep Hydrothermal Reservoirs in Southeastern Idaho

    DOE Data Explorer

    Earl Mattson

    2016-01-18

    This dataset includes chemistry of geothermal water samples of the Eastern Snake River Plain and surrounding area. The samples included in this dataset were collected during the springs and summers of 2014 and 2015. All chemical analysis of the samples were conducted in the Analytical Laboratory at the Center of Advanced Energy Studies in Idaho Falls, Idaho. This data set supersedes #425 submission and is the final submission for AOP 3.1.2.1 for INL. Isotopic data collected by Mark Conrad will be submitted in a separate file.

  17. Extended Analytic Device Optimization Employing Asymptotic Expansion

    NASA Technical Reports Server (NTRS)

    Mackey, Jonathan; Sehirlioglu, Alp; Dynsys, Fred

    2013-01-01

    Analytic optimization of a thermoelectric junction often introduces several simplifying assumptionsincluding constant material properties, fixed known hot and cold shoe temperatures, and thermallyinsulated leg sides. In fact all of these simplifications will have an effect on device performance,ranging from negligible to significant depending on conditions. Numerical methods, such as FiniteElement Analysis or iterative techniques, are often used to perform more detailed analysis andaccount for these simplifications. While numerical methods may stand as a suitable solution scheme,they are weak in gaining physical understanding and only serve to optimize through iterativesearching techniques. Analytic and asymptotic expansion techniques can be used to solve thegoverning system of thermoelectric differential equations with fewer or less severe assumptionsthan the classic case. Analytic methods can provide meaningful closed form solutions and generatebetter physical understanding of the conditions for when simplifying assumptions may be valid.In obtaining the analytic solutions a set of dimensionless parameters, which characterize allthermoelectric couples, is formulated and provide the limiting cases for validating assumptions.Presentation includes optimization of both classic rectangular couples as well as practically andtheoretically interesting cylindrical couples using optimization parameters physically meaningful toa cylindrical couple. Solutions incorporate the physical behavior for i) thermal resistance of hot andcold shoes, ii) variable material properties with temperature, and iii) lateral heat transfer through legsides.

  18. BiSet: Semantic Edge Bundling with Biclusters for Sensemaking.

    PubMed

    Sun, Maoyuan; Mi, Peng; North, Chris; Ramakrishnan, Naren

    2016-01-01

    Identifying coordinated relationships is an important task in data analytics. For example, an intelligence analyst might want to discover three suspicious people who all visited the same four cities. Existing techniques that display individual relationships, such as between lists of entities, require repetitious manual selection and significant mental aggregation in cluttered visualizations to find coordinated relationships. In this paper, we present BiSet, a visual analytics technique to support interactive exploration of coordinated relationships. In BiSet, we model coordinated relationships as biclusters and algorithmically mine them from a dataset. Then, we visualize the biclusters in context as bundled edges between sets of related entities. Thus, bundles enable analysts to infer task-oriented semantic insights about potentially coordinated activities. We make bundles as first class objects and add a new layer, "in-between", to contain these bundle objects. Based on this, bundles serve to organize entities represented in lists and visually reveal their membership. Users can interact with edge bundles to organize related entities, and vice versa, for sensemaking purposes. With a usage scenario, we demonstrate how BiSet supports the exploration of coordinated relationships in text analytics.

  19. Barycentric parameterizations for isotropic BRDFs.

    PubMed

    Stark, Michael M; Arvo, James; Smits, Brian

    2005-01-01

    A bidirectional reflectance distribution function (BRDF) is often expressed as a function of four real variables: two spherical coordinates in each of the the "incoming" and "outgoing" directions. However, many BRDFs reduce to functions of fewer variables. For example, isotropic reflection can be represented by a function of three variables. Some BRDF models can be reduced further. In this paper, we introduce new sets of coordinates which we use to reduce the dimensionality of several well-known analytic BRDFs as well as empirically measured BRDF data. The proposed coordinate systems are barycentric with respect to a triangular support with a direct physical interpretation. One coordinate set is based on the BRDF model proposed by Lafortune. Another set, based on a model of Ward, is associated with the "halfway" vector common in analytical BRDF formulas. Through these coordinate sets we establish lower bounds on the approximation error inherent in the models on which they are based. We present a third set of coordinates, not based on any analytical model, that performs well in approximating measured data. Finally, our proposed variables suggest novel ways of constructing and visualizing BRDFs.

  20. Ab Initio and Analytic Intermolecular Potentials for Ar-CF₄

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vayner, Grigoriy; Alexeev, Yuri; Wang, Jiangping

    2006-03-09

    Ab initio calculations at the CCSD(T) level of theory are performed to characterize the Ar + CF ₄ intermolecular potential. Extensive calculations, with and without a correction for basis set superposition error (BSSE), are performed with the cc-pVTZ basis set. Additional calculations are performed with other correlation consistent (cc) basis sets to extrapolate the Ar---CF₄potential energy minimum to the complete basis set (CBS) limit. Both the size of the basis set and BSSE have substantial effects on the Ar + CF₄ potential. Calculations with the cc-pVTZ basis set and without a BSSE correction, appear to give a good representation ofmore » the potential at the CBS limit and with a BSSE correction. In addition, MP2 theory is found to give potential energies in very good agreement with those determined by the much higher level CCSD(T) theory. Two analytic potential energy functions were determined for Ar + CF₄by fitting the cc-pVTZ calculations both with and without a BSSE correction. These analytic functions were written as a sum of two body potentials and excellent fits to the ab initio potentials were obtained by representing each two body interaction as a Buckingham potential.« less

  1. bigSCale: an analytical framework for big-scale single-cell data.

    PubMed

    Iacono, Giovanni; Mereu, Elisabetta; Guillaumet-Adkins, Amy; Corominas, Roser; Cuscó, Ivon; Rodríguez-Esteban, Gustavo; Gut, Marta; Pérez-Jurado, Luis Alberto; Gut, Ivo; Heyn, Holger

    2018-06-01

    Single-cell RNA sequencing (scRNA-seq) has significantly deepened our insights into complex tissues, with the latest techniques capable of processing tens of thousands of cells simultaneously. Analyzing increasing numbers of cells, however, generates extremely large data sets, extending processing time and challenging computing resources. Current scRNA-seq analysis tools are not designed to interrogate large data sets and often lack sensitivity to identify marker genes. With bigSCale, we provide a scalable analytical framework to analyze millions of cells, which addresses the challenges associated with large data sets. To handle the noise and sparsity of scRNA-seq data, bigSCale uses large sample sizes to estimate an accurate numerical model of noise. The framework further includes modules for differential expression analysis, cell clustering, and marker identification. A directed convolution strategy allows processing of extremely large data sets, while preserving transcript information from individual cells. We evaluated the performance of bigSCale using both a biological model of aberrant gene expression in patient-derived neuronal progenitor cells and simulated data sets, which underlines the speed and accuracy in differential expression analysis. To test its applicability for large data sets, we applied bigSCale to assess 1.3 million cells from the mouse developing forebrain. Its directed down-sampling strategy accumulates information from single cells into index cell transcriptomes, thereby defining cellular clusters with improved resolution. Accordingly, index cell clusters identified rare populations, such as reelin ( Reln )-positive Cajal-Retzius neurons, for which we report previously unrecognized heterogeneity associated with distinct differentiation stages, spatial organization, and cellular function. Together, bigSCale presents a solution to address future challenges of large single-cell data sets. © 2018 Iacono et al.; Published by Cold Spring Harbor Laboratory Press.

  2. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    NASA Astrophysics Data System (ADS)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  3. Ensembl Genomes 2013: scaling up access to genome-wide data.

    PubMed

    Kersey, Paul Julian; Allen, James E; Christensen, Mikkel; Davis, Paul; Falin, Lee J; Grabmueller, Christoph; Hughes, Daniel Seth Toney; Humphrey, Jay; Kerhornou, Arnaud; Khobova, Julia; Langridge, Nicholas; McDowall, Mark D; Maheswari, Uma; Maslen, Gareth; Nuhn, Michael; Ong, Chuang Kee; Paulini, Michael; Pedro, Helder; Toneva, Iliana; Tuli, Mary Ann; Walts, Brandon; Williams, Gareth; Wilson, Derek; Youens-Clark, Ken; Monaco, Marcela K; Stein, Joshua; Wei, Xuehong; Ware, Doreen; Bolser, Daniel M; Howe, Kevin Lee; Kulesha, Eugene; Lawson, Daniel; Staines, Daniel Michael

    2014-01-01

    Ensembl Genomes (http://www.ensemblgenomes.org) is an integrating resource for genome-scale data from non-vertebrate species. The project exploits and extends technologies for genome annotation, analysis and dissemination, developed in the context of the vertebrate-focused Ensembl project, and provides a complementary set of resources for non-vertebrate species through a consistent set of programmatic and interactive interfaces. These provide access to data including reference sequence, gene models, transcriptional data, polymorphisms and comparative analysis. This article provides an update to the previous publications about the resource, with a focus on recent developments. These include the addition of important new genomes (and related data sets) including crop plants, vectors of human disease and eukaryotic pathogens. In addition, the resource has scaled up its representation of bacterial genomes, and now includes the genomes of over 9000 bacteria. Specific extensions to the web and programmatic interfaces have been developed to support users in navigating these large data sets. Looking forward, analytic tools to allow targeted selection of data for visualization and download are likely to become increasingly important in future as the number of available genomes increases within all domains of life, and some of the challenges faced in representing bacterial data are likely to become commonplace for eukaryotes in future.

  4. Load sharing in distributed real-time systems with state-change broadcasts

    NASA Technical Reports Server (NTRS)

    Shin, Kang G.; Chang, Yi-Chieh

    1989-01-01

    A decentralized dynamic load-sharing (LS) method based on state-change broadcasts is proposed for a distributed real-time system. Whenever the state of a node changes from underloaded to fully loaded and vice versa, the node broadcasts this change to a set of nodes, called a buddy set, in the system. The performance of the method is evaluated with both analytic modeling and simulation. It is modeled first by an embedded Markov chain for which numerical solutions are derived. The model solutions are then used to calculate the distribution of queue lengths at the nodes and the probability of meeting task deadlines. The analytical results show that buddy sets of 10 nodes outperform those of less than 10 nodes, and the incremental benefit gained from increasing the buddy set size beyond 15 nodes is insignificant. These and other analytical results are verified by simulation. The proposed LS method is shown to meet task deadlines with a very high probability.

  5. Designing a Marketing Analytics Course for the Digital Age

    ERIC Educational Resources Information Center

    Liu, Xia; Burns, Alvin C.

    2018-01-01

    Marketing analytics is receiving great attention because of evolving technology and the radical changes in the marketing environment. This study aims to assist the design and implementation of a marketing analytics course. We assembled a rich data set from four sources: business executives, 400 employers' job postings, one million tweets about…

  6. NHEXAS PHASE I MARYLAND STUDY--QA ANALYTICAL RESULTS FOR PESTICIDE METABOLITES IN SPIKE SAMPLES

    EPA Science Inventory

    The Pesticides in Spikes data set contains the analytical results of measurements of up to 17 pesticides in 12 control samples (spikes) from 11 households. Measurements were made in samples of blood serum. Controls were used to assess recovery of target analytes from a sample m...

  7. Investigation of characteristics of feed system instabilities

    NASA Technical Reports Server (NTRS)

    Vaage, R. D.; Fidler, L. E.; Zehnle, R. A.

    1972-01-01

    The relationship between the structural and feed system natural frequencies in structure-propulsion system coupled longitudinal oscillations (pogo) is investigated. The feed system frequencies are usually very dependent upon the compressibility (compliance) of cavitation bubbles that exist to some extent in all operating turbopumps. This document includes: a complete review of cavitation mechanisms; development of a turbopump cavitation compliance model; an accumulation and analysis of all available cavitation compliance test data; and a correlation of empirical-analytical results. The analytical model is based on the analysis of flow relative to a set of cascaded blades, having any described shape, and assumes phase changes occur under conditions of isentropic equilibrium. Analytical cavitation compliance predictions for the J-2 LOX, F-1 LOX, H-1 LOX and LR87 oxidizer turbopump inducers do not compare favorably with test data. The model predicts much less cavitation than is derived from the test data. This implies that mechanisms other than blade cavitation contribute significantly to the total amount of turbopump cavitation.

  8. Analytical Optimization of the Net Residual Dispersion in SPM-Limited Dispersion-Managed Systems

    NASA Astrophysics Data System (ADS)

    Xiao, Xiaosheng; Gao, Shiming; Tian, Yu; Yang, Changxi

    2006-05-01

    Dispersion management is an effective technique to suppress the nonlinear impairment in fiber transmission systems, which includes tuning the amounts of precompensation, residual dispersion per span (RDPS), and net residual dispersion (NRD) of the systems. For self-phase modulation (SPM)-limited systems, optimizing the NRD is necessary because it can greatly improve the system performance. In this paper, an analytical method is presented to optimize NRD for SPM-limited dispersion-managed systems. The method is based on the correlation between the nonlinear impairment and the output pulse broadening of SPM-limited systems; therefore, dispersion-managed systems can be optimized through minimizing the output single-pulse broadening. A set of expressions is derived to calculate the output pulse broadening of the SPM-limited dispersion-managed system, from which the analytical result of optimal NRD is obtained. Furthermore, with the expressions of pulse broadening, how the nonlinear impairment depends on the amounts of precompensation and RDPS can be revealed conveniently.

  9. Electrochemical determination of inorganic mercury and arsenic--A review.

    PubMed

    Zaib, Maria; Athar, Muhammad Makshoof; Saeed, Asma; Farooq, Umar

    2015-12-15

    Inorganic mercury and arsenic encompasses a term which includes As(III), As(V) and Hg(II) species. These metal ions have been extensively studied due to their toxicity related issues. Different analytical methods are used to monitor inorganic mercury and arsenic in a variety of samples at trace level. The present study reviews various analytical techniques available for detection of inorganic mercury and arsenic with particular emphasis on electrochemical methods especially stripping voltammetry. A detailed critical evaluation of methods, advantages of electrochemical methods over other analytical methods, and various electrode materials available for mercury and arsenic analysis is presented in this review study. Modified carbon paste electrode provides better determination due to better deposition with linear and improved response under studied set of conditions. Biological materials may be the potent and economical alternative as compared to macro-electrodes and chemically modified carbon paste electrodes in stripping analysis of inorganic mercury and arsenic. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Generalized bipartite quantum state discrimination problems with sequential measurements

    NASA Astrophysics Data System (ADS)

    Nakahira, Kenji; Kato, Kentaro; Usuda, Tsuyoshi Sasaki

    2018-02-01

    We investigate an optimization problem of finding quantum sequential measurements, which forms a wide class of state discrimination problems with the restriction that only local operations and one-way classical communication are allowed. Sequential measurements from Alice to Bob on a bipartite system are considered. Using the fact that the optimization problem can be formulated as a problem with only Alice's measurement and is convex programming, we derive its dual problem and necessary and sufficient conditions for an optimal solution. Our results are applicable to various practical optimization criteria, including the Bayes criterion, the Neyman-Pearson criterion, and the minimax criterion. In the setting of the problem of finding an optimal global measurement, its dual problem and necessary and sufficient conditions for an optimal solution have been widely used to obtain analytical and numerical expressions for optimal solutions. Similarly, our results are useful to obtain analytical and numerical expressions for optimal sequential measurements. Examples in which our results can be used to obtain an analytical expression for an optimal sequential measurement are provided.

  11. A global multicenter study on reference values: 2. Exploration of sources of variation across the countries.

    PubMed

    Ichihara, Kiyoshi; Ozarda, Yesim; Barth, Julian H; Klee, George; Shimizu, Yoshihisa; Xia, Liangyu; Hoffmann, Mariza; Shah, Swarup; Matsha, Tandi; Wassung, Janette; Smit, Francois; Ruzhanskaya, Anna; Straseski, Joely; Bustos, Daniel N; Kimura, Shogo; Takahashi, Aki

    2017-04-01

    The intent of this study, based on a global multicenter study of reference values (RVs) for serum analytes was to explore biological sources of variation (SVs) of the RVs among 12 countries around the world. As described in the first part of this paper, RVs of 50 major serum analytes from 13,396 healthy individuals living in 12 countries were obtained. Analyzed in this study were 23 clinical chemistry analytes and 8 analytes measured by immunoturbidimetry. Multiple regression analysis was performed for each gender, country by country, analyte by analyte, by setting four major SVs (age, BMI, and levels of drinking and smoking) as a fixed set of explanatory variables. For analytes with skewed distributions, log-transformation was applied. The association of each source of variation with RVs was expressed as the partial correlation coefficient (r p ). Obvious gender and age-related changes in the RVs were observed in many analytes, almost consistently between countries. Compilation of age-related variations of RVs after adjusting for between-country differences revealed peculiar patterns specific to each analyte. Judged fromthe r p , BMI related changes were observed for many nutritional and inflammatory markers in almost all countries. However, the slope of linear regression of BMI vs. RV differed greatly among countries for some analytes. Alcohol and smoking-related changes were observed less conspicuously in a limited number of analytes. The features of sex, age, alcohol, and smoking-related changes in RVs of the analytes were largely comparable worldwide. The finding of differences in BMI-related changes among countries in some analytes is quite relevant to understanding ethnic differences in susceptibility to nutritionally related diseases. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Adequate mathematical modelling of environmental processes

    NASA Astrophysics Data System (ADS)

    Chashechkin, Yu. D.

    2012-04-01

    In environmental observations and laboratory visualization both large scale flow components like currents, jets, vortices, waves and a fine structure are registered (different examples are given). The conventional mathematical modeling both analytical and numerical is directed mostly on description of energetically important flow components. The role of a fine structures is still remains obscured. A variety of existing models makes it difficult to choose the most adequate and to estimate mutual assessment of their degree of correspondence. The goal of the talk is to give scrutiny analysis of kinematics and dynamics of flows. A difference between the concept of "motion" as transformation of vector space into itself with a distance conservation and the concept of "flow" as displacement and rotation of deformable "fluid particles" is underlined. Basic physical quantities of the flow that are density, momentum, energy (entropy) and admixture concentration are selected as physical parameters defined by the fundamental set which includes differential D'Alembert, Navier-Stokes, Fourier's and/or Fick's equations and closing equation of state. All of them are observable and independent. Calculations of continuous Lie groups shown that only the fundamental set is characterized by the ten-parametric Galilelian groups reflecting based principles of mechanics. Presented analysis demonstrates that conventionally used approximations dramatically change the symmetries of the governing equations sets which leads to their incompatibility or even degeneration. The fundamental set is analyzed taking into account condition of compatibility. A high order of the set indicated on complex structure of complete solutions corresponding to physical structure of real flows. Analytical solutions of a number problems including flows induced by diffusion on topography, generation of the periodic internal waves a compact sources in week-dissipative media as well as numerical solutions of the same problems are constructed. They include regular perturbed function describing large scale component and a rich family of singular perturbed function corresponding to fine flow components. Solutions are compared with data of laboratory experiments performed on facilities USU "HPC IPMec RAS" under support of Ministry of Education and Science RF (Goscontract No. 16.518.11.7059). Related problems of completeness and accuracy of laboratory and environmental measurements are discussed.

  13. Development of a new procedure for the determination of captopril in pharmaceutical formulations employing chemiluminescence and a multicommuted flow analysis approach.

    PubMed

    Lima, Manoel J A; Fernandes, Ridvan N; Tanaka, Auro A; Reis, Boaventura F

    2016-02-01

    This paper describes a new technique for the determination of captopril in pharmaceutical formulations, implemented by employing multicommuted flow analysis. The analytical procedure was based on the reaction between hypochlorite and captopril. The remaining hypochlorite oxidized luminol that generated electromagnetic radiation detected using a homemade luminometer. To the best of our knowledge, this is the first time that this reaction has been exploited for the determination of captopril in pharmaceutical products, offering a clean analytical procedure with minimal reagent usage. The effectiveness of the proposed procedure was confirmed by analyzing a set of pharmaceutical formulations. Application of the paired t-test showed that there was no significant difference between the data sets at a 95% confidence level. The useful features of the new analytical procedure included a linear response for captopril concentrations in the range 20.0-150.0 µmol/L (r = 0.997), a limit of detection (3σ) of 2.0 µmol/L, a sample throughput of 164 determinations per hour, reagent consumption of 9 µg luminol and 42 µg hypochlorite per determination and generation of 0.63 mL of waste. A relative standard deviation of 1% (n = 6) for a standard solution containing 80 µmol/L captopril was also obtained. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Analytical flow duration curves for summer streamflow in Switzerland

    NASA Astrophysics Data System (ADS)

    Santos, Ana Clara; Portela, Maria Manuela; Rinaldo, Andrea; Schaefli, Bettina

    2018-04-01

    This paper proposes a systematic assessment of the performance of an analytical modeling framework for streamflow probability distributions for a set of 25 Swiss catchments. These catchments show a wide range of hydroclimatic regimes, including namely snow-influenced streamflows. The model parameters are calculated from a spatially averaged gridded daily precipitation data set and from observed daily discharge time series, both in a forward estimation mode (direct parameter calculation from observed data) and in an inverse estimation mode (maximum likelihood estimation). The performance of the linear and the nonlinear model versions is assessed in terms of reproducing observed flow duration curves and their natural variability. Overall, the nonlinear model version outperforms the linear model for all regimes, but the linear model shows a notable performance increase with catchment elevation. More importantly, the obtained results demonstrate that the analytical model performs well for summer discharge for all analyzed streamflow regimes, ranging from rainfall-driven regimes with summer low flow to snow and glacier regimes with summer high flow. These results suggest that the model's encoding of discharge-generating events based on stochastic soil moisture dynamics is more flexible than previously thought. As shown in this paper, the presence of snowmelt or ice melt is accommodated by a relative increase in the discharge-generating frequency, a key parameter of the model. Explicit quantification of this frequency increase as a function of mean catchment meteorological conditions is left for future research.

  15. The Geochemical Databases GEOROC and GeoReM - What's New?

    NASA Astrophysics Data System (ADS)

    Sarbas, B.; Jochum, K. P.; Nohl, U.; Weis, U.

    2017-12-01

    The geochemical databases GEOROC (http: georoc.mpch-mainz.gwdg.de) and GeoReM (http: georem.mpch-mainz.gwdg.de) are maintained by the Max Planck Institute for Chemistry in Mainz, Germany. Both online databases became crucial tools for geoscientists from different research areas. They are regularly upgraded by new tools and new data from recent publications obtained from a wide range of international journals. GEOROC is a collection of published analyses of volcanic rocks and mantle xenoliths. Since recently, data for plutonic rocks are added. The analyses include major and trace element concentrations, radiogenic and non-radiogenic isotope ratios as well as analytical ages for whole rocks, glasses, minerals and inclusions. Samples come from eleven geological settings and span the whole geological age scale from Archean to Recent. Metadata include, among others, geographic location, rock class and rock type, geological age, degree of alteration, analytical method, laboratory, and reference. The GEOROC web page allows selection of samples by geological setting, geography, chemical criteria, rock or sample name, and bibliographic criteria. In addition, it provides a large number of precompiled files for individual locations, minerals and rock classes. GeoReM is a database collecting information about reference materials of geological and environmental interest, such as rock powders, synthetic and natural glasses as well as mineral, isotopic, biological, river water and seawater reference materials. It contains published data and compilation values (major and trace element concentrations and mass fractions, radiogenic and stable isotope ratios). Metadata comprise, among others, uncertainty, analytical method and laboratory. Reference materials are important for calibration, method validation, quality control and to establish metrological traceability. GeoReM offers six different search strategies: samples or materials (published values), samples (GeoReM preferred values), chemical criteria, chemical criteria based on bibliography, bibliography, as well as methods and institutions.

  16. Pair mobility functions for rigid spheres in concentrated colloidal dispersions: Stresslet and straining motion couplings

    NASA Astrophysics Data System (ADS)

    Su, Yu; Swan, James W.; Zia, Roseanna N.

    2017-03-01

    Accurate modeling of particle interactions arising from hydrodynamic, entropic, and other microscopic forces is essential to understanding and predicting particle motion and suspension behavior in complex and biological fluids. The long-range nature of hydrodynamic interactions can be particularly challenging to capture. In dilute dispersions, pair-level interactions are sufficient and can be modeled in detail by analytical relations derived by Jeffrey and Onishi [J. Fluid Mech. 139, 261-290 (1984)] and Jeffrey [Phys. Fluids A 4, 16-29 (1992)]. In more concentrated dispersions, analytical modeling of many-body hydrodynamic interactions quickly becomes intractable, leading to the development of simplified models. These include mean-field approaches that smear out particle-scale structure and essentially assume that long-range hydrodynamic interactions are screened by crowding, as particle mobility decays at high concentrations. Toward the development of an accurate and simplified model for the hydrodynamic interactions in concentrated suspensions, we recently computed a set of effective pair of hydrodynamic functions coupling particle motion to a hydrodynamic force and torque at volume fractions up to 50% utilizing accelerated Stokesian dynamics and a fast stochastic sampling technique [Zia et al., J. Chem. Phys. 143, 224901 (2015)]. We showed that the hydrodynamic mobility in suspensions of colloidal spheres is not screened, and the power law decay of the hydrodynamic functions persists at all concentrations studied. In the present work, we extend these mobility functions to include the couplings of particle motion and straining flow to the hydrodynamic stresslet. The couplings computed in these two articles constitute a set of orthogonal coupling functions that can be utilized to compute equilibrium properties in suspensions at arbitrary concentration and are readily applied to solve many-body hydrodynamic interactions analytically.

  17. Tidally averaged circulation in Puget Sound sub-basins: Comparison of historical data, analytical model, and numerical model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khangaonkar, Tarang; Yang, Zhaoqing; Kim, Tae Yun

    2011-07-20

    Through extensive field data collection and analysis efforts conducted since the 1950s, researchers have established an understanding of the characteristic features of circulation in Puget Sound. The pattern ranges from the classic fjordal behavior in some basins, with shallow brackish outflow and compensating inflow immediately below, to the typical two-layer flow observed in many partially mixed estuaries with saline inflow at depth. An attempt at reproducing this behavior by fitting an analytical formulation to past data is presented, followed by the application of a three-dimensional circulation and transport numerical model. The analytical treatment helped identify key physical processes and parameters,more » but quickly reconfirmed that response is complex and would require site-specific parameterization to include effects of sills and interconnected basins. The numerical model of Puget Sound, developed using unstructured-grid finite volume method, allowed resolution of the sub-basin geometric features, including presence of major islands, and site-specific strong advective vertical mixing created by bathymetry and multiple sills. The model was calibrated using available recent short-term oceanographic time series data sets from different parts of the Puget Sound basin. The results are compared against (1) recent velocity and salinity data collected in Puget Sound from 2006 and (2) a composite data set from previously analyzed historical records, mostly from the 1970s. The results highlight the ability of the model to reproduce velocity and salinity profile characteristics, their variations among Puget Sound subbasins, and tidally averaged circulation. Sensitivity of residual circulation to variations in freshwater inflow and resulting salinity gradient in fjordal sub-basins of Puget Sound is examined.« less

  18. Analytical performance assessment of orbital configurations

    NASA Astrophysics Data System (ADS)

    Hitzl, D. L.; Krakowski, D. C.

    1981-08-01

    The system analysis of an orbital communication network of N satellites has been conducted. Specifically, the problem of connecting, in an optimal way, a set of ground-based laser transmitters to a second set of ground receivers on another part of the earth via a number of relay satellites fitted with retroreflectors has been addressed. A computer program has been developed which can treat either the so-called 'single-bounce' or 'double-bounce' cases. Sample results included in this paper consider a double-bounce orbital network composed of 12 relay satellites in 6 hour elliptical orbits together with 16 transceiver (delivery) satellites in 4.8 hour elliptical orbits.

  19. FLORIDA TOWER FOOTPRINT EXPERIMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    WATSON,T.B.; DIETZ, R.N.; WILKE, R.

    2007-01-01

    The Florida Footprint experiments were a series of field programs in which perfluorocarbon tracers were released in different configurations centered on a flux tower to generate a data set that can be used to test transport and dispersion models. These models are used to determine the sources of the CO{sub 2} that cause the fluxes measured at eddy covariance towers. Experiments were conducted in a managed slash pine forest, 10 km northeast of Gainesville, Florida, in 2002, 2004, and 2006 and in atmospheric conditions that ranged from well mixed, to very stable, including the transition period between convective conditions atmore » midday to stable conditions after sun set. There were a total of 15 experiments. The characteristics of the PFTs, details of sampling and analysis methods, quality control measures, and analytical statistics including confidence limits are presented. Details of the field programs including tracer release rates, tracer source configurations, and configuration of the samplers are discussed. The result of this experiment is a high quality, well documented tracer and meteorological data set that can be used to improve and validate canopy dispersion models.« less

  20. Throughput and latency programmable optical transceiver by using DSP and FEC control.

    PubMed

    Tanimura, Takahito; Hoshida, Takeshi; Kato, Tomoyuki; Watanabe, Shigeki; Suzuki, Makoto; Morikawa, Hiroyuki

    2017-05-15

    We propose and experimentally demonstrate a proof-of-concept of a programmable optical transceiver that enables simultaneous optimization of multiple programmable parameters (modulation format, symbol rate, power allocation, and FEC) for satisfying throughput, signal quality, and latency requirements. The proposed optical transceiver also accommodates multiple sub-channels that can transport different optical signals with different requirements. Multi-degree-of-freedom of the parameters often leads to difficulty in finding the optimum combination among the parameters due to an explosion of the number of combinations. The proposed optical transceiver reduces the number of combinations and finds feasible sets of programmable parameters by using constraints of the parameters combined with a precise analytical model. For precise BER prediction with the specified set of parameters, we model the sub-channel BER as a function of OSNR, modulation formats, symbol rates, and power difference between sub-channels. Next, we formulate simple constraints of the parameters and combine the constraints with the analytical model to seek feasible sets of programmable parameters. Finally, we experimentally demonstrate the end-to-end operation of the proposed optical transceiver with offline manner including low-density parity-check (LDPC) FEC encoding and decoding under a specific use case with latency-sensitive application and 40-km transmission.

  1. Neutron activation and other analytical data for plutonic rocks from North America and Africa. National Uranium Resource Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, V.; Fay, W.M.; Cook, J.R.

    1982-09-01

    The objective of this report is to retrieve the elements of an analytical study of granites and associated other plutonic rocks which was begun as a part of the U.S. Department of Energy's National Uranium Resource Evaluation (NURE) program. A discussion of the Savannah River Laboratory (SRL) neutron activation analysis system is given so that a user will understand the linmitations of the data. Enough information is given so that an experienced geochemist can clean up the data set to the extent required by any project. The data are generally good as they are presented. It is intended that themore » data be read from a magnetic tape written to accompany this report. Microfiche tables of the data follow the text. These tables were prepared from data on the tape, and programs which will read the tape are presented in the section THE DATA TAPE. It is our intent to write a later paper which will include a thoroughly scrubbed data set and a technical discussion of results of the study. 1 figure.« less

  2. Standardless quantification by parameter optimization in electron probe microanalysis

    NASA Astrophysics Data System (ADS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-11-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.

  3. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  4. The pitfalls of hair analysis for toxicants in clinical practice: three case reports.

    PubMed Central

    Frisch, Melissa; Schwartz, Brian S

    2002-01-01

    Hair analysis is used to assess exposure to heavy metals in patients presenting with nonspecific symptoms and is a commonly used procedure in patients referred to our clinic. We are frequently called on to evaluate patients who have health-related concerns as a result of hair analysis. Three patients first presented to outside physicians with nonspecific, multisystemic symptoms. A panel of analytes was measured in hair, and one or more values were interpreted as elevated. As a result of the hair analysis and other unconventional diagnostic tests, the patients presented to us believing they suffered from metal toxicity. In this paper we review the clinical efficacy of this procedure within the context of a patient population with somatic disorders and no clear risk factors for metal intoxication. We also review limitations of hair analysis in this setting; these limitations include patient factors such as low pretest probability of disease and test factors such as the lack of validation of analytic techniques, the inability to discern between exogenous contaminants and endogenous toxicants in hair, the variability of analytic procedures, low interlaboratory reliability, and the increased likelihood of false positive test results in the measurement of panels of analytes. PMID:11940463

  5. Ecological Risk Assessment of Explosive Residues in Rodents, Reptiles, Amphibians, and Fish

    DTIC Science & Technology

    2004-03-01

    oligonucleotide primers were designed according to the sequence for pendrin in Mus musculus . PCR was carried out using a Failsafe kit (Epicentre, WI). PCR...Project No. T9700 PERCHLORATE ANALYTICAL Phase V As a calibration curve is run each time a set of samples is analyzed, we routinely include an... Reset FINAL REPORT FY2002 SERDP Project: ER-1235 TABLE OF CONTENTS Topic Page IDENTIFICATION OF PERCHLORATE-CONTAMINATED AND REFERENCE SITES

  6. Defining and Detecting Complex Peak Relationships in Mass Spectral Data: The Mz.unity Algorithm.

    PubMed

    Mahieu, Nathaniel G; Spalding, Jonathan L; Gelman, Susan J; Patti, Gary J

    2016-09-20

    Analysis of a single analyte by mass spectrometry can result in the detection of more than 100 degenerate peaks. These degenerate peaks complicate spectral interpretation and are challenging to annotate. In mass spectrometry-based metabolomics, this degeneracy leads to inflated false discovery rates, data sets containing an order of magnitude more features than analytes, and an inefficient use of resources during data analysis. Although software has been introduced to annotate spectral degeneracy, current approaches are unable to represent several important classes of peak relationships. These include heterodimers and higher complex adducts, distal fragments, relationships between peaks in different polarities, and complex adducts between features and background peaks. Here we outline sources of peak degeneracy in mass spectra that are not annotated by current approaches and introduce a software package called mz.unity to detect these relationships in accurate mass data. Using mz.unity, we find that data sets contain many more complex relationships than we anticipated. Examples include the adduct of glutamate and nicotinamide adenine dinucleotide (NAD), fragments of NAD detected in the same or opposite polarities, and the adduct of glutamate and a background peak. Further, the complex relationships we identify show that several assumptions commonly made when interpreting mass spectral degeneracy do not hold in general. These contributions provide new tools and insight to aid in the annotation of complex spectral relationships and provide a foundation for improved data set identification. Mz.unity is an R package and is freely available at https://github.com/nathaniel-mahieu/mz.unity as well as our laboratory Web site http://pattilab.wustl.edu/software/ .

  7. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines

    ERIC Educational Resources Information Center

    Ifenthaler, Dirk; Widanapathirana, Chathuranga

    2014-01-01

    Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…

  8. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  9. Analytical model for screening potential CO2 repositories

    USGS Publications Warehouse

    Okwen, R.T.; Stewart, M.T.; Cunningham, J.A.

    2011-01-01

    Assessing potential repositories for geologic sequestration of carbon dioxide using numerical models can be complicated, costly, and time-consuming, especially when faced with the challenge of selecting a repository from a multitude of potential repositories. This paper presents a set of simple analytical equations (model), based on the work of previous researchers, that could be used to evaluate the suitability of candidate repositories for subsurface sequestration of carbon dioxide. We considered the injection of carbon dioxide at a constant rate into a confined saline aquifer via a fully perforated vertical injection well. The validity of the analytical model was assessed via comparison with the TOUGH2 numerical model. The metrics used in comparing the two models include (1) spatial variations in formation pressure and (2) vertically integrated brine saturation profile. The analytical model and TOUGH2 show excellent agreement in their results when similar input conditions and assumptions are applied in both. The analytical model neglects capillary pressure and the pressure dependence of fluid properties. However, simulations in TOUGH2 indicate that little error is introduced by these simplifications. Sensitivity studies indicate that the agreement between the analytical model and TOUGH2 depends strongly on (1) the residual brine saturation, (2) the difference in density between carbon dioxide and resident brine (buoyancy), and (3) the relationship between relative permeability and brine saturation. The results achieved suggest that the analytical model is valid when the relationship between relative permeability and brine saturation is linear or quasi-linear and when the irreducible saturation of brine is zero or very small. ?? 2011 Springer Science+Business Media B.V.

  10. Modeling microelectrode biosensors: free-flow calibration can substantially underestimate tissue concentrations

    PubMed Central

    Wall, Mark J.

    2016-01-01

    Microelectrode amperometric biosensors are widely used to measure concentrations of analytes in solution and tissue including acetylcholine, adenosine, glucose, and glutamate. A great deal of experimental and modeling effort has been directed at quantifying the response of the biosensors themselves; however, the influence that the macroscopic tissue environment has on biosensor response has not been subjected to the same level of scrutiny. Here we identify an important issue in the way microelectrode biosensors are calibrated that is likely to have led to underestimations of analyte tissue concentrations. Concentration in tissue is typically determined by comparing the biosensor signal to that measured in free-flow calibration conditions. In a free-flow environment the concentration of the analyte at the outer surface of the biosensor can be considered constant. However, in tissue the analyte reaches the biosensor surface by diffusion through the extracellular space. Because the enzymes in the biosensor break down the analyte, a density gradient is set up resulting in a significantly lower concentration of analyte near the biosensor surface. This effect is compounded by the diminished volume fraction (porosity) and reduction in the diffusion coefficient due to obstructions (tortuosity) in tissue. We demonstrate this effect through modeling and experimentally verify our predictions in diffusive environments. NEW & NOTEWORTHY Microelectrode biosensors are typically calibrated in a free-flow environment where the concentrations at the biosensor surface are constant. However, when in tissue, the analyte reaches the biosensor via diffusion and so analyte breakdown by the biosensor results in a concentration gradient and consequently a lower concentration around the biosensor. This effect means that naive free-flow calibration will underestimate tissue concentration. We develop mathematical models to better quantify the discrepancy between the calibration and tissue environment and experimentally verify our key predictions. PMID:27927788

  11. Modeling microelectrode biosensors: free-flow calibration can substantially underestimate tissue concentrations.

    PubMed

    Newton, Adam J H; Wall, Mark J; Richardson, Magnus J E

    2017-03-01

    Microelectrode amperometric biosensors are widely used to measure concentrations of analytes in solution and tissue including acetylcholine, adenosine, glucose, and glutamate. A great deal of experimental and modeling effort has been directed at quantifying the response of the biosensors themselves; however, the influence that the macroscopic tissue environment has on biosensor response has not been subjected to the same level of scrutiny. Here we identify an important issue in the way microelectrode biosensors are calibrated that is likely to have led to underestimations of analyte tissue concentrations. Concentration in tissue is typically determined by comparing the biosensor signal to that measured in free-flow calibration conditions. In a free-flow environment the concentration of the analyte at the outer surface of the biosensor can be considered constant. However, in tissue the analyte reaches the biosensor surface by diffusion through the extracellular space. Because the enzymes in the biosensor break down the analyte, a density gradient is set up resulting in a significantly lower concentration of analyte near the biosensor surface. This effect is compounded by the diminished volume fraction (porosity) and reduction in the diffusion coefficient due to obstructions (tortuosity) in tissue. We demonstrate this effect through modeling and experimentally verify our predictions in diffusive environments. NEW & NOTEWORTHY Microelectrode biosensors are typically calibrated in a free-flow environment where the concentrations at the biosensor surface are constant. However, when in tissue, the analyte reaches the biosensor via diffusion and so analyte breakdown by the biosensor results in a concentration gradient and consequently a lower concentration around the biosensor. This effect means that naive free-flow calibration will underestimate tissue concentration. We develop mathematical models to better quantify the discrepancy between the calibration and tissue environment and experimentally verify our key predictions. Copyright © 2017 the American Physiological Society.

  12. A coupled-mode theory for multiwaveguide systems satisfying the reciprocity theorem and power conservation

    NASA Technical Reports Server (NTRS)

    Chuang, Shun-Lien

    1987-01-01

    Two sets of coupled-mode equations for multiwaveguide systems are derived using a generalized reciprocity relation; one set for a lossless system, and the other for a general lossy or lossless system. The second set of equations also reduces to those of the first set in the lossless case under the condition that the transverse field components are chosen to be real. Analytical relations between the coupling coefficients are shown and applied to the coupling of mode equations. It is shown analytically that these results satisfy exactly both the reciprocity theorem and power conservation. New orthogonal relations between the supermodes are derived in matrix form, with the overlap integrals taken into account.

  13. 3D Tensorial Elastodynamics for Isotropic Media on Vertically Deformed Meshes

    NASA Astrophysics Data System (ADS)

    Shragge, J. C.

    2017-12-01

    Solutions of the 3D elastodynamic wave equation are sometimes required in industrial and academic applications of elastic reverse-time migration (E-RTM) and full waveform inversion (E-FWI) that involve vertically deformed meshes. Examples include incorporating irregular free-surface topography and handling internal boundaries (e.g., water bottom) directly into the computational meshes. In 3D E-RTM and E-FWI applications, the number of forward modeling simulations can number in the tens of thousands (per iteration), which necessitates the development of stable, accurate and efficient 3D elastodynamics solvers. For topographic scenarios, most finite-difference solution approaches use a change-of-variable strategy that has a number of associated computational challenges, including difficulties in handling of the free-surface boundary condition. In this study, I follow a tensorial approach and use a generalized family of analytic transforms to develop a set of analytic equations for 3D elastodynamics that directly incorporates vertical grid deformations. Importantly, this analytic approach allows for the specification of an analytic free-surface boundary condition appropriate for vertically deformed meshes. These equations are both straightforward and efficient to solve using a velocity-stress formulation with finite-difference (MFD) operators implemented on a fully staggered grid. Moreover, I demonstrate that the use of mimetic finite difference (MFD) methods allows stable, accurate, and efficient numerical solutions to be simulated for typical topographic scenarios. Examples demonstrate that high-quality elastic wavefields can be generated for topographic surfaces exhibiting significant topographic relief.

  14. Analytic Scattering and Refraction Models for Exoplanet Transit Spectra

    NASA Astrophysics Data System (ADS)

    Robinson, Tyler D.; Fortney, Jonathan J.; Hubbard, William B.

    2017-12-01

    Observations of exoplanet transit spectra are essential to understanding the physics and chemistry of distant worlds. The effects of opacity sources and many physical processes combine to set the shape of a transit spectrum. Two such key processes—refraction and cloud and/or haze forward-scattering—have seen substantial recent study. However, models of these processes are typically complex, which prevents their incorporation into observational analyses and standard transit spectrum tools. In this work, we develop analytic expressions that allow for the efficient parameterization of forward-scattering and refraction effects in transit spectra. We derive an effective slant optical depth that includes a correction for forward-scattered light, and present an analytic form of this correction. We validate our correction against a full-physics transit spectrum model that includes scattering, and we explore the extent to which the omission of forward-scattering effects may bias models. Also, we verify a common analytic expression for the location of a refractive boundary, which we express in terms of the maximum pressure probed in a transit spectrum. This expression is designed to be easily incorporated into existing tools, and we discuss how the detection of a refractive boundary could help indicate the background atmospheric composition by constraining the bulk refractivity of the atmosphere. Finally, we show that opacity from Rayleigh scattering and collision-induced absorption will outweigh the effects of refraction for Jupiter-like atmospheres whose equilibrium temperatures are above 400-500 K.

  15. Multi-crosswell profile 3D imaging and method

    DOEpatents

    Washbourne, John K.; Rector, III, James W.; Bube, Kenneth P.

    2002-01-01

    Characterizing the value of a particular property, for example, seismic velocity, of a subsurface region of ground is described. In one aspect, the value of the particular property is represented using at least one continuous analytic function such as a Chebychev polynomial. The seismic data may include data derived from at least one crosswell dataset for the subsurface region of interest and may also include other data. In either instance, data may simultaneously be used from a first crosswell dataset in conjunction with one or more other crosswell datasets and/or with the other data. In another aspect, the value of the property is characterized in three dimensions throughout the region of interest using crosswell and/or other data. In still another aspect, crosswell datasets for highly deviated or horizontal boreholes are inherently useful. The method is performed, in part, by fitting a set of vertically spaced layer boundaries, represented by an analytic function such as a Chebychev polynomial, within and across the region encompassing the boreholes such that a series of layers is defined between the layer boundaries. Initial values of the particular property are then established between the layer boundaries and across the subterranean region using a series of continuous analytic functions. The continuous analytic functions are then adjusted to more closely match the value of the particular property across the subterranean region of ground to determine the value of the particular property for any selected point within the region.

  16. Mspire-Simulator: LC-MS shotgun proteomic simulator for creating realistic gold standard data.

    PubMed

    Noyce, Andrew B; Smith, Rob; Dalgleish, James; Taylor, Ryan M; Erb, K C; Okuda, Nozomu; Prince, John T

    2013-12-06

    The most important step in any quantitative proteomic pipeline is feature detection (aka peak picking). However, generating quality hand-annotated data sets to validate the algorithms, especially for lower abundance peaks, is nearly impossible. An alternative for creating gold standard data is to simulate it with features closely mimicking real data. We present Mspire-Simulator, a free, open-source shotgun proteomic simulator that goes beyond previous simulation attempts by generating LC-MS features with realistic m/z and intensity variance along with other noise components. It also includes machine-learned models for retention time and peak intensity prediction and a genetic algorithm to custom fit model parameters for experimental data sets. We show that these methods are applicable to data from three different mass spectrometers, including two fundamentally different types, and show visually and analytically that simulated peaks are nearly indistinguishable from actual data. Researchers can use simulated data to rigorously test quantitation software, and proteomic researchers may benefit from overlaying simulated data on actual data sets.

  17. Demonstrating the use of web analytics and an online survey to understand user groups of a national network of river level data

    NASA Astrophysics Data System (ADS)

    Macleod, Christopher Kit; Braga, Joao; Arts, Koen; Ioris, Antonio; Han, Xiwu; Sripada, Yaji; van der Wal, Rene

    2016-04-01

    The number of local, national and international networks of online environmental sensors are rapidly increasing. Where environmental data are made available online for public consumption, there is a need to advance our understanding of the relationships between the supply of and the different demands for such information. Understanding how individuals and groups of users are using online information resources may provide valuable insights into their activities and decision making. As part of the 'dot.rural wikiRivers' project we investigated the potential of web analytics and an online survey to generate insights into the use of a national network of river level data from across Scotland. These sources of online information were collected alongside phone interviews with volunteers sampled from the online survey, and interviews with providers of online river level data; as part of a larger project that set out to help improve the communication of Scotland's online river data. Our web analytics analysis was based on over 100 online sensors which are maintained by the Scottish Environmental Protection Agency (SEPA). Through use of Google Analytics data accessed via the R Ganalytics package we assessed: if the quality of data provided by Google Analytics free service is good enough for research purposes; if we could demonstrate what sensors were being used, when and where; how the nature and pattern of sensor data may affect web traffic; and whether we can identify and profile these users based on information from traffic sources. Web analytics data consists of a series of quantitative metrics which capture and summarize various dimensions of the traffic to a certain web page or set of pages. Examples of commonly used metrics include the number of total visits to a site and the number of total page views. Our analyses of the traffic sources from 2009 to 2011 identified several different major user groups. To improve our understanding of how the use of this national network of river level data may provide insights into the interactions between individuals and their usage of hydrological information, we ran an online survey linked to the SEPA river level pages for one year. We collected over 2000 complete responses to the survey. The survey included questions on user activities and the importance of river level information for their activities; alongside questions on what additional information they used in their decision making e.g. precipitation, and when and what river pages they visited. In this presentation we will present results from our analysis of the web analytics and online survey, and the insights they provide to understanding user groups of this national network of river level data.

  18. Clinical chemistry reference values for 75-year-old apparently healthy persons.

    PubMed

    Huber, Klaus Roland; Mostafaie, Nazanin; Stangl, Gerhard; Worofka, Brigitte; Kittl, Eva; Hofmann, Jörg; Hejtman, Milos; Michael, Rainer; Weissgram, Silvia; Leitha, Thomas; Jungwirth, Susanne; Fischer, Peter; Tragl, Karl-Heinz; Bauer, Kurt

    2006-01-01

    Clinical chemistry reference values for elderly persons are sparse and mostly intermixed with those for younger subjects. To understand the links between metabolism and aging, it is paramount to differentiate between "normal" physiological processes in apparently healthy elderly subjects and metabolic changes due to long-lasting diseases. The Vienna Transdanube Aging (VITA) study, which began in 2000 and is continuing, will allow us to do just that, because more than 600 male and female volunteers aged exactly 75 years (to exclude any influence of the "aging" factor in this cohort) are participating in this study. Extensive clinical, neurological, biochemical, psychological, genetic, and radiological analyses, with a special emphasis on consumption of medication and abuse of drugs, were performed on each of the probands. The multitude of data and questionnaires obtained made possible an a posteriori approach to select individuals fulfilling criteria for a reference sample group of apparently healthy 75-year-old subjects for our study. Specific analytes were quantified on automated clinical analyzers, while manual methods were used for hormonal analytes. All clinical chemistry analytes were evaluated using in-depth statistical analyses with SPSS for Windows. In all, reference intervals for 45 analytes could be established. These include routine parameters for the assessment of organ functions, as well as hormone concentrations and hematological appraisals. Because all patients were reevaluated after exactly 30 months in the course of this study, we had the opportunity to reassess their health status at the age of 77.5 years. This was very useful for validation of the first round data set. Data of the second round evaluation corroborate the reference limits of the baseline analysis and further confirm our inclusion and exclusion criteria. In summary, we have established a reliable set of reference data for hormonal, hematological, and clinical chemistry analytes for elderly subjects. These values will be very useful for our future attempts to correlate disease states and aging processes with metabolic factors.

  19. A Multilevel Multiset Time-Series Model for Describing Complex Developmental Processes

    PubMed Central

    Ma, Xin; Shen, Jianping

    2017-01-01

    The authors sought to develop an analytical platform where multiple sets of time series can be examined simultaneously. This multivariate platform capable of testing interaction effects among multiple sets of time series can be very useful in empirical research. The authors demonstrated that the multilevel framework can readily accommodate this analytical capacity. Given their intention to use the multilevel multiset time-series model to pursue complicated research purposes, their resulting model is relatively simple to specify, to run, and to interpret. These advantages make the adoption of their model relatively effortless as long as researchers have the basic knowledge and skills in working with multilevel growth modeling. With multiple potential extensions of their model, the establishment of this analytical platform for analysis of multiple sets of time series can inspire researchers to pursue far more advanced research designs to address complex developmental processes in reality. PMID:29881094

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  1. T.Rex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-08

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  2. Perioperative and ICU Healthcare Analytics within a Veterans Integrated System Network: a Qualitative Gap Analysis.

    PubMed

    Mudumbai, Seshadri; Ayer, Ferenc; Stefanko, Jerry

    2017-08-01

    Health care facilities are implementing analytics platforms as a way to document quality of care. However, few gap analyses exist on platforms specifically designed for patients treated in the Operating Room, Post-Anesthesia Care Unit, and Intensive Care Unit (ICU). As part of a quality improvement effort, we undertook a gap analysis of an existing analytics platform within the Veterans Healthcare Administration. The objectives were to identify themes associated with 1) current clinical use cases and stakeholder needs; 2) information flow and pain points; and 3) recommendations for future analytics development. Methods consisted of semi-structured interviews in 2 phases with a diverse set (n = 9) of support personnel and end users from five facilities across a Veterans Integrated Service Network. Phase 1 identified underlying needs and previous experiences with the analytics platform across various roles and operational responsibilities. Phase 2 validated preliminary feedback, lessons learned, and recommendations for improvement. Emerging themes suggested that the existing system met a small pool of national reporting requirements. However, pain points were identified with accessing data in several information system silos and performing multiple manual validation steps of data content. Notable recommendations included enhancing systems integration to create "one-stop shopping" for data, and developing a capability to perform trends analysis. Our gap analysis suggests that analytics platforms designed for surgical and ICU patients should employ approaches similar to those being used for primary care patients.

  3. Physical Analytics: An emerging field with real-world applications and impact

    NASA Astrophysics Data System (ADS)

    Hamann, Hendrik

    2015-03-01

    In the past most information on the internet has been originated by humans or computers. However with the emergence of cyber-physical systems, vast amount of data is now being created by sensors from devices, machines etc digitizing the physical world. While cyber-physical systems are subject to active research around the world, the vast amount of actual data generated from the physical world has attracted so far little attention from the engineering and physics community. In this presentation we use examples to highlight the opportunities in this new subject of ``Physical Analytics'' for highly inter-disciplinary research (including physics, engineering and computer science), which aims understanding real-world physical systems by leveraging cyber-physical technologies. More specifically, the convergence of the physical world with the digital domain allows applying physical principles to everyday problems in a much more effective and informed way than what was possible in the past. Very much like traditional applied physics and engineering has made enormous advances and changed our lives by making detailed measurements to understand the physics of an engineered device, we can now apply the same rigor and principles to understand large-scale physical systems. In the talk we first present a set of ``configurable'' enabling technologies for Physical Analytics including ultralow power sensing and communication technologies, physical big data management technologies, numerical modeling for physical systems, machine learning based physical model blending, and physical analytics based automation and control. Then we discuss in detail several concrete applications of Physical Analytics ranging from energy management in buildings and data centers, environmental sensing and controls, precision agriculture to renewable energy forecasting and management.

  4. Analytic theory of orbit contraction and ballistic entry into planetary atmospheres

    NASA Technical Reports Server (NTRS)

    Longuski, J. M.; Vinh, N. X.

    1980-01-01

    A space object traveling through an atmosphere is governed by two forces: aerodynamic and gravitational. On this premise, equations of motion are derived to provide a set of universal entry equations applicable to all regimes of atmospheric flight from orbital motion under the dissipate force of drag through the dynamic phase of reentry, and finally to the point of contact with the planetary surface. Rigorous mathematical techniques such as averaging, Poincare's method of small parameters, and Lagrange's expansion, applied to obtain a highly accurate, purely analytic theory for orbit contraction and ballistic entry into planetary atmospheres. The theory has a wide range of applications to modern problems including orbit decay of artificial satellites, atmospheric capture of planetary probes, atmospheric grazing, and ballistic reentry of manned and unmanned space vehicles.

  5. Expressivism, Relativism, and the Analytic Equivalence Test

    PubMed Central

    Frápolli, Maria J.; Villanueva, Neftalí

    2015-01-01

    The purpose of this paper is to show that, pace (Field, 2009), MacFarlane’s assessment relativism and expressivism should be sharply distinguished. We do so by arguing that relativism and expressivism exemplify two very different approaches to context-dependence. Relativism, on the one hand, shares with other contemporary approaches a bottom–up, building block, model, while expressivism is part of a different tradition, one that might include Lewis’ epistemic contextualism and Frege’s content individuation, with which it shares an organic model to deal with context-dependence. The building-block model and the organic model, and thus relativism and expressivism, are set apart with the aid of a particular test: only the building-block model is compatible with the idea that there might be analytically equivalent, and yet different, propositions. PMID:26635690

  6. Quality specification and status of internal quality control of cardiac biomarkers in China from 2011 to 2016.

    PubMed

    Li, Tingting; Wang, Wei; Zhao, Haijian; He, Falin; Zhong, Kun; Yuan, Shuai; Wang, Zhiguo

    2017-09-07

    This study aimed to investigate the status of internal quality control (IQC) for cardiac biomarkers from 2011 to 2016 so that we can have overall knowledge of the precision level of measurements in China and set appropriate precision specifications. Internal quality control data of cardiac biomarkers, including creatinine kinase MB (CK-MB) (μg/L), CK-MB(U/L), myoglobin (Mb), cardiac troponin I (cTnI), cardiac troponin T (cTnT), and homocysteines (HCY), were collected by a web-based external quality assessment (EQA) system. Percentages of laboratories meeting five precision quality specifications for current coefficient of variations (CVs) were calculated. Then, appropriate precision specifications were chosen for these six analytes. Finally, the CVs and IQC practice were further analyzed with different grouping methods. The current CVs remained nearly constant for 6 years. cTnT had the highest pass rates every year against five specifications, whereas HCY had the lowest pass rates. Overall, most analytes had a satisfactory performance (pass rates >80%), except for HCY, if one-third TEa or the minimum specification were employed. When the optimal specification was applied, the performance of most analytes was frustrating (pass rates < 60%) except for cTnT. The appropriate precision specifications of Mb, cTnI, cTnT and HCY were set as current CVs less than 9.20%, 9.90%, 7.50%, 10.54%, 7.63%, and 6.67%, respectively. The data of IQC practices indicated wide variation and substantial progress. The precision performance of cTnT was already satisfying, while the other five analytes, especially HCY, were still frustrating; thus, ongoing investigation and continuous improvement for IQC are still needed. © 2017 Wiley Periodicals, Inc.

  7. National survey on intra-laboratory turnaround time for some most common routine and stat laboratory analyses in 479 laboratories in China.

    PubMed

    Fei, Yang; Zeng, Rong; Wang, Wei; He, Falin; Zhong, Kun; Wang, Zhiguo

    2015-01-01

    To investigate the state of the art of intra-laboratory turnaround time (intra-TAT), provide suggestions and find out whether laboratories accredited by International Organization for Standardization (ISO) 15189 or College of American Pathologists (CAP) will show better performance on intra-TAT than non-accredited ones. 479 Chinese clinical laboratories participating in the external quality assessment programs of chemistry, blood gas, and haematology tests organized by the National Centre for Clinical Laboratories in China were included in our study. General information and the median of intra-TAT of routine and stat tests in last one week were asked in the questionnaires. The response rate of clinical biochemistry, blood gas, and haematology testing were 36% (479/1307), 38% (228/598), and 36% (449/1250), respectively. More than 50% of laboratories indicated that they had set up intra-TAT median goals and almost 60% of laboratories declared they had monitored intra-TAT generally for every analyte they performed. Among all analytes we investigated, the intra-TAT of haematology analytes was shorter than biochemistry while the intra-TAT of blood gas analytes was the shortest. There were significant differences between median intra-TAT on different days of the week for routine tests. However, there were no significant differences in median intra-TAT reported by accredited laboratories and non-accredited laboratories. Many laboratories in China are aware of intra-TAT control and are making effort to reach the target. There is still space for improvement. Accredited laboratories have better status on intra-TAT monitoring and target setting than the non-accredited, but there are no significant differences in median intra-TAT reported by them.

  8. National survey on intra-laboratory turnaround time for some most common routine and stat laboratory analyses in 479 laboratories in China

    PubMed Central

    Fei, Yang; Zeng, Rong; Wang, Wei; He, Falin; Zhong, Kun

    2015-01-01

    Introduction To investigate the state of the art of intra-laboratory turnaround time (intra-TAT), provide suggestions and find out whether laboratories accredited by International Organization for Standardization (ISO) 15189 or College of American Pathologists (CAP) will show better performance on intra-TAT than non-accredited ones. Materials and methods 479 Chinese clinical laboratories participating in the external quality assessment programs of chemistry, blood gas, and haematology tests organized by the National Centre for Clinical Laboratories in China were included in our study. General information and the median of intra-TAT of routine and stat tests in last one week were asked in the questionnaires. Results The response rate of clinical biochemistry, blood gas, and haematology testing were 36% (479 / 1307), 38% (228 / 598), and 36% (449 / 1250), respectively. More than 50% of laboratories indicated that they had set up intra-TAT median goals and almost 60% of laboratories declared they had monitored intra-TAT generally for every analyte they performed. Among all analytes we investigated, the intra-TAT of haematology analytes was shorter than biochemistry while the intra-TAT of blood gas analytes was the shortest. There were significant differences between median intra-TAT on different days of the week for routine tests. However, there were no significant differences in median intra-TAT reported by accredited laboratories and non-accredited laboratories. Conclusions Many laboratories in China are aware of intra-TAT control and are making effort to reach the target. There is still space for improvement. Accredited laboratories have better status on intra-TAT monitoring and target setting than the non-accredited, but there are no significant differences in median intra-TAT reported by them. PMID:26110033

  9. Shipboard Analytical Capabilities on the Renovated JOIDES Resolution, IODP Riserless Drilling Vessel

    NASA Astrophysics Data System (ADS)

    Blum, P.; Foster, P.; Houpt, D.; Bennight, C.; Brandt, L.; Cobine, T.; Crawford, W.; Fackler, D.; Fujine, K.; Hastedt, M.; Hornbacher, D.; Mateo, Z.; Moortgat, E.; Vasilyev, M.; Vasilyeva, Y.; Zeliadt, S.; Zhao, J.

    2008-12-01

    The JOIDES Resolution (JR) has conducted 121 scientific drilling expeditions during the Ocean Drilling Program (ODP) and the first phase of the Integrated Ocean Drilling Program (IODP) (1983-2006). The vessel and scientific systems have just completed an NSF-sponsored renovation (2005-2008). Shipboard analytical systems have been upgraded, within funding constraints imposed by market driven vessel conversion cost increases, to include: (1) enhanced shipboard analytical services including instruments and software for sampling and the capture of chemistry, physical properties, and geological data; (2) new data management capabilities built around a laboratory information management system (LIMS), digital asset management system, and web services; (3) operations data services with enhanced access to navigation and rig instrumentation data; and (4) a combination of commercial and home-made user applications for workflow- specific data extractions, generic and customized data reporting, and data visualization within a shipboard production environment. The instrumented data capture systems include a new set of core loggers for rapid and non-destructive acquisition of images and other physical properties data from drill cores. Line-scan imaging and natural gamma ray loggers capture data at unprecedented quality due to new and innovative designs. Many instruments used to characterize chemical compounds of rocks, sediments, and interstitial fluids were upgraded with the latest technology. The shipboard analytical environment features a new and innovative framework (DESCinfo) and application (DESClogik) for capturing descriptive and interpretive data from geological sub-domains such as sedimentology, petrology, paleontology, structural geology, stratigraphy, etc. This system fills a long-standing gap by providing a global database, controlled vocabularies and taxa name lists with version control, a highly configurable spreadsheet environment for data capture, and visualization of context data collected with the shipboard core loggers and other instruments.

  10. Approach of Decision Making Based on the Analytic Hierarchy Process for Urban Landscape Management

    NASA Astrophysics Data System (ADS)

    Srdjevic, Zorica; Lakicevic, Milena; Srdjevic, Bojan

    2013-03-01

    This paper proposes a two-stage group decision making approach to urban landscape management and planning supported by the analytic hierarchy process. The proposed approach combines an application of the consensus convergence model and the weighted geometric mean method. The application of the proposed approach is shown on a real urban landscape planning problem with a park-forest in Belgrade, Serbia. Decision makers were policy makers, i.e., representatives of several key national and municipal institutions, and experts coming from different scientific fields. As a result, the most suitable management plan from the set of plans is recognized. It includes both native vegetation renewal in degraded areas of park-forest and continued maintenance of its dominant tourism function. Decision makers included in this research consider the approach to be transparent and useful for addressing landscape management tasks. The central idea of this paper can be understood in a broader sense and easily applied to other decision making problems in various scientific fields.

  11. Approach of decision making based on the analytic hierarchy process for urban landscape management.

    PubMed

    Srdjevic, Zorica; Lakicevic, Milena; Srdjevic, Bojan

    2013-03-01

    This paper proposes a two-stage group decision making approach to urban landscape management and planning supported by the analytic hierarchy process. The proposed approach combines an application of the consensus convergence model and the weighted geometric mean method. The application of the proposed approach is shown on a real urban landscape planning problem with a park-forest in Belgrade, Serbia. Decision makers were policy makers, i.e., representatives of several key national and municipal institutions, and experts coming from different scientific fields. As a result, the most suitable management plan from the set of plans is recognized. It includes both native vegetation renewal in degraded areas of park-forest and continued maintenance of its dominant tourism function. Decision makers included in this research consider the approach to be transparent and useful for addressing landscape management tasks. The central idea of this paper can be understood in a broader sense and easily applied to other decision making problems in various scientific fields.

  12. Variable Refractive Index Effects on Radiation in Semitransparent Scattering Multilayered Regions

    NASA Technical Reports Server (NTRS)

    Siegel, R.; Spuckler, C. M.

    1993-01-01

    A simple set of equations is derived for predicting the temperature distribution and radiative energy flow in a semitransparent layer consisting of an arbitrary number of laminated sublayers that absorb, emit, and scatter radiation. Each sublayer can have a different refractive index and optical thickness. The plane composite region is heated on each exterior side by a different amount of incident radiation. The results are for the limiting case where heat conduction within the layers is very small relative to radiative transfer, and is neglected. The interfaces are assumed diffuse, and all interface reflections are included in the analysis. The thermal behavior is readily calculated from the analytical expressions that are obtained. By using many sublayers, the analytical expressions provide the temperature distribution and heat flow for a diffusing medium with a continuously varying refractive index, including internal reflection effects caused by refractive index gradients. Temperature and heat flux results are given to show the effect of variations in refractive index and optical thickness through the multilayer laminate.

  13. Compressive Detection of Highly Overlapped Spectra Using Walsh-Hadamard-Based Filter Functions.

    PubMed

    Corcoran, Timothy C

    2018-03-01

    In the chemometric context in which spectral loadings of the analytes are already known, spectral filter functions may be constructed which allow the scores of mixtures of analytes to be determined in on-the-fly fashion directly, by applying a compressive detection strategy. Rather than collecting the entire spectrum over the relevant region for the mixture, a filter function may be applied within the spectrometer itself so that only the scores are recorded. Consequently, compressive detection shrinks data sets tremendously. The Walsh functions, the binary basis used in Walsh-Hadamard transform spectroscopy, form a complete orthonormal set well suited to compressive detection. A method for constructing filter functions using binary fourfold linear combinations of Walsh functions is detailed using mathematics borrowed from genetic algorithm work, as a means of optimizing said functions for a specific set of analytes. These filter functions can be constructed to automatically strip the baseline from analysis. Monte Carlo simulations were performed with a mixture of four highly overlapped Raman loadings and with ten excitation-emission matrix loadings; both sets showed a very high degree of spectral overlap. Reasonable estimates of the true scores were obtained in both simulations using noisy data sets, proving the linearity of the method.

  14. Comparative Characterization of Crofelemer Samples Using Data Mining and Machine Learning Approaches With Analytical Stability Data Sets.

    PubMed

    Nariya, Maulik K; Kim, Jae Hyun; Xiong, Jian; Kleindl, Peter A; Hewarathna, Asha; Fisher, Adam C; Joshi, Sangeeta B; Schöneich, Christian; Forrest, M Laird; Middaugh, C Russell; Volkin, David B; Deeds, Eric J

    2017-11-01

    There is growing interest in generating physicochemical and biological analytical data sets to compare complex mixture drugs, for example, products from different manufacturers. In this work, we compare various crofelemer samples prepared from a single lot by filtration with varying molecular weight cutoffs combined with incubation for different times at different temperatures. The 2 preceding articles describe experimental data sets generated from analytical characterization of fractionated and degraded crofelemer samples. In this work, we use data mining techniques such as principal component analysis and mutual information scores to help visualize the data and determine discriminatory regions within these large data sets. The mutual information score identifies chemical signatures that differentiate crofelemer samples. These signatures, in many cases, would likely be missed by traditional data analysis tools. We also found that supervised learning classifiers robustly discriminate samples with around 99% classification accuracy, indicating that mathematical models of these physicochemical data sets are capable of identifying even subtle differences in crofelemer samples. Data mining and machine learning techniques can thus identify fingerprint-type attributes of complex mixture drugs that may be used for comparative characterization of products. Copyright © 2017 American Pharmacists Association®. All rights reserved.

  15. Skin and Soft Tissue Surgery in the Office Versus Operating Room Setting: An Analysis Based on Individual-Level Medicare Data.

    PubMed

    Kantor, Jonathan

    2018-03-23

    The relative volume of skin and soft tissue excision and reconstructive procedures performed in the outpatient office versus facility (ambulatory surgical center or hospital) differs by specialty, and has major implications for quality of care, outcomes, development of guidelines, resident education, health care economics, and patient perception. To assess the relative volume of surgical procedures performed in each setting (office vs ambulatory surgery center [ASC]/hospital) by dermatologists and nondermatologists. A cross-sectional analytical study was performed using the Medicare public use file (PUF) for 2014, which includes every patient seen in an office, ASC, or hospital in the United States billed to Medicare part B. Data were divided by physician specialty and setting. A total of 9,316,307 individual encounters were included in the Medicare PUF. Dermatologists account for 195,001 (2.1%) of the total. Dermatologists were more likely to perform surgical procedures in an office setting only (odds ratio 5.48 [95% confidence interval 5.05-5.95], p < .0001) than other specialists in aggregate. More than 90% of surgical procedures are performed in an office setting, and dermatologists are more than 5 times as likely as other specialists to operate in an office setting.

  16. Strategies to define performance specifications in laboratory medicine: 3 years on from the Milan Strategic Conference.

    PubMed

    Panteghini, Mauro; Ceriotti, Ferruccio; Jones, Graham; Oosterhuis, Wytze; Plebani, Mario; Sandberg, Sverre

    2017-10-26

    Measurements in clinical laboratories produce results needed in the diagnosis and monitoring of patients. These results are always characterized by some uncertainty. What quality is needed and what measurement errors can be tolerated without jeopardizing patient safety should therefore be defined and specified for each analyte having clinical use. When these specifications are defined, the total examination process will be "fit for purpose" and the laboratory professionals should then set up rules to control the measuring systems to ensure they perform within specifications. The laboratory community has used different models to set performance specifications (PS). Recently, it was felt that there was a need to revisit different models and, at the same time, to emphasize the presuppositions for using the different models. Therefore, in 2014 the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) organized a Strategic Conference in Milan. It was felt that there was a need for more detailed discussions on, for instance, PS for EQAS, which measurands should use which models to set PS and how to set PS for the extra-analytical phases. There was also a need to critically evaluate the quality of data on biological variation studies and further discussing the use of the total error (TE) concept. Consequently, EFLM established five Task Finish Groups (TFGs) to address each of these topics. The TFGs are finishing their activity on 2017 and the content of this paper includes deliverables from these groups.

  17. Application of Data Provenance in Healthcare Analytics Software: Information Visualisation of User Activities

    PubMed Central

    Xu, Shen; Rogers, Toby; Fairweather, Elliot; Glenn, Anthony; Curran, James; Curcin, Vasa

    2018-01-01

    Data provenance is a technique that describes the history of digital objects. In health data settings, it can be used to deliver auditability and transparency, and to achieve trust in a software system. However, implementing data provenance in analytics software at an enterprise level presents a different set of challenges from the research environments where data provenance was originally devised. In this paper, the challenges of reporting provenance information to the user is presented. Provenance captured from analytics software can be large and complex and visualizing a series of tasks over a long period can be overwhelming even for a domain expert, requiring visual aggregation mechanisms that fit with complex human cognitive activities involved in the process. This research studied how provenance-based reporting can be integrated into a health data analytics software, using the example of Atmolytics visual reporting tool. PMID:29888084

  18. Analytical formulation of cellular automata rules using data models

    NASA Astrophysics Data System (ADS)

    Jaenisch, Holger M.; Handley, James W.

    2009-05-01

    We present a unique method for converting traditional cellular automata (CA) rules into analytical function form. CA rules have been successfully used for morphological image processing and volumetric shape recognition and classification. Further, the use of CA rules as analog models to the physical and biological sciences can be significantly extended if analytical (as opposed to discrete) models could be formulated. We show that such transformations are possible. We use as our example John Horton Conway's famous "Game of Life" rule set. We show that using Data Modeling, we are able to derive both polynomial and bi-spectrum models of the IF-THEN rules that yield equivalent results. Further, we demonstrate that the "Game of Life" rule set can be modeled using the multi-fluxion, yielding a closed form nth order derivative and integral. All of the demonstrated analytical forms of the CA rule are general and applicable to real-time use.

  19. CytometryML: a markup language for analytical cytology

    NASA Astrophysics Data System (ADS)

    Leif, Robert C.; Leif, Stephanie H.; Leif, Suzanne B.

    2003-06-01

    Cytometry Markup Language, CytometryML, is a proposed new analytical cytology data standard. CytometryML is a set of XML schemas for encoding both flow cytometry and digital microscopy text based data types. CytometryML schemas reference both DICOM (Digital Imaging and Communications in Medicine) codes and FCS keywords. These schemas provide representations for the keywords in FCS 3.0 and will soon include DICOM microscopic image data. Flow Cytometry Standard (FCS) list-mode has been mapped to the DICOM Waveform Information Object. A preliminary version of a list mode binary data type, which does not presently exist in DICOM, has been designed. This binary type is required to enhance the storage and transmission of flow cytometry and digital microscopy data. Index files based on Waveform indices will be used to rapidly locate the cells present in individual subsets. DICOM has the advantage of employing standard file types, TIF and JPEG, for Digital Microscopy. Using an XML schema based representation means that standard commercial software packages such as Excel and MathCad can be used to analyze, display, and store analytical cytometry data. Furthermore, by providing one standard for both DICOM data and analytical cytology data, it eliminates the need to create and maintain special purpose interfaces for analytical cytology data thereby integrating the data into the larger DICOM and other clinical communities. A draft version of CytometryML is available at www.newportinstruments.com.

  20. Highlights of Transient Plume Impingement Model Validation and Applications

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael

    2011-01-01

    This paper describes highlights of an ongoing validation effort conducted to assess the viability of applying a set of analytic point source transient free molecule equations to model behavior ranging from molecular effusion to rocket plumes. The validation effort includes encouraging comparisons to both steady and transient studies involving experimental data and direct simulation Monte Carlo results. Finally, this model is applied to describe features of two exotic transient scenarios involving NASA Goddard Space Flight Center satellite programs.

  1. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.

  2. Knowledge-Based Motion Control of AN Intelligent Mobile Autonomous System

    NASA Astrophysics Data System (ADS)

    Isik, Can

    An Intelligent Mobile Autonomous System (IMAS), which is equipped with vision and low level sensors to cope with unknown obstacles, is modeled as a hierarchy of path planning and motion control. This dissertation concentrates on the lower level of this hierarchy (Pilot) with a knowledge-based controller. The basis of a theory of knowledge-based controllers is established, using the example of the Pilot level motion control of IMAS. In this context, the knowledge-based controller with a linguistic world concept is shown to be adequate for the minimum time control of an autonomous mobile robot motion. The Pilot level motion control of IMAS is approached in the framework of production systems. The three major components of the knowledge-based control that are included here are the hierarchies of the database, the rule base and the rule evaluator. The database, which is the representation of the state of the world, is organized as a semantic network, using a concept of minimal admissible vocabulary. The hierarchy of rule base is derived from the analytical formulation of minimum-time control of IMAS motion. The procedure introduced for rule derivation, which is called analytical model verbalization, utilizes the concept of causalities to describe the system behavior. A realistic analytical system model is developed and the minimum-time motion control in an obstacle strewn environment is decomposed to a hierarchy of motion planning and control. The conditions for the validity of the hierarchical problem decomposition are established, and the consistency of operation is maintained by detecting the long term conflicting decisions of the levels of the hierarchy. The imprecision in the world description is modeled using the theory of fuzzy sets. The method developed for the choice of the rule that prescribes the minimum-time motion control among the redundant set of applicable rules is explained and the usage of fuzzy set operators is justified. Also included in the dissertation are the description of the computer simulation of Pilot within the hierarchy of IMAS control and the simulated experiments that demonstrate the theoretical work.

  3. Finding accurate frontiers: A knowledge-intensive approach to relational learning

    NASA Technical Reports Server (NTRS)

    Pazzani, Michael; Brunk, Clifford

    1994-01-01

    An approach to analytic learning is described that searches for accurate entailments of a Horn Clause domain theory. A hill-climbing search, guided by an information based evaluation function, is performed by applying a set of operators that derive frontiers from domain theories. The analytic learning system is one component of a multi-strategy relational learning system. We compare the accuracy of concepts learned with this analytic strategy to concepts learned with an analytic strategy that operationalizes the domain theory.

  4. An Ethnographic Study of Stigma and Ageism in Residential Care or Assisted Living

    PubMed Central

    Dobbs, Debra; Eckert, J. Kevin; Rubinstein, Bob; Keimig, Lynn; Clark, Leanne; Frankowski, Ann Christine; Zimmerman, Sheryl

    2013-01-01

    Purpose This study explored aspects of stigmatization for older adults who live in residential care or assisted living (RC–AL) communities and what these settings have done to address stigma. Design and recognition of resident preferences and strengths, rather than their limitations. Methods We used ethnography and other qualitative data-gathering and analytic techniques to gather data from 309 participants (residents, family and staff) from six RC–AL settings in Maryland. We entered the transcript data into Atlas.ti 5.0. We analyzed the data by using grounded theory techniques for emergent themes. Results Four themes emerged that relate to stigma in RC–AL: (a) ageism in long-term care; (b) stigma as related to disease and illness; (c) sociocultural aspects of stigma; and (d) RC–AL as a stigmatizing setting. Some strategies used in RC–AL settings to combat stigma include family member advocacy on behalf of stigmatized residents, assertion of resident autonomy, and administrator awareness of potential stigmatization. Implications: Findings suggest that changes could be made to the structure as well as the process of care delivery to minimize the occurrence of stigma in RC–AL settings. Structural changes include an examination of how best, given the resident case mix, to accommodate care for persons with dementia (e.g., separate units or integrated care); processes of care include staff PMID:18728301

  5. Hybrid neural network and fuzzy logic approaches for rendezvous and capture in space

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.; Castellano, Timothy

    1991-01-01

    The nonlinear behavior of many practical systems and unavailability of quantitative data regarding the input-output relations makes the analytical modeling of these systems very difficult. On the other hand, approximate reasoning-based controllers which do not require analytical models have demonstrated a number of successful applications such as the subway system in the city of Sendai. These applications have mainly concentrated on emulating the performance of a skilled human operator in the form of linguistic rules. However, the process of learning and tuning the control rules to achieve the desired performance remains a difficult task. Fuzzy Logic Control is based on fuzzy set theory. A fuzzy set is an extension of a crisp set. Crisp sets only allow full membership or no membership at all, whereas fuzzy sets allow partial membership. In other words, an element may partially belong to a set.

  6. Do new concepts for deriving permissible limits for analytical imprecision and bias have any advantages over existing consensus?

    PubMed

    Petersen, Per Hyltoft; Sandberg, Sverre; Fraser, Callum G

    2011-04-01

    The Stockholm conference held in 1999 on "Strategies to set global analytical quality specifications (AQS) in laboratory medicine" reached a consensus and advocated the ubiquitous application of a hierarchical structure of approaches to setting AQS. This approach has been widely used over the last decade, although several issues remain unanswered. A number of new suggestions have been recently proposed for setting AQS. One of these recommendations is described by Haeckel and Wosniok in this issue of Clinical Chemistry and Laboratory Medicine. Their concept is to estimate the increase in false-positive results using conventional population-based reference intervals, the delta false-positive rate due to analytical imprecision and bias, and relate the results directly to the current analytical quality attained. Thus, the actual estimates in the laboratory for imprecision and bias are compared to the AQS. These values are classified in a ranking system according to the closeness to the AQS, and this combination is the new idea of the proposal. Other new ideas have been proposed recently. We wait, with great interest, as should others, to see if these newer approaches become widely used and worthy of incorporation into the hierarchy.

  7. Big data analytics in healthcare: promise and potential.

    PubMed

    Raghupathi, Wullianallur; Raghupathi, Viju

    2014-01-01

    To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.

  8. Dominating Scale-Free Networks Using Generalized Probabilistic Methods

    PubMed Central

    Molnár,, F.; Derzsy, N.; Czabarka, É.; Székely, L.; Szymanski, B. K.; Korniss, G.

    2014-01-01

    We study ensemble-based graph-theoretical methods aiming to approximate the size of the minimum dominating set (MDS) in scale-free networks. We analyze both analytical upper bounds of dominating sets and numerical realizations for applications. We propose two novel probabilistic dominating set selection strategies that are applicable to heterogeneous networks. One of them obtains the smallest probabilistic dominating set and also outperforms the deterministic degree-ranked method. We show that a degree-dependent probabilistic selection method becomes optimal in its deterministic limit. In addition, we also find the precise limit where selecting high-degree nodes exclusively becomes inefficient for network domination. We validate our results on several real-world networks, and provide highly accurate analytical estimates for our methods. PMID:25200937

  9. Dynamic Digital Maps as Vehicles for Distributing Digital Geologic Maps and Embedded Analytical Data and Multimedia

    NASA Astrophysics Data System (ADS)

    Condit, C. D.; Mninch, M.

    2012-12-01

    The Dynamic Digital Map (DDM) is an ideal vehicle for the professional geologist to use to describe the geologic setting of key sites to the public in a format that integrates and presents maps and associated analytical data and multimedia without the need for an ArcGIS interface. Maps with field trip guide stops that include photographs, movies and figures and animations, showing, for example, how the features seen in the field formed, or how data might be best visualized in "time-frame" sequences are ideally included in DDMs. DDMs distribute geologic maps, images, movies, analytical data, and text such as field guides, in an integrated cross-platform, web enabled format that are intuitive to use, easily and quickly searchable, and require no additional proprietary software to operate. Maps, photos, movies and animations are stored outside the program, which acts as an organizational framework and index to present these data. Once created, the DDM can be downloaded from the web site hosting it in the flavor matching the user's operating system (e.g. Linux, Windows and Macintosh) as zip, dmg or tar files (and soon as iOS and Android tablet apps). When decompressed, the DDM can then access its associated data directly from that site with no browser needed. Alternatively, the entire package can be distributed and used from CD, DVD, or flash-memory storage. The intent of this presentation is to introduce the variety of geology that can be accessed from the over 25 DDMs created to date, concentrating on the DDM of the Springerville Volcanic Field. We will highlight selected features of some of them, introduce a simplified interface to the original DDM (that we renamed DDMC for Classic) and give a brief look at a the recently (2010-2011) completed geologic maps of the Springerville Volcanic field to see examples of each of the features discussed above, and a display of the integrated analytical data set. We will also highlight the differences between the classic or DDMCs and the new Dynamic Digital Map Extended (DDME) designed from the ground up to take advantage of the expanded connectedness this redesigned program will accommodate.

  10. New fundamental parameters for attitude representation

    NASA Astrophysics Data System (ADS)

    Patera, Russell P.

    2017-08-01

    A new attitude parameter set is developed to clarify the geometry of combining finite rotations in a rotational sequence and in combining infinitesimal angular increments generated by angular rate. The resulting parameter set of six Pivot Parameters represents a rotation as a great circle arc on a unit sphere that can be located at any clocking location in the rotation plane. Two rotations are combined by linking their arcs at either of the two intersection points of the respective rotation planes. In a similar fashion, linking rotational increments produced by angular rate is used to derive the associated kinematical equations, which are linear and have no singularities. Included in this paper is the derivation of twelve Pivot Parameter elements that represent all twelve Euler Angle sequences, which enables efficient conversions between Pivot Parameters and any Euler Angle sequence. Applications of this new parameter set include the derivation of quaternions and the quaternion composition rule, as well as, the derivation of the analytical solution to time dependent coning motion. The relationships between Pivot Parameters and traditional parameter sets are included in this work. Pivot Parameters are well suited for a variety of aerospace applications due to their effective composition rule, singularity free kinematic equations, efficient conversion to and from Euler Angle sequences and clarity of their geometrical foundation.

  11. Exploring Teacher-Student Interactions and Moral Reasoning Practices in Drama Classrooms

    ERIC Educational Resources Information Center

    Freebody, Kelly

    2010-01-01

    The research reported here brings together three settings of conceptual and methodological inquiry: the sociological setting of socio-economic theory; the curricular/pedagogic setting of educational drama; and the analytic setting of ethnomethodolgically informed analyses of conversation analysis and membership categorisation analysis. Students…

  12. Effectiveness of job search interventions: a meta-analytic review.

    PubMed

    Liu, Songqi; Huang, Jason L; Wang, Mo

    2014-07-01

    The current meta-analytic review examined the effectiveness of job search interventions in facilitating job search success (i.e., obtaining employment). Major theoretical perspectives on job search interventions, including behavioral learning theory, theory of planned behavior, social cognitive theory, and coping theory, were reviewed and integrated to derive a taxonomy of critical job search intervention components. Summarizing the data from 47 experimentally or quasi-experimentally evaluated job search interventions, we found that the odds of obtaining employment were 2.67 times higher for job seekers participating in job search interventions compared to job seekers in the control group, who did not participate in such intervention programs. Our moderator analysis also suggested that job search interventions that contained certain components, including teaching job search skills, improving self-presentation, boosting self-efficacy, encouraging proactivity, promoting goal setting, and enlisting social support, were more effective than interventions that did not include such components. More important, job search interventions effectively promoted employment only when both skill development and motivation enhancement were included. In addition, we found that job search interventions were more effective in helping younger and older (vs. middle-aged) job seekers, short-term (vs. long-term) unemployed job seekers, and job seekers with special needs and conditions (vs. job seekers in general) to find employment. Furthermore, meta-analytic path analysis revealed that increased job search skills, job search self-efficacy, and job search behaviors partially mediated the positive effect of job search interventions on obtaining employment. Theoretical and practical implications and future research directions are discussed. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  13. Authentic Oral Language Production and Interaction in CALL: An Evolving Conceptual Framework for the Use of Learning Analytics within the SpeakApps Project

    ERIC Educational Resources Information Center

    Nic Giolla Mhichíl, Mairéad; van Engen, Jeroen; Ó Ciardúbháin, Colm; Ó Cléircín, Gearóid; Appel, Christine

    2014-01-01

    This paper sets out to construct and present the evolving conceptual framework of the SpeakApps projects to consider the application of learning analytics to facilitate synchronous and asynchronous oral language skills within this CALL context. Drawing from both the CALL and wider theoretical and empirical literature of learner analytics, the…

  14. Enhance your team-based qualitative research.

    PubMed

    Fernald, Douglas H; Duclos, Christine W

    2005-01-01

    Qualitative research projects often involve the collaborative efforts of a research team. Challenges inherent in teamwork include changes in membership and differences in analytical style, philosophy, training, experience, and skill. This article discusses teamwork issues and tools and techniques used to improve team-based qualitative research. We drew on our experiences in working on numerous projects of varying, size, duration, and purpose. Through trials of different tools and techniques, expert consultation, and review of the literature, we learned to improve how we build teams, manage information, and disseminate results. Attention given to team members and team processes is as important as choosing appropriate analytical tools and techniques. Attentive team leadership, commitment to early and regular team meetings, and discussion of roles, responsibilities, and expectations all help build more effective teams and establish clear norms. As data are collected and analyzed, it is important to anticipate potential problems from differing skills and styles, and how information and files are managed. Discuss analytical preferences and biases and set clear guidelines and practices for how data will be analyzed and handled. As emerging ideas and findings disperse across team members, common tools (such as summary forms and data grids), coding conventions, intermediate goals or products, and regular documentation help capture essential ideas and insights. In a team setting, little should be left to chance. This article identifies ways to improve team-based qualitative research with more a considered and systematic approach. Qualitative researchers will benefit from further examination and discussion of effective, field-tested, team-based strategies.

  15. A WPS Based Architecture for Climate Data Analytic Services (CDAS) at NASA

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; McInerney, M.; Duffy, D.; Carriere, L.; Potter, G. L.; Doutriaux, C.

    2015-12-01

    Faced with unprecedented growth in the Big Data domain of climate science, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute trusted and tested analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using trusted climate data analysis tools (ESMF, CDAT, NCO, etc.). The framework is structured as a set of interacting modules allowing maximal flexibility in deployment choices. The current set of module managers include: Staging Manager: Runs the computation locally on the WPS server or remotely using tools such as celery or SLURM. Compute Engine Manager: Runs the computation serially or distributed over nodes using a parallelization framework such as celery or spark. Decomposition Manger: Manages strategies for distributing the data over nodes. Data Manager: Handles the import of domain data from long term storage and manages the in-memory and disk-based caching architectures. Kernel manager: A kernel is an encapsulated computational unit which executes a processor's compute task. Each kernel is implemented in python exploiting existing analysis packages (e.g. CDAT) and is compatible with all CDAS compute engines and decompositions. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be executed using either direct web service calls, a python script or application, or a javascript-based web application. Client packages in python or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends, compare multiple reanalysis datasets, and variability.

  16. ANALYTiC: An Active Learning System for Trajectory Classification.

    PubMed

    Soares Junior, Amilcar; Renso, Chiara; Matwin, Stan

    2017-01-01

    The increasing availability and use of positioning devices has resulted in large volumes of trajectory data. However, semantic annotations for such data are typically added by domain experts, which is a time-consuming task. Machine-learning algorithms can help infer semantic annotations from trajectory data by learning from sets of labeled data. Specifically, active learning approaches can minimize the set of trajectories to be annotated while preserving good performance measures. The ANALYTiC web-based interactive tool visually guides users through this annotation process.

  17. Metal-amplified Density Assays, (MADAs), including a Density-Linked Immunosorbent Assay (DeLISA).

    PubMed

    Subramaniam, Anand Bala; Gonidec, Mathieu; Shapiro, Nathan D; Kresse, Kayleigh M; Whitesides, George M

    2015-02-21

    This paper reports the development of Metal-amplified Density Assays, or MADAs - a method of conducting quantitative or multiplexed assays, including immunoassays, by using Magnetic Levitation (MagLev) to measure metal-amplified changes in the density of beads labeled with biomolecules. The binding of target analytes (i.e. proteins, antibodies, antigens) to complementary ligands immobilized on the surface of the beads, followed by a chemical amplification of the binding in a form that results in a change in the density of the beads (achieved by using gold nanoparticle-labeled biomolecules, and electroless deposition of gold or silver), translates analyte binding events into changes in density measureable using MagLev. A minimal model based on diffusion-limited growth of hemispherical nuclei on a surface reproduces the dynamics of the assay. A MADA - when performed with antigens and antibodies - is called a Density-Linked Immunosorbent Assay, or DeLISA. Two immunoassays provided a proof of principle: a competitive quantification of the concentration of neomycin in whole milk, and a multiplexed detection of antibodies against Hepatitis C virus NS3 protein and syphilis T. pallidum p47 protein in serum. MADAs, including DeLISAs, require, besides the requisite biomolecules and amplification reagents, minimal specialized equipment (two permanent magnets, a ruler or a capillary with calibrated length markings) and no electrical power to obtain a quantitative readout of analyte concentration. With further development, the method may be useful in resource-limited or point-of-care settings.

  18. Analytic proof of the existence of the Lorenz attractor in the extended Lorenz model

    NASA Astrophysics Data System (ADS)

    Ovsyannikov, I. I.; Turaev, D. V.

    2017-01-01

    We give an analytic (free of computer assistance) proof of the existence of a classical Lorenz attractor for an open set of parameter values of the Lorenz model in the form of Yudovich-Morioka-Shimizu. The proof is based on detection of a homoclinic butterfly with a zero saddle value and rigorous verification of one of the Shilnikov criteria for the birth of the Lorenz attractor; we also supply a proof for this criterion. The results are applied in order to give an analytic proof for the existence of a robust, pseudohyperbolic strange attractor (the so-called discrete Lorenz attractor) for an open set of parameter values in a 4-parameter family of 3D Henon-like diffeomorphisms.

  19. An analytic data analysis method for oscillatory slug tests.

    PubMed

    Chen, Chia-Shyun

    2006-01-01

    An analytical data analysis method is developed for slug tests in partially penetrating wells in confined or unconfined aquifers of high hydraulic conductivity. As adapted from the van der Kamp method, the determination of the hydraulic conductivity is based on the occurrence times and the displacements of the extreme points measured from the oscillatory data and their theoretical counterparts available in the literature. This method is applied to two sets of slug test response data presented by Butler et al.: one set shows slow damping with seven discernable extremities, and the other shows rapid damping with three extreme points. The estimates of the hydraulic conductivity obtained by the analytic method are in good agreement with those determined by an available curve-matching technique.

  20. Creating Web Area Segments with Google Analytics

    EPA Pesticide Factsheets

    Segments allow you to quickly access data for a predefined set of Sessions or Users, such as government or education users, or sessions in a particular state. You can then apply this segment to any report within the Google Analytics (GA) interface.

  1. The U.S. national nuclear forensics library, nuclear materials information program, and data dictionary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamont, Stephen Philip; Brisson, Marcia; Curry, Michael

    2011-02-17

    Nuclear forensics assessments to determine material process history requires careful comparison of sample data to both measured and modeled nuclear material characteristics. Developing centralized databases, or nuclear forensics libraries, to house this information is an important step to ensure all relevant data will be available for comparison during a nuclear forensics analysis and help expedite the assessment of material history. The approach most widely accepted by the international community at this time is the implementation of National Nuclear Forensics libraries, which would be developed and maintained by individual nations. This is an attractive alternative toan international database since it providesmore » an understanding that each country has data on materials produced and stored within their borders, but eliminates the need to reveal any proprietary or sensitive information to other nations. To support the concept of National Nuclear Forensics libraries, the United States Department of Energy has developed a model library, based on a data dictionary, or set of parameters designed to capture all nuclear forensic relevant information about a nuclear material. Specifically, information includes material identification, collection background and current location, analytical laboratories where measurements were made, material packaging and container descriptions, physical characteristics including mass and dimensions, chemical and isotopic characteristics, particle morphology or metallurgical properties, process history including facilities, and measurement quality assurance information. While not necessarily required, it may also be valuable to store modeled data sets including reactor burn-up or enrichment cascade data for comparison. It is fully expected that only a subset of this information is available or relevant to many materials, and much of the data populating a National Nuclear Forensics library would be process analytical or material accountability measurement data as opposed to a complete forensic analysis of each material in the library.« less

  2. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  3. Effect of Additional Incentives for Aviation Biofuels: Results from the Biomass Scenario Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vimmerstedt, Laura J; Newes, Emily K

    2017-12-05

    The National Renewable Energy Laboratory supported the Department of Energy, Bioenergy Technologies Office, with analysis of alternative jet fuels in collaboration with the U.S. Department of Transportation, Federal Aviation Administration. Airlines for America requested additional exploratory scenarios within FAA analytic framework. Airlines for America requested additional analysis using the same analytic framework, the Biomass Scenario Model. The results were presented at a public working meeting of the California Air Resources Board on including alternative jet fuel in the Low Carbon Fuel Standard on March 17, 2017 (https://www.arb.ca.gov/fuels/lcfs/lcfs_meetings/lcfs_meetings.htm). This presentation clarifies and annotates the slides from the public working meeting, andmore » provides a link to the full data set. NREL does not advocate for or against the policies analyzed in this study.« less

  4. A Numerical-Analytical Approach Based on Canonical Transformations for Computing Optimal Low-Thrust Transfers

    NASA Astrophysics Data System (ADS)

    da Silva Fernandes, S.; das Chagas Carvalho, F.; Bateli Romão, J. V.

    2018-04-01

    A numerical-analytical procedure based on infinitesimal canonical transformations is developed for computing optimal time-fixed low-thrust limited power transfers (no rendezvous) between coplanar orbits with small eccentricities in an inverse-square force field. The optimization problem is formulated as a Mayer problem with a set of non-singular orbital elements as state variables. Second order terms in eccentricity are considered in the development of the maximum Hamiltonian describing the optimal trajectories. The two-point boundary value problem of going from an initial orbit to a final orbit is solved by means of a two-stage Newton-Raphson algorithm which uses an infinitesimal canonical transformation. Numerical results are presented for some transfers between circular orbits with moderate radius ratio, including a preliminary analysis of Earth-Mars and Earth-Venus missions.

  5. Effect of Additional Incentives for Aviation Biofuels: Results from the Biomass Scenario Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vimmerstedt, Laura J; Newes, Emily K

    The National Renewable Energy Laboratory supported the Department of Energy, Bioenergy Technologies Office, with analysis of alternative jet fuels in collaboration with the U.S. Department of Transportation, Federal Aviation Administration. Airlines for America requested additional exploratory scenarios within FAA analytic framework. Airlines for America requested additional analysis using the same analytic framework, the Biomass Scenario Model. The results were presented at a public working meeting of the California Air Resources Board on including alternative jet fuel in the Low Carbon Fuel Standard on March 17, 2017 (https://www.arb.ca.gov/fuels/lcfs/lcfs_meetings/lcfs_meetings.htm). This presentation clarifies and annotates the slides from the public working meeting, andmore » provides a link to the full data set. NREL does not advocate for or against the policies analyzed in this study.« less

  6. GESearch: An Interactive GUI Tool for Identifying Gene Expression Signature.

    PubMed

    Ye, Ning; Yin, Hengfu; Liu, Jingjing; Dai, Xiaogang; Yin, Tongming

    2015-01-01

    The huge amount of gene expression data generated by microarray and next-generation sequencing technologies present challenges to exploit their biological meanings. When searching for the coexpression genes, the data mining process is largely affected by selection of algorithms. Thus, it is highly desirable to provide multiple options of algorithms in the user-friendly analytical toolkit to explore the gene expression signatures. For this purpose, we developed GESearch, an interactive graphical user interface (GUI) toolkit, which is written in MATLAB and supports a variety of gene expression data files. This analytical toolkit provides four models, including the mean, the regression, the delegate, and the ensemble models, to identify the coexpression genes, and enables the users to filter data and to select gene expression patterns by browsing the display window or by importing knowledge-based genes. Subsequently, the utility of this analytical toolkit is demonstrated by analyzing two sets of real-life microarray datasets from cell-cycle experiments. Overall, we have developed an interactive GUI toolkit that allows for choosing multiple algorithms for analyzing the gene expression signatures.

  7. An analytical model for subsurface irradiance and remote sensing reflectance in deep and shallow case-2 waters.

    PubMed

    Albert, A; Mobley, C

    2003-11-03

    Subsurface remote sensing signals, represented by the irradiance re fl ectance and the remote sensing re fl ectance, were investigated. The present study is based on simulations with the radiative transfer program Hydrolight using optical properties of Lake Constance (German: Bodensee) based on in-situ measurements of the water constituents and the bottom characteristics. Analytical equations are derived for the irradiance re fl ectance and remote sensing re fl ectance for deep and shallow water applications. The input of the parameterization are the inherent optical properties of the water - absorption a(lambda) and backscattering bb(lambda). Additionally, the solar zenith angle thetas, the viewing angle thetav , and the surface wind speed u are considered. For shallow water applications the bottom albedo RB and the bottom depth zB are included into the parameterizations. The result is a complete set of analytical equations for the remote sensing signals R and Rrs in deep and shallow waters with an accuracy better than 4%. In addition, parameterizations of apparent optical properties were derived for the upward and downward diffuse attenuation coefficients Ku and Kd.

  8. Nuclear magnetic resonance signal dynamics of liquids in the presence of distant dipolar fields, revisited

    PubMed Central

    Barros, Wilson; Gochberg, Daniel F.; Gore, John C.

    2009-01-01

    The description of the nuclear magnetic resonance magnetization dynamics in the presence of long-range dipolar interactions, which is based upon approximate solutions of Bloch–Torrey equations including the effect of a distant dipolar field, has been revisited. New experiments show that approximate analytic solutions have a broader regime of validity as well as dependencies on pulse-sequence parameters that seem to have been overlooked. In order to explain these experimental results, we developed a new method consisting of calculating the magnetization via an iterative formalism where both diffusion and distant dipolar field contributions are treated as integral operators incorporated into the Bloch–Torrey equations. The solution can be organized as a perturbative series, whereby access to higher order terms allows one to set better boundaries on validity regimes for analytic first-order approximations. Finally, the method legitimizes the use of simple analytic first-order approximations under less demanding experimental conditions, it predicts new pulse-sequence parameter dependencies for the range of validity, and clarifies weak points in previous calculations. PMID:19425789

  9. A General Simulator Using State Estimation for a Space Tug Navigation System. [computerized simulation, orbital position estimation and flight mechanics

    NASA Technical Reports Server (NTRS)

    Boland, J. S., III

    1975-01-01

    A general simulation program is presented (GSP) involving nonlinear state estimation for space vehicle flight navigation systems. A complete explanation of the iterative guidance mode guidance law, derivation of the dynamics, coordinate frames, and state estimation routines are given so as to fully clarify the assumptions and approximations involved so that simulation results can be placed in their proper perspective. A complete set of computer acronyms and their definitions as well as explanations of the subroutines used in the GSP simulator are included. To facilitate input/output, a complete set of compatable numbers, with units, are included to aid in data development. Format specifications, output data phrase meanings and purposes, and computer card data input are clearly spelled out. A large number of simulation and analytical studies were used to determine the validity of the simulator itself as well as various data runs.

  10. Evaluation of pre-analytical conditions and comparison of the performance of several digital PCR assays for the detection of major EGFR mutations in circulating DNA from non-small cell lung cancers: the CIRCAN_0 study

    PubMed Central

    Garcia, Jessica; Dusserre, Eric; Cheynet, Valérie; Bringuier, Pierre Paul; Brengle-Pesce, Karen; Wozny, Anne-Sophie; Rodriguez-Lafrasse, Claire; Freyer, Gilles; Brevet, Marie; Payen, Léa; Couraud, Sébastien

    2017-01-01

    Non invasive somatic detection assays are suitable for repetitive tumor characterization or for detecting the appearance of somatic resistance during lung cancer. Molecular diagnosis based on circulating free DNA (cfDNA) offers the opportunity to track the genomic evolution of the tumor, and was chosen to assess the molecular profile of several EGFR alterations, including deletions in exon 19 (delEX19), the L858R substitution on exon 21 and the EGFR resistance mutation T790M on exon 20. Our study aimed at determining optimal pre-analytical conditions and EGFR mutation detection assays for analyzing cfDNA using the picoliter-droplet digital polymerase chain reaction (ddPCR) assay. Within the framework of the CIRCAN project set-up at the Lyon University Hospital, plasma samples were collected to establish a pre-analytical and analytical workflow of cfDNA analysis. We evaluated all of the steps from blood sampling to mutation detection output, including shipping conditions (4H versus 24H in EDTA tubes), the reproducibility of cfDNA extraction, the specificity/sensitivity of ddPCR (using external controls), and the comparison of different PCR assays for the detection of the three most important EGFR hotspots, which highlighted the increased sensitivity of our in-house primers/probes. Hence, we have described a new protocol facilitating the molecular detection of somatic mutations in cancer patients from liquid biopsies, improving their diagnosis and introducing a less traumatic monitoring system during tumor progression. PMID:29152135

  11. Analytical energy gradients for explicitly correlated wave functions. I. Explicitly correlated second-order Møller-Plesset perturbation theory

    NASA Astrophysics Data System (ADS)

    Győrffy, Werner; Knizia, Gerald; Werner, Hans-Joachim

    2017-12-01

    We present the theory and algorithms for computing analytical energy gradients for explicitly correlated second-order Møller-Plesset perturbation theory (MP2-F12). The main difficulty in F12 gradient theory arises from the large number of two-electron integrals for which effective two-body density matrices and integral derivatives need to be calculated. For efficiency, the density fitting approximation is used for evaluating all two-electron integrals and their derivatives. The accuracies of various previously proposed MP2-F12 approximations [3C, 3C(HY1), 3*C(HY1), and 3*A] are demonstrated by computing equilibrium geometries for a set of molecules containing first- and second-row elements, using double-ζ to quintuple-ζ basis sets. Generally, the convergence of the bond lengths and angles with respect to the basis set size is strongly improved by the F12 treatment, and augmented triple-ζ basis sets are sufficient to closely approach the basis set limit. The results obtained with the different approximations differ only very slightly. This paper is the first step towards analytical gradients for coupled-cluster singles and doubles with perturbative treatment of triple excitations, which will be presented in the second part of this series.

  12. Failure of Standard Training Sets in the Analysis of Fast-Scan Cyclic Voltammetry Data.

    PubMed

    Johnson, Justin A; Rodeberg, Nathan T; Wightman, R Mark

    2016-03-16

    The use of principal component regression, a multivariate calibration method, in the analysis of in vivo fast-scan cyclic voltammetry data allows for separation of overlapping signal contributions, permitting evaluation of the temporal dynamics of multiple neurotransmitters simultaneously. To accomplish this, the technique relies on information about current-concentration relationships across the scan-potential window gained from analysis of training sets. The ability of the constructed models to resolve analytes depends critically on the quality of these data. Recently, the use of standard training sets obtained under conditions other than those of the experimental data collection (e.g., with different electrodes, animals, or equipment) has been reported. This study evaluates the analyte resolution capabilities of models constructed using this approach from both a theoretical and experimental viewpoint. A detailed discussion of the theory of principal component regression is provided to inform this discussion. The findings demonstrate that the use of standard training sets leads to misassignment of the current-concentration relationships across the scan-potential window. This directly results in poor analyte resolution and, consequently, inaccurate quantitation, which may lead to erroneous conclusions being drawn from experimental data. Thus, it is strongly advocated that training sets be obtained under the experimental conditions to allow for accurate data analysis.

  13. SmartR: an open-source platform for interactive visual analytics for translational research data

    PubMed Central

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-01-01

    Abstract Summary: In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR, a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. Availability and Implementation: The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR. Contact: reinhard.schneider@uni.lu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28334291

  14. A geovisual analytic approach to understanding geo-social relationships in the international trade network.

    PubMed

    Luo, Wei; Yin, Peifeng; Di, Qian; Hardisty, Frank; MacEachren, Alan M

    2014-01-01

    The world has become a complex set of geo-social systems interconnected by networks, including transportation networks, telecommunications, and the internet. Understanding the interactions between spatial and social relationships within such geo-social systems is a challenge. This research aims to address this challenge through the framework of geovisual analytics. We present the GeoSocialApp which implements traditional network analysis methods in the context of explicitly spatial and social representations. We then apply it to an exploration of international trade networks in terms of the complex interactions between spatial and social relationships. This exploration using the GeoSocialApp helps us develop a two-part hypothesis: international trade network clusters with structural equivalence are strongly 'balkanized' (fragmented) according to the geography of trading partners, and the geographical distance weighted by population within each network cluster has a positive relationship with the development level of countries. In addition to demonstrating the potential of visual analytics to provide insight concerning complex geo-social relationships at a global scale, the research also addresses the challenge of validating insights derived through interactive geovisual analytics. We develop two indicators to quantify the observed patterns, and then use a Monte-Carlo approach to support the hypothesis developed above.

  15. SmartR: an open-source platform for interactive visual analytics for translational research data.

    PubMed

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  16. Teachable, high-content analytics for live-cell, phase contrast movies.

    PubMed

    Alworth, Samuel V; Watanabe, Hirotada; Lee, James S J

    2010-09-01

    CL-Quant is a new solution platform for broad, high-content, live-cell image analysis. Powered by novel machine learning technologies and teach-by-example interfaces, CL-Quant provides a platform for the rapid development and application of scalable, high-performance, and fully automated analytics for a broad range of live-cell microscopy imaging applications, including label-free phase contrast imaging. The authors used CL-Quant to teach off-the-shelf universal analytics, called standard recipes, for cell proliferation, wound healing, cell counting, and cell motility assays using phase contrast movies collected on the BioStation CT and BioStation IM platforms. Similar to application modules, standard recipes are intended to work robustly across a wide range of imaging conditions without requiring customization by the end user. The authors validated the performance of the standard recipes by comparing their performance with truth created manually, or by custom analytics optimized for each individual movie (and therefore yielding the best possible result for the image), and validated by independent review. The validation data show that the standard recipes' performance is comparable with the validated truth with low variation. The data validate that the CL-Quant standard recipes can provide robust results without customization for live-cell assays in broad cell types and laboratory settings.

  17. A Geovisual Analytic Approach to Understanding Geo-Social Relationships in the International Trade Network

    PubMed Central

    Luo, Wei; Yin, Peifeng; Di, Qian; Hardisty, Frank; MacEachren, Alan M.

    2014-01-01

    The world has become a complex set of geo-social systems interconnected by networks, including transportation networks, telecommunications, and the internet. Understanding the interactions between spatial and social relationships within such geo-social systems is a challenge. This research aims to address this challenge through the framework of geovisual analytics. We present the GeoSocialApp which implements traditional network analysis methods in the context of explicitly spatial and social representations. We then apply it to an exploration of international trade networks in terms of the complex interactions between spatial and social relationships. This exploration using the GeoSocialApp helps us develop a two-part hypothesis: international trade network clusters with structural equivalence are strongly ‘balkanized’ (fragmented) according to the geography of trading partners, and the geographical distance weighted by population within each network cluster has a positive relationship with the development level of countries. In addition to demonstrating the potential of visual analytics to provide insight concerning complex geo-social relationships at a global scale, the research also addresses the challenge of validating insights derived through interactive geovisual analytics. We develop two indicators to quantify the observed patterns, and then use a Monte-Carlo approach to support the hypothesis developed above. PMID:24558409

  18. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Further Progress in Discovering, Exploring, and Mapping Spatiotemporal Patterns Across the World's Largest Open Source Data Sets

    NASA Astrophysics Data System (ADS)

    Piburn, J.; Stewart, R.; Myers, A.; Sorokine, A.; Axley, E.; Anderson, D.; Burdette, J.; Biddle, C.; Hohl, A.; Eberle, R.; Kaufman, J.; Morton, A.

    2017-10-01

    Spatiotemporal (ST) analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.

  19. Aggregating Individual Preferences in the Analytic Hierarchy Process Applied to the 1983 Battelle TAV Study.

    DTIC Science & Technology

    1985-03-15

    elicitation - rankings, ratings, and pairwise comparisons, 2) Value Theory: includes an explanation of the AHP and fuzzy set theory, and 3) Group... AHP are better tools for these " fuzzy " applications. These results apply directly to this thesis. The original Battelle survey used direct ratings to...iridepeindent uf three arggretation toctiiIque5: geometric mean input, arithmetic me;n voctor output, and Majority rle,, output. The AHP consi:3tcncy index was

  20. Laboratory approach for diagnosis of toluene-based inhalant abuse in a clinical setting

    PubMed Central

    Jain, Raka; Verma, Arpita

    2016-01-01

    The steady increase of inhalant abuse is a great challenge for analytical toxicologists. This review describes an overview of inhalant abuse including the extent of the problem, types of products abused, modes of administration, pharmacology and effects of inhalants, the role of laboratory, interpretation of laboratory results and clinical considerations. Regular laboratory screening for inhalant abuse as well as other substance abuse and health risk behaviors must be a part of standard clinical care. PMID:26957863

  1. Integrable Time-Dependent Quantum Hamiltonians

    NASA Astrophysics Data System (ADS)

    Sinitsyn, Nikolai A.; Yuzbashyan, Emil A.; Chernyak, Vladimir Y.; Patra, Aniket; Sun, Chen

    2018-05-01

    We formulate a set of conditions under which the nonstationary Schrödinger equation with a time-dependent Hamiltonian is exactly solvable analytically. The main requirement is the existence of a non-Abelian gauge field with zero curvature in the space of system parameters. Known solvable multistate Landau-Zener models satisfy these conditions. Our method provides a strategy to incorporate time dependence into various quantum integrable models while maintaining their integrability. We also validate some prior conjectures, including the solution of the driven generalized Tavis-Cummings model.

  2. TEM Study of SAFARI-2000 Aerosols

    NASA Technical Reports Server (NTRS)

    Buseck, Peter R.

    2004-01-01

    The aim of our research was to obtain data on the chemical and physical properties of individual aerosol particles from biomass smoke plume s in southern Africa and from air masses in the region that are affec ted by the smoke. We used analytical transmission electron microscopy (ATEM), including energy-dispersive X-ray spectrometry (EDS) and ele ctron energy-loss spectroscopy (EELS), and field-emission electron microscopy (FESEM) to study aerosol particles from several smoke and haz e samples and from a set of cloud samples.

  3. FAST TRACK COMMUNICATION: Uniqueness of static black holes without analyticity

    NASA Astrophysics Data System (ADS)

    Chruściel, Piotr T.; Galloway, Gregory J.

    2010-08-01

    We show that the hypothesis of analyticity in the uniqueness theory of vacuum, or electrovacuum, static black holes is not needed. More generally, we show that prehorizons covering a closed set cannot occur in well-behaved domains of outer communications.

  4. The precision of wet atmospheric deposition data from national atmospheric deposition program/national trends network sites determined with collocated samplers

    USGS Publications Warehouse

    Nilles, M.A.; Gordon, J.D.; Schroder, L.J.

    1994-01-01

    A collocated, wet-deposition sampler program has been operated since October 1988 by the U.S. Geological Survey to estimate the overall sampling precision of wet atmospheric deposition data collected at selected sites in the National Atmospheric Deposition Program and National Trends Network (NADP/NTN). A duplicate set of wet-deposition sampling instruments was installed adjacent to existing sampling instruments at four different NADP/NTN sites for each year of the study. Wet-deposition samples from collocated sites were collected and analysed using standard NADP/NTN procedures. Laboratory analyses included determinations of pH, specific conductance, and concentrations of major cations and anions. The estimates of precision included all variability in the data-collection system, from the point of sample collection through storage in the NADP/NTN database. Sampling precision was determined from the absolute value of differences in the analytical results for the paired samples in terms of median relative and absolute difference. The median relative difference for Mg2+, Na+, K+ and NH4+ concentration and deposition was quite variable between sites and exceeded 10% at most sites. Relative error for analytes whose concentrations typically approached laboratory method detection limits were greater than for analytes that did not typically approach detection limits. The median relative difference for SO42- and NO3- concentration, specific conductance, and sample volume at all sites was less than 7%. Precision for H+ concentration and deposition ranged from less than 10% at sites with typically high levels of H+ concentration to greater than 30% at sites with low H+ concentration. Median difference for analyte concentration and deposition was typically 1.5-2-times greater for samples collected during the winter than during other seasons at two northern sites. Likewise, the median relative difference in sample volume for winter samples was more than double the annual median relative difference at the two northern sites. Bias accounted for less than 25% of the collocated variability in analyte concentration and deposition from weekly collocated precipitation samples at most sites.A collocated, wet-deposition sampler program has been operated since OCtober 1988 by the U.S Geological Survey to estimate the overall sampling precision of wet atmospheric deposition data collected at selected sites in the National Atmospheric Deposition Program and National Trends Network (NADP/NTN). A duplicate set of wet-deposition sampling instruments was installed adjacent to existing sampling instruments four different NADP/NTN sites for each year of the study. Wet-deposition samples from collocated sites were collected and analysed using standard NADP/NTN procedures. Laboratory analyses included determinations of pH, specific conductance, and concentrations of major cations and anions. The estimates of precision included all variability in the data-collection system, from the point of sample collection through storage in the NADP/NTN database.

  5. Small-x asymptotics of the quark helicity distribution: Analytic results

    DOE PAGES

    Kovchegov, Yuri V.; Pitonyak, Daniel; Sievert, Matthew D.

    2017-06-15

    In this Letter, we analytically solve the evolution equations for the small-x asymptotic behavior of the (flavor singlet) quark helicity distribution in the large- N c limit. Here, these evolution equations form a set of coupled integro-differential equations, which previously could only be solved numerically. This approximate numerical solution, however, revealed simplifying properties of the small-x asymptotics, which we exploit here to obtain an analytic solution.

  6. Analytic representation of FK/Fπ in two loop chiral perturbation theory

    NASA Astrophysics Data System (ADS)

    Ananthanarayan, B.; Bijnens, Johan; Friot, Samuel; Ghosh, Shayan

    2018-05-01

    We present an analytic representation of FK/Fπ as calculated in three-flavor two-loop chiral perturbation theory, which involves expressing three mass scale sunsets in terms of Kampé de Fériet series. We demonstrate how approximations may be made to obtain relatively compact analytic representations. An illustrative set of fits using lattice data is also presented, which shows good agreement with existing fits.

  7. Analyzing chromatographic data using multilevel modeling.

    PubMed

    Wiczling, Paweł

    2018-06-01

    It is relatively easy to collect chromatographic measurements for a large number of analytes, especially with gradient chromatographic methods coupled with mass spectrometry detection. Such data often have a hierarchical or clustered structure. For example, analytes with similar hydrophobicity and dissociation constant tend to be more alike in their retention than a randomly chosen set of analytes. Multilevel models recognize the existence of such data structures by assigning a model for each parameter, with its parameters also estimated from data. In this work, a multilevel model is proposed to describe retention time data obtained from a series of wide linear organic modifier gradients of different gradient duration and different mobile phase pH for a large set of acids and bases. The multilevel model consists of (1) the same deterministic equation describing the relationship between retention time and analyte-specific and instrument-specific parameters, (2) covariance relationships relating various physicochemical properties of the analyte to chromatographically specific parameters through quantitative structure-retention relationship based equations, and (3) stochastic components of intra-analyte and interanalyte variability. The model was implemented in Stan, which provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods. Graphical abstract Relationships between log k and MeOH content for acidic, basic, and neutral compounds with different log P. CI credible interval, PSA polar surface area.

  8. A comparative analysis of the categorization of multidimensional stimuli: I. Unidimensional classification does not necessarily imply analytic processing; evidence from pigeons (Columba livia), squirrels (Sciurus carolinensis), and humans (Homo sapiens).

    PubMed

    Wills, A J; Lea, Stephen E G; Leaver, Lisa A; Osthaus, Britta; Ryan, Catriona M E; Suret, Mark B; Bryant, Catherine M L; Chapman, Sue J A; Millar, Louise

    2009-11-01

    Pigeons (Columba livia), gray squirrels (Sciurus carolinensis), and undergraduates (Homo sapiens) learned discrimination tasks involving multiple mutually redundant dimensions. First, pigeons and undergraduates learned conditional discriminations between stimuli composed of three spatially separated dimensions, after first learning to discriminate the individual elements of the stimuli. When subsequently tested with stimuli in which one of the dimensions took an anomalous value, the majority of both species categorized test stimuli by their overall similarity to training stimuli. However some individuals of both species categorized them according to a single dimension. In a second set of experiments, squirrels, pigeons, and undergraduates learned go/no-go discriminations using multiple simultaneous presentations of stimuli composed of three spatially integrated, highly salient dimensions. The tendency to categorize test stimuli including anomalous dimension values unidimensionally was higher than in the first set of experiments and did not differ significantly between species. The authors conclude that unidimensional categorization of multidimensional stimuli is not diagnostic for analytic cognitive processing, and that any differences between human's and pigeons' behavior in such tasks are not due to special features of avian visual cognition.

  9. Analysis of latency performance of bluetooth low energy (BLE) networks.

    PubMed

    Cho, Keuchul; Park, Woojin; Hong, Moonki; Park, Gisu; Cho, Wooseong; Seo, Jihoon; Han, Kijun

    2014-12-23

    Bluetooth Low Energy (BLE) is a short-range wireless communication technology aiming at low-cost and low-power communication. The performance evaluation of classical Bluetooth device discovery have been intensively studied using analytical modeling and simulative methods, but these techniques are not applicable to BLE, since BLE has a fundamental change in the design of the discovery mechanism, including the usage of three advertising channels. Recently, there several works have analyzed the topic of BLE device discovery, but these studies are still far from thorough. It is thus necessary to develop a new, accurate model for the BLE discovery process. In particular, the wide range settings of the parameters introduce lots of potential for BLE devices to customize their discovery performance. This motivates our study of modeling the BLE discovery process and performing intensive simulation. This paper is focused on building an analytical model to investigate the discovery probability, as well as the expected discovery latency, which are then validated via extensive experiments. Our analysis considers both continuous and discontinuous scanning modes. We analyze the sensitivity of these performance metrics to parameter settings to quantitatively examine to what extent parameters influence the performance metric of the discovery processes.

  10. Analysis of Latency Performance of Bluetooth Low Energy (BLE) Networks

    PubMed Central

    Cho, Keuchul; Park, Woojin; Hong, Moonki; Park, Gisu; Cho, Wooseong; Seo, Jihoon; Han, Kijun

    2015-01-01

    Bluetooth Low Energy (BLE) is a short-range wireless communication technology aiming at low-cost and low-power communication. The performance evaluation of classical Bluetooth device discovery have been intensively studied using analytical modeling and simulative methods, but these techniques are not applicable to BLE, since BLE has a fundamental change in the design of the discovery mechanism, including the usage of three advertising channels. Recently, there several works have analyzed the topic of BLE device discovery, but these studies are still far from thorough. It is thus necessary to develop a new, accurate model for the BLE discovery process. In particular, the wide range settings of the parameters introduce lots of potential for BLE devices to customize their discovery performance. This motivates our study of modeling the BLE discovery process and performing intensive simulation. This paper is focused on building an analytical model to investigate the discovery probability, as well as the expected discovery latency, which are then validated via extensive experiments. Our analysis considers both continuous and discontinuous scanning modes. We analyze the sensitivity of these performance metrics to parameter settings to quantitatively examine to what extent parameters influence the performance metric of the discovery processes. PMID:25545266

  11. Workplace Skills Taught in a Simulated Analytical Department

    NASA Astrophysics Data System (ADS)

    Sonchik Marine, Susan

    2001-11-01

    Integration of workplace skills into the academic setting is paramount for any chemical technology program. In addition to the expected chemistry content, courses must build proficiency in oral and written communication skills, computer skills, laboratory safety, and logical troubleshooting. Miami University's Chemical Technology II course is set up as a contract analytical laboratory. Students apply the advanced sampling techniques, quality assurance, standard methods, and statistical analyses they have studied. For further integration of workplace skills, weekly "department meetings" are held where the student, as members of the department, report on their work in process, present completed projects, and share what they have learned and what problems they have encountered. Information is shared between the experienced members of the department and those encountering problems or starting a new project. The instructor as department manager makes announcements, reviews company and department status, and assigns work for the coming week. The department members report results to clients in formal reports or in short memos. Factors affecting the success of the "department meeting" approach include the formality of the meeting room, use of an official agenda, the frequency, time, and duration of the meeting, and accountability of the students.

  12. Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML)

    PubMed Central

    Lechevalier, D.; Ak, R.; Ferguson, M.; Law, K. H.; Lee, Y.-T. T.; Rachuri, S.

    2017-01-01

    This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain. PMID:29202125

  13. Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML).

    PubMed

    Park, J; Lechevalier, D; Ak, R; Ferguson, M; Law, K H; Lee, Y-T T; Rachuri, S

    2017-01-01

    This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain.

  14. Correlation of finite element free vibration predictions using random vibration test data. M.S. Thesis - Cleveland State Univ.

    NASA Technical Reports Server (NTRS)

    Chambers, Jeffrey A.

    1994-01-01

    Finite element analysis is regularly used during the engineering cycle of mechanical systems to predict the response to static, thermal, and dynamic loads. The finite element model (FEM) used to represent the system is often correlated with physical test results to determine the validity of analytical results provided. Results from dynamic testing provide one means for performing this correlation. One of the most common methods of measuring accuracy is by classical modal testing, whereby vibratory mode shapes are compared to mode shapes provided by finite element analysis. The degree of correlation between the test and analytical mode shapes can be shown mathematically using the cross orthogonality check. A great deal of time and effort can be exhausted in generating the set of test acquired mode shapes needed for the cross orthogonality check. In most situations response data from vibration tests are digitally processed to generate the mode shapes from a combination of modal parameters, forcing functions, and recorded response data. An alternate method is proposed in which the same correlation of analytical and test acquired mode shapes can be achieved without conducting the modal survey. Instead a procedure is detailed in which a minimum of test information, specifically the acceleration response data from a random vibration test, is used to generate a set of equivalent local accelerations to be applied to the reduced analytical model at discrete points corresponding to the test measurement locations. The static solution of the analytical model then produces a set of deformations that once normalized can be used to represent the test acquired mode shapes in the cross orthogonality relation. The method proposed has been shown to provide accurate results for both a simple analytical model as well as a complex space flight structure.

  15. Routine development of objectively derived search strategies.

    PubMed

    Hausner, Elke; Waffenschmidt, Siw; Kaiser, Thomas; Simon, Michael

    2012-02-29

    Over the past few years, information retrieval has become more and more professionalized, and information specialists are considered full members of a research team conducting systematic reviews. Research groups preparing systematic reviews and clinical practice guidelines have been the driving force in the development of search strategies, but open questions remain regarding the transparency of the development process and the available resources. An empirically guided approach to the development of a search strategy provides a way to increase transparency and efficiency. Our aim in this paper is to describe the empirically guided development process for search strategies as applied by the German Institute for Quality and Efficiency in Health Care (Institut für Qualität und Wirtschaftlichkeit im Gesundheitswesen, or "IQWiG"). This strategy consists of the following steps: generation of a test set, as well as the development, validation and standardized documentation of the search strategy. We illustrate our approach by means of an example, that is, a search for literature on brachytherapy in patients with prostate cancer. For this purpose, a test set was generated, including a total of 38 references from 3 systematic reviews. The development set for the generation of the strategy included 25 references. After application of textual analytic procedures, a strategy was developed that included all references in the development set. To test the search strategy on an independent set of references, the remaining 13 references in the test set (the validation set) were used. The validation set was also completely identified. Our conclusion is that an objectively derived approach similar to that used in search filter development is a feasible way to develop and validate reliable search strategies. Besides creating high-quality strategies, the widespread application of this approach will result in a substantial increase in the transparency of the development process of search strategies.

  16. Impression Management and Interview and Job Performance Ratings: A Meta-Analysis of Research Design with Tactics in Mind.

    PubMed

    Peck, Jessica A; Levashina, Julia

    2017-01-01

    Impression management (IM) is pervasive in interview and job performance settings. We meta-analytically examine IM by self- and other-focused tactics to establish base rates of tactic usage, to understand the impact of tactics on interview and job performance ratings, and to examine the moderating effects of research design. Our results suggest IM is used more frequently in the interview rather than job performance settings. Self-focused tactics are more effective in the interview rather than in job performance settings, and other-focused tactics are more effective in job performance settings rather than in the interview. We explore several research design moderators including research fidelity, rater, and participants. IM has a somewhat stronger impact on interview ratings in lab settings than field settings. IM also has a stronger impact on interview ratings when the target of IM is also the rater of performance than when the rater of performance is an observer. Finally, labor market participants use IM more frequently and more effectively than students in interview settings. Our research has implications for understanding how different IM tactics function in interview and job performance settings and the effects of research design on IM frequency and impact.

  17. Supersaturation Control using Analytical Crystal Size Distribution Estimator for Temperature Dependent in Nucleation and Crystal Growth Phenomena

    NASA Astrophysics Data System (ADS)

    Zahari, Zakirah Mohd; Zubaidah Adnan, Siti; Kanthasamy, Ramesh; Saleh, Suriyati; Samad, Noor Asma Fazli Abdul

    2018-03-01

    The specification of the crystal product is usually given in terms of crystal size distribution (CSD). To this end, optimal cooling strategy is necessary to achieve the CSD. The direct design control involving analytical CSD estimator is one of the approaches that can be used to generate the set-point. However, the effects of temperature on the crystal growth rate are neglected in the estimator. Thus, the temperature dependence on the crystal growth rate needs to be considered in order to provide an accurate set-point. The objective of this work is to extend the analytical CSD estimator where Arrhenius expression is employed to cover the effects of temperature on the growth rate. The application of this work is demonstrated through a potassium sulphate crystallisation process. Based on specified target CSD, the extended estimator is capable of generating the required set-point where a proposed controller successfully maintained the operation at the set-point to achieve the target CSD. Comparison with other cooling strategies shows a reduction up to 18.2% of the total number of undesirable crystals generated from secondary nucleation using linear cooling strategy is achieved.

  18. Report: Industrial Hygiene: Safer Working through Analytical Chemistry.

    ERIC Educational Resources Information Center

    Hemingway, Ronald E.

    1980-01-01

    The analytical chemist is involved in the recognition, evaluation, and control of chemical hazards in the workplace environment. These goals can be achieved by setting up a monitoring program; this should be a combination of planning, calibration, sampling, and analysis of toxic substances. (SMB)

  19. Cognitive-Developmental and Behavior-Analytic Theories: Evolving into Complementarity

    ERIC Educational Resources Information Center

    Overton, Willis F.; Ennis, Michelle D.

    2006-01-01

    Historically, cognitive-developmental and behavior-analytic approaches to the study of human behavior change and development have been presented as incompatible alternative theoretical and methodological perspectives. This presumed incompatibility has been understood as arising from divergent sets of metatheoretical assumptions that take the form…

  20. Solar electric geocentric transfer with attitude constraints: Analysis

    NASA Technical Reports Server (NTRS)

    Sackett, L. L.; Malchow, H. L.; Delbaum, T. N.

    1975-01-01

    A time optimal or nearly time optimal trajectory program was developed for solar electric geocentric transfer with or without attitude constraints and with an optional initial high thrust stage. The method of averaging reduces computation time. A nonsingular set of orbital elements is used. The constraints, which are those of one of the SERT-C designs, introduce complexities into the analysis and the solution yields possible discontinuous changes in thrust direction. The power degradation due to VanAllen radiation is modeled analytically. A wide range of solar cell characteristics is assumed. Effects such as oblateness and shadowing are included. The analysis and the results of many example runs are included.

  1. Furthering the Understanding of Parent–Child Relationships: A Nursing Scholarship Review Series. Part 3: Interaction and the Parent–Child Relationship—Assessment and Intervention Studies

    PubMed Central

    Pridham, Karen A.; Lutz, Kristin F.; Anderson, Lori S.; Riesch, Susan K.; Becker, Patricia T.

    2010-01-01

    PURPOSE This integrative review concerns nursing research on parent–child interaction and relationships published from 1980 through 2008 and includes assessment and intervention studies in clinically important settings (e.g., feeding, teaching, play). CONCLUSIONS Directions for research include development of theoretical frameworks, valid observational systems, and multivariate and longitudinal data analytic strategies. PRACTICE IMPLICATIONS Observation of social–emotional as well as task-related interaction qualities in the context of assessing parent–child relationships could generate new questions for nursing research and for family-centered nursing practice. PMID:20074112

  2. Nonlinear estimation for arrays of chemical sensors

    NASA Astrophysics Data System (ADS)

    Yosinski, Jason; Paffenroth, Randy

    2010-04-01

    Reliable detection of hazardous materials is a fundamental requirement of any national security program. Such materials can take a wide range of forms including metals, radioisotopes, volatile organic compounds, and biological contaminants. In particular, detection of hazardous materials in highly challenging conditions - such as in cluttered ambient environments, where complex collections of analytes are present, and with sensors lacking specificity for the analytes of interest - is an important part of a robust security infrastructure. Sophisticated single sensor systems provide good specificity for a limited set of analytes but often have cumbersome hardware and environmental requirements. On the other hand, simple, broadly responsive sensors are easily fabricated and efficiently deployed, but such sensors individually have neither the specificity nor the selectivity to address analyte differentiation in challenging environments. However, arrays of broadly responsive sensors can provide much of the sensitivity and selectivity of sophisticated sensors but without the substantial hardware overhead. Unfortunately, arrays of simple sensors are not without their challenges - the selectivity of such arrays can only be realized if the data is first distilled using highly advanced signal processing algorithms. In this paper we will demonstrate how the use of powerful estimation algorithms, based on those commonly used within the target tracking community, can be extended to the chemical detection arena. Herein our focus is on algorithms that not only provide accurate estimates of the mixture of analytes in a sample, but also provide robust measures of ambiguity, such as covariances.

  3. The general 2-D moments via integral transform method for acoustic radiation and scattering

    NASA Astrophysics Data System (ADS)

    Smith, Jerry R.; Mirotznik, Mark S.

    2004-05-01

    The moments via integral transform method (MITM) is a technique to analytically reduce the 2-D method of moments (MoM) impedance double integrals into single integrals. By using a special integral representation of the Green's function, the impedance integral can be analytically simplified to a single integral in terms of transformed shape and weight functions. The reduced expression requires fewer computations and reduces the fill times of the MoM impedance matrix. Furthermore, the resulting integral is analytic for nearly arbitrary shape and weight function sets. The MITM technique is developed for mixed boundary conditions and predictions with basic shape and weight function sets are presented. Comparisons of accuracy and speed between MITM and brute force are presented. [Work sponsored by ONR and NSWCCD ILIR Board.

  4. A multi-species reactive transport model to estimate biogeochemical rates based on single-well push-pull test data

    NASA Astrophysics Data System (ADS)

    Phanikumar, Mantha S.; McGuire, Jennifer T.

    2010-08-01

    Push-pull tests are a popular technique to investigate various aquifer properties and microbial reaction kinetics in situ. Most previous studies have interpreted push-pull test data using approximate analytical solutions to estimate (generally first-order) reaction rate coefficients. Though useful, these analytical solutions may not be able to describe important complexities in rate data. This paper reports the development of a multi-species, radial coordinate numerical model (PPTEST) that includes the effects of sorption, reaction lag time and arbitrary reaction order kinetics to estimate rates in the presence of mixing interfaces such as those created between injected "push" water and native aquifer water. The model has the ability to describe an arbitrary number of species and user-defined reaction rate expressions including Monod/Michelis-Menten kinetics. The FORTRAN code uses a finite-difference numerical model based on the advection-dispersion-reaction equation and was developed to describe the radial flow and transport during a push-pull test. The accuracy of the numerical solutions was assessed by comparing numerical results with analytical solutions and field data available in the literature. The model described the observed breakthrough data for tracers (chloride and iodide-131) and reactive components (sulfate and strontium-85) well and was found to be useful for testing hypotheses related to the complex set of processes operating near mixing interfaces.

  5. Turbomachinery noise

    NASA Astrophysics Data System (ADS)

    Groeneweg, John F.; Sofrin, Thomas G.; Rice, Edward J.; Gliebe, Phillip R.

    1991-08-01

    Summarized here are key advances in experimental techniques and theoretical applications which point the way to a broad understanding and control of turbomachinery noise. On the experimental side, the development of effective inflow control techniques makes it possible to conduct, in ground based facilities, definitive experiments in internally controlled blade row interactions. Results can now be valid indicators of flight behavior and can provide a firm base for comparison with analytical results. Inflow control coupled with detailed diagnostic tools such as blade pressure measurements can be used to uncover the more subtle mechanisms such as rotor strut interaction, which can set tone levels for some engine configurations. Initial mappings of rotor wake-vortex flow fields have provided a data base for a first generation semiempirical flow disturbance model. Laser velocimetry offers a nonintrusive method for validating and improving the model. Digital data systems and signal processing algorithms are bringing mode measurement closer to a working tool that can be frequently applied to a real machine such as a turbofan engine. On the analytical side, models of most of the links in the chain from turbomachine blade source to far field observation point have been formulated. Three dimensional lifting surface theory for blade rows, including source noncompactness and cascade effects, blade row transmission models incorporating mode and frequency scattering, and modal radiation calculations, including hybrid numerical-analytical approaches, are tools which await further application.

  6. Health-related quality of life, satisfaction, and economic outcome measures in studies of prostate cancer screening and treatment, 1990-2000.

    PubMed

    McNaughton-Collins, Mary; Walker-Corkery, Elizabeth; Barry, Michael J

    2004-01-01

    Prostate cancer outcomes research incorporates a broad spectrum of endpoints, from clinical or intermediate endpoints, such as tumor shrinkage or patient compliance, to final endpoints, such as survival or disease-free survival. Three types of nontraditional endpoints that are of growing interest-health-related quality of life (QOL), satisfaction with care, and economic cost impact-hold the promise of improving our ability to understand the full burden of prostate cancer screening and treatment. In this article we review the last decade's published literature regarding the health-related QOL, satisfaction, and economic outcomes of prostate cancer screening and treatment to determine the "state of the science" of outcomes measurement. The focus is the enumeration of the types of outcome measurement used in the studies not the determination of the results of the studies. Studies were identified by searching Medline (1990-2000). Articles were included if they presented original data on any patient-centered outcome (including costs or survival alone) for men screened and treated for prostate cancer. Review papers were excluded unless they were quantitative syntheses of the results of other primary studies. Economic and decision analytic papers were included if they presented information on outcomes of real or hypothetical patient cohorts. Each retrieved article was reviewed by one of the authors. Included papers were assigned one primary, mutually exclusive study design. For the "primary data" studies, information was abstracted on care setting, dates of the study, sample size, racial distribution, age, tumor differentiation, tumor stage, survival, statistical power, and types of outcomes measures (QOL-generic, QOL-cancer specific, QOL-prostate cancer specific, satisfaction, costs, utilities, and other). For the "economic and decision analytic" papers, information was abstracted on stage of disease, age range, outcomes, costs, and whether utilities were measured. Of the 198 included papers, there were 161 primary data papers categorized as follows: randomized trial (n = 28), nonrandomized trial (n = 13), prospective or retrospective cohort study (n = 55), case-control study (n = 0), cross-sectional study (n = 63), and meta-analysis (n = 2). The remaining 37 papers were economic and decision analytic papers. Among the 149 primary data papers that contained patient outcome data, there were 42 standard instruments used, accounting for 44% (179 of 410) of the measures overall. Almost three-quarters (71%) of papers included one, two, or three outcomes measures of all types (standard and nonstandard); three papers included seven outcomes measures, and one paper included nine. Over the 11-year time period, there was a nonstatistically significant trend toward more frequent use of standardized QOL instruments and a statistically significant trend toward increased reporting of race (P = .003). Standardization of measurement of health-related QOL, satisfaction with care, and economic cost effect among men screened and treated for prostate cancer is needed. A core set of similar questions, both generic and disease-specific, should ideally be asked in every study, although investigators should be encouraged to include additional question sets as appropriate to individual studies to get a more complete picture of how patients screened and treated for this condition are doing over time.

  7. Downstream processing and chromatography based analytical methods for production of vaccines, gene therapy vectors, and bacteriophages.

    PubMed

    Kramberger, Petra; Urbas, Lidija; Štrancar, Aleš

    2015-01-01

    Downstream processing of nanoplexes (viruses, virus-like particles, bacteriophages) is characterized by complexity of the starting material, number of purification methods to choose from, regulations that are setting the frame for the final product and analytical methods for upstream and downstream monitoring. This review gives an overview on the nanoplex downstream challenges and chromatography based analytical methods for efficient monitoring of the nanoplex production.

  8. Downstream processing and chromatography based analytical methods for production of vaccines, gene therapy vectors, and bacteriophages

    PubMed Central

    Kramberger, Petra; Urbas, Lidija; Štrancar, Aleš

    2015-01-01

    Downstream processing of nanoplexes (viruses, virus-like particles, bacteriophages) is characterized by complexity of the starting material, number of purification methods to choose from, regulations that are setting the frame for the final product and analytical methods for upstream and downstream monitoring. This review gives an overview on the nanoplex downstream challenges and chromatography based analytical methods for efficient monitoring of the nanoplex production. PMID:25751122

  9. Evaluation of Resuspension from Propeller Wash in DoD Harbors

    DTIC Science & Technology

    2016-09-01

    Environmental Research and Development Center FANS FOV ICP-MS Finite Analytical Navier-Stoker Solver Field of View Inductively Coupled Plasma with...Model (1984) and the Finite Analytical Navier- Stoker Solver (FANS) model (Chen et al., 2003) were set up to simulate and evaluate flow velocities and...model for evaluating the resuspension potential of propeller wash by a tugboat and the FANS model for a DDG. The Finite -Analytic Navier-Stokes (FANS

  10. Calculation of response matrix of CaSO 4:Dy based neutron dosimeter using Monte Carlo code FLUKA and measurement of 241Am-Be spectra

    NASA Astrophysics Data System (ADS)

    Chatterjee, S.; Bakshi, A. K.; Tripathy, S. P.

    2010-09-01

    Response matrix for CaSO 4:Dy based neutron dosimeter was generated using Monte Carlo code FLUKA in the energy range thermal to 20 MeV for a set of eight Bonner spheres of diameter 3-12″ including the bare one. Response of the neutron dosimeter was measured for the above set of spheres for 241Am-Be neutron source covered with 2 mm lead. An analytical expression for the response function was devised as a function of sphere mass. Using Frascati Unfolding Iteration Tool (FRUIT) unfolding code, the neutron spectrum of 241Am-Be was unfolded and compared with standard IAEA spectrum for the same.

  11. The Neutral Islands during the Late Epoch of Reionization

    NASA Astrophysics Data System (ADS)

    Xu, Yidong; Yue, Bin; Chen, Xuelei

    2018-05-01

    The large-scale structure of the ionization field during the epoch of reionization (EoR) can be modeled by the excursion set theory. While the growth of ionized regions during the early stage are described by the ``bubble model'', the shrinking process of neutral regions after the percolation of the ionized region calls for an ``island model''. An excursion set based analytical model and a semi-numerical code (islandFAST) have been developed. The ionizing background and the bubbles inside the islands are also included in the treatment. With two kinds of absorbers of ionizing photons, i.e. the large-scale under-dense neutral islands and the small-scale over-dense clumps, the ionizing background are self-consistently evolved in the model.

  12. Methods for evaluating the predictive accuracy of structural dynamic models

    NASA Technical Reports Server (NTRS)

    Hasselman, T. K.; Chrostowski, Jon D.

    1990-01-01

    Uncertainty of frequency response using the fuzzy set method and on-orbit response prediction using laboratory test data to refine an analytical model are emphasized with respect to large space structures. Two aspects of the fuzzy set approach were investigated relative to its application to large structural dynamics problems: (1) minimizing the number of parameters involved in computing possible intervals; and (2) the treatment of extrema which may occur in the parameter space enclosed by all possible combinations of the important parameters of the model. Extensive printer graphics were added to the SSID code to help facilitate model verification, and an application of this code to the LaRC Ten Bay Truss is included in the appendix to illustrate this graphics capability.

  13. Comparison between numeric and approximate analytic solutions for the prediction of soil metal uptake by roots. Example of cadmium.

    PubMed

    Schneider, André; Lin, Zhongbing; Sterckeman, Thibault; Nguyen, Christophe

    2018-04-01

    The dissociation of metal complexes in the soil solution can increase the availability of metals for root uptake. When it is accounted for in models of bioavailability of soil metals, the number of partial differential equations (PDEs) increases and the computation time to numerically solve these equations may be problematic when a large number of simulations are required, for example for sensitivity analyses or when considering root architecture. This work presents analytical solutions for the set of PDEs describing the bioavailability of soil metals including the kinetics of complexation for three scenarios where the metal complex in solution was fully inert, fully labile, or partially labile. The analytical solutions are only valid i) at steady-state when the PDEs become ordinary differential equations, the transient phase being not covered, ii) when diffusion is the major mechanism of transport and therefore, when convection is negligible, iii) when there is no between-root competition. The formulation of the analytical solutions is for cylindrical geometry but the solutions rely on the spread of the depletion profile around the root, which was modelled assuming a planar geometry. The analytical solutions were evaluated by comparison with the corresponding PDEs for cadmium in the case of the French agricultural soils. Provided that convection was much lower than diffusion (Péclet's number<0.02), the cumulative uptakes calculated from the analytic solutions were in very good agreement with those calculated from the PDEs, even in the case of a partially labile complex. The analytic solutions can be used instead of the PDEs to predict root uptake of metals. The analytic solutions were also used to build an indicator of the contribution of a complex to the uptake of the metal by roots, which can be helpful to predict the effect of soluble organic matter on the bioavailability of soil metals. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Mental health problems of aging and the aged from the viewpoint of analytical psychology*

    PubMed Central

    Bash, K. W.

    1959-01-01

    According to Jung's analytical psychology man is either predominantly extravert or predominantly introvert. Whichever he is, he must in most cases, in order to satisfy the biological drives of the earlier part of his life, adapt himself to an extraverted culture and thus become largely extravert. In the later part of life, as biological involution sets in, this attitude and the values attached thereto no longer suffice. The strains set up by the resulting need for a reorientation in life are a fruitful source of mental disorder. PMID:20604058

  15. Constraint-Referenced Analytics of Algebra Learning

    ERIC Educational Resources Information Center

    Sutherland, Scot M.; White, Tobin F.

    2016-01-01

    The development of the constraint-referenced analytics tool for monitoring algebra learning activities presented here came from the desire to firstly, take a more quantitative look at student responses in collaborative algebra activities, and secondly, to situate those activities in a more traditional introductory algebra setting focusing on…

  16. FACTOR ANALYTIC MODELS OF CLUSTERED MULTIVARIATE DATA WITH INFORMATIVE CENSORING

    EPA Science Inventory

    This paper describes a general class of factor analytic models for the analysis of clustered multivariate data in the presence of informative missingness. We assume that there are distinct sets of cluster-level latent variables related to the primary outcomes and to the censorin...

  17. NHEXAS PHASE I ARIZONA STUDY--METALS IN AIR ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Air data set contains analytical results for measurements of up to 11 metals in 369 air samples over 175 households. Samples were taken by pumping standardized air volumes through filters at indoor and outdoor sites around each household being sampled. The primary...

  18. NHEXAS PHASE I MARYLAND STUDY--PESTICIDE METABOLITES IN URINE ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticide Metabolites in Urine data set contains analytical results for measurements of up to 9 pesticides in 345 urine samples over 79 households. Each sample was collected from the primary respondent within each household during the study and represented the first morning ...

  19. NHEXAS PHASE I ARIZONA STUDY--PESTICIDE METABOLITES IN URINE ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticide Metabolites in Urine data set contains analytical results for measurements of up to 4 pesticide metabolites in 176 urine samples over 176 households. Each sample was collected from the primary respondent within each household during Stage III of the NHEXAS study. ...

  20. NHEXAS PHASE I MARYLAND STUDY--PESTICIDES IN DERMAL WIPES ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticides in Dermal Wipe Samples data set contains analytical results for measurements of up to 8 pesticides in 40 dermal wipe samples over 40 households. Each sample was collected from the primary respondent within each household. The sampling period occurred on the last ...

  1. NHEXAS PHASE I MARYLAND STUDY--QA ANALYTICAL RESULTS FOR PESTICIDES IN REPLICATE SAMPLES

    EPA Science Inventory

    The Pesticides in Replicates data set contains the analytical results of measurements of up to 10 pesticides in 68 replicate (duplicate) samples from 41 households. Measurements were made in samples of indoor air, dust, soil, drinking water, food, and beverages. Duplicate sampl...

  2. NHEXAS PHASE I MARYLAND STUDY--PESTICIDES IN BLOOD ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticides in Blood Serum data set contains analytical results for measurements of up to 17 pesticides in 358 blood samples over 79 households. Each sample was collected via a venous sample from the primary respondent within each household by a phlebotomist. Samples were ge...

  3. NHEXAS PHASE I ARIZONA STUDY--METALS IN WATER ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Water data set contains analytical results for measurements of up to 11 metals in 314 water samples over 211 households. Sample collection was undertaken at the tap and any additional drinking water source used extensively within each residence. The primary metals...

  4. Architectural Strategies for Enabling Data-Driven Science at Scale

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Law, E. S.; Doyle, R. J.; Little, M. M.

    2017-12-01

    The analysis of large data collections from NASA or other agencies is often executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Alternatively, data are hauled to large computational environments that provide centralized data analysis via traditional High Performance Computing (HPC). Scientific data archives, however, are not only growing massive, but are also becoming highly distributed. Neither traditional approach provides a good solution for optimizing analysis into the future. Assumptions across the NASA mission and science data lifecycle, which historically assume that all data can be collected, transmitted, processed, and archived, will not scale as more capable instruments stress legacy-based systems. New paradigms are needed to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural and analytical choices are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections, from point of collection (e.g., onboard) to analysis and decision support. The most effective approach to analyzing a distributed set of massive data may involve some exploration and iteration, putting a premium on the flexibility afforded by the architectural framework. The framework should enable scientist users to assemble workflows efficiently, manage the uncertainties related to data analysis and inference, and optimize deep-dive analytics to enhance scalability. In many cases, this "data ecosystem" needs to be able to integrate multiple observing assets, ground environments, archives, and analytics, evolving from stewardship of measurements of data to using computational methodologies to better derive insight from the data that may be fused with other sets of data. This presentation will discuss architectural strategies, including a 2015-2016 NASA AIST Study on Big Data, for evolving scientific research towards massively distributed data-driven discovery. It will include example use cases across earth science, planetary science, and other disciplines.

  5. New analytical solutions to the two-phase water faucet problem

    DOE PAGES

    Zou, Ling; Zhao, Haihua; Zhang, Hongbin

    2016-06-17

    Here, the one-dimensional water faucet problem is one of the classical benchmark problems originally proposed by Ransom to study the two-fluid two-phase flow model. With certain simplifications, such as massless gas phase and no wall and interfacial frictions, analytical solutions had been previously obtained for the transient liquid velocity and void fraction distribution. The water faucet problem and its analytical solutions have been widely used for the purposes of code assessment, benchmark and numerical verifications. In our previous study, the Ransom’s solutions were used for the mesh convergence study of a high-resolution spatial discretization scheme. It was found that, atmore » the steady state, an anticipated second-order spatial accuracy could not be achieved, when compared to the existing Ransom’s analytical solutions. A further investigation showed that the existing analytical solutions do not actually satisfy the commonly used two-fluid single-pressure two-phase flow equations. In this work, we present a new set of analytical solutions of the water faucet problem at the steady state, considering the gas phase density’s effect on pressure distribution. This new set of analytical solutions are used for mesh convergence studies, from which anticipated second-order of accuracy is achieved for the 2nd order spatial discretization scheme. In addition, extended Ransom’s transient solutions for the gas phase velocity and pressure are derived, with the assumption of decoupled liquid and gas pressures. Numerical verifications on the extended Ransom’s solutions are also presented.« less

  6. Sensitive glow discharge ion source for aerosol and gas analysis

    DOEpatents

    Reilly, Peter T. A. [Knoxville, TN

    2007-08-14

    A high sensitivity glow discharge ion source system for analyzing particles includes an aerodynamic lens having a plurality of constrictions for receiving an aerosol including at least one analyte particle in a carrier gas and focusing the analyte particles into a collimated particle beam. A separator separates the carrier gas from the analyte particle beam, wherein the analyte particle beam or vapors derived from the analyte particle beam are selectively transmitted out of from the separator. A glow discharge ionization source includes a discharge chamber having an entrance orifice for receiving the analyte particle beam or analyte vapors, and a target electrode and discharge electrode therein. An electric field applied between the target electrode and discharge electrode generates an analyte ion stream from the analyte vapors, which is directed out of the discharge chamber through an exit orifice, such as to a mass spectrometer. High analyte sensitivity is obtained by pumping the discharge chamber exclusively through the exit orifice and the entrance orifice.

  7. Materials Compatibility Testing in Concentrated Hydrogen Peroxide

    NASA Technical Reports Server (NTRS)

    Boxwell, R.; Bromley, G.; Mason, D.; Crockett, D.; Martinez, L.; McNeal, C.; Lyles, G. (Technical Monitor)

    2000-01-01

    Materials test methods from the 1960's have been used as a starting point in evaluating materials for today's space launch vehicles. These established test methods have been modified to incorporate today's analytical laboratory equipment. The Orbital test objective was to test a wide range of materials to incorporate the revolution in polymer and composite materials that has occurred since the 1960's. Testing is accomplished in 3 stages from rough screening to detailed analytical tests. Several interesting test observations have been made during this testing and are included in the paper. A summary of the set-up, test and evaluation of long-term storage sub-scale tanks is also included. This sub-scale tank test lasted for a 7-month duration prior to being stopped due to a polar boss material breakdown. Chemical evaluations of the hydrogen peroxide and residue left on the polar boss surface identify the material breakdown quite clearly. The paper concludes with recommendations for future testing and a specific effort underway within the industry to standardize the test methods used in evaluating materials.

  8. Quality control for federal clean water act and safe drinking water act regulatory compliance.

    PubMed

    Askew, Ed

    2013-01-01

    QC sample results are required in order to have confidence in the results from analytical tests. Some of the AOAC water methods include specific QC procedures, frequencies, and acceptance criteria. These are considered to be the minimum controls needed to perform the method successfully. Some regulatory programs, such as those in 40 CFR Part 136.7, require additional QC or have alternative acceptance limits. Essential QC measures include method calibration, reagent standardization, assessment of each analyst's capabilities, analysis of blind check samples, determination of the method's sensitivity (method detection level or quantification limit), and daily evaluation of bias, precision, and the presence of laboratory contamination or other analytical interference. The details of these procedures, their performance frequency, and expected ranges of results are set out in this manuscript. The specific regulatory requirements of 40 CFR Part 136.7 for the Clean Water Act, the laboratory certification requirements of 40 CFR Part 141 for the Safe Drinking Water Act, and the ISO 17025 accreditation requirements under The NELAC Institute are listed.

  9. Computer program for analysis of high speed, single row, angular contact, spherical roller bearing, SASHBEAN. Volume 2: Mathematical formulation and analysis

    NASA Technical Reports Server (NTRS)

    Aggarwal, Arun K.

    1993-01-01

    Spherical roller bearings have typically been used in applications with speeds limited to about 5000 rpm and loads limited for operation at less than about 0.25 million DN. However, spherical roller bearings are now being designed for high load and high speed applications including aerospace applications. A computer program, SASHBEAN, was developed to provide an analytical tool to design, analyze, and predict the performance of high speed, single row, angular contact (including zero contact angle), spherical roller bearings. The material presented is the mathematical formulation and analytical methods used to develop computer program SASHBEAN. For a given set of operating conditions, the program calculates the bearings ring deflections (axial and radial), roller deflections, contact areas stresses, depth and magnitude of maximum shear stresses, axial thrust, rolling element and cage rotational speeds, lubrication parameters, fatigue lives, and rates of heat generation. Centrifugal forces and gyroscopic moments are fully considered. The program is also capable of performing steady-state and time-transient thermal analyses of the bearing system.

  10. Ab initio electronic transport and thermoelectric properties of solids from full and range-separated hybrid functionals

    NASA Astrophysics Data System (ADS)

    Sansone, Giuseppe; Ferretti, Andrea; Maschio, Lorenzo

    2017-09-01

    Within the semiclassical Boltzmann transport theory in the constant relaxation-time approximation, we perform an ab initio study of the transport properties of selected systems, including crystalline solids and nanostructures. A local (Gaussian) basis set is adopted and exploited to analytically evaluate band velocities as well as to access full and range-separated hybrid functionals (such as B3LYP, PBE0, or HSE06) at a moderate computational cost. As a consequence of the analytical derivative, our approach is computationally efficient and does not suffer from problems related to band crossings. We investigate and compare the performance of a variety of hybrid functionals in evaluating Boltzmann conductivity. Demonstrative examples include silicon and aluminum bulk crystals as well as two thermoelectric materials (CoSb3, Bi2Te3). We observe that hybrid functionals other than providing more realistic bandgaps—as expected—lead to larger bandwidths and hence allow for a better estimate of transport properties, also in metallic systems. As a nanostructure prototype, we also investigate conductivity in boron-nitride (BN) substituted graphene, in which nanoribbons (nanoroads) alternate with BN ones.

  11. Adding In-Plane Flexibility to the Equations of Motion of a Single Rotor Helicopter

    NASA Technical Reports Server (NTRS)

    Curtiss, H. C., Jr.

    2000-01-01

    This report describes a way to add the effects of main rotor blade flexibility in the in- plane or lead-lag direction to a large set of non-linear equations of motion for a single rotor helicopter with rigid blades(l). Differences between the frequency of the regressing lag mode predicted by the equations of (1) and that measured in flight (2) for a UH-60 helicopter indicate that some element is missing from the analytical model of (1) which assumes rigid blades. A previous study (3) noted a similar discrepancy for the CH-53 helicopter. Using a relatively simple analytical model in (3), compared to (1), it was shown that a mechanical lag damper increases significantly the coupling between the rigid lag mode and the first flexible mode. This increased coupling due to a powerful lag damper produces an increase in the lowest lag frequency when viewed in a frame rotating with the blade. Flight test measurements normally indicate the frequency of this mode in a non-rotating or fixed frame. This report presents the additions necessary to the full equations of motion, to include main rotor blade lag flexibility. Since these additions are made to a very complex nonlinear dynamic model, in order to provide physical insight, a discussion of the results obtained from a simplified set of equations of motion is included. The reduced model illustrates the physics involved in the coupling and should indicate trends in the full model.

  12. Tides and tidal stress: Applications to Europa

    NASA Astrophysics Data System (ADS)

    Hurford, Terry Anthony, Jr.

    A review of analytical techniques and documentation of previously inaccessible mathematical formulations is applied to study of Jupiter's satellite Europa. Compared with numerical codes that are commonly used to model global tidal effects, analytical models of tidal deformation give deeper insight into the mechanics of tides, and can better reveal the nature of the dependence of observable effects on key parameters. I develop analytical models for tidal deformation of multi-layered bodies. Previous studies of Europa, based on numerical computation, only to show isolated examples from parameter space. My results show a systematic dependence of tidal response on the thicknesses and material parameters of Europa's core, rocky mantle, liquid water ocean, and outer layer of ice. As in the earlier work, I restrict these studies to incompressible materials. Any set of Love numbers h 2 and k 2 which describe a planet's tidal deformation, could be fit by a range of ice thickness values, by adjusting other parameters such as mantle rigidity or core size, an important result for mission planning. Inclusion of compression into multilayer models has been addressed analytically, uncovering several issues that are not explicit in the literature. Full evaluation with compression is here restricted to a uniform sphere. A set of singularities in the classical solution, which correspond to instabilities due to self-gravity has been identified and mapped in parameter space. The analytical models of tidal response yield the stresses anywhere within the body, including on its surface. Crack patterns (such as cycloids) on Europa are probably controlled by these stresses. However, in contrast to previous studies which used a thin shell approximation of the tidal stress, I consider how other tidal models compare with the observed tectonic features. In this way the relationship between Europa's surface tectonics and the global tidal distortion can be constrained. While large-scale tidal deformations probe internal structure deep within a body, small-scale deformations can probe internal structure at shallower depths. I have used photoclinometry to obtain topographic profiles across terrain adjacent to Europan ridges to detect the effects of loading on the lithosphere. Lithospheric thicknesses have been determined and correlated with types and ages of terrain.

  13. High temperature ion channels and pores

    NASA Technical Reports Server (NTRS)

    Cheley, Stephen (Inventor); Gu, Li Qun (Inventor); Bayley, Hagan (Inventor); Kang, Xiaofeng (Inventor)

    2011-01-01

    The present invention includes an apparatus, system and method for stochastic sensing of an analyte to a protein pore. The protein pore may be an engineer protein pore, such as an ion channel at temperatures above 55.degree. C. and even as high as near 100.degree. C. The analyte may be any reactive analyte, including chemical weapons, environmental toxins and pharmaceuticals. The analyte covalently bonds to the sensor element to produce a detectable electrical current signal. Possible signals include change in electrical current. Detection of the signal allows identification of the analyte and determination of its concentration in a sample solution. Multiple analytes present in the same solution may also be detected.

  14. Template based rotation: A method for functional connectivity analysis with a priori templates☆

    PubMed Central

    Schultz, Aaron P.; Chhatwal, Jasmeer P.; Huijbers, Willem; Hedden, Trey; van Dijk, Koene R.A.; McLaren, Donald G.; Ward, Andrew M.; Wigman, Sarah; Sperling, Reisa A.

    2014-01-01

    Functional connectivity magnetic resonance imaging (fcMRI) is a powerful tool for understanding the network level organization of the brain in research settings and is increasingly being used to study large-scale neuronal network degeneration in clinical trial settings. Presently, a variety of techniques, including seed-based correlation analysis and group independent components analysis (with either dual regression or back projection) are commonly employed to compute functional connectivity metrics. In the present report, we introduce template based rotation,1 a novel analytic approach optimized for use with a priori network parcellations, which may be particularly useful in clinical trial settings. Template based rotation was designed to leverage the stable spatial patterns of intrinsic connectivity derived from out-of-sample datasets by mapping data from novel sessions onto the previously defined a priori templates. We first demonstrate the feasibility of using previously defined a priori templates in connectivity analyses, and then compare the performance of template based rotation to seed based and dual regression methods by applying these analytic approaches to an fMRI dataset of normal young and elderly subjects. We observed that template based rotation and dual regression are approximately equivalent in detecting fcMRI differences between young and old subjects, demonstrating similar effect sizes for group differences and similar reliability metrics across 12 cortical networks. Both template based rotation and dual-regression demonstrated larger effect sizes and comparable reliabilities as compared to seed based correlation analysis, though all three methods yielded similar patterns of network differences. When performing inter-network and sub-network connectivity analyses, we observed that template based rotation offered greater flexibility, larger group differences, and more stable connectivity estimates as compared to dual regression and seed based analyses. This flexibility owes to the reduced spatial and temporal orthogonality constraints of template based rotation as compared to dual regression. These results suggest that template based rotation can provide a useful alternative to existing fcMRI analytic methods, particularly in clinical trial settings where predefined outcome measures and conserved network descriptions across groups are at a premium. PMID:25150630

  15. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.« less

  16. Analytical three-dimensional neutron transport benchmarks for verification of nuclear engineering codes. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ganapol, B.D.; Kornreich, D.E.

    Because of the requirement of accountability and quality control in the scientific world, a demand for high-quality analytical benchmark calculations has arisen in the neutron transport community. The intent of these benchmarks is to provide a numerical standard to which production neutron transport codes may be compared in order to verify proper operation. The overall investigation as modified in the second year renewal application includes the following three primary tasks. Task 1 on two dimensional neutron transport is divided into (a) single medium searchlight problem (SLP) and (b) two-adjacent half-space SLP. Task 2 on three-dimensional neutron transport covers (a) pointmore » source in arbitrary geometry, (b) single medium SLP, and (c) two-adjacent half-space SLP. Task 3 on code verification, includes deterministic and probabilistic codes. The primary aim of the proposed investigation was to provide a suite of comprehensive two- and three-dimensional analytical benchmarks for neutron transport theory applications. This objective has been achieved. The suite of benchmarks in infinite media and the three-dimensional SLP are a relatively comprehensive set of one-group benchmarks for isotropically scattering media. Because of time and resource limitations, the extensions of the benchmarks to include multi-group and anisotropic scattering are not included here. Presently, however, enormous advances in the solution for the planar Green`s function in an anisotropically scattering medium have been made and will eventually be implemented in the two- and three-dimensional solutions considered under this grant. Of particular note in this work are the numerical results for the three-dimensional SLP, which have never before been presented. The results presented were made possible only because of the tremendous advances in computing power that have occurred during the past decade.« less

  17. Development of an achiral supercritical fluid chromatography method with ultraviolet absorbance and mass spectrometric detection for impurity profiling of drug candidates. Part II. Selection of an orthogonal set of stationary phases.

    PubMed

    Lemasson, Elise; Bertin, Sophie; Hennig, Philippe; Boiteux, Hélène; Lesellier, Eric; West, Caroline

    2015-08-21

    Impurity profiling of organic products that are synthesized as possible drug candidates requires complementary analytical methods to ensure that all impurities are identified. Supercritical fluid chromatography (SFC) is a very useful tool to achieve this objective, as an adequate selection of stationary phases can provide orthogonal separations so as to maximize the chances to see all impurities. In this series of papers, we have developed a method for achiral SFC-MS profiling of drug candidates, based on a selection of 160 analytes issued from Servier Research Laboratories. In the first part of this study, focusing on mobile phase selection, a gradient elution with carbon dioxide and methanol comprising 2% water and 20mM ammonium acetate proved to be the best in terms of chromatographic performance, while also providing good MS response [1]. The objective of this second part was the selection of an orthogonal set of ultra-high performance stationary phases, that was carried out in two steps. Firstly, a reduced set of analytes (20) was used to screen 23 columns. The columns selected were all 1.7-2.5μm fully porous or 2.6-2.7μm superficially porous particles, with a variety of stationary phase chemistries. Derringer desirability functions were used to rank the columns according to retention window, column efficiency evaluated with peak width of selected analytes, and the proportion of analytes successfully eluted with good peak shapes. The columns providing the worst performances were thus eliminated and a shorter selection of columns (11) was obtained. Secondly, based on 160 tested analytes, the 11 columns were ranked again. The retention data obtained on these columns were then compared to define a reduced set of the best columns providing the greatest orthogonality, to maximize the chances to see all impurities within a limited number of runs. Two high-performance columns were thus selected: ACQUITY UPC(2) HSS C18 SB and Nucleoshell HILIC. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. A big data geospatial analytics platform - Physical Analytics Integrated Repository and Services (PAIRS)

    NASA Astrophysics Data System (ADS)

    Hamann, H.; Jimenez Marianno, F.; Klein, L.; Albrecht, C.; Freitag, M.; Hinds, N.; Lu, S.

    2015-12-01

    A big data geospatial analytics platform:Physical Analytics Information Repository and Services (PAIRS)Fernando Marianno, Levente Klein, Siyuan Lu, Conrad Albrecht, Marcus Freitag, Nigel Hinds, Hendrik HamannIBM TJ Watson Research Center, Yorktown Heights, NY 10598A major challenge in leveraging big geospatial data sets is the ability to quickly integrate multiple data sources into physical and statistical models and be run these models in real time. A geospatial data platform called Physical Analytics Information and Services (PAIRS) is developed on top of open source hardware and software stack to manage Terabyte of data. A new data interpolation and re gridding is implemented where any geospatial data layers can be associated with a set of global grid where the grid resolutions is doubling for consecutive layers. Each pixel on the PAIRS grid have an index that is a combination of locations and time stamp. The indexing allow quick access to data sets that are part of a global data layers and allowing to retrieve only the data of interest. PAIRS takes advantages of parallel processing framework (Hadoop) in a cloud environment to digest, curate, and analyze the data sets while being very robust and stable. The data is stored on a distributed no-SQL database (Hbase) across multiple server, data upload and retrieval is parallelized where the original analytics task is broken up is smaller areas/volume, analyzed independently, and then reassembled for the original geographical area. The differentiating aspect of PAIRS is the ability to accelerate model development across large geographical regions and spatial resolution ranging from 0.1 m up to hundreds of kilometer. System performance is benchmarked on real time automated data ingestion and retrieval of Modis and Landsat data layers. The data layers are curated for sensor error, verified for correctness, and analyzed statistically to detect local anomalies. Multi-layer query enable PAIRS to filter different data layers based on specific conditions (e.g analyze flooding risk of a property based on topography, soil ability to hold water, and forecasted precipitation) or retrieve information about locations that share similar weather and vegetation patterns during extreme weather events like heat wave.

  19. Screening of 23 β-lactams in foodstuffs by LC-MS/MS using an alkaline QuEChERS-like extraction.

    PubMed

    Bessaire, Thomas; Mujahid, Claudia; Beck, Andrea; Tarres, Adrienne; Savoy, Marie-Claude; Woo, Pei-Mun; Mottier, Pascal; Desmarchelier, Aurélien

    2018-04-01

    A fast and robust high performance LC-MS/MS screening method was developed for the analysis of β-lactam antibiotics in foods of animal origin: eggs, raw milk, processed dairy ingredients, infant formula, and meat- and fish-based products including baby foods. QuEChERS extraction with some adaptations enabled 23 drugs to be simultaneously monitored. Screening target concentrations were set at levels adequate to ensure compliance with current European, Chinese, US and Canadian regulations. The method was fully validated according to the European Community Reference Laboratories Residues Guidelines using 93 food samples of different composition. False-negative and false-positive rates were below 5% for all analytes. The method is adequate for use in high-routine laboratories. A 1-year study was additionally conducted to assess the stability of the 23 analytes in the working standard solution.

  20. Analytic EoS and PTW strength model recommendation for Starck Ta

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjue, Sky K.; Prime, Michael B.

    2016-09-01

    The purpose of this document is to provide an analytic EoS and PTW strength model for Starck Ta that can be consistently used between different platforms and simulations at three labs. This should provide a consistent basis for comparison of the results of calculations, but not the best implementation for matching a wide variety of experimental data. Another version using SESAME tables should follow, which will provide a better physical representation over a broader range of conditions. The data sets available at the time only include one Hopkinson bar at a strain rate of 1800/s; a broader range of high-ratemore » calibration data would be preferred. The resulting fit gives the PTW parameter p = 0. To avoid numerical issues, p = 0:001 has been used in FLAG. The PTW parameters that apply above the maximum strain rate in the data use the values from the original publication.« less

  1. Cross-section and rate formulas for electron-impact ionization, excitation, deexcitation, and total depopulation of excited atoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vriens, L.; Smeets, A.H.M.

    1980-09-01

    For electron-induced ionization, excitation, and de-excitation, mainly from excited atomic states, a detailed analysis is presented of the dependence of the cross sections and rate coefficients on electron energy and temperature, and on atomic parameters. A wide energy range is covered, including sudden as well as adiabatic collisions. By combining the available experimental and theoretical information, a set of simple analytical formulas is constructed for the cross sections and rate coefficients of the processes mentioned, for the total depopulation, and for three-body recombination. The formulas account for large deviations from classical and semiclassical scaling, as found for excitation. They agreemore » with experimental data and with the theories in their respective ranges of validity, but have a wider range of validity than the separate theories. The simple analytical form further facilitates the application in plasma modeling.« less

  2. Exact semiclassical expansions for one-dimensional quantum oscillators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delabaere, E.; Dillinger, H.; Pham, F.

    1997-12-01

    A set of rules is given for dealing with WKB expansions in the one-dimensional analytic case, whereby such expansions are not considered as approximations but as exact encodings of wave functions, thus allowing for analytic continuation with respect to whichever parameters the potential function depends on, with an exact control of small exponential effects. These rules, which include also the case when there are double turning points, are illustrated on various examples, and applied to the study of bound state or resonance spectra. In the case of simple oscillators, it is thus shown that the Rayleigh{endash}Schr{umlt o}dinger series is Borelmore » resummable, yielding the exact energy levels. In the case of the symmetrical anharmonic oscillator, one gets a simple and rigorous justification of the Zinn-Justin quantization condition, and of its solution in terms of {open_quotes}multi-instanton expansions.{close_quotes} {copyright} {ital 1997 American Institute of Physics.}« less

  3. Saying goodbye to the hero: Jung, Liber Novus and conversion from addiction.

    PubMed

    Addenbrooke, Mary

    2015-06-01

    Two chapters in Liber Novus throw fresh light on Jung's epistemology of addiction. Taking these as a starting point, the nature of the challenges that patients confront in leaving addiction behind are explored. It is suggested that an archetypal process of separation is constellated at the point of quitting as the precursor to a life without the object of the addiction. A short account is given of Jung's part in the inception of Alcoholics Anonymous and the potential role of a 'conversion experience' as an initiation into psychological reorientation away from the negative individuation experienced by the hero. The case of a patient addicted to heroin illustrates the contribution of an analytic approach in an NHS setting, along with other workers in a rehabilitation centre. Certain challenges of working with addicted people are outlined, including arousal of the psychotherapist's rescue fantasies. © 2015, The Society of Analytical Psychology.

  4. Human life support during interplanetary travel and domicile. III - Mars expedition system trade study

    NASA Technical Reports Server (NTRS)

    Seshan, P. K.; Ferrall, Joseph F.; Rohatgi, Naresh K.

    1991-01-01

    Several alternative configurations of life-support systems (LSSs) for a Mars missions are compared analytically on a quantitative basis in terms of weight, volume, and power. A baseline technology set is utilized for the illustrations of systems including totally open loop, carbon dioxide removal only, partially closed loop, and totally closed loop. The analytical model takes advantage of a modular, top-down hierarchical breakdown of LSS subsystems into functional elements that represent individual processing technologies. The open-loop systems are not competitive in terms of weight for both long-duration orbiters and short-duration lander vehicles, and power demands are lowest with the open loop and highest with the closed loop. The closed-loop system can reduce vehicle weight by over 70,000 lbs and thereby overcome the power penalty of 1600 W; the closed-loop variety is championed as the preferred system for a Mars expedition.

  5. Analytical interatomic potential for modeling nonequilibrium processes in the W-C-H system

    NASA Astrophysics Data System (ADS)

    Juslin, N.; Erhart, P.; Träskelin, P.; Nord, J.; Henriksson, K. O. E.; Nordlund, K.; Salonen, E.; Albe, K.

    2005-12-01

    A reactive interatomic potential based on an analytical bond-order scheme is developed for the ternary system W-C-H. The model combines Brenner's hydrocarbon potential with parameter sets for W-W, W-C, and W-H interactions and is adjusted to materials properties of reference structures with different local atomic coordinations including tungsten carbide, W-H molecules, as well as H dissolved in bulk W. The potential has been tested in various scenarios, such as surface, defect, and melting properties, none of which were considered in the fitting. The intended area of application is simulations of hydrogen and hydrocarbon interactions with tungsten, which have a crucial role in fusion reactor plasma-wall interactions. Furthermore, this study shows that the angular-dependent bond-order scheme can be extended to second nearest-neighbor interactions, which are relevant in body-centered-cubic metals. Moreover, it provides a possibly general route for modeling metal carbides.

  6. Simultaneous quantitative analysis of olmesartan, amlodipine and hydrochlorothiazide in their combined dosage form utilizing classical and alternating least squares based chemometric methods.

    PubMed

    Darwish, Hany W; Bakheit, Ahmed H; Abdelhameed, Ali S

    2016-03-01

    Simultaneous spectrophotometric analysis of a multi-component dosage form of olmesartan, amlodipine and hydrochlorothiazide used for the treatment of hypertension has been carried out using various chemometric methods. Multivariate calibration methods include classical least squares (CLS) executed by net analyte processing (NAP-CLS), orthogonal signal correction (OSC-CLS) and direct orthogonal signal correction (DOSC-CLS) in addition to multivariate curve resolution-alternating least squares (MCR-ALS). Results demonstrated the efficiency of the proposed methods as quantitative tools of analysis as well as their qualitative capability. The three analytes were determined precisely using the aforementioned methods in an external data set and in a dosage form after optimization of experimental conditions. Finally, the efficiency of the models was validated via comparison with the partial least squares (PLS) method in terms of accuracy and precision.

  7. Influence of Wake Models on Calculated Tiltrotor Aerodynamics

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2001-01-01

    The tiltrotor aircraft configuration has the potential to revolutionize air transportation by providing an economical combination of vertical take-off and landing capability with efficient, high-speed cruise flight. To achieve this potential it is necessary to have validated analytical tools that will support future tiltrotor aircraft development. These analytical tools must calculate tiltrotor aeromechanical behavior, including performance, structural loads, vibration, and aeroelastic stability, with an accuracy established by correlation with measured tiltrotor data. The recent test of the Tilt Rotor Aeroacoustic Model (TRAM) with a single,l/4-scale V-22 rotor in the German-Dutch Wind Tunnel (DNW) provides an extensive set of aeroacoustic, performance, and structural loads data. This paper will examine the influence of wake models on calculated tiltrotor aerodynamics, comparing calculations of performance and airloads with TRAM DNW measurements. The calculations will be performed using the comprehensive analysis CAMRAD II.

  8. Conformal Bootstrap in Mellin Space

    NASA Astrophysics Data System (ADS)

    Gopakumar, Rajesh; Kaviraj, Apratim; Sen, Kallol; Sinha, Aninda

    2017-02-01

    We propose a new approach towards analytically solving for the dynamical content of conformal field theories (CFTs) using the bootstrap philosophy. This combines the original bootstrap idea of Polyakov with the modern technology of the Mellin representation of CFT amplitudes. We employ exchange Witten diagrams with built-in crossing symmetry as our basic building blocks rather than the conventional conformal blocks in a particular channel. Demanding consistency with the operator product expansion (OPE) implies an infinite set of constraints on operator dimensions and OPE coefficients. We illustrate the power of this method in the ɛ expansion of the Wilson-Fisher fixed point by reproducing anomalous dimensions and, strikingly, obtaining OPE coefficients to higher orders in ɛ than currently available using other analytic techniques (including Feynman diagram calculations). Our results enable us to get a somewhat better agreement between certain observables in the 3D Ising model and the precise numerical values that have been recently obtained.

  9. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  10. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  11. NHEXAS PHASE I MARYLAND STUDY--PAHS IN AIR ANALYTICAL RESULTS

    EPA Science Inventory

    The PAHs in Air data set contains analytical results for measurements of up to 11 PAHs in 127 air samples over 51 households. Twenty-four-hour samples were taken over a one-week period using a continuous pump and solenoid apparatus pumping a standardized air volume through an UR...

  12. NHEXAS PHASE I ARIZONA STUDY--PESTICIDES IN DERMAL WIPES ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticides in Dermal Wipes data set contains analytical results for measurements of up to 3 pesticides in 177 dermal wipe samples over 177 households. Each sample was collected from the primary respondent within each household during Stage III of the NHEXAS study. The Derma...

  13. NHEXAS PHASE I ARIZONA STUDY--QA ANALYTICAL RESULTS FOR METALS IN SPIKE SAMPLES

    EPA Science Inventory

    The Metals in Spike Samples data set contains the analytical results of measurements of up to 11 metals in 38 control samples (spikes) from 18 households. Measurements were made in spiked samples of dust, food, beverages, blood, urine, and dermal wipe residue. Spiked samples we...

  14. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--PESTICIDES IN DERMAL ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticides in Dermal Wipes data set contains analytical results for measurements of up to 8 pesticides in 86 dermal wipe samples over 86 households. Each sample was collected from the primary respondent within each household. The Dermal/Pesticide hand wipe was collected 7 d...

  15. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--METALS IN AIR ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Air data set contains analytical results for measurements of up to 11 metals in 344 air samples over 86 households. Samples were taken by pumping standardized air volumes through filters at indoor and outdoor sites around each household being sampled. The primary ...

  16. NHEXAS PHASE I ARIZONA STUDY--METALS IN DERMAL WIPES ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Dermal Wipes data set contains analytical results for measurements of up to 11 metals in 179 dermal wipe samples over 179 households. Each sample was collected from the primary respondent within each household during Stage III of the NHEXAS study. The sampling per...

  17. NHEXAS PHASE I ARIZONA STUDY--QA ANALYTICAL RESULTS FOR METALS IN REPLICATE SAMPLES

    EPA Science Inventory

    The Metals in Replicate Samples data set contains the analytical results of measurements of up to 27 metals in 133 replicate (duplicate) samples from 62 households. Measurements were made in samples of soil, blood, tap water, and drinking water. Duplicate samples for a small pe...

  18. Combining data visualization and statistical approaches for interpreting measurements and meta-data: Integrating heatmaps, variable clustering, and mixed regression models

    EPA Science Inventory

    The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...

  19. NHEXAS PHASE I ARIZONA STUDY--PESTICIDES IN DUST ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticides in Dust data set contains analytical results for measurements of up to 3 pesticides in 437 dust samples over 278 households. Samples were taken by collecting dust from the indoor floor areas in the main room and in the bedroom of the primary resident. The primary...

  20. NHEXAS PHASE I ARIZONA STUDY--METALS IN DUST ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Dust data set contains analytical results for measurements of up to 11 metals in 562 dust samples over 388 households. Samples were taken by collecting dust samples from the indoor floor areas in the main room and in the bedroom of the primary resident. In additio...

  1. How Dispositional Learning Analytics Helps Understanding the Worked-Example Principle

    ERIC Educational Resources Information Center

    Tempelaar, Dirk

    2017-01-01

    This empirical study aims to demonstrate how Dispositional Learning Analytics can contribute in the investigation of the effectiveness of didactical scenarios in authentic settings, where previous research has mostly been laboratory based. Using a showcase based on learning processes of 1080 students in a blended introductory quantitative course,…

  2. Social Learning Analytics: Navigating the Changing Settings of Higher Education

    ERIC Educational Resources Information Center

    de Laat, Maarten; Prinsen, Fleur R.

    2014-01-01

    Current trends and challenges in higher education (HE) require a reorientation towards openness, technology use and active student participation. In this article we will introduce Social Learning Analytics (SLA) as instrumental in formative assessment practices, aimed at supporting and strengthening students as active learners in increasingly open…

  3. Utility of the summation chromatographic peak integration function to avoid manual reintegrations in the analysis of targeted analytes

    USDA-ARS?s Scientific Manuscript database

    As sample preparation and analytical techniques have improved, data handling has become the main limitation in automated high-throughput analysis of targeted chemicals in many applications. Conventional chromatographic peak integration functions rely on complex software and settings, but untrustwor...

  4. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--PESTICIDE METABOLITES IN URINE ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticide Metabolites in Urine data set contains the analytical results for measurements of up to 8 pesticide metabolites in 86 samples over 86 households. Each sample was collected form the primary respondent within each household. The sample consists of the first morning ...

  5. NHEXAS PHASE I ARIZONA STUDY--QA ANALYTICAL RESULTS FOR PESTICIDE METABOLITES IN BLANK SAMPLES

    EPA Science Inventory

    The Pesticide Metabolites in Blank Samples data set contains the analytical results of measurements of up to 4 pesticide metabolites in 3 blank samples from 3 households. Measurements were made in blank samples of urine. Blank samples were used to assess the potential for sampl...

  6. NHEXAS PHASE I MARYLAND STUDY--PESTICIDES IN WATER ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticides in Water data set contains analytical results for measurements of up to 10 pesticides in 388 water samples over 80 households. One-liter samples of tap water were collected after a two-minute flush from the tap identified by the resident as that most commonly used...

  7. NHEXAS PHASE I ARIZONA STUDY--METALS IN URINE ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Urine data set contains analytical results for measurements of up to 6 metals in 176 urine samples over 176 households. Each sample was collected from the primary respondent within each household during Stage III of the NHEXAS study. The sample consists of the fir...

  8. NHEXAS PHASE I MARYLAND STUDY--METALS IN SOIL ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Soil data set contains analytical results for measurements of up to 4 metals in 277 soil samples over 75 households. Composite samples were obtained from up to 24 locations around the outside of the specific residence and combined into a single sample. The primary...

  9. NHEXAS PHASE I ARIZONA STUDY--QA ANALYTICAL RESULTS FOR PESTICIDES IN BLANK SAMPLES

    EPA Science Inventory

    The Pesticides in Blank Samples data set contains the analytical results of measurements of up to 4 pesticides in 43 blank samples from 29 households. Measurements were made in blank samples of dust, indoor and outdoor air, food and beverages, blood, urine, and dermal wipe resid...

  10. NHEXAS PHASE I MARYLAND STUDY--METALS IN DERMAL WIPES ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Dermal Wipe Samples data set contains analytical results for measurements of up to 4 metals in 343 dermal wipe samples over 80 households. Each sample was collected from the primary respondent within each household. The sampling period occurred on the first day of...

  11. NHEXAS PHASE I MARYLAND STUDY--METALS IN DUST ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Dust data set contains analytical results for measurements of up to 4 metals in 282 dust samples over 80 households. Samples were obtained by collecting dust samples from the indoor floor areas in the main activity room using a modified vacuum cleaner device that c...

  12. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--METALS IN WATER ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Water data set contains analytical results for measurements of up to 11 metals in 98 water samples over 61 households. Sample collection was undertaken at the tap and any additional drinking water source used extensively within each residence. The primary metals o...

  13. NHEXAS PHASE I MARYLAND STUDY--METALS IN AIR ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Air data set contains analytical results for measurements of up to 4 metals in 458 air samples over 79 households. Twenty-four-hour samples were taken over a one-week period using a continuous pump and solenoid apparatus by pumping a standardized air volume through...

  14. NHEXAS PHASE I MARYLAND STUDY--QA ANALYTICAL RESULTS FOR METALS IN BLANKS

    EPA Science Inventory

    The Metals in Blanks data set contains the analytical results of measurements of up to 11 metals in 115 blank samples from 58 households. Measurements were made in blank samples of indoor and outdoor air, drinking water, beverages, urine, and blood. Blank samples were used to a...

  15. NHEXAS PHASE I MARYLAND STUDY--PESTICIDES IN FOOD ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticides in Duplicate Diet Food data set contains analytical results for measurements of up to 10 pesticides in 682 food samples over 80 households. Each sample was collected as a duplicate of the food consumed by the primary respondent during a four-day period commencing ...

  16. NHEXAS PHASE I MARYLAND STUDY--PESTICIDES IN DUST ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticides in Dust data set contains analytical results for measurements of up to 9 pesticides in 126 dust samples over 50 households. Samples were obtained by collecting dust samples from the indoor floor areas in the main activity room using a modified vacuum cleaner devic...

  17. NHEXAS PHASE I MARYLAND STUDY--QA ANALYTICAL RESULTS FOR METALS IN REPLICATE SAMPLES

    EPA Science Inventory

    The Metals in Replicates data set contains the analytical results of measurements of up to 11 metals in 88 replicate (duplicate) samples from 52 households. Measurements were made in samples of indoor and outdoor air, drinking water, food, and beverages. Duplicate samples for a...

  18. NHEXAS PHASE I MARYLAND STUDY--METALS IN FOOD ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Duplicate Diet Food data set contains analytical results for measurements of up to 11 metals in 773 food samples over 80 households. Each sample was collected as a duplicate of the food consumed by the primary respondent during a four-day period commencing with the...

  19. NHEXAS PHASE I MARYLAND STUDY--PESTICIDES IN AIR ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticides in Air data set contains analytical results for measurements of up to 9 pesticides in 127 air samples over 51 households. Samples were taken by pumping standardized air volumes through URG impactors with a 10 um cutpoint and polyurethane foam (PUF) filters at indo...

  20. NHEXAS PHASE I MARYLAND STUDY--PESTICIDES IN SOIL ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticides in Soil data set contains analytical results for measurements of up to 9 pesticides in 60 soil samples over 41 households. Composite samples were obtained from up to 24 locations around the outside of the specific residence and combined into a single sample. Only...

  1. NHEXAS PHASE I ARIZONA STUDY--QA ANALYTICAL RESULTS FOR PESTICIDE METABOLITES IN SPIKE SAMPLES

    EPA Science Inventory

    The Pesticide Metabolites in Spike Samples data set contains the analytical results of measurements of up to 4 pesticide metabolites in 3 control samples (spikes) from 3 households. Measurements were made in spiked samples of urine. Spiked samples were used to assess recovery o...

  2. NHEXAS PHASE I ARIZONA STUDY--METALS IN BLOOD ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Blood data set contains analytical results for measurements of up to 2 metals in 165 blood samples over 165 households. Each sample was collected as a venous sample from the primary respondent within each household during Stage III of the NHEXAS study. The samples...

  3. NHEXAS PHASE I MARYLAND STUDY--QA ANALYTICAL RESULTS FOR PESTICIDES IN BLANKS

    EPA Science Inventory

    The Pesticides in Blanks data set contains the analytical results of measurements of up to 20 pesticides in 70 blank samples from 46 households. Measurements were made in blank samples of indoor air, dust, soil, drinking water, food, beverages, and blood serum. Blank samples we...

  4. NHEXAS PHASE I MARYLAND STUDY--METALS IN BLOOD ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Blood data set contains analytical results for measurements of up to 2 metals in 374 blood samples over 80 households. Each sample was collected via a venous sample from the primary respondent within each household by a phlebotomist. Samples were generally drawn o...

  5. NHEXAS PHASE I MARYLAND STUDY--QA ANALYTICAL RESULTS FOR PESTICIDE METABOLITES IN BLANKS

    EPA Science Inventory

    The Pesticide Metabolites in Blanks data set contains the analytical results of measurements of up to 4 pesticide metabolites in 14 blank samples from 13 households. Measurements were made in blank samples of urine. Blank samples were used to assess the potential for sample con...

  6. NHEXAS PHASE I MARYLAND STUDY--METALS IN URINE ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Urine data set contains analytical results for measurements of up to 3 metals in 376 urine samples over 80 households. Each sample was collected from the primary respondent within each household during the study and represented the first morning void of either Day ...

  7. ENVIRONMENTAL RESEARCH BRIEF : ANALYTIC ELEMENT MODELING OF GROUND-WATER FLOW AND HIGH PERFORMANCE COMPUTING

    EPA Science Inventory

    Several advances in the analytic element method have been made to enhance its performance and facilitate three-dimensional ground-water flow modeling in a regional aquifer setting. First, a new public domain modular code (ModAEM) has been developed for modeling ground-water flow ...

  8. Teaching Critical & Analytical Thinking in High School Biology?

    ERIC Educational Resources Information Center

    McDonald, Gaby

    2012-01-01

    How can critical and analytical thinking be improved so that they mimic real-life research and prepare students for university courses? The data sets obtained in students' experiments were used to encourage students to evaluate results, experiments, and published information critically. Examples show that students can learn to compare and defend…

  9. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--QA ANALYTICAL RESULTS FOR PARTICULATE MATTER IN BLANK SAMPLES

    EPA Science Inventory

    The Particulate Matter in Blank Samples data set contains the analytical results for measurements of two particle sizes in 12 samples. Filters were pre-weighed, loaded into impactors, kept unexposed in the laboratory, unloaded and post-weighed. Positive weight gains for laborat...

  10. Analytics to Literacies: The Development of a Learning Analytics Framework for Multiliteracies Assessment

    ERIC Educational Resources Information Center

    Dawson, Shane; Siemens, George

    2014-01-01

    The rapid advances in information and communication technologies, coupled with increased access to information and the formation of global communities, have resulted in interest among researchers and academics to revise educational practice to move beyond traditional "literacy" skills towards an enhanced set of…

  11. Multivariate analysis of chromatographic retention data as a supplementary means for grouping structurally related compounds.

    PubMed

    Fasoula, S; Zisi, Ch; Sampsonidis, I; Virgiliou, Ch; Theodoridis, G; Gika, H; Nikitas, P; Pappa-Louisi, A

    2015-03-27

    In the present study a series of 45 metabolite standards belonging to four chemically similar metabolite classes (sugars, amino acids, nucleosides and nucleobases, and amines) was subjected to LC analysis on three HILIC columns under 21 different gradient conditions with the aim to explore whether the retention properties of these analytes are determined from the chemical group they belong. Two multivariate techniques, principal component analysis (PCA) and discriminant analysis (DA), were used for statistical evaluation of the chromatographic data and extraction similarities between chemically related compounds. The total variance explained by the first two principal components of PCA was found to be about 98%, whereas both statistical analyses indicated that all analytes are successfully grouped in four clusters of chemical structure based on the retention obtained in four or at least three chromatographic runs, which, however should be performed on two different HILIC columns. Moreover, leave-one-out cross-validation of the above retention data set showed that the chemical group in which an analyte belongs can be 95.6% correctly predicted when the analyte is subjected to LC analysis under the same four or three experimental conditions as the all set of analytes was run beforehand. That, in turn, may assist with disambiguation of analyte identification in complex biological extracts. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Subject-enabled analytics model on measurement statistics in health risk expert system for public health informatics.

    PubMed

    Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun

    2017-11-01

    This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Interactive Visual Analytics Approch for Exploration of Geochemical Model Simulations with Different Parameter Sets

    NASA Astrophysics Data System (ADS)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2015-04-01

    Many geoscience applications can benefit from testing many combinations of input parameters for geochemical simulation models. It is, however, a challenge to screen the input and output data from the model to identify the significant relationships between input parameters and output variables. For addressing this problem we propose a Visual Analytics approach that has been developed in an ongoing collaboration between computer science and geoscience researchers. Our Visual Analytics approach uses visualization methods of hierarchical horizontal axis, multi-factor stacked bar charts and interactive semi-automated filtering for input and output data together with automatic sensitivity analysis. This guides the users towards significant relationships. We implement our approach as an interactive data exploration tool. It is designed with flexibility in mind, so that a diverse set of tasks such as inverse modeling, sensitivity analysis and model parameter refinement can be supported. Here we demonstrate the capabilities of our approach by two examples for gas storage applications. For the first example our Visual Analytics approach enabled the analyst to observe how the element concentrations change around previously established baselines in response to thousands of different combinations of mineral phases. This supported combinatorial inverse modeling for interpreting observations about the chemical composition of the formation fluids at the Ketzin pilot site for CO2 storage. The results indicate that, within the experimental error range, the formation fluid cannot be considered at local thermodynamical equilibrium with the mineral assemblage of the reservoir rock. This is a valuable insight from the predictive geochemical modeling for the Ketzin site. For the second example our approach supports sensitivity analysis for a reaction involving the reductive dissolution of pyrite with formation of pyrrothite in presence of gaseous hydrogen. We determine that this reaction is thermodynamically favorable under a broad range of conditions. This includes low temperatures and absence of microbial catalysators. Our approach has potential for use in other applications that involve exploration of relationships in geochemical simulation model data.

  14. Analytical performance specifications for external quality assessment - definitions and descriptions.

    PubMed

    Jones, Graham R D; Albarede, Stephanie; Kesseler, Dagmar; MacKenzie, Finlay; Mammen, Joy; Pedersen, Morten; Stavelin, Anne; Thelen, Marc; Thomas, Annette; Twomey, Patrick J; Ventura, Emma; Panteghini, Mauro

    2017-06-27

    External Quality Assurance (EQA) is vital to ensure acceptable analytical quality in medical laboratories. A key component of an EQA scheme is an analytical performance specification (APS) for each measurand that a laboratory can use to assess the extent of deviation of the obtained results from the target value. A consensus conference held in Milan in 2014 has proposed three models to set APS and these can be applied to setting APS for EQA. A goal arising from this conference is the harmonisation of EQA APS between different schemes to deliver consistent quality messages to laboratories irrespective of location and the choice of EQA provider. At this time there are wide differences in the APS used in different EQA schemes for the same measurands. Contributing factors to this variation are that the APS in different schemes are established using different criteria, applied to different types of data (e.g. single data points, multiple data points), used for different goals (e.g. improvement of analytical quality; licensing), and with the aim of eliciting different responses from participants. This paper provides recommendations from the European Federation of Laboratory Medicine (EFLM) Task and Finish Group on Performance Specifications for External Quality Assurance Schemes (TFG-APSEQA) and on clear terminology for EQA APS. The recommended terminology covers six elements required to understand APS: 1) a statement on the EQA material matrix and its commutability; 2) the method used to assign the target value; 3) the data set to which APS are applied; 4) the applicable analytical property being assessed (i.e. total error, bias, imprecision, uncertainty); 5) the rationale for the selection of the APS; and 6) the type of the Milan model(s) used to set the APS. The terminology is required for EQA participants and other interested parties to understand the meaning of meeting or not meeting APS.

  15. Full value documentation in the Czech Food Composition Database.

    PubMed

    Machackova, M; Holasova, M; Maskova, E

    2010-11-01

    The aim of this project was to launch a new Food Composition Database (FCDB) Programme in the Czech Republic; to implement a methodology for food description and value documentation according to the standards designed by the European Food Information Resource (EuroFIR) Network of Excellence; and to start the compilation of a pilot FCDB. Foods for the initial data set were selected from the list of foods included in the Czech Food Consumption Basket. Selection of 24 priority components was based on the range of components used in former Czech tables. The priority list was extended with components for which original Czech analytical data or calculated data were available. Values that were input into the compiled database were documented according to the EuroFIR standards within the entities FOOD, COMPONENT, VALUE and REFERENCE using Excel sheets. Foods were described using the LanguaL Thesaurus. A template for documentation of data according to the EuroFIR standards was designed. The initial data set comprised documented data for 162 foods. Values were based on original Czech analytical data (available for traditional and fast foods, milk and milk products, wheat flour types), data derived from literature (for example, fruits, vegetables, nuts, legumes, eggs) and calculated data. The Czech FCDB programme has been successfully relaunched. Inclusion of the Czech data set into the EuroFIR eSearch facility confirmed compliance of the database format with the EuroFIR standards. Excel spreadsheets are applicable for full value documentation in the FCDB.

  16. Electrocardiographic interpretation skills of cardiology residents: are they competent?

    PubMed

    Sibbald, Matthew; Davies, Edward G; Dorian, Paul; Yu, Eric H C

    2014-12-01

    Achieving competency at electrocardiogram (ECG) interpretation among cardiology subspecialty residents has traditionally focused on interpreting a target number of ECGs during training. However, there is little evidence to support this approach. Further, there are no data documenting the competency of ECG interpretation skills among cardiology residents, who become de facto the gold standard in their practice communities. We tested 29 Cardiology residents from all 3 years in a large training program using a set of 20 ECGs collected from a community cardiology practice over a 1-month period. Residents interpreted half of the ECGs using a standard analytic framework, and half using their own approach. Residents were scored on the number of correct and incorrect diagnoses listed. Overall diagnostic accuracy was 58%. Of 6 potentially life-threatening diagnoses, residents missed 36% (123 of 348) including hyperkalemia (81%), long QT (52%), complete heart block (35%), and ventricular tachycardia (19%). Residents provided additional inappropriate diagnoses on 238 ECGs (41%). Diagnostic accuracy was similar between ECGs interpreted using an analytic framework vs ECGs interpreted without an analytic framework (59% vs 58%; F(1,1333) = 0.26; P = 0.61). Cardiology resident proficiency at ECG interpretation is suboptimal. Despite the use of an analytic framework, there remain significant deficiencies in ECG interpretation among Cardiology residents. A more systematic method of addressing these important learning gaps is urgently needed. Copyright © 2014 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.

  17. An active learning representative subset selection method using net analyte signal.

    PubMed

    He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi

    2018-05-05

    To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. An active learning representative subset selection method using net analyte signal

    NASA Astrophysics Data System (ADS)

    He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi

    2018-05-01

    To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced.

  19. Institutional, Financial, Legal, and Cultural Factors in a Distance Learning Program.

    PubMed

    Blakeman, Rachel; Haseley, Dennis

    2015-06-01

    As psychoanalytic institutes evolve, adapting to the contemporary financial and social environment, the integration of new technologies into psychoanalytic education presents opportunities for expansion to candidates residing beyond the usual geographic boundaries. While the teaching of analytic content through distance learning programs appears to be relatively straightforward, factors including legalities, traditional mind-sets, and cross-cultural issues need to be considered as complicating the situation, as illustrated by one U.S. institute's distance learning initiative with a group in South Korea. © 2015 by the American Psychoanalytic Association.

  20. Next generation data harmonization

    NASA Astrophysics Data System (ADS)

    Armstrong, Chandler; Brown, Ryan M.; Chaves, Jillian; Czerniejewski, Adam; Del Vecchio, Justin; Perkins, Timothy K.; Rudnicki, Ron; Tauer, Greg

    2015-05-01

    Analysts are presented with a never ending stream of data sources. Often, subsets of data sources to solve problems are easily identified but the process to align data sets is time consuming. However, many semantic technologies do allow for fast harmonization of data to overcome these problems. These include ontologies that serve as alignment targets, visual tools and natural language processing that generate semantic graphs in terms of the ontologies, and analytics that leverage these graphs. This research reviews a developed prototype that employs all these approaches to perform analysis across disparate data sources documenting violent, extremist events.

  1. A review of gear housing dynamics and acoustics literature

    NASA Technical Reports Server (NTRS)

    Lim, Teik Chin; Singh, Rajendra

    1989-01-01

    A review of the available literature on gear housing vibration and noise radiation is presented. Analytical and experimental methodologies used for bearing dynamics, housing vibration and noise, mounts and suspensions, and the overall gear and housing system are discussed. Typical design guidelines, as outlined by various investigators, are also included. Results of this review indicate that although many attempts were made to characterize the dynamics of gearbox system components, no comprehensive set of design criteria currently exist. Moreover, the literature contains conflicting reports concerning relevant design guidelines.

  2. Design and analytical study of a rotor airfoil

    NASA Technical Reports Server (NTRS)

    Dadone, L. U.

    1978-01-01

    An airfoil section for use on helicopter rotor blades was defined and analyzed by means of potential flow/boundary layer interaction and viscous transonic flow methods to meet as closely as possible a set of advanced airfoil design objectives. The design efforts showed that the first priority objectives, including selected low speed pitching moment, maximum lift and drag divergence requirements can be met, though marginally. The maximum lift requirement at M = 0.5 and most of the profile drag objectives cannot be met without some compromise of at least one of the higher order priorities.

  3. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--METALS IN HAIR ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Hair data set contains the analytical results for measurements of mercury(CAS # 7439-97-6) collected from 40 of the 86 primary household residents (IRN=01) in the Arizona Border Survey. A hair sample was collected from the occipital region of the participant's head...

  4. Educational Change in Post-Conflict Contexts: Reflections on the South African Experience 20 Years Later

    ERIC Educational Resources Information Center

    Christie, Pam

    2016-01-01

    Reflecting on South African experience, this paper develops an analytical framework using the work of Henri Lefebvre and Nancy Fraser to understand why socially just arrangements may be so difficult to achieve in post-conflict reconstruction. The paper uses Lefebvre's analytic to trace three sets of entangled practices…

  5. NHEXAS PHASE I ARIZONA STUDY--QA ANALYTICAL RESULTS FOR METALS IN BLANK SAMPLES

    EPA Science Inventory

    The Metals in Blank Samples data set contains the analytical results of measurements of up to 27 metals in 82 blank samples from 26 households. Measurements were made in blank samples of dust, indoor and outdoor air, personal air, food, beverages, blood, urine, and dermal wipe r...

  6. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--METALS IN DERMAL ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Dermal Wipes data set contains analytical results for measurements of up to 11 metals in 86 dermal wipe samples over 86 households. Each sample was collected from the primary respondent within each household. The sampling period occurred on the first day of the fi...

  7. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--QA ANALYTICAL RESULTS FOR METALS IN SPIKE SAMPLES

    EPA Science Inventory

    The Metals in Spike Samples data set contains the analytical results of measurements of up to 11 metals in 15 control samples (spikes) from 11 households. Measurements were made in spiked samples of dust, food, and dermal wipe residue. Spiked samples were used to assess recover...

  8. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--QA ANALYTICAL RESULTS FOR METALS IN REPLICATE SAMPLES

    EPA Science Inventory

    The Metals in Replicate Samples data set contains the analytical results of measurements of up to 2 metals in 172 replicate (duplicate) samples from 86 households. Measurements were made in samples of blood. Duplicate samples for a small percentage of the total number of sample...

  9. Triple Helix Systems: An Analytical Framework for Innovation Policy and Practice in the Knowledge Society

    ERIC Educational Resources Information Center

    Ranga, Marina; Etzkowitz, Henry

    2013-01-01

    This paper introduces the concept of Triple Helix systems as an analytical construct that synthesizes the key features of university--industry--government (Triple Helix) interactions into an "innovation system" format, defined according to systems theory as a set of components, relationships and functions. Among the components of Triple…

  10. NHEXAS PHASE I ARIZONA STUDY--METALS-XRF IN DUST ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals-XRF in Dust data set contains X-ray fluorescence (XRF) analytical results for measurements of up to 27 metals in 384 dust samples over 384 households. Samples were taken by collecting dust from the indoor floor areas in the main room and in the bedroom of the primary ...

  11. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--PESTICIDES IN DUST ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticides in Dust data set contains analytical results for measurements of up to 8 pesticides in 91 dust samples over 91 households. Samples were taken by collecting dust from the indoor floor areas in the main room and in the bedroom of the primary resident. The primary p...

  12. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--PAHS IN DUST ANALYTICAL RESULTS

    EPA Science Inventory

    The PAHs in Dust data set contains the analytical results for measurements of up to 21 polynuclear aromatic hydrocarbons (PAHs) in 91 dust samples over 91 households. Samples were taken by collecting dust from the indoor floor areas from the main room and in the bedroom of the p...

  13. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--METALS IN DUST ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Dust data set contains analytical results for measurements of up to 11 metals in 182 dust samples over 91 households. Samples were taken by collecting dust samples from the indoor floor areas in the main room and in the bedroom of the primary resident. In addition...

  14. A Meta-Analytic Review of School-Based Prevention for Cannabis Use

    ERIC Educational Resources Information Center

    Porath-Waller, Amy J.; Beasley, Erin; Beirness, Douglas J.

    2010-01-01

    This investigation used meta-analytic techniques to evaluate the effectiveness of school-based prevention programming in reducing cannabis use among youth aged 12 to 19. It summarized the results from 15 studies published in peer-reviewed journals since 1999 and identified features that influenced program effectiveness. The results from the set of…

  15. NHEXAS PHASE I MARYLAND STUDY--METALS IN WATER ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Water data set contains analytical results for measurements of up to 11 metals in 400 water samples over 80 households. One-liter samples of tap water were collected after a two minute flush from the tap identified by the resident as that most commonly used for dri...

  16. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--METALS IN URINE ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Urine data set contains analytical results for measurements of up to 7 metals in 86 urine samples over 86 households. Each sample was collected from the primary respondent within each household. The sample consists of the first morning void following the 24-hour d...

  17. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--METALS IN BLOOD ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals in Blood data set contains analytical results for measurements of up to 2 metals in 86 blood samples over 86 households. Each sample was collected as a venous sample from the primary respondent within each household. The samples consisted of two 3-mL tubes. The prim...

  18. NHEXAS PHASE I MARYLAND STUDY--QA ANALYTICAL RESULTS FOR METALS IN SPIKE SAMPLES

    EPA Science Inventory

    The Metals in Spikes data set contains the analytical results of measurements of up to 4 metals in 71 control samples (spikes) from 47 households. Measurements were made in samples of indoor and outdoor air, blood, and urine. Controls were used to assess recovery of target anal...

  19. Degrees of School Democracy: A Holistic Framework

    ERIC Educational Resources Information Center

    Woods, Philip A.; Woods, Glenys J.

    2012-01-01

    This article outlines an analytical framework that enables analysis of degrees of democracy in a school or other organizational setting. It is founded in a holistic conception of democracy, which is a model of working together that aspires to truth, goodness, and meaning and the participation of all. We suggest that the analytical framework can be…

  20. Translating the Theoretical into Practical: A Logical Framework of Functional Analytic Psychotherapy Interactions for Research, Training, and Clinical Purposes

    ERIC Educational Resources Information Center

    Weeks, Cristal E.; Kanter, Jonathan W.; Bonow, Jordan T.; Landes, Sara J.; Busch, Andrew M.

    2012-01-01

    Functional analytic psychotherapy (FAP) provides a behavioral analysis of the psychotherapy relationship that directly applies basic research findings to outpatient psychotherapy settings. Specifically, FAP suggests that a therapist's in vivo (i.e., in-session) contingent responding to targeted client behaviors, particularly positive reinforcement…

  1. The MIND PALACE: A Multi-Spectral Imaging and Spectroscopy Database for Planetary Science

    NASA Astrophysics Data System (ADS)

    Eshelman, E.; Doloboff, I.; Hara, E. K.; Uckert, K.; Sapers, H. M.; Abbey, W.; Beegle, L. W.; Bhartia, R.

    2017-12-01

    The Multi-Instrument Database (MIND) is the web-based home to a well-characterized set of analytical data collected by a suite of deep-UV fluorescence/Raman instruments built at the Jet Propulsion Laboratory (JPL). Samples derive from a growing body of planetary surface analogs, mineral and microbial standards, meteorites, spacecraft materials, and other astrobiologically relevant materials. In addition to deep-UV spectroscopy, datasets stored in MIND are obtained from a variety of analytical techniques obtained over multiple spatial and spectral scales including electron microscopy, optical microscopy, infrared spectroscopy, X-ray fluorescence, and direct fluorescence imaging. Multivariate statistical analysis techniques, primarily Principal Component Analysis (PCA), are used to guide interpretation of these large multi-analytical spectral datasets. Spatial co-referencing of integrated spectral/visual maps is performed using QGIS (geographic information system software). Georeferencing techniques transform individual instrument data maps into a layered co-registered data cube for analysis across spectral and spatial scales. The body of data in MIND is intended to serve as a permanent, reliable, and expanding database of deep-UV spectroscopy datasets generated by this unique suite of JPL-based instruments on samples of broad planetary science interest.

  2. The life and work of Melanie Klein in the British Psycho-Analytical Society.

    PubMed

    King, P H

    1983-01-01

    This paper describes certain aspects of the life and work of Melanie Klein in the British Psycho-Analytical Society. It attempts to highlight the reciprocity of the relationship between Melanie Klein and other members of that Society by showing how the climate of psychoanalytical opinion that was prevalent among members of that Society during the first decade of her stay in London, and which encouraged discussion of clinical work and interest in psychoanalytical discovery, provided a congenial setting for her to become firmly established as an active member of the British Society and to continue her contributions to psychoanalytic theory and clinical expertise. The paper also traces the development of Melanie Klein's main theoretical contributions, together with relevant criticisms of them as they emerged, against the background of the history of the British Psycho-Analytical Society. It describes the controversies that arose as to whether or not her ideas could properly be viewed within the framework of psychoanalytic theory, as formulated by Freud, and the attempted resolution of these controversies, together with some comments on the repercussions of these theoretical disagreements on relationships within the Society. An extensive list of references is included to facilitate a more detailed study of the subject.

  3. Proactive Supply Chain Performance Management with Predictive Analytics

    PubMed Central

    Stefanovic, Nenad

    2014-01-01

    Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment. PMID:25386605

  4. Behavior analytic approaches to problem behavior in intellectual disabilities.

    PubMed

    Hagopian, Louis P; Gregory, Meagan K

    2016-03-01

    The purpose of the current review is to summarize recent behavior analytic research on problem behavior in individuals with intellectual disabilities. We have focused our review on studies published from 2013 to 2015, but also included earlier studies that were relevant. Behavior analytic research on problem behavior continues to focus on the use and refinement of functional behavioral assessment procedures and function-based interventions. During the review period, a number of studies reported on procedures aimed at making functional analysis procedures more time efficient. Behavioral interventions continue to evolve, and there were several larger scale clinical studies reporting on multiple individuals. There was increased attention on the part of behavioral researchers to develop statistical methods for analysis of within subject data and continued efforts to aggregate findings across studies through evaluative reviews and meta-analyses. Findings support continued utility of functional analysis for guiding individualized interventions and for classifying problem behavior. Modifications designed to make functional analysis more efficient relative to the standard method of functional analysis were reported; however, these require further validation. Larger scale studies on behavioral assessment and treatment procedures provided additional empirical support for effectiveness of these approaches and their sustainability outside controlled clinical settings.

  5. [Chiral separation of five beta-blockers using di-n-hexyl L-tartrate-boric acid complex as mobile phase additive by reversed-phase liquid chromatography].

    PubMed

    Yang, Juan; Wang, Lijuan; Guo, Qiaoling; Yang, Gengliang

    2012-03-01

    A reversed-phase high performance liquid chromatographic (HPLC) method using the di-n-hexyl L-tartrate-boric acid complex as a chiral mobile phase additive was developed for the enantioseparation of five beta-blockers including propranolol, esmolol, metoprolol, bisoprolol and sotalol. In order to obtain a better enantioseparation, the influences of concentrations of di-n-butyl L-tartrate and boric acid, the type, concentration and pH of the buffer, methanol content as well as the molecular structure of analytes were extensively investigated. The separation of the analytes was performed on a Venusil MP-C18 column (250 mm x 4.6 mm, 5 microm). The mobile phase was 15 mmol/L ammonium acetate-methanol containing 60 mmol/L boric acid, 70 mmol/L di-n-hexyl L-tartrate (pH 6.00). The volume ratios of 15 mmol/L ammonium acetate to methanol were 20: 80 for propranolol, esmolol, metoprolol, bisoprolol and 30: 70 for sotalol. The flow rate was 0.5 mL/min and the detection wavelength was set at 214 nm. Under the optimized conditions, baseline enantioseparation was obtained separately for the five pairs of analytes.

  6. Proactive supply chain performance management with predictive analytics.

    PubMed

    Stefanovic, Nenad

    2014-01-01

    Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment.

  7. Quantifying construction and demolition waste: an analytical review.

    PubMed

    Wu, Zezhou; Yu, Ann T W; Shen, Liyin; Liu, Guiwen

    2014-09-01

    Quantifying construction and demolition (C&D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C&D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C&D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. igun - A program for the simulation of positive ion extraction including magnetic fields

    NASA Astrophysics Data System (ADS)

    Becker, R.; Herrmannsfeldt, W. B.

    1992-04-01

    igun is a program for the simulation of positive ion extraction from plasmas. It is based on the well known program egun for the calculation of electron and ion trajectories in electron guns and lenses. The mathematical treatment of the plasma sheath is based on a simple analytical model, which provides a numerically stable calculation of the sheath potentials. In contrast to other ion extraction programs, igun is able to determine the extracted ion current in succeeding cycles of iteration by itself. However, it is also possible to set values of current, plasma density, or ion current density. Either axisymmetric or rectangular coordinates can be used, including axisymmetric or transverse magnetic fields.

  9. Lyapunov dimension formula for the global attractor of the Lorenz system

    NASA Astrophysics Data System (ADS)

    Leonov, G. A.; Kuznetsov, N. V.; Korzhemanova, N. A.; Kusakin, D. V.

    2016-12-01

    The exact Lyapunov dimension formula for the Lorenz system for a positive measure set of parameters, including classical values, was analytically obtained first by G.A. Leonov in 2002. Leonov used the construction technique of special Lyapunov-type functions, which was developed by him in 1991 year. Later it was shown that the consideration of larger class of Lyapunov-type functions permits proving the validity of this formula for all parameters, of the system, such that all the equilibria of the system are hyperbolically unstable. In the present work it is proved the validity of the formula for Lyapunov dimension for a wider variety of parameters values including all parameters, which satisfy the classical physical limitations.

  10. Characterization and modeling of SET/RESET cycling induced read-disturb failure time degradation in a resistive switching memory

    NASA Astrophysics Data System (ADS)

    Su, Po-Cheng; Hsu, Chun-Chi; Du, Sin-I.; Wang, Tahui

    2017-12-01

    Read operation induced disturbance in SET-state in a tungsten oxide resistive switching memory is investigated. We observe that the reduction of oxygen vacancy density during read-disturb follows power-law dependence on cumulative read-disturb time. Our study shows that the SET-state read-disturb immunity progressively degrades by orders of magnitude as SET/RESET cycle number increases. To explore the cause of the read-disturb degradation, we perform a constant voltage stress to emulate high-field stress effects in SET/RESET cycling. We find that the read-disturb failure time degradation is attributed to high-field stress-generated oxide traps. Since the stress-generated traps may substitute for some of oxygen vacancies in forming conductive percolation paths in a switching dielectric, a stressed cell has a reduced oxygen vacancy density in SET-state, which in turn results in a shorter read-disturb failure time. We develop an analytical read-disturb degradation model including both cycling induced oxide trap creation and read-disturb induced oxygen vacancy reduction. Our model can well reproduce the measured read-disturb failure time degradation in a cycled cell without using fitting parameters.

  11. Parameter Optimization for Feature and Hit Generation in a General Unknown Screening Method-Proof of Concept Study Using a Design of Experiment Approach for a High Resolution Mass Spectrometry Procedure after Data Independent Acquisition.

    PubMed

    Elmiger, Marco P; Poetzsch, Michael; Steuer, Andrea E; Kraemer, Thomas

    2018-03-06

    High resolution mass spectrometry and modern data independent acquisition (DIA) methods enable the creation of general unknown screening (GUS) procedures. However, even when DIA is used, its potential is far from being exploited, because often, the untargeted acquisition is followed by a targeted search. Applying an actual GUS (including untargeted screening) produces an immense amount of data that must be dealt with. An optimization of the parameters regulating the feature detection and hit generation algorithms of the data processing software could significantly reduce the amount of unnecessary data and thereby the workload. Design of experiment (DoE) approaches allow a simultaneous optimization of multiple parameters. In a first step, parameters are evaluated (crucial or noncrucial). Second, crucial parameters are optimized. The aim in this study was to reduce the number of hits, without missing analytes. The obtained parameter settings from the optimization were compared to the standard settings by analyzing a test set of blood samples spiked with 22 relevant analytes as well as 62 authentic forensic cases. The optimization lead to a marked reduction of workload (12.3 to 1.1% and 3.8 to 1.1% hits for the test set and the authentic cases, respectively) while simultaneously increasing the identification rate (68.2 to 86.4% and 68.8 to 88.1%, respectively). This proof of concept study emphasizes the great potential of DoE approaches to master the data overload resulting from modern data independent acquisition methods used for general unknown screening procedures by optimizing software parameters.

  12. Odor Recognition vs. Classification in Artificial Olfaction

    NASA Astrophysics Data System (ADS)

    Raman, Baranidharan; Hertz, Joshua; Benkstein, Kurt; Semancik, Steve

    2011-09-01

    Most studies in chemical sensing have focused on the problem of precise identification of chemical species that were exposed during the training phase (the recognition problem). However, generalization of training to predict the chemical composition of untrained gases based on their similarity with analytes in the training set (the classification problem) has received very limited attention. These two analytical tasks pose conflicting constraints on the system. While correct recognition requires detection of molecular features that are unique to an analyte, generalization to untrained chemicals requires detection of features that are common across a desired class of analytes. A simple solution that addresses both issues simultaneously can be obtained from biological olfaction, where the odor class and identity information are decoupled and extracted individually over time. Mimicking this approach, we proposed a hierarchical scheme that allowed initial discrimination between broad chemical classes (e.g. contains oxygen) followed by finer refinements using additional data into sub-classes (e.g. ketones vs. alcohols) and, eventually, specific compositions (e.g. ethanol vs. methanol) [1]. We validated this approach using an array of temperature-controlled chemiresistors. We demonstrated that a small set of training analytes is sufficient to allow generalization to novel chemicals and that the scheme provides robust categorization despite aging. Here, we provide further characterization of this approach.

  13. Strategic assay deployment as a method for countering analytical bottlenecks in high throughput process development: case studies in ion exchange chromatography.

    PubMed

    Konstantinidis, Spyridon; Heldin, Eva; Chhatre, Sunil; Velayudhan, Ajoy; Titchener-Hooker, Nigel

    2012-01-01

    High throughput approaches to facilitate the development of chromatographic separations have now been adopted widely in the biopharmaceutical industry, but issues of how to reduce the associated analytical burden remain. For example, acquiring experimental data by high level factorial designs in 96 well plates can place a considerable strain upon assay capabilities, generating a bottleneck that limits significantly the speed of process characterization. This article proposes an approach designed to counter this challenge; Strategic Assay Deployment (SAD). In SAD, a set of available analytical methods is investigated to determine which set of techniques is the most appropriate to use and how best to deploy these to reduce the consumption of analytical resources while still enabling accurate and complete process characterization. The approach is demonstrated by investigating how salt concentration and pH affect the binding of green fluorescent protein from Escherichia coli homogenate to an anion exchange resin presented in a 96-well filter plate format. Compared with the deployment of routinely used analytical methods alone, the application of SAD reduced both the total assay time and total assay material consumption by at least 40% and 5%, respectively. SAD has significant utility in accelerating bioprocess development activities. Copyright © 2012 American Institute of Chemical Engineers (AIChE).

  14. Impression Management and Interview and Job Performance Ratings: A Meta-Analysis of Research Design with Tactics in Mind

    PubMed Central

    Peck, Jessica A.; Levashina, Julia

    2017-01-01

    Impression management (IM) is pervasive in interview and job performance settings. We meta-analytically examine IM by self- and other-focused tactics to establish base rates of tactic usage, to understand the impact of tactics on interview and job performance ratings, and to examine the moderating effects of research design. Our results suggest IM is used more frequently in the interview rather than job performance settings. Self-focused tactics are more effective in the interview rather than in job performance settings, and other-focused tactics are more effective in job performance settings rather than in the interview. We explore several research design moderators including research fidelity, rater, and participants. IM has a somewhat stronger impact on interview ratings in lab settings than field settings. IM also has a stronger impact on interview ratings when the target of IM is also the rater of performance than when the rater of performance is an observer. Finally, labor market participants use IM more frequently and more effectively than students in interview settings. Our research has implications for understanding how different IM tactics function in interview and job performance settings and the effects of research design on IM frequency and impact. PMID:28261135

  15. Reducing the Analytical Bottleneck for Domain Scientists: Lessons from a Climate Data Visualization Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Poco, Jorge; Bertini, Enrico

    2016-01-01

    The gap between large-scale data production rate and the rate of generation of data-driven scientific insights has led to an analytical bottleneck in scientific domains like climate, biology, etc. This is primarily due to the lack of innovative analytical tools that can help scientists efficiently analyze and explore alternative hypotheses about the data, and communicate their findings effectively to a broad audience. In this paper, by reflecting on a set of successful collaborative research efforts between with a group of climate scientists and visualization researchers, we introspect how interactive visualization can help reduce the analytical bottleneck for domain scientists.

  16. Statistical correlation analysis for comparing vibration data from test and analysis

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.; Purves, L. R.; Hershfeld, D. J.

    1986-01-01

    A theory was developed to compare vibration modes obtained by NASTRAN analysis with those obtained experimentally. Because many more analytical modes can be obtained than experimental modes, the analytical set was treated as expansion functions for putting both sources in comparative form. The dimensional symmetry was developed for three general cases: nonsymmetric whole model compared with a nonsymmetric whole structural test, symmetric analytical portion compared with a symmetric experimental portion, and analytical symmetric portion with a whole experimental test. The theory was coded and a statistical correlation program was installed as a utility. The theory is established with small classical structures.

  17. Modeling of energy release systems from OTEC plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denno, K.

    1983-12-01

    This paper presents analytical scope for the controlling functions of OTEC operation for the ultimate production of sizable bulk ..delta..T as well as H/sub 2/, N/sub 2/ and NH/sub 3/. The controlling parametric functions include the oceanic and ammonia Reynolds numbers which depend implicitly and explicitly on the ocean water velocity, mass-volume, duration of ..delta..T extraction, and the inlet and outlet water temperatures internally and externally. Solutions for the oceanic and amonia Reynolds numbers have been established setting the deciding constraints on water velocity, boundary temperatures, mass-volume as well as other plant parameters. Linkage between OTEC plant and other conventionalmore » as well as advanced energy systems has been expressed in terms of a set of balance and coordinating energy equations.« less

  18. SWAT system performance predictions

    NASA Astrophysics Data System (ADS)

    Parenti, Ronald R.; Sasiela, Richard J.

    1993-03-01

    In the next phase of Lincoln Laboratory's SWAT (Short-Wavelength Adaptive Techniques) program, the performance of a 241-actuator adaptive-optics system will be measured using a variety of synthetic-beacon geometries. As an aid in this experimental investigation, a detailed set of theoretical predictions has also been assembled. The computational tools that have been applied in this study include a numerical approach in which Monte-Carlo ray-trace simulations of accumulated phase error are developed, and an analytical analysis of the expected system behavior. This report describes the basis of these two computational techniques and compares their estimates of overall system performance. Although their regions of applicability tend to be complementary rather than redundant, good agreement is usually obtained when both sets of results can be derived for the same engagement scenario.

  19. Mirion--a software package for automatic processing of mass spectrometric images.

    PubMed

    Paschke, C; Leisner, A; Hester, A; Maass, K; Guenther, S; Bouschen, W; Spengler, B

    2013-08-01

    Mass spectrometric imaging (MSI) techniques are of growing interest for the Life Sciences. In recent years, the development of new instruments employing ion sources that are tailored for spatial scanning allowed the acquisition of large data sets. A subsequent data processing, however, is still a bottleneck in the analytical process, as a manual data interpretation is impossible within a reasonable time frame. The transformation of mass spectrometric data into spatial distribution images of detected compounds turned out to be the most appropriate method to visualize the results of such scans, as humans are able to interpret images faster and easier than plain numbers. Image generation, thus, is a time-consuming and complex yet very efficient task. The free software package "Mirion," presented in this paper, allows the handling and analysis of data sets acquired by mass spectrometry imaging. Mirion can be used for image processing of MSI data obtained from many different sources, as it uses the HUPO-PSI-based standard data format imzML, which is implemented in the proprietary software of most of the mass spectrometer companies. Different graphical representations of the recorded data are available. Furthermore, automatic calculation and overlay of mass spectrometric images promotes direct comparison of different analytes for data evaluation. The program also includes tools for image processing and image analysis.

  20. [Quantitative spectrum analysis of characteristic gases of spontaneous combustion coal].

    PubMed

    Liang, Yun-Tao; Tang, Xiao-Jun; Luo, Hai-Zhu; Sun, Yong

    2011-09-01

    Aimed at the characteristics of spontaneous combustion gas such as a variety of gases, lou limit of detection, and critical requirement of safety, Fourier transform infrared (FTIR) spectral analysis is presented to analyze characteristic gases of spontaneous combustion In this paper, analysis method is introduced at first by combing characteristics of absorption spectra of analyte and analysis requirement. Parameter setting method, sample preparation, feature variable abstract and analysis model building are taken into consideration. The methods of sample preparation, feature abstraction and analysis model are introduced in detail. And then, eleven kinds of gases were tested with Tensor 27 spectrometer. CH4, C2H6, C3H8, iC4H10, nC4H10, C2 H4, C3 H6, C3 H2, SF6, CO and CO2 were included. The optical path length was 10 cm while the spectra resolution was set as 1 cm(-1). The testing results show that the detection limit of all analytes is less than 2 x 10(-6). All the detection limits fit the measurement requirement of spontaneous combustion gas, which means that FTIR may be an ideal instrument and the analysis method used in this paper is competent for spontaneous combustion gas measurement on line.

  1. Pretreatment and integrated analysis of spectral data reveal seaweed similarities based on chemical diversity.

    PubMed

    Wei, Feifei; Ito, Kengo; Sakata, Kenji; Date, Yasuhiro; Kikuchi, Jun

    2015-03-03

    Extracting useful information from high dimensionality and large data sets is a major challenge for data-driven approaches. The present study was aimed at developing novel integrated analytical strategies for comprehensively characterizing seaweed similarities based on chemical diversity. The chemical compositions of 107 seaweed and 2 seagrass samples were analyzed using multiple techniques, including Fourier transform infrared (FT-IR) and solid- and solution-state nuclear magnetic resonance (NMR) spectroscopy, thermogravimetry-differential thermal analysis (TG-DTA), inductively coupled plasma-optical emission spectrometry (ICP-OES), CHNS/O total elemental analysis, and isotope ratio mass spectrometry (IR-MS). The spectral data were preprocessed using non-negative matrix factorization (NMF) and NMF combined with multivariate curve resolution-alternating least-squares (MCR-ALS) methods in order to separate individual component information from the overlapping and/or broad spectral peaks. Integrated analysis of the preprocessed chemical data demonstrated distinct discrimination of differential seaweed species. Further network analysis revealed a close correlation between the heavy metal elements and characteristic components of brown algae, such as cellulose, alginic acid, and sulfated mucopolysaccharides, providing a componential basis for its metal-sorbing potential. These results suggest that this integrated analytical strategy is useful for extracting and identifying the chemical characteristics of diverse seaweeds based on large chemical data sets, particularly complicated overlapping spectral data.

  2. Performance Evaluation of an Improved GC-MS Method to Quantify Methylmercury in Fish.

    PubMed

    Watanabe, Takahiro; Kikuchi, Hiroyuki; Matsuda, Rieko; Hayashi, Tomoko; Akaki, Koichi; Teshima, Reiko

    2015-01-01

    Here, we set out to improve our previously developed methylmercury analytical method, involving phenyl derivatization and gas chromatography-mass spectrometry (GC-MS). In the improved method, phenylation of methylmercury with sodium tetraphenylborate was carried out in a toluene/water two-phase system, instead of in water alone. The modification enabled derivatization at optimum pH, and the formation of by-products was dramatically reduced. In addition, adsorption of methyl phenyl mercury in the GC system was suppressed by co-injection of PEG200, enabling continuous analysis without loss of sensitivity. The performance of the improved analytical method was independently evaluated by three analysts using certified reference materials and methylmercury-spiked fresh fish samples. The present analytical method was validated as suitable for determination of compliance with the provisional regulation value for methylmercury in fish, set in the Food Sanitation haw.

  3. Exposing the Backstage: Critical Reflections on a Longitudinal Qualitative Study of Residents’ Care Networks in Assisted Living

    PubMed Central

    Kemp, Candace L.; Ball, Mary M.; Morgan, Jennifer Craft; Doyle, Patrick J.; Burgess, Elisabeth O.; Dillard, Joy A.; Barmon, Christina E.; Fitzroy, Andrea F.; Helmly, Victoria E.; Avent, Elizabeth S.; Perkins, Molly M.

    2018-01-01

    In this article, we analyze the research experiences associated with a longitudinal qualitative study of residents’ care networks in assisted living. Using data from researcher meetings, field notes, and memos, we critically examine our design and decision making and accompanying methodological implications. We focus on one complete wave of data collection involving 28 residents and 114 care network members in four diverse settings followed for 2 years. We identify study features that make our research innovative, but that also represent significant challenges. They include the focus and topic; settings and participants; scope and design complexity; nature, modes, frequency, and duration of data collection; and analytic approach. Each feature has methodological implications, including benefits and challenges pertaining to recruitment, retention, data collection, quality, and management, research team work, researcher roles, ethics, and dissemination. Our analysis demonstrates the value of our approach and of reflecting on and sharing methodological processes for cumulative knowledge building. PMID:27651072

  4. Exposing the Backstage: Critical Reflections on a Longitudinal Qualitative Study of Residents' Care Networks in Assisted Living.

    PubMed

    Kemp, Candace L; Ball, Mary M; Morgan, Jennifer Craft; Doyle, Patrick J; Burgess, Elisabeth O; Dillard, Joy A; Barmon, Christina E; Fitzroy, Andrea F; Helmly, Victoria E; Avent, Elizabeth S; Perkins, Molly M

    2017-07-01

    In this article, we analyze the research experiences associated with a longitudinal qualitative study of residents' care networks in assisted living. Using data from researcher meetings, field notes, and memos, we critically examine our design and decision making and accompanying methodological implications. We focus on one complete wave of data collection involving 28 residents and 114 care network members in four diverse settings followed for 2 years. We identify study features that make our research innovative, but that also represent significant challenges. They include the focus and topic; settings and participants; scope and design complexity; nature, modes, frequency, and duration of data collection; and analytic approach. Each feature has methodological implications, including benefits and challenges pertaining to recruitment, retention, data collection, quality, and management, research team work, researcher roles, ethics, and dissemination. Our analysis demonstrates the value of our approach and of reflecting on and sharing methodological processes for cumulative knowledge building.

  5. The MCNP6 Analytic Criticality Benchmark Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less

  6. The BioExtract Server: a web-based bioinformatic workflow platform

    PubMed Central

    Lushbough, Carol M.; Jennewein, Douglas M.; Brendel, Volker P.

    2011-01-01

    The BioExtract Server (bioextract.org) is an open, web-based system designed to aid researchers in the analysis of genomic data by providing a platform for the creation of bioinformatic workflows. Scientific workflows are created within the system by recording tasks performed by the user. These tasks may include querying multiple, distributed data sources, saving query results as searchable data extracts, and executing local and web-accessible analytic tools. The series of recorded tasks can then be saved as a reproducible, sharable workflow available for subsequent execution with the original or modified inputs and parameter settings. Integrated data resources include interfaces to the National Center for Biotechnology Information (NCBI) nucleotide and protein databases, the European Molecular Biology Laboratory (EMBL-Bank) non-redundant nucleotide database, the Universal Protein Resource (UniProt), and the UniProt Reference Clusters (UniRef) database. The system offers access to numerous preinstalled, curated analytic tools and also provides researchers with the option of selecting computational tools from a large list of web services including the European Molecular Biology Open Software Suite (EMBOSS), BioMoby, and the Kyoto Encyclopedia of Genes and Genomes (KEGG). The system further allows users to integrate local command line tools residing on their own computers through a client-side Java applet. PMID:21546552

  7. Assessing the utility of transcriptome data for inferring phylogenetic relationships among coleoid cephalopods.

    PubMed

    Lindgren, Annie R; Anderson, Frank E

    2018-01-01

    Historically, deep-level relationships within the molluscan class Cephalopoda (squids, cuttlefishes, octopods and their relatives) have remained elusive due in part to the considerable morphological diversity of extant taxa, a limited fossil record for species that lack a calcareous shell and difficulties in sampling open ocean taxa. Many conflicts identified by morphologists in the early 1900s remain unresolved today in spite of advances in morphological, molecular and analytical methods. In this study we assess the utility of transcriptome data for resolving cephalopod phylogeny, with special focus on the orders of Decapodiformes (open-eye squids, bobtail squids, cuttlefishes and relatives). To do so, we took new and previously published transcriptome data and used a unique cephalopod core ortholog set to generate a dataset that was subjected to an array of filtering and analytical methods to assess the impacts of: taxon sampling, ortholog number, compositional and rate heterogeneity and incongruence across loci. Analyses indicated that datasets that maximized taxonomic coverage but included fewer orthologs were less stable than datasets that sacrificed taxon sampling to increase the number of orthologs. Clades recovered irrespective of dataset, filtering or analytical method included Octopodiformes (Vampyroteuthis infernalis + octopods), Decapodiformes (squids, cuttlefishes and their relatives), and orders Oegopsida (open-eyed squids) and Myopsida (e.g., loliginid squids). Ordinal-level relationships within Decapodiformes were the most susceptible to dataset perturbation, further emphasizing the challenges associated with uncovering relationships at deep nodes in the cephalopod tree of life. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Applying Advanced Analytical Approaches to Characterize the Impact of Specific Clinical Gaps and Profiles on the Management of Rheumatoid Arthritis.

    PubMed

    Ruiz-Cordell, Karyn D; Joubin, Kathy; Haimowitz, Steven

    2016-01-01

    The goal of this study was to add a predictive modeling approach to the meta-analysis of continuing medical education curricula to determine whether this technique can be used to better understand clinical decision making. Using the education of rheumatologists on rheumatoid arthritis management as a model, this study demonstrates how the combined methodology has the ability to not only characterize learning gaps but also identify those proficiency areas that have the greatest impact on clinical behavior. The meta-analysis included seven curricula with 25 activities. Learners who identified as rheumatologists were evaluated across multiple learning domains, using a uniform methodology to characterize learning gains and gaps. A performance composite variable (called the treatment individualization and optimization score) was then established as a target upon which predictive analytics were conducted. Significant predictors of the target included items related to the knowledge of rheumatologists and confidence concerning 1) treatment guidelines and 2) tests that measure disease activity. In addition, a striking demographic predictor related to geographic practice setting was also identified. The results demonstrate the power of advanced analytics to identify key predictors that influence clinical behaviors. Furthermore, the ability to provide an expected magnitude of change if these predictors are addressed has the potential to substantially refine educational priorities to those drivers that, if targeted, will most effectively overcome clinical barriers and lead to the greatest success in achieving treatment goals.

  9. Analytic Morse/long-range potential energy surfaces and "adiabatic-hindered-rotor" treatment for a symmetric top-linear molecule dimer: A case study of CH3F-H2

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao-Long; Ma, Yong-Tao; Zhai, Yu; Li, Hui

    2018-03-01

    A first effective six-dimensional ab initio potential energy surface (PES) for CH3F-H2 which explicitly includes the intramolecular Q3 stretching normal mode of the CH3F monomer is presented. The electronic structure computations have been carried out at the explicitly correlated coupled cluster level of theory [CCSD(T)-F12a] with an augmented correlation-consistent triple zeta basis set. Five-dimensional analytical intermolecular PESs for ν3(CH3F) = 0 and 1 are then obtained by fitting the vibrationally averaged potentials to the Morse/Long-Range (MLR) potential function form. The MLR function form is applied to the nonlinear molecule-linear molecule case for the first time. These fits to 25 015 points have root-mean-square deviations of 0.74 cm-1 and 0.082 cm-1 for interaction energies less than 0.0 cm-1. Using the adiabatic hindered-rotor approximation, three-dimensional PESs for CH3F-paraH2 are generated from the 5D PESs over all possible orientations of the hydrogen monomer. The infrared and microwave spectra for CH3F-paraH2 dimer are predicted for the first time. These analytic PESs can be used for modeling the dynamical behavior in CH3F-(H2)N clusters, including the possible appearance of microscopic superfluidity.

  10. Forensic toxicology.

    PubMed

    Drummer, Olaf H

    2010-01-01

    Forensic toxicology has developed as a forensic science in recent years and is now widely used to assist in death investigations, in civil and criminal matters involving drug use, in drugs of abuse testing in correctional settings and custodial medicine, in road and workplace safety, in matters involving environmental pollution, as well as in sports doping. Drugs most commonly targeted include amphetamines, benzodiazepines, cannabis, cocaine and the opiates, but can be any other illicit substance or almost any over-the-counter or prescribed drug, as well as poisons available to the community. The discipline requires high level skills in analytical techniques with a solid knowledge of pharmacology and pharmacokinetics. Modern techniques rely heavily on immunoassay screening analyses and mass spectrometry (MS) for confirmatory analyses using either high-performance liquid chromatography or gas chromatography as the separation technique. Tandem MS has become more and more popular compared to single-stage MS. It is essential that analytical systems are fully validated and fit for the purpose and the assay batches are monitored with quality controls. External proficiency programs monitor both the assay and the personnel performing the work. For a laboratory to perform optimally, it is vital that the circumstances and context of the case are known and the laboratory understands the limitations of the analytical systems used, including drug stability. Drugs and poisons can change concentration postmortem due to poor or unequal quality of blood and other specimens, anaerobic metabolism and redistribution. The latter provides the largest handicap in the interpretation of postmortem results.

  11. Evaluation of a reduced centrifugation time and higher centrifugal force on various general chemistry and immunochemistry analytes in plasma and serum.

    PubMed

    Møller, Mette F; Søndergaard, Tove R; Kristensen, Helle T; Münster, Anna-Marie B

    2017-09-01

    Background Centrifugation of blood samples is an essential preanalytical step in the clinical biochemistry laboratory. Centrifugation settings are often altered to optimize sample flow and turnaround time. Few studies have addressed the effect of altering centrifugation settings on analytical quality, and almost all studies have been done using collection tubes with gel separator. Methods In this study, we compared a centrifugation time of 5 min at 3000 ×  g to a standard protocol of 10 min at 2200 ×  g. Nine selected general chemistry and immunochemistry analytes and interference indices were studied in lithium heparin plasma tubes and serum tubes without gel separator. Results were evaluated using mean bias, difference plots and coefficient of variation, compared with maximum allowable bias and coefficient of variation used in laboratory routine quality control. Results For all analytes except lactate dehydrogenase, the results were within the predefined acceptance criteria, indicating that the analytical quality was not compromised. Lactate dehydrogenase showed higher values after centrifugation for 5 min at 3000 ×  g, mean bias was 6.3 ± 2.2% and the coefficient of variation was 5%. Conclusions We found that a centrifugation protocol of 5 min at 3000 ×  g can be used for the general chemistry and immunochemistry analytes studied, with the possible exception of lactate dehydrogenase, which requires further assessment.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, Ling; Zhao, Haihua; Zhang, Hongbin

    Here, the one-dimensional water faucet problem is one of the classical benchmark problems originally proposed by Ransom to study the two-fluid two-phase flow model. With certain simplifications, such as massless gas phase and no wall and interfacial frictions, analytical solutions had been previously obtained for the transient liquid velocity and void fraction distribution. The water faucet problem and its analytical solutions have been widely used for the purposes of code assessment, benchmark and numerical verifications. In our previous study, the Ransom’s solutions were used for the mesh convergence study of a high-resolution spatial discretization scheme. It was found that, atmore » the steady state, an anticipated second-order spatial accuracy could not be achieved, when compared to the existing Ransom’s analytical solutions. A further investigation showed that the existing analytical solutions do not actually satisfy the commonly used two-fluid single-pressure two-phase flow equations. In this work, we present a new set of analytical solutions of the water faucet problem at the steady state, considering the gas phase density’s effect on pressure distribution. This new set of analytical solutions are used for mesh convergence studies, from which anticipated second-order of accuracy is achieved for the 2nd order spatial discretization scheme. In addition, extended Ransom’s transient solutions for the gas phase velocity and pressure are derived, with the assumption of decoupled liquid and gas pressures. Numerical verifications on the extended Ransom’s solutions are also presented.« less

  13. Detection method for dissociation of multiple-charged ions

    DOEpatents

    Smith, Richard D.; Udseth, Harold R.; Rockwood, Alan L.

    1991-01-01

    Dissociations of multiple-charged ions are detected and analyzed by charge-separation tandem mass spectrometry. Analyte molecules are ionized to form multiple-charged parent ions. A particular charge parent ion state is selected in a first-stage mass spectrometer and its mass-to-charge ratio (M/Z) is detected to determine its mass and charge. The selected parent ions are then dissociated, each into a plurality of fragments including a set of daughter ions each having a mass of at least one molecular weight and a charge of at least one. Sets of daughter ions resulting from the dissociation of one parent ion (sibling ions) vary in number but typically include two to four ions, one or more multiply-charged. A second stage mass spectrometer detects mass-to-charge ratio (m/z) of the daughter ions and a temporal or temporo-spatial relationship among them. This relationship is used to correlate the daughter ions to determine which (m/z) ratios belong to a set of sibling ions. Values of mass and charge of each of the sibling ions are determined simultaneously from their respective (m/z) ratios such that the sibling ion charges are integers and sum to the parent ion charge.

  14. The epidemiology of substance use among street children in resource-constrained settings: a systematic review and meta-analysis

    PubMed Central

    Embleton, Lonnie; Mwangi, Ann; Vreeman, Rachel; Ayuku, David; Braitstein, Paula

    2013-01-01

    Aims To compile and analyze critically the literature published on street children and substance use in resource-constrained settings. Methods We searched the literature systematically and used meta-analytical procedures to synthesize literature that met the review’s inclusion criteria. Pooled-prevalence estimates and 95% confidence intervals (CI) were calculated using the random-effects model for life-time substance use by geographical region as well as by type of substance used. Results Fifty studies from 22 countries were included into the review. Meta-analysis of combined life-time substance use from 27 studies yielded an overall drug use pooled-prevalence estimate of 60% (95% CI = 51–69%). Studies from 14 countries contributed to an overall pooled prevalence for street children’s reported inhalant use of 47% (95% CI = 36–58%). This review reveals significant gaps in the literature, including a dearth of data on physical and mental health outcomes, HIV and mortality in association with street children’s substance use. Conclusions Street children from resource-constrained settings reported high life-time substance use. Inhalants are the predominant substances used, followed by tobacco, alcohol and marijuana. PMID:23844822

  15. Enhancing Students' Numeracy and Analytical Skills with the Use of Technology: A Career Preparation Approach

    ERIC Educational Resources Information Center

    Hurst, Jessica L.

    2013-01-01

    In today's competitive job market, education is increasingly touted as necessary preparation for the workplace (Joyce, Hassall, Montano, & Anes, 2006). The literature suggests that one of the most important skill sets identified by both educators and practitioners is the need for numerical competency and analytical proficiency (Joyce et…

  16. A simulation-based evaluation of methods for inferring linear barriers to gene flow

    Treesearch

    Christopher Blair; Dana E. Weigel; Matthew Balazik; Annika T. H. Keeley; Faith M. Walker; Erin Landguth; Sam Cushman; Melanie Murphy; Lisette Waits; Niko Balkenhol

    2012-01-01

    Different analytical techniques used on the same data set may lead to different conclusions about the existence and strength of genetic structure. Therefore, reliable interpretation of the results from different methods depends on the efficacy and reliability of different statistical methods. In this paper, we evaluated the performance of multiple analytical methods to...

  17. The Challenges of Teaching Business Analytics: Finding Real Big Data for Business Students

    ERIC Educational Resources Information Center

    Yap, Alexander Y.; Drye, Sherrie L.

    2018-01-01

    This research shares the challenges of bringing in real-world big business data into the classroom so students can experience how today's business decisions can improve with the strategic use of data analytics. Finding a true big data set that provides real world business transactions and operational data has been a challenge for academics…

  18. Does Incubation Enhance Problem Solving? A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Sio, Ut Na; Ormerod, Thomas C.

    2009-01-01

    A meta-analytic review of empirical studies that have investigated incubation effects on problem solving is reported. Although some researchers have reported increased solution rates after an incubation period (i.e., a period of time in which a problem is set aside prior to further attempts to solve), others have failed to find effects. The…

  19. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--QA ANALYTICAL RESULTS FOR METALS IN BLANK SAMPLES

    EPA Science Inventory

    The Metals in Blank Samples data set contains the analytical results of measurements of up to 27 metals in 52 blank samples. Measurements were made in blank samples of dust, indoor air, food, water, and dermal wipe residue. Blank samples were used to assess the potential for sa...

  20. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--METALS/XRF IN DUST ANALYTICAL RESULTS

    EPA Science Inventory

    The Metals-XRF in Dust data set contains X-ray fluorescence (XRF) analytical results for measurements of up to 27 metals in 91 dust samples over 91 households. Samples were taken by collecting dust from the indoor floor areas in the main room and in the bedroom of the primary re...

  1. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY-PESTICIDES AND POLYCHLORINATED BIPHENYLS (PCBS) IN BLOOD ANALYTICAL RESULTS

    EPA Science Inventory

    The Pesticides and PCBs in Blood data set contains analytical results for measurements of up to 11 pesticides and up to 36 PCBs in 86 blood samples over 86 households. Each sample was collected as a venous sample from the primary respondent within each household. The samples co...

  2. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    ERIC Educational Resources Information Center

    Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors…

  3. A Conversation-Analytic Perspective on the Organization of Teacher-Led Clarification and Its Implications for L2 Teacher Training

    ERIC Educational Resources Information Center

    Atar, Cihat; Seedhouse, Paul

    2018-01-01

    This study analyses teacher-led clarification sequences in a university second language classroom setting from a conversation-analytic perspective. In the literature, there are many studies of clarification requests, but the focus is on individual categories and quantification. No previous study has examined clarification, as reconceptualised in…

  4. Sequential Multiplex Analyte Capturing for Phosphoprotein Profiling*

    PubMed Central

    Poetz, Oliver; Henzler, Tanja; Hartmann, Michael; Kazmaier, Cornelia; Templin, Markus F.; Herget, Thomas; Joos, Thomas O.

    2010-01-01

    Microarray-based sandwich immunoassays can simultaneously detect dozens of proteins. However, their use in quantifying large numbers of proteins is hampered by cross-reactivity and incompatibilities caused by the immunoassays themselves. Sequential multiplex analyte capturing addresses these problems by repeatedly probing the same sample with different sets of antibody-coated, magnetic suspension bead arrays. As a miniaturized immunoassay format, suspension bead array-based assays fulfill the criteria of the ambient analyte theory, and our experiments reveal that the analyte concentrations are not significantly changed. The value of sequential multiplex analyte capturing was demonstrated by probing tumor cell line lysates for the abundance of seven different receptor tyrosine kinases and their degree of phosphorylation and by measuring the complex phosphorylation pattern of the epidermal growth factor receptor in the same sample from the same cavity. PMID:20682761

  5. Separation of very hydrophobic analytes by micellar electrokinetic chromatography IV. Modeling of the effective electrophoretic mobility from carbon number equivalents and octanol-water partition coefficients.

    PubMed

    Huhn, Carolin; Pyell, Ute

    2008-07-11

    It is investigated whether those relationships derived within an optimization scheme developed previously to optimize separations in micellar electrokinetic chromatography can be used to model effective electrophoretic mobilities of analytes strongly differing in their properties (polarity and type of interaction with the pseudostationary phase). The modeling is based on two parameter sets: (i) carbon number equivalents or octanol-water partition coefficients as analyte descriptors and (ii) four coefficients describing properties of the separation electrolyte (based on retention data for a homologous series of alkyl phenyl ketones used as reference analytes). The applicability of the proposed model is validated comparing experimental and calculated effective electrophoretic mobilities. The results demonstrate that the model can effectively be used to predict effective electrophoretic mobilities of neutral analytes from the determined carbon number equivalents or from octanol-water partition coefficients provided that the solvation parameters of the analytes of interest are similar to those of the reference analytes.

  6. Hidden Costs: the ethics of cost-effectiveness analyses for health interventions in resource-limited settings

    PubMed Central

    Rutstein, Sarah E.; Price, Joan T.; Rosenberg, Nora E.; Rennie, Stuart M.; Biddle, Andrea K.; Miller, William C.

    2017-01-01

    Cost-effectiveness analysis (CEA) is an increasingly appealing tool for evaluating and comparing health-related interventions in resource-limited settings. The goal is to inform decision-makers regarding the health benefits and associated costs of alternative interventions, helping guide allocation of limited resources by prioritizing interventions that offer the most health for the least money. Although only one component of a more complex decision-making process, CEAs influence the distribution of healthcare resources, directly influencing morbidity and mortality for the world’s most vulnerable populations. However, CEA-associated measures are frequently setting-specific valuations, and CEA outcomes may violate ethical principles of equity and distributive justice. We examine the assumptions and analytical tools used in CEAs that may conflict with societal values. We then evaluate contextual features unique to resource-limited settings, including the source of health-state utilities and disability weights; implications of CEA thresholds in light of economic uncertainty; and the role of external donors. Finally, we explore opportunities to help align interpretation of CEA outcomes with values and budgetary constraints in resource-limited settings. The ethical implications of CEAs in resource-limited settings are vast. It is imperative that CEA outcome summary measures and implementation thresholds adequately reflect societal values and ethical priorities in resource-limited settings. PMID:27141969

  7. Hidden costs: The ethics of cost-effectiveness analyses for health interventions in resource-limited settings.

    PubMed

    Rutstein, Sarah E; Price, Joan T; Rosenberg, Nora E; Rennie, Stuart M; Biddle, Andrea K; Miller, William C

    2017-10-01

    Cost-effectiveness analysis (CEA) is an increasingly appealing tool for evaluating and comparing health-related interventions in resource-limited settings. The goal is to inform decision-makers regarding the health benefits and associated costs of alternative interventions, helping guide allocation of limited resources by prioritising interventions that offer the most health for the least money. Although only one component of a more complex decision-making process, CEAs influence the distribution of health-care resources, directly influencing morbidity and mortality for the world's most vulnerable populations. However, CEA-associated measures are frequently setting-specific valuations, and CEA outcomes may violate ethical principles of equity and distributive justice. We examine the assumptions and analytical tools used in CEAs that may conflict with societal values. We then evaluate contextual features unique to resource-limited settings, including the source of health-state utilities and disability weights, implications of CEA thresholds in light of economic uncertainty, and the role of external donors. Finally, we explore opportunities to help align interpretation of CEA outcomes with values and budgetary constraints in resource-limited settings. The ethical implications of CEAs in resource-limited settings are vast. It is imperative that CEA outcome summary measures and implementation thresholds adequately reflect societal values and ethical priorities in resource-limited settings.

  8. Association of Protein Translation and Extracellular Matrix Gene Sets with Breast Cancer Metastasis: Findings Uncovered on Analysis of Multiple Publicly Available Datasets Using Individual Patient Data Approach.

    PubMed

    Chowdhury, Nilotpal; Sapru, Shantanu

    2015-01-01

    Microarray analysis has revolutionized the role of genomic prognostication in breast cancer. However, most studies are single series studies, and suffer from methodological problems. We sought to use a meta-analytic approach in combining multiple publicly available datasets, while correcting for batch effects, to reach a more robust oncogenomic analysis. The aim of the present study was to find gene sets associated with distant metastasis free survival (DMFS) in systemically untreated, node-negative breast cancer patients, from publicly available genomic microarray datasets. Four microarray series (having 742 patients) were selected after a systematic search and combined. Cox regression for each gene was done for the combined dataset (univariate, as well as multivariate - adjusted for expression of Cell cycle related genes) and for the 4 major molecular subtypes. The centre and microarray batch effects were adjusted by including them as random effects variables. The Cox regression coefficients for each analysis were then ranked and subjected to a Gene Set Enrichment Analysis (GSEA). Gene sets representing protein translation were independently negatively associated with metastasis in the Luminal A and Luminal B subtypes, but positively associated with metastasis in Basal tumors. Proteinaceous extracellular matrix (ECM) gene set expression was positively associated with metastasis, after adjustment for expression of cell cycle related genes on the combined dataset. Finally, the positive association of the proliferation-related genes with metastases was confirmed. To the best of our knowledge, the results depicting mixed prognostic significance of protein translation in breast cancer subtypes are being reported for the first time. We attribute this to our study combining multiple series and performing a more robust meta-analytic Cox regression modeling on the combined dataset, thus discovering 'hidden' associations. This methodology seems to yield new and interesting results and may be used as a tool to guide new research.

  9. Association of Protein Translation and Extracellular Matrix Gene Sets with Breast Cancer Metastasis: Findings Uncovered on Analysis of Multiple Publicly Available Datasets Using Individual Patient Data Approach

    PubMed Central

    Chowdhury, Nilotpal; Sapru, Shantanu

    2015-01-01

    Introduction Microarray analysis has revolutionized the role of genomic prognostication in breast cancer. However, most studies are single series studies, and suffer from methodological problems. We sought to use a meta-analytic approach in combining multiple publicly available datasets, while correcting for batch effects, to reach a more robust oncogenomic analysis. Aim The aim of the present study was to find gene sets associated with distant metastasis free survival (DMFS) in systemically untreated, node-negative breast cancer patients, from publicly available genomic microarray datasets. Methods Four microarray series (having 742 patients) were selected after a systematic search and combined. Cox regression for each gene was done for the combined dataset (univariate, as well as multivariate – adjusted for expression of Cell cycle related genes) and for the 4 major molecular subtypes. The centre and microarray batch effects were adjusted by including them as random effects variables. The Cox regression coefficients for each analysis were then ranked and subjected to a Gene Set Enrichment Analysis (GSEA). Results Gene sets representing protein translation were independently negatively associated with metastasis in the Luminal A and Luminal B subtypes, but positively associated with metastasis in Basal tumors. Proteinaceous extracellular matrix (ECM) gene set expression was positively associated with metastasis, after adjustment for expression of cell cycle related genes on the combined dataset. Finally, the positive association of the proliferation-related genes with metastases was confirmed. Conclusion To the best of our knowledge, the results depicting mixed prognostic significance of protein translation in breast cancer subtypes are being reported for the first time. We attribute this to our study combining multiple series and performing a more robust meta-analytic Cox regression modeling on the combined dataset, thus discovering 'hidden' associations. This methodology seems to yield new and interesting results and may be used as a tool to guide new research. PMID:26080057

  10. Concept for facilitating analyst-mediated interpretation of qualitative chromatographic-mass spectral data: an alternative to manual examination of extracted ion chromatograms.

    PubMed

    Borges, Chad R

    2007-07-01

    A chemometrics-based data analysis concept has been developed as a substitute for manual inspection of extracted ion chromatograms (XICs), which facilitates rapid, analyst-mediated interpretation of GC- and LC/MS(n) data sets from samples undergoing qualitative batchwise screening for prespecified sets of analytes. Automatic preparation of data into two-dimensional row space-derived scatter plots (row space plots) eliminates the need to manually interpret hundreds to thousands of XICs per batch of samples while keeping all interpretation of raw data directly in the hands of the analyst-saving great quantities of human time without loss of integrity in the data analysis process. For a given analyte, two analyte-specific variables are automatically collected by a computer algorithm and placed into a data matrix (i.e., placed into row space): the first variable is the ion abundance corresponding to scan number x and analyte-specific m/z value y, and the second variable is the ion abundance corresponding to scan number x and analyte-specific m/z value z (a second ion). These two variables serve as the two axes of the aforementioned row space plots. In order to collect appropriate scan number (retention time) information, it is necessary to analyze, as part of every batch, a sample containing a mixture of all analytes to be tested. When pure standard materials of tested analytes are unavailable, but representative ion m/z values are known and retention time can be approximated, data are evaluated based on two-dimensional scores plots from principal component analysis of small time range(s) of mass spectral data. The time-saving efficiency of this concept is directly proportional to the percentage of negative samples and to the total number of samples processed simultaneously.

  11. Three-dimensional eddy current solution of a polyphase machine test model (abstract)

    NASA Astrophysics Data System (ADS)

    Pahner, Uwe; Belmans, Ronnie; Ostovic, Vlado

    1994-05-01

    This abstract describes a three-dimensional (3D) finite element solution of a test model that has been reported in the literature. The model is a basis for calculating the current redistribution effects in the end windings of turbogenerators. The aim of the study is to see whether the analytical results of the test model can be found using a general purpose finite element package, thus indicating that the finite element model is accurate enough to treat real end winding problems. The real end winding problems cannot be solved analytically, as the geometry is far too complicated. The model consists of a polyphase coil set, containing 44 individual coils. This set generates a two pole mmf distribution on a cylindrical surface. The rotating field causes eddy currents to flow in the inner massive and conducting rotor. In the analytical solution a perfect sinusoidal mmf distribution is put forward. The finite element model contains 85824 tetrahedra and 16451 nodes. A complex single scalar potential representation is used in the nonconducting parts. The computation time required was 3 h and 42 min. The flux plots show that the field distribution is acceptable. Furthermore, the induced currents are calculated and compared with the values found from the analytical solution. The distribution of the eddy currents is very close to the distribution of the analytical solution. The most important results are the losses, both local and global. The value of the overall losses is less than 2% away from those of the analytical solution. Also the local distribution of the losses is at any given point less than 7% away from the analytical solution. The deviations of the results are acceptable and are partially due to the fact that the sinusoidal mmf distribution was not modeled perfectly in the finite element method.

  12. Sustained prediction ability of net analyte preprocessing methods using reduced calibration sets. Theoretical and experimental study involving the spectrophotometric analysis of multicomponent mixtures.

    PubMed

    Goicoechea, H C; Olivieri, A C

    2001-07-01

    A newly developed multivariate method involving net analyte preprocessing (NAP) was tested using central composite calibration designs of progressively decreasing size regarding the multivariate simultaneous spectrophotometric determination of three active components (phenylephrine, diphenhydramine and naphazoline) and one excipient (methylparaben) in nasal solutions. Its performance was evaluated and compared with that of partial least-squares (PLS-1). Minimisation of the calibration predicted error sum of squares (PRESS) as a function of a moving spectral window helped to select appropriate working spectral ranges for both methods. The comparison of NAP and PLS results was carried out using two tests: (1) the elliptical joint confidence region for the slope and intercept of a predicted versus actual concentrations plot for a large validation set of samples and (2) the D-optimality criterion concerning the information content of the calibration data matrix. Extensive simulations and experimental validation showed that, unlike PLS, the NAP method is able to furnish highly satisfactory results when the calibration set is reduced from a full four-component central composite to a fractional central composite, as expected from the modelling requirements of net analyte based methods.

  13. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    PubMed

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. categoryCompare, an analytical tool based on feature annotations

    PubMed Central

    Flight, Robert M.; Harrison, Benjamin J.; Mohammad, Fahim; Bunge, Mary B.; Moon, Lawrence D. F.; Petruska, Jeffrey C.; Rouchka, Eric C.

    2014-01-01

    Assessment of high-throughput—omics data initially focuses on relative or raw levels of a particular feature, such as an expression value for a transcript, protein, or metabolite. At a second level, analyses of annotations including known or predicted functions and associations of each individual feature, attempt to distill biological context. Most currently available comparative- and meta-analyses methods are dependent on the availability of identical features across data sets, and concentrate on determining features that are differentially expressed across experiments, some of which may be considered “biomarkers.” The heterogeneity of measurement platforms and inherent variability of biological systems confounds the search for robust biomarkers indicative of a particular condition. In many instances, however, multiple data sets show involvement of common biological processes or signaling pathways, even though individual features are not commonly measured or differentially expressed between them. We developed a methodology, categoryCompare, for cross-platform and cross-sample comparison of high-throughput data at the annotation level. We assessed the utility of the approach using hypothetical data, as well as determining similarities and differences in the set of processes in two instances: (1) denervated skin vs. denervated muscle, and (2) colon from Crohn's disease vs. colon from ulcerative colitis (UC). The hypothetical data showed that in many cases comparing annotations gave superior results to comparing only at the gene level. Improved analytical results depended as well on the number of genes included in the annotation term, the amount of noise in relation to the number of genes expressing in unenriched annotation categories, and the specific method in which samples are combined. In the skin vs. muscle denervation comparison, the tissues demonstrated markedly different responses. The Crohn's vs. UC comparison showed gross similarities in inflammatory response in the two diseases, with particular processes specific to each disease. PMID:24808906

  15. Prescriber barriers and enablers to minimising potentially inappropriate medications in adults: a systematic review and thematic synthesis.

    PubMed

    Anderson, Kristen; Stowasser, Danielle; Freeman, Christopher; Scott, Ian

    2014-12-08

    To synthesise qualitative studies that explore prescribers' perceived barriers and enablers to minimising potentially inappropriate medications (PIMs) chronically prescribed in adults. A qualitative systematic review was undertaken by searching PubMed, EMBASE, Scopus, PsycINFO, CINAHL and INFORMIT from inception to March 2014, combined with an extensive manual search of reference lists and related citations. A quality checklist was used to assess the transparency of the reporting of included studies and the potential for bias. Thematic synthesis identified common subthemes and descriptive themes across studies from which an analytical construct was developed. Study characteristics were examined to explain differences in findings. All healthcare settings. Medical and non-medical prescribers of medicines to adults. Prescribers' perspectives on factors which shape their behaviour towards continuing or discontinuing PIMs in adults. 21 studies were included; most explored primary care physicians' perspectives on managing older, community-based adults. Barriers and enablers to minimising PIMs emerged within four analytical themes: problem awareness; inertia secondary to lower perceived value proposition for ceasing versus continuing PIMs; self-efficacy in regard to personal ability to alter prescribing; and feasibility of altering prescribing in routine care environments given external constraints. The first three themes are intrinsic to the prescriber (eg, beliefs, attitudes, knowledge, skills, behaviour) and the fourth is extrinsic (eg, patient, work setting, health system and cultural factors). The PIMs examined and practice setting influenced the themes reported. A multitude of highly interdependent factors shape prescribers' behaviour towards continuing or discontinuing PIMs. A full understanding of prescriber barriers and enablers to changing prescribing behaviour is critical to the development of targeted interventions aimed at deprescribing PIMs and reducing the risk of iatrogenic harm. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  16. Uncertainty of relative sensitivity factors in glow discharge mass spectrometry

    NASA Astrophysics Data System (ADS)

    Meija, Juris; Methven, Brad; Sturgeon, Ralph E.

    2017-10-01

    The concept of the relative sensitivity factors required for the correction of the measured ion beam ratios in pin-cell glow discharge mass spectrometry is examined in detail. We propose a data-driven model for predicting the relative response factors, which relies on a non-linear least squares adjustment and analyte/matrix interchangeability phenomena. The model provides a self-consistent set of response factors for any analyte/matrix combination of any element that appears as either an analyte or matrix in at least one known response factor.

  17. Charge transport and trapping in organic field effect transistors exposed to polar analytes

    NASA Astrophysics Data System (ADS)

    Duarte, Davianne; Sharma, Deepak; Cobb, Brian; Dodabalapur, Ananth

    2011-03-01

    Pentacene based organic thin-film transistors were used to study the effects of polar analytes on charge transport and trapping behavior during vapor sensing. Three sets of devices with differing morphology and mobility (0.001-0.5 cm2/V s) were employed. All devices show enhanced trapping upon exposure to analyte molecules. The organic field effect transistors with different mobilities also provide evidence for morphology dependent partition coefficients. This study helps provide a physical basis for many reports on organic transistor based sensor response.

  18. Nonlinear Dynamics of Nanomechanical Resonators

    NASA Astrophysics Data System (ADS)

    Ramakrishnan, Subramanian; Gulak, Yuiry; Sundaram, Bala; Benaroya, Haym

    2007-03-01

    Nanoelectromechanical systems (NEMS) offer great promise for many applications including motion and mass sensing. Recent experimental results suggest the importance of nonlinear effects in NEMS, an issue which has not been addressed fully in theory. We report on a nonlinear extension of a recent analytical model by Armour et al [1] for the dynamics of a single-electron transistor (SET) coupled to a nanomechanical resonator. We consider the nonlinear resonator motion in both (a) the Duffing and (b) nonlinear pendulum regimes. The corresponding master equations are derived and solved numerically and we consider moment approximations as well. In the Duffing case with hardening stiffness, we observe that the resonator is damped by the SET at a significantly higher rate. In the cases of softening stiffness and the pendulum, there exist regimes where the SET adds energy to the resonator. To our knowledge, this is the first instance of a single model displaying both negative and positive resonator damping in different dynamical regimes. The implications of the results for SET sensitivity as well as for, as yet unexplained, experimental results will be discussed. 1. Armour et al. Phys.Rev.B (69) 125313 (2004).

  19. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    DOT National Transportation Integrated Search

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  20. Using Large Data Sets to Study College Education Trajectories

    ERIC Educational Resources Information Center

    Oseguera, Leticia; Hwang, Jihee

    2014-01-01

    This chapter presents various considerations researchers undertook to conduct a quantitative study on low-income students using a national data set. Specifically, it describes how a critical quantitative scholar approaches guiding frameworks, variable operationalization, analytic techniques, and result interpretation. Results inform how…

Top