Sample records for factor analytic approaches

  1. Comparison of three methods for wind turbine capacity factor estimation.

    PubMed

    Ditkovich, Y; Kuperman, A

    2014-01-01

    Three approaches to calculating capacity factor of fixed speed wind turbines are reviewed and compared using a case study. The first "quasiexact" approach utilizes discrete wind raw data (in the histogram form) and manufacturer-provided turbine power curve (also in discrete form) to numerically calculate the capacity factor. On the other hand, the second "analytic" approach employs a continuous probability distribution function, fitted to the wind data as well as continuous turbine power curve, resulting from double polynomial fitting of manufacturer-provided power curve data. The latter approach, while being an approximation, can be solved analytically thus providing a valuable insight into aspects, affecting the capacity factor. Moreover, several other merits of wind turbine performance may be derived based on the analytical approach. The third "approximate" approach, valid in case of Rayleigh winds only, employs a nonlinear approximation of the capacity factor versus average wind speed curve, only requiring rated power and rotor diameter of the turbine. It is shown that the results obtained by employing the three approaches are very close, enforcing the validity of the analytically derived approximations, which may be used for wind turbine performance evaluation.

  2. An Analysis of Machine- and Human-Analytics in Classification.

    PubMed

    Tam, Gary K L; Kothari, Vivek; Chen, Min

    2017-01-01

    In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.

  3. Factor-Analytic and Individualized Approaches to Constructing Brief Measures of ADHD Behaviors

    ERIC Educational Resources Information Center

    Volpe, Robert J.; Gadow, Kenneth D.; Blom-Hoffman, Jessica; Feinberg, Adam B.

    2009-01-01

    Two studies were performed to examine a factor-analytic and an individualized approach to creating short progress-monitoring measures from the longer "ADHD-Symptom Checklist-4" (ADHD-SC4). In Study 1, teacher ratings on items of the ADHD:Inattentive (IA) and ADHD:Hyperactive-Impulsive (HI) scales of the ADHD-SC4 were factor analyzed in a normative…

  4. Comparison of Three Methods for Wind Turbine Capacity Factor Estimation

    PubMed Central

    Ditkovich, Y.; Kuperman, A.

    2014-01-01

    Three approaches to calculating capacity factor of fixed speed wind turbines are reviewed and compared using a case study. The first “quasiexact” approach utilizes discrete wind raw data (in the histogram form) and manufacturer-provided turbine power curve (also in discrete form) to numerically calculate the capacity factor. On the other hand, the second “analytic” approach employs a continuous probability distribution function, fitted to the wind data as well as continuous turbine power curve, resulting from double polynomial fitting of manufacturer-provided power curve data. The latter approach, while being an approximation, can be solved analytically thus providing a valuable insight into aspects, affecting the capacity factor. Moreover, several other merits of wind turbine performance may be derived based on the analytical approach. The third “approximate” approach, valid in case of Rayleigh winds only, employs a nonlinear approximation of the capacity factor versus average wind speed curve, only requiring rated power and rotor diameter of the turbine. It is shown that the results obtained by employing the three approaches are very close, enforcing the validity of the analytically derived approximations, which may be used for wind turbine performance evaluation. PMID:24587755

  5. Significance Testing in Confirmatory Factor Analytic Models.

    ERIC Educational Resources Information Center

    Khattab, Ali-Maher; Hocevar, Dennis

    Traditionally, confirmatory factor analytic models are tested against a null model of total independence. Using randomly generated factors in a matrix of 46 aptitude tests, this approach is shown to be unlikely to reject even random factors. An alternative null model, based on a single general factor, is suggested. In addition, an index of model…

  6. Taxometric and Factor Analytic Models of Anxiety Sensitivity: Integrating Approaches to Latent Structural Research

    ERIC Educational Resources Information Center

    Bernstein, Amit; Zvolensky, Michael J.; Norton, Peter J.; Schmidt, Norman B.; Taylor, Steven; Forsyth, John P.; Lewis, Sarah F.; Feldner, Matthew T.; Leen-Feldner, Ellen W.; Stewart, Sherry H.; Cox, Brian

    2007-01-01

    This study represents an effort to better understand the latent structure of anxiety sensitivity (AS), as indexed by the 16-item Anxiety Sensitivity Index (ASI; S. Reiss, R. A. Peterson, M. Gursky, & R. J. McNally, 1986), by using taxometric and factor-analytic approaches in an integrative manner. Taxometric analyses indicated that AS has a…

  7. Resilience: A Meta-Analytic Approach

    ERIC Educational Resources Information Center

    Lee, Ji Hee; Nam, Suk Kyung; Kim, A-Reum; Kim, Boram; Lee, Min Young; Lee, Sang Min

    2013-01-01

    This study investigated the relationship between psychological resilience and its relevant variables by using a meta-analytic method. The results indicated that the largest effect on resilience was found to stem from the protective factors, a medium effect from risk factors, and the smallest effect from demographic factors. (Contains 4 tables.)

  8. The Structure of Temperament in Preschoolers: A Two-Stage Factor Analytic Approach

    PubMed Central

    Dyson, Margaret W.; Olino, Thomas M.; Durbin, C. Emily; Goldsmith, H. Hill; Klein, Daniel N.

    2012-01-01

    The structure of temperament traits in young children has been the subject of extensive debate, with separate models proposing different trait dimensions. This research has relied almost exclusively on parent-report measures. The present study used an alternative approach, a laboratory observational measure, to explore the structure of temperament in preschoolers. A 2-stage factor analytic approach, exploratory factor analyses (n = 274) followed by confirmatory factor analyses (n = 276), was used. We retrieved an adequately fitting model that consisted of 5 dimensions: Sociability, Positive Affect/Interest, Dysphoria, Fear/Inhibition, and Constraint versus Impulsivity. This solution overlaps with, but is also distinct from, the major models derived from parent-report measures. PMID:21859196

  9. Analytical method development of nifedipine and its degradants binary mixture using high performance liquid chromatography through a quality by design approach

    NASA Astrophysics Data System (ADS)

    Choiri, S.; Ainurofiq, A.; Ratri, R.; Zulmi, M. U.

    2018-03-01

    Nifedipin (NIF) is a photo-labile drug that easily degrades when it exposures a sunlight. This research aimed to develop of an analytical method using a high-performance liquid chromatography and implemented a quality by design approach to obtain effective, efficient, and validated analytical methods of NIF and its degradants. A 22 full factorial design approach with a curvature as a center point was applied to optimize of the analytical condition of NIF and its degradants. Mobile phase composition (MPC) and flow rate (FR) as factors determined on the system suitability parameters. The selected condition was validated by cross-validation using a leave one out technique. Alteration of MPC affected on time retention significantly. Furthermore, an increase of FR reduced the tailing factor. In addition, the interaction of both factors affected on an increase of the theoretical plates and resolution of NIF and its degradants. The selected analytical condition of NIF and its degradants has been validated at range 1 – 16 µg/mL that had good linearity, precision, accuration and efficient due to an analysis time within 10 min.

  10. Taxometric and Factor Analytic Models of Anxiety Sensitivity among Youth: Exploring the Latent Structure of Anxiety Psychopathology Vulnerability

    ERIC Educational Resources Information Center

    Bernstein, Amit; Zvolensky, Michael J.; Stewart, Sherry; Comeau, Nancy

    2007-01-01

    This study represents an effort to better understand the latent structure of anxiety sensitivity (AS), a well-established affect-sensitivity individual difference factor, among youth by employing taxometric and factor analytic approaches in an integrative manner. Taxometric analyses indicated that AS, as indexed by the Child Anxiety Sensitivity…

  11. An analytical approach to γ-ray self-shielding effects for radioactive bodies encountered nuclear decommissioning scenarios.

    PubMed

    Gamage, K A A; Joyce, M J

    2011-10-01

    A novel analytical approach is described that accounts for self-shielding of γ radiation in decommissioning scenarios. The approach is developed with plutonium-239, cobalt-60 and caesium-137 as examples; stainless steel and concrete have been chosen as the media for cobalt-60 and caesium-137, respectively. The analytical methods have been compared MCNPX 2.6.0 simulations. A simple, linear correction factor relates the analytical results and the simulated estimates. This has the potential to greatly simplify the estimation of self-shielding effects in decommissioning activities. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Psychometric Structure of a Comprehensive Objective Structured Clinical Examination: A Factor Analytic Approach

    ERIC Educational Resources Information Center

    Volkan, Kevin; Simon, Steven R.; Baker, Harley; Todres, I. David

    2004-01-01

    Problem Statement and Background: While the psychometric properties of Objective Structured Clinical Examinations (OSCEs) have been studied, their latent structures have not been well characterized. This study examines a factor analytic model of a comprehensive OSCE and addresses implications for measurement of clinical performance. Methods: An…

  13. Approaches to acceptable risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whipple, C

    Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, inmore » a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.« less

  14. Sample Size and Power Estimates for a Confirmatory Factor Analytic Model in Exercise and Sport: A Monte Carlo Approach

    ERIC Educational Resources Information Center

    Myers, Nicholas D.; Ahn, Soyeon; Jin, Ying

    2011-01-01

    Monte Carlo methods can be used in data analytic situations (e.g., validity studies) to make decisions about sample size and to estimate power. The purpose of using Monte Carlo methods in a validity study is to improve the methodological approach within a study where the primary focus is on construct validity issues and not on advancing…

  15. Analytic study of orbiter landing profiles

    NASA Technical Reports Server (NTRS)

    Walker, H. J.

    1981-01-01

    A broad survey of possible orbiter landing configurations was made with specific goals of defining boundaries for the landing task. The results suggest that the center of the corridors between marginal and routine represents a more or less optimal preflare condition for regular operations. Various constraints used to define the boundaries are based largely on qualitative judgements from earlier flight experience with the X-15 and lifting body research aircraft. The results should serve as useful background for expanding and validating landing simulation programs. The analytic approach offers a particular advantage in identifying trends due to the systematic variation of factors such as vehicle weight, load factor, approach speed, and aim point. Limitations such as a constant load factor during the flare and using a fixed gear deployment time interval, can be removed by increasing the flexibility of the computer program. This analytic definition of landing profiles of the orbiter may suggest additional studies, includin more configurations or more comparisons of landing profiles within and beyond the corridor boundaries.

  16. A Factor Analytic and Regression Approach to Functional Age: Potential Effects of Race.

    ERIC Educational Resources Information Center

    Colquitt, Alan L.; And Others

    Factor analysis and multiple regression are two major approaches used to look at functional age, which takes account of the extensive variation in the rate of physiological and psychological maturation throughout life. To examine the role of racial or cultural influences on the measurement of functional age, a battery of 12 tests concentrating on…

  17. Science and the Nonscience Major: Addressing the Fear Factor in the Chemical Arena Using Forensic Science

    ERIC Educational Resources Information Center

    Labianca, Dominick A.

    2007-01-01

    This article describes an approach to minimizing the "fear factor" in a chemistry course for the nonscience major, and also addresses relevant applications to other science courses, including biology, geology, and physics. The approach emphasizes forensic science and affords students the opportunity to hone their analytical skills in an…

  18. A Confirmatory Approach to Examining the Factor Structure of the Strengths and Difficulties Questionnaire (SDQ): A Large Scale Cohort Study

    ERIC Educational Resources Information Center

    Niclasen, Janni; Skovgaard, Anne Mette; Andersen, Anne-Marie Nybo; Somhovd, Mikael Julius; Obel, Carsten

    2013-01-01

    The aim of this study was to examine the factor structure of the Strengths and Difficulties Questionnaire (SDQ) using a Structural Confirmatory Factor Analytic approach. The Danish translation of the SDQ was distributed to 71,840 parents and teachers of 5-7 and 10-12-year-old boys and girls from four large scale cohorts. Three theoretical models…

  19. Cognitive-analytical therapy for a patient with functional neurological symptom disorder-conversion disorder (psychogenic myopia): A case study.

    PubMed

    Nasiri, Hamid; Ebrahimi, Amrollah; Zahed, Arash; Arab, Mostafa; Samouei, Rahele

    2015-05-01

    Functional neurological symptom disorder commonly presents with symptoms and defects of sensory and motor functions. Therefore, it is often mistaken for a medical condition. It is well known that functional neurological symptom disorder more often caused by psychological factors. There are three main approaches namely analytical, cognitive and biological to manage conversion disorder. Any of such approaches can be applied through short-term treatment programs. In this case, study a 12-year-old boy with the diagnosed functional neurological symptom disorder (psychogenic myopia) was put under a cognitive-analytical treatment. The outcome of this treatment modality was proved successful.

  20. An analytical approach for the calculation of stress-intensity factors in transformation-toughened ceramics

    NASA Astrophysics Data System (ADS)

    Müller, W. H.

    1990-12-01

    Stress-induced transformation toughening in Zirconia-containing ceramics is described analytically by means of a quantitative model: A Griffith crack which interacts with a transformed, circular Zirconia inclusion. Due to its volume expansion, a ZrO2-particle compresses its flanks, whereas a particle in front of the crack opens the flanks such that the crack will be attracted and finally absorbed. Erdogan's integral equation technique is applied to calculate the dislocation functions and the stress-intensity-factors which correspond to these situations. In order to derive analytical expressions, the elastic constants of the inclusion and the matrix are assumed to be equal.

  1. Flexible aircraft dynamic modeling for dynamic analysis and control synthesis

    NASA Technical Reports Server (NTRS)

    Schmidt, David K.

    1989-01-01

    The linearization and simplification of a nonlinear, literal model for flexible aircraft is highlighted. Areas of model fidelity that are critical if the model is to be used for control system synthesis are developed and several simplification techniques that can deliver the necessary model fidelity are discussed. These techniques include both numerical and analytical approaches. An analytical approach, based on first-order sensitivity theory is shown to lead not only to excellent numerical results, but also to closed-form analytical expressions for key system dynamic properties such as the pole/zero factors of the vehicle transfer-function matrix. The analytical results are expressed in terms of vehicle mass properties, vibrational characteristics, and rigid-body and aeroelastic stability derivatives, thus leading to the underlying causes for critical dynamic characteristics.

  2. At-line nanofractionation with parallel mass spectrometry and bioactivity assessment for the rapid screening of thrombin and factor Xa inhibitors in snake venoms.

    PubMed

    Mladic, Marija; Zietek, Barbara M; Iyer, Janaki Krishnamoorthy; Hermarij, Philip; Niessen, Wilfried M A; Somsen, Govert W; Kini, R Manjunatha; Kool, Jeroen

    2016-02-01

    Snake venoms comprise complex mixtures of peptides and proteins causing modulation of diverse physiological functions upon envenomation of the prey organism. The components of snake venoms are studied as research tools and as potential drug candidates. However, the bioactivity determination with subsequent identification and purification of the bioactive compounds is a demanding and often laborious effort involving different analytical and pharmacological techniques. This study describes the development and optimization of an integrated analytical approach for activity profiling and identification of venom constituents targeting the cardiovascular system, thrombin and factor Xa enzymes in particular. The approach developed encompasses reversed-phase liquid chromatography (RPLC) analysis of a crude snake venom with parallel mass spectrometry (MS) and bioactivity analysis. The analytical and pharmacological part in this approach are linked using at-line nanofractionation. This implies that the bioactivity is assessed after high-resolution nanofractionation (6 s/well) onto high-density 384-well microtiter plates and subsequent freeze drying of the plates. The nanofractionation and bioassay conditions were optimized for maintaining LC resolution and achieving good bioassay sensitivity. The developed integrated analytical approach was successfully applied for the fast screening of snake venoms for compounds affecting thrombin and factor Xa activity. Parallel accurate MS measurements provided correlation of observed bioactivity to peptide/protein masses. This resulted in identification of a few interesting peptides with activity towards the drug target factor Xa from a screening campaign involving venoms of 39 snake species. Besides this, many positive protease activity peaks were observed in most venoms analysed. These protease fingerprint chromatograms were found to be similar for evolutionary closely related species and as such might serve as generic snake protease bioactivity fingerprints in biological studies on venoms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. The Development of Verbal and Visual Working Memory Processes: A Latent Variable Approach

    ERIC Educational Resources Information Center

    Koppenol-Gonzalez, Gabriela V.; Bouwmeester, Samantha; Vermunt, Jeroen K.

    2012-01-01

    Working memory (WM) processing in children has been studied with different approaches, focusing on either the organizational structure of WM processing during development (factor analytic) or the influence of different task conditions on WM processing (experimental). The current study combined both approaches, aiming to distinguish verbal and…

  4. Net growth rate of continuum heterogeneous biofilms with inhibition kinetics.

    PubMed

    Gonzo, Elio Emilio; Wuertz, Stefan; Rajal, Veronica B

    2018-01-01

    Biofilm systems can be modeled using a variety of analytical and numerical approaches, usually by making simplifying assumptions regarding biofilm heterogeneity and activity as well as effective diffusivity. Inhibition kinetics, albeit common in experimental systems, are rarely considered and analytical approaches are either lacking or consider effective diffusivity of the substrate and the biofilm density to remain constant. To address this obvious knowledge gap an analytical procedure to estimate the effectiveness factor (dimensionless substrate mass flux at the biofilm-fluid interface) was developed for a continuum heterogeneous biofilm with multiple limiting-substrate Monod kinetics to different types of inhibition kinetics. The simple perturbation technique, previously validated to quantify biofilm activity, was applied to systems where either the substrate or the inhibitor is the limiting component, and cases where the inhibitor is a reaction product or the substrate also acts as the inhibitor. Explicit analytical equations are presented for the effectiveness factor estimation and, therefore, the calculation of biomass growth rate or limiting substrate/inhibitor consumption rate, for a given biofilm thickness. The robustness of the new biofilm model was tested using kinetic parameters experimentally determined for the growth of Pseudomonas putida CCRC 14365 on phenol. Several additional cases have been analyzed, including examples where the effectiveness factor can reach values greater than unity, characteristic of systems with inhibition kinetics. Criteria to establish when the effectiveness factor can reach values greater than unity in each of the cases studied are also presented.

  5. Consistent approach to describing aircraft HIRF protection

    NASA Technical Reports Server (NTRS)

    Rimbey, P. R.; Walen, D. B.

    1995-01-01

    The high intensity radiated fields (HIRF) certification process as currently implemented is comprised of an inconsistent combination of factors that tend to emphasize worst case scenarios in assessing commercial airplane certification requirements. By examining these factors which include the process definition, the external HIRF environment, the aircraft coupling and corresponding internal fields, and methods of measuring equipment susceptibilities, activities leading to an approach to appraising airplane vulnerability to HIRF are proposed. This approach utilizes technically based criteria to evaluate the nature of the threat, including the probability of encountering the external HIRF environment. No single test or analytic method comprehensively addresses the full HIRF threat frequency spectrum. Additional tools such as statistical methods must be adopted to arrive at more realistic requirements to reflect commercial aircraft vulnerability to the HIRF threat. Test and analytic data are provided to support the conclusions of this report. This work was performed under NASA contract NAS1-19360, Task 52.

  6. Exact mode volume and Purcell factor of open optical systems

    NASA Astrophysics Data System (ADS)

    Muljarov, E. A.; Langbein, W.

    2016-12-01

    The Purcell factor quantifies the change of the radiative decay of a dipole in an electromagnetic environment relative to free space. Designing this factor is at the heart of photonics technology, striving to develop ever smaller or less lossy optical resonators. The Purcell factor can be expressed using the electromagnetic eigenmodes of the resonators, introducing the notion of a mode volume for each mode. This approach allows an analytic treatment, reducing the Purcell factor and other observables to sums over eigenmode resonances. Calculating the mode volumes requires a correct normalization of the modes. We introduce an exact normalization of modes, not relying on perfectly matched layers. We present an analytic theory of the Purcell effect based on this exact mode normalization and the resulting effective mode volume. We use a homogeneous dielectric sphere in vacuum, which is analytically solvable, to exemplify these findings. We furthermore verify the applicability of the normalization to numerically determined modes of a finite dielectric cylinder.

  7. Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Keller, J.; Wallen, R.

    2015-02-01

    Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.

  8. Vortex-assisted magnetic β-cyclodextrin/attapulgite-linked ionic liquid dispersive liquid-liquid microextraction coupled with high-performance liquid chromatography for the fast determination of four fungicides in water samples.

    PubMed

    Yang, Miyi; Xi, Xuefei; Wu, Xiaoling; Lu, Runhua; Zhou, Wenfeng; Zhang, Sanbing; Gao, Haixiang

    2015-02-13

    A novel microextraction technique combining magnetic solid-phase microextraction (MSPME) with ionic liquid dispersive liquid-liquid microextraction (IL-DLLME) to determine four fungicides is presented in this work for the first time. The main factors affecting the extraction efficiency were optimized by the one-factor-at-a-time approach and the impacts of these factors were studied by an orthogonal design. Without tedious clean-up procedure, analytes were extracted from the sample to the adsorbent and organic solvent and then desorbed in acetonitrile prior to chromatographic analysis. Under the optimum conditions, good linearity and high enrichment factors were obtained for all analytes, with correlation coefficients ranging from 0.9998 to 1.0000 and enrichment factors ranging 135 and 159 folds. The recoveries for proposed approach were between 98% and 115%, the limits of detection were between 0.02 and 0.04 μg L(-1) and the RSDs changed from 2.96 to 4.16. The method was successfully applied in the analysis of four fungicides (azoxystrobin, chlorothalonil, cyprodinil and trifloxystrobin) in environmental water samples. The recoveries for the real water samples ranged between 81% and 109%. The procedure proved to be a time-saving, environmentally friendly, and efficient analytical technique. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Proposal of a risk-factor-based analytical approach for integrating occupational health and safety into project risk evaluation.

    PubMed

    Badri, Adel; Nadeau, Sylvie; Gbodossou, André

    2012-09-01

    Excluding occupational health and safety (OHS) from project management is no longer acceptable. Numerous industrial accidents have exposed the ineffectiveness of conventional risk evaluation methods as well as negligence of risk factors having major impact on the health and safety of workers and nearby residents. Lack of reliable and complete evaluations from the beginning of a project generates bad decisions that could end up threatening the very existence of an organization. This article supports a systematic approach to the evaluation of OHS risks and proposes a new procedure based on the number of risk factors identified and their relative significance. A new concept called risk factor concentration along with weighting of risk factor categories as contributors to undesirable events are used in the analytical hierarchy process multi-criteria comparison model with Expert Choice(©) software. A case study is used to illustrate the various steps of the risk evaluation approach and the quick and simple integration of OHS at an early stage of a project. The approach allows continual reassessment of criteria over the course of the project or when new data are acquired. It was thus possible to differentiate the OHS risks from the risk of drop in quality in the case of the factory expansion project. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Quantitative evaluation of the matrix effect in bioanalytical methods based on LC-MS: A comparison of two approaches.

    PubMed

    Rudzki, Piotr J; Gniazdowska, Elżbieta; Buś-Kwaśnik, Katarzyna

    2018-06-05

    Liquid chromatography coupled to mass spectrometry (LC-MS) is a powerful tool for studying pharmacokinetics and toxicokinetics. Reliable bioanalysis requires the characterization of the matrix effect, i.e. influence of the endogenous or exogenous compounds on the analyte signal intensity. We have compared two methods for the quantitation of matrix effect. The CVs(%) of internal standard normalized matrix factors recommended by the European Medicines Agency were evaluated against internal standard normalized relative matrix effects derived from Matuszewski et al. (2003). Both methods use post-extraction spiked samples, but matrix factors require also neat solutions. We have tested both approaches using analytes of diverse chemical structures. The study did not reveal relevant differences in the results obtained with both calculation methods. After normalization with the internal standard, the CV(%) of the matrix factor was on average 0.5% higher than the corresponding relative matrix effect. The method adopted by the European Medicines Agency seems to be slightly more conservative in the analyzed datasets. Nine analytes of different structures enabled a general overview of the problem, still, further studies are encouraged to confirm our observations. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Universal analytical scattering form factor for shell-, core-shell, or homogeneous particles with continuously variable density profile shape.

    PubMed

    Foster, Tobias

    2011-09-01

    A novel analytical and continuous density distribution function with a widely variable shape is reported and used to derive an analytical scattering form factor that allows us to universally describe the scattering from particles with the radial density profile of homogeneous spheres, shells, or core-shell particles. Composed by the sum of two Fermi-Dirac distribution functions, the shape of the density profile can be altered continuously from step-like via Gaussian-like or parabolic to asymptotically hyperbolic by varying a single "shape parameter", d. Using this density profile, the scattering form factor can be calculated numerically. An analytical form factor can be derived using an approximate expression for the original Fermi-Dirac distribution function. This approximation is accurate for sufficiently small rescaled shape parameters, d/R (R being the particle radius), up to values of d/R ≈ 0.1, and thus captures step-like, Gaussian-like, and parabolic as well as asymptotically hyperbolic profile shapes. It is expected that this form factor is particularly useful in a model-dependent analysis of small-angle scattering data since the applied continuous and analytical function for the particle density profile can be compared directly with the density profile extracted from the data by model-free approaches like the generalized inverse Fourier transform method. © 2011 American Chemical Society

  12. Semi-Analytic Reconstruction of Flux in Finite Volume Formulations

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2006-01-01

    Semi-analytic reconstruction uses the analytic solution to a second-order, steady, ordinary differential equation (ODE) to simultaneously evaluate the convective and diffusive flux at all interfaces of a finite volume formulation. The second-order ODE is itself a linearized approximation to the governing first- and second- order partial differential equation conservation laws. Thus, semi-analytic reconstruction defines a family of formulations for finite volume interface fluxes using analytic solutions to approximating equations. Limiters are not applied in a conventional sense; rather, diffusivity is adjusted in the vicinity of changes in sign of eigenvalues in order to achieve a sufficiently small cell Reynolds number in the analytic formulation across critical points. Several approaches for application of semi-analytic reconstruction for the solution of one-dimensional scalar equations are introduced. Results are compared with exact analytic solutions to Burger s Equation as well as a conventional, upwind discretization using Roe s method. One approach, the end-point wave speed (EPWS) approximation, is further developed for more complex applications. One-dimensional vector equations are tested on a quasi one-dimensional nozzle application. The EPWS algorithm has a more compact difference stencil than Roe s algorithm but reconstruction time is approximately a factor of four larger than for Roe. Though both are second-order accurate schemes, Roe s method approaches a grid converged solution with fewer grid points. Reconstruction of flux in the context of multi-dimensional, vector conservation laws including effects of thermochemical nonequilibrium in the Navier-Stokes equations is developed.

  13. Evaluating Effective Teaching in College Level Economics Using Student Ratings of Instruction: A Factor Analytic Approach

    ERIC Educational Resources Information Center

    Agbetsiafa, Douglas

    2010-01-01

    This paper explores the factors that affect students' evaluation of economic instruction using a sample of 1300 completed rating instruments at a comprehensive four-year mid-western public university. The study uses factor analysis to determine the validity and reliability of the evaluation instrument in assessing instructor or course…

  14. Investigating the Relationship between Test-Taker Background Characteristics and Test Performance in a Heterogeneous English-as-a-Second-Language (ESL) Test Population: A Factor Analytic Approach. Research Report. ETS RR-15-25

    ERIC Educational Resources Information Center

    Manna, Venessa F.; Yoo, Hanwook

    2015-01-01

    This study examined the heterogeneity in the English-as-a-second-language (ESL) test population by modeling the relationship between test-taker background characteristics and test performance as measured by the "TOEFL iBT"® using a confirmatory factor analysis (CFA) with covariate approach. The background characteristics studied…

  15. An approach for environmental risk assessment of engineered nanomaterials using Analytical Hierarchy Process (AHP) and fuzzy inference rules.

    PubMed

    Topuz, Emel; van Gestel, Cornelis A M

    2016-01-01

    The usage of Engineered Nanoparticles (ENPs) in consumer products is relatively new and there is a need to conduct environmental risk assessment (ERA) to evaluate their impacts on the environment. However, alternative approaches are required for ERA of ENPs because of the huge gap in data and knowledge compared to conventional pollutants and their unique properties that make it difficult to apply existing approaches. This study aims to propose an ERA approach for ENPs by integrating Analytical Hierarchy Process (AHP) and fuzzy inference models which provide a systematic evaluation of risk factors and reducing uncertainty about the data and information, respectively. Risk is assumed to be the combination of occurrence likelihood, exposure potential and toxic effects in the environment. A hierarchy was established to evaluate the sub factors of these components. Evaluation was made with fuzzy numbers to reduce uncertainty and incorporate the expert judgements. Overall score of each component was combined with fuzzy inference rules by using expert judgements. Proposed approach reports the risk class and its membership degree such as Minor (0.7). Therefore, results are precise and helpful to determine the risk management strategies. Moreover, priority weights calculated by comparing the risk factors based on their importance for the risk enable users to understand which factor is effective on the risk. Proposed approach was applied for Ag (two nanoparticles with different coating) and TiO2 nanoparticles for different case studies. Results verified the proposed benefits of the approach. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. The Anxiety Sensitivity Index--Revised: Confirmatory Factor Analyses, Structural Invariance in Caucasian and African American Samples, and Score Reliability and Validity

    ERIC Educational Resources Information Center

    Arnau, Randolph C.; Broman-Fulks, Joshua J.; Green, Bradley A.; Berman, Mitchell E.

    2009-01-01

    The most commonly used measure of anxiety sensitivity is the 36-item Anxiety Sensitivity Index--Revised (ASI-R). Exploratory factor analyses have produced several different factors structures for the ASI-R, but an acceptable fit using confirmatory factor analytic approaches has only been found for a 21-item version of the instrument. We evaluated…

  17. An Analytical Thermal Model for Autonomous Soaring Research

    NASA Technical Reports Server (NTRS)

    Allen, Michael

    2006-01-01

    A viewgraph presentation describing an analytical thermal model used to enable research on autonomous soaring for a small UAV aircraft is given. The topics include: 1) Purpose; 2) Approach; 3) SURFRAD Data; 4) Convective Layer Thickness; 5) Surface Heat Budget; 6) Surface Virtual Potential Temperature Flux; 7) Convective Scaling Velocity; 8) Other Calculations; 9) Yearly trends; 10) Scale Factors; 11) Scale Factor Test Matrix; 12) Statistical Model; 13) Updraft Strength Calculation; 14) Updraft Diameter; 15) Updraft Shape; 16) Smoothed Updraft Shape; 17) Updraft Spacing; 18) Environment Sink; 19) Updraft Lifespan; 20) Autonomous Soaring Research; 21) Planned Flight Test; and 22) Mixing Ratio.

  18. Replica Analysis for Portfolio Optimization with Single-Factor Model

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2017-06-01

    In this paper, we use replica analysis to investigate the influence of correlation among the return rates of assets on the solution of the portfolio optimization problem. We consider the behavior of an optimal solution for the case where the return rate is described with a single-factor model and compare the findings obtained from our proposed methods with correlated return rates with those obtained with independent return rates. We then analytically assess the increase in the investment risk when correlation is included. Furthermore, we also compare our approach with analytical procedures for minimizing the investment risk from operations research.

  19. Approaching near real-time biosensing: microfluidic microsphere based biosensor for real-time analyte detection.

    PubMed

    Cohen, Noa; Sabhachandani, Pooja; Golberg, Alexander; Konry, Tania

    2015-04-15

    In this study we describe a simple lab-on-a-chip (LOC) biosensor approach utilizing well mixed microfluidic device and a microsphere-based assay capable of performing near real-time diagnostics of clinically relevant analytes such cytokines and antibodies. We were able to overcome the adsorption kinetics reaction rate-limiting mechanism, which is diffusion-controlled in standard immunoassays, by introducing the microsphere-based assay into well-mixed yet simple microfluidic device with turbulent flow profiles in the reaction regions. The integrated microsphere-based LOC device performs dynamic detection of the analyte in minimal amount of biological specimen by continuously sampling micro-liter volumes of sample per minute to detect dynamic changes in target analyte concentration. Furthermore we developed a mathematical model for the well-mixed reaction to describe the near real time detection mechanism observed in the developed LOC method. To demonstrate the specificity and sensitivity of the developed real time monitoring LOC approach, we applied the device for clinically relevant analytes: Tumor Necrosis Factor (TNF)-α cytokine and its clinically used inhibitor, anti-TNF-α antibody. Based on the reported results herein, the developed LOC device provides continuous sensitive and specific near real-time monitoring method for analytes such as cytokines and antibodies, reduces reagent volumes by nearly three orders of magnitude as well as eliminates the washing steps required by standard immunoassays. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Analyzing Response Times in Tests with Rank Correlation Approaches

    ERIC Educational Resources Information Center

    Ranger, Jochen; Kuhn, Jorg-Tobias

    2013-01-01

    It is common practice to log-transform response times before analyzing them with standard factor analytical methods. However, sometimes the log-transformation is not capable of linearizing the relation between the response times and the latent traits. Therefore, a more general approach to response time analysis is proposed in the current…

  1. Construction of RFIF using VVSFs with application

    NASA Astrophysics Data System (ADS)

    Katiyar, Kuldip; Prasad, Bhagwati

    2017-10-01

    A method of variable vertical scaling factors (VVSFs) is proposed to define the recurrent fractal interpolation function (RFIF) for fitting the data sets. A generalization of one of the recent methods using analytic approach is presented for finding variable vertical scaling factors. An application of it in reconstruction of an EEG signal is also given.

  2. Basic emotion processing and the adolescent brain: Task demands, analytic approaches, and trajectories of changes.

    PubMed

    Del Piero, Larissa B; Saxbe, Darby E; Margolin, Gayla

    2016-06-01

    Early neuroimaging studies suggested that adolescents show initial development in brain regions linked with emotional reactivity, but slower development in brain structures linked with emotion regulation. However, the increased sophistication of adolescent brain research has made this picture more complex. This review examines functional neuroimaging studies that test for differences in basic emotion processing (reactivity and regulation) between adolescents and either children or adults. We delineated different emotional processing demands across the experimental paradigms in the reviewed studies to synthesize the diverse results. The methods for assessing change (i.e., analytical approach) and cohort characteristics (e.g., age range) were also explored as potential factors influencing study results. Few unifying dimensions were found to successfully distill the results of the reviewed studies. However, this review highlights the potential impact of subtle methodological and analytic differences between studies, need for standardized and theory-driven experimental paradigms, and necessity of analytic approaches that are can adequately test the trajectories of developmental change that have recently been proposed. Recommendations for future research highlight connectivity analyses and non-linear developmental trajectories, which appear to be promising approaches for measuring change across adolescence. Recommendations are made for evaluating gender and biological markers of development beyond chronological age. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Basic emotion processing and the adolescent brain: Task demands, analytic approaches, and trajectories of changes

    PubMed Central

    Del Piero, Larissa B.; Saxbe, Darby E.; Margolin, Gayla

    2016-01-01

    Early neuroimaging studies suggested that adolescents show initial development in brain regions linked with emotional reactivity, but slower development in brain structures linked with emotion regulation. However, the increased sophistication of adolescent brain research has made this picture more complex. This review examines functional neuroimaging studies that test for differences in basic emotion processing (reactivity and regulation) between adolescents and either children or adults. We delineated different emotional processing demands across the experimental paradigms in the reviewed studies to synthesize the diverse results. The methods for assessing change (i.e., analytical approach) and cohort characteristics (e.g., age range) were also explored as potential factors influencing study results. Few unifying dimensions were found to successfully distill the results of the reviewed studies. However, this review highlights the potential impact of subtle methodological and analytic differences between studies, need for standardized and theory-driven experimental paradigms, and necessity of analytic approaches that are can adequately test the trajectories of developmental change that have recently been proposed. Recommendations for future research highlight connectivity analyses and nonlinear developmental trajectories, which appear to be promising approaches for measuring change across adolescence. Recommendations are made for evaluating gender and biological markers of development beyond chronological age. PMID:27038840

  4. The predictive accuracy of analytical formulas and semiclassical approaches for α decay half-lives of superheavy nuclei

    NASA Astrophysics Data System (ADS)

    Zhao, T. L.; Bao, X. J.; Guo, S. Q.

    2018-02-01

    Systematic calculations on the α decay half-lives are performed by using three analytical formulas and two semiclassical approaches. For the three analytical formulas, the experimental α decay half-lives and {Q}α values of the 66 reference nuclei have been used to obtain the coefficients. We get only four adjustable parameters to describe α decay half-lives for even-even, odd-A, and odd-odd nuclei. By comparison between the calculated values from ten analytical formulas and experimental data, it is shown that the new universal decay law (NUDL) foumula is the most accurate one to reproduce the experimental α decay half-lives of the superheavy nuclei (SHN). Meanwhile it is found that the experimental α decay half-lives of SHN are well reproduced by the Royer formula although many parameters are contained. The results show that the NUDL formula and the generalized liquid drop model (GLDM2) with consideration of the preformation factor can give fairly equivalent results for the superheavy nuclei.

  5. Managing knowledge business intelligence: A cognitive analytic approach

    NASA Astrophysics Data System (ADS)

    Surbakti, Herison; Ta'a, Azman

    2017-10-01

    The purpose of this paper is to identify and analyze integration of Knowledge Management (KM) and Business Intelligence (BI) in order to achieve competitive edge in context of intellectual capital. Methodology includes review of literatures and analyzes the interviews data from managers in corporate sector and models established by different authors. BI technologies have strong association with process of KM for attaining competitive advantage. KM have strong influence from human and social factors and turn them to the most valuable assets with efficient system run under BI tactics and technologies. However, the term of predictive analytics is based on the field of BI. Extracting tacit knowledge is a big challenge to be used as a new source for BI to use in analyzing. The advanced approach of the analytic methods that address the diversity of data corpus - structured and unstructured - required a cognitive approach to provide estimative results and to yield actionable descriptive, predictive and prescriptive results. This is a big challenge nowadays, and this paper aims to elaborate detail in this initial work.

  6. Adapting Surface Ground Motion Relations to Underground conditions: A case study for the Sudbury Neutrino Observatory in Sudbury, Ontario, Canada

    NASA Astrophysics Data System (ADS)

    Babaie Mahani, A.; Eaton, D. W.

    2013-12-01

    Ground Motion Prediction Equations (GMPEs) are widely used in Probabilistic Seismic Hazard Assessment (PSHA) to estimate ground-motion amplitudes at Earth's surface as a function of magnitude and distance. Certain applications, such as hazard assessment for caprock integrity in the case of underground storage of CO2, waste disposal sites, and underground pipelines, require subsurface estimates of ground motion; at present, such estimates depend upon theoretical modeling and simulations. The objective of this study is to derive correction factors for GMPEs to enable estimation of amplitudes in the subsurface. We use a semi-analytic approach along with finite-difference simulations of ground-motion amplitudes for surface and underground motions. Spectral ratios of underground to surface motions are used to calculate the correction factors. Two predictive methods are used. The first is a semi-analytic approach based on a quarter-wavelength method that is widely used for earthquake site-response investigations; the second is a numerical approach based on elastic finite-difference simulations of wave propagation. Both methods are evaluated using recordings of regional earthquakes by broadband seismometers installed at the surface and at depths of 1400 m and 2100 m in the Sudbury Neutrino Observatory, Canada. Overall, both methods provide a reasonable fit to the peaks and troughs observed in the ratios of real data. The finite-difference method, however, has the capability to simulate ground motion ratios more accurately than the semi-analytic approach.

  7. Identifying environmental variables explaining genotype-by-environment interaction for body weight of rainbow trout (Onchorynchus mykiss): reaction norm and factor analytic models.

    PubMed

    Sae-Lim, Panya; Komen, Hans; Kause, Antti; Mulder, Han A

    2014-02-26

    Identifying the relevant environmental variables that cause GxE interaction is often difficult when they cannot be experimentally manipulated. Two statistical approaches can be applied to address this question. When data on candidate environmental variables are available, GxE interaction can be quantified as a function of specific environmental variables using a reaction norm model. Alternatively, a factor analytic model can be used to identify the latent common factor that explains GxE interaction. This factor can be correlated with known environmental variables to identify those that are relevant. Previously, we reported a significant GxE interaction for body weight at harvest in rainbow trout reared on three continents. Here we explore their possible causes. Reaction norm and factor analytic models were used to identify which environmental variables (age at harvest, water temperature, oxygen, and photoperiod) may have caused the observed GxE interaction. Data on body weight at harvest was recorded on 8976 offspring reared in various locations: (1) a breeding environment in the USA (nucleus), (2) a recirculating aquaculture system in the Freshwater Institute in West Virginia, USA, (3) a high-altitude farm in Peru, and (4) a low-water temperature farm in Germany. Akaike and Bayesian information criteria were used to compare models. The combination of days to harvest multiplied with daily temperature (Day*Degree) and photoperiod were identified by the reaction norm model as the environmental variables responsible for the GxE interaction. The latent common factor that was identified by the factor analytic model showed the highest correlation with Day*Degree. Day*Degree and photoperiod were the environmental variables that differed most between Peru and other environments. Akaike and Bayesian information criteria indicated that the factor analytical model was more parsimonious than the reaction norm model. Day*Degree and photoperiod were identified as environmental variables responsible for the strong GxE interaction for body weight at harvest in rainbow trout across four environments. Both the reaction norm and the factor analytic models can help identify the environmental variables responsible for GxE interaction. A factor analytic model is preferred over a reaction norm model when limited information on differences in environmental variables between farms is available.

  8. Identifying environmental variables explaining genotype-by-environment interaction for body weight of rainbow trout (Onchorynchus mykiss): reaction norm and factor analytic models

    PubMed Central

    2014-01-01

    Background Identifying the relevant environmental variables that cause GxE interaction is often difficult when they cannot be experimentally manipulated. Two statistical approaches can be applied to address this question. When data on candidate environmental variables are available, GxE interaction can be quantified as a function of specific environmental variables using a reaction norm model. Alternatively, a factor analytic model can be used to identify the latent common factor that explains GxE interaction. This factor can be correlated with known environmental variables to identify those that are relevant. Previously, we reported a significant GxE interaction for body weight at harvest in rainbow trout reared on three continents. Here we explore their possible causes. Methods Reaction norm and factor analytic models were used to identify which environmental variables (age at harvest, water temperature, oxygen, and photoperiod) may have caused the observed GxE interaction. Data on body weight at harvest was recorded on 8976 offspring reared in various locations: (1) a breeding environment in the USA (nucleus), (2) a recirculating aquaculture system in the Freshwater Institute in West Virginia, USA, (3) a high-altitude farm in Peru, and (4) a low-water temperature farm in Germany. Akaike and Bayesian information criteria were used to compare models. Results The combination of days to harvest multiplied with daily temperature (Day*Degree) and photoperiod were identified by the reaction norm model as the environmental variables responsible for the GxE interaction. The latent common factor that was identified by the factor analytic model showed the highest correlation with Day*Degree. Day*Degree and photoperiod were the environmental variables that differed most between Peru and other environments. Akaike and Bayesian information criteria indicated that the factor analytical model was more parsimonious than the reaction norm model. Conclusions Day*Degree and photoperiod were identified as environmental variables responsible for the strong GxE interaction for body weight at harvest in rainbow trout across four environments. Both the reaction norm and the factor analytic models can help identify the environmental variables responsible for GxE interaction. A factor analytic model is preferred over a reaction norm model when limited information on differences in environmental variables between farms is available. PMID:24571451

  9. Study of different HILIC, mixed-mode, and other aqueous normal-phase approaches for the liquid chromatography/mass spectrometry-based determination of challenging polar pesticides.

    PubMed

    Vass, Andrea; Robles-Molina, José; Pérez-Ortega, Patricia; Gilbert-López, Bienvenida; Dernovics, Mihaly; Molina-Díaz, Antonio; García-Reyes, Juan F

    2016-07-01

    The aim of the study was to evaluate the performance of different chromatographic approaches for the liquid chromatography/mass spectrometry (LC-MS(/MS)) determination of 24 highly polar pesticides. The studied compounds, which are in most cases unsuitable for conventional LC-MS(/MS) multiresidue methods were tested with nine different chromatographic conditions, including two different hydrophilic interaction liquid chromatography (HILIC) columns, two zwitterionic-type mixed-mode columns, three normal-phase columns operated in HILIC-mode (bare silica and two silica-based chemically bonded columns (cyano and amino)), and two standard reversed-phase C18 columns. Different sets of chromatographic parameters in positive (for 17 analytes) and negative ionization modes (for nine analytes) were examined. In order to compare the different approaches, a semi-quantitative classification was proposed, calculated as the percentage of an empirical performance value, which consisted of three main features: (i) capacity factor (k) to characterize analyte separation from the void, (ii) relative response factor, and (iii) peak shape based on analytes' peak width. While no single method was able to provide appropriate detection of all the 24 studied species in a single run, the best suited approach for the compounds ionized in positive mode was based on a UHPLC HILIC column with 1.8 μm particle size, providing appropriate results for 22 out of the 24 species tested. In contrast, the detection of glyphosate and aminomethylphosphonic acid could only be achieved with a zwitterionic-type mixed-mode column, which proved to be suitable only for the pesticides detected in negative ion mode. Finally, the selected approach (UHPLC HILIC) was found to be useful for the determination of multiple pesticides in oranges using HILIC-ESI-MS/MS, with limits of quantitation in the low microgram per kilogram in most cases. Graphical Abstract HILIC improves separation of multiclass polar pesticides.

  10. A Learning Analytics Approach to Investigating Factors Affecting EFL Students' Oral Performance in a Flipped Classroom

    ERIC Educational Resources Information Center

    Lin, Chi-Jen; Hwang, Gwo-Jen

    2018-01-01

    Flipped classrooms have been widely adopted and discussed by school teachers and researchers in the past decade. However, few studies have been conducted to formally evaluate the effectiveness of flipped classrooms in terms of improving EFL students' English oral presentation, not to mention investigating factors affecting their flipped learning…

  11. PNEUMATIC MICROVALVE FOR ELECTROKINETIC SAMPLE PRECONCENTRATION AND CAPILLARY ELECTROPHORESIS INJECTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cong, Yongzheng; Rausch, Sarah J.; Geng, Tao

    2014-10-27

    Here we show that a closed pneumatic microvalve on a PDMS chip can serve as a semipermeable membrane under an applied potential, enabling current to pass through while blocking the passage of charged analytes. Enrichment of both anionic and cationic species has been demonstrated, and concentration factors of ~70 have been achieved in just 8 s. Once analytes are concentrated, the valve is briefly opened and the sample is hydrodynamically injected onto an integrated microchip or capillary electrophoresis (CE) column. In contrast to existing preconcentration approaches, the membrane-based method described here enables both rapid analyte concentration as well as highmore » resolution separations.« less

  12. Examining Factors Associated with (In)Stability in Social Information Processing among Urban School Children: A Latent Transition Analytic Approach

    ERIC Educational Resources Information Center

    Goldweber, Asha; Bradshaw, Catherine P.; Goodman, Kimberly; Monahan, Kathryn; Cooley-Strickland, Michele

    2011-01-01

    There is compelling evidence for the role of social information processing (SIP) in aggressive behavior. However, less is known about factors that influence stability versus instability in patterns of SIP over time. Latent transition analysis was used to identify SIP patterns over one year and examine how community violence exposure, aggressive…

  13. Theory of ground state factorization in quantum cooperative systems.

    PubMed

    Giampaolo, Salvatore M; Adesso, Gerardo; Illuminati, Fabrizio

    2008-05-16

    We introduce a general analytic approach to the study of factorization points and factorized ground states in quantum cooperative systems. The method allows us to determine rigorously the existence, location, and exact form of separable ground states in a large variety of, generally nonexactly solvable, spin models belonging to different universality classes. The theory applies to translationally invariant systems, irrespective of spatial dimensionality, and for spin-spin interactions of arbitrary range.

  14. Testing the multidimensionality of the inventory of school motivation in a Dutch student sample.

    PubMed

    Korpershoek, Hanke; Xu, Kun; Mok, Magdalena Mo Ching; McInerney, Dennis M; van der Werf, Greetje

    2015-01-01

    A factor analytic and a Rasch measurement approach were applied to evaluate the multidimensional nature of the school motivation construct among more than 7,000 Dutch secondary school students. The Inventory of School Motivation (McInerney and Ali, 2006) was used, which intends to measure four motivation dimensions (mastery, performance, social, and extrinsic motivation), each comprising of two first-order factors. One unidimensional model and three multidimensional models (4-factor, 8-factor, higher order) were fit to the data. Results of both approaches showed that the multidimensional models validly represented the school motivation among Dutch secondary school pupils, whereas model fit of the unidimensional model was poor. The differences in model fit between the three multidimensional models were small, although a different model was favoured by the two approaches. The need for improvement of some of the items and the need to increase measurement precision of several first-order factors are discussed.

  15. Surrogate analyte approach for quantitation of endogenous NAD(+) in human acidified blood samples using liquid chromatography coupled with electrospray ionization tandem mass spectrometry.

    PubMed

    Liu, Liling; Cui, Zhiyi; Deng, Yuzhong; Dean, Brian; Hop, Cornelis E C A; Liang, Xiaorong

    2016-02-01

    A high-performance liquid chromatography tandem mass spectrometry (LC-MS/MS) assay for the quantitative determination of NAD(+) in human whole blood using a surrogate analyte approach was developed and validated. Human whole blood was acidified using 0.5N perchloric acid at a ratio of 1:3 (v:v, blood:perchloric acid) during sample collection. 25μL of acidified blood was extracted using a protein precipitation method and the resulting extracts were analyzed using reverse-phase chromatography and positive electrospray ionization mass spectrometry. (13)C5-NAD(+) was used as the surrogate analyte for authentic analyte, NAD(+). The standard curve ranging from 0.250 to 25.0μg/mL in acidified human blood for (13)C5-NAD(+) was fitted to a 1/x(2) weighted linear regression model. The LC-MS/MS response between surrogate analyte and authentic analyte at the same concentration was obtained before and after the batch run. This response factor was not applied when determining the NAD(+) concentration from the (13)C5-NAD(+) standard curve since the percent difference was less than 5%. The precision and accuracy of the LC-MS/MS assay based on the five analytical QC levels were well within the acceptance criteria from both FDA and EMA guidance for bioanalytical method validation. Average extraction recovery of (13)C5-NAD(+) was 94.6% across the curve range. Matrix factor was 0.99 for both high and low QC indicating minimal ion suppression or enhancement. The validated assay was used to measure the baseline level of NAD(+) in 29 male and 21 female human subjects. This assay was also used to study the circadian effect of endogenous level of NAD(+) in 10 human subjects. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. SANC: the process γγ → ΖΖ

    NASA Astrophysics Data System (ADS)

    Bardin, D.; Bondarenko, S.; Christova, P.; Kalinovskaya, L.; von Schlippe, W.; Uglov, E.

    2017-11-01

    The implementation of the process γγ → ΖΖ at the one-loop level within SANC system multichannel approach is considered. The derived one-loop scalar form factors can be used for any cross channel after an appropriate permutation of their arguments-Mandelstam variables s, t, u. To check of the correctness of the results we observe the independence of the scalar form factors on the gauge parameters and the validity of Ward identity (external photon transversality). We present the complete analytical results for the covariant and tensor structures and helicity amplitudes for this process. We make an extensive comparison of our analytical and numerical results with those existing in the literature.

  17. Enhancement in the sensitivity of microfluidic enzyme-linked immunosorbent assays through analyte preconcentration.

    PubMed

    Yanagisawa, Naoki; Dutta, Debashis

    2012-08-21

    In this Article, we describe a microfluidic enzyme-linked immunosorbent assay (ELISA) method whose sensitivity can be substantially enhanced through preconcentration of the target analyte around a semipermeable membrane. The reported preconcentration has been accomplished in our current work via electrokinetic means allowing a significant increase in the amount of captured analyte relative to nonspecific binding in the trapping/detection zone. Upon introduction of an enzyme substrate into this region, the rate of generation of the ELISA reaction product (resorufin) was observed to increase by over a factor of 200 for the sample and 2 for the corresponding blank compared to similar assays without analyte trapping. Interestingly, in spite of nonuniformities in the amount of captured analyte along the surface of our analysis channel, the measured fluorescence signal in the preconcentration zone increased linearly with time over an enzyme reaction period of 30 min and at a rate that was proportional to the analyte concentration in the bulk sample. In our current study, the reported technique has been shown to reduce the smallest detectable concentration of the tumor marker CA 19-9 and Blue Tongue Viral antibody by over 2 orders of magnitude compared to immunoassays without analyte preconcentration. When compared to microwell based ELISAs, the reported microfluidic approach not only yielded a similar improvement in the smallest detectable analyte concentration but also reduced the sample consumption in the assay by a factor of 20 (5 μL versus 100 μL).

  18. Raman spectroscopy as a process analytical technology for pharmaceutical manufacturing and bioprocessing.

    PubMed

    Esmonde-White, Karen A; Cuellar, Maryann; Uerpmann, Carsten; Lenain, Bruno; Lewis, Ian R

    2017-01-01

    Adoption of Quality by Design (QbD) principles, regulatory support of QbD, process analytical technology (PAT), and continuous manufacturing are major factors effecting new approaches to pharmaceutical manufacturing and bioprocessing. In this review, we highlight new technology developments, data analysis models, and applications of Raman spectroscopy, which have expanded the scope of Raman spectroscopy as a process analytical technology. Emerging technologies such as transmission and enhanced reflection Raman, and new approaches to using available technologies, expand the scope of Raman spectroscopy in pharmaceutical manufacturing, and now Raman spectroscopy is successfully integrated into real-time release testing, continuous manufacturing, and statistical process control. Since the last major review of Raman as a pharmaceutical PAT in 2010, many new Raman applications in bioprocessing have emerged. Exciting reports of in situ Raman spectroscopy in bioprocesses complement a growing scientific field of biological and biomedical Raman spectroscopy. Raman spectroscopy has made a positive impact as a process analytical and control tool for pharmaceutical manufacturing and bioprocessing, with demonstrated scientific and financial benefits throughout a product's lifecycle.

  19. Variable-centered and person-centered approaches to studying Mexican-origin mother-daughter cultural orientation dissonance.

    PubMed

    Bámaca-Colbert, Mayra Y; Gayles, Jochebed G

    2010-11-01

    The overall aim of the current study was to identify the methodological approach and corresponding analytic procedure that best elucidated the associations among Mexican-origin mother-daughter cultural orientation dissonance, family functioning, and adolescent adjustment. To do so, we employed, and compared, two methodological approaches (i.e., variable-centered and person-centered) via four analytic procedures (i.e., difference score, interactive, matched/mismatched grouping, and latent profiles). The sample consisted of 319 girls in the 7th or 10th grade and their mother or mother figure from a large Southwestern, metropolitan area in the US. Family factors were found to be important predictors of adolescent adjustment in all models. Although some findings were similar across all models, overall, findings suggested that the latent profile procedure best elucidated the associations among the variables examined in this study. In addition, associations were present across early and middle adolescents, with a few findings being only present for one group. Implications for using these analytic procedures in studying cultural and family processes are discussed.

  20. USAF Summer Research Program - 1994 Summer Faculty Research Program Final Reports, Volume 2A, Armstrong Laboratory.

    DTIC Science & Technology

    1994-12-01

    meta-analytic approach . Journal of Applied Psychology , 76, 432-446. 6-14 Raju, N.S., & Dowhower, D.P. (1991). The effect of second-order sampling on the... Psychology . HARPER, G., & KEMBER, D. (1989). Interpretation of factor analysis from the approaches to studying inventory. British Journal of Educational...Contemporary Educational Psychology , 12, 381-385. TRIGWELL, K., & PROSSER, M. (1991). Relating approaches to study and quality of learning outcomes at

  1. The Effects of College Students' Personal Values on Changes in Learning Approaches

    ERIC Educational Resources Information Center

    Lietz, Petra; Matthews, Bobbie

    2010-01-01

    Many studies of changes in learning approaches have used data from different age groups at one point in time only (Gow and Kember, High Educ 19:307-322, 1990; Watkins and Hattie, Br J Educ Psychol 51:384-393, 1981) or have analyzed the effects of just two or three factors using single level analytical techniques (Cano, Br J Educ Psychol…

  2. Evaluation of soil erosion risk using Analytic Network Process and GIS: a case study from Spanish mountain olive plantations.

    PubMed

    Nekhay, Olexandr; Arriaza, Manuel; Boerboom, Luc

    2009-07-01

    The study presents an approach that combined objective information such as sampling or experimental data with subjective information such as expert opinions. This combined approach was based on the Analytic Network Process method. It was applied to evaluate soil erosion risk and overcomes one of the drawbacks of USLE/RUSLE soil erosion models, namely that they do not consider interactions among soil erosion factors. Another advantage of this method is that it can be used if there are insufficient experimental data. The lack of experimental data can be compensated for through the use of expert evaluations. As an example of the proposed approach, the risk of soil erosion was evaluated in olive groves in Southern Spain, showing the potential of the ANP method for modelling a complex physical process like soil erosion.

  3. Metabolomics and Diabetes: Analytical and Computational Approaches

    PubMed Central

    Sas, Kelli M.; Karnovsky, Alla; Michailidis, George

    2015-01-01

    Diabetes is characterized by altered metabolism of key molecules and regulatory pathways. The phenotypic expression of diabetes and associated complications encompasses complex interactions between genetic, environmental, and tissue-specific factors that require an integrated understanding of perturbations in the network of genes, proteins, and metabolites. Metabolomics attempts to systematically identify and quantitate small molecule metabolites from biological systems. The recent rapid development of a variety of analytical platforms based on mass spectrometry and nuclear magnetic resonance have enabled identification of complex metabolic phenotypes. Continued development of bioinformatics and analytical strategies has facilitated the discovery of causal links in understanding the pathophysiology of diabetes and its complications. Here, we summarize the metabolomics workflow, including analytical, statistical, and computational tools, highlight recent applications of metabolomics in diabetes research, and discuss the challenges in the field. PMID:25713200

  4. A Meta Analytical Approach Regarding School Effectiveness: The True Size of School Effects and the Effect Size of Educational Leadership.

    ERIC Educational Resources Information Center

    Bosker, Roel J.; Witziers, Bob

    School-effectiveness research has not yet been able to identify the factors of effective and noneffective schools, the real contribution of the significant factors, the true sizes of school effects, and the generalizability of school-effectiveness results. This paper presents findings of a meta analysis, the Dutch PSO programme, that was used to…

  5. Study of single nucleotide polymorphisms of tumour necrosis factors and HSP genes in nasopharyngeal carcinoma in North East India.

    PubMed

    Lakhanpal, Meena; Singh, Laishram Chandreshwor; Rahman, Tashnin; Sharma, Jagnnath; Singh, M Madhumangal; Kataki, Amal Chandra; Verma, Saurabh; Pandrangi, Santhi Latha; Singh, Y Mohan; Wajid, Saima; Kapur, Sujala; Saxena, Sunita

    2016-01-01

    Nasopharyngeal carcinoma (NPC) is an epithelial tumour with a distinctive racial and geographical distribution. High incidence of NPC has been reported from China, Southeast Asia, and northeast (NE) region of India. The immune mechanism plays incredibly role in pathogenesis of NPC. Tumour necrosis factors (TNFs) and heat shock protein 70 (HSP 70) constitute significant components of innate as well as adaptive host immunity. Multi-analytical approaches including logistic regression (LR), classification and regression tree (CART) and multifactor dimensionality reduction (MDR) were applied in 120 NPC cases and 100 controls to explore high order interactions among TNF-α (-308 G>A), TNF β (+252 A>G), HSP 70-1 (+190 G>C), HSP 70-hom (+2437 T>C) genes and environmental risk factors. TNF β was identified as the primary etiological factor by all three analytical approaches. Individual analysis of results showed protective effect of TNF β GG genotype (adjusted odds ratio (OR2) = 0.27, 95 % CI = 0.125-0.611, P = 0.001), HSP 70 (+2437) CC genotype (OR2 = 0.17, 95 % CI = 0.0430.69, P = 0.013), while AG genotype of TNF β was found significantly associated with risk of NPC (OR2 = 1.97, 95 % CI = 1.019-3.83, P = 0.04). Analysis of environmental factors demonstrated association of alcohol consumption, living in mud houses and use of firewood for cooking as major risk factors for NPC. Individual haplotype association analysis showed significant risk associated with GTGA haplotype (OR = 68.61, 95 % CI = 2.47-190.37, P = 0.013) while a protective effect with CCAA and GCGA haplotypes (OR = 0.19, 95 % CI = 0.05-0.75, P = 0.019; OR = 0.01 95 % CI = 0.05-0.30, P = 0.007). The multi-analytical approaches applied in this study helped in identification of distinct gene-gene and gene-environment interactions significant in risk assessment of NPC.

  6. Scalable non-negative matrix tri-factorization.

    PubMed

    Čopar, Andrej; Žitnik, Marinka; Zupan, Blaž

    2017-01-01

    Matrix factorization is a well established pattern discovery tool that has seen numerous applications in biomedical data analytics, such as gene expression co-clustering, patient stratification, and gene-disease association mining. Matrix factorization learns a latent data model that takes a data matrix and transforms it into a latent feature space enabling generalization, noise removal and feature discovery. However, factorization algorithms are numerically intensive, and hence there is a pressing challenge to scale current algorithms to work with large datasets. Our focus in this paper is matrix tri-factorization, a popular method that is not limited by the assumption of standard matrix factorization about data residing in one latent space. Matrix tri-factorization solves this by inferring a separate latent space for each dimension in a data matrix, and a latent mapping of interactions between the inferred spaces, making the approach particularly suitable for biomedical data mining. We developed a block-wise approach for latent factor learning in matrix tri-factorization. The approach partitions a data matrix into disjoint submatrices that are treated independently and fed into a parallel factorization system. An appealing property of the proposed approach is its mathematical equivalence with serial matrix tri-factorization. In a study on large biomedical datasets we show that our approach scales well on multi-processor and multi-GPU architectures. On a four-GPU system we demonstrate that our approach can be more than 100-times faster than its single-processor counterpart. A general approach for scaling non-negative matrix tri-factorization is proposed. The approach is especially useful parallel matrix factorization implemented in a multi-GPU environment. We expect the new approach will be useful in emerging procedures for latent factor analysis, notably for data integration, where many large data matrices need to be collectively factorized.

  7. Psychosocial Factors Versus Single Predictors: A Factor Analytic Approach to Cardiovascular Outcomes in The Women’s Ischemia Syndrome Evaluation (WISE) Study

    DTIC Science & Technology

    2010-02-18

    later than men (Bello & Mosca, 2004). Also, for women 6 taking oral contraceptives , smoking significantly increases their risk of developing CVD...to include both the physiological processes involved in stress and the stress response as well as the emotional and psychological aspects of stress...and stressors (Mason, 1975). The emotional and psychological aspects of stress are critical components in the link between psychosocial factors and

  8. Strategies to increase the use of child safety seats : an assessment of current knowledge

    DOT National Transportation Integrated Search

    1986-12-01

    This analytic literature research reports on characteristics of child-safety-seat (CSS) users and nonusers, and on the efficacy of approaches to increasing CSS use. It concentrates on human factors issues in CSS use, excluding technical studies on de...

  9. Evaluation of the Current Status of the Combinatorial Approach for the Study of Phase Diagrams

    PubMed Central

    Wong-Ng, W.

    2012-01-01

    This paper provides an evaluation of the effectiveness of using the high throughput combinatorial approach for preparing phase diagrams of thin film and bulk materials. Our evaluation is based primarily on examples of combinatorial phase diagrams that have been reported in the literature as well as based on our own laboratory experiments. Various factors that affect the construction of these phase diagrams are examined. Instrumentation and analytical approaches needed to improve data acquisition and data analysis are summarized. PMID:26900530

  10. Food risk perceptions, gender, and individual differences in avoidance and approach motivation, intuitive and analytic thinking styles, and anxiety.

    PubMed

    Leikas, Sointu; Lindeman, Marjaana; Roininen, Katariina; Lähteenmäki, Liisa

    2007-03-01

    Risks appear to be perceived in two different ways, affectively and rationally. Finnish adult internet users were contacted via e-mail and asked to fill an internet questionnaire consisting of questions of food risks and measures of avoidance and approach motivation, analytic and intuitive information processing style, trait anxiety, and gender in order to find out (1) whether food risks are perceived two-dimensionally, (2) how individual differences in motivation, information processing, and anxiety are associated with the different dimensions of food risk perceptions, and (3) whether gender moderates these associations. The data were analyzed by factor, correlation and regression analyses. Three factors emerged: risk scariness, risk likelihood, and risks of cardiovascular disease. Personality and gender x personality interactions predicted food risk perceptions. Results showed that food risk perceptions generally form two dimensions; scariness and likelihood, but that this may depend on the nature of the risk. In addition, results imply that individuals with high avoidance motivation perceive food risks as scarier and more likely than others, and that individuals with an analytic information processing style perceive food risks as less likely than others. Trait anxiety seems to be associated with higher food risk perceptions only among men.

  11. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 1: Approaches based on extractant drop-, plug-, film- and microflow-formation.

    PubMed

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-04

    Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Evaluation Criteria for Micro-CAI: A Psychometric Approach

    PubMed Central

    Wallace, Douglas; Slichter, Mark; Bolwell, Christine

    1985-01-01

    The increased use of microcomputer-based instructional programs has resulted in a greater need for third-party evaluation of the software. This in turn has prompted the development of micro-CAI evaluation tools. The present project sought to develop a prototype instrument to assess the impact of CAI program presentation characteristics on students. Data analysis and scale construction was conducted using standard item reliability analyses and factor analytic techniques. Adequate subscale reliabilities and factor structures were found, suggesting that a psychometric approach to CAI evaluation may possess some merit. Efforts to assess the utility of the resultant instrument are currently underway.

  13. Analytic strategies to evaluate the association of time-varying exposures to HIV-related outcomes: Alcohol consumption as an example.

    PubMed

    Cook, Robert L; Kelso, Natalie E; Brumback, Babette A; Chen, Xinguang

    2016-01-01

    As persons with HIV are living longer, there is a growing need to investigate factors associated with chronic disease, rate of disease progression and survivorship. Many risk factors for this high-risk population change over time, such as participation in treatment, alcohol consumption and drug abuse. Longitudinal datasets are increasingly available, particularly clinical data that contain multiple observations of health exposures and outcomes over time. Several analytic options are available for assessment of longitudinal data; however, it can be challenging to choose the appropriate analytic method for specific combinations of research questions and types of data. The purpose of this review is to help researchers choose the appropriate methods to analyze longitudinal data, using alcohol consumption as an example of a time-varying exposure variable. When selecting the optimal analytic method, one must consider aspects of exposure (e.g. timing, pattern, and amount) and outcome (fixed or time-varying), while also addressing minimizing bias. In this article, we will describe several analytic approaches for longitudinal data, including developmental trajectory analysis, generalized estimating equations, and mixed effect models. For each analytic strategy, we describe appropriate situations to use the method and provide an example that demonstrates the use of the method. Clinical data related to alcohol consumption and HIV are used to illustrate these methods.

  14. Multiple Hypnotizabilities: Differentiating the Building Blocks of Hypnotic Response

    ERIC Educational Resources Information Center

    Woody, Erik Z.; Barnier, Amanda J.; McConkey, Kevin M.

    2005-01-01

    Although hypnotizability can be conceptualized as involving component subskills, standard measures do not differentiate them from a more general unitary trait, partly because the measures include limited sets of dichotomous items. To overcome this, the authors applied full-information factor analysis, a sophisticated analytic approach for…

  15. Translucent Radiosity: Efficiently Combining Diffuse Inter-Reflection and Subsurface Scattering.

    PubMed

    Sheng, Yu; Shi, Yulong; Wang, Lili; Narasimhan, Srinivasa G

    2014-07-01

    It is hard to efficiently model the light transport in scenes with translucent objects for interactive applications. The inter-reflection between objects and their environments and the subsurface scattering through the materials intertwine to produce visual effects like color bleeding, light glows, and soft shading. Monte-Carlo based approaches have demonstrated impressive results but are computationally expensive, and faster approaches model either only inter-reflection or only subsurface scattering. In this paper, we present a simple analytic model that combines diffuse inter-reflection and isotropic subsurface scattering. Our approach extends the classical work in radiosity by including a subsurface scattering matrix that operates in conjunction with the traditional form factor matrix. This subsurface scattering matrix can be constructed using analytic, measurement-based or simulation-based models and can capture both homogeneous and heterogeneous translucencies. Using a fast iterative solution to radiosity, we demonstrate scene relighting and dynamically varying object translucencies at near interactive rates.

  16. The examination of headache activity using time-series research designs.

    PubMed

    Houle, Timothy T; Remble, Thomas A; Houle, Thomas A

    2005-05-01

    The majority of research conducted on headache has utilized cross-sectional designs which preclude the examination of dynamic factors and principally rely on group-level effects. The present article describes the application of an individual-oriented process model using time-series analytical techniques. The blending of a time-series approach with an interactive process model allows consideration of the relationships of intra-individual dynamic processes, while not precluding the researcher to examine inter-individual differences. The authors explore the nature of time-series data and present two necessary assumptions underlying the time-series approach. The concept of shock and its contribution to headache activity is also presented. The time-series approach is not without its problems and two such problems are specifically reported: autocorrelation and the distribution of daily observations. The article concludes with the presentation of several analytical techniques suited to examine the time-series interactive process model.

  17. Analytic structure of the S-matrix for singular quantum mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camblong, Horacio E.; Epele, Luis N.; Fanchiotti, Huner

    2015-06-15

    The analytic structure of the S-matrix of singular quantum mechanics is examined within a multichannel framework, with primary focus on its dependence with respect to a parameter (Ω) that determines the boundary conditions. Specifically, a characterization is given in terms of salient mathematical and physical properties governing its behavior. These properties involve unitarity and associated current-conserving Wronskian relations, time-reversal invariance, and Blaschke factorization. The approach leads to an interpretation of effective nonunitary solutions in singular quantum mechanics and their determination from the unitary family.

  18. Accounting for context in studies of health inequalities: a review and comparison of analytic approaches.

    PubMed

    Schempf, Ashley H; Kaufman, Jay S

    2012-10-01

    A common epidemiologic objective is to evaluate the contribution of residential context to individual-level disparities by race or socioeconomic position. We reviewed analytic strategies to account for the total (observed and unobserved factors) contribution of environmental context to health inequalities, including conventional fixed effects (FE) and hybrid FE implemented within a random effects (RE) or a marginal model. To illustrate results and limitations of the various analytic approaches of accounting for the total contextual component of health disparities, we used data on births nested within neighborhoods as an applied example of evaluating neighborhood confounding of racial disparities in gestational age at birth, including both a continuous and a binary outcome. Ordinary and RE models provided disparity estimates that can be substantially biased in the presence of neighborhood confounding. Both FE and hybrid FE models can account for cluster level confounding and provide disparity estimates unconfounded by neighborhood, with the latter having greater flexibility in allowing estimation of neighborhood-level effects and intercept/slope variability when implemented in a RE specification. Given the range of models that can be implemented in a hybrid approach and the frequent goal of accounting for contextual confounding, this approach should be used more often. Published by Elsevier Inc.

  19. A three-step approach for the derivation and validation of high-performing predictive models using an operational dataset: congestive heart failure readmission case study.

    PubMed

    AbdelRahman, Samir E; Zhang, Mingyuan; Bray, Bruce E; Kawamoto, Kensaku

    2014-05-27

    The aim of this study was to propose an analytical approach to develop high-performing predictive models for congestive heart failure (CHF) readmission using an operational dataset with incomplete records and changing data over time. Our analytical approach involves three steps: pre-processing, systematic model development, and risk factor analysis. For pre-processing, variables that were absent in >50% of records were removed. Moreover, the dataset was divided into a validation dataset and derivation datasets which were separated into three temporal subsets based on changes to the data over time. For systematic model development, using the different temporal datasets and the remaining explanatory variables, the models were developed by combining the use of various (i) statistical analyses to explore the relationships between the validation and the derivation datasets; (ii) adjustment methods for handling missing values; (iii) classifiers; (iv) feature selection methods; and (iv) discretization methods. We then selected the best derivation dataset and the models with the highest predictive performance. For risk factor analysis, factors in the highest-performing predictive models were analyzed and ranked using (i) statistical analyses of the best derivation dataset, (ii) feature rankers, and (iii) a newly developed algorithm to categorize risk factors as being strong, regular, or weak. The analysis dataset consisted of 2,787 CHF hospitalizations at University of Utah Health Care from January 2003 to June 2013. In this study, we used the complete-case analysis and mean-based imputation adjustment methods; the wrapper subset feature selection method; and four ranking strategies based on information gain, gain ratio, symmetrical uncertainty, and wrapper subset feature evaluators. The best-performing models resulted from the use of a complete-case analysis derivation dataset combined with the Class-Attribute Contingency Coefficient discretization method and a voting classifier which averaged the results of multi-nominal logistic regression and voting feature intervals classifiers. Of 42 final model risk factors, discharge disposition, discretized age, and indicators of anemia were the most significant. This model achieved a c-statistic of 86.8%. The proposed three-step analytical approach enhanced predictive model performance for CHF readmissions. It could potentially be leveraged to improve predictive model performance in other areas of clinical medicine.

  20. The Prioritization of Clinical Risk Factors of Obstructive Sleep Apnea Severity Using Fuzzy Analytic Hierarchy Process

    PubMed Central

    Maranate, Thaya; Pongpullponsak, Adisak; Ruttanaumpawan, Pimon

    2015-01-01

    Recently, there has been a problem of shortage of sleep laboratories that can accommodate the patients in a timely manner. Delayed diagnosis and treatment may lead to worse outcomes particularly in patients with severe obstructive sleep apnea (OSA). For this reason, the prioritization in polysomnography (PSG) queueing should be endorsed based on disease severity. To date, there have been conflicting data whether clinical information can predict OSA severity. The 1,042 suspected OSA patients underwent diagnostic PSG study at Siriraj Sleep Center during 2010-2011. A total of 113 variables were obtained from sleep questionnaires and anthropometric measurements. The 19 groups of clinical risk factors consisting of 42 variables were categorized into each OSA severity. This study aimed to array these factors by employing Fuzzy Analytic Hierarchy Process approach based on normalized weight vector. The results revealed that the first rank of clinical risk factors in Severe, Moderate, Mild, and No OSA was nighttime symptoms. The overall sensitivity/specificity of the approach to these groups was 92.32%/91.76%, 89.52%/88.18%, 91.08%/84.58%, and 96.49%/81.23%, respectively. We propose that the urgent PSG appointment should include clinical risk factors of Severe OSA group. In addition, the screening for Mild from No OSA patients in sleep center setting using symptoms during sleep is also recommended (sensitivity = 87.12% and specificity = 72.22%). PMID:26221183

  1. Selecting Evaluation Comparison Groups: A Cluster Analytic Approach.

    ERIC Educational Resources Information Center

    Davis, Todd Mclin; McLean, James E.

    A persistent problem in the evaluation of field-based projects is the lack of no-treatment comparison groups. Frequently, potential comparison groups are confounded by socioeconomic, racial, or other factors. Among the possible methods for dealing with this problem are various matching procedures, but they are cumbersome to use with multiple…

  2. A Meta-Analysis of the Predictors of Cyberbullying Perpetration and Victimization

    ERIC Educational Resources Information Center

    Guo, Siying

    2016-01-01

    Previous studies so far have investigated various aspects of cyberbullying. Using meta-analytic approaches, the study was primarily to determine the target factors predicting individuals' perpetration and victimization in cyberbullying. A meta-analysis of 77 studies containing 418 primary effect sizes was conducted to exam the relative magnitude…

  3. Newton Algorithms for Analytic Rotation: An Implicit Function Approach

    ERIC Educational Resources Information Center

    Boik, Robert J.

    2008-01-01

    In this paper implicit function-based parameterizations for orthogonal and oblique rotation matrices are proposed. The parameterizations are used to construct Newton algorithms for minimizing differentiable rotation criteria applied to "m" factors and "p" variables. The speed of the new algorithms is compared to that of existing algorithms and to…

  4. Safety and Suitability for Service Assessment Testing for Surface and Underwater Launched Munitions

    DTIC Science & Technology

    2014-12-05

    test efficiency that tend to associate the Analytical S3 Test Approach with large, complex munition systems and the Empirical S3 Test Approach with...the smaller, less complex munition systems . 8.1 ANALYTICAL S3 TEST APPROACH. The Analytical S3 test approach, as shown in Figure 3, evaluates...assets than the Analytical S3 Test approach to establish the safety margin of the system . This approach is generally applicable to small munitions

  5. Testing the dimensional structure of DSM-5 posttraumatic stress disorder symptoms in a nonclinical trauma-exposed adolescent sample.

    PubMed

    Liu, Liyong; Wang, Li; Cao, Chengqi; Qing, Yulan; Armour, Cherie

    2016-02-01

    The current study investigated the underlying dimensionality of DSM-5 posttraumatic stress disorder (PTSD) symptoms in a trauma-exposed Chinese adolescent sample using a confirmatory factor analytic (CFA) alternative model approach. The sample consisted of 559 students (242 females and 314 males) ranging in age from 12 to 18 years (M = 15.8, SD = 1.3). Participants completed the PTSD Checklist for DSM-5, the Major Depression Disorder and Panic Disorder subscales of the Revised Children's Anxiety and Depression Scale, and the Aggressive Behavior subscale of the Youth Self-Report. Confirmatory factor analytic results indicated that a seven-factor model comprised of intrusion, avoidance, negative affect, anhedonia, externalizing behavior, anxious arousal, and dysphoric arousal factors emerged as the best-fitting model. Further analyses showed that the external measures of psychopathological variables including major depressive disorder, panic disorder, and aggressive behavior were differentially associated with the resultant factors. These findings support and extend previous findings for the newly refined seven-factor hybrid model, and carry clinical and research implications for trauma-related psychopathology. © 2015 Association for Child and Adolescent Mental Health.

  6. Method development and qualification of capillary zone electrophoresis for investigation of therapeutic monoclonal antibody quality.

    PubMed

    Suba, Dávid; Urbányi, Zoltán; Salgó, András

    2016-10-01

    Capillary electrophoresis techniques are widely used in the analytical biotechnology. Different electrophoretic techniques are very adequate tools to monitor size-and charge heterogenities of protein drugs. Method descriptions and development studies of capillary zone electrophoresis (CZE) have been described in literature. Most of them are performed based on the classical one-factor-at-time (OFAT) approach. In this study a very simple method development approach is described for capillary zone electrophoresis: a "two-phase-four-step" approach is introduced which allows a rapid, iterative method development process and can be a good platform for CZE method. In every step the current analytical target profile and an appropriate control strategy were established to monitor the current stage of development. A very good platform was established to investigate intact and digested protein samples. Commercially available monoclonal antibody was chosen as model protein for the method development study. The CZE method was qualificated after the development process and the results were presented. The analytical system stability was represented by the calculated RSD% value of area percentage and migration time of the selected peaks (<0.8% and <5%) during the intermediate precision investigation. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Structural analysis and design of multivariable control systems: An algebraic approach

    NASA Technical Reports Server (NTRS)

    Tsay, Yih Tsong; Shieh, Leang-San; Barnett, Stephen

    1988-01-01

    The application of algebraic system theory to the design of controllers for multivariable (MV) systems is explored analytically using an approach based on state-space representations and matrix-fraction descriptions. Chapters are devoted to characteristic lambda matrices and canonical descriptions of MIMO systems; spectral analysis, divisors, and spectral factors of nonsingular lambda matrices; feedback control of MV systems; and structural decomposition theories and their application to MV control systems.

  8. The Evaluation of a Self-Enumerated Scale of Quality of Life (CASP-19) in the Context of Research on Ageing: A Combination of Exploratory and Confirmatory Approaches

    ERIC Educational Resources Information Center

    Wiggins, R. D.; Netuveli, G.; Hyde, M.; Higgs, P.; Blane, D.

    2008-01-01

    This paper describes the conceptual development of a self-enumerated scale of quality of life (CASP-19) and presents an empirical evaluation of its structure using a combination of exploratory and confirmatory factor analytic approaches across three different survey settings for older people living in England and Wales in the new millennium. All…

  9. Interconnections between various analytic approaches applicable to third-order nonlinear differential equations

    PubMed Central

    Mohanasubha, R.; Chandrasekar, V. K.; Senthilvelan, M.; Lakshmanan, M.

    2015-01-01

    We unearth the interconnection between various analytical methods which are widely used in the current literature to identify integrable nonlinear dynamical systems described by third-order nonlinear ODEs. We establish an important interconnection between the extended Prelle–Singer procedure and λ-symmetries approach applicable to third-order ODEs to bring out the various linkages associated with these different techniques. By establishing this interconnection we demonstrate that given any one of the quantities as a starting point in the family consisting of Jacobi last multipliers, Darboux polynomials, Lie point symmetries, adjoint-symmetries, λ-symmetries, integrating factors and null forms one can derive the rest of the quantities in this family in a straightforward and unambiguous manner. We also illustrate our findings with three specific examples. PMID:27547076

  10. Interconnections between various analytic approaches applicable to third-order nonlinear differential equations.

    PubMed

    Mohanasubha, R; Chandrasekar, V K; Senthilvelan, M; Lakshmanan, M

    2015-04-08

    We unearth the interconnection between various analytical methods which are widely used in the current literature to identify integrable nonlinear dynamical systems described by third-order nonlinear ODEs. We establish an important interconnection between the extended Prelle-Singer procedure and λ-symmetries approach applicable to third-order ODEs to bring out the various linkages associated with these different techniques. By establishing this interconnection we demonstrate that given any one of the quantities as a starting point in the family consisting of Jacobi last multipliers, Darboux polynomials, Lie point symmetries, adjoint-symmetries, λ-symmetries, integrating factors and null forms one can derive the rest of the quantities in this family in a straightforward and unambiguous manner. We also illustrate our findings with three specific examples.

  11. Differential solute gas response in ionic-liquid-based QCM arrays: elucidating design factors responsible for discriminative explosive gas sensing.

    PubMed

    Rehman, Abdul; Hamilton, Andrew; Chung, Alfred; Baker, Gary A; Wang, Zhe; Zeng, Xiangqun

    2011-10-15

    An eight-sensor array coupling a chemoselective room-temperature ionic liquid (RTIL) with quartz crystal microbalance (QCM) transduction is presented in this work in order to demonstrate the power of this approach in differentiating closely related analytes in sensory devices. The underlying mechanism behind the specific sensory response was explored by (i) studying mass loading and viscoelasticity effects of the sensing layers, predominantly through variation in damping impedance, the combination of which determines the sensitivity; (ii) creation of a solvation model based on Abraham's solvation descriptors which reveals the fact that polarizability and lipophilicity are the main factors influencing the dissolution of gas analytes into the RTILs; and (iii) determination of enthalpy and entropy values for the studied interactions and comparison via a simulation model, which is also effective for pattern discrimination, in order to establish a foundation for the analytical scientist as well as inspiration for synthetic pathways and innovative research into next-generation sensory approaches. The reported sensors displayed an excellent sensitivity with detection limit of <0.2%, fast response and recovery, and a workable temperature range of 27-55 °C and even higher. Linear discriminant analysis (LDA) showed a discrimination accuracy of 86-92% for nitromethane and 1-ethyl-2-nitrobenzene, 71% for different mixtures of nitromethane, and 100% for these analytes when thermodynamic parameters were used as input data. We envisage applications to detecting other nitroaromatics and security-related gas targets, and high-temperature or real-time situations where manual access is restricted, opening up new horizons in chemical sensing. © 2011 American Chemical Society

  12. Interactive Visual Analytics Approch for Exploration of Geochemical Model Simulations with Different Parameter Sets

    NASA Astrophysics Data System (ADS)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2015-04-01

    Many geoscience applications can benefit from testing many combinations of input parameters for geochemical simulation models. It is, however, a challenge to screen the input and output data from the model to identify the significant relationships between input parameters and output variables. For addressing this problem we propose a Visual Analytics approach that has been developed in an ongoing collaboration between computer science and geoscience researchers. Our Visual Analytics approach uses visualization methods of hierarchical horizontal axis, multi-factor stacked bar charts and interactive semi-automated filtering for input and output data together with automatic sensitivity analysis. This guides the users towards significant relationships. We implement our approach as an interactive data exploration tool. It is designed with flexibility in mind, so that a diverse set of tasks such as inverse modeling, sensitivity analysis and model parameter refinement can be supported. Here we demonstrate the capabilities of our approach by two examples for gas storage applications. For the first example our Visual Analytics approach enabled the analyst to observe how the element concentrations change around previously established baselines in response to thousands of different combinations of mineral phases. This supported combinatorial inverse modeling for interpreting observations about the chemical composition of the formation fluids at the Ketzin pilot site for CO2 storage. The results indicate that, within the experimental error range, the formation fluid cannot be considered at local thermodynamical equilibrium with the mineral assemblage of the reservoir rock. This is a valuable insight from the predictive geochemical modeling for the Ketzin site. For the second example our approach supports sensitivity analysis for a reaction involving the reductive dissolution of pyrite with formation of pyrrothite in presence of gaseous hydrogen. We determine that this reaction is thermodynamically favorable under a broad range of conditions. This includes low temperatures and absence of microbial catalysators. Our approach has potential for use in other applications that involve exploration of relationships in geochemical simulation model data.

  13. Plate and butt-weld stresses beyond elastic limit, material and structural modeling

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1991-01-01

    Ultimate safety factors of high performance structures depend on stress behavior beyond the elastic limit, a region not too well understood. An analytical modeling approach was developed to gain fundamental insights into inelastic responses of simple structural elements. Nonlinear material properties were expressed in engineering stresses and strains variables and combined with strength of material stress and strain equations similar to numerical piece-wise linear method. Integrations are continuous which allows for more detailed solutions. Included with interesting results are the classical combined axial tension and bending load model and the strain gauge conversion to stress beyond the elastic limit. Material discontinuity stress factors in butt-welds were derived. This is a working-type document with analytical methods and results applicable to all industries of high reliability structures.

  14. Big data analytics as a service infrastructure: challenges, desired properties and solutions

    NASA Astrophysics Data System (ADS)

    Martín-Márquez, Manuel

    2015-12-01

    CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.

  15. Cross-Disciplinary Consultancy to Enhance Predictions of Asthma Exacerbation Risk in Boston.

    PubMed

    Reid, Margaret; Gunn, Julia; Shah, Snehal; Donovan, Michael; Eggo, Rosalind; Babin, Steven; Stajner, Ivanka; Rogers, Eric; Ensor, Katherine B; Raun, Loren; Levy, Jonathan I; Painter, Ian; Phipatanakul, Wanda; Yip, Fuyuen; Nath, Anjali; Streichert, Laura C; Tong, Catherine; Burkom, Howard

    2016-01-01

    This paper continues an initiative conducted by the International Society for Disease Surveillance with funding from the Defense Threat Reduction Agency to connect near-term analytical needs of public health practice with technical expertise from the global research community. The goal is to enhance investigation capabilities of day-to-day population health monitors. A prior paper described the formation of consultancies for requirements analysis and dialogue regarding costs and benefits of sustainable analytic tools. Each funded consultancy targets a use case of near-term concern to practitioners. The consultancy featured here focused on improving predictions of asthma exacerbation risk in demographic and geographic subdivisions of the city of Boston, Massachusetts, USA based on the combination of known risk factors for which evidence is routinely available. A cross-disciplinary group of 28 stakeholders attended the consultancy on March 30-31, 2016 at the Boston Public Health Commission. Known asthma exacerbation risk factors are upper respiratory virus transmission, particularly in school-age children, harsh or extreme weather conditions, and poor air quality. Meteorological subject matter experts described availability and usage of data sources representing these risk factors. Modelers presented multiple analytic approaches including mechanistic models, machine learning approaches, simulation techniques, and hybrids. Health department staff and local partners discussed surveillance operations, constraints, and operational system requirements. Attendees valued the direct exchange of information among public health practitioners, system designers, and modelers. Discussion finalized design of an 8-year de-identified dataset of Boston ED patient records for modeling partners who sign a standard data use agreement.

  16. Cross-Disciplinary Consultancy to Enhance Predictions of Asthma Exacerbation Risk in Boston

    PubMed Central

    Reid, Margaret; Gunn, Julia; Shah, Snehal; Donovan, Michael; Eggo, Rosalind; Babin, Steven; Stajner, Ivanka; Rogers, Eric; Ensor, Katherine B.; Raun, Loren; Levy, Jonathan I.; Painter, Ian; Phipatanakul, Wanda; Yip, Fuyuen; Nath, Anjali; Streichert, Laura C.; Tong, Catherine

    2016-01-01

    This paper continues an initiative conducted by the International Society for Disease Surveillance with funding from the Defense Threat Reduction Agency to connect near-term analytical needs of public health practice with technical expertise from the global research community. The goal is to enhance investigation capabilities of day-to-day population health monitors. A prior paper described the formation of consultancies for requirements analysis and dialogue regarding costs and benefits of sustainable analytic tools. Each funded consultancy targets a use case of near-term concern to practitioners. The consultancy featured here focused on improving predictions of asthma exacerbation risk in demographic and geographic subdivisions of the city of Boston, Massachusetts, USA based on the combination of known risk factors for which evidence is routinely available. A cross-disciplinary group of 28 stakeholders attended the consultancy on March 30-31, 2016 at the Boston Public Health Commission. Known asthma exacerbation risk factors are upper respiratory virus transmission, particularly in school-age children, harsh or extreme weather conditions, and poor air quality. Meteorological subject matter experts described availability and usage of data sources representing these risk factors. Modelers presented multiple analytic approaches including mechanistic models, machine learning approaches, simulation techniques, and hybrids. Health department staff and local partners discussed surveillance operations, constraints, and operational system requirements. Attendees valued the direct exchange of information among public health practitioners, system designers, and modelers. Discussion finalized design of an 8-year de-identified dataset of Boston ED patient records for modeling partners who sign a standard data use agreement. PMID:28210420

  17. Effect-Based Screening Methods for Water Quality Characterization Will Augment Conventional Analyte-by-Analyte Chemical Methods in Research As Well As Regulatory Monitoring

    EPA Science Inventory

    Conventional approaches to water quality characterization can provide data on individual chemical components of each water sample. This analyte-by-analyte approach currently serves many useful research and compliance monitoring needs. However these approaches, which require a ...

  18. Restructuring the rotor analysis program C-60

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The continuing evolution of the rotary wing industry demands increasing analytical capabilities. To keep up with this demand, software must be structured to accommodate change. The approach discussed for meeting this demand is to restructure an existing analysis. The motivational factors, basic principles, application techniques, and practical lessons from experience with this restructuring effort are reviewed.

  19. Linking Teacher Competences to Organizational Citizenship Behaviour: The Role of Empowerment

    ERIC Educational Resources Information Center

    Kasekende, Francis; Munene, John C.; Otengei, Samson Omuudu; Ntayi, Joseph Mpeera

    2016-01-01

    Purpose: The purpose of this paper is to examine relationship between teacher competences and organizational citizenship behavior (OCB) with empowerment as a mediating factor. Design/methodology/approach: The study took a cross-sectional descriptive and analytical design. Using cluster and random sampling procedures, data were obtained from 383…

  20. Factors Associated with Faculty Use of Student Data for Instructional Improvement

    ERIC Educational Resources Information Center

    Svinicki, Marilla D.; Williams, Kyle; Rackley, Kadie; Sanders, Anke J. Z.; Pine, Lisa; Stewart, Julie

    2016-01-01

    Much is being said in education about the value of adopting data-based or analytics approaches to instructional improvement. One important group of stakeholders in this effort is the faculty. "In many cases, the key constituency group is faculty, whose powerful voice and genuine participation often determine the success or failure of…

  1. An Alternative Classification Scheme for Teaching Performance Incentives Using a Factor Analytic Approach.

    ERIC Educational Resources Information Center

    Mertler, Craig A.

    This study attempted to (1) expand the dichotomous classification scheme typically used by educators and researchers to describe teaching incentives and (2) offer administrators and teachers an alternative framework within which to develop incentive systems. Elementary, middle, and high school teachers in Ohio rated 10 commonly instituted teaching…

  2. Estimating and testing mediation and moderation in within-subject designs.

    PubMed

    Judd, C M; Kenny, D A; McClelland, G H

    2001-06-01

    Analyses designed to detect mediation and moderation of treatment effects are increasingly prevalent in research in psychology. The mediation question concerns the processes that produce a treatment effect. The moderation question concerns factors that affect the magnitude of that effect. Although analytic procedures have been reasonably well worked out in the case in which the treatment varies between participants, no systematic procedures for examining mediation and moderation have been developed in the case in which the treatment varies within participants. The authors present an analytic approach to these issues using ordinary least squares estimation.

  3. The case for visual analytics of arsenic concentrations in foods.

    PubMed

    Johnson, Matilda O; Cohly, Hari H P; Isokpehi, Raphael D; Awofolu, Omotayo R

    2010-05-01

    Arsenic is a naturally occurring toxic metal and its presence in food could be a potential risk to the health of both humans and animals. Prolonged ingestion of arsenic contaminated water may result in manifestations of toxicity in all systems of the body. Visual Analytics is a multidisciplinary field that is defined as the science of analytical reasoning facilitated by interactive visual interfaces. The concentrations of arsenic vary in foods making it impractical and impossible to provide regulatory limit for each food. This review article presents a case for the use of visual analytics approaches to provide comparative assessment of arsenic in various foods. The topics covered include (i) metabolism of arsenic in the human body; (ii) arsenic concentrations in various foods; (ii) factors affecting arsenic uptake in plants; (ii) introduction to visual analytics; and (iv) benefits of visual analytics for comparative assessment of arsenic concentration in foods. Visual analytics can provide an information superstructure of arsenic in various foods to permit insightful comparative risk assessment of the diverse and continually expanding data on arsenic in food groups in the context of country of study or origin, year of study, method of analysis and arsenic species.

  4. The Case for Visual Analytics of Arsenic Concentrations in Foods

    PubMed Central

    Johnson, Matilda O.; Cohly, Hari H.P.; Isokpehi, Raphael D.; Awofolu, Omotayo R.

    2010-01-01

    Arsenic is a naturally occurring toxic metal and its presence in food could be a potential risk to the health of both humans and animals. Prolonged ingestion of arsenic contaminated water may result in manifestations of toxicity in all systems of the body. Visual Analytics is a multidisciplinary field that is defined as the science of analytical reasoning facilitated by interactive visual interfaces. The concentrations of arsenic vary in foods making it impractical and impossible to provide regulatory limit for each food. This review article presents a case for the use of visual analytics approaches to provide comparative assessment of arsenic in various foods. The topics covered include (i) metabolism of arsenic in the human body; (ii) arsenic concentrations in various foods; (ii) factors affecting arsenic uptake in plants; (ii) introduction to visual analytics; and (iv) benefits of visual analytics for comparative assessment of arsenic concentration in foods. Visual analytics can provide an information superstructure of arsenic in various foods to permit insightful comparative risk assessment of the diverse and continually expanding data on arsenic in food groups in the context of country of study or origin, year of study, method of analysis and arsenic species. PMID:20623005

  5. Bimodal fuzzy analytic hierarchy process (BFAHP) for coronary heart disease risk assessment.

    PubMed

    Sabahi, Farnaz

    2018-04-04

    Rooted deeply in medical multiple criteria decision-making (MCDM), risk assessment is very important especially when applied to the risk of being affected by deadly diseases such as coronary heart disease (CHD). CHD risk assessment is a stochastic, uncertain, and highly dynamic process influenced by various known and unknown variables. In recent years, there has been a great interest in fuzzy analytic hierarchy process (FAHP), a popular methodology for dealing with uncertainty in MCDM. This paper proposes a new FAHP, bimodal fuzzy analytic hierarchy process (BFAHP) that augments two aspects of knowledge, probability and validity, to fuzzy numbers to better deal with uncertainty. In BFAHP, fuzzy validity is computed by aggregating the validities of relevant risk factors based on expert knowledge and collective intelligence. By considering both soft and statistical data, we compute the fuzzy probability of risk factors using the Bayesian formulation. In BFAHP approach, these fuzzy validities and fuzzy probabilities are used to construct a reciprocal comparison matrix. We then aggregate fuzzy probabilities and fuzzy validities in a pairwise manner for each risk factor and each alternative. BFAHP decides about being affected and not being affected by ranking of high and low risks. For evaluation, the proposed approach is applied to the risk of being affected by CHD using a real dataset of 152 patients of Iranian hospitals. Simulation results confirm that adding validity in a fuzzy manner can accrue more confidence of results and clinically useful especially in the face of incomplete information when compared with actual results. Applying the proposed BFAHP on CHD risk assessment of the dataset, it yields high accuracy rate above 85% for correct prediction. In addition, this paper recognizes that the risk factors of diastolic blood pressure in men and high-density lipoprotein in women are more important in CHD than other risk factors. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. In-house validation of a liquid chromatography-tandem mass spectrometry method for the determination of selective androgen receptor modulators (SARMS) in bovine urine.

    PubMed

    Schmidt, Kathrin S; Mankertz, Joachim

    2018-06-01

    A sensitive and robust LC-MS/MS method allowing the rapid screening and confirmation of selective androgen receptor modulators in bovine urine was developed and successfully validated according to Commission Decision 2002/657/EC, chapter 3.1.3 'alternative validation', by applying a matrix-comprehensive in-house validation concept. The confirmation of the analytes in the validation samples was achieved both on the basis of the MRM ion ratios as laid down in Commission Decision 2002/657/EC and by comparison of their enhanced product ion (EPI) spectra with a reference mass spectral library by making use of the QTRAP technology. Here, in addition to the MRM survey scan, EPI spectra were generated in a data-dependent way according to an information-dependent acquisition criterion. Moreover, stability studies of the analytes in solution and in matrix according to an isochronous approach proved the stability of the analytes in solution and in matrix for at least the duration of the validation study. To identify factors that have a significant influence on the test method in routine analysis, a factorial effect analysis was performed. To this end, factors considered to be relevant for the method in routine analysis (e.g. operator, storage duration of the extracts before measurement, different cartridge lots and different hydrolysis conditions) were systematically varied on two levels. The examination of the extent to which these factors influence the measurement results of the individual analytes showed that none of the validation factors exerts a significant influence on the measurement results.

  7. Relative frequencies of constrained events in stochastic processes: An analytical approach.

    PubMed

    Rusconi, S; Akhmatskaya, E; Sokolovski, D; Ballard, N; de la Cal, J C

    2015-10-01

    The stochastic simulation algorithm (SSA) and the corresponding Monte Carlo (MC) method are among the most common approaches for studying stochastic processes. They relies on knowledge of interevent probability density functions (PDFs) and on information about dependencies between all possible events. Analytical representations of a PDF are difficult to specify in advance, in many real life applications. Knowing the shapes of PDFs, and using experimental data, different optimization schemes can be applied in order to evaluate probability density functions and, therefore, the properties of the studied system. Such methods, however, are computationally demanding, and often not feasible. We show that, in the case where experimentally accessed properties are directly related to the frequencies of events involved, it may be possible to replace the heavy Monte Carlo core of optimization schemes with an analytical solution. Such a replacement not only provides a more accurate estimation of the properties of the process, but also reduces the simulation time by a factor of order of the sample size (at least ≈10(4)). The proposed analytical approach is valid for any choice of PDF. The accuracy, computational efficiency, and advantages of the method over MC procedures are demonstrated in the exactly solvable case and in the evaluation of branching fractions in controlled radical polymerization (CRP) of acrylic monomers. This polymerization can be modeled by a constrained stochastic process. Constrained systems are quite common, and this makes the method useful for various applications.

  8. Determinants of 25(OH)D sufficiency in obese minority children: selecting outcome measures and analytic approaches.

    PubMed

    Zhou, Ping; Schechter, Clyde; Cai, Ziyong; Markowitz, Morri

    2011-06-01

    To highlight complexities in defining vitamin D sufficiency in children. Serum 25-(OH) vitamin D [25(OH)D] levels from 140 healthy obese children age 6 to 21 years living in the inner city were compared with multiple health outcome measures, including bone biomarkers and cardiovascular risk factors. Several statistical analytic approaches were used, including Pearson correlation, analysis of covariance (ANCOVA), and "hockey stick" regression modeling. Potential threshold levels for vitamin D sufficiency varied by outcome variable and analytic approach. Only systolic blood pressure (SBP) was significantly correlated with 25(OH)D (r = -0.261; P = .038). ANCOVA revealed that SBP and triglyceride levels were statistically significant in the test groups [25(OH)D <10, <15 and <20 ng/mL] compared with the reference group [25(OH)D >25 ng/mL]. ANCOVA also showed that only children with severe vitamin D deficiency [25(OH)D <10 ng/mL] had significantly higher parathyroid hormone levels (Δ = 15; P = .0334). Hockey stick model regression analyses found evidence of a threshold level in SBP, with a 25(OH)D breakpoint of 27 ng/mL, along with a 25(OH)D breakpoint of 18 ng/mL for triglycerides, but no relationship between 25(OH)D and parathyroid hormone. Defining vitamin D sufficiency should take into account different vitamin D-related health outcome measures and analytic methodologies. Copyright © 2011 Mosby, Inc. All rights reserved.

  9. Multivariate Analysis and Machine Learning in Cerebral Palsy Research

    PubMed Central

    Zhang, Jing

    2017-01-01

    Cerebral palsy (CP), a common pediatric movement disorder, causes the most severe physical disability in children. Early diagnosis in high-risk infants is critical for early intervention and possible early recovery. In recent years, multivariate analytic and machine learning (ML) approaches have been increasingly used in CP research. This paper aims to identify such multivariate studies and provide an overview of this relatively young field. Studies reviewed in this paper have demonstrated that multivariate analytic methods are useful in identification of risk factors, detection of CP, movement assessment for CP prediction, and outcome assessment, and ML approaches have made it possible to automatically identify movement impairments in high-risk infants. In addition, outcome predictors for surgical treatments have been identified by multivariate outcome studies. To make the multivariate and ML approaches useful in clinical settings, further research with large samples is needed to verify and improve these multivariate methods in risk factor identification, CP detection, movement assessment, and outcome evaluation or prediction. As multivariate analysis, ML and data processing technologies advance in the era of Big Data of this century, it is expected that multivariate analysis and ML will play a bigger role in improving the diagnosis and treatment of CP to reduce mortality and morbidity rates, and enhance patient care for children with CP. PMID:29312134

  10. Multivariate Analysis and Machine Learning in Cerebral Palsy Research.

    PubMed

    Zhang, Jing

    2017-01-01

    Cerebral palsy (CP), a common pediatric movement disorder, causes the most severe physical disability in children. Early diagnosis in high-risk infants is critical for early intervention and possible early recovery. In recent years, multivariate analytic and machine learning (ML) approaches have been increasingly used in CP research. This paper aims to identify such multivariate studies and provide an overview of this relatively young field. Studies reviewed in this paper have demonstrated that multivariate analytic methods are useful in identification of risk factors, detection of CP, movement assessment for CP prediction, and outcome assessment, and ML approaches have made it possible to automatically identify movement impairments in high-risk infants. In addition, outcome predictors for surgical treatments have been identified by multivariate outcome studies. To make the multivariate and ML approaches useful in clinical settings, further research with large samples is needed to verify and improve these multivariate methods in risk factor identification, CP detection, movement assessment, and outcome evaluation or prediction. As multivariate analysis, ML and data processing technologies advance in the era of Big Data of this century, it is expected that multivariate analysis and ML will play a bigger role in improving the diagnosis and treatment of CP to reduce mortality and morbidity rates, and enhance patient care for children with CP.

  11. Understanding suicide risk within the Research Domain Criteria (RDoC) framework: A meta-analytic review.

    PubMed

    Glenn, Catherine R; Kleiman, Evan M; Cha, Christine B; Deming, Charlene A; Franklin, Joseph C; Nock, Matthew K

    2018-01-01

    The field is in need of novel and transdiagnostic risk factors for suicide. The National Institute of Mental Health's Research Domain Criteria (RDoC) provides a framework that may help advance research on suicidal behavior. We conducted a meta-analytic review of existing prospective risk and protective factors for suicidal thoughts and behaviors (ideation, attempts, and deaths) that fall within one of the five RDoC domains or relate to a prominent suicide theory. Predictors were selected from a database of 4,082 prospective risk and protective factors for suicide outcomes. A total of 460 predictors met inclusion criteria for this meta-analytic review and most examined risk (vs. protective) factors for suicidal thoughts and behaviors. The overall effect of risk factors was statistically significant, but relatively small, in predicting suicide ideation (weighted mean odds ratio: wOR = 1.72; 95% CI: 1.59-1.87), suicide attempt (wOR = 1.66 [1.57-1.76), and suicide death (wOR = 1.41 [1.24-1.60]). Across all suicide outcomes, most risk factors related to the Negative Valence Systems domain, although effect sizes were of similar magnitude across RDoC domains. This study demonstrated that the RDoC framework provides a novel and promising approach to suicide research; however, relatively few studies of suicidal behavior fit within this framework. Future studies must go beyond the "usual suspects" of suicide risk factors (e.g., mental disorders, sociodemographics) to understand the processes that combine to lead to this deadly outcome. © 2017 Wiley Periodicals, Inc.

  12. Scattering of In-Plane Waves by Elastic Wedges

    NASA Astrophysics Data System (ADS)

    Mohammadi, K.; Asimaki, D.; Fradkin, L.

    2014-12-01

    The scattering of seismic waves by elastic wedges has been a topic of interest in seismology and geophysics for many decades. Analytical, semi-analytical, experimental and numerical studies on idealized wedges have provided insight into the seismic behavior of continental margins, mountain roots and crustal discontinuities. Published results, however, have almost exclusively focused on incident Rayleigh waves and out-of-plane body (SH) waves. Complementing the existing body of work, we here present results from our study on the res­ponse of elastic wedges to incident P or SV waves, an idealized pro­blem that can provide valuable insight to the understanding and parameterization of topographic ampli­fication of seismic ground mo­tion. We first show our earlier work on explicit finite difference simulations of SV-wave scattering by elastic wedges over a wide range of internal angles. We next present a semi-analytical solution that we developed using the approach proposed by Gautesen, to describe the scattered wavefield in the immediate vicinity of the wedge's tip (near-field). We use the semi-analytical solution to validate the numerical analyses, and improve resolution of the amplification factor at the wedge vertex that spikes when the internal wedge angle approaches the critical angle of incidence.

  13. High-throughput method for the quantitation of metabolites and co-factors from homocysteine-methionine cycle for nutritional status assessment.

    PubMed

    Da Silva, Laeticia; Collino, Sebastiano; Cominetti, Ornella; Martin, Francois-Pierre; Montoliu, Ivan; Moreno, Sergio Oller; Corthesy, John; Kaput, Jim; Kussmann, Martin; Monteiro, Jacqueline Pontes; Guiraud, Seu Ping

    2016-09-01

    There is increasing interest in the profiling and quantitation of methionine pathway metabolites for health management research. Currently, several analytical approaches are required to cover metabolites and co-factors. We report the development and the validation of a method for the simultaneous detection and quantitation of 13 metabolites in red blood cells. The method, validated in a cohort of healthy human volunteers, shows a high level of accuracy and reproducibility. This high-throughput protocol provides a robust coverage of central metabolites and co-factors in one single analysis and in a high-throughput fashion. In large-scale clinical settings, the use of such an approach will significantly advance the field of nutritional research in health and disease.

  14. Investigation of hydrophobic substrates for solution residue analysis utilizing an ambient desorption liquid sampling-atmospheric pressure glow discharge microplasma.

    PubMed

    Paing, Htoo W; Marcus, R Kenneth

    2018-03-12

    A practical method for preparation of solution residue samples for analysis utilizing the ambient desorption liquid sampling-atmospheric pressure glow discharge optical emission spectroscopy (AD-LS-APGD-OES) microplasma is described. Initial efforts involving placement of solution aliquots in wells drilled into copper substrates, proved unsuccessful. A design-of-experiment (DOE) approach was carried out to determine influential factors during sample deposition including solution volume, solute concentration, number of droplets deposited, and the solution matrix. These various aspects are manifested in the mass of analyte deposited as well as the size/shape of the product residue. Statistical analysis demonstrated that only those initial attributes were significant factors towards the emission response of the analyte. Various approaches were investigated to better control the location/uniformity of the deposited sample. Three alternative substrates, a glass slide, a poly(tetrafluoro)ethylene (PTFE) sheet, and a polydimethylsiloxane (PDMS)-coated glass slide, were evaluated towards the microplasma analytical performance. Co-deposition with simple organic dyes provided an accurate means of determining the location of the analyte with only minor influence on emission responses. The PDMS-coated glass provided the best performance by virtue of its providing a uniform spatial distribution of the residue material. This uniformity yielded an improved limits of detection by approximately 22× for 20 μL and 4 x for 2 μL over the other two substrates. While they operate by fundamentally different processes, this choice of substrate is not restricted to the LS-APGD, but may also be applicable to other AD methods such as DESI, DART, or LIBS. Further developments will be directed towards a field-deployable ambient desorption OES source for quantitative analysis of microvolume solution residues of nuclear forensics importance.

  15. Determination of immersion factors for radiance sensors in marine and inland waters: a semi-analytical approach using refractive index approximation

    NASA Astrophysics Data System (ADS)

    Dev, Pravin J.; Shanmugam, P.

    2016-05-01

    Underwater radiometers are generally calibrated in air using a standard source. The immersion factors are required for these radiometers to account for the change in the in-water measurements with respect to in-air due to the different refractive index of the medium. The immersion factors previously determined for RAMSES series of commercial radiometers manufactured by TriOS are applicable to clear oceanic waters. In typical inland and turbid productive coastal waters, these experimentally determined immersion factors yield significantly large errors in water-leaving radiances (Lw) and hence remote sensing reflectances (Rrs). To overcome this limitation, a semi-analytical method with based on the refractive index approximation is proposed in this study, with the aim of obtaining reliable Lw and Rrs from RAMSES radiometers for turbid and productive waters within coastal and inland water environments. We also briefly show the variation of pure water immersion factors (Ifw) and newly derived If on Lw and Rrs for clear and turbid waters. The remnant problems other than the immersion factor coefficients such as transmission, air-water and water-air Fresnel's reflectances are also discussed.

  16. Investigating the two-moment characterisation of subcellular biochemical networks.

    PubMed

    Ullah, Mukhtar; Wolkenhauer, Olaf

    2009-10-07

    While ordinary differential equations (ODEs) form the conceptual framework for modelling many cellular processes, specific situations demand stochastic models to capture the influence of noise. The most common formulation of stochastic models for biochemical networks is the chemical master equation (CME). While stochastic simulations are a practical way to realise the CME, analytical approximations offer more insight into the influence of noise. Towards that end, the two-moment approximation (2MA) is a promising addition to the established analytical approaches including the chemical Langevin equation (CLE) and the related linear noise approximation (LNA). The 2MA approach directly tracks the mean and (co)variance which are coupled in general. This coupling is not obvious in CME and CLE and ignored by LNA and conventional ODE models. We extend previous derivations of 2MA by allowing (a) non-elementary reactions and (b) relative concentrations. Often, several elementary reactions are approximated by a single step. Furthermore, practical situations often require the use of relative concentrations. We investigate the applicability of the 2MA approach to the well-established fission yeast cell cycle model. Our analytical model reproduces the clustering of cycle times observed in experiments. This is explained through multiple resettings of M-phase promoting factor (MPF), caused by the coupling between mean and (co)variance, near the G2/M transition.

  17. Alcohol expectancy multiaxial assessment: a memory network-based approach.

    PubMed

    Goldman, Mark S; Darkes, Jack

    2004-03-01

    Despite several decades of activity, alcohol expectancy research has yet to merge measurement approaches with developing memory theory. This article offers an expectancy assessment approach built on a conceptualization of expectancy as an information processing network. The authors began with multidimensional scaling models of expectancy space, which served as heuristics to suggest confirmatory factor analytic dimensional models for entry into covariance structure predictive models. It is argued that this approach permits a relatively thorough assessment of the broad range of potential expectancy dimensions in a format that is very flexible in terms of instrument length and specificity versus breadth of focus. ((c) 2004 APA, all rights reserved)

  18. The areal reduction factor: A new analytical expression for the Lazio Region in central Italy

    NASA Astrophysics Data System (ADS)

    Mineo, C.; Ridolfi, E.; Napolitano, F.; Russo, F.

    2018-05-01

    For the study and modeling of hydrological phenomena, both in urban and rural areas, a proper estimation of the areal reduction factor (ARF) is crucial. In this paper, we estimated the ARF from observed rainfall data as the ratio between the average rainfall occurring in a specific area and the point rainfall. Then, we compared the obtained ARF values with some of the most widespread empirical approaches in literature which are used when rainfall observations are not available. Results highlight that the literature formulations can lead to a substantial over- or underestimation of the ARF estimated from observed data. These findings can have severe consequences, especially in the design of hydraulic structures where empirical formulations are extensively applied. The aim of this paper is to present a new analytical relationship with an explicit dependence on the rainfall duration and area that can better represent the ARF-area trend over the area case of study. The analytical curve presented here can find an important application to estimate the ARF values for design purposes. The test study area is the Lazio Region (central Italy).

  19. Generalized Subset Designs in Analytical Chemistry.

    PubMed

    Surowiec, Izabella; Vikström, Ludvig; Hector, Gustaf; Johansson, Erik; Vikström, Conny; Trygg, Johan

    2017-06-20

    Design of experiments (DOE) is an established methodology in research, development, manufacturing, and production for screening, optimization, and robustness testing. Two-level fractional factorial designs remain the preferred approach due to high information content while keeping the number of experiments low. These types of designs, however, have never been extended to a generalized multilevel reduced design type that would be capable to include both qualitative and quantitative factors. In this Article we describe a novel generalized fractional factorial design. In addition, it also provides complementary and balanced subdesigns analogous to a fold-over in two-level reduced factorial designs. We demonstrate how this design type can be applied with good results in three different applications in analytical chemistry including (a) multivariate calibration using microwave resonance spectroscopy for the determination of water in tablets, (b) stability study in drug product development, and (c) representative sample selection in clinical studies. This demonstrates the potential of generalized fractional factorial designs to be applied in many other areas of analytical chemistry where representative, balanced, and complementary subsets are required, especially when a combination of quantitative and qualitative factors at multiple levels exists.

  20. An approach to get thermodynamic properties from speed of sound

    NASA Astrophysics Data System (ADS)

    Núñez, M. A.; Medina, L. A.

    2017-01-01

    An approach for estimating thermodynamic properties of gases from the speed of sound u, is proposed. The square u2, the compression factor Z and the molar heat capacity at constant volume C V are connected by two coupled nonlinear partial differential equations. Previous approaches to solving this system differ in the conditions used on the range of temperature values [Tmin,Tmax]. In this work we propose the use of Dirichlet boundary conditions at Tmin, Tmax. The virial series of the compression factor Z = 1+Bρ+Cρ2+… and other properties leads the problem to the solution of a recursive set of linear ordinary differential equations for the B, C. Analytic solutions of the B equation for Argon are used to study the stability of our approach and previous ones under perturbation errors of the input data. The results show that the approach yields B with a relative error bounded basically by that of the boundary values and the error of other approaches can be some orders of magnitude lager.

  1. Headspace versus direct immersion solid phase microextraction in complex matrixes: investigation of analyte behavior in multicomponent mixtures.

    PubMed

    Gionfriddo, Emanuela; Souza-Silva, Érica A; Pawliszyn, Janusz

    2015-08-18

    This work aims to investigate the behavior of analytes in complex mixtures and matrixes with the use of solid-phase microextraction (SPME). Various factors that influence analyte uptake such as coating chemistry, extraction mode, the physicochemical properties of analytes, and matrix complexity were considered. At first, an aqueous system containing analytes bearing different hydrophobicities, molecular weights, and chemical functionalities was investigated by using commercially available liquid and solid porous coatings. The differences in the mass transfer mechanisms resulted in a more pronounced occurrence of coating saturation in headspace mode. Contrariwise, direct immersion extraction minimizes the occurrence of artifacts related to coating saturation and provides enhanced extraction of polar compounds. In addition, matrix-compatible PDMS-modified solid coatings, characterized by a new morphology that avoids coating fouling, were compared to their nonmodified analogues. The obtained results indicate that PDMS-modified coatings reduce artifacts associated with coating saturation, even in headspace mode. This factor, coupled to their matrix compatibility, make the use of direct SPME very practical as a quantification approach and the best choice for metabolomics studies where wide coverage is intended. To further understand the influence on analyte uptake on a system where additional interactions occur due to matrix components, ex vivo and in vivo sampling conditions were simulated using a starch matrix model, with the aim of mimicking plant-derived materials. Our results corroborate the fact that matrix handling can affect analyte/matrix equilibria, with consequent release of high concentrations of previously bound hydrophobic compounds, potentially leading to coating saturation. Direct immersion SPME limited the occurrence of the artifacts, which confirms the suitability of SPME for in vivo applications. These findings shed light into the implementation of in vivo SPME strategies in quantitative metabolomics studies of complex plant-based systems.

  2. The Measurement of Economic, Social and Environmental Performance of Countries: A Novel Approach

    ERIC Educational Resources Information Center

    Cracolici, Maria Francesca; Cuffaro, Miranda; Nijkamp, Peter

    2010-01-01

    This paper presents a new analytical framework for assessing spatial disparities among countries. It takes for granted that the analysis of a country's performance cannot be limited solely to either economic or social factors. The aim of the paper is to combine relevant economic and "non-economic" (mainly social) aspects of a country's performance…

  3. Three Groups' Perception of Broadcasting in the Public Interest: A Factor Analytical Approach to Definition.

    ERIC Educational Resources Information Center

    Brown, Barbara

    Since the Federal Communications Commission is to be a regulation service in the public interest, several studies investigated what several Midwestern American groups would consider "in public interest." The study began in 1991 with an examination of college students' attitudes. A second part of the study (in 1992) administered…

  4. Handling Missing Data in Educational Research Using SPSS

    ERIC Educational Resources Information Center

    Cheema, Jehanzeb

    2012-01-01

    This study looked at the effect of a number of factors such as the choice of analytical method, the handling method for missing data, sample size, and proportion of missing data, in order to evaluate the effect of missing data treatment on accuracy of estimation. In order to accomplish this a methodological approach involving simulated data was…

  5. Using Primary Sources to Teach Civil War History: A Case Study in Pedagogical Decision Making

    ERIC Educational Resources Information Center

    Snook, David L.

    2017-01-01

    This exploratory study combined the process of modified analytic induction with a mixed methods approach to analyze various factors that affected or might have affected participating teachers' decisions to use or not use various primary source based teaching strategies to teach historical thinking skills. Four participating eighth and ninth grade…

  6. The Shiver-Shimmer Factor: Musical Spirituality, Emotion, and Education

    ERIC Educational Resources Information Center

    Bogdan, Deanne

    2010-01-01

    This article offers one approach to exploring the question of in what sense music educators can speak of music and its moving power as spiritual by inquiring into what might count as a "musical spiritual experience" in emotional terms. The essay's analytic framework employs the distinction between two related concepts which I call the "shiver" and…

  7. Validity of Particle-Counting Method Using Laser-Light Scattering for Detecting Platelet Aggregation in Diabetic Patients

    NASA Astrophysics Data System (ADS)

    Nakadate, Hiromichi; Sekizuka, Eiichi; Minamitani, Haruyuki

    We aimed to study the validity of a new analytical approach that reflected the phase from platelet activation to the formation of small platelet aggregates. We hoped that this new approach would enable us to use the particle-counting method with laser-light scattering to measure platelet aggregation in healthy controls and in diabetic patients without complications. We measured agonist-induced platelet aggregation for 10 min. Agonist was added to the platelet-rich plasma 1 min after measurement started. We compared the total scattered light intensity from small aggregates over a 10-min period (established analytical approach) and that over a 2-min period from 1 to 3 min after measurement started (new analytical approach). Consequently platelet aggregation in diabetics with HbA1c ≥ 6.5% was significantly greater than in healthy controls by both analytical approaches. However, platelet aggregation in diabetics with HbA1c < 6.5%, i.e. patients in the early stages of diabetes, was significantly greater than in healthy controls only by the new analytical approach, not by the established analytical approach. These results suggest that platelet aggregation as detected by the particle-counting method using laser-light scattering could be applied in clinical examinations by our new analytical approach.

  8. Geometric factor and influence of sensors in the establishment of a resistivity-moisture relation in soil samples

    NASA Astrophysics Data System (ADS)

    López-Sánchez, M.; Mansilla-Plaza, L.; Sánchez-de-laOrden, M.

    2017-10-01

    Prior to field scale research, soil samples are analysed on a laboratory scale for electrical resistivity calibrations. Currently, there are a variety of field instruments to estimate the water content in soils using different physical phenomena. These instruments can be used to develop moisture-resistivity relationships on the same soil samples. This assures that measurements are performed on the same material and under the same conditions (e.g., humidity and temperature). A geometric factor is applied to the location of electrodes, in order to calculate the apparent electrical resistivity of the laboratory test cells. This geometric factor can be determined in three different ways: by means of the use of an analytical approximation, laboratory trials (experimental approximation), or by the analysis of a numerical model. The first case, the analytical approximation, is not appropriate for complex cells or arrays. And both, the experimental and numerical approximation can lead to inaccurate results. Therefore, we propose a novel approach to obtain a compromise solution between both techniques, providing a more precise determination of the geometrical factor.

  9. Dynamic calibration approach for determining catechins and gallic acid in green tea using LC-ESI/MS.

    PubMed

    Bedner, Mary; Duewer, David L

    2011-08-15

    Catechins and gallic acid are antioxidant constituents of Camellia sinensis, or green tea. Liquid chromatography with both ultraviolet (UV) absorbance and electrospray ionization mass spectrometric (ESI/MS) detection was used to determine catechins and gallic acid in three green tea matrix materials that are commonly used as dietary supplements. The results from both detection modes were evaluated with 14 quantitation models, all of which were based on the analyte response relative to an internal standard. Half of the models were static, where quantitation was achieved with calibration factors that were constant over an analysis set. The other half were dynamic, with calibration factors calculated from interpolated response factor data at each time a sample was injected to correct for potential variations in analyte response over time. For all analytes, the relatively nonselective UV responses were found to be very stable over time and independent of the calibrant concentration; comparable results with low variability were obtained regardless of the quantitation model used. Conversely, the highly selective MS responses were found to vary both with time and as a function of the calibrant concentration. A dynamic quantitation model based on polynomial data-fitting was used to reduce the variability in the quantitative results using the MS data.

  10. Concentrations of metals in tissues of lowbush blueberry (Vaccinium angustifolium) near a copper-nickel smelter at Sudbury, Ontario, Canada: A factor analytic approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bagatto, G.; Shorthouse, J.D.; Crowder, A.A.

    1993-10-01

    Ecosystems damaged by emissions from the copper-nickel smelters of Inco and Falconbridge Ltd. near Sudbury, Ontario, Canada have provided a unique opportunity to study the effects of metal particulates and sulphur dioxide fumigations on plant and animal communities. The most infamous terrain in the Sudbury region is nearest the smelters (two active and one closed), where nearly all vegetation has been destroyed and soils eroded and contaminated. However, over all the past twenty years, some species of plants have developed a tolerance to polluted soils and some denuded lands have been naturally and artificially revegetated. Furthermore, a series of uniquemore » anthropogenic forests have developed away from the smelters. Several studies on the accumulation of metals in plant tissues indicate the levels of metals are usually highest closest to the smelters. Consequently, several studies have reported high correlations between plant concentrations of certain metals with distance from the source of pollution. However, tissue metal burdens are not always correlated with distance from the emission source, suggesting that other biological and physico-chemical factors may influence tissue metal burdens in the Sudbury habitat. The present study provides information on the metal burdens in another plant, lowbush blueberry, growing both near and away from the smelters. This study assesses the apparent influence of the Sudbury smelting operations on plant tissue burdens of five additional elements, along with copper and nickel, by using a factor analytic approach. This approach will allow determination of underlying factors which govern tissue metal burdens in a polluted environment and helps to refine the future direction of research in the Sudbury ecosystem. 12 refs., 2 tabs.« less

  11. Interaction of a conductive crack and of an electrode at a piezoelectric bimaterial interface

    NASA Astrophysics Data System (ADS)

    Onopriienko, Oleg; Loboda, Volodymyr; Sheveleva, Alla; Lapusta, Yuri

    2018-06-01

    The interaction of a conductive crack and an electrode at a piezoelectric bi-material interface is studied. The bimaterial is subjected to an in-plane electrical field parallel to the interface and an anti-plane mechanical loading. The problem is formulated and reduced, via the application of sectionally analytic vector functions, to a combined Dirichlet-Riemann boundary value problem. Simple analytical expressions for the stress, the electric field, and their intensity factors as well as for the crack faces' displacement jump are derived. Our numerical results illustrate the proposed approach and permit to draw some conclusions on the crack-electrode interaction.

  12. Properties of water as a novel stationary phase in capillary gas chromatography.

    PubMed

    Gallant, Jonathan A; Thurbide, Kevin B

    2014-09-12

    A novel method of separation that uses water as a stationary phase in capillary gas chromatography (GC) is presented. Through applying a water phase to the interior walls of a stainless steel capillary, good separations were obtained for a large variety of analytes in this format. It was found that carrier gas humidification and backpressure were key factors in promoting stable operation over time at various temperatures. For example, with these measures in place, the retention time of an acetone test analyte was found to reduce by only 44s after 100min of operation at a column temperature of 100°C. In terms of efficiency, under optimum conditions the method produced about 20,000 plates for an acetone test analyte on a 250μm i.d.×30m column. Overall, retention on the stationary phase generally increased with analyte water solubility and polarity, but was relatively little correlated with analyte volatility. Conversely, non-polar analytes were essentially unretained in the system. These features were applied to the direct analysis of different polar analytes in both aqueous and organic samples. Results suggest that this approach could provide an interesting alternative tool in capillary GC separations. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Stimulus overselectivity four decades later: a review of the literature and its implications for current research in autism spectrum disorder.

    PubMed

    Ploog, Bertram O

    2010-11-01

    This review of several topics related to "stimulus overselectivity" (Lovaas et al., J Abnormal Psychol 77:211-222, 1971) has three main purposes: (1) To outline the factors that may contribute to overselectivity; (2) to link the behavior-analytical notion of overselectivity to current nonbehavior-analytical research and theory; and (3) to suggest remedial strategies based on the behavior-analytical approach. While it is clear that overselectivity is not specific to autism spectrum disorder (ASD) and also that not all persons with ASD exhibit overselectivity, it is prevalent in ASD and has critical implications for symptoms, treatment, research, and theory. Weak Central Coherence and Enhanced Perceptual Functioning theories are briefly considered. The research areas addressed here include theory of mind, joint attention, language development, and executive function.

  14. Application of person-centered analytic methodology in longitudinal research: exemplars from the Women's Health Initiative Clinical Trial data.

    PubMed

    Zaslavsky, Oleg; Cochrane, Barbara B; Herting, Jerald R; Thompson, Hilaire J; Woods, Nancy F; Lacroix, Andrea

    2014-02-01

    Despite the variety of available analytic methods, longitudinal research in nursing has been dominated by use of a variable-centered analytic approach. The purpose of this article is to present the utility of person-centered methodology using a large cohort of American women 65 and older enrolled in the Women's Health Initiative Clinical Trial (N = 19,891). Four distinct trajectories of energy/fatigue scores were identified. Levels of fatigue were closely linked to age, socio-demographic factors, comorbidities, health behaviors, and poor sleep quality. These findings were consistent regardless of the methodological framework. Finally, we demonstrated that energy/fatigue levels predicted future hospitalization in non-disabled elderly. Person-centered methods provide unique opportunities to explore and statistically model the effects of longitudinal heterogeneity within a population. © 2013 Wiley Periodicals, Inc.

  15. Reverse transcription-polymerase chain reaction molecular testing of cytology specimens: Pre-analytic and analytic factors.

    PubMed

    Bridge, Julia A

    2017-01-01

    The introduction of molecular testing into cytopathology laboratory practice has expanded the types of samples considered feasible for identifying genetic alterations that play an essential role in cancer diagnosis and treatment. Reverse transcription-polymerase chain reaction (RT-PCR), a sensitive and specific technical approach for amplifying a defined segment of RNA after it has been reverse-transcribed into its DNA complement, is commonly used in clinical practice for the identification of recurrent or tumor-specific fusion gene events. Real-time RT-PCR (quantitative RT-PCR), a technical variation, also permits the quantitation of products generated during each cycle of the polymerase chain reaction process. This review addresses qualitative and quantitative pre-analytic and analytic considerations of RT-PCR as they relate to various cytologic specimens. An understanding of these aspects of genetic testing is central to attaining optimal results in the face of the challenges that cytology specimens may present. Cancer Cytopathol 2017;125:11-19. © 2016 American Cancer Society. © 2016 American Cancer Society.

  16. Construct validity of the Beck Hopelessness Scale (BHS) among university students: A multitrait-multimethod approach.

    PubMed

    Boduszek, Daniel; Dhingra, Katie

    2016-10-01

    There is considerable debate about the underlying factor structure of the Beck Hopelessness Scale (BHS) in the literature. An established view is that it reflects a unitary or bidimensional construct in nonclinical samples. There are, however, reasons to reconsider this conceptualization. Based on previous factor analytic findings from both clinical and nonclinical studies, the aim of the present study was to compare 16 competing models of the BHS in a large university student sample (N = 1, 733). Sixteen distinct factor models were specified and tested using conventional confirmatory factor analytic techniques, along with confirmatory bifactor modeling. A 3-factor solution with 2 method effects (i.e., a multitrait-multimethod model) provided the best fit to the data. The reliability of this conceptualization was supported by McDonald's coefficient omega and the differential relationships exhibited between the 3 hopelessness factors ("feelings about the future," "loss of motivation," and "future expectations") and measures of goal disengagement, brooding rumination, suicide ideation, and suicide attempt history. The results provide statistical support for a 3-trait and 2-method factor model, and hence the 3 dimensions of hopelessness theorized by Beck. The theoretical and methodological implications of these findings are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. Separating method factors and higher order traits of the Big Five: a meta-analytic multitrait-multimethod approach.

    PubMed

    Chang, Luye; Connelly, Brian S; Geeza, Alexis A

    2012-02-01

    Though most personality researchers now recognize that ratings of the Big Five are not orthogonal, the field has been divided about whether these trait intercorrelations are substantive (i.e., driven by higher order factors) or artifactual (i.e., driven by correlated measurement error). We used a meta-analytic multitrait-multirater study to estimate trait correlations after common method variance was controlled. Our results indicated that common method variance substantially inflates trait correlations, and, once controlled, correlations among the Big Five became relatively modest. We then evaluated whether two different theories of higher order factors could account for the pattern of Big Five trait correlations. Our results did not support Rushton and colleagues' (Rushton & Irwing, 2008; Rushton et al., 2009) proposed general factor of personality, but Digman's (1997) α and β metatraits (relabeled by DeYoung, Peterson, and Higgins (2002) as Stability and Plasticity, respectively) produced viable fit. However, our models showed considerable overlap between Stability and Emotional Stability and between Plasticity and Extraversion, raising the question of whether these metatraits are redundant with their dominant Big Five traits. This pattern of findings was robust when we included only studies whose observers were intimately acquainted with targets. Our results underscore the importance of using a multirater approach to studying personality and the need to separate the causes and outcomes of higher order metatraits from those of the Big Five. We discussed the implications of these findings for the array of research fields in which personality is studied.

  18. Approximate Formula for the Vertical Asymptote of Projectile Motion in Midair

    ERIC Educational Resources Information Center

    Chudinov, Peter Sergey

    2010-01-01

    The classic problem of the motion of a point mass (projectile) thrown at an angle to the horizon is reviewed. The air drag force is taken into account with the drag factor assumed to be constant. An analytical approach is used for the investigation. An approximate formula is obtained for one of the characteristics of the motion--the vertical…

  19. Removal of iron interferences by solvent extraction for geochemical analysis by atomic-absorption spectrophotometry

    USGS Publications Warehouse

    Zhou, L.; Chao, T.T.; Sanzolone, R.F.

    1985-01-01

    Iron is a common interferent in the determination of many elements in geochemical samples. Two approaches for its removal have been taken. The first involves removal of iron by extraction with methyl isobutyl ketone (MIBK) from hydrochloric acid medium, leaving the analytes in the aqueous phase. The second consists of reduction of iron(III) to iron(II) by ascorbic acid to minimize its extraction into MIBK, so that the analytes may be isolated by extraction. Elements of interest can then be determined using the aqueous solution or the organic extract, as appropriate. Operating factors such as the concentration of hydrochloric acid, amounts of iron present, number of extractions, the presence or absence of a salting-out agent, and the optimum ratio of ascorbic acid to iron have been determined. These factors have general applications in geochemical analysis by atomic-absorption spectrophotometry. ?? 1985.

  20. Improving the Method of Roof Fall Susceptibility Assessment based on Fuzzy Approach

    NASA Astrophysics Data System (ADS)

    Ghasemi, Ebrahim; Ataei, Mohammad; Shahriar, Kourosh

    2017-03-01

    Retreat mining is always accompanied by a great amount of accidents and most of them are due to roof fall. Therefore, development of methodologies to evaluate the roof fall susceptibility (RFS) seems essential. Ghasemi et al. (2012) proposed a systematic methodology to assess the roof fall risk during retreat mining based on risk assessment classic approach. The main defect of this method is ignorance of subjective uncertainties due to linguistic input value of some factors, low resolution, fixed weighting, sharp class boundaries, etc. To remove this defection and improve the mentioned method, in this paper, a novel methodology is presented to assess the RFS using fuzzy approach. The application of fuzzy approach provides an effective tool to handle the subjective uncertainties. Furthermore, fuzzy analytical hierarchy process (AHP) is used to structure and prioritize various risk factors and sub-factors during development of this method. This methodology is applied to identify the susceptibility of roof fall occurrence in main panel of Tabas Central Mine (TCM), Iran. The results indicate that this methodology is effective and efficient in assessing RFS.

  1. Genome wide approaches to identify protein-DNA interactions.

    PubMed

    Ma, Tao; Ye, Zhenqing; Wang, Liguo

    2018-05-29

    Transcription factors are DNA-binding proteins that play key roles in many fundamental biological processes. Unraveling their interactions with DNA is essential to identify their target genes and understand the regulatory network. Genome-wide identification of their binding sites became feasible thanks to recent progress in experimental and computational approaches. ChIP-chip, ChIP-seq, and ChIP-exo are three widely used techniques to demarcate genome-wide transcription factor binding sites. This review aims to provide an overview of these three techniques including their experiment procedures, computational approaches, and popular analytic tools. ChIP-chip, ChIP-seq, and ChIP-exo have been the major techniques to study genome-wide in vivo protein-DNA interaction. Due to the rapid development of next-generation sequencing technology, array-based ChIP-chip is deprecated and ChIP-seq has become the most widely used technique to identify transcription factor binding sites in genome-wide. The newly developed ChIP-exo further improves the spatial resolution to single nucleotide. Numerous tools have been developed to analyze ChIP-chip, ChIP-seq and ChIP-exo data. However, different programs may employ different mechanisms or underlying algorithms thus each will inherently include its own set of statistical assumption and bias. So choosing the most appropriate analytic program for a given experiment needs careful considerations. Moreover, most programs only have command line interface so their installation and usage will require basic computation expertise in Unix/Linux. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. Technosocial Modeling of IED Threat Scenarios and Attacks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitney, Paul D.; Brothers, Alan J.; Coles, Garill A.

    2009-03-23

    This paper describes an approach for integrating sociological and technical models to develop more complete threat assessment. Current approaches to analyzing and addressing threats tend to focus on the technical factors. This paper addresses development of predictive models that encompass behavioral as well as these technical factors. Using improvised explosive device (IED) attacks as motivation, this model supports identification of intervention activities 'left of boom' as well as prioritizing attack modalities. We show how Bayes nets integrate social factors associated with IED attacks into general threat model containing technical and organizational steps from planning through obtaining the IED to initiationmore » of the attack. The social models are computationally-based representations of relevant social science literature that describes human decision making and physical factors. When combined with technical models, the resulting model provides improved knowledge integration into threat assessment for monitoring. This paper discusses the construction of IED threat scenarios, integration of diverse factors into an analytical framework for threat assessment, indicator identification for future threats, and future research directions.« less

  3. Shape anomaly detection under strong measurement noise: An analytical approach to adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Krasichkov, Alexander S.; Grigoriev, Eugene B.; Bogachev, Mikhail I.; Nifontov, Eugene M.

    2015-10-01

    We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.

  4. Volkov basis for simulation of interaction of strong laser pulses and solids

    NASA Astrophysics Data System (ADS)

    Kidd, Daniel; Covington, Cody; Li, Yonghui; Varga, Kálmán

    2018-01-01

    An efficient and accurate basis comprised of Volkov states is implemented and tested for time-dependent simulations of interactions between strong laser pulses and crystalline solids. The Volkov states are eigenstates of the free electron Hamiltonian in an electromagnetic field and analytically represent the rapidly oscillating time-dependence of the orbitals, allowing significantly faster time propagation than conventional approaches. The Volkov approach can be readily implemented in plane-wave codes by multiplying the potential energy matrix elements with a simple time-dependent phase factor.

  5. Transfer function concept for ultrasonic characterization of material microstructures

    NASA Technical Reports Server (NTRS)

    Vary, A.; Kautz, H. E.

    1986-01-01

    The approach given depends on treating material microstructures as elastomechanical filters that have analytically definable transfer functions. These transfer functions can be defined in terms of the frequency dependence of the ultrasonic attenuation coefficient. The transfer function concept provides a basis for synthesizing expressions that characterize polycrystalline materials relative to microstructural factors such as mean grain size, grain-size distribution functions, and grain boundary energy transmission. Although the approach is nonrigorous, it leads to a rational basis for combining the previously mentioned diverse and fragmented equations for ultrasonic attenuation coefficients.

  6. Modeling evolution of dark matter substructure and annihilation boost

    NASA Astrophysics Data System (ADS)

    Hiroshima, Nagisa; Ando, Shin'ichiro; Ishiyama, Tomoaki

    2018-06-01

    We study evolution of dark matter substructures, especially how they lose mass and change density profile after they fall in gravitational potential of larger host halos. We develop an analytical prescription that models the subhalo mass evolution and calibrate it to results of N -body numerical simulations of various scales from very small (Earth size) to large (galaxies to clusters) halos. We then combine the results with halo accretion histories and calculate the subhalo mass function that is physically motivated down to Earth-mass scales. Our results—valid for arbitrary host masses and redshifts—have reasonable agreement with those of numerical simulations at resolved scales. Our analytical model also enables self-consistent calculations of the boost factor of dark matter annihilation, which we find to increase from tens of percent at the smallest (Earth) and intermediate (dwarfs) masses to a factor of several at galaxy size, and to become as large as a factor of ˜10 for the largest halos (clusters) at small redshifts. Our analytical approach can accommodate substructures in the subhalos (sub-subhalos) in a consistent framework, which we find to give up to a factor of a few enhancements to the annihilation boost. The presence of the subhalos enhances the intensity of the isotropic gamma-ray background by a factor of a few, and as the result, the measurement by the Fermi Large Area Telescope excludes the annihilation cross section greater than ˜4 ×10-26 cm3 s-1 for dark matter masses up to ˜200 GeV .

  7. Accurate potentiometric determination of lipid membrane-water partition coefficients and apparent dissociation constants of ionizable drugs: electrostatic corrections.

    PubMed

    Elsayed, Mustafa M A; Vierl, Ulrich; Cevc, Gregor

    2009-06-01

    Potentiometric lipid membrane-water partition coefficient studies neglect electrostatic interactions to date; this leads to incorrect results. We herein show how to account properly for such interactions in potentiometric data analysis. We conducted potentiometric titration experiments to determine lipid membrane-water partition coefficients of four illustrative drugs, bupivacaine, diclofenac, ketoprofen and terbinafine. We then analyzed the results conventionally and with an improved analytical approach that considers Coulombic electrostatic interactions. The new analytical approach delivers robust partition coefficient values. In contrast, the conventional data analysis yields apparent partition coefficients of the ionized drug forms that depend on experimental conditions (mainly the lipid-drug ratio and the bulk ionic strength). This is due to changing electrostatic effects originating either from bound drug and/or lipid charges. A membrane comprising 10 mol-% mono-charged molecules in a 150 mM (monovalent) electrolyte solution yields results that differ by a factor of 4 from uncharged membranes results. Allowance for the Coulombic electrostatic interactions is a prerequisite for accurate and reliable determination of lipid membrane-water partition coefficients of ionizable drugs from potentiometric titration data. The same conclusion applies to all analytical methods involving drug binding to a surface.

  8. Decoding the mechanical fingerprints of biomolecules.

    PubMed

    Dudko, Olga K

    2016-01-01

    The capacity of biological macromolecules to act as exceedingly sophisticated and highly efficient cellular machines - switches, assembly factors, pumps, or motors - is realized through their conformational transitions, that is, their folding into distinct shapes and selective binding to other molecules. Conformational transitions can be induced, monitored, and manipulated by pulling individual macromolecules apart with an applied force. Pulling experiments reveal, for a given biomolecule, the relationship between applied force and molecular extension. Distinct signatures in the force-extension relationship identify a given biomolecule and thus serve as the molecule's 'mechanical fingerprints'. But, how can these fingerprints be decoded to uncover the energy barriers crossed by the molecule in the course of its conformational transition, as well as the associated timescales? This review summarizes a powerful class of approaches to interpreting single-molecule force spectroscopy measurements - namely, analytically tractable approaches. On the fundamental side, analytical theories have the power to reveal the unifying principles underneath the bewildering diversity of biomolecules and their behaviors. On the practical side, analytical expressions that result from these theories are particularly well suited for a direct fit to experimental data, yielding the important parameters that govern biological processes at the molecular level.

  9. Analytical modeling of light transport in scattering materials with strong absorption.

    PubMed

    Meretska, M L; Uppu, R; Vissenberg, G; Lagendijk, A; Ijzerman, W L; Vos, W L

    2017-10-02

    We have investigated the transport of light through slabs that both scatter and strongly absorb, a situation that occurs in diverse application fields ranging from biomedical optics, powder technology, to solid-state lighting. In particular, we study the transport of light in the visible wavelength range between 420 and 700 nm through silicone plates filled with YAG:Ce 3+ phosphor particles, that even re-emit absorbed light at different wavelengths. We measure the total transmission, the total reflection, and the ballistic transmission of light through these plates. We obtain average single particle properties namely the scattering cross-section σ s , the absorption cross-section σ a , and the anisotropy factor µ using an analytical approach, namely the P3 approximation to the radiative transfer equation. We verify the extracted transport parameters using Monte-Carlo simulations of the light transport. Our approach fully describes the light propagation in phosphor diffuser plates that are used in white LEDs and that reveal a strong absorption (L/l a > 1) up to L/l a = 4, where L is the slab thickness, l a is the absorption mean free path. In contrast, the widely used diffusion theory fails to describe this parameter range. Our approach is a suitable analytical tool for industry, since it provides a fast yet accurate determination of key transport parameters, and since it introduces predictive power into the design process of white light emitting diodes.

  10. Exploring the Different Trajectories of Analytical Thinking Ability Factors: An Application of the Second-Order Growth Curve Factor Model

    ERIC Educational Resources Information Center

    Saengprom, Narumon; Erawan, Waraporn; Damrongpanit, Suntonrapot; Sakulku, Jaruwan

    2015-01-01

    The purposes of this study were 1) Compare analytical thinking ability by testing the same sets of students 5 times 2) Develop and verify whether analytical thinking ability of students corresponds to second-order growth curve factors model. Samples were 1,093 eighth-grade students. The results revealed that 1) Analytical thinking ability scores…

  11. Correlation between polymerization shrinkage stress and C-factor depends upon cavity compliance.

    PubMed

    Wang, Zhengzhi; Chiang, Martin Y M

    2016-03-01

    The literature reports inconsistent results regarding using configuration factor (C-factor) as an indicator to reflect the generation of polymerization shrinkage stress (PS) from dental restorative composites due to the constraint of cavity configuration. The current study aimed at unraveling the complex effects of C-factor on PS based on analytical and experimental approaches together, such that the reported inconsistency can be explained and a significance of C-factor in clinic can be comprehensively provided. Analytical models based on linear elasticity were established to predict PS measured in instruments (testing systems) with different compliances, and complex effects of C-factor on PS were derived. The analyses were validated by experiments using a cantilever beam-based instrument and systematic variation of instrumental compliance. For a general trend, PS decreased with increasing C-factor when measured by instruments with high compliance. However, this trend gradually diminished and eventually reversed (PS became increased with increasing C-factor) by decreasing the system compliance. Our study indicates that the correlation between PS and C-factor are highly dependent on the compliance of testing instrument for PS measurement. This suggests that the current concept on the role of C-factor in the stress development and transmission to tooth structures, higher C-factor produces higher PS due to reduced flow capacity of more confined materials, can be misleading. Thus, the compliance of the prepared tooth (cavity) structure should also be considered in the effect of C-factor on PS. Published by Elsevier Ltd.

  12. Multidimensional assessment of awareness in early-stage dementia: a cluster analytic approach.

    PubMed

    Clare, Linda; Whitaker, Christopher J; Nelis, Sharon M; Martyr, Anthony; Markova, Ivana S; Roth, Ilona; Woods, Robert T; Morris, Robin G

    2011-01-01

    Research on awareness in dementia has yielded variable and inconsistent associations between awareness and other factors. This study examined awareness using a multidimensional approach and applied cluster analytic techniques to identify associations between the level of awareness and other variables. Participants were 101 individuals with early-stage dementia (PwD) and their carers. Explicit awareness was assessed at 3 levels: performance monitoring in relation to memory, evaluative judgement in relation to memory, everyday activities and socio-emotional functioning, and metacognitive reflection in relation to the experience and impact of the condition. Implicit awareness was assessed with an emotional Stroop task. Different measures of explicit awareness scores were related only to a limited extent. Cluster analysis yielded 3 groups with differing degrees of explicit awareness. These groups showed no differences in implicit awareness. Lower explicit awareness was associated with greater age, lower MMSE scores, poorer recall and naming scores, lower anxiety and greater carer stress. Multidimensional assessment offers a more robust approach to classifying PwD according to level of awareness and hence to examining correlates and predictors of awareness. Copyright © 2011 S. Karger AG, Basel.

  13. Achieving optimal SERS through enhanced experimental design

    PubMed Central

    Fisk, Heidi; Westley, Chloe; Turner, Nicholas J.

    2016-01-01

    One of the current limitations surrounding surface‐enhanced Raman scattering (SERS) is the perceived lack of reproducibility. SERS is indeed challenging, and for analyte detection, it is vital that the analyte interacts with the metal surface. However, as this is analyte dependent, there is not a single set of SERS conditions that are universal. This means that experimental optimisation for optimum SERS response is vital. Most researchers optimise one factor at a time, where a single parameter is altered first before going onto optimise the next. This is a very inefficient way of searching the experimental landscape. In this review, we explore the use of more powerful multivariate approaches to SERS experimental optimisation based on design of experiments and evolutionary computational methods. We particularly focus on colloidal‐based SERS rather than thin film preparations as a result of their popularity. © 2015 The Authors. Journal of Raman Spectroscopy published by John Wiley & Sons, Ltd. PMID:27587905

  14. Override the controversy: Analytic thinking predicts endorsement of evolution.

    PubMed

    Gervais, Will M

    2015-09-01

    Despite overwhelming scientific consensus, popular opinions regarding evolution are starkly divided. In the USA, for example, nearly one in three adults espouse a literal and recent divine creation account of human origins. Plausibly, resistance to scientific conclusions regarding the origins of species-like much resistance to other scientific conclusions (Bloom & Weisberg, 2007)-gains support from reliably developing intuitions. Intuitions about essentialism, teleology, agency, and order may combine to make creationism potentially more cognitively attractive than evolutionary concepts. However, dual process approaches to cognition recognize that people can often analytically override their intuitions. Two large studies (total N=1324) found consistent evidence that a tendency to engage analytic thinking predicted endorsement of evolution, even controlling for relevant demographic, attitudinal, and religious variables. Meanwhile, exposure to religion predicted reduced endorsement of evolution. Cognitive style is one factor among many affecting opinions on the origin of species. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Achieving optimal SERS through enhanced experimental design.

    PubMed

    Fisk, Heidi; Westley, Chloe; Turner, Nicholas J; Goodacre, Royston

    2016-01-01

    One of the current limitations surrounding surface-enhanced Raman scattering (SERS) is the perceived lack of reproducibility. SERS is indeed challenging, and for analyte detection, it is vital that the analyte interacts with the metal surface. However, as this is analyte dependent, there is not a single set of SERS conditions that are universal. This means that experimental optimisation for optimum SERS response is vital. Most researchers optimise one factor at a time, where a single parameter is altered first before going onto optimise the next. This is a very inefficient way of searching the experimental landscape. In this review, we explore the use of more powerful multivariate approaches to SERS experimental optimisation based on design of experiments and evolutionary computational methods. We particularly focus on colloidal-based SERS rather than thin film preparations as a result of their popularity. © 2015 The Authors. Journal of Raman Spectroscopy published by John Wiley & Sons, Ltd.

  16. Plasma biomarkers of depressive symptoms in older adults.

    PubMed

    Arnold, S E; Xie, S X; Leung, Y-Y; Wang, L-S; Kling, M A; Han, X; Kim, E J; Wolk, D A; Bennett, D A; Chen-Plotkin, A; Grossman, M; Hu, W; Lee, V M-Y; Mackin, R Scott; Trojanowski, J Q; Wilson, R S; Shaw, L M

    2012-01-03

    The pathophysiology of negative affect states in older adults is complex, and a host of central nervous system and peripheral systemic mechanisms may play primary or contributing roles. We conducted an unbiased analysis of 146 plasma analytes in a multiplex biochemical biomarker study in relation to number of depressive symptoms endorsed by 566 participants in the Alzheimer's Disease Neuroimaging Initiative (ADNI) at their baseline and 1-year assessments. Analytes that were most highly associated with depressive symptoms included hepatocyte growth factor, insulin polypeptides, pregnancy-associated plasma protein-A and vascular endothelial growth factor. Separate regression models assessed contributions of past history of psychiatric illness, antidepressant or other psychotropic medicine, apolipoprotein E genotype, body mass index, serum glucose and cerebrospinal fluid (CSF) τ and amyloid levels, and none of these values significantly attenuated the main effects of the candidate analyte levels for depressive symptoms score. Ensemble machine learning with Random Forests found good accuracy (~80%) in classifying groups with and without depressive symptoms. These data begin to identify biochemical biomarkers of depressive symptoms in older adults that may be useful in investigations of pathophysiological mechanisms of depression in aging and neurodegenerative dementias and as targets of novel treatment approaches.

  17. Application of analytical redundancy management to Shuttle crafts. [computerized simulation of microelectronic implementation

    NASA Technical Reports Server (NTRS)

    Montgomery, R. C.; Tabak, D.

    1979-01-01

    The study involves the bank of filters approach to analytical redundancy management since this is amenable to microelectronic implementation. Attention is given to a study of the UD factorized filter to determine if it gives more accurate estimates than the standard Kalman filter when data processing word size is reduced. It is reported that, as the word size is reduced, the effect of modeling error dominates the filter performance of the two filters. However, the UD filter is shown to maintain a slight advantage in tracking performance. It is concluded that because of the UD filter's stability in the serial processing mode, it remains the leading candidate for microelectronic implementation.

  18. Bioactive factors for tissue regeneration: state of the art.

    PubMed

    Ohba, Shinsuke; Hojo, Hironori; Chung, Ung-Il

    2012-07-01

    THERE ARE THREE COMPONENTS FOR THE CREATION OF NEW TISSUES: cell sources, scaffolds, and bioactive factors. Unlike conventional medical strategies, regenerative medicine requires not only analytical approaches but also integrative ones. Basic research has identified a number of bioactive factors that are necessary, but not sufficient, for organogenesis. In skeletal development, these factors include bone morphogenetic proteins (BMPs), transforming growth factor β TGF-β, Wnts, hedgehogs (Hh), fibroblast growth factors (FGFs), insulin-like growth factors (IGFs), SRY box-containing gene (Sox) 9, Sp7, and runt-related transcription factors (Runx). Clinical and preclinical studies have been extensively performed to apply the knowledge to bone and cartilage regeneration. Given the large number of findings obtained so far, it would be a good time for a multi-disciplinary, collaborative effort to optimize these known factors and develop appropriate drug delivery systems for delivering them.

  19. The Influence of Teacher-Student Conflict on Teacher Ratings of Children's Externalizing and Internalizing Behaviors: A Multitrait-Multimethod Factor Analytic Approach

    ERIC Educational Resources Information Center

    Wickerd, Garry Dean

    2012-01-01

    Teachers' ratings of problem behaviors are gathered as a matter of course in psychological evaluations of children. Teacher ratings are weighted heavily in the results of educational evaluations (Frick et al., 2010). With such heavy emphasis on teacher behavioral ratings, it is important to understand the extent to which conflict between…

  20. The "A" Factor: Coming to Terms with the Question of Legacy in South African Education

    ERIC Educational Resources Information Center

    Soudien, Crain

    2007-01-01

    This paper attempts to offer an alternative framework for assessing education delivery in South Africa. Its purpose is to develop an analytic approach for understanding education delivery in South Africa in the last 11 years and to use this framework to pose a set of strategic questions about how policy might be framed to deal with delivery. The…

  1. Analysis of yield and oil from a series of canola breeding trials. Part II. Exploring variety by environment interaction using factor analysis.

    PubMed

    Cullis, B R; Smith, A B; Beeck, C P; Cowling, W A

    2010-11-01

    Exploring and exploiting variety by environment (V × E) interaction is one of the major challenges facing plant breeders. In paper I of this series, we presented an approach to modelling V × E interaction in the analysis of complex multi-environment trials using factor analytic models. In this paper, we develop a range of statistical tools which explore V × E interaction in this context. These tools include graphical displays such as heat-maps of genetic correlation matrices as well as so-called E-scaled uniplots that are a more informative alternative to the classical biplot for large plant breeding multi-environment trials. We also present a new approach to prediction for multi-environment trials that include pedigree information. This approach allows meaningful selection indices to be formed either for potential new varieties or potential parents.

  2. Factor Analytic Approach to Transitive Text Mining using Medline Descriptors

    NASA Astrophysics Data System (ADS)

    Stegmann, J.; Grohmann, G.

    Matrix decomposition methods were applied to examples of noninteractive literature sets sharing implicit relations. Document-by-term matrices were created from downloaded PubMed literature sets, the terms being the Medical Subject Headings (MeSH descriptors) assigned to the documents. The loadings of the factors derived from singular value or eigenvalue matrix decomposition were sorted according to absolute values and subsequently inspected for positions of terms relevant to the discovery of hidden connections. It was found that only a small number of factors had to be screened to find key terms in close neighbourhood, being separated by a small number of terms only.

  3. Models for randomly distributed nanoscopic domains on spherical vesicles

    NASA Astrophysics Data System (ADS)

    Anghel, Vinicius N. P.; Bolmatov, Dima; Katsaras, John

    2018-06-01

    The existence of lipid domains in the plasma membrane of biological systems has proven controversial, primarily due to their nanoscopic size—a length scale difficult to interrogate with most commonly used experimental techniques. Scattering techniques have recently proven capable of studying nanoscopic lipid domains populating spherical vesicles. However, the development of analytical methods able of predicting and analyzing domain pair correlations from such experiments has not kept pace. Here, we developed models for the random distribution of monodisperse, circular nanoscopic domains averaged on the surface of a spherical vesicle. Specifically, the models take into account (i) intradomain correlations corresponding to form factors and interdomain correlations corresponding to pair distribution functions, and (ii) the analytical computation of interdomain correlations for cases of two and three domains on a spherical vesicle. In the case of more than three domains, these correlations are treated either by Monte Carlo simulations or by spherical analogs of the Ornstein-Zernike and Percus-Yevick (PY) equations. Importantly, the spherical analog of the PY equation works best in the case of nanoscopic size domains, a length scale that is mostly inaccessible by experimental approaches such as, for example, fluorescent techniques and optical microscopies. The analytical form factors and structure factors of nanoscopic domains populating a spherical vesicle provide a new and important framework for the quantitative analysis of experimental data from commonly studied phase-separated vesicles used in a wide range of biophysical studies.

  4. An analytical approach to separate climate and human contributions to basin streamflow variability

    NASA Astrophysics Data System (ADS)

    Li, Changbin; Wang, Liuming; Wanrui, Wang; Qi, Jiaguo; Linshan, Yang; Zhang, Yuan; Lei, Wu; Cui, Xia; Wang, Peng

    2018-04-01

    Climate variability and anthropogenic regulations are two interwoven factors in the ecohydrologic system across large basins. Understanding the roles that these two factors play under various hydrologic conditions is of great significance for basin hydrology and sustainable water utilization. In this study, we present an analytical approach based on coupling water balance method and Budyko hypothesis to derive effectiveness coefficients (ECs) of climate change, as a way to disentangle contributions of it and human activities to the variability of river discharges under different hydro-transitional situations. The climate dominated streamflow change (ΔQc) by EC approach was compared with those deduced by the elasticity method and sensitivity index. The results suggest that the EC approach is valid and applicable for hydrologic study at large basin scale. Analyses of various scenarios revealed that contributions of climate change and human activities to river discharge variation differed among the regions of the study area. Over the past several decades, climate change dominated hydro-transitions from dry to wet, while human activities played key roles in the reduction of streamflow during wet to dry periods. Remarkable decline of discharge in upstream was mainly due to human interventions, although climate contributed more to runoff increasing during dry periods in the semi-arid downstream. Induced effectiveness on streamflow changes indicated a contribution ratio of 49% for climate and 51% for human activities at the basin scale from 1956 to 2015. The mathematic derivation based simple approach, together with the case example of temporal segmentation and spatial zoning, could help people understand variation of river discharge with more details at a large basin scale under the background of climate change and human regulations.

  5. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  6. Evaluation and Prediction of Water Resources Based on AHP

    NASA Astrophysics Data System (ADS)

    Li, Shuai; Sun, Anqi

    2017-01-01

    Nowadays, the shortage of water resources is a threat to us. In order to solve the problem of water resources restricted by varieties of factors, this paper establishes a water resources evaluation index model (WREI), which adopts the fuzzy comprehensive evaluation (FCE) based on analytic hierarchy process (AHP) algorithm. After considering influencing factors of water resources, we ignore secondary factors and then hierarchical approach the main factors according to the class, set up a three-layer structure. The top floor is for WREI. Using analytic hierarchy process (AHP) to determine weight first, and then use fuzzy judgment to judge target, so the comprehensive use of the two algorithms reduce the subjective influence of AHP and overcome the disadvantages of multi-level evaluation. To prove the model, we choose India as a target region. On the basis of water resources evaluation index model, we use Matlab and combine grey prediction with linear prediction to discuss the ability to provide clean water in India and the trend of India’s water resources changing in the next 15 years. The model with theoretical support and practical significance will be of great help to provide reliable data support and reference for us to get plans to improve water quality.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Safari, L., E-mail: laleh.safari@ist.ac.at; Department of Physics, University of Oulu, Box 3000, FI-90014 Oulu; Santos, J. P.

    Atomic form factors are widely used for the characterization of targets and specimens, from crystallography to biology. By using recent mathematical results, here we derive an analytical expression for the atomic form factor within the independent particle model constructed from nonrelativistic screened hydrogenic wave functions. The range of validity of this analytical expression is checked by comparing the analytically obtained form factors with the ones obtained within the Hartee-Fock method. As an example, we apply our analytical expression for the atomic form factor to evaluate the differential cross section for Rayleigh scattering off neutral atoms.

  8. Microbiological concerns and methodological approaches related to bacterial water quality in spaceflight

    NASA Technical Reports Server (NTRS)

    Pyle, Barry H.; Mcfeters, Gordon A.

    1992-01-01

    A number of microbiological issues are of critical importance to crew health and system performance in spacecraft water systems. This presentation reviews an army of these concerns which include factors that influence water treatment and disinfection in spaceflight such as biofilm formation and the physiological responses of bacteria in clean water systems. Factors associated with spaceflight like aerosol formation under conditions of microgravity are also discussed within the context of airborne infections such as Legionellosis. Finally, a spectrum of analytical approaches is reviewed to provide an evaluation of methodological alternatives that have been suggested or used to detect microorganisms of interest in water systems. These range from classical approaches employing colony formation on specific microbiological growth media to direct (i.e. microscopic) and indirect (e.g. electrochemical) methods as well as the use of molecular approaches and gene probes. These techniques are critically evaluated for their potential utility in determining microbiological water quality through the detection of microorganisms under the influence of ambient environmental stress inherent in spaceflight water systems.

  9. Critical Factors in Data Governance for Learning Analytics

    ERIC Educational Resources Information Center

    Elouazizi, Noureddine

    2014-01-01

    This paper identifies some of the main challenges of data governance modelling in the context of learning analytics for higher education institutions, and discusses the critical factors for designing data governance models for learning analytics. It identifies three fundamental common challenges that cut across any learning analytics data…

  10. A Multiomics Approach to Identify Genes Associated with Childhood Asthma Risk and Morbidity.

    PubMed

    Forno, Erick; Wang, Ting; Yan, Qi; Brehm, John; Acosta-Perez, Edna; Colon-Semidey, Angel; Alvarez, Maria; Boutaoui, Nadia; Cloutier, Michelle M; Alcorn, John F; Canino, Glorisa; Chen, Wei; Celedón, Juan C

    2017-10-01

    Childhood asthma is a complex disease. In this study, we aim to identify genes associated with childhood asthma through a multiomics "vertical" approach that integrates multiple analytical steps using linear and logistic regression models. In a case-control study of childhood asthma in Puerto Ricans (n = 1,127), we used adjusted linear or logistic regression models to evaluate associations between several analytical steps of omics data, including genome-wide (GW) genotype data, GW methylation, GW expression profiling, cytokine levels, asthma-intermediate phenotypes, and asthma status. At each point, only the top genes/single-nucleotide polymorphisms/probes/cytokines were carried forward for subsequent analysis. In step 1, asthma modified the gene expression-protein level association for 1,645 genes; pathway analysis showed an enrichment of these genes in the cytokine signaling system (n = 269 genes). In steps 2-3, expression levels of 40 genes were associated with intermediate phenotypes (asthma onset age, forced expiratory volume in 1 second, exacerbations, eosinophil counts, and skin test reactivity); of those, methylation of seven genes was also associated with asthma. Of these seven candidate genes, IL5RA was also significant in analytical steps 4-8. We then measured plasma IL-5 receptor α levels, which were associated with asthma age of onset and moderate-severe exacerbations. In addition, in silico database analysis showed that several of our identified IL5RA single-nucleotide polymorphisms are associated with transcription factors related to asthma and atopy. This approach integrates several analytical steps and is able to identify biologically relevant asthma-related genes, such as IL5RA. It differs from other methods that rely on complex statistical models with various assumptions.

  11. Advantages of using tetrahydrofuran-water as mobile phases in the quantitation of cyclosporin A in monkey and rat plasma by liquid chromatography-tandem mass spectrometry.

    PubMed

    Li, Austin C; Li, Yinghe; Guirguis, Micheal S; Caldwell, Robert G; Shou, Wilson Z

    2007-01-04

    A new analytical method is described here for the quantitation of anti-inflammatory drug cyclosporin A (CyA) in monkey and rat plasma. The method used tetrahydrofuran (THF)-water mobile phases to elute the analyte and internal standard, cyclosporin C (CyC). The gradient mobile phase program successfully eluted CyA into a sharp peak and therefore improved resolution between the analyte and possible interfering materials compared with previously reported analytical approaches, where CyA was eluted as a broad peak due to the rapid conversion between different conformers. The sharp peak resulted from this method facilitated the quantitative calculation as multiple smoothing and large number of bunching factors were not necessary. The chromatography in the new method was performed at 30 degrees C instead of 65-70 degrees C as reported previously. Other advantages of the method included simple and fast sample extraction-protein precipitation, direct injection of the extraction supernatant to column for analysis, and elimination of evaporation and reconstitution steps, which were needed in solid phase extraction or liquid-liquid extraction reported before. This method is amenable to high-throughput analysis with a total chromatographic run time of 3 min. This approach has been verified as sensitive, linear (0.977-4000 ng/mL), accurate and precise for the quantitation of CyA in monkey and rat plasma. However, compared with the usage of conventional mobile phases, the only drawback of this approach was the reduced detection response from the mass spectrometer that was possibly caused by poor desolvation in the ionization source. This is the first report to demonstrate the advantages of using THF-water mobile phases to elute CyA in liquid chromatography.

  12. An approach to the design of wide-angle optical systems with special illumination and IFOV requirements

    NASA Astrophysics Data System (ADS)

    Pravdivtsev, Andrey V.

    2012-06-01

    The article presents the approach to the design wide-angle optical systems with special illumination and instantaneous field of view (IFOV) requirements. The unevenness of illumination reduces the dynamic range of the system, which negatively influence on the system ability to perform their task. The result illumination on the detector depends among other factors from the IFOV changes. It is also necessary to consider IFOV in the synthesis of data processing algorithms, as it directly affects to the potential "signal/background" ratio for the case of statistically homogeneous backgrounds. A numerical-analytical approach that simplifies the design of wideangle optical systems with special illumination and IFOV requirements is presented. The solution can be used for optical systems which field of view greater than 180 degrees. Illumination calculation in optical CAD is based on computationally expensive tracing of large number of rays. The author proposes to use analytical expression for some characteristics which illumination depends on. The rest characteristic are determined numerically in calculation with less computationally expensive operands, the calculation performs not every optimization step. The results of analytical calculation inserts in the merit function of optical CAD optimizer. As a result we reduce the optimizer load, since using less computationally expensive operands. It allows reducing time and resources required to develop a system with the desired characteristics. The proposed approach simplifies the creation and understanding of the requirements for the quality of the optical system, reduces the time and resources required to develop an optical system, and allows creating more efficient EOS.

  13. Determinants of participation in prostate cancer screening: A simple analytical framework to account for healthy-user bias

    PubMed Central

    Tabuchi, Takahiro; Nakayama, Tomio; Fukushima, Wakaba; Matsunaga, Ichiro; Ohfuji, Satoko; Kondo, Kyoko; Kawano, Eiji; Fukuhara, Hiroyuki; Ito, Yuri; Oshima, Akira

    2015-01-01

    In Japan at present, fecal occult blood testing (FOBT) is recommended for cancer screening while routine population-based prostate-specific antigen (PSA) screening is not. In future it may be necessary to increase participation in the former and decrease it in the latter. Our objectives were to explore determinants of PSA-screening participation while simultaneously taking into account factors associated with FOBT. Data were gathered from a cross-sectional study conducted with random sampling of 6191 adults in Osaka city in 2011. Of 3244 subjects (return rate 52.4%), 936 men aged 40–64 years were analyzed using log-binomial regression to explore factors related to PSA-screening participation within 1 year. Only responders for cancer screening, defined as men who participated in either FOBT or PSA-testing, were used as main study subjects. Men who were older (prevalence ratio [PR] [95% confidence interval (CI)] = 2.17 [1.43, 3.28] for 60–64 years compared with 40–49 years), had technical or junior college education (PR [95% CI] = 1.76 [1.19, 2.59] compared with men with high school or less) and followed doctors' recommendations (PR [95% CI] = 1.50 [1.00, 2.26]) were significantly more likely to have PSA-screening after multiple variable adjustment among cancer-screening responders. Attenuation in PR of hypothesized common factors was observed among cancer-screening responders compared with the usual approach (among total subjects). Using the analytical framework to account for healthy-user bias, we found three factors related to participation in PSA-screening with attenuated association of common factors. This approach may provide a more sophisticated interpretation of participation in various screenings with different levels of recommendation. PMID:25456306

  14. Do adolescent delinquency and problem drinking share psychosocial risk factors? A literature review.

    PubMed

    Curcio, Angela L; Mak, Anita S; George, Amanda M

    2013-04-01

    Despite the prevalence and damaging effects of adolescent problem drinking, relative to delinquency, far less research has focused on drinking using an integrated theoretical approach. The aim of the current research was to review existing literature on psychosocial risk factors for delinquency and problem drinking, and explore whether integrating elements of social learning theory with an established psychosocial control theory of delinquency could explain adolescent problem drinking. We reviewed 71 studies published post-1990 with particular focus on articles that empirically researched risk factors for adolescent problem drinking and delinquency in separate and concurrent studies and meta-analytic reviews. We found shared risk factors for adolescent delinquency and problem drinking that are encompassed by an extension of psychosocial control theory. The potential of an extended psychosocial control theory providing a parsimonious theoretical approach to explaining delinquency, problem drinking and other adolescent problem behaviours, along with suggestions for future investigations, is discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Incorporating children's toxicokinetics into a risk framework.

    PubMed Central

    Ginsberg, Gary; Slikker, William; Bruckner, James; Sonawane, Babasaheb

    2004-01-01

    Children's responses to environmental toxicants will be affected by the way in which their systems absorb, distribute, metabolize, and excrete chemicals. These toxicokinetic factors vary during development, from in utero where maternal and placental processes play a large role, to the neonate in which emerging metabolism and clearance pathways are key determinants. Toxicokinetic differences between neonates and adults lead to the potential for internal dosimetry differences and increased or decreased risk, depending on the mechanisms for toxicity and clearance of a given chemical. This article raises a number of questions that need to be addressed when conducting a toxicokinetic analysis of in utero or childhood exposures. These questions are organized into a proposed framework for conducting the assessment that involves problem formulation (identification of early life stage toxicokinetic factors and chemical-specific factors that may raise questions/concerns for children); data analysis (development of analytic approach, construction of child/adult or child/animal dosimetry comparisons); and risk characterization (evaluation of how children's toxicokinetic analysis can be used to decrease uncertainties in the risk assessment). The proposed approach provides a range of analytical options, from qualitative to quantitative, for assessing children's dosimetry. Further, it provides background information on a variety of toxicokinetic factors that can vary as a function of developmental stage. For example, the ontology of metabolizing systems is described via reference to pediatric studies involving therapeutic drugs and evidence from in vitro enzyme studies. This type of resource information is intended to help the assessor begin to address the issues raised in this paper. PMID:14754583

  16. Parametrizations of three-body hadronic B - and D -decay amplitudes in terms of analytic and unitary meson-meson form factors

    NASA Astrophysics Data System (ADS)

    Boito, D.; Dedonder, J.-P.; El-Bennich, B.; Escribano, R.; Kamiński, R.; Leśniak, L.; Loiseau, B.

    2017-12-01

    We introduce parametrizations of hadronic three-body B and D weak decay amplitudes that can be readily implemented in experimental analyses and are a sound alternative to the simplistic and widely used sum of Breit-Wigner type amplitudes, also known as the isobar model. These parametrizations can be particularly useful in the interpretation of C P asymmetries in the Dalitz plots. They are derived from previous calculations based on a quasi-two-body factorization approach in which two-body hadronic final-state interactions are fully taken into account in terms of unitary S - and P -wave π π , π K , and K K ¯ form factors. These form factors can be determined rigorously, fulfilling fundamental properties of quantum field-theory amplitudes such as analyticity and unitarity, and are in agreement with the low-energy behavior predicted by effective theories of QCD. They are derived from sets of coupled-channel equations using T -matrix elements constrained by experimental meson-meson phase shifts and inelasticities, chiral symmetry, and asymptotic QCD. We provide explicit amplitude expressions for the decays B±→π+π-π±, B →K π+π-, B±→K+K-K±, D+→π-π+π+, D+→K-π+π+, and D0→KS0π+π-, for which we have shown in previous studies that this approach is phenomenologically successful; in addition, we provide expressions for the D0→KS0K+K- decay. Other three-body hadronic channels can be parametrized likewise.

  17. Metal-organic frameworks for analytical chemistry: from sample collection to chromatographic separation.

    PubMed

    Gu, Zhi-Yuan; Yang, Cheng-Xiong; Chang, Na; Yan, Xiu-Ping

    2012-05-15

    In modern analytical chemistry researchers pursue novel materials to meet analytical challenges such as improvements in sensitivity, selectivity, and detection limit. Metal-organic frameworks (MOFs) are an emerging class of microporous materials, and their unusual properties such as high surface area, good thermal stability, uniform structured nanoscale cavities, and the availability of in-pore functionality and outer-surface modification are attractive for diverse analytical applications. This Account summarizes our research on the analytical applications of MOFs ranging from sampling to chromatographic separation. MOFs have been either directly used or engineered to meet the demands of various analytical applications. Bulk MOFs with microsized crystals are convenient sorbents for direct application to in-field sampling and solid-phase extraction. Quartz tubes packed with MOF-5 have shown excellent stability, adsorption efficiency, and reproducibility for in-field sampling and trapping of atmospheric formaldehyde. The 2D copper(II) isonicotinate packed microcolumn has demonstrated large enhancement factors and good shape- and size-selectivity when applied to on-line solid-phase extraction of polycyclic aromatic hydrocarbons in water samples. We have explored the molecular sieving effect of MOFs for the efficient enrichment of peptides with simultaneous exclusion of proteins from biological fluids. These results show promise for the future of MOFs in peptidomics research. Moreover, nanosized MOFs and engineered thin films of MOFs are promising materials as novel coatings for solid-phase microextraction. We have developed an in situ hydrothermal growth approach to fabricate thin films of MOF-199 on etched stainless steel wire for solid-phase microextraction of volatile benzene homologues with large enhancement factors and wide linearity. Their high thermal stability and easy-to-engineer nanocrystals make MOFs attractive as new stationary phases to fabricate MOF-coated capillaries for high-resolution gas chromatography (GC). We have explored a dynamic coating approach to fabricate a MOF-coated capillary for the GC separation of important raw chemicals and persistent organic pollutants with high resolution and excellent selectivity. We have combined a MOF-coated fiber for solid-phase microextraction with a MOF-coated capillary for GC separation, which provides an effective MOF-based tandem molecular sieve platform for selective microextraction and high-resolution GC separation of target analytes in complex samples. Microsized MOFs with good solvent stability are attractive stationary phases for high-performance liquid chromatography (HPLC). These materials have shown high resolution and good selectivity and reproducibility in both the normal-phase HPLC separation of fullerenes and substituted aromatics on MIL-101 packed columns and position isomers on a MIL-53(Al) packed column and the reversed-phase HPLC separation of a wide range of analytes from nonpolar to polar and acidic to basic solutes. Despite the above achievements, further exploration of MOFs in analytical chemistry is needed. Especially, analytical application-oriented engineering of MOFs is imperative for specific applications.

  18. Determination of residual acetone and acetone related impurities in drug product intermediates prepared as Spray Dried Dispersions (SDD) using gas chromatography with headspace autosampling (GCHS).

    PubMed

    Quirk, Emma; Doggett, Adrian; Bretnall, Alison

    2014-08-05

    Spray Dried Dispersions (SDD) are uniform mixtures of a specific ratio of amorphous active pharmaceutical ingredient (API) and polymer prepared via a spray drying process. Volatile solvents are employed during spray drying to facilitate the formation of the SDD material. Following manufacture, analytical methodology is required to determine residual levels of the spray drying solvent and its associated impurities. Due to the high level of polymer in the SDD samples, direct liquid injection with Gas Chromatography (GC) is not a viable option for analysis. This work describes the development and validation of an analytical approach to determine residual levels of acetone and acetone related impurities, mesityl oxide (MO) and diacetone alcohol (DAA), in drug product intermediates prepared as SDDs using GC with headspace (HS) autosampling. The method development for these analytes presented a number of analytical challenges which had to be overcome before the levels of the volatiles of interest could be accurately quantified. GCHS could be used after two critical factors were implemented; (1) calculation and application of conversion factors to 'correct' for the reactions occurring between acetone, MO and DAA during generation of the headspace volume for analysis, and the addition of an equivalent amount of polymer into all reference solutions used for quantitation to ensure comparability between the headspace volumes generated for both samples and external standards. This work describes the method development and optimisation of the standard preparation, the headspace autosampler operating parameters and the chromatographic conditions, together with a summary of the validation of the methodology. The approach has been demonstrated to be robust and suitable to accurately determine levels of acetone, MO and DAA in SDD materials over the linear concentration range 0.008-0.4μL/mL, with minimum quantitation limits of 20ppm for acetone and MO, and 80ppm for DAA. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  20. Analytical functions for beta and gamma absorbed fractions of iodine-131 in spherical and ellipsoidal volumes.

    PubMed

    Mowlavi, Ali Asghar; Fornasier, Maria Rossa; Mirzaei, Mohammd; Bregant, Paola; de Denaro, Mario

    2014-10-01

    The beta and gamma absorbed fractions in organs and tissues are the important key factors of radionuclide internal dosimetry based on Medical Internal Radiation Dose (MIRD) approach. The aim of this study is to find suitable analytical functions for beta and gamma absorbed fractions in spherical and ellipsoidal volumes with a uniform distribution of iodine-131 radionuclide. MCNPX code has been used to calculate the energy absorption from beta and gamma rays of iodine-131 uniformly distributed inside different ellipsoids and spheres, and then the absorbed fractions have been evaluated. We have found the fit parameters of a suitable analytical function for the beta absorbed fraction, depending on a generalized radius for ellipsoid based on the radius of sphere, and a linear fit function for the gamma absorbed fraction. The analytical functions that we obtained from fitting process in Monte Carlo data can be used for obtaining the absorbed fractions of iodine-131 beta and gamma rays for any volume of the thyroid lobe. Moreover, our results for the spheres are in good agreement with the results of MIRD and other scientific literatures.

  1. Using Fuzzy Analytic Hierarchy Process multicriteria and Geographical information system for coastal vulnerability analysis in Morocco: The case of Mohammedia

    NASA Astrophysics Data System (ADS)

    Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha

    2016-04-01

    This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.

  2. Modified hollow Gaussian beam and its paraxial propagation

    NASA Astrophysics Data System (ADS)

    Cai, Yangjian; Chen, Chiyi; Wang, Fei

    2007-10-01

    A model named modified hollow Gaussian beam (HGB) is proposed to describe a dark hollow beam with adjustable beam spot size, central dark size and darkness factor. In this modified model, both the beam spot size and the central dark size will be convergent to finite constants as the beam order approaches infinity, which are much different from that of the previous unmodified model, where the beam spot size and the central dark size will not be convergent as the beam order approaches infinity. The dependences of the propagation factor of modified and unmodified HGBs on the beam order are found to be the same. Based on the Collins integral, analytical formulas for the modified HGB propagating through aligned and misaligned optical system are derived. Some numerical examples are given.

  3. Profiling physicochemical and planktonic features from discretely/continuously sampled surface water.

    PubMed

    Oita, Azusa; Tsuboi, Yuuri; Date, Yasuhiro; Oshima, Takahiro; Sakata, Kenji; Yokoyama, Akiko; Moriya, Shigeharu; Kikuchi, Jun

    2018-04-24

    There is an increasing need for assessing aquatic ecosystems that are globally endangered. Since aquatic ecosystems are complex, integrated consideration of multiple factors utilizing omics technologies can help us better understand aquatic ecosystems. An integrated strategy linking three analytical (machine learning, factor mapping, and forecast-error-variance decomposition) approaches for extracting the features of surface water from datasets comprising ions, metabolites, and microorganisms is proposed herein. The three developed approaches can be employed for diverse datasets of sample sizes and experimentally analyzed factors. The three approaches are applied to explore the features of bay water surrounding Odaiba, Tokyo, Japan, as a case study. Firstly, the machine learning approach separated 681 surface water samples within Japan into three clusters, categorizing Odaiba water into seawater with relatively low inorganic ions, including Mg, Ba, and B. Secondly, the factor mapping approach illustrated Odaiba water samples from the summer as rich in multiple amino acids and some other metabolites and poor in inorganic ions relative to other seasons based on their seasonal dynamics. Finally, forecast-error-variance decomposition using vector autoregressive models indicated that a type of microalgae (Raphidophyceae) grows in close correlation with alanine, succinic acid, and valine on filters and with isobutyric acid and 4-hydroxybenzoic acid in filtrate, Ba, and average wind speed. Our integrated strategy can be used to examine many biological, chemical, and environmental physical factors to analyze surface water. Copyright © 2018. Published by Elsevier B.V.

  4. Centrifugal ultrafiltration of human serum for improving immunoglobulin A quantification using attenuated total reflectance infrared spectroscopy.

    PubMed

    Elsohaby, Ibrahim; McClure, J Trenton; Riley, Christopher B; Bryanton, Janet; Bigsby, Kathryn; Shaw, R Anthony

    2018-02-20

    Attenuated total reflectance infrared (ATR-IR) spectroscopy is a simple, rapid and cost-effective method for the analysis of serum. However, the complex nature of serum remains a limiting factor to the reliability of this method. We investigated the benefits of coupling the centrifugal ultrafiltration with ATR-IR spectroscopy for quantification of human serum IgA concentration. Human serum samples (n = 196) were analyzed for IgA using an immunoturbidimetric assay. ATR-IR spectra were acquired for whole serum samples and for the retentate (residue) reconstituted with saline following 300 kDa centrifugal ultrafiltration. IR-based analytical methods were developed for each of the two spectroscopic datasets, and the accuracy of each of the two methods compared. Analytical methods were based upon partial least squares regression (PLSR) calibration models - one with 5-PLS factors (for whole serum) and the second with 9-PLS factors (for the reconstituted retentate). Comparison of the two sets of IR-based analytical results to reference IgA values revealed improvements in the Pearson correlation coefficient (from 0.66 to 0.76), and the root mean squared error of prediction in IR-based IgA concentrations (from 102 to 79 mg/dL) for the ultrafiltration retentate-based method as compared to the method built upon whole serum spectra. Depleting human serum low molecular weight proteins using a 300 kDa centrifugal filter thus enhances the accuracy IgA quantification by ATR-IR spectroscopy. Further evaluation and optimization of this general approach may ultimately lead to routine analysis of a range of high molecular-weight analytical targets that are otherwise unsuitable for IR-based analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. The Key Events Dose-Response Framework: a cross-disciplinary mode-of-action based approach to examining dose-response and thresholds.

    PubMed

    Julien, Elizabeth; Boobis, Alan R; Olin, Stephen S

    2009-09-01

    The ILSI Research Foundation convened a cross-disciplinary working group to examine current approaches for assessing dose-response and identifying safe levels of intake or exposure for four categories of bioactive agents-food allergens, nutrients, pathogenic microorganisms, and environmental chemicals. This effort generated a common analytical framework-the Key Events Dose-Response Framework (KEDRF)-for systematically examining key events that occur between the initial dose of a bioactive agent and the effect of concern. Individual key events are considered with regard to factors that influence the dose-response relationship and factors that underlie variability in that relationship. This approach illuminates the connection between the processes occurring at the level of fundamental biology and the outcomes observed at the individual and population levels. Thus, it promotes an evidence-based approach for using mechanistic data to reduce reliance on default assumptions, to quantify variability, and to better characterize biological thresholds. This paper provides an overview of the KEDRF and introduces a series of four companion papers that illustrate initial application of the approach to a range of bioactive agents.

  6. The use of native cation-exchange chromatography to study aggregation and phase separation of monoclonal antibodies

    PubMed Central

    Chen, Shuang; Lau, Hollis; Brodsky, Yan; Kleemann, Gerd R; Latypov, Ramil F

    2010-01-01

    This study introduces a novel analytical approach for studying aggregation and phase separation of monoclonal antibodies (mAbs). The approach is based on using analytical scale cation-exchange chromatography (CEX) for measuring the loss of soluble monomer in the case of individual and mixed protein solutions. Native CEX outperforms traditional size-exclusion chromatography in separating complex protein mixtures, offering an easy way to assess mAb aggregation propensity. Different IgG1 and IgG2 molecules were tested individually and in mixtures consisting of up to four protein molecules. Antibody aggregation was induced by four different stress factors: high temperature, low pH, addition of fatty acids, and rigorous agitation. The extent of aggregation was determined from the amount of monomeric protein remaining in solution after stress. Consequently, it was possible to address the role of specific mAb regions in antibody aggregation by co-incubating Fab and Fc fragments with their respective full-length molecules. Our results revealed that the relative contribution of Fab and Fc regions in mAb aggregation is strongly dependent on pH and the stress factor applied. In addition, the CEX-based approach was used to study reversible protein precipitation due to phase separation, which demonstrated its use for a broader range of protein–protein association phenomena. In all cases, the role of Fab and Fc was clearly dissected, providing important information for engineering more stable mAb-based therapeutics. PMID:20512972

  7. Spatio-Temporal Dimensions of Child Poverty in America, 1990-2010.

    PubMed

    Call, Maia A; Voss, Paul R

    2016-01-01

    The persistence of childhood poverty in the United States, a wealthy and developed country, continues to pose both an analytical dilemma and public policy challenge, despite many decades of research and remedial policy implementation. In this paper, our goals are twofold, though our primary focus is methodological. We attempt both to examine the relationship between space, time, and previously established factors correlated with childhood poverty at the county level in the continental United States as well as to provide an empirical case study to demonstrate an underutilized methodological approach. We analyze a spatially consistent dataset built from the 1990 and 2000 U.S. Censuses, and the 2006-2010 American Community Survey. Our analytic approach includes cross-sectional spatial models to estimate the reproduction of poverty for each of the reference years as well as a fixed effects panel data model, to analyze change in child poverty over time. In addition, we estimate a full space-time interaction model, which adjusts for spatial and temporal variation in these data. These models reinforce our understanding of the strong regional persistence of childhood poverty in the U.S. over time and suggest that the factors impacting childhood poverty remain much the same today as they have in past decades.

  8. Monitoring Healthy Metabolic Trajectories with Nutritional Metabonomics

    PubMed Central

    Collino, Sebastiano; Martin, François-Pierre J.; Kochhar, Sunil; Rezzi, Serge

    2009-01-01

    Metabonomics is a well established analytical approach for the analysis of physiological regulatory processes via the metabolic profiling of biofluids and tissues in living organisms. Its potential is fully exploited in the field of “nutrimetabonomics” that aims at assessing the metabolic effects of active ingredients and foods in individuals. Yet, one of the greatest challenges in nutrition research is to decipher the critical interactions between mammalian organisms and environmental factors, including the gut microbiota. “Nutrimetabonomics” is today foreseen as a powerful approach for future nutritional programs tailored at health maintenance and disease prevention. PMID:22253970

  9. Analysis of eighty-four commercial aviation incidents - Implications for a resource management approach to crew training

    NASA Technical Reports Server (NTRS)

    Murphy, M. R.

    1980-01-01

    A resource management approach to aircrew performance is defined and utilized in structuring an analysis of 84 exemplary incidents from the NASA Aviation Safety Reporting System. The distribution of enabling and associated (evolutionary) and recovery factors between and within five analytic categories suggests that resource management training be concentrated on: (1) interpersonal communications, with air traffic control information of major concern; (2) task management, mainly setting priorities and appropriately allocating tasks under varying workload levels; and (3) planning, coordination, and decisionmaking concerned with preventing and recovering from potentially unsafe situations in certain aircraft maneuvers.

  10. A Numerical-Analytical Approach to Modeling the Axial Rotation of the Earth

    NASA Astrophysics Data System (ADS)

    Markov, Yu. G.; Perepelkin, V. V.; Rykhlova, L. V.; Filippova, A. S.

    2018-04-01

    A model for the non-uniform axial rotation of the Earth is studied using a celestial-mechanical approach and numerical simulations. The application of an approximate model containing a small number of parameters to predict variations of the axial rotation velocity of the Earth over short time intervals is justified. This approximate model is obtained by averaging variable parameters that are subject to small variations due to non-stationarity of the perturbing factors. The model is verified and compared with predictions over a long time interval published by the International Earth Rotation and Reference Systems Service (IERS).

  11. A General Model for Performance Evaluation in DS-CDMA Systems with Variable Spreading Factors

    NASA Astrophysics Data System (ADS)

    Chiaraluce, Franco; Gambi, Ennio; Righi, Giorgia

    This paper extends previous analytical approaches for the study of CDMA systems to the relevant case of multipath environments where users can operate at different bit rates. This scenario is of interest for the Wideband CDMA strategy employed in UMTS, and the model permits the performance comparison of classic and more innovative spreading signals. The method is based on the characteristic function approach, that allows to model accurately the various kinds of interferences. Some numerical examples are given with reference to the ITU-R M. 1225 Recommendations, but the analysis could be extended to different channel descriptions.

  12. QFD-ANP Approach for the Conceptual Design of Research Vessels: A Case Study

    NASA Astrophysics Data System (ADS)

    Venkata Subbaiah, Kambagowni; Yeshwanth Sai, Koneru; Suresh, Challa

    2016-10-01

    Conceptual design is a subset of concept art wherein a new idea of product is created instead of a visual representation which would directly be used in a final product. The purpose is to understand the needs of conceptual design which are being used in engineering designs and to clarify the current conceptual design practice. Quality function deployment (QFD) is a customer oriented design approach for developing new or improved products and services to enhance customer satisfaction. House of quality (HOQ) has been traditionally used as planning tool of QFD which translates customer requirements (CRs) into design requirements (DRs). Factor analysis is carried out in order to reduce the CR portions of HOQ. The analytical hierarchical process is employed to obtain the priority ratings of CR's which are used in constructing HOQ. This paper mainly discusses about the conceptual design of an oceanographic research vessel using analytical network process (ANP) technique. Finally the QFD-ANP integrated methodology helps to establish the importance ratings of DRs.

  13. Learning Analytics Considered Harmful

    ERIC Educational Resources Information Center

    Dringus, Laurie P.

    2012-01-01

    This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through…

  14. Teaching Analytical Chemistry to Pharmacy Students: A Combined, Iterative Approach

    ERIC Educational Resources Information Center

    Masania, Jinit; Grootveld, Martin; Wilson, Philippe B.

    2018-01-01

    Analytical chemistry has often been a difficult subject to teach in a classroom or lecture-based context. Numerous strategies for overcoming the inherently practical-based difficulties have been suggested, each with differing pedagogical theories. Here, we present a combined approach to tackling the problem of teaching analytical chemistry, with…

  15. Analytical study of the heat loss attenuation by clothing on thermal manikins under radiative heat loads.

    PubMed

    Den Hartog, Emiel A; Havenith, George

    2010-01-01

    For wearers of protective clothing in radiation environments there are no quantitative guidelines available for the effect of a radiative heat load on heat exchange. Under the European Union funded project ThermProtect an analytical effort was defined to address the issue of radiative heat load while wearing protective clothing. As within the ThermProtect project much information has become available from thermal manikin experiments in thermal radiation environments, these sets of experimental data are used to verify the analytical approach. The analytical approach provided a good prediction of the heat loss in the manikin experiments, 95% of the variance was explained by the model. The model has not yet been validated at high radiative heat loads and neglects some physical properties of the radiation emissivity. Still, the analytical approach provides a pragmatic approach and may be useful for practical implementation in protective clothing standards for moderate thermal radiation environments.

  16. Immunogenicity of therapeutics: a matter of efficacy and safety.

    PubMed

    Nechansky, Andreas; Kircheis, Ralf

    2010-11-01

    The unwanted immunogenicity of therapeutic proteins is a major concern regarding patient safety. Furthermore, pharmacokinetic, pharmacodynamic and clinical efficacy can be seriously affected by the immunogenicity of therapeutic proteins. Authorities have fully recognized this issue and demand appropriate and well-characterized assays to detect anti-drug antibodies (ADAs). We provide an overview of the immunogenicity topic in general, the regulatory background and insight into underlying immunological mechanisms and the limited ability to predict clinical immunogenicity a priori. Furthermore, we comment on the analytical testing approach and the status-quo of appropriate method validation. The review provides insight regarding the analytical approach that is expected by regulatory authorities overseeing immunogenicity testing requirements. Additionally, the factors influencing immunogenicity are summarized and key references regarding immunogenicity testing approaches and method validation are discussed. The unwanted immunogenicity of protein therapeutics is of major concern because of its potential to affect patient safety and drug efficacy. Analytical testing is sophisticated and requires more than one assay. Because immunogenicity in humans is hardly predictable, assay development has to start in a timely fashion and for clinical studies immunogenicity assay validation is mandatory prior to analyzing patient serum samples. Regarding ADAs, the question remains as to when such antibodies are regarded of clinical relevance and what levels are, if at all, acceptable. In summary, the detection of ADAs should raise the awareness of the physician concerning patient safety and of the sponsor/manufacture concerning the immunogenic potential of the drug product.

  17. Toward an Empirical Multidimensional Structure of Anhedonia, Reward Sensitivity, and Positive Emotionality: An Exploratory Factor Analytic Study.

    PubMed

    Olino, Thomas M; McMakin, Dana L; Forbes, Erika E

    2016-11-20

    Positive emotionality, anhedonia, and reward sensitivity share motivational and experiential elements of approach motivation and pleasure. Earlier work has examined the interrelationships among these constructs from measures of extraversion. More recently, the Research Domain Criteria introduced the Positive Valence Systems as a primary dimension to better understand psychopathology. However, the suggested measures tapping this construct have not yet been integrated within the structural framework of personality, even at the level of self-report. Thus, this study conducted exploratory factor and exploratory bifactor analyses on 17 different dimensions relevant to approach motivation, spanning anhedonia, behavioral activation system functioning, and positive emotionality. Convergent validity of these dimensions is tested by examining associations with depressive symptoms. Relying on multiple indices of fit, our preferred model included a general factor along with specific factors of affiliation, positive emotion, assertiveness, and pleasure seeking. These factors demonstrated different patterns of association with depressive symptoms. We discuss the plausibility of this model and highlight important future directions for work on the structure of a broad Positive Valence Systems construct. © The Author(s) 2016.

  18. Headspace single drop microextraction versus dispersive liquid-liquid microextraction using magnetic ionic liquid extraction solvents.

    PubMed

    An, Jiwoo; Rahn, Kira L; Anderson, Jared L

    2017-05-15

    A headspace single drop microextraction (HS-SDME) method and a dispersive liquid-liquid microextraction (DLLME) method were developed using two tetrachloromanganate ([MnCl 4 2- ])-based magnetic ionic liquids (MIL) as extraction solvents for the determination of twelve aromatic compounds, including four polyaromatic hydrocarbons, by reversed phase high-performance liquid chromatography (HPLC). The analytical performance of the developed HS-SDME method was compared to the DLLME approach employing the same MILs. In the HS-SDME approach, the magnetic field generated by the magnet was exploited to suspend the MIL solvent from the tip of a rod magnet. The utilization of MILs in HS-SDME resulted in a highly stable microdroplet under elevated temperatures and long extraction times, overcoming a common challenge encountered in traditional SDME approaches of droplet instability. The low UV absorbance of the [MnCl 4 2- ]-based MILs permitted direct analysis of the analyte enriched extraction solvent by HPLC. In HS-SDME, the effects of ionic strength of the sample solution, temperature of the extraction system, extraction time, stir rate, and headspace volume on extraction efficiencies were examined. Coefficients of determination (R 2 ) ranged from 0.994 to 0.999 and limits of detection (LODs) varied from 0.04 to 1.0μgL -1 with relative recoveries from lake water ranging from 70.2% to 109.6%. For the DLLME method, parameters including disperser solvent type and volume, ionic strength of the sample solution, mass of extraction solvent, and extraction time were studied and optimized. Coefficients of determination for the DLLME method varied from 0.997 to 0.999 with LODs ranging from 0.05 to 1.0μgL -1 . Relative recoveries from lake water samples ranged from 68.7% to 104.5%. Overall, the DLLME approach permitted faster extraction times and higher enrichment factors for analytes with low vapor pressure whereas the HS-SDME approach exhibited better extraction efficiencies for analytes with relatively higher vapor pressure. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Predictors of family strength: the integrated spiritual-religious/resilient perspective for understanding the healthy/strong family.

    PubMed

    Ghaffari, Majid; Fatehizade, Maryam; Ahmadi, Ahmad; Ghasemi, Vahid; Baghban, Iran

    2013-01-01

    The present study aimed to investigate the effects of spiritual well-being and family protective factors on the family strength in a propositional structural model. The research population consisted of all the married people of the Isfahan, Iran, in 2012 with preschool-aged children and in the first decade of marriage with at least eight grades of educational level. Three hundred and ninety five voluntary and unpaid participants were selected randomly through multi-stage sampling from seven regions of the city. The instruments used were the Spiritual Well-being Scale, Inventory of Family Protective Factors, and Family Strength Scale. Descriptive statistics and a structural equation modeling analytic approach were used. The analytic model predicted 82% of the variance of the family strength. The total effect of the spiritual well-being on the family strength was higher compared to the family protective factors. Furthermore, spiritual well-being predicted 43% of the distribution of the family protective factors and had indirect effect on the family strength through the family protective factors (p < 0.001). The results of this study confirmed the interrelationships among spiritual well-being and family protective factors, and their simultaneous effects on family strength. Family counselors may employ an integrated spiritual-religious/resilient perspective to inform their strength-based work with individuals and their families. None.

  20. Defining dignity in terminally ill cancer patients: a factor-analytic approach.

    PubMed

    Hack, Thomas F; Chochinov, Harvey Max; Hassard, Thomas; Kristjanson, Linda J; McClement, Susan; Harlos, Mike

    2004-10-01

    The construct of 'dignity' is frequently raised in discussions about quality end of life care for terminal cancer patients, and is invoked by parties on both sides of the euthanasia debate. Lacking in this general debate has been an empirical explication of 'dignity' from the viewpoint of cancer patients themselves. The purpose of the present study was to use factor-analytic and regression methods to analyze dignity data gathered from 213 cancer patients having less than 6 months to live. Patients rated their sense of dignity, and completed measures of symptom distress and psychological well-being. The results showed that although the majority of patients had an intact sense of dignity, there were 99 (46%) patients who reported at least some, or occasional loss of dignity, and 16 (7.5%) patients who indicated that loss of dignity was a significant problem. The exploratory factor analysis yielded six primary factors: (1) Pain; (2) Intimate Dependency; (3) Hopelessness/Depression; (4) Informal Support Network; (5) Formal Support Network; and (6) Quality of Life. Subsequent regression analyses of modifiable factors produced a final two-factor (Hopelessness/Depression and Intimate Dependency) model of statistical significance. These results provide empirical support for the dignity model, and suggest that the provision of end of life care should include methods for treating depression, fostering hope, and facilitating functional independence. Copyright 2004 John Wiley & Sons, Ltd.

  1. High-frequency phase shift measurement greatly enhances the sensitivity of QCM immunosensors.

    PubMed

    March, Carmen; García, José V; Sánchez, Ángel; Arnau, Antonio; Jiménez, Yolanda; García, Pablo; Manclús, Juan J; Montoya, Ángel

    2015-03-15

    In spite of being widely used for in liquid biosensing applications, sensitivity improvement of conventional (5-20MHz) quartz crystal microbalance (QCM) sensors remains an unsolved challenging task. With the help of a new electronic characterization approach based on phase change measurements at a constant fixed frequency, a highly sensitive and versatile high fundamental frequency (HFF) QCM immunosensor has successfully been developed and tested for its use in pesticide (carbaryl and thiabendazole) analysis. The analytical performance of several immunosensors was compared in competitive immunoassays taking carbaryl insecticide as the model analyte. The highest sensitivity was exhibited by the 100MHz HFF-QCM carbaryl immunosensor. When results were compared with those reported for 9MHz QCM, analytical parameters clearly showed an improvement of one order of magnitude for sensitivity (estimated as the I50 value) and two orders of magnitude for the limit of detection (LOD): 30μgl(-1) vs 0.66μgL(-1)I50 value and 11μgL(-1) vs 0.14μgL(-1) LOD, for 9 and 100MHz, respectively. For the fungicide thiabendazole, I50 value was roughly the same as that previously reported for SPR under the same biochemical conditions, whereas LOD improved by a factor of 2. The analytical performance achieved by high frequency QCM immunosensors surpassed those of conventional QCM and SPR, closely approaching the most sensitive ELISAs. The developed 100MHz QCM immunosensor strongly improves sensitivity in biosensing, and therefore can be considered as a very promising new analytical tool for in liquid applications where highly sensitive detection is required. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Biosensors for the determination of environmental inhibitors of enzymes

    NASA Astrophysics Data System (ADS)

    Evtugyn, Gennadii A.; Budnikov, Herman C.; Nikolskaya, Elena B.

    1999-12-01

    Characteristic features of functioning and practical application of enzyme-based biosensors for the determination of environmental pollutants as enzyme inhibitors are considered with special emphasis on the influence of the methods used for the measurement of the rates of enzymic reactions, of enzyme immobilisation procedure and of the composition of the reaction medium on the analytical characteristics of inhibitor assays. The published data on the development of biosensors for detecting pesticides and heavy metals are surveyed. Special attention is given to the use of cholinesterase-based biosensors in environmental and analytical monitoring. The approaches to the estimation of kinetic parameters of inhibition are reviewed and the factors determining the selectivity and sensitivity of inhibitor assays in environmental objects are analysed. The bibliography includes 195 references.

  3. Vibration and damping of laminated, composite-material plates including thickness-shear effects

    NASA Technical Reports Server (NTRS)

    Bert, C. W.; Siu, C. C.

    1972-01-01

    An analytical investigation of sinusoidally forced vibration of laminated, anisotropic plates including bending-stretching coupling, thickness-shear flexibility, all three types of inertia effects, and material damping is presented. In the analysis the effects of thickness-shear deformation are considered by the use of a shear correction factor K, analogous to that used by Mindlin for homogeneous plates. Two entirely different approaches for calculating the thickness-shear factor for a laminate are presented. Numerical examples indicate that the value of K depends on the layer properties and the stacking sequence of the laminate.

  4. [Impact factor, its variants and its influence in academic promotion].

    PubMed

    Puche, Rodolfo C

    2011-01-01

    Bibliometrics is a set of methods used to study or measure texts and information. While bibliometric methods are most often used in the field of library and information science, bibliometrics variables have wide applications in other areas. One popular bibliometric variable is Garfield's Impact Factor (IF). IF is used to explore the impact of a given field, the impact of a set of researchers, or the impact of a particular paper. This variable is used to assess academic output and it is believed to affect adversely the traditional approach and assessment of scientific research. In our country, the members of the evaluation committees of intensive research institutions, e.g. the National Scientific and Technical Research Council (CONICET) use IF to assess the quality of research. This article revises the exponential growth of bibliometrics and attempts to expose the overall dissatisfaction with the analytical quality of IF. Such dissatisfaction is expressed in the number of investigations attempting to obtain a better variable of improved analytical quality.

  5. Toward a definition of intolerance of uncertainty: a review of factor analytical studies of the Intolerance of Uncertainty Scale.

    PubMed

    Birrell, Jane; Meares, Kevin; Wilkinson, Andrew; Freeston, Mark

    2011-11-01

    Since its emergence in the early 1990s, a narrow but concentrated body of research has developed examining the role of intolerance of uncertainty (IU) in worry, and yet we still know little about its phenomenology. In an attempt to clarify our understanding of this construct, this paper traces the way in which our understanding and definition of IU have evolved throughout the literature. This paper also aims to further our understanding of IU by exploring the latent variables measures by the Intolerance of Uncertainty Scale (IUS; Freeston, Rheaume, Letarte, Dugas & Ladouceur, 1994). A review of the literature surrounding IU confirmed that the current definitions are categorical and lack specificity. A critical review of existing factor analytic studies was carried out in order to determine the underlying factors measured by the IUS. Systematic searches yielded 9 papers for review. Two factors with 12 consistent items emerged throughout the exploratory studies, and the stability of models containing these two factors was demonstrated in subsequent confirmatory studies. It is proposed that these factors represent (i) desire for predictability and an active engagement in seeking certainty, and (ii) paralysis of cognition and action in the face of uncertainty. It is suggested that these factors may represent approach and avoidance responses to uncertainty. Further research is required to confirm the construct validity of these factors and to determine the stability of this structure within clinical samples. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Cotinine analytical workshop report: consideration of analytical methods for determining cotinine in human body fluids as a measure of passive exposure to tobacco smoke.

    PubMed Central

    Watts, R R; Langone, J J; Knight, G J; Lewtas, J

    1990-01-01

    A two-day technical workshop was convened November 10-11, 1986, to discuss analytical approaches for determining trace amounts of cotinine in human body fluids resulting from passive exposure to environmental tobacco smoke (ETS). The workshop, jointly sponsored by the U.S. Environmental Protection Agency and Centers for Disease Control, was attended by scientists with expertise in cotinine analytical methodology and/or conduct of human monitoring studies related to ETS. The workshop format included technical presentations, separate panel discussions on chromatography and immunoassay analytical approaches, and group discussions related to the quality assurance/quality control aspects of future monitoring programs. This report presents a consensus of opinion on general issues before the workshop panel participants and also a detailed comparison of several analytical approaches being used by the various represented laboratories. The salient features of the chromatography and immunoassay analytical methods are discussed separately. PMID:2190812

  7. Development, optimization, validation and application of faster gas chromatography - flame ionization detector method for the analysis of total petroleum hydrocarbons in contaminated soils.

    PubMed

    Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly

    2015-12-18

    This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up efficiency. The method was successfully applied for the analysis of TPH of Bunker C oil in contaminated soil. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  8. Decreased pain sensitivity due to trimethylbenzene exposure ...

    EPA Pesticide Factsheets

    Traditionally, human health risk assessments have relied on qualitative approaches for hazard identification, often using the Hill criteria and weight of evidence determinations to integrate data from multiple studies. Recently, the National Research Council has recommended the development of quantitative approaches for evidence integration, including the application of meta-analyses. The following hazard identification case study applies qualitative as well as meta-analytic approaches to trimethylbenzene (TMB) isomers exposure and the potential neurotoxic effects on pain sensitivity. In the meta-analytic approach, a pooled effect size is calculated, after consideration of multiple confounding factors, in order to determine whether the entire database under consideration indicates that TMBs are likely to be a neurotoxic hazard. The pain sensitivity studies included in the present analyses initially seem discordant in their results: effects on pain sensitivity are seen immediately after termination of exposure, appear to resolve 24 hours after exposure, and then reappear 50 days later following foot-shock. Qualitative consideration of toxicological and toxicokinetic characteristics of the TMB isomers suggests that the observed differences between studies are due to testing time and can be explained through a complete consideration of the underlying biology of the effect and the nervous system as a whole. Meta-analyses and –regressions support this conclus

  9. A large-scale superhydrophobic surface-enhanced Raman scattering (SERS) platform fabricated via capillary force lithography and assembly of Ag nanocubes for ultratrace molecular sensing.

    PubMed

    Tan, Joel Ming Rui; Ruan, Justina Jiexin; Lee, Hiang Kwee; Phang, In Yee; Ling, Xing Yi

    2014-12-28

    An analytical platform with an ultratrace detection limit in the atto-molar (aM) concentration range is vital for forensic, industrial and environmental sectors that handle scarce/highly toxic samples. Superhydrophobic surface-enhanced Raman scattering (SERS) platforms serve as ideal platforms to enhance detection sensitivity by reducing the random spreading of aqueous solution. However, the fabrication of superhydrophobic SERS platforms is generally limited due to the use of sophisticated and expensive protocols and/or suffers structural and signal inconsistency. Herein, we demonstrate a high-throughput fabrication of a stable and uniform superhydrophobic SERS platform for ultratrace molecular sensing. Large-area box-like micropatterns of the polymeric surface are first fabricated using capillary force lithography (CFL). Subsequently, plasmonic properties are incorporated into the patterned surfaces by decorating with Ag nanocubes using the Langmuir-Schaefer technique. To create a stable superhydrophobic SERS platform, an additional 25 nm Ag film is coated over the Ag nanocube-decorated patterned template followed by chemical functionalization with perfluorodecanethiol. Our resulting superhydrophobic SERS platform demonstrates excellent water-repellency with a static contact angle of 165° ± 9° and a consequent analyte concentration factor of 59-fold, as compared to its hydrophilic counterpart. By combining the analyte concentration effect of superhydrophobic surfaces with the intense electromagnetic "hot spots" of Ag nanocubes, our superhydrophobic SERS platform achieves an ultra-low detection limit of 10(-17) M (10 aM) for rhodamine 6G using just 4 μL of analyte solutions, corresponding to an analytical SERS enhancement factor of 10(13). Our fabrication protocol demonstrates a simple, cost- and time-effective approach for the large-scale fabrication of a superhydrophobic SERS platform for ultratrace molecular detection.

  10. Methods for Estimating Uncertainty in Factor Analytic Solutions

    EPA Science Inventory

    The EPA PMF (Environmental Protection Agency positive matrix factorization) version 5.0 and the underlying multilinear engine-executable ME-2 contain three methods for estimating uncertainty in factor analytic models: classical bootstrap (BS), displacement of factor elements (DI...

  11. A Crowdsensing Based Analytical Framework for Perceptional Degradation of OTT Web Browsing.

    PubMed

    Li, Ke; Wang, Hai; Xu, Xiaolong; Du, Yu; Liu, Yuansheng; Ahmad, M Omair

    2018-05-15

    Service perception analysis is crucial for understanding both user experiences and network quality as well as for maintaining and optimizing of mobile networks. Given the rapid development of mobile Internet and over-the-top (OTT) services, the conventional network-centric mode of network operation and maintenance is no longer effective. Therefore, developing an approach to evaluate and optimizing users' service perceptions has become increasingly important. Meanwhile, the development of a new sensing paradigm, mobile crowdsensing (MCS), makes it possible to evaluate and analyze the user's OTT service perception from end-user's point of view other than from the network side. In this paper, the key factors that impact users' end-to-end OTT web browsing service perception are analyzed by monitoring crowdsourced user perceptions. The intrinsic relationships among the key factors and the interactions between key quality indicators (KQI) are evaluated from several perspectives. Moreover, an analytical framework of perceptional degradation and a detailed algorithm are proposed whose goal is to identify the major factors that impact the perceptional degradation of web browsing service as well as their significance of contribution. Finally, a case study is presented to show the effectiveness of the proposed method using a dataset crowdsensed from a large number of smartphone users in a real mobile network. The proposed analytical framework forms a valuable solution for mobile network maintenance and optimization and can help improve web browsing service perception and network quality.

  12. Analytical approaches for the detection of emerging therapeutics and non-approved drugs in human doping controls.

    PubMed

    Thevis, Mario; Schänzer, Wilhelm

    2014-12-01

    The number and diversity of potentially performance-enhancing substances is continuously growing, fueled by new pharmaceutical developments but also by the inventiveness and, at the same time, unscrupulousness of black-market (designer) drug producers and providers. In terms of sports drug testing, this situation necessitates reactive as well as proactive research and expansion of the analytical armamentarium to ensure timely, adequate, and comprehensive doping controls. This review summarizes literature published over the past 5 years on new drug entities, discontinued therapeutics, and 'tailored' compounds classified as doping agents according to the regulations of the World Anti-Doping Agency, with particular attention to analytical strategies enabling their detection in human blood or urine. Among these compounds, low- and high-molecular mass substances of peptidic (e.g. modified insulin-like growth factor-1, TB-500, hematide/peginesatide, growth hormone releasing peptides, AOD-9604, etc.) and non-peptidic (selective androgen receptor modulators, hypoxia-inducible factor stabilizers, siRNA, S-107 and ARM036/aladorian, etc.) as well as inorganic (cobalt) nature are considered and discussed in terms of specific requirements originating from physicochemical properties, concentration levels, metabolism, and their amenability for chromatographic-mass spectrometric or alternative detection methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Quantifying Acoustic Impacts on Marine Mammals and Sea Turtles: Methods and Analytical Approach for Phase III Training and Testing

    DTIC Science & Technology

    2017-06-16

    Acoustic Impacts on Marine Mammals and Sea Turtles: Methods and Analytical Approach for Phase III Training and Testing Sarah A. Blackstock Joseph O...December 2017 4. TITLE AND SUBTITLE Quantifying Acoustic Impacts on Marine Mammals and Sea Turtles: Methods and Analytical Approach for Phase III...Navy’s Phase III Study Areas as described in each Environmental Impact Statement/ Overseas Environmental Impact Statement and describes the methods

  14. [Laparoscopy for perfored duodenal ulcer : conversion and morbidity factors: retrospective study of 290 cases].

    PubMed

    Ben Abid, Sadreddine; Mzoughi, Zeineb; Attaoui, Mohamed Amine; Talbi, Ghofrane; Arfa, Nafaa; Gharbi, Lassaad; Khalfallah, Mohamed Taher

    2014-12-01

    feasibility and advantages of laparoscopic approach in performed duodenal ulcer have no longer to be demonstrated. Laparoscopic suture and peritoneal cleaning expose to a conversion rate between 10 and 23%. However less than laparotomy, morbidity of this approach is not absent. This study aim to analyze factors exposing to conversion after laparoscopic approach of perforred duodenal ulcer. We also aim to define the morbidity of this approach and predictive factors of this morbidity Methods: Retrospective descriptive study was conducted referring all cases of perforated duodenal ulcer treated laparoscopically over a period of ten years, running from January 2000 to December 2010. All patients were operated by laparoscopy with or without conversion. We have noted conversion factors. A statistical analysis with logistic regression was performed whenever we have sought to identify independent risk factors for conversion verified as statistically significant in univariante. The significance level was set at 5%. Analytic univariant and multivariant study was performed to analyze morbidity factors. 290 patients were included. The median age was 34ans.T he intervention was conducted completely laparoscopically in 91.4% of cases. The conversion rate was 8.6%. It was selected as a risk factor for conversion: age> 32 years, a known ulcer, progressive pain, renal function failure, a difficult peritoneal lavage and having a chronic ulcer. Postoperative morbidity was 5.1%. Three independent risk factors of surgical complications were selected: renal failure, age> 45 years, and a chronic ulcer appearance. Laparoscopic treatment of perforred duodenal ulcer expose to a conversion risk. Morbidity is certainly less than laparotomy and a better Knowledge of predictif's morbidity factors become necessary for a better management of this disease.

  15. A Superior Kirchhoff Method for Aeroacoustic Noise Prediction: The Ffowcs Williams-Hawkings Equation

    NASA Technical Reports Server (NTRS)

    Brentner, Kenneth S.

    1997-01-01

    The prediction of aeroacoustic noise is important; all new aircraft must meet noise certification requirements. Local noise standards can be even more stringent. The NASA noise reduction goal is to reduce perceived noise levels by a factor of two in 10 years. The objective of this viewgraph presentation is to demonstrate the superiority of the FW-H approach over the Kirchoff method for aeroacoustics, both analytically and numerically.

  16. A dynamic mechanical analysis technique for porous media

    PubMed Central

    Pattison, Adam J; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-01-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a non-linear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1 – 14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in hydraulic conductivity as well. PMID:25248170

  17. Uncertainty of relative sensitivity factors in glow discharge mass spectrometry

    NASA Astrophysics Data System (ADS)

    Meija, Juris; Methven, Brad; Sturgeon, Ralph E.

    2017-10-01

    The concept of the relative sensitivity factors required for the correction of the measured ion beam ratios in pin-cell glow discharge mass spectrometry is examined in detail. We propose a data-driven model for predicting the relative response factors, which relies on a non-linear least squares adjustment and analyte/matrix interchangeability phenomena. The model provides a self-consistent set of response factors for any analyte/matrix combination of any element that appears as either an analyte or matrix in at least one known response factor.

  18. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  19. Assessing the Structure of the Ways of Coping Questionnaire in Fibromyalgia Patients Using Common Factor Analytic Approaches.

    PubMed

    Van Liew, Charles; Santoro, Maya S; Edwards, Larissa; Kang, Jeremy; Cronan, Terry A

    2016-01-01

    The Ways of Coping Questionnaire (WCQ) is a widely used measure of coping processes. Despite its use in a variety of populations, there has been concern about the stability and structure of the WCQ across different populations. This study examines the factor structure of the WCQ in a large sample of individuals diagnosed with fibromyalgia. The participants were 501 adults (478 women) who were part of a larger intervention study. Participants completed the WCQ at their 6-month assessment. Foundational factoring approaches were performed on the data (i.e., maximum likelihood factoring [MLF], iterative principal factoring [IPF], principal axis factoring (PAF), and principal components factoring [PCF]) with oblique oblimin rotation. Various criteria were evaluated to determine the number of factors to be extracted, including Kaiser's rule, Scree plot visual analysis, 5 and 10% unique variance explained, 70 and 80% communal variance explained, and Horn's parallel analysis (PA). It was concluded that the 4-factor PAF solution was the preferable solution, based on PA extraction and the fact that this solution minimizes nonvocality and multivocality. The present study highlights the need for more research focused on defining the limits of the WCQ and the degree to which population-specific and context-specific subscale adjustments are needed.

  20. Assessing the Structure of the Ways of Coping Questionnaire in Fibromyalgia Patients Using Common Factor Analytic Approaches

    PubMed Central

    Edwards, Larissa; Kang, Jeremy

    2016-01-01

    The Ways of Coping Questionnaire (WCQ) is a widely used measure of coping processes. Despite its use in a variety of populations, there has been concern about the stability and structure of the WCQ across different populations. This study examines the factor structure of the WCQ in a large sample of individuals diagnosed with fibromyalgia. The participants were 501 adults (478 women) who were part of a larger intervention study. Participants completed the WCQ at their 6-month assessment. Foundational factoring approaches were performed on the data (i.e., maximum likelihood factoring [MLF], iterative principal factoring [IPF], principal axis factoring (PAF), and principal components factoring [PCF]) with oblique oblimin rotation. Various criteria were evaluated to determine the number of factors to be extracted, including Kaiser's rule, Scree plot visual analysis, 5 and 10% unique variance explained, 70 and 80% communal variance explained, and Horn's parallel analysis (PA). It was concluded that the 4-factor PAF solution was the preferable solution, based on PA extraction and the fact that this solution minimizes nonvocality and multivocality. The present study highlights the need for more research focused on defining the limits of the WCQ and the degree to which population-specific and context-specific subscale adjustments are needed. PMID:28070160

  1. Improving the efficiency of quantitative (1)H NMR: an innovative external standard-internal reference approach.

    PubMed

    Huang, Yande; Su, Bao-Ning; Ye, Qingmei; Palaniswamy, Venkatapuram A; Bolgar, Mark S; Raglione, Thomas V

    2014-01-01

    The classical internal standard quantitative NMR (qNMR) method determines the purity of an analyte by the determination of a solution containing the analyte and a standard. Therefore, the standard must meet the requirements of chemical compatibility and lack of resonance interference with the analyte as well as a known purity. The identification of such a standard can be time consuming and must be repeated for each analyte. In contrast, the external standard qNMR method utilizes a standard with a known purity to calibrate the NMR instrument. The external standard and the analyte are measured separately, thereby eliminating the matter of chemical compatibility and resonance interference between the standard and the analyte. However, the instrumental factors, including the quality of NMR tubes, must be kept the same. Any deviations will compromise the accuracy of the results. An innovative qNMR method reported herein utilizes an internal reference substance along with an external standard to assume the role of the standard used in the traditional internal standard qNMR method. In this new method, the internal reference substance must only be chemically compatible and be free of resonance-interference with the analyte or external standard whereas the external standard must only be of a known purity. The exact purity or concentration of the internal reference substance is not required as long as the same quantity is added to the external standard and the analyte. The new method reduces the burden of searching for an appropriate standard for each analyte significantly. Therefore the efficiency of the qNMR purity assay increases while the precision of the internal standard method is retained. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. [Changing the focus: an exploratory study of drug use and workplace violence among women of popular classes in Rio de Janeiro, Brazil].

    PubMed

    David, Helena Maria Scherlowski Leal; Caufield, Catherine

    2005-01-01

    This exploratory study aimed to investigate factors related to the use of illicit and licit drugs and workplace violence in a group of women from popular classes in the city of Rio de Janeiro. We used a descriptive and analytic quantitative approach was used, as well as a qualitative approach through in-depth interviews with women who suffered or were suffering workplace violence, using the collective subject discourse analysis methodology. The results showed sociodemographic and work situations that can be considered as possible risk factors for drug consumption and workplace violence. The qualitative analysis shows how this group perceives the phenomena of drug use and workplace violence, expanding the comprehension about these issues and providing conceptual and methodological elements for additional studies on this subject.

  3. A confirmatory factor analytic validation of the Tinnitus Handicap Inventory.

    PubMed

    Kleinstäuber, Maria; Frank, Ina; Weise, Cornelia

    2015-03-01

    Because the postulated three-factor structure of the internationally widely used Tinnitus Handicap Inventory (THI) has not been confirmed yet by a confirmatory factor analytic approach this was the central aim of the current study. From a clinical setting, N=373 patients with chronic tinnitus completed the THI and further questionnaires assessing tinnitus-related and psychological variables. In order to analyze the psychometric properties of the THI, confirmatory factor analysis (CFA) and correlational analyses were conducted. CFA provided a statistically significant support for a better fit of the data to the hypothesized three-factor structure (RMSEA=.049, WRMR=1.062, CFI=.965, TLI=.961) than to a general factor model (RMSEA=.062, WRMR=1.258, CFI=.942, TLI=.937). The calculation of Cronbach's alpha as indicator of internal consistency revealed satisfactory values (.80-.91) with the exception of the catastrophic subscale (.65). High positive correlations of the THI and its subscales with other measures of tinnitus distress, anxiety, and depression, high negative correlations with tinnitus acceptance, moderate positive correlations with anxiety sensitivity, sleeping difficulties, tinnitus loudness, and small correlations with the Big Five personality dimensions confirmed construct validity. Results show that the THI is a highly reliable and valid measure of tinnitus-related handicap. In contrast to results of previous exploratory analyses the current findings speak for a three-factor in contrast to a unifactorial structure. Future research is needed to replicate this result in different tinnitus populations. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Analytical display design for flight tasks conducted under instrument meteorological conditions. [human factors engineering of pilot performance for display device design in instrument landing systems

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1976-01-01

    Paramount to proper utilization of electronic displays is a method for determining pilot-centered display requirements. Display design should be viewed fundamentally as a guidance and control problem which has interactions with the designer's knowledge of human psychomotor activity. From this standpoint, reliable analytical models of human pilots as information processors and controllers can provide valuable insight into the display design process. A relatively straightforward, nearly algorithmic procedure for deriving model-based, pilot-centered display requirements was developed and is presented. The optimal or control theoretic pilot model serves as the backbone of the design methodology, which is specifically directed toward the synthesis of head-down, electronic, cockpit display formats. Some novel applications of the optimal pilot model are discussed. An analytical design example is offered which defines a format for the electronic display to be used in a UH-1H helicopter in a landing approach task involving longitudinal and lateral degrees of freedom.

  5. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  6. Deducing the form factors for shear used in the calculus of the displacements based on strain energy methods. Mathematical approach for currently used shapes

    NASA Astrophysics Data System (ADS)

    Constantinescu, E.; Oanta, E.; Panait, C.

    2017-08-01

    The paper presents an initial study concerning the form factors for shear, for a rectangular and for a circular cross section, being used an analytical method and a numerical study. The numerical study considers a division of the cross section in small areas and uses the power of the definitions in order to compute the according integrals. The accurate values of the form factors are increasing the accuracy of the displacements computed by the use of the strain energy methods. The knowledge resulted from this study will be used for several directions of development: calculus of the form factors for a ring-type cross section of variable ratio of the inner and outer diameters, calculus of the geometrical characteristics of an inclined circular segment and, using a Bool algebra that operates with geometrical shapes, for an inclined circular ring segment. These shapes may be used to analytically define the geometrical model of a complex composite section, i.e. a ship hull cross section. The according calculus relations are also useful for the development of customized design commands in CAD commercial applications. The paper is a result of the long run development of original computer based instruments in engineering of the authors.

  7. The Hispanic Stress Inventory--Adolescent Version: a culturally informed psychosocial assessment.

    PubMed

    Cervantes, Richard C; Fisher, Dennis G; Córdova, David; Napper, Lucy E

    2012-03-01

    A 2-phase study was conducted to develop a culturally informed measure of psychosocial stress for adolescents: the Hispanic Stress Inventory--Adolescent Version (HSI-A). Phase 1 involved item development through the collection of open-ended focus group interview data (n = 170) from a heterogeneous sample of Hispanic youths residing in the southwest and northeast United States. In Phase 2, we examined the psychometric properties of the HSI-A (n = 1,651), which involved the use of factor analytic procedures to determine the underlying scale structure of the HSI-A for foreign-born and U.S.-born participants in an aggregated analytic approach. An 8-factor solution was established, with factors that include Family Economic Stress, Acculturation-Gap Stress, Culture and Educational Stress, Immigration-Related Stress, Discrimination Stress, Family Immigration Stress, Community and Gang-Related Stress, and Family and Drug-Related Stress. Concurrent, related validity estimates were calculated to determine relations between HSI-A and other measures of child psychopathology and behavioral and emotional disturbances. HSI-A total stress appraisal scores were significantly correlated with both the Children's Depression Inventory and the Youth Self Report (p < .001). Reliability estimates for the HSI-A were conducted, and they yielded high reliability coefficients for most factor subscales, with the HSI-A total stress appraisal score reliability alpha at .92.

  8. The Hispanic Stress Inventory-Adolescent Version: A Culturally Informed Psychosocial Assessment

    PubMed Central

    Cervantes, Richard C.; Fisher, Dennis G.; Córdova, David; Napper, Lucy

    2012-01-01

    A 2-phase study was conducted to develop a culturally informed measure of psychosocial stress for adolescents, the Hispanic Stress Inventory-Adolescent Version (HSI-A). Phase I involved item development through the collection of open-ended focus group interview data (n=170) from a heterogeneous sample of Hispanic youth residing in the southwest and northeast United States. Phase 2 examined the psychometric properties of the HSI-A (n=1651) involving the use of factor analytic procedures to determine the underlying scale structure of the HSI-A, for foreign-born and U.S.-born participants in an aggregated analytic approach. An eight factor solution was established with factors that include Family Economic Stress, Acculturation Gaps Stress, Culture and Educational Stress, Immigration Related Stress, Discrimination Stress, Family Immigration Stress, Community and Gang Violence Stress and Family Drug Related Stress. Concurrent related validity estimates were calculated to determine relationships between HSI-A and other measures of child psychopathology, behavioral and emotional disturbances. HSI-A Total Stress Appraisal Scores were significantly correlated with both the CDI and YSR (p<.001 respectively). Reliability estimates for the HSI-A were conducted and yielded high reliability coefficients for most all factor sub-scales with HSI-A Total Stress Appraisal score reliability at alpha=.92. PMID:21942232

  9. Combining Model-Based and Feature-Driven Diagnosis Approaches - A Case Study on Electromechanical Actuators

    NASA Technical Reports Server (NTRS)

    Narasimhan, Sriram; Roychoudhury, Indranil; Balaban, Edward; Saxena, Abhinav

    2010-01-01

    Model-based diagnosis typically uses analytical redundancy to compare predictions from a model against observations from the system being diagnosed. However this approach does not work very well when it is not feasible to create analytic relations describing all the observed data, e.g., for vibration data which is usually sampled at very high rates and requires very detailed finite element models to describe its behavior. In such cases, features (in time and frequency domains) that contain diagnostic information are extracted from the data. Since this is a computationally intensive process, it is not efficient to extract all the features all the time. In this paper we present an approach that combines the analytic model-based and feature-driven diagnosis approaches. The analytic approach is used to reduce the set of possible faults and then features are chosen to best distinguish among the remaining faults. We describe an implementation of this approach on the Flyable Electro-mechanical Actuator (FLEA) test bed.

  10. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  11. Exploring key factors in online shopping with a hybrid model.

    PubMed

    Chen, Hsiao-Ming; Wu, Chia-Huei; Tsai, Sang-Bing; Yu, Jian; Wang, Jiangtao; Zheng, Yuxiang

    2016-01-01

    Nowadays, the web increasingly influences retail sales. An in-depth analysis of consumer decision-making in the context of e-business has become an important issue for internet vendors. However, factors affecting e-business are complicated and intertwined. To stimulate online sales, understanding key influential factors and causal relationships among the factors is important. To gain more insights into this issue, this paper introduces a hybrid method, which combines the Decision Making Trial and Evaluation Laboratory (DEMATEL) with the analytic network process, called DANP method, to find out the driving factors that influence the online business mostly. By DEMATEL approach the causal graph showed that "online service" dimension has the highest degree of direct impact on other dimensions; thus, the internet vendor is suggested to made strong efforts on service quality throughout the online shopping process. In addition, the study adopted DANP to measure the importance of key factors, among which "transaction security" proves to be the most important criterion. Hence, transaction security should be treated with top priority to boost the online businesses. From our study with DANP approach, the comprehensive information can be visually detected so that the decision makers can spotlight on the root causes to develop effectual actions.

  12. Positive matrix factorization as source apportionment of soil lead and cadmium around a battery plant (Changxing County, China).

    PubMed

    Xue, Jian-long; Zhi, Yu-you; Yang, Li-ping; Shi, Jia-chun; Zeng, Ling-zao; Wu, Lao-sheng

    2014-06-01

    Chemical compositions of soil samples are multivariate in nature and provide datasets suitable for the application of multivariate factor analytical techniques. One of the analytical techniques, the positive matrix factorization (PMF), uses a weighted least square by fitting the data matrix to determine the weights of the sources based on the error estimates of each data point. In this research, PMF was employed to apportion the sources of heavy metals in 104 soil samples taken within a 1-km radius of a lead battery plant contaminated site in Changxing County, Zhejiang Province, China. The site is heavily contaminated with high concentrations of lead (Pb) and cadmium (Cd). PMF successfully partitioned the variances into sources related to soil background, agronomic practices, and the lead battery plants combined with a geostatistical approach. It was estimated that the lead battery plants and the agronomic practices contributed 55.37 and 29.28%, respectively, for soil Pb of the total source. Soil Cd mainly came from the lead battery plants (65.92%), followed by the agronomic practices (21.65%), and soil parent materials (12.43%). This research indicates that PMF combined with geostatistics is a useful tool for source identification and apportionment.

  13. Finding accurate frontiers: A knowledge-intensive approach to relational learning

    NASA Technical Reports Server (NTRS)

    Pazzani, Michael; Brunk, Clifford

    1994-01-01

    An approach to analytic learning is described that searches for accurate entailments of a Horn Clause domain theory. A hill-climbing search, guided by an information based evaluation function, is performed by applying a set of operators that derive frontiers from domain theories. The analytic learning system is one component of a multi-strategy relational learning system. We compare the accuracy of concepts learned with this analytic strategy to concepts learned with an analytic strategy that operationalizes the domain theory.

  14. A multitrait-multisource confirmatory factor analytic approach to the construct validity of ADHD and ODD rating scales with Malaysian children.

    PubMed

    Gomez, Rapson; Burns, G Leonard; Walsh, James A; Hafetz, Nina

    2005-04-01

    Confirmatory factor analysis (CFA) was used to model a multitrait by multisource matrix to determine the convergent and discriminant validity of measures of attention-deficit hyperactivity disorder (ADHD)-inattention (IN), ADHD-hyperactivity/impulsivity (HI), and oppositional defiant disorder (ODD) in 917 Malaysian elementary school children. The three trait factors were ADHD-IN, ADHDHI, and ODD. The two source factors were parents and teachers. Similar to earlier studies with Australian and Brazilian children, the parent and teacher measures failed to show convergent and discriminant validity with Malaysian children. The study outlines the implications of such strong source effects in ADHD-IN, ADHD-HI, and ODD measures for the use of such parent and teacher scales to study the symptom dimensions.

  15. ESTIMATING UNCERTAINITIES IN FACTOR ANALYTIC MODELS

    EPA Science Inventory

    When interpreting results from factor analytic models as used in receptor modeling, it is important to quantify the uncertainties in those results. For example, if the presence of a species on one of the factors is necessary to interpret the factor as originating from a certain ...

  16. A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions

    PubMed Central

    Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.

    2009-01-01

    Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453

  17. A new frequency approach for light flicker evaluation in electric power systems

    NASA Astrophysics Data System (ADS)

    Feola, Luigi; Langella, Roberto; Testa, Alfredo

    2015-12-01

    In this paper, a new analytical estimator for light flicker in frequency domain, which is able to take into account also the frequency components neglected by the classical methods proposed in literature, is proposed. The analytical solutions proposed apply for any generic stationary signal affected by interharmonic distortion. The light flicker analytical estimator proposed is applied to numerous numerical case studies with the goal of showing i) the correctness and the improvements of the analytical approach proposed with respect to the other methods proposed in literature and ii) the accuracy of the results compared to those obtained by means of the classical International Electrotechnical Commission (IEC) flickermeter. The usefulness of the proposed analytical approach is that it can be included in signal processing tools for interharmonic penetration studies for the integration of renewable energy sources in future smart grids.

  18. Experimental Validation of the Transverse Shear Behavior of a Nomex Core for Sandwich Panels

    NASA Astrophysics Data System (ADS)

    Farooqi, M. I.; Nasir, M. A.; Ali, H. M.; Ali, Y.

    2017-05-01

    This work deals with determination of the transverse shear moduli of a Nomex® honeycomb core of sandwich panels. Their out-of-plane shear characteristics depend on the transverse shear moduli of the honeycomb core. These moduli were determined experimentally, numerically, and analytically. Numerical simulations were performed by using a unit cell model and three analytical approaches. Analytical calculations showed that two of the approaches provided reasonable predictions for the transverse shear modulus as compared with experimental results. However, the approach based upon the classical lamination theory showed large deviations from experimental data. Numerical simulations also showed a trend similar to that resulting from the analytical models.

  19. Formalising recall by genotype as an efficient approach to detailed phenotyping and causal inference.

    PubMed

    Corbin, Laura J; Tan, Vanessa Y; Hughes, David A; Wade, Kaitlin H; Paul, Dirk S; Tansey, Katherine E; Butcher, Frances; Dudbridge, Frank; Howson, Joanna M; Jallow, Momodou W; John, Catherine; Kingston, Nathalie; Lindgren, Cecilia M; O'Donavan, Michael; O'Rahilly, Stephen; Owen, Michael J; Palmer, Colin N A; Pearson, Ewan R; Scott, Robert A; van Heel, David A; Whittaker, John; Frayling, Tim; Tobin, Martin D; Wain, Louise V; Smith, George Davey; Evans, David M; Karpe, Fredrik; McCarthy, Mark I; Danesh, John; Franks, Paul W; Timpson, Nicholas J

    2018-02-19

    Detailed phenotyping is required to deepen our understanding of the biological mechanisms behind genetic associations. In addition, the impact of potentially modifiable risk factors on disease requires analytical frameworks that allow causal inference. Here, we discuss the characteristics of Recall-by-Genotype (RbG) as a study design aimed at addressing both these needs. We describe two broad scenarios for the application of RbG: studies using single variants and those using multiple variants. We consider the efficacy and practicality of the RbG approach, provide a catalogue of UK-based resources for such studies and present an online RbG study planner.

  20. The Identification and Significance of Intuitive and Analytic Problem Solving Approaches Among College Physics Students

    ERIC Educational Resources Information Center

    Thorsland, Martin N.; Novak, Joseph D.

    1974-01-01

    Described is an approach to assessment of intuitive and analytic modes of thinking in physics. These modes of thinking are associated with Ausubel's theory of learning. High ability in either intuitive or analytic thinking was associated with success in college physics, with high learning efficiency following a pattern expected on the basis of…

  1. Missed detection of significant positive and negative shifts in gentamicin assay: implications for routine laboratory quality practices.

    PubMed

    Koerbin, Gus; Liu, Jiakai; Eigenstetter, Alex; Tan, Chin Hon; Badrick, Tony; Loh, Tze Ping

    2018-02-15

    A product recall was issued for the Roche/Hitachi Cobas Gentamicin II assays on 25 th May 2016 in Australia, after a 15 - 20% positive analytical shift was discovered. Laboratories were advised to employ the Thermo Fisher Gentamicin assay as an alternative. Following the reintroduction of the revised assay on 12 th September 2016, a second reagent recall was made on 20 th March 2017 after the discovery of a 20% negative analytical shift due to erroneous instrument adjustment factor. The practices of an index laboratory were examined to determine how the analytical shifts evaded detection by routine internal quality control (IQC) and external quality assurance (EQA) systems. The ability of the patient result-based approaches, including moving average (MovAvg) and moving sum of outliers (MovSO) approaches in detecting these shifts were examined. Internal quality control data of the index laboratory were acceptable prior to the product recall. The practice of adjusting IQC target following a change in assay method resulted in the missed negative shift when the revised Roche assay was reintroduced. While the EQA data of the Roche subgroup showed clear negative bias relative to other laboratory methods, the results were considered as possible 'matrix effect'. The MovAvg method detected the positive shift before the product recall. The MovSO did not detect the negative shift in the index laboratory but did so in another laboratory 5 days before the second product recall. There are gaps in current laboratory quality practices that leave room for analytical errors to evade detection.

  2. Equifinality in empirical studies of cultural transmission.

    PubMed

    Barrett, Brendan J

    2018-01-31

    Cultural systems exhibit equifinal behavior - a single final state may be arrived at via different mechanisms and/or from different initial states. Potential for equifinality exists in all empirical studies of cultural transmission including controlled experiments, observational field research, and computational simulations. Acknowledging and anticipating the existence of equifinality is important in empirical studies of social learning and cultural evolution; it helps us understand the limitations of analytical approaches and can improve our ability to predict the dynamics of cultural transmission. Here, I illustrate and discuss examples of equifinality in studies of social learning, and how certain experimental designs might be prone to it. I then review examples of equifinality discussed in the social learning literature, namely the use of s-shaped diffusion curves to discern individual from social learning and operational definitions and analytical approaches used in studies of conformist transmission. While equifinality exists to some extent in all studies of social learning, I make suggestions for how to address instances of it, with an emphasis on using data simulation and methodological verification alongside modern statistical approaches that emphasize prediction and model comparison. In cases where evaluated learning mechanisms are equifinal due to non-methodological factors, I suggest that this is not always a problem if it helps us predict cultural change. In some cases, equifinal learning mechanisms might offer insight into how both individual learning, social learning strategies and other endogenous social factors might by important in structuring cultural dynamics and within- and between-group heterogeneity. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. An Analytical Solution for Yaw Maneuver Optimization on the International Space Station and Other Orbiting Space Vehicles

    NASA Technical Reports Server (NTRS)

    Dobrinskaya, Tatiana

    2015-01-01

    This paper suggests a new method for optimizing yaw maneuvers on the International Space Station (ISS). Yaw rotations are the most common large maneuvers on the ISS often used for docking and undocking operations, as well as for other activities. When maneuver optimization is used, large maneuvers, which were performed on thrusters, could be performed either using control moment gyroscopes (CMG), or with significantly reduced thruster firings. Maneuver optimization helps to save expensive propellant and reduce structural loads - an important factor for the ISS service life. In addition, optimized maneuvers reduce contamination of the critical elements of the vehicle structure, such as solar arrays. This paper presents an analytical solution for optimizing yaw attitude maneuvers. Equations describing pitch and roll motion needed to counteract the major torques during a yaw maneuver are obtained. A yaw rate profile is proposed. Also the paper describes the physical basis of the suggested optimization approach. In the obtained optimized case, the torques are significantly reduced. This torque reduction was compared to the existing optimization method which utilizes the computational solution. It was shown that the attitude profiles and the torque reduction have a good match for these two methods of optimization. The simulations using the ISS flight software showed similar propellant consumption for both methods. The analytical solution proposed in this paper has major benefits with respect to computational approach. In contrast to the current computational solution, which only can be calculated on the ground, the analytical solution does not require extensive computational resources, and can be implemented in the onboard software, thus, making the maneuver execution automatic. The automatic maneuver significantly simplifies the operations and, if necessary, allows to perform a maneuver without communication with the ground. It also reduces the probability of command errors. The suggested analytical solution provides a new method of maneuver optimization which is less complicated, automatic and more universal. A maneuver optimization approach, presented in this paper, can be used not only for the ISS, but for other orbiting space vehicles.

  4. Environmental Risk Assessment System for Phosphogypsum Tailing Dams

    PubMed Central

    Sun, Xin; Tang, Xiaolong; Yi, Honghong; Li, Kai; Zhou, Lianbi; Xu, Xianmang

    2013-01-01

    This paper may be of particular interest to the readers as it provides a new environmental risk assessment system for phosphogypsum tailing dams. In this paper, we studied the phosphogypsum tailing dams which include characteristics of the pollution source, environmental risk characteristics and evaluation requirements to identify the applicable environmental risk assessment methods. Two analytical methods, that is, the analytic hierarchy process (AHP) and fuzzy logic, were used to handle the complexity of the environmental and nonquantitative data. Using our assessment method, different risk factors can be ranked according to their contributions to the environmental risk, thereby allowing the calculation of their relative priorities during decision making. Thus, environmental decision-makers can use this approach to develop alternative management strategies for proposed, ongoing, and completed PG tailing dams. PMID:24382947

  5. Environmental risk assessment system for phosphogypsum tailing dams.

    PubMed

    Sun, Xin; Ning, Ping; Tang, Xiaolong; Yi, Honghong; Li, Kai; Zhou, Lianbi; Xu, Xianmang

    2013-01-01

    This paper may be of particular interest to the readers as it provides a new environmental risk assessment system for phosphogypsum tailing dams. In this paper, we studied the phosphogypsum tailing dams which include characteristics of the pollution source, environmental risk characteristics and evaluation requirements to identify the applicable environmental risk assessment methods. Two analytical methods, that is, the analytic hierarchy process (AHP) and fuzzy logic, were used to handle the complexity of the environmental and nonquantitative data. Using our assessment method, different risk factors can be ranked according to their contributions to the environmental risk, thereby allowing the calculation of their relative priorities during decision making. Thus, environmental decision-makers can use this approach to develop alternative management strategies for proposed, ongoing, and completed PG tailing dams.

  6. Analytics that Inform the University: Using Data You Already Have

    ERIC Educational Resources Information Center

    Dziuban, Charles; Moskal, Patsy; Cavanagh, Thomas; Watts, Andre

    2012-01-01

    The authors describe the University of Central Florida's top-down/bottom-up action analytics approach to using data to inform decision-making at the University of Central Florida. The top-down approach utilizes information about programs, modalities, and college implementation of Web initiatives. The bottom-up approach continuously monitors…

  7. Multivariate Approaches for Simultaneous Determination of Avanafil and Dapoxetine by UV Chemometrics and HPLC-QbD in Binary Mixtures and Pharmaceutical Product.

    PubMed

    2016-04-07

    Multivariate UV-spectrophotometric methods and Quality by Design (QbD) HPLC are described for concurrent estimation of avanafil (AV) and dapoxetine (DP) in the binary mixture and in the dosage form. Chemometric methods have been developed, including classical least-squares, principal component regression, partial least-squares, and multiway partial least-squares. Analytical figures of merit, such as sensitivity, selectivity, analytical sensitivity, LOD, and LOQ were determined. QbD consists of three steps, starting with the screening approach to determine the critical process parameter and response variables. This is followed by understanding of factors and levels, and lastly the application of a Box-Behnken design containing four critical factors that affect the method. From an Ishikawa diagram and a risk assessment tool, four main factors were selected for optimization. Design optimization, statistical calculation, and final-condition optimization of all the reactions were Carried out. Twenty-five experiments were done, and a quadratic model was used for all response variables. Desirability plot, surface plot, design space, and three-dimensional plots were calculated. In the optimized condition, HPLC separation was achieved on Phenomenex Gemini C18 column (250 × 4.6 mm, 5 μm) using acetonitrile-buffer (ammonium acetate buffer at pH 3.7 with acetic acid) as a mobile phase at flow rate of 0.7 mL/min. Quantification was done at 239 nm, and temperature was set at 20°C. The developed methods were validated and successfully applied for simultaneous determination of AV and DP in the dosage form.

  8. Human factors issues and approaches in the spatial layout of a space station control room, including the use of virtual reality as a design analysis tool

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P., II

    1994-01-01

    Human Factors Engineering support was provided for the 30% design review of the late Space Station Freedom Payload Control Area (PCA). The PCA was to be the payload operations control room, analogous to the Spacelab Payload Operations Control Center (POCC). This effort began with a systematic collection and refinement of the relevant requirements driving the spatial layout of the consoles and PCA. This information was used as input for specialized human factors analytical tools and techniques in the design and design analysis activities. Design concepts and configuration options were developed and reviewed using sketches, 2-D Computer-Aided Design (CAD) drawings, and immersive Virtual Reality (VR) mockups.

  9. Which Helper Behaviors and Intervention Styles Are Related to Better Short-Term Outcomes in Telephone Crisis Intervention? Results from a Silent Monitoring Study of Calls to the U.S. 1-800-SUICIDE Network

    ERIC Educational Resources Information Center

    Mishara, Brian L.; Chagnon, Francois; Daigle, Marc; Balan, Bogdan; Raymond, Sylvaine; Marcoux, Isabelle; Bardon, Cecile; Campbell, Julie K.; Berman, Alan

    2007-01-01

    A total of 2,611 calls to 14 helplines were monitored to observe helper behaviors and caller characteristics and changes during the calls. The relationship between intervention characteristics and call outcomes are reported for 1,431 crisis calls. Empathy and respect, as well as factor-analytically derived scales of supportive approach and good…

  10. Problematic eating behaviors among bariatric surgical candidates: a psychometric investigation and factor analytic approach.

    PubMed

    Gelinas, Bethany L; Delparte, Chelsea A; Wright, Kristi D; Hart, Regan

    2015-01-01

    Psychological factors (e.g., anxiety, depression) are routinely assessed in bariatric pre-surgical programs, as high levels of psychopathology are consistently related to poor program outcomes (e.g., failure to lose significant weight pre-surgery, weight regain post-surgery). Behavioral factors related to poor program outcomes and ways in which behavioral and psychological factors interact, have received little attention in bariatric research and practice. Potentially problematic behavioral factors are queried by Section H of the Weight and Lifestyle Inventory (WALI-H), in which respondents indicate the relevance of certain eating behaviors to obesity. A factor analytic investigation of the WALI-H serves to improve the way in which this assessment tool is interpreted and used among bariatric surgical candidates, and subsequent moderation analyses serve to demonstrate potential compounding influences of psychopathology on eating behavior factors. Bariatric surgical candidates (n =362) completed several measures of psychopathology and the WALI-H. Item responses from the WALI-H were subjected to principal axis factoring with oblique rotation. Results revealed a three-factor model including: (1) eating in response to negative affect, (2) overeating/desirability of food, and (3) eating in response to positive affect/social cues. All three behavioral factors of the WALI-H were significantly associated with measures of depression and anxiety. Moderation analyses revealed that depression did not moderate the relationship between anxiety and any eating behavior factor. Although single forms of psychopathology are related to eating behaviors, the combination of psychopathology does not appear to influence these problematic behaviors. Recommendations for pre-surgical assessment and treatment of bariatric surgical candidates are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Apolipoprotein E Polymorphism and Left Ventricular Failure in Beta-Thalassemia: A Multivariate Meta-Analysis.

    PubMed

    Dimou, Niki L; Pantavou, Katerina G; Bagos, Pantelis G

    2017-09-01

    Apolipoprotein E (ApoE) is potentially a genetic risk factor for the development of left ventricular failure (LVF), the main cause of death in beta-thalassemia homozygotes. In the present study, we synthesize the results of independent studies examining the effect of ApoE on LVF development in thalassemic patients through a meta-analytic approach. However, all studies report more than one outcome, as patients are classified into three groups according to the severity of the symptoms and the genetic polymorphism. Thus, a multivariate meta-analytic method that addresses simultaneously multiple exposures and multiple comparison groups was developed. Four individual studies were included in the meta-analysis involving 613 beta-thalassemic patients and 664 controls. The proposed method that takes into account the correlation of log odds ratios (log(ORs)), revealed a statistically significant overall association (P-value  =  0.009), mainly attributed to the contrast of E4 versus E3 allele for patients with evidence (OR: 2.32, 95% CI: 1.19, 4.53) or patients with clinical and echocardiographic findings (OR: 3.34, 95% CI: 1.78, 6.26) of LVF. This study suggests that E4 is a genetic risk factor for LVF in beta-thalassemia major. The presented multivariate approach can be applied in several fields of research. © 2017 John Wiley & Sons Ltd/University College London.

  12. Weighting of the data and analytical approaches may account for differences in overcoming the inadequate representativeness of the respondents to the third wave of a cohort study.

    PubMed

    Taylor, Anne W; Dal Grande, Eleonora; Grant, Janet; Appleton, Sarah; Gill, Tiffany K; Shi, Zumin; Adams, Robert J

    2013-04-01

    Attrition in cohort studies can cause the data to be nonreflective of the original population. Although of little concern if intragroup comparisons are being made or cause and effect assessed, the assessment of bias was undertaken in this study so that intergroup or descriptive analyses could be undertaken. The North West Adelaide Health Study is a chronic disease and risk factor cohort study undertaken in Adelaide, South Australia. In the original wave (1999), clinical and self-report data were collected from 4,056 adults. In the third wave (2008-2010), 2,710 adults were still actively involved. Comparisons were made against two other data sources: Australian Bureau of Statistics Estimated Residential Population and a regular conducted chronic disease and risk factor surveillance system. Comparisons of demographics (age, sex, area, education, work status, and income) proved to be statistically significantly different. In addition, smoking status, body mass index, and general health status were statistically significant from the comparison group. No statistically significant differences were found for alcohol risk. Although the third wave of this cohort study is not representative of the broader population on the variables assessed, weighting of the data and analytical approaches can account for differences. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  14. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  15. Model wall and recovery temperature effects on experimental heat transfer data analysis

    NASA Technical Reports Server (NTRS)

    Throckmorton, D. A.; Stone, D. R.

    1974-01-01

    Basic analytical procedures are used to illustrate, both qualitatively and quantitatively, the relative impact upon heat transfer data analysis of certain factors which may affect the accuracy of experimental heat transfer data. Inaccurate knowledge of adiabatic wall conditions results in a corresponding inaccuracy in the measured heat transfer coefficient. The magnitude of the resulting error is extreme for data obtained at wall temperatures approaching the adiabatic condition. High model wall temperatures and wall temperature gradients affect the level and distribution of heat transfer to an experimental model. The significance of each of these factors is examined and its impact upon heat transfer data analysis is assessed.

  16. A GRAPHICAL DIAGNOSTIC METHOD FOR ASSESSING THE ROTATION IN FACTOR ANALYTICAL MODELS OF ATMOSPHERIC POLLUTION. (R831078)

    EPA Science Inventory

    Factor analytic tools such as principal component analysis (PCA) and positive matrix factorization (PMF), suffer from rotational ambiguity in the results: different solutions (factors) provide equally good fits to the measured data. The PMF model imposes non-negativity of both...

  17. Analytical model for ion stopping power and range in the therapeutic energy interval for beams of hydrogen and heavier ions

    NASA Astrophysics Data System (ADS)

    Donahue, William; Newhauser, Wayne D.; Ziegler, James F.

    2016-09-01

    Many different approaches exist to calculate stopping power and range of protons and heavy charged particles. These methods may be broadly categorized as physically complete theories (widely applicable and complex) or semi-empirical approaches (narrowly applicable and simple). However, little attention has been paid in the literature to approaches that are both widely applicable and simple. We developed simple analytical models of stopping power and range for ions of hydrogen, carbon, iron, and uranium that spanned intervals of ion energy from 351 keV u-1 to 450 MeV u-1 or wider. The analytical models typically reproduced the best-available evaluated stopping powers within 1% and ranges within 0.1 mm. The computational speed of the analytical stopping power model was 28% faster than a full-theoretical approach. The calculation of range using the analytic range model was 945 times faster than a widely-used numerical integration technique. The results of this study revealed that the new, simple analytical models are accurate, fast, and broadly applicable. The new models require just 6 parameters to calculate stopping power and range for a given ion and absorber. The proposed model may be useful as an alternative to traditional approaches, especially in applications that demand fast computation speed, small memory footprint, and simplicity.

  18. Analytical model for ion stopping power and range in the therapeutic energy interval for beams of hydrogen and heavier ions.

    PubMed

    Donahue, William; Newhauser, Wayne D; Ziegler, James F

    2016-09-07

    Many different approaches exist to calculate stopping power and range of protons and heavy charged particles. These methods may be broadly categorized as physically complete theories (widely applicable and complex) or semi-empirical approaches (narrowly applicable and simple). However, little attention has been paid in the literature to approaches that are both widely applicable and simple. We developed simple analytical models of stopping power and range for ions of hydrogen, carbon, iron, and uranium that spanned intervals of ion energy from 351 keV u(-1) to 450 MeV u(-1) or wider. The analytical models typically reproduced the best-available evaluated stopping powers within 1% and ranges within 0.1 mm. The computational speed of the analytical stopping power model was 28% faster than a full-theoretical approach. The calculation of range using the analytic range model was 945 times faster than a widely-used numerical integration technique. The results of this study revealed that the new, simple analytical models are accurate, fast, and broadly applicable. The new models require just 6 parameters to calculate stopping power and range for a given ion and absorber. The proposed model may be useful as an alternative to traditional approaches, especially in applications that demand fast computation speed, small memory footprint, and simplicity.

  19. Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.

    2016-12-01

    The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binotti, M.; Zhu, G.; Gray, A.

    An analytical approach, as an extension of one newly developed method -- First-principle OPTical Intercept Calculation (FirstOPTIC) -- is proposed to treat the geometrical impact of three-dimensional (3-D) effects on parabolic trough optical performance. The mathematical steps of this analytical approach are presented and implemented numerically as part of the suite of FirstOPTIC code. In addition, the new code has been carefully validated against ray-tracing simulation results and available numerical solutions. This new analytical approach to treating 3-D effects will facilitate further understanding and analysis of the optical performance of trough collectors as a function of incidence angle.

  1. Analytical methodology for determination of helicopter IFR precision approach requirements. [pilot workload and acceptance level

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.

    1980-01-01

    A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.

  2. Unifying Approach to Analytical Chemistry and Chemical Analysis: Problem-Oriented Role of Chemical Analysis.

    ERIC Educational Resources Information Center

    Pardue, Harry L.; Woo, Jannie

    1984-01-01

    Proposes an approach to teaching analytical chemistry and chemical analysis in which a problem to be resolved is the focus of a course. Indicates that this problem-oriented approach is intended to complement detailed discussions of fundamental and applied aspects of chemical determinations and not replace such discussions. (JN)

  3. Responding to obesity in Brazil: understanding the international and domestic politics of policy reform through a nested analytic approach to comparative analysis.

    PubMed

    Gómez, Eduardo J

    2015-02-01

    Why do governments pursue obesity legislation? And is the case of Brazil unique compared with other nations when considering the politics of policy reform? Using a nested analytic approach to comparative research, I found that theoretical frameworks accounting for why nations implement obesity legislation were not supported with cross-national statistical evidence. I then turned to the case of Brazil's response to obesity at three levels of government, national, urban, and rural, to propose alternative hypotheses for why nations pursue obesity policy. The case of Brazil suggests that the reasons that governments respond are different at these three levels. International forces, historical institutions, and social health movements were factors that prompted national government responses. At the urban and rural government levels, receiving federal financial assistance and human resource support appeared to be more important. The case of Brazil suggests that the international and domestic politics of responding to obesity are highly complex and that national and subnational political actors have different perceptions and interests when pursuing obesity legislation. Copyright © 2015 by Duke University Press.

  4. Cocontraction of pairs of antagonistic muscles: analytical solution for planar static nonlinear optimization approaches.

    PubMed

    Herzog, W; Binding, P

    1993-11-01

    It has been stated in the literature that static, nonlinear optimization approaches cannot predict coactivation of pairs of antagonistic muscles; however, numerical solutions of such approaches have predicted coactivation of pairs of one-joint and multijoint antagonists. Analytical support for either finding is not available in the literature for systems containing more than one degree of freedom. The purpose of this study was to investigate analytically the possibility of cocontraction of pairs of antagonistic muscles using a static nonlinear optimization approach for a multidegree-of-freedom, two-dimensional system. Analytical solutions were found using the Karush-Kuhn-Tucker conditions, which were necessary and sufficient for optimality in this problem. The results show that cocontraction of pairs of one-joint antagonistic muscles is not possible, whereas cocontraction of pairs of multijoint antagonists is. These findings suggest that cocontraction of pairs of antagonistic muscles may be an "efficient" way to accomplish many movement tasks.

  5. Analytic Steering: Inserting Context into the Information Dialog

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bohn, Shawn J.; Calapristi, Augustin J.; Brown, Shyretha D.

    2011-10-23

    An analyst’s intrinsic domain knowledge is a primary asset in almost any analysis task. Unstructured text analysis systems that apply un-supervised content analysis approaches can be more effective if they can leverage this domain knowledge in a manner that augments the information discovery process without obfuscating new or unexpected content. Current unsupervised approaches rely upon the prowess of the analyst to submit the right queries or observe generalized document and term relationships from ranked or visual results. We propose a new approach which allows the user to control or steer the analytic view within the unsupervised space. This process ismore » controlled through the data characterization process via user supplied context in the form of a collection of key terms. We show that steering with an appropriate choice of key terms can provide better relevance to the analytic domain and still enable the analyst to uncover un-expected relationships; this paper discusses cases where various analytic steering approaches can provide enhanced analysis results and cases where analytic steering can have a negative impact on the analysis process.« less

  6. Dimensions of Early Speech Sound Disorders: A Factor Analytic Study

    ERIC Educational Resources Information Center

    Lewis, Barbara A.; Freebairn, Lisa A.; Hansen, Amy J.; Stein, Catherine M.; Shriberg, Lawrence D.; Iyengar, Sudha K.; Taylor, H. Gerry

    2006-01-01

    The goal of this study was to classify children with speech sound disorders (SSD) empirically, using factor analytic techniques. Participants were 3-7-year olds enrolled in speech/language therapy (N=185). Factor analysis of an extensive battery of speech and language measures provided support for two distinct factors, representing the skill…

  7. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    PubMed

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  8. Is the Jeffreys' scale a reliable tool for Bayesian model comparison in cosmology?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nesseris, Savvas; García-Bellido, Juan, E-mail: savvas.nesseris@uam.es, E-mail: juan.garciabellido@uam.es

    2013-08-01

    We are entering an era where progress in cosmology is driven by data, and alternative models will have to be compared and ruled out according to some consistent criterium. The most conservative and widely used approach is Bayesian model comparison. In this paper we explicitly calculate the Bayes factors for all models that are linear with respect to their parameters. We do this in order to test the so called Jeffreys' scale and determine analytically how accurate its predictions are in a simple case where we fully understand and can calculate everything analytically. We also discuss the case of nestedmore » models, e.g. one with M{sub 1} and another with M{sub 2} superset of M{sub 1} parameters and we derive analytic expressions for both the Bayes factor and the figure of Merit, defined as the inverse area of the model parameter's confidence contours. With all this machinery and the use of an explicit example we demonstrate that the threshold nature of Jeffreys' scale is not a ''one size fits all'' reliable tool for model comparison and that it may lead to biased conclusions. Furthermore, we discuss the importance of choosing the right basis in the context of models that are linear with respect to their parameters and how that basis affects the parameter estimation and the derived constraints.« less

  9. Electron transfer statistics and thermal fluctuations in molecular junctions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goswami, Himangshu Prabal; Harbola, Upendra

    2015-02-28

    We derive analytical expressions for probability distribution function (PDF) for electron transport in a simple model of quantum junction in presence of thermal fluctuations. Our approach is based on the large deviation theory combined with the generating function method. For large number of electrons transferred, the PDF is found to decay exponentially in the tails with different rates due to applied bias. This asymmetry in the PDF is related to the fluctuation theorem. Statistics of fluctuations are analyzed in terms of the Fano factor. Thermal fluctuations play a quantitative role in determining the statistics of electron transfer; they tend tomore » suppress the average current while enhancing the fluctuations in particle transfer. This gives rise to both bunching and antibunching phenomena as determined by the Fano factor. The thermal fluctuations and shot noise compete with each other and determine the net (effective) statistics of particle transfer. Exact analytical expression is obtained for delay time distribution. The optimal values of the delay time between successive electron transfers can be lowered below the corresponding shot noise values by tuning the thermal effects.« less

  10. Grade 8 students' capability of analytical thinking and attitude toward science through teaching and learning about soil and its' pollution based on science technology and society (STS) approach

    NASA Astrophysics Data System (ADS)

    Boonprasert, Lapisarin; Tupsai, Jiraporn; Yuenyong, Chokchai

    2018-01-01

    This study reported Grade 8 students' analytical thinking and attitude toward science in teaching and learning about soil and its' pollution through science technology and society (STS) approach. The participants were 36 Grade 8 students in Naklang, Nongbualumphu, Thailand. The teaching and learning about soil and its' pollution through STS approach had carried out for 6 weeks. The soil and its' pollution unit through STS approach was developed based on framework of Yuenyong (2006) that consisted of five stages including (1) identification of social issues, (2) identification of potential solutions, (3) need for knowledge, (4) decision-making, and (5) socialization stage. Students' analytical thinking and attitude toward science was collected during their learning by participant observation, analytical thinking test, students' tasks, and journal writing. The findings revealed that students could gain their capability of analytical thinking. They could give ideas or behave the characteristics of analytical thinking such as thinking for classifying, compare and contrast, reasoning, interpreting, collecting data and decision making. Students' journal writing reflected that the STS class of soil and its' pollution motivated students. The paper will discuss implications of these for science teaching and learning through STS in Thailand.

  11. Pumping tests in nonuniform aquifers - The radially symmetric case

    USGS Publications Warehouse

    Butler, J.J.

    1988-01-01

    Traditionally, pumping-test-analysis methodology has been limited to applications involving aquifers whose properties are assumed uniform in space. This work attempts to assess the applicability of analytical methodology to a broader class of units with spatially varying properties. An examination of flow behavior in a simple configuration consisting of pumping from the center of a circular disk embedded in a matrix of differing properties is the basis for this investigation. A solution describing flow in this configuration is obtained through Laplace-transform techniques using analytical and numerical inversion schemes. Approaches for the calculation of flow properties in conditions that can be roughly represented by this simple configuration are proposed. Possible applications include a wide variety of geologic structures, as well as the case of a well skin resulting from drilling or development. Of more importance than the specifics of these techniques for analysis of water-level responses is the insight into flow behavior during a pumping test that is provided by the large-time form of the derived solution. The solution reveals that drawdown during a pumping test can be considered to consist of two components that are dependent and independent of near-well properties, respectively. Such an interpretation of pumping-test drawdown allows some general conclusions to be drawn concerning the relationship between parameters calculated using analytical approaches based on curve-matching and those calculated using approaches based on the slope of a semilog straight line plot. The infinite-series truncation that underlies the semilog analytical approaches is shown to remove further contributions of near-well material to total drawdown. In addition, the semilog distance-drawdown approach is shown to yield an expression that is equivalent to the Thiem equation. These results allow some general recommendations to be made concerning observation-well placement for pumping tests in nonuniform aquifers. The relative diffusivity of material on either side of a discontinuity is shown to be the major factor in controlling flow behavior during the period in which the front of the cone of depression is moving across the discontinuity. Though resulting from an analysis of flow in an idealized configuration, the insights of this work into flow behavior during a pumping test are applicable to a wide class of nonuniform units. ?? 1988.

  12. Novel and sensitive reversed-phase high-pressure liquid chromatography method with electrochemical detection for the simultaneous and fast determination of eight biogenic amines and metabolites in human brain tissue.

    PubMed

    Van Dam, Debby; Vermeiren, Yannick; Aerts, Tony; De Deyn, Peter Paul

    2014-08-01

    A fast and simple RP-HPLC method with electrochemical detection (ECD) and ion pair chromatography was developed, optimized and validated in order to simultaneously determine eight different biogenic amines and metabolites in post-mortem human brain tissue in a single-run analytical approach. The compounds of interest are the indolamine serotonin (5-hydroxytryptamine, 5-HT), the catecholamines dopamine (DA) and (nor)epinephrine ((N)E), as well as their respective metabolites, i.e. 3,4-dihydroxyphenylacetic acid (DOPAC) and homovanillic acid (HVA), 5-hydroxy-3-indoleacetic acid (5-HIAA) and 3-methoxy-4-hydroxyphenylglycol (MHPG). A two-level fractional factorial experimental design was applied to study the effect of five experimental factors (i.e. the ion-pair counter concentration, the level of organic modifier, the pH of the mobile phase, the temperature of the column, and the voltage setting of the detector) on the chromatographic behaviour. The cross effect between the five quantitative factors and the capacity and separation factors of the analytes were then analysed using a Standard Least Squares model. The optimized method was fully validated according to the requirements of SFSTP (Société Française des Sciences et Techniques Pharmaceutiques). Our human brain tissue sample preparation procedure is straightforward and relatively short, which allows samples to be loaded onto the HPLC system within approximately 4h. Additionally, a high sample throughput was achieved after optimization due to a total runtime of maximally 40min per sample. The conditions and settings of the HPLC system were found to be accurate with high intra and inter-assay repeatability, recovery and accuracy rates. The robust analytical method results in very low detection limits and good separation for all of the eight biogenic amines and metabolites in this complex mixture of biological analytes. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Computer Graphics Special Issue on 1992 Symposium on Interactive 3D graphics Held in Cambridge, Massachusetts on 29 March-1 April 1992

    DTIC Science & Technology

    1992-01-01

    it * five representations of a desk: 1) a highly detailed desk with ,m ~ ~ ~ ~ Dlla L.._ ’ ,,,e ".. l, faces subdivided along gradients of radiosity , 2...created by our implementation, including timings and polygon using radiosity approaches [9, 61, by summing the counts. contributions of an approximating...and Winget, J. Improving Radiosity Solutions Through the Use of Analytically Determined Form-Factors. Proc. SIGGRAPH 󈨝 Plate 4, illuminated by both

  14. Motivations for play in online games.

    PubMed

    Yee, Nick

    2006-12-01

    An empirical model of player motivations in online games provides the foundation to understand and assess how players differ from one another and how motivations of play relate to age, gender, usage patterns, and in-game behaviors. In the current study, a factor analytic approach was used to create an empirical model of player motivations. The analysis revealed 10 motivation subcomponents that grouped into three overarching components (achievement, social, and immersion). Relationships between motivations and demographic variables (age, gender, and usage patterns) are also presented.

  15. Critical current and linewidth reduction in spin-torque nano-oscillators by delayed self-injection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khalsa, Guru, E-mail: guru.khalsa@nist.gov; Stiles, M. D.; Grollier, J.

    2015-06-15

    Based on theoretical models, the dynamics of spin-torque nano-oscillators can be substantially modified by re-injecting the emitted signal to the input of the oscillator after some delay. Numerical simulations for vortex magnetic tunnel junctions show that with reasonable parameters this approach can decrease critical currents as much as 25% and linewidths by a factor of 4. Analytical calculations, which agree well with simulations, demonstrate that these results can be generalized to any kind of spin-torque oscillator.

  16. Development of Analytical Systems for Evaluation of US Reconstitution and Recovery Programs.

    DTIC Science & Technology

    1980-09-01

    Program Evaluation Economic M4odels US Economy ’MABB"ACT (Cort~at m~ Mae @0b neamv md kavily by block numbr) ~This study identifies economic models and...planning tasks Are more complex and difficult than those faced by planners In the post 󈧬s era. Also, because of those same factors and that the 1980s...comparative analysis outlined in the second study , while also concerned with the accomplishment of societal objectives, is somewhat different. The approach

  17. Development of the permeability/performance reference compound approach for in situ calibration of semipermeable membrane devices

    USGS Publications Warehouse

    Huckins, J.N.; Petty, J.D.; Lebo, J.A.; Almeida, F.V.; Booij, K.; Alvarez, D.A.; Cranor, W.L.; Clark, R.C.; Mogensen, B.B.

    2002-01-01

    Permeability/performance reference compounds (PRCs) are analytically noninterfering organic compounds with moderate to high fugacity from semipermeable membrane devices (SPMDs) that are added to the lipid prior to membrane enclosure. Assuming that isotropic exchange kinetics (IEK) apply and that SPMD-water partition coefficients are known, measurement of PRC dissipation rate constants during SPMD field exposures and laboratory calibration studies permits the calculation of an exposure adjustment factor (EAF). In theory, PRC-derived EAF ratios reflect changes in SPMD sampling rates (relative to laboratory data) due to differences in exposure temperature, membrane biofouling, and flow velocity-turbulence at the membrane surface. Thus, the PRC approach should allow for more accurate estimates of target solute/vapor concentrations in an exposure medium. Under some exposure conditions, the impact of environmental variables on SPMD sampling rates may approach an order of magnitude. The results of this study suggest that most of the effects of temperature, facial velocity-turbulence, and biofouling on the uptake rates of analytes with a wide range of hydrophobicities can be deduced from PRCs with a much narrower range of hydrophobicities. Finally, our findings indicate that the use of PRCs permits prediction of in situ SPMD sampling rates within 2-fold of directly measured values.

  18. Moving alcohol prevention research forward-Part I: introducing a complex systems paradigm.

    PubMed

    Apostolopoulos, Yorghos; Lemke, Michael K; Barry, Adam E; Lich, Kristen Hassmiller

    2018-02-01

    The drinking environment is a complex system consisting of a number of heterogeneous, evolving and interacting components, which exhibit circular causality and emergent properties. These characteristics reduce the efficacy of commonly used research approaches, which typically do not account for the underlying dynamic complexity of alcohol consumption and the interdependent nature of diverse factors influencing misuse over time. We use alcohol misuse among college students in the United States as an example for framing our argument for a complex systems paradigm. A complex systems paradigm, grounded in socio-ecological and complex systems theories and computational modeling and simulation, is introduced. Theoretical, conceptual, methodological and analytical underpinnings of this paradigm are described in the context of college drinking prevention research. The proposed complex systems paradigm can transcend limitations of traditional approaches, thereby fostering new directions in alcohol prevention research. By conceptualizing student alcohol misuse as a complex adaptive system, computational modeling and simulation methodologies and analytical techniques can be used. Moreover, use of participatory model-building approaches to generate simulation models can further increase stakeholder buy-in, understanding and policymaking. A complex systems paradigm for research into alcohol misuse can provide a holistic understanding of the underlying drinking environment and its long-term trajectory, which can elucidate high-leverage preventive interventions. © 2017 Society for the Study of Addiction.

  19. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  20. Chemical imaging of drug delivery systems with structured surfaces-a combined analytical approach of confocal raman microscopy and optical profilometry.

    PubMed

    Kann, Birthe; Windbergs, Maike

    2013-04-01

    Confocal Raman microscopy is an analytical technique with a steadily increasing impact in the field of pharmaceutics as the instrumental setup allows for nondestructive visualization of component distribution within drug delivery systems. Here, the attention is mainly focused on classic solid carrier systems like tablets, pellets, or extrudates. Due to the opacity of these systems, Raman analysis is restricted either to exterior surfaces or cross sections. As Raman spectra are only recorded from one focal plane at a time, the sample is usually altered to create a smooth and even surface. However, this manipulation can lead to misinterpretation of the analytical results. Here, we present a trendsetting approach to overcome these analytical pitfalls with a combination of confocal Raman microscopy and optical profilometry. By acquiring a topography profile of the sample area of interest prior to Raman spectroscopy, the profile height information allowed to level the focal plane to the sample surface for each spectrum acquisition. We first demonstrated the basic principle of this complementary approach in a case study using a tilted silica wafer. In a second step, we successfully adapted the two techniques to investigate an extrudate and a lyophilisate as two exemplary solid drug carrier systems. Component distribution analysis with the novel analytical approach was neither hampered by the curvature of the cylindrical extrudate nor the highly structured surface of the lyophilisate. Therefore, the combined analytical approach bears a great potential to be implemented in diversified fields of pharmaceutical sciences.

  1. Risk and protective factors, longitudinal research, and bullying prevention.

    PubMed

    Ttofi, Maria M; Farrington, David P

    2012-01-01

    This chapter presents the results from two systematic/meta-analytic reviews of longitudinal studies on the association of school bullying (perpetration and victimization) with adverse health and criminal outcomes later in life. Significant associations between the two predictors and the outcomes are found even after controlling for other major childhood risk factors that are measured before school bullying. The results indicate that effective antibullying programs should be encouraged. They could be viewed as a form of early crime prevention as well as an early form of public health promotion. The findings from a systematic/meta-analytic review on the effectiveness of antibullying programs are also presented. Overall, school-based antibullying programs are effective, leading to an average decrease in bullying of 20 to 23 percent and in victimization of 17 to 20 percent. The chapter emphasizes the lack of prospective longitudinal research in the area of school bullying, which does not allow examination of whether any given factor (individual, family,. or social) is a correlate, a predictor, or a possible cause for bullying. This has important implications for future antibullying initiatives, as well as implications for the refinement of theories of school bullying. It is necessary to extend the framework of the traditional risk-focused approach by incorporating the notion of resiliency and investigating possible protective factors against school bullying and its negative consequences. Copyright © 2012 Wiley Periodicals, Inc., A Wiley Company.

  2. Shear joint capability versus bolt clearance

    NASA Technical Reports Server (NTRS)

    Lee, H. M.

    1992-01-01

    The results of a conservative analysis approach into the determination of shear joint strength capability for typical space-flight hardware as a function of the bolt-hole clearance specified in the design are presented. These joints are comprised of high-strength steel fasteners and abutments constructed of aluminum alloys familiar to the aerospace industry. A general analytical expression was first arrived at which relates bolt-hole clearance to the bolt shear load required to place all joint fasteners into a shear transferring position. Extension of this work allowed the analytical development of joint load capability as a function of the number of fasteners, shear strength of the bolt, bolt-hole clearance, and the desired factor of safety. Analysis results clearly indicate that a typical space-flight hardware joint can withstand significant loading when less than ideal bolt hole clearances are used in the design.

  3. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  4. Factors Affecting Higher Order Thinking Skills of Students: A Meta-Analytic Structural Equation Modeling Study

    ERIC Educational Resources Information Center

    Budsankom, Prayoonsri; Sawangboon, Tatsirin; Damrongpanit, Suntorapot; Chuensirimongkol, Jariya

    2015-01-01

    The purpose of the research is to develop and identify the validity of factors affecting higher order thinking skills (HOTS) of students. The thinking skills can be divided into three types: analytical, critical, and creative thinking. This analysis is done by applying the meta-analytic structural equation modeling (MASEM) based on a database of…

  5. Application of analytical quality by design principles for the determination of alkyl p-toluenesulfonates impurities in Aprepitant by HPLC. Validation using total-error concept.

    PubMed

    Zacharis, Constantinos K; Vastardi, Elli

    2018-02-20

    In the research presented we report the development of a simple and robust liquid chromatographic method for the quantification of two genotoxic alkyl sulphonate impurities (namely methyl p-toluenesulfonate and isopropyl p-toluenesulfonate) in Aprepitant API substances using the Analytical Quality by Design (AQbD) approach. Following the steps of AQbD protocol, the selected critical method attributes (CMAs) were the separation criterions between the critical peak pairs, the analysis time and the peak efficiencies of the analytes. The critical method parameters (CMPs) included the flow rate, the gradient slope and the acetonitrile content at the first step of the gradient elution program. Multivariate experimental designs namely Plackett-Burman and Box-Behnken designs were conducted sequentially for factor screening and optimization of the method parameters. The optimal separation conditions were estimated using the desirability function. The method was fully validated in the range of 10-200% of the target concentration limit of the analytes using the "total error" approach. Accuracy profiles - a graphical decision making tool - were constructed using the results of the validation procedures. The β-expectation tolerance intervals did not exceed the acceptance criteria of±10%, meaning that 95% of future results will be included in the defined bias limits. The relative bias ranged between - 1.3-3.8% for both analytes, while the RSD values for repeatability and intermediate precision were less than 1.9% in all cases. The achieved limit of detection (LOD) and the limit of quantification (LOQ) were adequate for the specific purpose and found to be 0.02% (corresponding to 48μgg -1 in sample) for both methyl and isopropyl p-toluenesulfonate. As proof-of-concept, the validated method was successfully applied in the analysis of several Aprepitant batches indicating that this methodology could be used for routine quality control analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Rhodobase, a meta-analytical tool for reconstructing gene regulatory networks in a model photosynthetic bacterium.

    PubMed

    Moskvin, Oleg V; Bolotin, Dmitry; Wang, Andrew; Ivanov, Pavel S; Gomelsky, Mark

    2011-02-01

    We present Rhodobase, a web-based meta-analytical tool for analysis of transcriptional regulation in a model anoxygenic photosynthetic bacterium, Rhodobacter sphaeroides. The gene association meta-analysis is based on the pooled data from 100 of R. sphaeroides whole-genome DNA microarrays. Gene-centric regulatory networks were visualized using the StarNet approach (Jupiter, D.C., VanBuren, V., 2008. A visual data mining tool that facilitates reconstruction of transcription regulatory networks. PLoS ONE 3, e1717) with several modifications. We developed a means to identify and visualize operons and superoperons. We designed a framework for the cross-genome search for transcription factor binding sites that takes into account high GC-content and oligonucleotide usage profile characteristic of the R. sphaeroides genome. To facilitate reconstruction of directional relationships between co-regulated genes, we screened upstream sequences (-400 to +20bp from start codons) of all genes for putative binding sites of bacterial transcription factors using a self-optimizing search method developed here. To test performance of the meta-analysis tools and transcription factor site predictions, we reconstructed selected nodes of the R. sphaeroides transcription factor-centric regulatory matrix. The test revealed regulatory relationships that correlate well with the experimentally derived data. The database of transcriptional profile correlations, the network visualization engine and the optimized search engine for transcription factor binding sites analysis are available at http://rhodobase.org. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  7. Beyond compartmentalization: a relational approach towards agency and vulnerability of young migrants.

    PubMed

    Huijsmans, Roy

    2012-01-01

    Based on fieldwork material from Lao People's Democratic Republic, this paper introduces an analytical framework that transcends compartmentalized approaches towards migration involving young people. The notions of fluid and institutionalized forms of migration illuminate key differences and commonalities in the relational fabric underpinning empirically diverse migration scenarios. Applying this framework to the role of networks in becoming a young migrant, this chapter sheds light on young migrants' differential scope for exercising agency. This redirects concerns about young migrants away from descriptive and static factors towards their relational position in the process of migration, which shapes their agency and vulnerability. Copyright © 2012 Wiley Periodicals, Inc., A Wiley Company.

  8. Modern Analytical Chemistry in the Contemporary World

    ERIC Educational Resources Information Center

    Šíma, Jan

    2016-01-01

    Students not familiar with chemistry tend to misinterpret analytical chemistry as some kind of the sorcery where analytical chemists working as modern wizards handle magical black boxes able to provide fascinating results. However, this approach is evidently improper and misleading. Therefore, the position of modern analytical chemistry among…

  9. Development of an Analytical Method for Dibutyl Phthalate Determination Using Surrogate Analyte Approach

    PubMed Central

    Farzanehfar, Vahid; Faizi, Mehrdad; Naderi, Nima; Kobarfard, Farzad

    2017-01-01

    Dibutyl phthalate (DBP) is a phthalic acid ester and is widely used in polymeric products to make them more flexible. DBP is found in almost every plastic material and is believed to be persistent in the environment. Various analytical methods have been used to measure DBP in different matrices. Considering the ubiquitous nature of DBP, the most important challenge in DBP analyses is the contamination of even analytical grade organic solvents with this compound and lack of availability of a true blank matrix to construct the calibration line. Standard addition method or using artificial matrices reduce the precision and accuracy of the results. In this study a surrogate analyte approach that is based on using deuterium labeled analyte (DBP-d4) to construct the calibration line was applied to determine DBP in hexane samples. PMID:28496469

  10. Noise-band factor analysis of cancer Fourier transform infrared evanescent-wave fiber optical (FTIR-FEW) spectra

    NASA Astrophysics Data System (ADS)

    Sukuta, Sydney; Bruch, Reinhard F.

    2002-05-01

    The goal of this study is to test the feasibility of using noise factor/eigenvector bands as general clinical analytical tools for diagnoses. We developed a new technique, Noise Band Factor Cluster Analysis (NBFCA), to diagnose benign tumors via their Fourier transform IR fiber optic evanescent wave spectral data for the first time. The middle IR region of human normal skin tissue and benign and melanoma tumors, were analyzed using this new diagnostic technique. Our results are not in full-agreement with pathological classifications hence there is a possibility that our approaches could complement or improve these traditional classification schemes. Moreover, the use of NBFCA make it much easier to delineate class boundaries hence this method provides results with much higher certainty.

  11. Analytical treatment of the deformation behavior of EUVL masks during electrostatic chucking

    NASA Astrophysics Data System (ADS)

    Brandstetter, Gerd; Govindjee, Sanjay

    2012-03-01

    A new analytical approach is presented to predict mask deformation during electro-static chucking in next generation extreme-ultraviolet-lithography (EUVL). Given an arbitrary profile measurement of the mask and chuck non-flatness, this method has been developed as an alternative to time-consuming finite element simulations for overlay error correction algorithms. We consider the feature transfer of each harmonic component in the profile shapes via linear elasticity theory and demonstrate analytically how high spatial frequencies are filtered. The method is compared to presumably more accurate finite element simulations and has been tested successfully in an overlay error compensation experiment, where the residual error y-component could be reduced by a factor 2. As a side outcome, the formulation provides a tool to estimate the critical pin-size and -pitch such that the distortion on the mask front-side remains within given tolerances. We find for a numerical example that pin-pitches of less than 5 mm will result in a mask pattern-distortion of less than 1 nm if the chucking pressure is below 30 kPa.

  12. Analytical treatment of the deformation behavior of extreme-ultraviolet-lithography masks during electrostatic chucking

    NASA Astrophysics Data System (ADS)

    Brandstetter, Gerd; Govindjee, Sanjay

    2012-10-01

    A new analytical approach is presented to predict mask deformation during electrostatic chucking in next-generation extreme-ultraviolet-lithography. Given an arbitrary profile measurement of the mask and chuck nonflatness, this method has been developed as an alternative to time-consuming finite element simulations for overlay error correction algorithms. We consider the feature transfer of each harmonic component in the profile shapes via linear elasticity theory and demonstrate analytically how high spatial frequencies are filtered. The method is compared to presumably more accurate finite element simulations and has been tested successfully in an overlay error compensation experiment, where the residual error y-component could be reduced by a factor of 2. As a side outcome, the formulation provides a tool to estimate the critical pin-size and -pitch such that the distortion on the mask front-side remains within given tolerances. We find for a numerical example that pin-pitches of less than 5 mm will result in a mask pattern distortion of less than 1 nm if the chucking pressure is below 30 kPa.

  13. Influence of Desorption Conditions on Analyte Sensitivity and Internal Energy in Discrete Tissue or Whole Body Imaging by IR-MALDESI

    NASA Astrophysics Data System (ADS)

    Rosen, Elias P.; Bokhart, Mark T.; Ghashghaei, H. Troy; Muddiman, David C.

    2015-06-01

    Analyte signal in a laser desorption/postionization scheme such as infrared matrix-assisted laser desorption electrospray ionization (IR-MALDESI) is strongly coupled to the degree of overlap between the desorbed plume of neutral material from a sample and an orthogonal electrospray. In this work, we systematically examine the effect of desorption conditions on IR-MALDESI response to pharmaceutical drugs and endogenous lipids in biological tissue using a design of experiments approach. Optimized desorption conditions have then been used to conduct an untargeted lipidomic analysis of whole body sagittal sections of neonate mouse. IR-MALDESI response to a wide range of lipid classes has been demonstrated, with enhanced lipid coverage received by varying the laser wavelength used for mass spectrometry imaging (MSI). Targeted MS2 imaging (MS2I) of an analyte, cocaine, deposited beneath whole body sections allowed determination of tissue-specific ion response factors, and CID fragments of cocaine were monitored to comment on wavelength-dependent internal energy deposition based on the "survival yield" method.

  14. ADRA2B Deletion Variant and Enhanced Cognitive Processing of Emotional Information: A Meta-Analytical Review.

    PubMed

    Xie, Weizhen; Cappiello, Marcus; Meng, Ming; Rosenthal, Robert; Zhang, Weiwei

    2018-05-08

    This meta-analytical review examines whether a deletion variant in ADRA2B, a gene that encodes α 2B adrenoceptor in the regulation of norepinephrine availability, influences cognitive processing of emotional information in human observers. Using a multilevel modeling approach, this meta-analysis of 16 published studies with a total of 2,752 participants showed that ADRA2B deletion variant was significantly associated with enhanced perceptual and cognitive task performance for emotional stimuli. In contrast, this genetic effect did not manifest in overall task performance when non-emotional content was used. Furthermore, various study-level factors, such as targeted cognitive processes (memory vs. attention/perception) and task procedures (recall vs. recognition), could moderate the size of this genetic effect. Overall, with increased statistical power and standardized analytical procedures, this meta-analysis has established the contributions of ADRA2B to the interactions between emotion and cognition, adding to the growing literature on individual differences in attention, perception, and memory for emotional information in the general population. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. The advance care planning experiences of people with dementia, family caregivers and professionals: a synthesis of the qualitative literature.

    PubMed

    Ryan, Tony; Amen, Karwan M; McKeown, Jane

    2017-10-01

    There exists compelling evidence that advance care planning (ACP) remains a key factor in the delivery of appropriate end of life care and facilitates the timely transition to palliative care for people with dementia. Take up of ACP within the dementia population is low, especially when compared with other conditions. Quantitative research has helped in identifying some of the key factors in enabling or inhibiting the use of ACP within the dementia population. Qualitative research can, however, shed further light upon the experiences of all. We carried out a search of the qualitative literature addressing the ACP experiences of people with dementia, family caregivers and professionals. An approach to qualitative synthesis involving coding of original text, developing descriptive themes and generating analytical themes was utilized. We identified five papers and subsequently five analytical themes: breadth and scope of future planning; challenges to ACP; postponing ACP; confidence in systems and making ACP happen for people with dementia. The synthesized findings shed light on the ongoing challenges of the use and further development of ACP in the population of people with dementia. In particular attention is drawn to the difficulties in the timing of ACP and the preference for informal approaches to planning within the families of people affected by dementia. The ACP capacity of the workforce is also addressed. The paper reveals considerable complexity in undertaking ACP in a context of dementia. It is suggested that the preference for informal approaches and the timing of initial conversations be considered and that the skills of those involved in initiating discussions should be given primacy.

  16. Modified Skvor/Starr approach in the mechanical-thermal noise analysis of condenser microphone.

    PubMed

    Tan, Chee Wee; Miao, Jianmin

    2009-11-01

    Simple analytical expressions of mechanical resistance, such as those formulated by Skvor/Starr, are widely used to describe the mechanical-thermal noise performance of a condenser microphone. However, the Skvor/Starr approach does not consider the location effect of acoustic holes in the backplate and overestimates the total equivalent mechanical resistance and mechanical-thermal noise. In this paper, a modified form of the Skvor/Starr approach is proposed to address this hole location dependent effect. A mode shape factor, which consists of the zero order Bessel and modified Bessel functions, is included in Skvor's mechanical resistance formulation to consider the effect of the hole location in the backplate. With reference to two B&K microphones, the theoretical results of the A-weighted mechanical-thermal noise obtained by the modified Skvor/Starr approach are in good agreements with those reported experimental ones.

  17. Visual analytics of brain networks.

    PubMed

    Li, Kaiming; Guo, Lei; Faraco, Carlos; Zhu, Dajiang; Chen, Hanbo; Yuan, Yixuan; Lv, Jinglei; Deng, Fan; Jiang, Xi; Zhang, Tuo; Hu, Xintao; Zhang, Degang; Miller, L Stephen; Liu, Tianming

    2012-05-15

    Identification of regions of interest (ROIs) is a fundamental issue in brain network construction and analysis. Recent studies demonstrate that multimodal neuroimaging approaches and joint analysis strategies are crucial for accurate, reliable and individualized identification of brain ROIs. In this paper, we present a novel approach of visual analytics and its open-source software for ROI definition and brain network construction. By combining neuroscience knowledge and computational intelligence capabilities, visual analytics can generate accurate, reliable and individualized ROIs for brain networks via joint modeling of multimodal neuroimaging data and an intuitive and real-time visual analytics interface. Furthermore, it can be used as a functional ROI optimization and prediction solution when fMRI data is unavailable or inadequate. We have applied this approach to an operation span working memory fMRI/DTI dataset, a schizophrenia DTI/resting state fMRI (R-fMRI) dataset, and a mild cognitive impairment DTI/R-fMRI dataset, in order to demonstrate the effectiveness of visual analytics. Our experimental results are encouraging. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Analytical approach for collective diffusion: One-dimensional lattice with the nearest neighbor and the next nearest neighbor lateral interactions

    NASA Astrophysics Data System (ADS)

    Tarasenko, Alexander

    2018-01-01

    Diffusion of particles adsorbed on a homogeneous one-dimensional lattice is investigated using a theoretical approach and MC simulations. The analytical dependencies calculated in the framework of approach are tested using the numerical data. The perfect coincidence of the data obtained by these different methods demonstrates that the correctness of the approach based on the theory of the non-equilibrium statistical operator.

  19. An analytically based numerical method for computing view factors in real urban environments

    NASA Astrophysics Data System (ADS)

    Lee, Doo-Il; Woo, Ju-Wan; Lee, Sang-Hyun

    2018-01-01

    A view factor is an important morphological parameter used in parameterizing in-canyon radiative energy exchange process as well as in characterizing local climate over urban environments. For realistic representation of the in-canyon radiative processes, a complete set of view factors at the horizontal and vertical surfaces of urban facets is required. Various analytical and numerical methods have been suggested to determine the view factors for urban environments, but most of the methods provide only sky-view factor at the ground level of a specific location or assume simplified morphology of complex urban environments. In this study, a numerical method that can determine the sky-view factors ( ψ ga and ψ wa ) and wall-view factors ( ψ gw and ψ ww ) at the horizontal and vertical surfaces is presented for application to real urban morphology, which are derived from an analytical formulation of the view factor between two blackbody surfaces of arbitrary geometry. The established numerical method is validated against the analytical sky-view factor estimation for ideal street canyon geometries, showing a consolidate confidence in accuracy with errors of less than 0.2 %. Using a three-dimensional building database, the numerical method is also demonstrated to be applicable in determining the sky-view factors at the horizontal (roofs and roads) and vertical (walls) surfaces in real urban environments. The results suggest that the analytically based numerical method can be used for the radiative process parameterization of urban numerical models as well as for the characterization of local urban climate.

  20. Clinical chemistry reference values for 75-year-old apparently healthy persons.

    PubMed

    Huber, Klaus Roland; Mostafaie, Nazanin; Stangl, Gerhard; Worofka, Brigitte; Kittl, Eva; Hofmann, Jörg; Hejtman, Milos; Michael, Rainer; Weissgram, Silvia; Leitha, Thomas; Jungwirth, Susanne; Fischer, Peter; Tragl, Karl-Heinz; Bauer, Kurt

    2006-01-01

    Clinical chemistry reference values for elderly persons are sparse and mostly intermixed with those for younger subjects. To understand the links between metabolism and aging, it is paramount to differentiate between "normal" physiological processes in apparently healthy elderly subjects and metabolic changes due to long-lasting diseases. The Vienna Transdanube Aging (VITA) study, which began in 2000 and is continuing, will allow us to do just that, because more than 600 male and female volunteers aged exactly 75 years (to exclude any influence of the "aging" factor in this cohort) are participating in this study. Extensive clinical, neurological, biochemical, psychological, genetic, and radiological analyses, with a special emphasis on consumption of medication and abuse of drugs, were performed on each of the probands. The multitude of data and questionnaires obtained made possible an a posteriori approach to select individuals fulfilling criteria for a reference sample group of apparently healthy 75-year-old subjects for our study. Specific analytes were quantified on automated clinical analyzers, while manual methods were used for hormonal analytes. All clinical chemistry analytes were evaluated using in-depth statistical analyses with SPSS for Windows. In all, reference intervals for 45 analytes could be established. These include routine parameters for the assessment of organ functions, as well as hormone concentrations and hematological appraisals. Because all patients were reevaluated after exactly 30 months in the course of this study, we had the opportunity to reassess their health status at the age of 77.5 years. This was very useful for validation of the first round data set. Data of the second round evaluation corroborate the reference limits of the baseline analysis and further confirm our inclusion and exclusion criteria. In summary, we have established a reliable set of reference data for hormonal, hematological, and clinical chemistry analytes for elderly subjects. These values will be very useful for our future attempts to correlate disease states and aging processes with metabolic factors.

  1. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based Graduate-Level Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Toh, Chee-Seng

    2007-01-01

    A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.

  2. Biomanufacturing process analytical technology (PAT) application for downstream processing: Using dissolved oxygen as an indicator of product quality for a protein refolding reaction.

    PubMed

    Pizarro, Shelly A; Dinges, Rachel; Adams, Rachel; Sanchez, Ailen; Winter, Charles

    2009-10-01

    Process analytical technology (PAT) is an initiative from the US FDA combining analytical and statistical tools to improve manufacturing operations and ensure regulatory compliance. This work describes the use of a continuous monitoring system for a protein refolding reaction to provide consistency in product quality and process performance across batches. A small-scale bioreactor (3 L) is used to understand the impact of aeration for refolding recombinant human vascular endothelial growth factor (rhVEGF) in a reducing environment. A reverse-phase HPLC assay is used to assess product quality. The goal in understanding the oxygen needs of the reaction and its impact to quality, is to make a product that is efficiently refolded to its native and active form with minimum oxidative degradation from batch to batch. Because this refolding process is heavily dependent on oxygen, the % dissolved oxygen (DO) profile is explored as a PAT tool to regulate process performance at commercial manufacturing scale. A dynamic gassing out approach using constant mass transfer (k(L)a) is used for scale-up of the aeration parameters to manufacturing scale tanks (2,000 L, 15,000 L). The resulting DO profiles of the refolding reaction show similar trends across scales and these are analyzed using rpHPLC. The desired product quality attributes are then achieved through alternating air and nitrogen sparging triggered by changes in the monitored DO profile. This approach mitigates the impact of differences in equipment or feedstock components between runs, and is directly inline with the key goal of PAT to "actively manage process variability using a knowledge-based approach." (c) 2009 Wiley Periodicals, Inc.

  3. Effect of Geometry on Electrokinetic Characterization of Solid Surfaces.

    PubMed

    Kumar, Abhijeet; Kleinen, Jochen; Venzmer, Joachim; Gambaryan-Roisman, Tatiana

    2017-08-01

    An analytical approach is presented to describe pressure-driven streaming current (I str ) and streaming potential (U str ) generation in geometrically complex samples, for which the classical Helmholtz-Smoluchowski (H-S) equation is known to be inaccurate. The new approach is valid under the same prerequisite conditions that are used for the development of the H-S equation, that is, the electrical double layers (EDLs) are sufficiently thin and surface conductivity and electroviscous effects are negligible. The analytical methodology is developed using linear velocity profiles to describe liquid flow inside of EDLs and using simplifying approximations to describe macroscopic flow. At first, a general expression is obtained to describe the I str generated in different cross sections of an arbitrarily shaped sample. Thereafter, assuming that the generated U str varies only along the pressure-gradient direction, an expression describing the variation of generated U str along the sample length is obtained. These expressions describing I str and U str generation constitute the theoretical foundation of this work, which is first applied to a set of three nonuniform cross-sectional capillaries and thereafter to a square array of cylindrical fibers (model porous media) for both parallel and transverse fiber orientation cases. Although analytical solutions cannot be obtained for real porous substrates because of their random structure, the new theory provides useful insights into the effect of important factors such as fiber orientation, sample porosity, and sample dimensions. The solutions obtained for the model porous media are used to device strategies for more accurate zeta potential determination of porous fiber plugs. The new approach could be thus useful in resolving the long-standing problem of sample geometry dependence of zeta potential measurements.

  4. An implict LU scheme for the Euler equations applied to arbitrary cascades. [new method of factoring

    NASA Technical Reports Server (NTRS)

    Buratynski, E. K.; Caughey, D. A.

    1984-01-01

    An implicit scheme for solving the Euler equations is derived and demonstrated. The alternating-direction implicit (ADI) technique is modified, using two implicit-operator factors corresponding to lower-block-diagonal (L) or upper-block-diagonal (U) algebraic systems which can be easily inverted. The resulting LU scheme is implemented in finite-volume mode and applied to 2D subsonic and transonic cascade flows with differing degrees of geometric complexity. The results are presented graphically and found to be in good agreement with those of other numerical and analytical approaches. The LU method is also 2.0-3.4 times faster than ADI, suggesting its value in calculating 3D problems.

  5. Health care reforms.

    PubMed

    Marušič, Dorjan; Prevolnik Rupel, Valentina

    2016-09-01

    In large systems, such as health care, reforms are underway constantly. The article presents a definition of health care reform and factors that influence its success. The factors being discussed range from knowledgeable personnel, the role of involvement of international experts and all stakeholders in the country, the importance of electoral mandate and governmental support, leadership and clear and transparent communication. The goals set need to be clear, and it is helpful to have good data and analytical support in the process. Despite all debates and experiences, it is impossible to clearly define the best approach to tackle health care reform due to a different configuration of governance structure, political will and state of the economy in a country.

  6. Analytical Thinking, Analytical Action: Using Prelab Video Demonstrations and e-Quizzes to Improve Undergraduate Preparedness for Analytical Chemistry Practical Classes

    ERIC Educational Resources Information Center

    Jolley, Dianne F.; Wilson, Stephen R.; Kelso, Celine; O'Brien, Glennys; Mason, Claire E.

    2016-01-01

    This project utilizes visual and critical thinking approaches to develop a higher-education synergistic prelab training program for a large second-year undergraduate analytical chemistry class, directing more of the cognitive learning to the prelab phase. This enabled students to engage in more analytical thinking prior to engaging in the…

  7. Mass Spectrometry Strategies for Clinical Metabolomics and Lipidomics in Psychiatry, Neurology, and Neuro-Oncology

    PubMed Central

    Wood, Paul L

    2014-01-01

    Metabolomics research has the potential to provide biomarkers for the detection of disease, for subtyping complex disease populations, for monitoring disease progression and therapy, and for defining new molecular targets for therapeutic intervention. These potentials are far from being realized because of a number of technical, conceptual, financial, and bioinformatics issues. Mass spectrometry provides analytical platforms that address the technical barriers to success in metabolomics research; however, the limited commercial availability of analytical and stable isotope standards has created a bottleneck for the absolute quantitation of a number of metabolites. Conceptual and financial factors contribute to the generation of statistically under-powered clinical studies, whereas bioinformatics issues result in the publication of a large number of unidentified metabolites. The path forward in this field involves targeted metabolomics analyses of large control and patient populations to define both the normal range of a defined metabolite and the potential heterogeneity (eg, bimodal) in complex patient populations. This approach requires that metabolomics research groups, in addition to developing a number of analytical platforms, build sufficient chemistry resources to supply the analytical standards required for absolute metabolite quantitation. Examples of metabolomics evaluations of sulfur amino-acid metabolism in psychiatry, neurology, and neuro-oncology and of lipidomics in neurology will be reviewed. PMID:23842599

  8. Green approach using monolithic column for simultaneous determination of coformulated drugs.

    PubMed

    Yehia, Ali M; Mohamed, Heba M

    2016-06-01

    Green chemistry and sustainability is now entirely encompassed across the majority of pharmaceutical companies and research labs. Researchers' attention is careworn toward implementing the green analytical chemistry principles for more eco-friendly analytical methodologies. Solvents play a dominant role in determining the greenness of the analytical procedure. Using safer solvents, the greenness profile of the methodology could be increased remarkably. In this context, a green chromatographic method has been developed and validated for the simultaneous determination of phenylephrine, paracetamol, and guaifenesin in their ternary pharmaceutical mixture. The chromatographic separation was carried out using monolithic column and green solvents as mobile phase. The use of monolithic column allows efficient separation protocols at higher flow rates, which results in short time of analysis. Two-factor three-level experimental design was used to optimize the chromatographic conditions. The greenness profile of the proposed methodology was assessed using eco-scale as a green metrics and was found to be an excellent green method with regard to the usage and production of hazardous chemicals and solvents, energy consumption, and amount of produced waste. The proposed method improved the environmental impact without compromising the analytical performance criteria and could be used as a safer alternate for the routine analysis of the studied drugs. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Measurements of Kepler Planet Masses and Eccentricities from Transit Timing Variations: Analytic and N-body Results

    NASA Astrophysics Data System (ADS)

    Hadden, Sam; Lithwick, Yoram

    2015-12-01

    Several Kepler planets reside in multi-planet systems where gravitational interactions result in transit timing variations (TTVs) that provide exquisitely sensitive probes of their masses of and orbits. Measuring these planets' masses and orbits constrains their bulk compositions and can provide clues about their formation. However, inverting TTV measurements in order to infer planet properties can be challenging: it involves fitting a nonlinear model with a large number of parameters to noisy data, often with significant degeneracies between parameters. I present results from two complementary approaches to TTV inversion: Markov chain Monte Carlo simulations that use N-body integrations to compute transit times and a simplified analytic model for computing the TTVs of planets near mean motion resonances. The analytic model allows for straightforward interpretations of N-body results and provides an independent estimate of parameter uncertainties that can be compared to MCMC results which may be sensitive to factors such as priors. We have conducted extensive MCMC simulations along with analytic fits to model the TTVs of dozens of Kepler multi-planet systems. We find that the bulk of these sub-Jovian planets have low densities that necessitate significant gaseous envelopes. We also find that the planets' eccentricities are generally small but often definitively non-zero.

  10. Mass spectrometry strategies for clinical metabolomics and lipidomics in psychiatry, neurology, and neuro-oncology.

    PubMed

    Wood, Paul L

    2014-01-01

    Metabolomics research has the potential to provide biomarkers for the detection of disease, for subtyping complex disease populations, for monitoring disease progression and therapy, and for defining new molecular targets for therapeutic intervention. These potentials are far from being realized because of a number of technical, conceptual, financial, and bioinformatics issues. Mass spectrometry provides analytical platforms that address the technical barriers to success in metabolomics research; however, the limited commercial availability of analytical and stable isotope standards has created a bottleneck for the absolute quantitation of a number of metabolites. Conceptual and financial factors contribute to the generation of statistically under-powered clinical studies, whereas bioinformatics issues result in the publication of a large number of unidentified metabolites. The path forward in this field involves targeted metabolomics analyses of large control and patient populations to define both the normal range of a defined metabolite and the potential heterogeneity (eg, bimodal) in complex patient populations. This approach requires that metabolomics research groups, in addition to developing a number of analytical platforms, build sufficient chemistry resources to supply the analytical standards required for absolute metabolite quantitation. Examples of metabolomics evaluations of sulfur amino-acid metabolism in psychiatry, neurology, and neuro-oncology and of lipidomics in neurology will be reviewed.

  11. A practical model of thin disk regenerative amplifier based on analytical expression of ASE lifetime

    NASA Astrophysics Data System (ADS)

    Zhou, Huang; Chyla, Michal; Nagisetty, Siva Sankar; Chen, Liyuan; Endo, Akira; Smrz, Martin; Mocek, Tomas

    2017-12-01

    In this paper, a practical model of a thin disk regenerative amplifier has been developed based on an analytical approach, in which Drew A. Copeland [1] had evaluated the loss rate of the upper state laser level due to ASE and derived the analytical expression of the effective life-time of the upper-state laser level by taking the Lorentzian stimulated emission line-shape and total internal reflection into account. By adopting the analytical expression of effective life-time in the rate equations, we have developed a less numerically intensive model for predicting and analyzing the performance of a thin disk regenerative amplifier. Thanks to the model, optimized combination of various parameters can be obtained to avoid saturation, period-doubling bifurcation or first pulse suppression prior to experiments. The effective life-time due to ASE is also analyzed against various parameters. The simulated results fit well with experimental data. By fitting more experimental results with numerical model, we can improve the parameters of the model, such as reflective factor which is used to determine the weight of boundary reflection within the influence of ASE. This practical model will be used to explore the scaling limits imposed by ASE of the thin disk regenerative amplifier being developed in HiLASE Centre.

  12. The experience of initiating injection drug use and its social context: a qualitative systematic review and thematic synthesis.

    PubMed

    Guise, Andy; Horyniak, Danielle; Melo, Jason; McNeil, Ryan; Werb, Dan

    2017-12-01

    Understanding the experience of initiating injection drug use and its social contexts is crucial to inform efforts to prevent transitions into this mode of drug consumption and support harm reduction. We reviewed and synthesized existing qualitative scientific literature systematically to identify the socio-structural contexts for, and experiences of, the initiation of injection drug use. We searched six databases (Medline, Embase, PsychINFO, CINAHL, IBSS and SSCI) systematically, along with a manual search, including key journals and subject experts. Peer-reviewed studies were included if they qualitatively explored experiences of or socio-structural contexts for injection drug use initiation. A thematic synthesis approach was used to identify descriptive and analytical themes throughout studies. From 1731 initial results, 41 studies reporting data from 1996 participants were included. We developed eight descriptive themes and two analytical (higher-order) themes. The first analytical theme focused on injecting initiation resulting from a social process enabled and constrained by socio-structural factors: social networks and individual interactions, socialization into drug-using identities and choices enabled and constrained by social context all combine to produce processes of injection initiation. The second analytical theme addressed pathways that explore varying meanings attached to injection initiation and how they link to social context: seeking pleasure, responses to increasing tolerance to drugs, securing belonging and identity and coping with pain and trauma. Qualitative research shows that injection drug use initiation has varying and distinct meanings for individuals involved and is a dynamic process shaped by social and structural factors. Interventions should therefore respond to the socio-structural influences on injecting drug use initiation by seeking to modify the contexts for initiation, rather than solely prioritizing the reduction of individual harms through behavior change. © 2017 Society for the Study of Addiction.

  13. Development and optimization of SPE-HPLC-UV/ELSD for simultaneous determination of nine bioactive components in Shenqi Fuzheng Injection based on Quality by Design principles.

    PubMed

    Wang, Lu; Qu, Haibin

    2016-03-01

    A method combining solid phase extraction, high performance liquid chromatography, and ultraviolet/evaporative light scattering detection (SPE-HPLC-UV/ELSD) was developed according to Quality by Design (QbD) principles and used to assay nine bioactive compounds within a botanical drug, Shenqi Fuzheng Injection. Risk assessment and a Plackett-Burman design were utilized to evaluate the impact of 11 factors on the resolutions and signal-to-noise of chromatographic peaks. Multiple regression and Pareto ranking analysis indicated that the sorbent mass, sample volume, flow rate, column temperature, evaporator temperature, and gas flow rate were statistically significant (p < 0.05) in this procedure. Furthermore, a Box-Behnken design combined with response surface analysis was employed to study the relationships between the quality of SPE-HPLC-UV/ELSD analysis and four significant factors, i.e., flow rate, column temperature, evaporator temperature, and gas flow rate. An analytical design space of SPE-HPLC-UV/ELSD was then constructed by calculated Monte Carlo probability. In the presented approach, the operating parameters of sample preparation, chromatographic separation, and compound detection were investigated simultaneously. Eight terms of method validation, i.e., system-suitability tests, method robustness/ruggedness, sensitivity, precision, repeatability, linearity, accuracy, and stability, were accomplished at a selected working point. These results revealed that the QbD principles were suitable in the development of analytical procedures for samples in complex matrices. Meanwhile, the analytical quality and method robustness were validated by the analytical design space. The presented strategy provides a tutorial on the development of a robust QbD-compliant quantitative method for samples in complex matrices.

  14. Joseph v. Brady: Synthesis Reunites What Analysis Has Divided

    ERIC Educational Resources Information Center

    Thompson, Travis

    2012-01-01

    Joseph V. Brady (1922-2011) created behavior-analytic neuroscience and the analytic framework for understanding how the external and internal neurobiological environments and mechanisms interact. Brady's approach offered synthesis as well as analysis. He embraced Findley's approach to constructing multioperant behavioral repertoires that found…

  15. Analytical Protocol (GC/ECNIMS) for OSWER's Response to OIG Report (2005-P-00022) on Toxaphene Analysis

    EPA Science Inventory

    The research approached the large number and complexity of the analytes as four separate groups: technical toxaphene, toxaphene congeners (eight in number), chlordane, and organochlorine pesticides. This approach was advantageous because it eliminated potential interferences amon...

  16. Demonstration/Validation of the Snap Sampler Passive Ground Water Sampling Device for Sampling Inorganic Analytes at the Former Pease Air Force Base

    DTIC Science & Technology

    2009-07-01

    viii Unit Conversion Factors...sampler is also an economic alternative for sampling for inorganic analytes. ERDC/CRREL TR-09-12 xii Unit Conversion Factors Multiply By To Obtain...head- space and then covered with two layers of tightly fitting aluminum foil. To dissolve the analytes, the solutions were stirred for approximately

  17. Algebraic approach to small-world network models

    NASA Astrophysics Data System (ADS)

    Rudolph-Lilith, Michelle; Muller, Lyle E.

    2014-01-01

    We introduce an analytic model for directed Watts-Strogatz small-world graphs and deduce an algebraic expression of its defining adjacency matrix. The latter is then used to calculate the small-world digraph's asymmetry index and clustering coefficient in an analytically exact fashion, valid nonasymptotically for all graph sizes. The proposed approach is general and can be applied to all algebraically well-defined graph-theoretical measures, thus allowing for an analytical investigation of finite-size small-world graphs.

  18. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  19. Multielemental analysis of 18 essential and toxic elements in amniotic fluid samples by ICP-MS: Full procedure validation and estimation of measurement uncertainty.

    PubMed

    Markiewicz, B; Sajnóg, A; Lorenc, W; Hanć, A; Komorowicz, I; Suliburska, J; Kocyłowski, R; Barałkiewicz, D

    2017-11-01

    Amniotic fluid is the substantial factor in the development of an embryo and fetus due to the fact that water and solutes contained in it penetrate the fetal membranes in an hydrostatic and osmotic way as well as being swallowed by the fetus. Elemental composition of amniotic fluid influences the growth and health of the fetus, therefore, an analysis of amniotic fluid is important because the results would indicate abnormal levels of minerals or toxic elements. Inductively coupled plasma mass spectroscopy (ICP-MS) is often used for determination of trace and ultra-trace level elements in a wide range of matrices including biological samples because of its unique analytical capabilities. In the case of trace and ultra-trace level analysis detailed characteristics of analytical procedure as well as properties of the analytical result are particularly important. The purpose of this study was to develop a new analytical procedure for multielemental analysis of 18 elements (Al, As, Ba, Ca, Cd, Co, Cr, Cu, Mg, Mn, Ni, Pb, Sb, Se, Sr, U, V and Zn) in amniotic fluid samples using ICP-MS. Dynamic reaction cell (DRC) with two reaction gases, ammonia and oxygen, was involved in the experiment to eliminate spectral interferences. Detailed validation was conducted using 3 certified reference mterials (CRMs) and real amniotic fluid samples collected from patients. Repeatability for all analyzed analytes was found to range from 0.70% to 8.0% and for intermediate precision results varied from 1.3% to 15%. Trueness expressed as recovery ranged from 80% to 125%. Traceability was assured through the analyses of CRMs. Uncertainty of the results was also evaluated using single-laboratory validation approach. The obtained expanded uncertainty (U) results for CRMs, expressed as a percentage of the concentration of an analyte, were found to be between 8.3% for V and 45% for Cd. Standard uncertainty of the precision was found to have a greater influence on the combined standard uncertainty than on trueness factor. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Big Data Analytics for Prostate Radiotherapy.

    PubMed

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose-volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the "RadoncSpace") in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches.

  1. Big Data Analytics for Prostate Radiotherapy

    PubMed Central

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose–volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the “RadoncSpace”) in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches. PMID:27379211

  2. A Social Identity Approach to Understanding and Promoting Physical Activity.

    PubMed

    Stevens, Mark; Rees, Tim; Coffee, Pete; Steffens, Niklas K; Haslam, S Alexander; Polman, Remco

    2017-10-01

    Against the backdrop of a global physical inactivity crisis, attempts to both understand and positively influence physical activity behaviours are characterized by a focus on individual-level factors (e.g. cognitions, attitudes, motivation). We outline a new perspective, drawn from an emerging body of work exploring the applicability of social identity and self-categorization theories to domains of sport and health, from which to understand and address this pervasive problem. This social identity approach suggests that the groups to which people belong can be, and often are, incorporated into their sense of self and, through this, are powerful determinants of physical activity-related behaviour. We start by reviewing the current state of physical activity research and highlighting the potential for the social identity approach to help understand how social factors influence these behaviours. Next, we outline the theoretical underpinnings of the social identity approach and provide three key examples that speak to the analytical and practical value of the social identity approach in physical activity settings. Specifically, we argue that social identity (1) can be harnessed to promote engagement in physical activity, (2) underpins exercise group behaviour, and (3) underpins effective leadership in exercise settings. We conclude by identifying prospects for a range of theory-informed research developments.

  3. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based, Graduate-Level Analytical Chemistry Course

    NASA Astrophysics Data System (ADS)

    Toh, Chee-Seng

    2007-04-01

    A research-focused approach is described for a nonlaboratory-based graduate-level module on analytical chemistry. The approach utilizes commonly practiced activities carried out in active research laboratories, in particular, activities involving logging of ideas and thoughts, journal clubs, proposal writing, classroom participation and discussions, and laboratory tours. This approach was adapted without compromising the course content and results suggest possible adaptation and implementation in other graduate-level courses.

  4. Determination of aerodynamic sensitivity coefficients in the transonic and supersonic regimes

    NASA Technical Reports Server (NTRS)

    Elbanna, Hesham M.; Carlson, Leland A.

    1989-01-01

    The quasi-analytical approach is developed to compute airfoil aerodynamic sensitivity coefficients in the transonic and supersonic flight regimes. Initial investigation verifies the feasibility of this approach as applied to the transonic small perturbation residual expression. Results are compared to those obtained by the direct (finite difference) approach and both methods are evaluated to determine their computational accuracies and efficiencies. The quasi-analytical approach is shown to be superior and worth further investigation.

  5. Introducing a new and rapid microextraction approach based on magnetic ionic liquids: Stir bar dispersive liquid microextraction.

    PubMed

    Chisvert, Alberto; Benedé, Juan L; Anderson, Jared L; Pierson, Stephen A; Salvador, Amparo

    2017-08-29

    With the aim of contributing to the development and improvement of microextraction techniques, a novel approach combining the principles and advantages of stir bar sorptive extraction (SBSE) and dispersive liquid-liquid microextraction (DLLME) is presented. This new approach, termed stir bar dispersive liquid microextraction (SBDLME), involves the addition of a magnetic ionic liquid (MIL) and a neodymium-core magnetic stir bar into the sample allowing the MIL coat the stir bar due to physical forces (i.e., magnetism). As long as the stirring rate is maintained at low speed, the MIL resists rotational (centrifugal) forces and remains on the stir bar surface in a manner closely resembling SBSE. By increasing the stirring rate, the rotational forces surpass the magnetic field and the MIL disperses into the sample solution in a similar manner to DLLME. After extraction, the stirring is stopped and the MIL returns to the stir bar without the requirement of an additional external magnetic field. The MIL-coated stir bar containing the preconcentrated analytes is thermally desorbed directly into a gas chromatographic system coupled to a mass spectrometric detector (TD-GC-MS). This novel approach opens new insights into the microextraction field, by using the benefits provided by SBSE and DLLME simultaneously, such as automated thermal desorption and high surface contact area, respectively, but most importantly, it enables the use of tailor-made solvents (i.e., MILs). To prove its utility, SBDLME has been used in the extraction of lipophilic organic UV filters from environmental water samples as model analytical application with excellent analytical features in terms of linearity, enrichment factors (67-791), limits of detection (low ng L -1 ), intra- and inter-day repeatability (RSD<15%) and relative recoveries (87-113%, 91-117% and 89-115% for river, sea and swimming pool water samples, respectively). Copyright © 2017 Elsevier B.V. All rights reserved.

  6. A novel approach to signal normalisation in atmospheric pressure ionisation mass spectrometry.

    PubMed

    Vogeser, Michael; Kirchhoff, Fabian; Geyer, Roland

    2012-07-01

    The aim of our study was to test an alternative principle of signal normalisation in LC-MS/MS. During analyses, post column infusion of the target analyte is done via a T-piece, generating an "area under the analyte peak" (AUP). The ratio of peak area to AUP is assessed as assay response. Acceptable analytical performance of this principle was found for an exemplary analyte. Post-column infusion may allow normalisation of ion suppression not requiring any additional standard compound. This approach can be useful in situations where no appropriate compound is available for classical internal standardisation. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. A Conceptual Analytics Model for an Outcome-Driven Quality Management Framework as Part of Professional Healthcare Education.

    PubMed

    Hervatis, Vasilis; Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil

    2015-10-06

    Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators' decision making. A deductive case study approach was applied to develop the conceptual model. The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach.

  8. A Conceptual Analytics Model for an Outcome-Driven Quality Management Framework as Part of Professional Healthcare Education

    PubMed Central

    Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil

    2015-01-01

    Background Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. Objective The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators’ decision making. Methods A deductive case study approach was applied to develop the conceptual model. Results The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. Conclusions The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach. PMID:27731840

  9. How Does Social Inequality Continue to Influence Young People's Trajectories through the Apprenticeship Pathway System in South Africa? An Analytical Approach

    ERIC Educational Resources Information Center

    Kruss, Glenda; Wildschut, Angelique

    2016-01-01

    The paper contributes by proposing an analytical approach that allows for the identification of patterns of participation in education and training and the labour market, through empirical measurement of the number of transitions and distinct trajectories traversed by groups of individuals. To illustrate the value of the approach, we focus on an…

  10. Integrated Data & Analysis in Support of Informed and Transparent Decision Making

    NASA Astrophysics Data System (ADS)

    Guivetchi, K.

    2012-12-01

    The California Water Plan includes a framework for improving water reliability, environmental stewardship, and economic stability through two initiatives - integrated regional water management to make better use of local water sources by integrating multiple aspects of managing water and related resources; and maintaining and improving statewide water management systems. The Water Plan promotes ways to develop a common approach for data standards and for understanding, evaluating, and improving regional and statewide water management systems, and for common ways to evaluate and select from alternative management strategies and projects. The California Water Plan acknowledges that planning for the future is uncertain and that change will continue to occur. It is not possible to know for certain how population growth, land use decisions, water demand patterns, environmental conditions, the climate, and many other factors that affect water use and supply may change by 2050. To anticipate change, our approach to water management and planning for the future needs to consider and quantify uncertainty, risk, and sustainability. There is a critical need for information sharing and information management to support over-arching and long-term water policy decisions that cross-cut multiple programs across many organizations and provide a common and transparent understanding of water problems and solutions. Achieving integrated water management with multiple benefits requires a transparent description of dynamic linkages between water supply, flood management, water quality, land use, environmental water, and many other factors. Water Plan Update 2013 will include an analytical roadmap for improving data, analytical tools, and decision-support to advance integrated water management at statewide and regional scales. It will include recommendations for linking collaborative processes with technical enhancements, providing effective analytical tools, and improving and sharing data and information. Specifically, this includes achieving better integration and consistency with other planning activities; obtaining consensus on quantitative deliverables; building a common conceptual understanding of the water management system; developing common schematics of the water management system; establishing modeling protocols and standards; and improving transparency and exchange of Water Plan information.

  11. Modeling cometary photopolarimetric characteristics with Sh-matrix method

    NASA Astrophysics Data System (ADS)

    Kolokolova, L.; Petrov, D.

    2017-12-01

    Cometary dust is dominated by particles of complex shape and structure, which are often considered as fractal aggregates. Rigorous modeling of light scattering by such particles, even using parallelized codes and NASA supercomputer resources, is very computer time and memory consuming. We are presenting a new approach to modeling cometary dust that is based on the Sh-matrix technique (e.g., Petrov et al., JQSRT, 112, 2012). This method is based on the T-matrix technique (e.g., Mishchenko et al., JQSRT, 55, 1996) and was developed after it had been found that the shape-dependent factors could be separated from the size- and refractive-index-dependent factors and presented as a shape matrix, or Sh-matrix. Size and refractive index dependences are incorporated through analytical operations on the Sh-matrix to produce the elements of T-matrix. Sh-matrix method keeps all advantages of the T-matrix method, including analytical averaging over particle orientation. Moreover, the surface integrals describing the Sh-matrix elements themselves can be solvable analytically for particles of any shape. This makes Sh-matrix approach an effective technique to simulate light scattering by particles of complex shape and surface structure. In this paper, we present cometary dust as an ensemble of Gaussian random particles. The shape of these particles is described by a log-normal distribution of their radius length and direction (Muinonen, EMP, 72, 1996). Changing one of the parameters of this distribution, the correlation angle, from 0 to 90 deg., we can model a variety of particles from spheres to particles of a random complex shape. We survey the angular and spectral dependencies of intensity and polarization resulted from light scattering by such particles, studying how they depend on the particle shape, size, and composition (including porous particles to simulate aggregates) to find the best fit to the cometary observations.

  12. Dyadic confirmatory factor analysis of the inflammatory bowel disease family responsibility questionnaire.

    PubMed

    Greenley, Rachel Neff; Reed-Knight, Bonney; Blount, Ronald L; Wilson, Helen W

    2013-09-01

    Evaluate the factor structure of youth and maternal involvement ratings on the Inflammatory Bowel Disease Family Responsibility Questionnaire, a measure of family allocation of condition management responsibilities in pediatric inflammatory bowel disease. Participants included 251 youth aged 11-18 years with inflammatory bowel disease and their mothers. Item-level descriptive analyses, subscale internal consistency estimates, and confirmatory factor analyses of youth and maternal involvement were conducted using a dyadic data-analytic approach. Results supported the validity of 4 conceptually derived subscales including general health maintenance, social aspects, condition management tasks, and nutrition domains. Additionally, results indicated adequate support for the factor structure of a 21-item youth involvement measure and strong support for a 16-item maternal involvement measure. Additional empirical support for the validity of the Inflammatory Bowel Disease Family Responsibility Questionnaire was provided. Future research to replicate current findings and to examine the measure's clinical utility is warranted.

  13. Analysis on critical success factors for agile manufacturing evaluation in original equipment manufacturing industry-an AHP approach

    NASA Astrophysics Data System (ADS)

    Ajay Guru Dev, C.; Senthil Kumar, V. S.

    2016-09-01

    Manufacturing industries are facing challenges in the implementation of agile manufacturing in their products and processes. Agility is widely accepted as a new competitive concept in the manufacturing sector in fulfilling varying customer demand. Thus, evaluation of agile manufacturing in industries has become a necessity. The success of an organisation depends on its ability to manage finding the critical success factors and give them special and continued attention in order to bring about high performance. This paper proposes a set of critical success factors (CSFs) for evaluating agile manufacturing considered appropriate for the manufacturing sector. The analytical hierarchy process (AHP) method is applied for prioritizing the success factors, by summarizing the opinions of experts. It is believed that the proposed CSFs enable and assist manufacturing industries to achieve a higher performance in agile manufacturing so as to increase competitiveness.

  14. VAST Challenge 2016: Streaming Visual Analytics

    DTIC Science & Technology

    2016-10-25

    understand rapidly evolving situations. To support such tasks, visual analytics solutions must move well beyond systems that simply provide real-time...received. Mini-Challenge 1: Design Challenge Mini-Challenge 1 focused on systems to support security and operational analytics at the Euybia...Challenge 1 was to solicit novel approaches for streaming visual analytics that push the boundaries for what constitutes a visual analytics system , and to

  15. Use of experimental design in the investigation of stir bar sorptive extraction followed by ultra-high-performance liquid chromatography-tandem mass spectrometry for the analysis of explosives in water samples.

    PubMed

    Schramm, Sébastien; Vailhen, Dominique; Bridoux, Maxime Cyril

    2016-02-12

    A method for the sensitive quantification of trace amounts of organic explosives in water samples was developed by using stir bar sorptive extraction (SBSE) followed by liquid desorption and ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS). The proposed method was developed and optimized using a statistical design of experiment approach. Use of experimental designs allowed a complete study of 10 factors and 8 analytes including nitro-aromatics, amino-nitro-aromatics and nitric esters. The liquid desorption study was performed using a full factorial experimental design followed by a kinetic study. Four different variables were tested here: the liquid desorption mode (stirring or sonication), the chemical nature of the stir bar (PDMS or PDMS-PEG), the composition of the liquid desorption phase and finally, the volume of solvent used for the liquid desorption. On the other hand, the SBSE extraction study was performed using a Doehlert design. SBSE extraction conditions such as extraction time profiles, sample volume, modifier addition, and acetic acid addition were examined. After optimization of the experimental parameters, sensitivity was improved by a factor 5-30, depending on the compound studied, due to the enrichment factors reached using the SBSE method. Limits of detection were in the ng/L level for all analytes studied. Reproducibility of the extraction with different stir bars was close to the reproducibility of the analytical method (RSD between 4 and 16%). Extractions in various water sample matrices (spring, mineral and underground water) have shown similar enrichment compared to ultrapure water, revealing very low matrix effects. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Altered amygdalar resting-state connectivity in depression is explained by both genes and environment.

    PubMed

    Córdova-Palomera, Aldo; Tornador, Cristian; Falcón, Carles; Bargalló, Nuria; Nenadic, Igor; Deco, Gustavo; Fañanás, Lourdes

    2015-10-01

    Recent findings indicate that alterations of the amygdalar resting-state fMRI connectivity play an important role in the etiology of depression. While both depression and resting-state brain activity are shaped by genes and environment, the relative contribution of genetic and environmental factors mediating the relationship between amygdalar resting-state connectivity and depression remain largely unexplored. Likewise, novel neuroimaging research indicates that different mathematical representations of resting-state fMRI activity patterns are able to embed distinct information relevant to brain health and disease. The present study analyzed the influence of genes and environment on amygdalar resting-state fMRI connectivity, in relation to depression risk. High-resolution resting-state fMRI scans were analyzed to estimate functional connectivity patterns in a sample of 48 twins (24 monozygotic pairs) informative for depressive psychopathology (6 concordant, 8 discordant and 10 healthy control pairs). A graph-theoretical framework was employed to construct brain networks using two methods: (i) the conventional approach of filtered BOLD fMRI time-series and (ii) analytic components of this fMRI activity. Results using both methods indicate that depression risk is increased by environmental factors altering amygdalar connectivity. When analyzing the analytic components of the BOLD fMRI time-series, genetic factors altering the amygdala neural activity at rest show an important contribution to depression risk. Overall, these findings show that both genes and environment modify different patterns the amygdala resting-state connectivity to increase depression risk. The genetic relationship between amygdalar connectivity and depression may be better elicited by examining analytic components of the brain resting-state BOLD fMRI signals. © 2015 Wiley Periodicals, Inc.

  17. Trace metal speciation in natural waters: Computational vs. analytical

    USGS Publications Warehouse

    Nordstrom, D. Kirk

    1996-01-01

    Improvements in the field sampling, preservation, and determination of trace metals in natural waters have made many analyses more reliable and less affected by contamination. The speciation of trace metals, however, remains controversial. Chemical model speciation calculations do not necessarily agree with voltammetric, ion exchange, potentiometric, or other analytical speciation techniques. When metal-organic complexes are important, model calculations are not usually helpful and on-site analytical separations are essential. Many analytical speciation techniques have serious interferences and only work well for a limited subset of water types and compositions. A combined approach to the evaluation of speciation could greatly reduce these uncertainties. The approach proposed would be to (1) compare and contrast different analytical techniques with each other and with computed speciation, (2) compare computed trace metal speciation with reliable measurements of solubility, potentiometry, and mean activity coefficients, and (3) compare different model calculations with each other for the same set of water analyses, especially where supplementary data on speciation already exist. A comparison and critique of analytical with chemical model speciation for a range of water samples would delineate the useful range and limitations of these different approaches to speciation. Both model calculations and analytical determinations have useful and different constraints on the range of possible speciation such that they can provide much better insight into speciation when used together. Major discrepancies in the thermodynamic databases of speciation models can be evaluated with the aid of analytical speciation, and when the thermodynamic models are highly consistent and reliable, the sources of error in the analytical speciation can be evaluated. Major thermodynamic discrepancies also can be evaluated by simulating solubility and activity coefficient data and testing various chemical models for their range of applicability. Until a comparative approach such as this is taken, trace metal speciation will remain highly uncertain and controversial.

  18. A Resource-Constrained Approach to Implementing Analytics in an Institution of Higher Education: An Experience Report

    ERIC Educational Resources Information Center

    Buerck, John P.; Mudigonda, Srikanth P.

    2014-01-01

    Academic analytics and learning analytics have been increasingly adopted by academic institutions of higher learning for improving student performance and retention. While several studies have reported the implementation details and the successes of specific analytics initiatives, relatively fewer studies exist in literature that describe the…

  19. Development of a robust space power system decision model

    NASA Astrophysics Data System (ADS)

    Chew, Gilbert; Pelaccio, Dennis G.; Jacobs, Mark; Stancati, Michael; Cataldo, Robert

    2001-02-01

    NASA continues to evaluate power systems to support human exploration of the Moon and Mars. The system(s) would address all power needs of surface bases and on-board power for space transfer vehicles. Prior studies have examined both solar and nuclear-based alternatives with respect to individual issues such as sizing or cost. What has not been addressed is a comprehensive look at the risks and benefits of the options that could serve as the analytical framework to support a system choice that best serves the needs of the exploration program. This paper describes the SAIC developed Space Power System Decision Model, which uses a formal Two-step Analytical Hierarchy Process (TAHP) methodology that is used in the decision-making process to clearly distinguish candidate power systems in terms of benefits, safety, and risk. TAHP is a decision making process based on the Analytical Hierarchy Process, which employs a hierarchic approach of structuring decision factors by weights, and relatively ranks system design options on a consistent basis. This decision process also includes a level of data gathering and organization that produces a consistent, well-documented assessment, from which the capability of each power system option to meet top-level goals can be prioritized. The model defined on this effort focuses on the comparative assessment candidate power system options for Mars surface application(s). This paper describes the principles of this approach, the assessment criteria and weighting procedures, and the tools to capture and assess the expert knowledge associated with space power system evaluation. .

  20. Horizon-absorbed energy flux in circularized, nonspinning black-hole binaries, and its effective-one-body representation

    NASA Astrophysics Data System (ADS)

    Nagar, Alessandro; Akcay, Sarp

    2012-02-01

    We propose, within the effective-one-body approach, a new, resummed analytical representation of the gravitational-wave energy flux absorbed by a system of two circularized (nonspinning) black holes. This expression is such that it is well-behaved in the strong-field, fast-motion regime, notably up to the effective-one-body-defined last unstable orbit. Building conceptually upon the procedure adopted to resum the multipolar asymptotic energy flux, we introduce a multiplicative decomposition of the multipolar absorbed flux made by three factors: (i) the leading-order contribution, (ii) an “effective source” and (iii) a new residual amplitude correction (ρ˜ℓmH)2ℓ. In the test-mass limit, we use a frequency-domain perturbative approach to accurately compute numerically the horizon-absorbed fluxes along a sequence of stable and unstable circular orbits, and we extract from them the functions ρ˜ℓmH. These quantities are then fitted via rational functions. The resulting analytically represented test-mass knowledge is then suitably hybridized with lower-order analytical information that is valid for any mass ratio. This yields a resummed representation of the absorbed flux for a generic, circularized, nonspinning black-hole binary. Our result adds new information to the state-of-the-art calculation of the absorbed flux at fractional 5 post-Newtonian order [S. Taylor and E. Poisson, Phys. Rev. D 78, 084016 (2008)], which is recovered in the weak-field limit approximation by construction.

  1. Multiplexed MRM-based quantitation of candidate cancer biomarker proteins in undepleted and non-enriched human plasma.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Borchers, Christoph H

    2013-07-01

    An emerging approach for multiplexed targeted proteomics involves bottom-up LC-MRM-MS, with stable isotope-labeled internal standard peptides, to accurately quantitate panels of putative disease biomarkers in biofluids. In this paper, we used this approach to quantitate 27 candidate cancer-biomarker proteins in human plasma that had not been treated by immunoaffinity depletion or enrichment techniques. These proteins have been reported as biomarkers for a variety of human cancers, from laryngeal to ovarian, with breast cancer having the highest correlation. We implemented measures to minimize the analytical variability, improve the quantitative accuracy, and increase the feasibility and applicability of this MRM-based method. We have demonstrated excellent retention time reproducibility (median interday CV: 0.08%) and signal stability (median interday CV: 4.5% for the analytical platform and 6.1% for the bottom-up workflow) for the 27 biomarker proteins (represented by 57 interference-free peptides). The linear dynamic range for the MRM assays spanned four orders-of-magnitude, with 25 assays covering a 10(3) -10(4) range in protein concentration. The lowest abundance quantifiable protein in our biomarker panel was insulin-like growth factor 1 (calculated concentration: 127 ng/mL). Overall, the analytical performance of this assay demonstrates high robustness and sensitivity, and provides the necessary throughput and multiplexing capabilities required to verify and validate cancer-associated protein biomarker panels in human plasma, prior to clinical use. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Positive lists of cosmetic ingredients: Analytical methodology for regulatory and safety controls - A review.

    PubMed

    Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen

    2016-04-07

    Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Method optimization for drug impurity profiling in supercritical fluid chromatography: Application to a pharmaceutical mixture.

    PubMed

    Muscat Galea, Charlene; Didion, David; Clicq, David; Mangelings, Debby; Vander Heyden, Yvan

    2017-12-01

    A supercritical chromatographic method for the separation of a drug and its impurities has been developed and optimized applying an experimental design approach and chromatogram simulations. Stationary phase screening was followed by optimization of the modifier and injection solvent composition. A design-of-experiment (DoE) approach was then used to optimize column temperature, back-pressure and the gradient slope simultaneously. Regression models for the retention times and peak widths of all mixture components were built. The factor levels for different grid points were then used to predict the retention times and peak widths of the mixture components using the regression models and the best separation for the worst separated peak pair in the experimental domain was identified. A plot of the minimal resolutions was used to help identifying the factor levels leading to the highest resolution between consecutive peaks. The effects of the DoE factors were visualized in a way that is familiar to the analytical chemist, i.e. by simulating the resulting chromatogram. The mixture of an active ingredient and seven impurities was separated in less than eight minutes. The approach discussed in this paper demonstrates how SFC methods can be developed and optimized efficiently using simple concepts and tools. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Evaluation of the availability of bound analyte for passive sampling in the presence of mobile binding matrix.

    PubMed

    Xu, Jianqiao; Huang, Shuyao; Jiang, Ruifen; Cui, Shufen; Luan, Tiangang; Chen, Guosheng; Qiu, Junlang; Cao, Chenyang; Zhu, Fang; Ouyang, Gangfeng

    2016-04-21

    Elucidating the availability of the bound analytes for the mass transfer through the diffusion boundary layers (DBLs) adjacent to passive samplers is important for understanding the passive sampling kinetics in complex samples, in which the lability factor of the bound analyte in the DBL is an important parameter. In this study, the mathematical expression of lability factor was deduced by assuming a pseudo-steady state during passive sampling, and the equation indicated that the lability factor was equal to the ratio of normalized concentration gradients between the bound and free analytes. Through the introduction of the mathematical expression of lability factor, the modified effective average diffusion coefficient was proven to be more suitable for describing the passive sampling kinetics in the presence of mobile binding matrixes. Thereafter, the lability factors of the bound polycyclic aromatic hydrocarbons (PAHs) with sodium dodecylsulphate (SDS) micelles as the binding matrixes were figured out according to the improved theory. The lability factors were observed to decrease with larger binding ratios and smaller micelle sizes, and were successfully used to predict the mass transfer efficiencies of PAHs through DBLs. This study would promote the understanding of the availability of bound analytes for passive sampling based on the theoretical improvements and experimental assessments. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Internally insulated thermal storage system development program

    NASA Technical Reports Server (NTRS)

    Scott, O. L.

    1980-01-01

    A cost effective thermal storage system for a solar central receiver power system using molten salt stored in internally insulated carbon steel tanks is described. Factors discussed include: testing of internal insulation materials in molten salt; preliminary design of storage tanks, including insulation and liner installation; optimization of the storage configuration; and definition of a subsystem research experiment to demonstrate the system. A thermal analytical model and analysis of a thermocline tank was performed. Data from a present thermocline test tank was compared to gain confidence in the analytical approach. A computer analysis of the various storage system parameters (insulation thickness, number of tanks, tank geometry, etc.,) showed that (1) the most cost-effective configuration was a small number of large cylindrical tanks, and (2) the optimum is set by the mechanical constraints of the system, such as soil bearing strength and tank hoop stress, not by the economics.

  6. Internally insulated thermal storage system development program

    NASA Astrophysics Data System (ADS)

    Scott, O. L.

    1980-03-01

    A cost effective thermal storage system for a solar central receiver power system using molten salt stored in internally insulated carbon steel tanks is described. Factors discussed include: testing of internal insulation materials in molten salt; preliminary design of storage tanks, including insulation and liner installation; optimization of the storage configuration; and definition of a subsystem research experiment to demonstrate the system. A thermal analytical model and analysis of a thermocline tank was performed. Data from a present thermocline test tank was compared to gain confidence in the analytical approach. A computer analysis of the various storage system parameters (insulation thickness, number of tanks, tank geometry, etc.,) showed that (1) the most cost-effective configuration was a small number of large cylindrical tanks, and (2) the optimum is set by the mechanical constraints of the system, such as soil bearing strength and tank hoop stress, not by the economics.

  7. On the impact of cloudiness on the characteristics of nocturnal downslope flows

    NASA Astrophysics Data System (ADS)

    Ye, Z. J.; Segal, M.; Garratt, J. R.; Pielke, R. A.

    1989-10-01

    The effects of cloud cover amount and the height of cloud base on nighttime thermally induced downslope flow were investigated using analytical and numerical model approaches. The conclusions obtained with the analytical and the numerical model evaluations agreed. It was concluded that, (i) as cloud cover increases and/or the height of cloud base decreases, the depth and the intensity of nighttime thermally-induced downslope flows may decrease by a factor reaching one sixth and one tenth, respectively, in the case of overcast low cloud; (ii) when skies suddenly cloud over around midnight, the development of the downslope flow is altered in different ways: a reduction in intensity; or a cessation of further development, depending on the fraction of cloud coverage, and (iii) with a sudden clearing of overcast low cloud around midnight, the depth and the intensity of the downslope flow increases significantly.

  8. Ohmic Inflation of Hot Jupiters: an Analytical Approach

    NASA Astrophysics Data System (ADS)

    Ginzburg, Sivan; Sari, Re'em

    2015-12-01

    Many giant exoplanets in close orbits have observed radii which exceed theoretical predictions.One suggested explanation for this discrepancy is heat deposited deep inside the atmospheres of these hot Jupiters.We present an analytical model for the evolution of such irradiated, and internally heated gas giants, and derive scaling laws for their cooling rates and radii.We estimate the Ohmic dissipation resulting from the interaction between the atmospheric winds and the planet's magnetic field, and apply our model to Ohmically heated planets.Our model can account for the observed radii of many inflated planets, but not the most extreme ones.We show that Ohmically heated planets have already reached their equilibrium phase and they no longer contract.We show that it is possible to re-inflate planets, but we confirm that re-heating timescales are longer by about a factor of 30 than cooling times.

  9. Human performance modeling for system of systems analytics: combat performance-shaping factors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, Craig R.; Miller, Dwight Peter

    The US military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives. To support this goal, Sandia National Laboratories (SNL) has undertaken a program of HPM as an integral augmentation to its system-of-system (SoS) analytics capabilities. The previous effort, reported in SAND2005-6569, evaluated the effects of soldier cognitive fatigue on SoS performance. The current effort began with a very broad survey of any performance-shaping factors (PSFs) that also might affect soldiers performance in combat situations. The work included consideration of three different approaches to cognition modeling and how appropriate theymore » would be for application to SoS analytics. This bulk of this report categorizes 47 PSFs into three groups (internal, external, and task-related) and provides brief descriptions of how each affects combat performance, according to the literature. The PSFs were then assembled into a matrix with 22 representative military tasks and assigned one of four levels of estimated negative impact on task performance, based on the literature. Blank versions of the matrix were then sent to two ex-military subject-matter experts to be filled out based on their personal experiences. Data analysis was performed to identify the consensus most influential PSFs. Results indicate that combat-related injury, cognitive fatigue, inadequate training, physical fatigue, thirst, stress, poor perceptual processing, and presence of chemical agents are among the PSFs with the most negative impact on combat performance.« less

  10. Exact analytical approach for six-degree-of-freedom measurement using image-orientation-change method.

    PubMed

    Tsai, Chung-Yu

    2012-04-01

    An exact analytical approach is proposed for measuring the six-degree-of-freedom (6-DOF) motion of an object using the image-orientation-change (IOC) method. The proposed measurement system comprises two reflector systems, where each system consists of two reflectors and one position sensing detector (PSD). The IOCs of the object in the two reflector systems are described using merit functions determined from the respective PSD readings before and after motion occurs, respectively. The three rotation variables are then determined analytically from the eigenvectors of the corresponding merit functions. After determining the three rotation variables, the order of the translation equations is downgraded to a linear form. Consequently, the solution for the three translation variables can also be analytically determined. As a result, the motion transformation matrix describing the 6-DOF motion of the object is fully determined. The validity of the proposed approach is demonstrated by means of an illustrative example.

  11. A Meta-Analytic Review of Work-Family Conflict and Its Antecedents

    ERIC Educational Resources Information Center

    Byron, Kristin

    2005-01-01

    This meta-analytic review combines the results of more than 60 studies to help determine the relative effects of work, nonwork, and demographic and individual factors on work interference with family (WIF) and family interference with work (FIW). As expected, work factors related more strongly to WIF, and some nonwork factors were more strongly…

  12. Orthogonal Higher Order Structure of the WISC-IV Spanish Using Hierarchical Exploratory Factor Analytic Procedures

    ERIC Educational Resources Information Center

    McGill, Ryan J.; Canivez, Gary L.

    2016-01-01

    As recommended by Carroll, the present study examined the factor structure of the Wechsler Intelligence Scale for Children-Fourth Edition Spanish (WISC-IV Spanish) normative sample using higher order exploratory factor analytic techniques not included in the WISC-IV Spanish Technical Manual. Results indicated that the WISC-IV Spanish subtests were…

  13. Normal Theory Two-Stage ML Estimator When Data Are Missing at the Item Level

    PubMed Central

    Savalei, Victoria; Rhemtulla, Mijke

    2017-01-01

    In many modeling contexts, the variables in the model are linear composites of the raw items measured for each participant; for instance, regression and path analysis models rely on scale scores, and structural equation models often use parcels as indicators of latent constructs. Currently, no analytic estimation method exists to appropriately handle missing data at the item level. Item-level multiple imputation (MI), however, can handle such missing data straightforwardly. In this article, we develop an analytic approach for dealing with item-level missing data—that is, one that obtains a unique set of parameter estimates directly from the incomplete data set and does not require imputations. The proposed approach is a variant of the two-stage maximum likelihood (TSML) methodology, and it is the analytic equivalent of item-level MI. We compare the new TSML approach to three existing alternatives for handling item-level missing data: scale-level full information maximum likelihood, available-case maximum likelihood, and item-level MI. We find that the TSML approach is the best analytic approach, and its performance is similar to item-level MI. We recommend its implementation in popular software and its further study. PMID:29276371

  14. Normal Theory Two-Stage ML Estimator When Data Are Missing at the Item Level.

    PubMed

    Savalei, Victoria; Rhemtulla, Mijke

    2017-08-01

    In many modeling contexts, the variables in the model are linear composites of the raw items measured for each participant; for instance, regression and path analysis models rely on scale scores, and structural equation models often use parcels as indicators of latent constructs. Currently, no analytic estimation method exists to appropriately handle missing data at the item level. Item-level multiple imputation (MI), however, can handle such missing data straightforwardly. In this article, we develop an analytic approach for dealing with item-level missing data-that is, one that obtains a unique set of parameter estimates directly from the incomplete data set and does not require imputations. The proposed approach is a variant of the two-stage maximum likelihood (TSML) methodology, and it is the analytic equivalent of item-level MI. We compare the new TSML approach to three existing alternatives for handling item-level missing data: scale-level full information maximum likelihood, available-case maximum likelihood, and item-level MI. We find that the TSML approach is the best analytic approach, and its performance is similar to item-level MI. We recommend its implementation in popular software and its further study.

  15. Peptidomics: the integrated approach of MS, hyphenated techniques and bioinformatics for neuropeptide analysis.

    PubMed

    Boonen, Kurt; Landuyt, Bart; Baggerman, Geert; Husson, Steven J; Huybrechts, Jurgen; Schoofs, Liliane

    2008-02-01

    MS is currently one of the most important analytical techniques in biological and medical research. ESI and MALDI launched the field of MS into biology. The performance of mass spectrometers increased tremendously over the past decades. Other technological advances increased the analytical power of biological MS even more. First, the advent of the genome projects allowed an automated analysis of mass spectrometric data. Second, improved separation techniques, like nanoscale HPLC, are essential for MS analysis of biomolecules. The recent progress in bioinformatics is the third factor that accelerated the biochemical analysis of macromolecules. The first part of this review will introduce the basics of these techniques. The field that integrates all these techniques to identify endogenous peptides is called peptidomics and will be discussed in the last section. This integrated approach aims at identifying all the present peptides in a cell, organ or organism (the peptidome). Today, peptidomics is used by several fields of research. Special emphasis will be given to the identification of neuropeptides, a class of short proteins that fulfil several important intercellular signalling functions in every animal. MS imaging techniques and biomarker discovery will also be discussed briefly.

  16. Time-optimal excitation of maximum quantum coherence: Physical limits and pulse sequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Köcher, S. S.; Institute of Energy and Climate Research; Heydenreich, T.

    Here we study the optimum efficiency of the excitation of maximum quantum (MaxQ) coherence using analytical and numerical methods based on optimal control theory. The theoretical limit of the achievable MaxQ amplitude and the minimum time to achieve this limit are explored for a set of model systems consisting of up to five coupled spins. In addition to arbitrary pulse shapes, two simple pulse sequence families of practical interest are considered in the optimizations. Compared to conventional approaches, substantial gains were found both in terms of the achieved MaxQ amplitude and in pulse sequence durations. For a model system, theoreticallymore » predicted gains of a factor of three compared to the conventional pulse sequence were experimentally demonstrated. Motivated by the numerical results, also two novel analytical transfer schemes were found: Compared to conventional approaches based on non-selective pulses and delays, double-quantum coherence in two-spin systems can be created twice as fast using isotropic mixing and hard spin-selective pulses. Also it is proved that in a chain of three weakly coupled spins with the same coupling constants, triple-quantum coherence can be created in a time-optimal fashion using so-called geodesic pulses.« less

  17. MapReduce Based Parallel Bayesian Network for Manufacturing Quality Control

    NASA Astrophysics Data System (ADS)

    Zheng, Mao-Kuan; Ming, Xin-Guo; Zhang, Xian-Yu; Li, Guo-Ming

    2017-09-01

    Increasing complexity of industrial products and manufacturing processes have challenged conventional statistics based quality management approaches in the circumstances of dynamic production. A Bayesian network and big data analytics integrated approach for manufacturing process quality analysis and control is proposed. Based on Hadoop distributed architecture and MapReduce parallel computing model, big volume and variety quality related data generated during the manufacturing process could be dealt with. Artificial intelligent algorithms, including Bayesian network learning, classification and reasoning, are embedded into the Reduce process. Relying on the ability of the Bayesian network in dealing with dynamic and uncertain problem and the parallel computing power of MapReduce, Bayesian network of impact factors on quality are built based on prior probability distribution and modified with posterior probability distribution. A case study on hull segment manufacturing precision management for ship and offshore platform building shows that computing speed accelerates almost directly proportionally to the increase of computing nodes. It is also proved that the proposed model is feasible for locating and reasoning of root causes, forecasting of manufacturing outcome, and intelligent decision for precision problem solving. The integration of bigdata analytics and BN method offers a whole new perspective in manufacturing quality control.

  18. Product identification techniques used as training aids for analytical chemists

    NASA Technical Reports Server (NTRS)

    Grillo, J. P.

    1968-01-01

    Laboratory staff assistants are trained to use data and observations of routine product analyses performed by experienced analytical chemists when analyzing compounds for potential toxic hazards. Commercial products are used as examples in teaching the analytical approach to unknowns.

  19. How effects on health equity are assessed in systematic reviews of interventions.

    PubMed

    Welch, Vivian; Tugwell, Peter; Petticrew, Mark; de Montigny, Joanne; Ueffing, Erin; Kristjansson, Betsy; McGowan, Jessie; Benkhalti Jandu, Maria; Wells, George A; Brand, Kevin; Smylie, Janet

    2010-12-08

    Enhancing health equity has now achieved international political importance with endorsement from the World Health Assembly in 2009.  The failure of systematic reviews to consider effects on health equity is cited by decision-makers as a limitation to their ability to inform policy and program decisions.  To systematically review methods to assess effects on health equity in systematic reviews of effectiveness. We searched the following databases up to July 2 2010: MEDLINE, PsychINFO, the Cochrane Methodology Register, CINAHL, Education Resources Information Center, Education Abstracts, Criminal Justice Abstracts, Index to Legal Periodicals, PAIS International, Social Services Abstracts, Sociological Abstracts, Digital Dissertations and the Health Technology Assessment Database. We searched SCOPUS to identify articles that cited any of the included studies on October 7 2010. We included empirical studies of cohorts of systematic reviews that assessed methods for measuring effects on health inequalities. Data were extracted using a pre-tested form by two independent reviewers. Risk of bias was appraised for included studies according to the potential for bias in selection and detection of systematic reviews.  Thirty-four methodological studies were included.  The methods used by these included studies were: 1) Targeted approaches (n=22); 2) gap approaches (n=12) and gradient approach (n=1).  Gender or sex was assessed in eight out of 34 studies, socioeconomic status in ten studies, race/ethnicity in seven studies, age in seven studies, low and middle income countries in 14 studies, and two studies assessed multiple factors across health inequity may exist.Only three studies provided a definition of health equity. Four methodological approaches to assessing effects on health equity were identified: 1) descriptive assessment of reporting and analysis in systematic reviews (all 34 studies used a type of descriptive method); 2) descriptive assessment of reporting and analysis in original trials (12/34 studies); 3) analytic approaches (10/34 studies); and 4) applicability assessment (11/34 studies). Both analytic and applicability approaches were not reported transparently nor in sufficient detail to judge their credibility. There is a need for improvement in conceptual clarity about the definition of health equity, describing sufficient detail about analytic approaches (including subgroup analyses) and transparent reporting of judgments required for applicability assessments in order to assess and report effects on health equity in systematic reviews.

  20. Active matrix-based collection of airborne analytes: an analyte recording chip providing exposure history and finger print.

    PubMed

    Fang, Jun; Park, Se-Chul; Schlag, Leslie; Stauden, Thomas; Pezoldt, Jörg; Jacobs, Heiko O

    2014-12-03

    In the field of sensors that target the detection of airborne analytes, Corona/lens-based-collection provides a new path to achieve a high sensitivity. An active-matrix-based analyte collection approach referred to as "airborne analyte memory chip/recorder" is demonstrated, which takes and stores airborne analytes in a matrix to provide an exposure history for off-site analysis. © 2014 The Authors. Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Psychosocial Working Conditions and Suicide Ideation: Evidence From a Cross-Sectional Survey of Working Australians.

    PubMed

    Milner, Allison; Page, Kathryn; Witt, Katrina; LaMontagne, Anthony

    2016-06-01

    This study examined the relationship between psychosocial working factors such as job control, job demands, job insecurity, supervisor support, and workplace bullying as risk factors for suicide ideation. We used a logistic analytic approach to assess risk factors for thoughts of suicide in a cross-sectional sample of working Australians. Potential predictors included psychosocial job stressors (described above); we also controlled for age, gender, occupational skill level, and psychological distress. We found that workplace bullying or harassment was associated with 1.54 greater odds of suicide ideation (95% confidence interval 1.64 to 2.05) in the model including psychological distress. Results also suggest that higher job control and security were associated with lower odds of suicide ideation. These results suggest the need for organizational level intervention to address psychosocial job stressors, including bullying.

  2. Liquid Metering Centrifuge Sticks (LMCS): A Centrifugal Approach to Metering Known Sample Volumes for Colorimetric Solid Phase Extraction (C-SPE)

    NASA Technical Reports Server (NTRS)

    Gazda, Daniel B.; Schultz, John R.; Clarke, Mark S.

    2007-01-01

    Phase separation is one of the most significant obstacles encountered during the development of analytical methods for water quality monitoring in spacecraft environments. Removing air bubbles from water samples prior to analysis is a routine task on earth; however, in the absence of gravity, this routine task becomes extremely difficult. This paper details the development and initial ground testing of liquid metering centrifuge sticks (LMCS), devices designed to collect and meter a known volume of bubble-free water in microgravity. The LMCS uses centrifugal force to eliminate entrapped air and reproducibly meter liquid sample volumes for analysis with Colorimetric Solid Phase Extraction (C-SPE). C-SPE is a sorption-spectrophotometric platform that is being developed as a potential spacecraft water quality monitoring system. C-SPE utilizes solid phase extraction membranes impregnated with analyte-specific colorimetric reagents to concentrate and complex target analytes in spacecraft water samples. The mass of analyte extracted from the water sample is determined using diffuse reflectance (DR) data collected from the membrane surface and an analyte-specific calibration curve. The analyte concentration can then be calculated from the mass of extracted analyte and the volume of the sample analyzed. Previous flight experiments conducted in microgravity conditions aboard the NASA KC-135 aircraft demonstrated that the inability to collect and meter a known volume of water using a syringe was a limiting factor in the accuracy of C-SPE measurements. Herein, results obtained from ground based C-SPE experiments using ionic silver as a test analyte and either the LMCS or syringes for sample metering are compared to evaluate the performance of the LMCS. These results indicate very good agreement between the two sample metering methods and clearly illustrate the potential of utilizing centrifugal forces to achieve phase separation and metering of water samples in microgravity.

  3. Errors in clinical laboratories or errors in laboratory medicine?

    PubMed

    Plebani, Mario

    2006-01-01

    Laboratory testing is a highly complex process and, although laboratory services are relatively safe, they are not as safe as they could or should be. Clinical laboratories have long focused their attention on quality control methods and quality assessment programs dealing with analytical aspects of testing. However, a growing body of evidence accumulated in recent decades demonstrates that quality in clinical laboratories cannot be assured by merely focusing on purely analytical aspects. The more recent surveys on errors in laboratory medicine conclude that in the delivery of laboratory testing, mistakes occur more frequently before (pre-analytical) and after (post-analytical) the test has been performed. Most errors are due to pre-analytical factors (46-68.2% of total errors), while a high error rate (18.5-47% of total errors) has also been found in the post-analytical phase. Errors due to analytical problems have been significantly reduced over time, but there is evidence that, particularly for immunoassays, interference may have a serious impact on patients. A description of the most frequent and risky pre-, intra- and post-analytical errors and advice on practical steps for measuring and reducing the risk of errors is therefore given in the present paper. Many mistakes in the Total Testing Process are called "laboratory errors", although these may be due to poor communication, action taken by others involved in the testing process (e.g., physicians, nurses and phlebotomists), or poorly designed processes, all of which are beyond the laboratory's control. Likewise, there is evidence that laboratory information is only partially utilized. A recent document from the International Organization for Standardization (ISO) recommends a new, broader definition of the term "laboratory error" and a classification of errors according to different criteria. In a modern approach to total quality, centered on patients' needs and satisfaction, the risk of errors and mistakes in pre- and post-examination steps must be minimized to guarantee the total quality of laboratory services.

  4. SAW-Based Phononic Crystal Microfluidic Sensor—Microscale Realization of Velocimetry Approaches for Integrated Analytical Platform Applications

    PubMed Central

    Lucklum, Ralf; Zubtsov, Mikhail; Schmidt, Marc-Peter; Mukhin, Nikolay V.; Hirsch, Soeren

    2017-01-01

    The current work demonstrates a novel surface acoustic wave (SAW) based phononic crystal sensor approach that allows the integration of a velocimetry-based sensor concept into single chip integrated solutions, such as Lab-on-a-Chip devices. The introduced sensor platform merges advantages of ultrasonic velocimetry analytic systems and a microacoustic sensor approach. It is based on the analysis of structural resonances in a periodic composite arrangement of microfluidic channels confined within a liquid analyte. Completed theoretical and experimental investigations show the ability to utilize periodic structure localized modes for the detection of volumetric properties of liquids and prove the efficacy of the proposed sensor concept. PMID:28946609

  5. SAW-Based Phononic Crystal Microfluidic Sensor-Microscale Realization of Velocimetry Approaches for Integrated Analytical Platform Applications.

    PubMed

    Oseev, Aleksandr; Lucklum, Ralf; Zubtsov, Mikhail; Schmidt, Marc-Peter; Mukhin, Nikolay V; Hirsch, Soeren

    2017-09-23

    The current work demonstrates a novel surface acoustic wave (SAW) based phononic crystal sensor approach that allows the integration of a velocimetry-based sensor concept into single chip integrated solutions, such as Lab-on-a-Chip devices. The introduced sensor platform merges advantages of ultrasonic velocimetry analytic systems and a microacoustic sensor approach. It is based on the analysis of structural resonances in a periodic composite arrangement of microfluidic channels confined within a liquid analyte. Completed theoretical and experimental investigations show the ability to utilize periodic structure localized modes for the detection of volumetric properties of liquids and prove the efficacy of the proposed sensor concept.

  6. Mechanical and Electronic Approaches to Improve the Sensitivity of Microcantilever Sensors

    PubMed Central

    Mutyala, Madhu Santosh Ku; Bandhanadham, Deepika; Pan, Liu; Pendyala, Vijaya Rohini; Ji, Hai-Feng

    2010-01-01

    Advances in the field of Micro Electro Mechanical Systems (MEMS) and their uses now offer unique opportunities in the design of ultrasensitive analytical tools. The analytical community continues to search for cost-effective, reliable, and even portable analytical techniques that can give reliable and fast response results for a variety of chemicals and biomolecules. Microcantilevers (MCLs) have emerged as a unique platform for label-free biosensor or bioassay. Several electronic designs, including piezoresistive, piezoelectric, and capacitive approaches, have been applied to measure the bending or frequency change of the MCLs upon exposure to chemicals. This review summarizes mechanical, fabrication, and electronics approaches to increase the sensitivity of microcantilever (MCL) sensors. PMID:20975987

  7. Unsteady fluid flow in a slightly curved pipe: A comparative study of a matched asymptotic expansions solution with a single analytical solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Messaris, Gerasimos A. T., E-mail: messaris@upatras.gr; School of Science and Technology, Hellenic Open University, 11 Sahtouri Street, GR 262 22 Patras; Hadjinicolaou, Maria

    The present work is motivated by the fact that blood flow in the aorta and the main arteries is governed by large finite values of the Womersley number α and for such values of α there is not any analytical solution in the literature. The existing numerical solutions, although accurate, give limited information about the factors that affect the flow, whereas an analytical approach has an advantage in that it can provide physical insight to the flow mechanism. Having this in mind, we seek analytical solution to the equations of the fluid flow driven by a sinusoidal pressure gradient inmore » a slightly curved pipe of circular cross section when the Womersley number varies from small finite to infinite values. Initially the equations of motion are expanded in terms of the curvature ratio δ and the resulting linearized equations are solved analytically in two ways. In the first, we match the solution for the main core to that for the Stokes boundary layer. This solution is valid for very large values of α. In the second, we derive a straightforward single solution valid to the entire flow region and for 8 ≤ α < ∞, a range which includes the values of α that refer to the physiological flows. Each solution contains expressions for the axial velocity, the stream function, and the wall stresses and is compared to the analogous forms presented in other studies. The two solutions give identical results to each other regarding the axial flow but differ in the secondary flow and the circumferential wall stress, due to the approximations employed in the matched asymptotic expansion process. The results on the stream function from the second solution are in agreement with analogous results from other numerical solutions. The second solution predicts that the atherosclerotic plaques may develop in any location around the cross section of the aortic wall unlike to the prescribed locations predicted by the first solution. In addition, it gives circumferential wall stresses augmented by approximately 100% with respect to the matched asymptotic expansions, a factor that may contribute jointly with other pathological factors to the faster aging of the arterial system and the possible malfunction of the aorta.« less

  8. Unsteady fluid flow in a slightly curved pipe: A comparative study of a matched asymptotic expansions solution with a single analytical solution

    NASA Astrophysics Data System (ADS)

    Messaris, Gerasimos A. T.; Hadjinicolaou, Maria; Karahalios, George T.

    2016-08-01

    The present work is motivated by the fact that blood flow in the aorta and the main arteries is governed by large finite values of the Womersley number α and for such values of α there is not any analytical solution in the literature. The existing numerical solutions, although accurate, give limited information about the factors that affect the flow, whereas an analytical approach has an advantage in that it can provide physical insight to the flow mechanism. Having this in mind, we seek analytical solution to the equations of the fluid flow driven by a sinusoidal pressure gradient in a slightly curved pipe of circular cross section when the Womersley number varies from small finite to infinite values. Initially the equations of motion are expanded in terms of the curvature ratio δ and the resulting linearized equations are solved analytically in two ways. In the first, we match the solution for the main core to that for the Stokes boundary layer. This solution is valid for very large values of α. In the second, we derive a straightforward single solution valid to the entire flow region and for 8 ≤ α < ∞, a range which includes the values of α that refer to the physiological flows. Each solution contains expressions for the axial velocity, the stream function, and the wall stresses and is compared to the analogous forms presented in other studies. The two solutions give identical results to each other regarding the axial flow but differ in the secondary flow and the circumferential wall stress, due to the approximations employed in the matched asymptotic expansion process. The results on the stream function from the second solution are in agreement with analogous results from other numerical solutions. The second solution predicts that the atherosclerotic plaques may develop in any location around the cross section of the aortic wall unlike to the prescribed locations predicted by the first solution. In addition, it gives circumferential wall stresses augmented by approximately 100% with respect to the matched asymptotic expansions, a factor that may contribute jointly with other pathological factors to the faster aging of the arterial system and the possible malfunction of the aorta.

  9. Comment on 'Parametrization of Stillinger-Weber potential based on a valence force field model: application to single-layer MoS2 and black phosphorus'.

    PubMed

    Midtvedt, Daniel; Croy, Alexander

    2016-06-10

    We compare the simplified valence-force model for single-layer black phosphorus with the original model and recent ab initio results. Using an analytic approach and numerical calculations we find that the simplified model yields Young's moduli that are smaller compared to the original model and are almost a factor of two smaller than ab initio results. Moreover, the Poisson ratios are an order of magnitude smaller than values found in the literature.

  10. On the phenomenology of extended Brans-Dicke gravity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lima, Nelson A.; Ferreira, Pedro G., E-mail: ndal@roe.ac.uk, E-mail: p.ferreira1@physics.ox.ac.uk

    We introduce a designer approach for extended Brans-Dicke gravity that allows us to obtain the evolution of the scalar field by fixing the Hubble parameter to that of a w CDM model. We obtain analytical approximations for ϕ as a function of the scale factor and use these to build expressions for the effective Newton's constant at the background and at the linear level and the slip between the perturbed Newtonian potentials. By doing so, we are able to explore their dependence on the fundamental parameters of the theory.

  11. What are the correct ρ0(770 ) meson mass and width values?

    NASA Astrophysics Data System (ADS)

    Bartoš, Erik; Dubnička, Stanislav; Liptaj, Andrej; Dubničková, Anna Zuzana; Kamiński, Robert

    2017-12-01

    The accuracy of the Gounaris-Sakurai pion electromagnetic form factor model at the elastic region, in which just the ρ0(770 ) resonance appears, is investigated by the particular analysis of the most accurate P-wave isovector π π scattering phase shift δ11(t ) data, obtained by the Garcia-Martin-Kamiński-Peláez-Yndurain approach, and by an application of the Unitary&Analytic pion electromagnetic structure model to a description of the newest precise data on the e+e-→π+π- process.

  12. Incorporating photon recycling into the analytical drift-diffusion model of high efficiency solar cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lumb, Matthew P.; Naval Research Laboratory, Washington, DC 20375; Steiner, Myles A.

    The analytical drift-diffusion formalism is able to accurately simulate a wide range of solar cell architectures and was recently extended to include those with back surface reflectors. However, as solar cells approach the limits of material quality, photon recycling effects become increasingly important in predicting the behavior of these cells. In particular, the minority carrier diffusion length is significantly affected by the photon recycling, with consequences for the solar cell performance. In this paper, we outline an approach to account for photon recycling in the analytical Hovel model and compare analytical model predictions to GaAs-based experimental devices operating close tomore » the fundamental efficiency limit.« less

  13. SmartAQnet: remote and in-situ sensing of urban air quality

    NASA Astrophysics Data System (ADS)

    Budde, Matthias; Riedel, Till; Beigl, Michael; Schäfer, Klaus; Emeis, Stefan; Cyrys, Josef; Schnelle-Kreis, Jürgen; Philipp, Andreas; Ziegler, Volker; Grimm, Hans; Gratza, Thomas

    2017-10-01

    Air quality and the associated subjective and health-related quality of life are among the important topics of urban life in our time. However, it is very difficult for many cities to take measures to accommodate today's needs concerning e.g. mobility, housing and work, because a consistent fine-granular data and information on causal chains is largely missing. This has the potential to change, as today, both large-scale basic data as well as new promising measuring approaches are becoming available. The project "SmartAQnet", funded by the German Federal Ministry of Transport and Digital Infrastructure (BMVI), is based on a pragmatic, data driven approach, which for the first time combines existing data sets with a networked mobile measurement strategy in the urban space. By connecting open data, such as weather data or development plans, remote sensing of influencing factors, and new mobile measurement approaches, such as participatory sensing with low-cost sensor technology, "scientific scouts" (autonomous, mobile smart dust measurement device that is auto-calibrated to a high-quality reference instrument within an intelligent monitoring network) and demand-oriented measurements by light-weight UAVs, a novel measuring and analysis concept is created within the model region of Augsburg, Germany. In addition to novel analytics, a prototypical technology stack is planned which, through modern analytics methods and Big Data and IoT technologies, enables application in a scalable way.

  14. Isolating and Examining Sources of Suppression and Multicollinearity in Multiple Linear Regression.

    PubMed

    Beckstead, Jason W

    2012-03-30

    The presence of suppression (and multicollinearity) in multiple regression analysis complicates interpretation of predictor-criterion relationships. The mathematical conditions that produce suppression in regression analysis have received considerable attention in the methodological literature but until now nothing in the way of an analytic strategy to isolate, examine, and remove suppression effects has been offered. In this article such an approach, rooted in confirmatory factor analysis theory and employing matrix algebra, is developed. Suppression is viewed as the result of criterion-irrelevant variance operating among predictors. Decomposition of predictor variables into criterion-relevant and criterion-irrelevant components using structural equation modeling permits derivation of regression weights with the effects of criterion-irrelevant variance omitted. Three examples with data from applied research are used to illustrate the approach: the first assesses child and parent characteristics to explain why some parents of children with obsessive-compulsive disorder accommodate their child's compulsions more so than do others, the second examines various dimensions of personal health to explain individual differences in global quality of life among patients following heart surgery, and the third deals with quantifying the relative importance of various aptitudes for explaining academic performance in a sample of nursing students. The approach is offered as an analytic tool for investigators interested in understanding predictor-criterion relationships when complex patterns of intercorrelation among predictors are present and is shown to augment dominance analysis.

  15. Using Configural Frequency Analysis as a Person-Centered Analytic Approach with Categorical Data

    ERIC Educational Resources Information Center

    Stemmler, Mark; Heine, Jörg-Henrik

    2017-01-01

    Configural frequency analysis and log-linear modeling are presented as person-centered analytic approaches for the analysis of categorical or categorized data in multi-way contingency tables. Person-centered developmental psychology, based on the holistic interactionistic perspective of the Stockholm working group around David Magnusson and Lars…

  16. Combining CBT and Behavior-Analytic Approaches to Target Severe Emotion Dysregulation in Verbal Youth with ASD and ID

    ERIC Educational Resources Information Center

    Parent, Veronique; Birtwell, Kirstin B.; Lambright, Nathan; DuBard, Melanie

    2016-01-01

    This article presents an individual intervention combining cognitive-behavioral and behavior-analytic approaches to target severe emotion dysregulation in verbal youth with autism spectrum disorder (ASD) concurrent with intellectual disability (ID). The article focuses on two specific individuals who received the treatment within a therapeutic…

  17. Methods for Integrating Moderation and Mediation: A General Analytical Framework Using Moderated Path Analysis

    ERIC Educational Resources Information Center

    Edwards, Jeffrey R.; Lambert, Lisa Schurer

    2007-01-01

    Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated…

  18. A Simpli ed, General Approach to Simulating from Multivariate Copula Functions

    Treesearch

    Barry Goodwin

    2012-01-01

    Copulas have become an important analytic tool for characterizing multivariate distributions and dependence. One is often interested in simulating data from copula estimates. The process can be analytically and computationally complex and usually involves steps that are unique to a given parametric copula. We describe an alternative approach that uses \\probability{...

  19. Advanced, Analytic, Automated (AAA) Measurement of Engagement during Learning

    ERIC Educational Resources Information Center

    D'Mello, Sidney; Dieterle, Ed; Duckworth, Angela

    2017-01-01

    It is generally acknowledged that engagement plays a critical role in learning. Unfortunately, the study of engagement has been stymied by a lack of valid and efficient measures. We introduce the advanced, analytic, and automated (AAA) approach to measure engagement at fine-grained temporal resolutions. The AAA measurement approach is grounded in…

  20. Equity Analytics: A Methodological Approach for Quantifying Participation Patterns in Mathematics Classroom Discourse

    ERIC Educational Resources Information Center

    Reinholz, Daniel L.; Shah, Niral

    2018-01-01

    Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…

  1. An Investigation of First-Year Engineering Student and Instructor Perspectives of Learning Analytics Approaches

    ERIC Educational Resources Information Center

    Knight, David B.; Brozina, Cory; Novoselich, Brian

    2016-01-01

    This paper investigates how first-year engineering undergraduates and their instructors describe the potential for learning analytics approaches to contribute to student success. Results of qualitative data collection in a first-year engineering course indicated that both students and instructors\temphasized a preference for learning analytics…

  2. Thermoelectric phonon-glass electron-crystal via ion beam patterning of silicon

    NASA Astrophysics Data System (ADS)

    Zhu, Taishan; Swaminathan-Gopalan, Krishnan; Stephani, Kelly; Ertekin, Elif

    2018-05-01

    Ion beam irradiation has recently emerged as a versatile approach to functional materials design. We show in this work that patterned defective regions generated by ion beam irradiation of silicon can create a phonon-glass electron-crystal (PGEC), a long-standing goal of thermoelectrics. By controlling the effective diameter of and spacing between the defective regions, molecular dynamics simulations suggest a reduction of the thermal conductivity by a factor of ˜20 is achievable. Boltzmann theory shows that the thermoelectric power factor remains largely intact in the damaged material. To facilitate the Boltzmann theory, we derive an analytical model for electron scattering with cylindrical defective regions based on partial-wave analysis. Together we predict a figure of merit of Z T ≈0.5 or more at room temperature for optimally patterned geometries of these silicon metamaterials. These findings indicate that nanostructuring of patterned defective regions in crystalline materials is a viable approach to realize a PGEC, and ion beam irradiation could be a promising fabrication strategy.

  3. Manufacturing Process Selection of Composite Bicycle’s Crank Arm using Analytical Hierarchy Process (AHP)

    NASA Astrophysics Data System (ADS)

    Luqman, M.; Rosli, M. U.; Khor, C. Y.; Zambree, Shayfull; Jahidi, H.

    2018-03-01

    Crank arm is one of the important parts in a bicycle that is an expensive product due to the high cost of material and production process. This research is aimed to investigate the potential type of manufacturing process to fabricate composite bicycle crank arm and to describe an approach based on analytical hierarchy process (AHP) that assists decision makers or manufacturing engineers in determining the most suitable process to be employed in manufacturing of composite bicycle crank arm at the early stage of the product development process to reduce the production cost. There are four types of processes were considered, namely resin transfer molding (RTM), compression molding (CM), vacuum bag molding and filament winding (FW). The analysis ranks these four types of process for its suitability in the manufacturing of bicycle crank arm based on five main selection factors and 10 sub factors. Determining the right manufacturing process was performed based on AHP process steps. Consistency test was performed to make sure the judgements are consistent during the comparison. The results indicated that the compression molding was the most appropriate manufacturing process because it has the highest value (33.6%) among the other manufacturing processes.

  4. Assessing Measurement Invariance for Spanish Sentence Repetition and Morphology Elicitation Tasks.

    PubMed

    Kapantzoglou, Maria; Thompson, Marilyn S; Gray, Shelley; Restrepo, M Adelaida

    2016-04-01

    The purpose of this study was to evaluate evidence supporting the construct validity of two grammatical tasks (sentence repetition, morphology elicitation) included in the Spanish Screener for Language Impairment in Children (Restrepo, Gorin, & Gray, 2013). We evaluated if the tasks measured the targeted grammatical skills in the same way across predominantly Spanish-speaking children with typical language development and those with primary language impairment. A multiple-group, confirmatory factor analytic approach was applied to examine factorial invariance in a sample of 307 predominantly Spanish-speaking children (177 with typical language development; 130 with primary language impairment). The 2 newly developed grammatical tasks were modeled as measures in a unidimensional confirmatory factor analytic model along with 3 well-established grammatical measures from the Clinical Evaluation of Language Fundamentals-Fourth Edition, Spanish (Wiig, Semel, & Secord, 2006). Results suggest that both new tasks measured the construct of grammatical skills for both language-ability groups in an equivalent manner. There was no evidence of bias related to children's language status for the Spanish Screener for Language Impairment in Children Sentence Repetition or Morphology Elicitation tasks. Results provide support for the validity of the new tasks as measures of grammatical skills.

  5. Rapid and sensitive determination of tellurium in soil and plant samples by sector-field inductively coupled plasma mass spectrometry.

    PubMed

    Yang, Guosheng; Zheng, Jian; Tagami, Keiko; Uchida, Shigeo

    2013-11-15

    In this work, we report a rapid and highly sensitive analytical method for the determination of tellurium in soil and plant samples using sector field inductively coupled plasma mass spectrometry (SF-ICP-MS). Soil and plant samples were digested using Aqua regia. After appropriate dilution, Te in soil and plant samples was directly analyzed without any separation and preconcentration. This simple sample preparation approach avoided to a maximum extent any contamination and loss of Te prior to the analysis. The developed analytical method was validated by the analysis of soil/sediment and plant reference materials. Satisfactory detection limits of 0.17 ng g(-1) for soil and 0.02 ng g(-1) for plant samples were achieved, which meant that the developed method was applicable to studying the soil-to-plant transfer factor of Te. Our work represents for the first time that data on the soil-to-plant transfer factor of Te were obtained for Japanese samples which can be used for the estimation of internal radiation dose of radioactive tellurium due to the Fukushima Daiichi Nuclear Power Plant accident. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. MemAxes: Visualization and Analytics for Characterizing Complex Memory Performance Behaviors.

    PubMed

    Gimenez, Alfredo; Gamblin, Todd; Jusufi, Ilir; Bhatele, Abhinav; Schulz, Martin; Bremer, Peer-Timo; Hamann, Bernd

    2018-07-01

    Memory performance is often a major bottleneck for high-performance computing (HPC) applications. Deepening memory hierarchies, complex memory management, and non-uniform access times have made memory performance behavior difficult to characterize, and users require novel, sophisticated tools to analyze and optimize this aspect of their codes. Existing tools target only specific factors of memory performance, such as hardware layout, allocations, or access instructions. However, today's tools do not suffice to characterize the complex relationships between these factors. Further, they require advanced expertise to be used effectively. We present MemAxes, a tool based on a novel approach for analytic-driven visualization of memory performance data. MemAxes uniquely allows users to analyze the different aspects related to memory performance by providing multiple visual contexts for a centralized dataset. We define mappings of sampled memory access data to new and existing visual metaphors, each of which enabling a user to perform different analysis tasks. We present methods to guide user interaction by scoring subsets of the data based on known performance problems. This scoring is used to provide visual cues and automatically extract clusters of interest. We designed MemAxes in collaboration with experts in HPC and demonstrate its effectiveness in case studies.

  7. Analytical approach of laser beam propagation in the hollow polygonal light pipe.

    PubMed

    Zhu, Guangzhi; Zhu, Xiao; Zhu, Changhong

    2013-08-10

    An analytical method of researching the light distribution properties on the output end of a hollow n-sided polygonal light pipe and a light source with a Gaussian distribution is developed. The mirror transformation matrices and a special algorithm of removing void virtual images are created to acquire the location and direction vector of each effective virtual image on the entrance plane. The analytical method is demonstrated by Monte Carlo ray tracing. At the same time, four typical cases are discussed. The analytical results indicate that the uniformity of light distribution varies with the structural and optical parameters of the hollow n-sided polygonal light pipe and light source with a Gaussian distribution. The analytical approach will be useful to design and choose the hollow n-sided polygonal light pipe, especially for high-power laser beam homogenization techniques.

  8. Accurate analytical modeling of junctionless DG-MOSFET by green's function approach

    NASA Astrophysics Data System (ADS)

    Nandi, Ashutosh; Pandey, Nilesh

    2017-11-01

    An accurate analytical model of Junctionless double gate MOSFET (JL-DG-MOSFET) in the subthreshold regime of operation is developed in this work using green's function approach. The approach considers 2-D mixed boundary conditions and multi-zone techniques to provide an exact analytical solution to 2-D Poisson's equation. The Fourier coefficients are calculated correctly to derive the potential equations that are further used to model the channel current and subthreshold slope of the device. The threshold voltage roll-off is computed from parallel shifts of Ids-Vgs curves between the long channel and short-channel devices. It is observed that the green's function approach of solving 2-D Poisson's equation in both oxide and silicon region can accurately predict channel potential, subthreshold current (Isub), threshold voltage (Vt) roll-off and subthreshold slope (SS) of both long & short channel devices designed with different doping concentrations and higher as well as lower tsi/tox ratio. All the analytical model results are verified through comparisons with TCAD Sentaurus simulation results. It is observed that the model matches quite well with TCAD device simulations.

  9. Pre-analytical and analytical factors influencing Alzheimer's disease cerebrospinal fluid biomarker variability.

    PubMed

    Fourier, Anthony; Portelius, Erik; Zetterberg, Henrik; Blennow, Kaj; Quadrio, Isabelle; Perret-Liaudet, Armand

    2015-09-20

    A panel of cerebrospinal fluid (CSF) biomarkers including total Tau (t-Tau), phosphorylated Tau protein at residue 181 (p-Tau) and β-amyloid peptides (Aβ42 and Aβ40), is frequently used as an aid in Alzheimer's disease (AD) diagnosis for young patients with cognitive impairment, for predicting prodromal AD in mild cognitive impairment (MCI) subjects, for AD discrimination in atypical clinical phenotypes and for inclusion/exclusion and stratification of patients in clinical trials. Due to variability in absolute levels between laboratories, there is no consensus on medical cut-off value for the CSF AD signature. Thus, for full implementation of this core AD biomarker panel in clinical routine, this issue has to be solved. Variability can be explained both by pre-analytical and analytical factors. For example, the plastic tubes used for CSF collection and storage, the lack of reference material and the variability of the analytical protocols were identified as important sources of variability. The aim of this review is to highlight these pre-analytical and analytical factors and describe efforts done to counteract them in order to establish cut-off values for core CSF AD biomarkers. This review will give the current state of recommendations. Copyright © 2015. Published by Elsevier B.V.

  10. Piezoresistive Cantilever Performance—Part I: Analytical Model for Sensitivity

    PubMed Central

    Park, Sung-Jin; Doll, Joseph C.; Pruitt, Beth L.

    2010-01-01

    An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors. PMID:20336183

  11. Piezoresistive Cantilever Performance-Part I: Analytical Model for Sensitivity.

    PubMed

    Park, Sung-Jin; Doll, Joseph C; Pruitt, Beth L

    2010-02-01

    An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors.

  12. The factor structure of the Values in Action Inventory of Strengths (VIA-IS): An item-level exploratory structural equation modeling (ESEM) bifactor analysis.

    PubMed

    Ng, Vincent; Cao, Mengyang; Marsh, Herbert W; Tay, Louis; Seligman, Martin E P

    2017-08-01

    The factor structure of the Values in Action Inventory of Strengths (VIA-IS; Peterson & Seligman, 2004) has not been well established as a result of methodological challenges primarily attributable to a global positivity factor, item cross-loading across character strengths, and questions concerning the unidimensionality of the scales assessing character strengths. We sought to overcome these methodological challenges by applying exploratory structural equation modeling (ESEM) at the item level using a bifactor analytic approach to a large sample of 447,573 participants who completed the VIA-IS with all 240 character strengths items and a reduced set of 107 unidimensional character strength items. It was found that a 6-factor bifactor structure generally held for the reduced set of unidimensional character strength items; these dimensions were justice, temperance, courage, wisdom, transcendence, humanity, and an overarching general factor that is best described as dispositional positivity. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Understanding latent structures of clinical information logistics: A bottom-up approach for model building and validating the workflow composite score.

    PubMed

    Esdar, Moritz; Hübner, Ursula; Liebe, Jan-David; Hüsers, Jens; Thye, Johannes

    2017-01-01

    Clinical information logistics is a construct that aims to describe and explain various phenomena of information provision to drive clinical processes. It can be measured by the workflow composite score, an aggregated indicator of the degree of IT support in clinical processes. This study primarily aimed to investigate the yet unknown empirical patterns constituting this construct. The second goal was to derive a data-driven weighting scheme for the constituents of the workflow composite score and to contrast this scheme with a literature based, top-down procedure. This approach should finally test the validity and robustness of the workflow composite score. Based on secondary data from 183 German hospitals, a tiered factor analytic approach (confirmatory and subsequent exploratory factor analysis) was pursued. A weighting scheme, which was based on factor loadings obtained in the analyses, was put into practice. We were able to identify five statistically significant factors of clinical information logistics that accounted for 63% of the overall variance. These factors were "flow of data and information", "mobility", "clinical decision support and patient safety", "electronic patient record" and "integration and distribution". The system of weights derived from the factor loadings resulted in values for the workflow composite score that differed only slightly from the score values that had been previously published based on a top-down approach. Our findings give insight into the internal composition of clinical information logistics both in terms of factors and weights. They also allowed us to propose a coherent model of clinical information logistics from a technical perspective that joins empirical findings with theoretical knowledge. Despite the new scheme of weights applied to the calculation of the workflow composite score, the score behaved robustly, which is yet another hint of its validity and therefore its usefulness. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. Implementing a Contributory Scoring Approach for the "GRE"® Analytical Writing Section: A Comprehensive Empirical Investigation. Research Report. ETS RR-17-14

    ERIC Educational Resources Information Center

    Breyer, F. Jay; Rupp, André A.; Bridgeman, Brent

    2017-01-01

    In this research report, we present an empirical argument for the use of a contributory scoring approach for the 2-essay writing assessment of the analytical writing section of the "GRE"® test in which human and machine scores are combined for score creation at the task and section levels. The approach was designed to replace a currently…

  15. Use of multiple colorimetric indicators for paper-based microfluidic devices.

    PubMed

    Dungchai, Wijitar; Chailapakul, Orawon; Henry, Charles S

    2010-08-03

    We report here the use of multiple indicators for a single analyte for paper-based microfluidic devices (microPAD) in an effort to improve the ability to visually discriminate between analyte concentrations. In existing microPADs, a single dye system is used for the measurement of a single analyte. In our approach, devices are designed to simultaneously quantify analytes using multiple indicators for each analyte improving the accuracy of the assay. The use of multiple indicators for a single analyte allows for different indicator colors to be generated at different analyte concentration ranges as well as increasing the ability to better visually discriminate colors. The principle of our devices is based on the oxidation of indicators by hydrogen peroxide produced by oxidase enzymes specific for each analyte. Each indicator reacts at different peroxide concentrations and therefore analyte concentrations, giving an extended range of operation. To demonstrate the utility of our approach, the mixture of 4-aminoantipyrine and 3,5-dichloro-2-hydroxy-benzenesulfonic acid, o-dianisidine dihydrochloride, potassium iodide, acid black, and acid yellow were chosen as the indicators for simultaneous semi-quantitative measurement of glucose, lactate, and uric acid on a microPAD. Our approach was successfully applied to quantify glucose (0.5-20 mM), lactate (1-25 mM), and uric acid (0.1-7 mM) in clinically relevant ranges. The determination of glucose, lactate, and uric acid in control serum and urine samples was also performed to demonstrate the applicability of this device for biological sample analysis. Finally results for the multi-indicator and single indicator system were compared using untrained readers to demonstrate the improvements in accuracy achieved with the new system. 2010 Elsevier B.V. All rights reserved.

  16. Mechanics of additively manufactured porous biomaterials based on the rhombicuboctahedron unit cell.

    PubMed

    Hedayati, R; Sadighi, M; Mohammadi-Aghdam, M; Zadpoor, A A

    2016-01-01

    Thanks to recent developments in additive manufacturing techniques, it is now possible to fabricate porous biomaterials with arbitrarily complex micro-architectures. Micro-architectures of such biomaterials determine their physical and biological properties, meaning that one could potentially improve the performance of such biomaterials through rational design of micro-architecture. The relationship between the micro-architecture of porous biomaterials and their physical and biological properties has therefore received increasing attention recently. In this paper, we studied the mechanical properties of porous biomaterials made from a relatively unexplored unit cell, namely rhombicuboctahedron. We derived analytical relationships that relate the micro-architecture of such porous biomaterials, i.e. the dimensions of the rhombicuboctahedron unit cell, to their elastic modulus, Poisson's ratio, and yield stress. Finite element models were also developed to validate the analytical solutions. Analytical and numerical results were compared with experimental data from one of our recent studies. It was found that analytical solutions and numerical results show a very good agreement particularly for smaller values of apparent density. The elastic moduli predicted by analytical and numerical models were in very good agreement with experimental observations too. While in excellent agreement with each other, analytical and numerical models somewhat over-predicted the yield stress of the porous structures as compared to experimental data. As the ratio of the vertical struts to the inclined struts, α, approaches zero and infinity, the rhombicuboctahedron unit cell respectively approaches the octahedron (or truncated cube) and cube unit cells. For those limits, the analytical solutions presented here were found to approach the analytic solutions obtained for the octahedron, truncated cube, and cube unit cells, meaning that the presented solutions are generalizations of the analytical solutions obtained for several other types of porous biomaterials. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Tiered analytics for purity assessment of macrocyclic peptides in drug discovery: Analytical consideration and method development.

    PubMed

    Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A

    2017-05-10

    Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources. Copyright © 2017. Published by Elsevier B.V.

  18. A consortium-driven framework to guide the implementation of ICH M7 Option 4 control strategies.

    PubMed

    Barber, Chris; Antonucci, Vincent; Baumann, Jens-Christoph; Brown, Roland; Covey-Crump, Elizabeth; Elder, David; Elliott, Eric; Fennell, Jared W; Gallou, Fabrice; Ide, Nathan D; Jordine, Guido; Kallemeyn, Jeffrey M; Lauwers, Dirk; Looker, Adam R; Lovelle, Lucie E; McLaughlin, Mark; Molzahn, Robert; Ott, Martin; Schils, Didier; Oestrich, Rolf Schulte; Stevenson, Neil; Talavera, Pere; Teasdale, Andrew; Urquhart, Michael W; Varie, David L; Welch, Dennie

    2017-11-01

    The ICH M7 Option 4 control of (potentially) mutagenic impurities is based on the use of scientific principles in lieu of routine analytical testing. This approach can reduce the burden of analytical testing without compromising patient safety, provided a scientifically rigorous approach is taken which is backed up by sufficient theoretical and/or analytical data. This paper introduces a consortium-led initiative and offers a proposal on the supporting evidence that could be presented in regulatory submissions. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Bridging analytical approaches for low-carbon transitions

    NASA Astrophysics Data System (ADS)

    Geels, Frank W.; Berkhout, Frans; van Vuuren, Detlef P.

    2016-06-01

    Low-carbon transitions are long-term multi-faceted processes. Although integrated assessment models have many strengths for analysing such transitions, their mathematical representation requires a simplification of the causes, dynamics and scope of such societal transformations. We suggest that integrated assessment model-based analysis should be complemented with insights from socio-technical transition analysis and practice-based action research. We discuss the underlying assumptions, strengths and weaknesses of these three analytical approaches. We argue that full integration of these approaches is not feasible, because of foundational differences in philosophies of science and ontological assumptions. Instead, we suggest that bridging, based on sequential and interactive articulation of different approaches, may generate a more comprehensive and useful chain of assessments to support policy formation and action. We also show how these approaches address knowledge needs of different policymakers (international, national and local), relate to different dimensions of policy processes and speak to different policy-relevant criteria such as cost-effectiveness, socio-political feasibility, social acceptance and legitimacy, and flexibility. A more differentiated set of analytical approaches thus enables a more differentiated approach to climate policy making.

  20. Influence of Pre-Analytical Factors on Thymus- and Activation-Regulated Chemokine Quantitation in Plasma

    PubMed Central

    Zhao, Xuemei; Delgado, Liliana; Weiner, Russell; Laterza, Omar F.

    2015-01-01

    Thymus- and activation-regulated chemokine (TARC) in serum/plasma associates with the disease activity of atopic dermatitis (AD), and is a promising tool for assessing the response to the treatment of the disease. TARC also exists within platelets, with elevated levels detectable in AD patients. We examined the effects of pre-analytical factors on the quantitation of TARC in human EDTA plasma. TARC levels in platelet-free plasma were significantly lower than those in platelet-containing plasma. After freeze-thaw, TARC levels increased in platelet-containing plasma, but remained unchanged in platelet-free plasma, suggesting TARC was released from the platelets during the freeze-thaw process. In contrast, TARC levels were stable in serum independent of freeze-thaw. These findings underscore the importance of pre-analytical factors to TARC quantitation. Plasma TARC levels should be measured in platelet-free plasma for accurate quantitation. Pre-analytical factors influence the quantitation, interpretation, and implementation of circulating TARC as a biomarker for the development of AD therapeutics. PMID:28936246

  1. Target analyte quantification by isotope dilution LC-MS/MS directly referring to internal standard concentrations--validation for serum cortisol measurement.

    PubMed

    Maier, Barbara; Vogeser, Michael

    2013-04-01

    Isotope dilution LC-MS/MS methods used in the clinical laboratory typically involve multi-point external calibration in each analytical series. Our aim was to test the hypothesis that determination of target analyte concentrations directly derived from the relation of the target analyte peak area to the peak area of a corresponding stable isotope labelled internal standard compound [direct isotope dilution analysis (DIDA)] may be not inferior to conventional external calibration with respect to accuracy and reproducibility. Quality control samples and human serum pools were analysed in a comparative validation protocol for cortisol as an exemplary analyte by LC-MS/MS. Accuracy and reproducibility were compared between quantification either involving a six-point external calibration function, or a result calculation merely based on peak area ratios of unlabelled and labelled analyte. Both quantification approaches resulted in similar accuracy and reproducibility. For specified analytes, reliable analyte quantification directly derived from the ratio of peak areas of labelled and unlabelled analyte without the need for a time consuming multi-point calibration series is possible. This DIDA approach is of considerable practical importance for the application of LC-MS/MS in the clinical laboratory where short turnaround times often have high priority.

  2. A Big Data Analytics Methodology Program in the Health Sector

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  3. A Bottom-Up Approach to Understanding Protein Layer Formation at Solid-Liquid Interfaces

    PubMed Central

    Kastantin, Mark; Langdon, Blake B.; Schwartz, Daniel K.

    2014-01-01

    A common goal across different fields (e.g. separations, biosensors, biomaterials, pharmaceuticals) is to understand how protein behavior at solid-liquid interfaces is affected by environmental conditions. Temperature, pH, ionic strength, and the chemical and physical properties of the solid surface, among many factors, can control microscopic protein dynamics (e.g. adsorption, desorption, diffusion, aggregation) that contribute to macroscopic properties like time-dependent total protein surface coverage and protein structure. These relationships are typically studied through a top-down approach in which macroscopic observations are explained using analytical models that are based upon reasonable, but not universally true, simplifying assumptions about microscopic protein dynamics. Conclusions connecting microscopic dynamics to environmental factors can be heavily biased by potentially incorrect assumptions. In contrast, more complicated models avoid several of the common assumptions but require many parameters that have overlapping effects on predictions of macroscopic, average protein properties. Consequently, these models are poorly suited for the top-down approach. Because the sophistication incorporated into these models may ultimately prove essential to understanding interfacial protein behavior, this article proposes a bottom-up approach in which direct observations of microscopic protein dynamics specify parameters in complicated models, which then generate macroscopic predictions to compare with experiment. In this framework, single-molecule tracking has proven capable of making direct measurements of microscopic protein dynamics, but must be complemented by modeling to combine and extrapolate many independent microscopic observations to the macro-scale. The bottom-up approach is expected to better connect environmental factors to macroscopic protein behavior, thereby guiding rational choices that promote desirable protein behaviors. PMID:24484895

  4. Microchip integrating magnetic nanoparticles for allergy diagnosis.

    PubMed

    Teste, Bruno; Malloggi, Florent; Siaugue, Jean-Michel; Varenne, Anne; Kanoufi, Frederic; Descroix, Stéphanie

    2011-12-21

    We report on the development of a simple and easy to use microchip dedicated to allergy diagnosis. This microchip combines both the advantages of homogeneous immunoassays i.e. species diffusion and heterogeneous immunoassays i.e. easy separation and preconcentration steps. In vitro allergy diagnosis is based on specific Immunoglobulin E (IgE) quantitation, in that way we have developed and integrated magnetic core-shell nanoparticles (MCSNPs) as an IgE capture nanoplatform in a microdevice taking benefit from both their magnetic and colloidal properties. Integrating such immunosupport allows to perform the target analyte (IgE) capture in the colloidal phase thus increasing the analyte capture kinetics since both immunological partners are diffusing during the immune reaction. This colloidal approach improves 1000 times the analyte capture kinetics compared to conventional methods. Moreover, based on the MCSNPs' magnetic properties and on the magnetic chamber we have previously developed the MCSNPs and therefore the target can be confined and preconcentrated within the microdevice prior to the detection step. The MCSNPs preconcentration factor achieved was about 35,000 and allows to reach high sensitivity thus avoiding catalytic amplification during the detection step. The developed microchip offers many advantages: the analytical procedure was fully integrated on-chip, analyses were performed in short assay time (20 min), the sample and reagents consumption was reduced to few microlitres (5 μL) while a low limit of detection can be achieved (about 1 ng mL(-1)).

  5. 3-MCPD in food other than soy sauce or hydrolysed vegetable protein (HVP).

    PubMed

    Baer, Ines; de la Calle, Beatriz; Taylor, Philip

    2010-01-01

    This review gives an overview of current knowledge about 3-monochloropropane-1,2-diol (3-MCPD) formation and detection. Although 3-MCPD is often mentioned with regard to soy sauce and acid-hydrolysed vegetable protein (HVP), and much research has been done in that area, the emphasis here is placed on other foods. This contaminant can be found in a great variety of foodstuffs and is difficult to avoid in our daily nutrition. Despite its low concentration in most foods, its carcinogenic properties are of general concern. Its formation is a multivariate problem influenced by factors such as heat, moisture and sugar/lipid content, depending on the type of food and respective processing employed. Understanding the formation of this contaminant in food is fundamental to not only preventing or reducing it, but also developing efficient analytical methods of detecting it. Considering the differences between 3-MCPD-containing foods, and the need to test for the contaminant at different levels of food processing, one would expect a variety of analytical approaches. In this review, an attempt is made to provide an up-to-date list of available analytical methods and to highlight the differences among these techniques. Finally, the emergence of 3-MCPD esters and analytical techniques for them are also discussed here, although they are not the main focus of this review.

  6. A novel literature-based approach to identify genetic and molecular predictors of survival in glioblastoma multiforme: Analysis of 14,678 patients using systematic review and meta-analytical tools.

    PubMed

    Thuy, Matthew N T; Kam, Jeremy K T; Lee, Geoffrey C Y; Tao, Peter L; Ling, Dorothy Q; Cheng, Melissa; Goh, Su Kah; Papachristos, Alexander J; Shukla, Lipi; Wall, Krystal-Leigh; Smoll, Nicolas R; Jones, Jordan J; Gikenye, Njeri; Soh, Bob; Moffat, Brad; Johnson, Nick; Drummond, Katharine J

    2015-05-01

    Glioblastoma multiforme (GBM) has a poor prognosis despite maximal multimodal therapy. Biomarkers of relevance to prognosis which may also identify treatment targets are needed. A few hundred genetic and molecular predictors have been implicated in the literature, however with the exception of IDH1 and O6-MGMT, there is uncertainty regarding their true prognostic relevance. This study analyses reported genetic and molecular predictors of prognosis in GBM. For each, its relationship with univariate overall survival in adults with GBM is described. A systematic search of MEDLINE (1998-July 2010) was performed. Eligible papers studied the effect of any genetic or molecular marker on univariate overall survival in adult patients with histologically diagnosed GBM. Primary outcomes were median survival difference in months and univariate hazard ratios. Analyses included converting 126 Kaplan-Meier curves and 27 raw data sets into primary outcomes. Seventy-four random effects meta-analyses were performed on 39 unique genetic or molecular factors. Objective criteria were designed to classify factors into the categories of clearly prognostic, weakly prognostic, non-prognostic and promising. Included were 304 publications and 174 studies involving 14,678 unique patients from 33 countries. We identified 422 reported genetic and molecular predictors, of which 52 had ⩾2 studies. IDH1 mutation and O6-MGMT were classified as clearly prognostic, validating the methodology. High Ki-67/MIB-1 and loss of heterozygosity of chromosome 10/10q were classified as weakly prognostic. Four factors were classified as non-prognostic and 13 factors were classified as promising and worthy of additional investigation. Funnel plot analysis did not identify any evidence of publication bias. This study demonstrates a novel literature and meta-analytical based approach to maximise the value that can be derived from the plethora of literature reports of molecular and genetic factors in GBM. Caution is advised in over-interpreting the results due to study limitations. Further research to develop this methodology and improvements in study reporting are suggested. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Sinkhole susceptibility mapping using the analytical hierarchy process (AHP) and magnitude-frequency relationships: A case study in Hamadan province, Iran

    NASA Astrophysics Data System (ADS)

    Taheri, Kamal; Gutiérrez, Francisco; Mohseni, Hassan; Raeisi, Ezzat; Taheri, Milad

    2015-04-01

    Since 1989, an increasing number of sinkhole occurrences have been reported in the Kabudar Ahang and Razan-Qahavand subcatchments (KRQ) of Hamadan province, western Iran. The sinkhole-related subsidence phenomenon poses a significant threat for people and human structures, including sensitive facilities like the Hamadan Power Plant. Groundwater over-exploitation from the thick alluvial cover and the underlying cavernous limestone has been identified as the main factor involved in sinkhole development. A sinkhole susceptibility model was produced in a GIS environment applying the analytical hierarchy process (AHP) approach and considering a selection of eight factors, each categorized into five classes: distance to faults (DF), water level decline (WLD), groundwater exploitation (GE), penetration of deep wells into karst bedrock (PKA), distance to deep wells (DDW), groundwater alkalinity (GA), bedrock lithology (BL), and alluvium thickness (AT). Relative weights were preliminarily assigned to each factor and to their different classes through systematic pairwise comparisons based on expert judgment. The resulting sinkhole susceptibility index (SSI) values were then classified into four susceptibility classes: low, moderate, high and very high susceptibility. Subsequently, the model was refined through a trial and error process involving changes in the relative weights and iterative evaluation of the prediction capability. Independent evaluation of the final model indicates that 55% and 45% of the subsidence events fall within the very high and high, susceptibility zones, respectively. The results of this study show that AHP can be a useful approach for susceptibility assessment if data on the main controlling factors have sufficient accuracy and spatial coverage. The limitations of the model are partly related to the difficulty of gathering data on some important geological factors, due to their hidden nature. The magnitude and frequency relationship constructed with the 41 sinkholes with chronological and morphometric data indicates maximum recurrence intervals of 1.17, 2.14 and 4.18 years for sinkholes with major axial lengths equal to or higher than 10 m, 20 m, and 30 m, respectively.

  8. Holistic rubric vs. analytic rubric for measuring clinical performance levels in medical students.

    PubMed

    Yune, So Jung; Lee, Sang Yeoup; Im, Sun Ju; Kam, Bee Sung; Baek, Sun Yong

    2018-06-05

    Task-specific checklists, holistic rubrics, and analytic rubrics are often used for performance assessments. We examined what factors evaluators consider important in holistic scoring of clinical performance assessment, and compared the usefulness of applying holistic and analytic rubrics respectively, and analytic rubrics in addition to task-specific checklists based on traditional standards. We compared the usefulness of a holistic rubric versus an analytic rubric in effectively measuring the clinical skill performances of 126 third-year medical students who participated in a clinical performance assessment conducted by Pusan National University School of Medicine. We conducted a questionnaire survey of 37 evaluators who used all three evaluation methods-holistic rubric, analytic rubric, and task-specific checklist-for each student. The relationship between the scores on the three evaluation methods was analyzed using Pearson's correlation. Inter-rater agreement was analyzed by Kappa index. The effect of holistic and analytic rubric scores on the task-specific checklist score was analyzed using multiple regression analysis. Evaluators perceived accuracy and proficiency to be major factors in objective structured clinical examinations evaluation, and history taking and physical examination to be major factors in clinical performance examinations evaluation. Holistic rubric scores were highly related to the scores of the task-specific checklist and analytic rubric. Relatively low agreement was found in clinical performance examinations compared to objective structured clinical examinations. Meanwhile, the holistic and analytic rubric scores explained 59.1% of the task-specific checklist score in objective structured clinical examinations and 51.6% in clinical performance examinations. The results show the usefulness of holistic and analytic rubrics in clinical performance assessment, which can be used in conjunction with task-specific checklists for more efficient evaluation.

  9. Effects of Vibrations on Metal Forming Process: Analytical Approach and Finite Element Simulations

    NASA Astrophysics Data System (ADS)

    Armaghan, Khan; Christophe, Giraud-Audine; Gabriel, Abba; Régis, Bigot

    2011-01-01

    Vibration assisted forming is one of the most recent and beneficial technique used to improve forming process. Effects of vibration on metal forming processes can be attributed to two causes. First, the volume effect links lowering of yield stress with the influence of vibration on the dislocation movement. Second, the surface effect explains lowering of the effective coefficient of friction by periodic reduction contact area. This work is related to vibration assisted forming process in viscoplastic domain. Impact of change in vibration waveform has been analyzed. For this purpose, two analytical models have been developed for two different types of vibration waveforms (sinusoidal and triangular). These models were developed on the basis of Slice method that is used to find out the required forming force for the process. Final relationships show that application of triangular waveform in forming process is more beneficial as compare to sinusoidal vibrations in terms of reduced forming force. Finite Element Method (FEM) based simulations were performed using Forge2008®and these confirmed the results of analytical models. The ratio of vibration speed to upper die speed is a critical factor in the reduction of the forming force.

  10. Electrical wave propagation in an anisotropic model of the left ventricle based on analytical description of cardiac architecture.

    PubMed

    Pravdin, Sergey F; Dierckx, Hans; Katsnelson, Leonid B; Solovyova, Olga; Markhasin, Vladimir S; Panfilov, Alexander V

    2014-01-01

    We develop a numerical approach based on our recent analytical model of fiber structure in the left ventricle of the human heart. A special curvilinear coordinate system is proposed to analytically include realistic ventricular shape and myofiber directions. With this anatomical model, electrophysiological simulations can be performed on a rectangular coordinate grid. We apply our method to study the effect of fiber rotation and electrical anisotropy of cardiac tissue (i.e., the ratio of the conductivity coefficients along and across the myocardial fibers) on wave propagation using the ten Tusscher-Panfilov (2006) ionic model for human ventricular cells. We show that fiber rotation increases the speed of cardiac activation and attenuates the effects of anisotropy. Our results show that the fiber rotation in the heart is an important factor underlying cardiac excitation. We also study scroll wave dynamics in our model and show the drift of a scroll wave filament whose velocity depends non-monotonically on the fiber rotation angle; the period of scroll wave rotation decreases with an increase of the fiber rotation angle; an increase in anisotropy may cause the breakup of a scroll wave, similar to the mother rotor mechanism of ventricular fibrillation.

  11. ACCELERATING MR PARAMETER MAPPING USING SPARSITY-PROMOTING REGULARIZATION IN PARAMETRIC DIMENSION

    PubMed Central

    Velikina, Julia V.; Alexander, Andrew L.; Samsonov, Alexey

    2013-01-01

    MR parameter mapping requires sampling along additional (parametric) dimension, which often limits its clinical appeal due to a several-fold increase in scan times compared to conventional anatomic imaging. Data undersampling combined with parallel imaging is an attractive way to reduce scan time in such applications. However, inherent SNR penalties of parallel MRI due to noise amplification often limit its utility even at moderate acceleration factors, requiring regularization by prior knowledge. In this work, we propose a novel regularization strategy, which utilizes smoothness of signal evolution in the parametric dimension within compressed sensing framework (p-CS) to provide accurate and precise estimation of parametric maps from undersampled data. The performance of the method was demonstrated with variable flip angle T1 mapping and compared favorably to two representative reconstruction approaches, image space-based total variation regularization and an analytical model-based reconstruction. The proposed p-CS regularization was found to provide efficient suppression of noise amplification and preservation of parameter mapping accuracy without explicit utilization of analytical signal models. The developed method may facilitate acceleration of quantitative MRI techniques that are not suitable to model-based reconstruction because of complex signal models or when signal deviations from the expected analytical model exist. PMID:23213053

  12. Multiple Interactive Pollutants in Water Quality Trading

    NASA Astrophysics Data System (ADS)

    Sarang, Amin; Lence, Barbara J.; Shamsai, Abolfazl

    2008-10-01

    Efficient environmental management calls for the consideration of multiple pollutants, for which two main types of transferable discharge permit (TDP) program have been described: separate permits that manage each pollutant individually in separate markets, with each permit based on the quantity of the pollutant or its environmental effects, and weighted-sum permits that aggregate several pollutants as a single commodity to be traded in a single market. In this paper, we perform a mathematical analysis of TDP programs for multiple pollutants that jointly affect the environment (i.e., interactive pollutants) and demonstrate the practicality of this approach for cost-efficient maintenance of river water quality. For interactive pollutants, the relative weighting factors are functions of the water quality impacts, marginal damage function, and marginal treatment costs at optimality. We derive the optimal set of weighting factors required by this approach for important scenarios for multiple interactive pollutants and propose using an analytical elasticity of substitution function to estimate damage functions for these scenarios. We evaluate the applicability of this approach using a hypothetical example that considers two interactive pollutants. We compare the weighted-sum permit approach for interactive pollutants with individual permit systems and TDP programs for multiple additive pollutants. We conclude by discussing practical considerations and implementation issues that result from the application of weighted-sum permit programs.

  13. Reduction of Simulation Times for High-Q Structures using the Resonance Equation

    DOE PAGES

    Hall, Thomas Wesley; Bandaru, Prabhakar R.; Rees, Daniel Earl

    2015-11-17

    Simulating steady state performance of high quality factor (Q) resonant RF structures is computationally difficult for structures with sizes on the order of more than a few wavelengths because of the long times (on the order of ~ 0.1 ms) required to achieve steady state in comparison with maximum time step that can be used in the simulation (typically, on the order of ~ 1 ps). This paper presents analytical and computational approaches that can be used to accelerate the simulation of the steady state performance of such structures. The basis of the proposed approach is the utilization of amore » larger amplitude signal at the beginning to achieve steady state earlier relative to the nominal input signal. Finally, the methodology for finding the necessary input signal is then discussed in detail, and the validity of the approach is evaluated.« less

  14. Numerical solution of the full potential equation using a chimera grid approach

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.

    1995-01-01

    A numerical scheme utilizing a chimera zonal grid approach for solving the full potential equation in two spatial dimensions is described. Within each grid zone a fully-implicit approximate factorization scheme is used to advance the solution one interaction. This is followed by the explicit advance of all common zonal grid boundaries using a bilinear interpolation of the velocity potential. The presentation is highlighted with numerical results simulating the flow about a two-dimensional, nonlifting, circular cylinder. For this problem, the flow domain is divided into two parts: an inner portion covered by a polar grid and an outer portion covered by a Cartesian grid. Both incompressible and compressible (transonic) flow solutions are included. Comparisons made with an analytic solution as well as single grid results indicate that the chimera zonal grid approach is a viable technique for solving the full potential equation.

  15. CrowdHEALTH: Holistic Health Records and Big Data Analytics for Health Policy Making and Personalized Health.

    PubMed

    Kyriazis, Dimosthenis; Autexier, Serge; Brondino, Iván; Boniface, Michael; Donat, Lucas; Engen, Vegard; Fernandez, Rafael; Jimenez-Peris, Ricardo; Jordan, Blanca; Jurak, Gregor; Kiourtis, Athanasios; Kosmidis, Thanos; Lustrek, Mitja; Maglogiannis, Ilias; Mantas, John; Martinez, Antonio; Mavrogiorgou, Argyro; Menychtas, Andreas; Montandon, Lydia; Nechifor, Cosmin-Septimiu; Nifakos, Sokratis; Papageorgiou, Alexandra; Patino-Martinez, Marta; Perez, Manuel; Plagianakos, Vassilis; Stanimirovic, Dalibor; Starc, Gregor; Tomson, Tanja; Torelli, Francesco; Traver-Salcedo, Vicente; Vassilacopoulos, George; Wajid, Usman

    2017-01-01

    Today's rich digital information environment is characterized by the multitude of data sources providing information that has not yet reached its full potential in eHealth. The aim of the presented approach, namely CrowdHEALTH, is to introduce a new paradigm of Holistic Health Records (HHRs) that include all health determinants. HHRs are transformed into HHRs clusters capturing the clinical, social and human context of population segments and as a result collective knowledge for different factors. The proposed approach also seamlessly integrates big data technologies across the complete data path, providing of Data as a Service (DaaS) to the health ecosystem stakeholders, as well as to policy makers towards a "health in all policies" approach. Cross-domain co-creation of policies is feasible through a rich toolkit, being provided on top of the DaaS, incorporating mechanisms for causal and risk analysis, and for the compilation of predictions.

  16. Hyperspectral image reconstruction for x-ray fluorescence tomography

    DOE PAGES

    Gürsoy, Doǧa; Biçer, Tekin; Lanzirotti, Antonio; ...

    2015-01-01

    A penalized maximum-likelihood estimation is proposed to perform hyperspectral (spatio-spectral) image reconstruction for X-ray fluorescence tomography. The approach minimizes a Poisson-based negative log-likelihood of the observed photon counts, and uses a penalty term that has the effect of encouraging local continuity of model parameter estimates in both spatial and spectral dimensions simultaneously. The performance of the reconstruction method is demonstrated with experimental data acquired from a seed of arabidopsis thaliana collected at the 13-ID-E microprobe beamline at the Advanced Photon Source. The resulting element distribution estimates with the proposed approach show significantly better reconstruction quality than the conventional analytical inversionmore » approaches, and allows for a high data compression factor which can reduce data acquisition times remarkably. In particular, this technique provides the capability to tomographically reconstruct full energy dispersive spectra without compromising reconstruction artifacts that impact the interpretation of results.« less

  17. FACTOR ANALYTIC MODELS OF CLUSTERED MULTIVARIATE DATA WITH INFORMATIVE CENSORING

    EPA Science Inventory

    This paper describes a general class of factor analytic models for the analysis of clustered multivariate data in the presence of informative missingness. We assume that there are distinct sets of cluster-level latent variables related to the primary outcomes and to the censorin...

  18. Examining the Relations between Executive Function, Math, and Literacy during the Transition to Kindergarten: A Multi-Analytic Approach

    ERIC Educational Resources Information Center

    Schmitt, Sara A.; Geldhof, G. John; Purpura, David J.; Duncan, Robert; McClelland, Megan M.

    2017-01-01

    The present study explored the bidirectional and longitudinal associations between executive function (EF) and early academic skills (math and literacy) across 4 waves of measurement during the transition from preschool to kindergarten using 2 complementary analytical approaches: cross-lagged panel modeling and latent growth curve modeling (LCGM).…

  19. Communication between Participants and Non-Participants in Analytical Capacity Building Projects: Management Advice to Family Farms in Benin

    ERIC Educational Resources Information Center

    Rouchouse, Marine; Faysse, Nicolas; De Romemont, Aurelle; Moumouni, Ismail; Faure, Guy

    2015-01-01

    Purpose: Approaches to build farmers' analytical capacities are said to trigger wide-ranging changes. This article reports on the communication process between participants and non-participants in one such approach, related to the technical and management skills learned by participants and the changes these participants subsequently made, and the…

  20. Analysis of a Nonlinear Oscillator with CAS through Analytical, Numerical, and Qualitative Approaches: A Prototype for Teaching

    ERIC Educational Resources Information Center

    López-García, Jeanett; Jiménez Zamudio, Jorge Javier

    2017-01-01

    It is very common to find in contemporary literature of Differential Equations, the need to incorporate holistically in teaching and learning the three different approaches: analytical, qualitative, and numerical, for continuous dynamical systems. However, nowadays, in some Bachelor of Science that includes only one course in differential…

  1. Observability during planetary approach navigation

    NASA Technical Reports Server (NTRS)

    Bishop, Robert H.; Burkhart, P. Daniel; Thurman, Sam W.

    1993-01-01

    The objective of the research is to develop an analytic technique to predict the relative navigation capability of different Earth-based radio navigation measurements. In particular, the problem is to determine the relative ability of geocentric range and Doppler measurements to detect the effects of the target planet gravitational attraction on the spacecraft during the planetary approach and near-encounter mission phases. A complete solution to the two-dimensional problem has been developed. Relatively simple analytic formulas are obtained for range and Doppler measurements which describe the observability content of the measurement data along the approach trajectories. An observability measure is defined which is based on the observability matrix for nonlinear systems. The results show good agreement between the analytic observability analysis and the computational batch processing method.

  2. Analytic thinking promotes religious disbelief.

    PubMed

    Gervais, Will M; Norenzayan, Ara

    2012-04-27

    Scientific interest in the cognitive underpinnings of religious belief has grown in recent years. However, to date, little experimental research has focused on the cognitive processes that may promote religious disbelief. The present studies apply a dual-process model of cognitive processing to this problem, testing the hypothesis that analytic processing promotes religious disbelief. Individual differences in the tendency to analytically override initially flawed intuitions in reasoning were associated with increased religious disbelief. Four additional experiments provided evidence of causation, as subtle manipulations known to trigger analytic processing also encouraged religious disbelief. Combined, these studies indicate that analytic processing is one factor (presumably among several) that promotes religious disbelief. Although these findings do not speak directly to conversations about the inherent rationality, value, or truth of religious beliefs, they illuminate one cognitive factor that may influence such discussions.

  3. Risk Factors Predicting Infectious Lactational Mastitis: Decision Tree Approach versus Logistic Regression Analysis.

    PubMed

    Fernández, Leónides; Mediano, Pilar; García, Ricardo; Rodríguez, Juan M; Marín, María

    2016-09-01

    Objectives Lactational mastitis frequently leads to a premature abandonment of breastfeeding; its development has been associated with several risk factors. This study aims to use a decision tree (DT) approach to establish the main risk factors involved in mastitis and to compare its performance for predicting this condition with a stepwise logistic regression (LR) model. Methods Data from 368 cases (breastfeeding women with mastitis) and 148 controls were collected by a questionnaire about risk factors related to medical history of mother and infant, pregnancy, delivery, postpartum, and breastfeeding practices. The performance of the DT and LR analyses was compared using the area under the receiver operating characteristic (ROC) curve. Sensitivity, specificity and accuracy of both models were calculated. Results Cracked nipples, antibiotics and antifungal drugs during breastfeeding, infant age, breast pumps, familial history of mastitis and throat infection were significant risk factors associated with mastitis in both analyses. Bottle-feeding and milk supply were related to mastitis for certain subgroups in the DT model. The areas under the ROC curves were similar for LR and DT models (0.870 and 0.835, respectively). The LR model had better classification accuracy and sensitivity than the DT model, but the last one presented better specificity at the optimal threshold of each curve. Conclusions The DT and LR models constitute useful and complementary analytical tools to assess the risk of lactational infectious mastitis. The DT approach identifies high-risk subpopulations that need specific mastitis prevention programs and, therefore, it could be used to make the most of public health resources.

  4. A Practical Methodology for Disaggregating the Drivers of Drug Costs Using Administrative Data.

    PubMed

    Lungu, Elena R; Manti, Orlando J; Levine, Mitchell A H; Clark, Douglas A; Potashnik, Tanya M; McKinley, Carol I

    2017-09-01

    Prescription drug expenditures represent a significant component of health care costs in Canada, with estimates of $28.8 billion spent in 2014. Identifying the major cost drivers and the effect they have on prescription drug expenditures allows policy makers and researchers to interpret current cost pressures and anticipate future expenditure levels. To identify the major drivers of prescription drug costs and to develop a methodology to disaggregate the impact of each of the individual drivers. The methodology proposed in this study uses the Laspeyres approach for cost decomposition. This approach isolates the effect of the change in a specific factor (e.g., price) by holding the other factor(s) (e.g., quantity) constant at the base-period value. The Laspeyres approach is expanded to a multi-factorial framework to isolate and quantify several factors that drive prescription drug cost. Three broad categories of effects are considered: volume, price and drug-mix effects. For each category, important sub-effects are quantified. This study presents a new and comprehensive methodology for decomposing the change in prescription drug costs over time including step-by-step demonstrations of how the formulas were derived. This methodology has practical applications for health policy decision makers and can aid researchers in conducting cost driver analyses. The methodology can be adjusted depending on the purpose and analytical depth of the research and data availability. © 2017 Journal of Population Therapeutics and Clinical Pharmacology. All rights reserved.

  5. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  6. Features Students Really Expect from Learning Analytics

    ERIC Educational Resources Information Center

    Schumacher, Clara; Ifenthaler, Dirk

    2016-01-01

    In higher education settings more and more learning is facilitated through online learning environments. To support and understand students' learning processes better, learning analytics offers a promising approach. The purpose of this study was to investigate students' expectations toward features of learning analytics systems. In a first…

  7. Accurate mass measurements and their appropriate use for reliable analyte identification.

    PubMed

    Godfrey, A Ruth; Brenton, A Gareth

    2012-09-01

    Accurate mass instrumentation is becoming increasingly available to non-expert users. This data can be mis-used, particularly for analyte identification. Current best practice in assigning potential elemental formula for reliable analyte identification has been described with modern informatic approaches to analyte elucidation, including chemometric characterisation, data processing and searching using facilities such as the Chemical Abstracts Service (CAS) Registry and Chemspider.

  8. Multi-analytical Approaches Informing the Risk of Sepsis

    NASA Astrophysics Data System (ADS)

    Gwadry-Sridhar, Femida; Lewden, Benoit; Mequanint, Selam; Bauer, Michael

    Sepsis is a significant cause of mortality and morbidity and is often associated with increased hospital resource utilization, prolonged intensive care unit (ICU) and hospital stay. The economic burden associated with sepsis is huge. With advances in medicine, there are now aggressive goal oriented treatments that can be used to help these patients. If we were able to predict which patients may be at risk for sepsis we could start treatment early and potentially reduce the risk of mortality and morbidity. Analytic methods currently used in clinical research to determine the risk of a patient developing sepsis may be further enhanced by using multi-modal analytic methods that together could be used to provide greater precision. Researchers commonly use univariate and multivariate regressions to develop predictive models. We hypothesized that such models could be enhanced by using multiple analytic methods that together could be used to provide greater insight. In this paper, we analyze data about patients with and without sepsis using a decision tree approach and a cluster analysis approach. A comparison with a regression approach shows strong similarity among variables identified, though not an exact match. We compare the variables identified by the different approaches and draw conclusions about the respective predictive capabilities,while considering their clinical significance.

  9. A hybrid approach to urine drug testing using high-resolution mass spectrometry and select immunoassays.

    PubMed

    McMillin, Gwendolyn A; Marin, Stephanie J; Johnson-Davis, Kamisha L; Lawlor, Bryan G; Strathmann, Frederick G

    2015-02-01

    The major objective of this research was to propose a simplified approach for the evaluation of medication adherence in chronic pain management patients, using liquid chromatography time-of-flight (TOF) mass spectrometry, performed in parallel with select homogeneous enzyme immunoassays (HEIAs). We called it a "hybrid" approach to urine drug testing. The hybrid approach was defined based on anticipated positivity rates, availability of commercial reagents for HEIAs, and assay performance, particularly analytical sensitivity and specificity for drug(s) of interest. Subsequent to implementation of the hybrid approach, time to result was compared with that observed with other urine drug testing approaches. Opioids, benzodiazepines, zolpidem, amphetamine-like stimulants, and methylphenidate metabolite were detected by TOF mass spectrometry to maximize specificity and sensitivity of these 37 drug analytes. Barbiturates, cannabinoid metabolite, carisoprodol, cocaine metabolite, ethyl glucuronide, methadone, phencyclidine, propoxyphene, and tramadol were detected by HEIAs that performed adequately and/or for which positivity rates were very low. Time to result was significantly reduced compared with the traditional approach. The hybrid approach to urine drug testing provides a simplified and analytically specific testing process that minimizes the need for secondary confirmation. Copyright© by the American Society for Clinical Pathology.

  10. Does size matter? Study of performance of pseudo-ELISAs based on molecularly imprinted polymer nanoparticles prepared for analytes of different sizes.

    PubMed

    Cáceres, C; Canfarotta, F; Chianella, I; Pereira, E; Moczko, E; Esen, C; Guerreiro, A; Piletska, E; Whitcombe, M J; Piletsky, S A

    2016-02-21

    The aim of this work is to evaluate whether the size of the analyte used as template for the synthesis of molecularly imprinted polymer nanoparticles (nanoMIPs) can affect their performance in pseudo-enzyme linked immunosorbent assays (pseudo-ELISAs). Successful demonstration of a nanoMIPs-based pseudo-ELISA for vancomycin (1449.3 g mol(-1)) was demonstrated earlier. In the present investigation, the following analytes were selected: horseradish peroxidase (HRP, 44 kDa), cytochrome C (Cyt C, 12 kDa) biotin (244.31 g mol(-1)) and melamine (126.12 g mol(-1)). NanoMIPs with a similar composition for all analytes were synthesised by persulfate-initiated polymerisation in water. In addition, core-shell nanoMIPs coated with polyethylene glycol (PEG) and imprinted for melamine were produced in organics and tested. The polymerisation of the nanoparticles was done using a solid-phase approach with the correspondent template immobilised on glass beads. The performance of the nanoMIPs used as replacement for antibodies in direct pseudo-ELISA (for the enzymes) and competitive pseudo-ELISA for the smaller analytes was investigated. For the competitive mode we rely on competition for the binding to the nanoparticles between free analyte and corresponding analyte-HRP conjugate. The results revealed that the best performances were obtained for nanoMIPs synthesised in aqueous media for the larger analytes. In addition, this approach was successful for biotin but completely failed for the smallest template melamine. This problem was solved using nanoMIP prepared by UV polymerisation in an organic media with a PEG shell. This study demonstrates that the preparation of nanoMIP by solid-phase approach can produce material with high affinity and potential to replace antibodies in ELISA tests for both large and small analytes. This makes this technology versatile and applicable to practically any target analyte and diagnostic field.

  11. I know what you did last summer (and it was not CBT): a factor analytic model of international psychotherapeutic practice in the eating disorders.

    PubMed

    Tobin, David L; Banker, Judith D; Weisberg, Laura; Bowers, Wayne

    2007-12-01

    Although several studies have shown that eating disorders clinicians do not generally use treatment manuals, findings regarding what they do use have typically been vague, or closely linked to a particular theoretical approach. Our goal was to identify what eating disorder clinicians do with their patients in a more theoretically neutral context. We also sought to describe an empirically defined approach to psychotherapeutic practice as defined by clinicians via factor analysis. A survey developed for this study was administered to 265 clinicians recruited online and at regional and international meetings for eating disorders professionals. Only 6% of respondents reported they adhered closely to treatment manuals and 98% of the respondents indicated they used both behavioral and dynamically informed interventions. Factor analysis of clinicians' use of 32 therapeutic strategies suggested seven dimensions: Psychodynamic Interventions, Coping Skills Training, Family History, CBT, Contracts, Therapist Disclosure, and Patient Feelings. The findings of this study suggest that most clinicians use a wide array of eating disorder treatment interventions drawn from empirically supported treatments, such as CBT-BN, and from treatments that have no randomized controlled trial support. Factor analysis suggested theoretically linked dimensions of treatment, but also dimensions that are common across models. (c) 2007 by Wiley Periodicals, Inc.

  12. Thermodynamics of Newman-Unti-Tamburino charged spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mann, Robert; Department of Physics, University of Waterloo, 200 University Avenue West, Waterloo, Ontario, N2L 3G1; Stelea, Cristian

    We discuss and compare at length the results of two methods used recently to describe the thermodynamics of Taub-Newman-Unti-Tamburino (NUT) solutions in a de Sitter background. In the first approach (C approach), one deals with an analytically continued version of the metric while in the second approach (R approach), the discussion is carried out using the unmodified metric with Lorentzian signature. No analytic continuation is performed on the coordinates and/or the parameters that appear in the metric. We find that the results of both these approaches are completely equivalent modulo analytic continuation and we provide the exact prescription that relatesmore » the results in both methods. The extension of these results to the AdS/flat cases aims to give a physical interpretation of the thermodynamics of NUT-charged spacetimes in the Lorentzian sector. We also briefly discuss the higher-dimensional spaces and note that, analogous with the absence of hyperbolic NUTs in AdS backgrounds, there are no spherical Taub-NUT-dS solutions.« less

  13. Proceedings of the Workshop on Change of Representation and Problem Reformulation

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.

    1992-01-01

    The proceedings of the third Workshop on Change of representation and Problem Reformulation is presented. In contrast to the first two workshops, this workshop was focused on analytic or knowledge-based approaches, as opposed to statistical or empirical approaches called 'constructive induction'. The organizing committee believes that there is a potential for combining analytic and inductive approaches at a future date. However, it became apparent at the previous two workshops that the communities pursuing these different approaches are currently interested in largely non-overlapping issues. The constructive induction community has been holding its own workshops, principally in conjunction with the machine learning conference. While this workshop is more focused on analytic approaches, the organizing committee has made an effort to include more application domains. We have greatly expanded from the origins in the machine learning community. Participants in this workshop come from the full spectrum of AI application domains including planning, qualitative physics, software engineering, knowledge representation, and machine learning.

  14. Data analytics approach to create waste generation profiles for waste management and collection.

    PubMed

    Niska, Harri; Serkkola, Ari

    2018-04-30

    Extensive monitoring data on waste generation is increasingly collected in order to implement cost-efficient and sustainable waste management operations. In addition, geospatial data from different registries of the society are opening for free usage. Novel data analytics approaches can be built on the top of the data to produce more detailed, and in-time waste generation information for the basis of waste management and collection. In this paper, a data-based approach based on the self-organizing map (SOM) and the k-means algorithm is developed for creating a set of waste generation type profiles. The approach is demonstrated using the extensive container-level waste weighting data collected in the metropolitan area of Helsinki, Finland. The results obtained highlight the potential of advanced data analytic approaches in producing more detailed waste generation information e.g. for the basis of tailored feedback services for waste producers and the planning and optimization of waste collection and recycling. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Dynamics of Biomarkers in Relation to Aging and Mortality

    PubMed Central

    Arbeev, Konstantin G.; Ukraintseva, Svetlana V.; Yashin, Anatoliy I.

    2016-01-01

    Contemporary longitudinal studies collect repeated measurements of biomarkers allowing one to analyze their dynamics in relation to mortality, morbidity, or other health-related outcomes. Rich and diverse data collected in such studies provide opportunities to investigate how various socioeconomic, demographic, behavioral and other variables can interact with biological and genetic factors to produce differential rates of aging in individuals. In this paper, we review some recent publications investigating dynamics of biomarkers in relation to mortality, which use single biomarkers as well as cumulative measures combining information from multiple biomarkers. We also discuss the analytical approach, the stochastic process models, which conceptualizes several aging-related mechanisms in the structure of the model and allows evaluating “hidden” characteristics of aging-related changes indirectly from available longitudinal data on biomarkers and follow-up on mortality or onset of diseases taking into account other relevant factors (both genetic and non-genetic). We also discuss an extension of the approach, which considers ranges of “optimal values” of biomarkers rather than a single optimal value as in the original model. We discuss practical applications of the approach to single biomarkers and cumulative measures highlighting that the potential of applications to cumulative measures is still largely underused. PMID:27138087

  16. Metabonomics of ageing - Towards understanding metabolism of a long and healthy life.

    PubMed

    Martin, Francois-Pierre J; Montoliu, Ivan; Kussmann, Martin

    2017-07-01

    Systems biology approaches have been increasingly employed in clinical studies to enhance our understanding of the role of genetics, environmental factors and their interactions on nutritional, health and disease status. Amongst the new omics technologies, metabonomics has emerged as a robust platform to capture metabolic and nutritional requirements by enabling, in a minimally invasive fashion, the monitoring of a wide range of biochemical compounds. Their variations reflect comprehensively the various molecular regulatory processes, which are tightly controlled and under the influence of genetics, diet, gut microbiota and other environmental factors. They are providing key insights into complex metabolic phenomena as well as into differences and specificities at individual and population level. The aim of this review is to evaluate promising metabolic insights towards understanding metabolism of a long and healthy life from pre-clinical and clinical metabonomics studies. We will also discuss analytical approaches to enable data integration, with an emphasis on the longitudinal component. Herein, we will illustrate current examples, challenges and perspectives in the applications of metabonomics monitoring and modelling approaches in the context of healthy ageing research. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Photoprotective Strategies of Mediterranean Plants in Relation to Morphological Traits and Natural Environmental Pressure: A Meta-Analytical Approach

    PubMed Central

    Fernández-Marín, Beatriz; Hernández, Antonio; Garcia-Plazaola, Jose I.; Esteban, Raquel; Míguez, Fátima; Artetxe, Unai; Gómez-Sagasti, Maria T.

    2017-01-01

    Despite being a small geographic extension, Mediterranean Basin is characterized by an exceptional plant biodiversity. Adaptive responses of this biocoenosis are delineated by an unusual temporal dissociation along the year between optimal temperature for growth and water availability. This fact generates the combination of two environmental stress factors: a period of summer drought, variable in length and intensity, and the occurrence of mild to cold winters. Both abiotic factors, trigger the generation of (photo)oxidative stress and plants orchestrate an arsenal of structural, physiological, biochemical, and molecular mechanisms to withstand such environmental injuries. In the last two decades an important effort has been made to characterize the adaptive morphological and ecophysiological traits behind plant survival strategies with an eye to predict how they will respond to future climatic changes. In the present work, we have compiled data from 89 studies following a meta-analytical approach with the aim of assessing the composition and plasticity of photosynthetic pigments and low-molecular-weight antioxidants (tocopherols, glutathione, and ascorbic acid) of wild Mediterranean plant species. The influence of internal plant and leaf factors on such composition together with the stress responsiveness, were also analyzed. This approach enabled to obtain data from 73 species of the Mediterranean flora, with the genus Quercus being the most frequently studied. Main highlights of present analysis are: (i) sort of photoprotective mechanisms do not differ between Mediterranean plants and other floras but they show higher plasticity indexes; (ii) α−tocopherol among the antioxidants and violaxanthin-cycle pigments show the highest responsiveness to environmental factors; (iii) both winter and drought stresses induce overnight retention of de-epoxidised violaxanthin-cycle pigments; (iv) this retention correlates with depressions of Fv/Fm; and (v) contrary to what could be expected, mature leaves showed higher accumulation of hydrophilic antioxidants than young leaves, and sclerophyllous leaves higher biochemical photoprotective demand than membranous leaves. In a global climatic change scenario, the plasticity of their photoprotective mechanisms will likely benefit Mediterranean species against oceanic ones. Nevertheless, deep research of ecoregions other than the Mediterranean Basin will be needed to fully understand photoprotection strategies of this extremely biodiverse floristic biome: the Mediterranean ecosystem. PMID:28674548

  18. Technosocial Predictive Analytics in Support of Naturalistic Decision Making

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.; Cowell, Andrew J.; Malone, Elizabeth L.

    2009-06-23

    A main challenge we face in fostering sustainable growth is to anticipate outcomes through predictive and proactive across domains as diverse as energy, security, the environment, health and finance in order to maximize opportunities, influence outcomes and counter adversities. The goal of this paper is to present new methods for anticipatory analytical thinking which address this challenge through the development of a multi-perspective approach to predictive modeling as a core to a creative decision making process. This approach is uniquely multidisciplinary in that it strives to create decision advantage through the integration of human and physical models, and leverages knowledgemore » management and visual analytics to support creative thinking by facilitating the achievement of interoperable knowledge inputs and enhancing the user’s cognitive access. We describe a prototype system which implements this approach and exemplify its functionality with reference to a use case in which predictive modeling is paired with analytic gaming to support collaborative decision-making in the domain of agricultural land management.« less

  19. Bioinspired Methodology for Artificial Olfaction

    PubMed Central

    Raman, Baranidharan; Hertz, Joshua L.; Benkstein, Kurt D.; Semancik, Steve

    2008-01-01

    Artificial olfaction is a potential tool for noninvasive chemical monitoring. Application of “electronic noses” typically involves recognition of “pretrained” chemicals, while long-term operation and generalization of training to allow chemical classification of “unknown” analytes remain challenges. The latter analytical capability is critically important, as it is unfeasible to pre-expose the sensor to every analyte it might encounter. Here, we demonstrate a biologically inspired approach where the recognition and generalization problems are decoupled and resolved in a hierarchical fashion. Analyte composition is refined in a progression from general (e.g., target is a hydrocarbon) to precise (e.g., target is ethane), using highly optimized response features for each step. We validate this approach using a MEMS-based chemiresistive microsensor array. We show that this approach, a unique departure from existing methodologies in artificial olfaction, allows the recognition module to better mitigate sensor-aging effects and to better classify unknowns, enhancing the utility of chemical sensors for real-world applications. PMID:18855409

  20. Complex index of refraction estimation from degree of polarization with diffuse scattering consideration.

    PubMed

    Zhan, Hanyu; Voelz, David G; Cho, Sang-Yeon; Xiao, Xifeng

    2015-11-20

    The estimation of the refractive index from optical scattering off a target's surface is an important task for remote sensing applications. Optical polarimetry is an approach that shows promise for refractive index estimation. However, this estimation often relies on polarimetric models that are limited to specular targets involving single surface scattering. Here, an analytic model is developed for the degree of polarization (DOP) associated with reflection from a rough surface that includes the effect of diffuse scattering. A multiplicative factor is derived to account for the diffuse component and evaluation of the model indicates that diffuse scattering can significantly affect the DOP values. The scattering model is used in a new approach for refractive index estimation from a series of DOP values that involves jointly estimating n, k, and ρ(d)with a nonlinear equation solver. The approach is shown to work well with simulation data and additive noise. When applied to laboratory-measured DOP values, the approach produces significantly improved index estimation results relative to reference values.

  1. Multivariate Protein Signatures of Pre-Clinical Alzheimer's Disease in the Alzheimer's Disease Neuroimaging Initiative (ADNI) Plasma Proteome Dataset

    PubMed Central

    Johnstone, Daniel; Milward, Elizabeth A.; Berretta, Regina; Moscato, Pablo

    2012-01-01

    Background Recent Alzheimer's disease (AD) research has focused on finding biomarkers to identify disease at the pre-clinical stage of mild cognitive impairment (MCI), allowing treatment to be initiated before irreversible damage occurs. Many studies have examined brain imaging or cerebrospinal fluid but there is also growing interest in blood biomarkers. The Alzheimer's Disease Neuroimaging Initiative (ADNI) has generated data on 190 plasma analytes in 566 individuals with MCI, AD or normal cognition. We conducted independent analyses of this dataset to identify plasma protein signatures predicting pre-clinical AD. Methods and Findings We focused on identifying signatures that discriminate cognitively normal controls (n = 54) from individuals with MCI who subsequently progress to AD (n = 163). Based on p value, apolipoprotein E (APOE) showed the strongest difference between these groups (p = 2.3×10−13). We applied a multivariate approach based on combinatorial optimization ((α,β)-k Feature Set Selection), which retains information about individual participants and maintains the context of interrelationships between different analytes, to identify the optimal set of analytes (signature) to discriminate these two groups. We identified 11-analyte signatures achieving values of sensitivity and specificity between 65% and 86% for both MCI and AD groups, depending on whether APOE was included and other factors. Classification accuracy was improved by considering “meta-features,” representing the difference in relative abundance of two analytes, with an 8-meta-feature signature consistently achieving sensitivity and specificity both over 85%. Generating signatures based on longitudinal rather than cross-sectional data further improved classification accuracy, returning sensitivities and specificities of approximately 90%. Conclusions Applying these novel analysis approaches to the powerful and well-characterized ADNI dataset has identified sets of plasma biomarkers for pre-clinical AD. While studies of independent test sets are required to validate the signatures, these analyses provide a starting point for developing a cost-effective and minimally invasive test capable of diagnosing AD in its pre-clinical stages. PMID:22485168

  2. Big Data Analytics Methodology in the Financial Industry

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  3. Technology advancement for integrative stem cell analyses.

    PubMed

    Jeong, Yoon; Choi, Jonghoon; Lee, Kwan Hyi

    2014-12-01

    Scientists have endeavored to use stem cells for a variety of applications ranging from basic science research to translational medicine. Population-based characterization of such stem cells, while providing an important foundation to further development, often disregard the heterogeneity inherent among individual constituents within a given population. The population-based analysis and characterization of stem cells and the problems associated with such a blanket approach only underscore the need for the development of new analytical technology. In this article, we review current stem cell analytical technologies, along with the advantages and disadvantages of each, followed by applications of these technologies in the field of stem cells. Furthermore, while recent advances in micro/nano technology have led to a growth in the stem cell analytical field, underlying architectural concepts allow only for a vertical analytical approach, in which different desirable parameters are obtained from multiple individual experiments and there are many technical challenges that limit vertically integrated analytical tools. Therefore, we propose--by introducing a concept of vertical and horizontal approach--that there is the need of adequate methods to the integration of information, such that multiple descriptive parameters from a stem cell can be obtained from a single experiment.

  4. A multiplexed analysis approach identifies new association of inflammatory proteins in patients with overactive bladder

    PubMed Central

    Ma, Emily; Vetter, Joel; Bliss, Laura; Lai, H. Henry; Mysorekar, Indira U.

    2016-01-01

    Overactive bladder (OAB) is a common debilitating bladder condition with unknown etiology and limited diagnostic modalities. Here, we explored a novel high-throughput and unbiased multiplex approach with cellular and molecular components in a well-characterized patient cohort to identify biomarkers that could be reliably used to distinguish OAB from controls or provide insights into underlying etiology. As a secondary analysis, we determined whether this method could discriminate between OAB and other chronic bladder conditions. We analyzed plasma samples from healthy volunteers (n = 19) and patients diagnosed with OAB, interstitial cystitis/bladder pain syndrome (IC/BPS), or urinary tract infections (UTI; n = 51) for proinflammatory, chemokine, cytokine, angiogenesis, and vascular injury factors using Meso Scale Discovery (MSD) analysis and urinary cytological analysis. Wilcoxon rank-sum tests were used to perform univariate and multivariate comparisons between patient groups (controls, OAB, IC/BPS, and UTI). Multivariate logistic regression models were fit for each MSD analyte on 1) OAB patients and controls, 2) OAB and IC/BPS patients, and 3) OAB and UTI patients. Age, race, and sex were included as independent variables in all multivariate analysis. Receiver operating characteristic (ROC) curves were generated to determine the diagnostic potential of a given analyte. Our findings demonstrate that five analytes, i.e., interleukin 4, TNF-α, macrophage inflammatory protein-1β, serum amyloid A, and Tie2 can reliably differentiate OAB relative to controls and can be used to distinguish OAB from the other conditions. Together, our pilot study suggests a molecular imbalance in inflammatory proteins may contribute to OAB pathogenesis. PMID:27029431

  5. Grating-coupled surface plasmons on InSb: a versatile platform for terahertz plasmonic sensing (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Talbayev, Diyar; Zhou, Jiangfeng; Lin, Shuai; Bhattarai, Khagendra

    2017-05-01

    Detection and identification of molecular materials based on their THz frequency vibrational resonances remains an open technological challenge. The need for such technology is illustrated by its potential uses in explosives detection (e.g., RDX) or identification of large biomolecules based on their THz-frequency vibrational fingerprints. The prevailing approaches to THz sensing often rely on a form of waveguide spectroscopy, either utilizing geometric waveguides, such as metallic parallel plate, or plasmonic waveguides made of structured metallic surfaces with sub-wavelength corrugation. The sensitivity of waveguide-based sensing devices is derived from the long (1 cm or longer) propagation and interaction distance of the THz wave with the analyte. We have demonstrated that thin InSb layers with metallic gratings can support high quality factor "true" surface plasmon (SP) resonances that can be used for THz plasmonic sensing. We find two strong SP absorption resonances in normal-incidence transmission and investigate their dispersion relations, dependence on InSb thickness, and the spatial distribution of the electric field. The sensitivity of this approach relies on the frequency shift of the SP resonance when the dielectric function changes in the immediate vicinity of the sensor, in the region of deeply sub-wavelength thickness. Our computational modeling indicates that the sensor sensitivity can exceed 0.25 THz per refractive index unit. One of the SP resonances also exhibits a splitting when tuned in resonance with a vibrational mode of an analyte, which could lead to new sensing modalities for the detection of THz vibrational features of the analyte.

  6. Lack of grading agreement among international hemostasis external quality assessment programs

    PubMed Central

    Olson, John D.; Jennings, Ian; Meijer, Piet; Bon, Chantal; Bonar, Roslyn; Favaloro, Emmanuel J.; Higgins, Russell A.; Keeney, Michael; Mammen, Joy; Marlar, Richard A.; Meley, Roland; Nair, Sukesh C.; Nichols, William L.; Raby, Anne; Reverter, Joan C.; Srivastava, Alok; Walker, Isobel

    2018-01-01

    Laboratory quality programs rely on internal quality control and external quality assessment (EQA). EQA programs provide unknown specimens for the laboratory to test. The laboratory's result is compared with other (peer) laboratories performing the same test. EQA programs assign target values using a variety of methods statistical tools and performance assessment of ‘pass’ or ‘fail’ is made. EQA provider members of the international organization, external quality assurance in thrombosis and hemostasis, took part in a study to compare outcome of performance analysis using the same data set of laboratory results. Eleven EQA organizations using eight different analytical approaches participated. Data for a normal and prolonged activated partial thromboplastin time (aPTT) and a normal and reduced factor VIII (FVIII) from 218 laboratories were sent to the EQA providers who analyzed the data set using their method of evaluation for aPTT and FVIII, determining the performance for each laboratory record in the data set. Providers also summarized their statistical approach to assignment of target values and laboratory performance. Each laboratory record in the data set was graded pass/fail by all EQA providers for each of the four analytes. There was a lack of agreement of pass/fail grading among EQA programs. Discordance in the grading was 17.9 and 11% of normal and prolonged aPTT results, respectively, and 20.2 and 17.4% of normal and reduced FVIII results, respectively. All EQA programs in this study employed statistical methods compliant with the International Standardization Organization (ISO), ISO 13528, yet the evaluation of laboratory results for all four analytes showed remarkable grading discordance. PMID:29232255

  7. Analytical Implications of Using Practice Theory in Workplace Information Literacy Research

    ERIC Educational Resources Information Center

    Moring, Camilla; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical focus and interest when researching workplace…

  8. Demonstrating Success: Web Analytics and Continuous Improvement

    ERIC Educational Resources Information Center

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  9. A guide for measurement of circulating metabolic hormones in rodents: Pitfalls during the pre-analytical phase

    PubMed Central

    Bielohuby, Maximilian; Popp, Sarah; Bidlingmaier, Martin

    2012-01-01

    Researchers analyse hormones to draw conclusions from changes in hormone concentrations observed under specific physiological conditions and to elucidate mechanisms underlying their biological variability. It is, however, frequently overlooked that also circumstances occurring after collection of biological samples can significantly affect the hormone concentrations measured, owing to analytical and pre-analytical variability. Whereas the awareness for such potential confounders is increasing in human laboratory medicine, there is sometimes limited consensus about the control of these factors in rodent studies. In this guide, we demonstrate how such factors can affect reliability and consequent interpretation of the data from immunoassay measurements of circulating metabolic hormones in rodent studies. We also compare the knowledge about such factors in rodent studies to recent recommendations established for biomarker studies in humans and give specific practical recommendations for the control of pre-analytical conditions in metabolic studies in rodents. PMID:24024118

  10. An innovative computationally efficient hydromechanical coupling approach for fault reactivation in geological subsurface utilization

    NASA Astrophysics Data System (ADS)

    Adams, M.; Kempka, T.; Chabab, E.; Ziegler, M.

    2018-02-01

    Estimating the efficiency and sustainability of geological subsurface utilization, i.e., Carbon Capture and Storage (CCS) requires an integrated risk assessment approach, considering the occurring coupled processes, beside others, the potential reactivation of existing faults. In this context, hydraulic and mechanical parameter uncertainties as well as different injection rates have to be considered and quantified to elaborate reliable environmental impact assessments. Consequently, the required sensitivity analyses consume significant computational time due to the high number of realizations that have to be carried out. Due to the high computational costs of two-way coupled simulations in large-scale 3D multiphase fluid flow systems, these are not applicable for the purpose of uncertainty and risk assessments. Hence, an innovative semi-analytical hydromechanical coupling approach for hydraulic fault reactivation will be introduced. This approach determines the void ratio evolution in representative fault elements using one preliminary base simulation, considering one model geometry and one set of hydromechanical parameters. The void ratio development is then approximated and related to one reference pressure at the base of the fault. The parametrization of the resulting functions is then directly implemented into a multiphase fluid flow simulator to carry out the semi-analytical coupling for the simulation of hydromechanical processes. Hereby, the iterative parameter exchange between the multiphase and mechanical simulators is omitted, since the update of porosity and permeability is controlled by one reference pore pressure at the fault base. The suggested procedure is capable to reduce the computational time required by coupled hydromechanical simulations of a multitude of injection rates by a factor of up to 15.

  11. Mars approach for global sensitivity analysis of differential equation models with applications to dynamics of influenza infection.

    PubMed

    Lee, Yeonok; Wu, Hulin

    2012-01-01

    Differential equation models are widely used for the study of natural phenomena in many fields. The study usually involves unknown factors such as initial conditions and/or parameters. It is important to investigate the impact of unknown factors (parameters and initial conditions) on model outputs in order to better understand the system the model represents. Apportioning the uncertainty (variation) of output variables of a model according to the input factors is referred to as sensitivity analysis. In this paper, we focus on the global sensitivity analysis of ordinary differential equation (ODE) models over a time period using the multivariate adaptive regression spline (MARS) as a meta model based on the concept of the variance of conditional expectation (VCE). We suggest to evaluate the VCE analytically using the MARS model structure of univariate tensor-product functions which is more computationally efficient. Our simulation studies show that the MARS model approach performs very well and helps to significantly reduce the computational cost. We present an application example of sensitivity analysis of ODE models for influenza infection to further illustrate the usefulness of the proposed method.

  12. Prediction of Floor Water Inrush: The Application of GIS-Based AHP Vulnerable Index Method to Donghuantuo Coal Mine, China

    NASA Astrophysics Data System (ADS)

    Wu, Qiang; Liu, Yuanzhang; Liu, Donghai; Zhou, Wanfang

    2011-09-01

    Floor water inrush represents a geohazard that can pose significant threat to safe operations for instance in coal mines in China and elsewhere. Its occurrence is controlled by many factors, and the processes are often not amenable to mathematical expressions. To evaluate the water inrush risk, the paper proposes the vulnerability index approach by coupling the analytic hierarchy process (AHP) and geographic information system (GIS). The detailed procedures of using this innovative approach are shown in a case study in China (Donghuantuo Coal Mine). The powerful spatial data analysis functions of GIS was used to establish the thematic layer of each of the six factors that control the water inrush, and the contribution weights of each factor was determined with the AHP method. The established AHP evaluation model was used to determine the threshold value for each risk level with a histogram of the water inrush vulnerability index. As a result, the mine area was divided into five regions with different vulnerability levels which served as general guidelines for the mine operations. The prediction results were further corroborated with the actual mining data, and the evaluation result is satisfactory.

  13. Circuit Design Features of a Stable Two-Cell System.

    PubMed

    Zhou, Xu; Franklin, Ruth A; Adler, Miri; Jacox, Jeremy B; Bailis, Will; Shyer, Justin A; Flavell, Richard A; Mayo, Avi; Alon, Uri; Medzhitov, Ruslan

    2018-02-08

    Cell communication within tissues is mediated by multiple paracrine signals including growth factors, which control cell survival and proliferation. Cells and the growth factors they produce and receive constitute a circuit with specific properties that ensure homeostasis. Here, we used computational and experimental approaches to characterize the features of cell circuits based on growth factor exchange between macrophages and fibroblasts, two cell types found in most mammalian tissues. We found that the macrophage-fibroblast cell circuit is stable and robust to perturbations. Analytical screening of all possible two-cell circuit topologies revealed the circuit features sufficient for stability, including environmental constraint and negative-feedback regulation. Moreover, we found that cell-cell contact is essential for the stability of the macrophage-fibroblast circuit. These findings illustrate principles of cell circuit design and provide a quantitative perspective on cell interactions. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. The analytical and numerical approaches to the theory of the Moon's librations: Modern analysis and results

    NASA Astrophysics Data System (ADS)

    Petrova, N.; Zagidullin, A.; Nefedyev, Y.; Kosulin, V.; Andreev, A.

    2017-11-01

    Observing physical librations of celestial bodies and the Moon represents one of the astronomical methods of remotely assessing the internal structure of a celestial body without conducting expensive space experiments. The paper contains a review of recent advances in studying the Moon's structure using various methods of obtaining and applying the lunar physical librations (LPhL) data. In this article LPhL simulation methods of assessing viscoelastic and dissipative properties of the lunar body and lunar core parameters, whose existence has been recently confirmed during the seismic data reprocessing of ;Apollo; space mission, are described. Much attention is paid to physical interpretation of the free librations phenomenon and the methods for its determination. In the paper the practical application of the most accurate analytical LPhL tables (Rambaux and Williams, 2011) is discussed. The tables were built on the basis of complex analytical processing of the residual differences obtained when comparing long-term series of laser observations with the numerical ephemeris DE421. In the paper an efficiency analysis of two approaches to LPhL theory is conducted: the numerical and the analytical ones. It has been shown that in lunar investigation both approaches complement each other in various aspects: the numerical approach provides high accuracy of the theory, which is required for the proper processing of modern observations, the analytical approach allows to comprehend the essence of the phenomena in the lunar rotation, predict and interpret new effects in the observations of lunar body and lunar core parameters.

  15. Analytical and simulator study of advanced transport

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Rickard, W. W.

    1982-01-01

    An analytic methodology, based on the optimal-control pilot model, was demonstrated for assessing longitidunal-axis handling qualities of transport aircraft in final approach. Calibration of the methodology is largely in terms of closed-loop performance requirements, rather than specific vehicle response characteristics, and is based on a combination of published criteria, pilot preferences, physical limitations, and engineering judgment. Six longitudinal-axis approach configurations were studied covering a range of handling qualities problems, including the presence of flexible aircraft modes. The analytical procedure was used to obtain predictions of Cooper-Harper ratings, a solar quadratic performance index, and rms excursions of important system variables.

  16. Reflections on Klein's radical notion of phantasy and its implications for analytic practice.

    PubMed

    Blass, Rachel B

    2017-06-01

    Analysts may incorporate many of Melanie Klein's important contributions (e.g., on preoedipal dynamics, envy, and projective identification) without transforming their basic analytic approach. In this paper I argue that adopting the Kleinian notion of unconscious phantasy is transformative. While it is grounded in Freud's thinking and draws out something essential to his work, this notion of phantasy introduces a radical change that defines Kleinian thinking and practice and significantly impacts the analyst's basic clinical approach. This impact and its technical implications in the analytic situation are illustrated and discussed. Copyright © 2017 Institute of Psychoanalysis.

  17. Gene-Environment Interplay in Common Complex Diseases: Forging an Integrative Model—Recommendations From an NIH Workshop

    PubMed Central

    Bookman, Ebony B.; McAllister, Kimberly; Gillanders, Elizabeth; Wanke, Kay; Balshaw, David; Rutter, Joni; Reedy, Jill; Shaughnessy, Daniel; Agurs-Collins, Tanya; Paltoo, Dina; Atienza, Audie; Bierut, Laura; Kraft, Peter; Fallin, M. Daniele; Perera, Frederica; Turkheimer, Eric; Boardman, Jason; Marazita, Mary L.; Rappaport, Stephen M.; Boerwinkle, Eric; Suomi, Stephen J.; Caporaso, Neil E.; Hertz-Picciotto, Irva; Jacobson, Kristen C.; Lowe, William L.; Goldman, Lynn R.; Duggal, Priya; Gunnar, Megan R.; Manolio, Teri A.; Green, Eric D.; Olster, Deborah H.; Birnbaum, Linda S.

    2011-01-01

    Although it is recognized that many common complex diseases are a result of multiple genetic and environmental risk factors, studies of gene-environment interaction remain a challenge and have had limited success to date. Given the current state-of-the-science, NIH sought input on ways to accelerate investigations of gene-environment interplay in health and disease by inviting experts from a variety of disciplines to give advice about the future direction of gene-environment interaction studies. Participants of the NIH Gene-Environment Interplay Workshop agreed that there is a need for continued emphasis on studies of the interplay between genetic and environmental factors in disease and that studies need to be designed around a multifaceted approach to reflect differences in diseases, exposure attributes, and pertinent stages of human development. The participants indicated that both targeted and agnostic approaches have strengths and weaknesses for evaluating main effects of genetic and environmental factors and their interactions. The unique perspectives represented at the workshop allowed the exploration of diverse study designs and analytical strategies, and conveyed the need for an interdisciplinary approach including data sharing, and data harmonization to fully explore gene-environment interactions. Further, participants also emphasized the continued need for high-quality measures of environmental exposures and new genomic technologies in ongoing and new studies. PMID:21308768

  18. Suggestions toward Some Discourse-Analytic Approaches to Text Difficulty: With Special Reference to "T-Unit Configuration" in the Textual Unfolding

    ERIC Educational Resources Information Center

    Lotfipour-Saedi, Kazem

    2015-01-01

    This paper represents some suggestions towards discourse-analytic approaches for ESL/EFL education, with the focus on identifying the textual forms which can contribute to the textual difficulty. Textual difficulty/comprehensibility, rather than being purely text-based or reader-dependent, is certainly a matter of interaction between text and…

  19. Patterns of Work and Family Involvement among Single and Dual Earner Couples: Two Competing Analytical Approaches.

    ERIC Educational Resources Information Center

    Yogev, Sara; Brett, Jeanne

    This paper offers a conceptual framework for the intersection of work and family roles based on the constructs of work involvement and family involvement. The theoretical and empirical literature on the intersection of work and family roles is reviewed from two analytical approaches. From the individual level of analysis, the literature reviewed…

  20. An Analytics-Based Approach to Managing Cognitive Load by Using Log Data of Learning Management Systems and Footprints of Social Media

    ERIC Educational Resources Information Center

    Yen, Cheng-Huang; Chen, I-Chuan; Lai, Su-Chun; Chuang, Yea-Ru

    2015-01-01

    Traces of learning behaviors generally provide insights into learners and the learning processes that they employ. In this article, a learning-analytics-based approach is proposed for managing cognitive load by adjusting the instructional strategies used in online courses. The technology-based learning environment examined in this study involved a…

  1. An analytical approach to thermal modeling of Bridgman type crystal growth: One dimensional analysis. Computer program users manual

    NASA Technical Reports Server (NTRS)

    Cothran, E. K.

    1982-01-01

    The computer program written in support of one dimensional analytical approach to thermal modeling of Bridgman type crystal growth is presented. The program listing and flow charts are included, along with the complete thermal model. Sample problems include detailed comments on input and output to aid the first time user.

  2. A note on a simplified and general approach to simulating from multivariate copula functions

    Treesearch

    Barry K. Goodwin

    2013-01-01

    Copulas have become an important analytic tool for characterizing multivariate distributions and dependence. One is often interested in simulating data from copula estimates. The process can be analytically and computationally complex and usually involves steps that are unique to a given parametric copula. We describe an alternative approach that uses ‘Probability-...

  3. Simplified analytical model and balanced design approach for light-weight wood-based structural panel in bending

    Treesearch

    Jinghao Li; John F. Hunt; Shaoqin Gong; Zhiyong Cai

    2016-01-01

    This paper presents a simplified analytical model and balanced design approach for modeling lightweight wood-based structural panels in bending. Because many design parameters are required to input for the model of finite element analysis (FEA) during the preliminary design process and optimization, the equivalent method was developed to analyze the mechanical...

  4. Development of An Analytic Approach to Determine How Environmental Protection Agency’s Integrated Risk Information System (IRIS) Is Used by Non-EPA Decision Makers (Final Contractor Report)

    EPA Science Inventory

    EPA announced the availability of the final contractor report entitled, Development of an Analytic Approach to Determine How Environmental Protection Agency’s Integrated Risk Information System (IRIS) Is Used By Non EPA Decision Makers. This contractor report analyzed how ...

  5. Development of a validated liquid chromatographic method for quantification of sorafenib tosylate in the presence of stress-induced degradation products and in biological matrix employing analytical quality by design approach.

    PubMed

    Sharma, Teenu; Khurana, Rajneet Kaur; Jain, Atul; Katare, O P; Singh, Bhupinder

    2018-05-01

    The current research work envisages an analytical quality by design-enabled development of a simple, rapid, sensitive, specific, robust and cost-effective stability-indicating reversed-phase high-performance liquid chromatographic method for determining stress-induced forced-degradation products of sorafenib tosylate (SFN). An Ishikawa fishbone diagram was constructed to embark upon analytical target profile and critical analytical attributes, i.e. peak area, theoretical plates, retention time and peak tailing. Factor screening using Taguchi orthogonal arrays and quality risk assessment studies carried out using failure mode effect analysis aided the selection of critical method parameters, i.e. mobile phase ratio and flow rate potentially affecting the chosen critical analytical attributes. Systematic optimization using response surface methodology of the chosen critical method parameters was carried out employing a two-factor-three-level-13-run, face-centered cubic design. A method operable design region was earmarked providing optimum method performance using numerical and graphical optimization. The optimum method employed a mobile phase composition consisting of acetonitrile and water (containing orthophosphoric acid, pH 4.1) at 65:35 v/v at a flow rate of 0.8 mL/min with UV detection at 265 nm using a C 18 column. Response surface methodology validation studies confirmed good efficiency and sensitivity of the developed method for analysis of SFN in mobile phase as well as in human plasma matrix. The forced degradation studies were conducted under different recommended stress conditions as per ICH Q1A (R2). Mass spectroscopy studies showed that SFN degrades in strongly acidic, alkaline and oxidative hydrolytic conditions at elevated temperature, while the drug was per se found to be photostable. Oxidative hydrolysis using 30% H 2 O 2 showed maximum degradation with products at retention times of 3.35, 3.65, 4.20 and 5.67 min. The absence of any significant change in the retention time of SFN and degradation products, formed under different stress conditions, ratified selectivity and specificity of the systematically developed method. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Anaerobic microbial dehalogenation of organohalides-state of the art and remediation strategies.

    PubMed

    Nijenhuis, Ivonne; Kuntze, Kevin

    2016-04-01

    Contamination and remediation of groundwater with halogenated organics and understanding of involved microbial reactions still poses a challenge. Over the last years, research in anaerobic microbial dehalogenation has advanced in many aspects providing information about the reaction, physiology of microorganisms as well as approaches to investigate the activity of microorganisms in situ. Recently published crystal structures of reductive dehalogenases (Rdh), heterologous expression systems and advanced analytical, proteomic and stable isotope approaches allow addressing the overall reaction and specific enzymes as well as co-factors involved during anaerobic microbial dehalogenation. In addition to Dehalococcoides spp., Dehalobacter and Dehalogenimonas strains have been recognized as important and versatile organohalide respirers. Together, these provide perspectives for integrated concepts allowing to improve and monitor in situ biodegradation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Novel predictive models for metabolic syndrome risk: a "big data" analytic approach.

    PubMed

    Steinberg, Gregory B; Church, Bruce W; McCall, Carol J; Scott, Adam B; Kalis, Brian P

    2014-06-01

    We applied a proprietary "big data" analytic platform--Reverse Engineering and Forward Simulation (REFS)--to dimensions of metabolic syndrome extracted from a large data set compiled from Aetna's databases for 1 large national customer. Our goals were to accurately predict subsequent risk of metabolic syndrome and its various factors on both a population and individual level. The study data set included demographic, medical claim, pharmacy claim, laboratory test, and biometric screening results for 36,944 individuals. The platform reverse-engineered functional models of systems from diverse and large data sources and provided a simulation framework for insight generation. The platform interrogated data sets from the results of 2 Comprehensive Metabolic Syndrome Screenings (CMSSs) as well as complete coverage records; complete data from medical claims, pharmacy claims, and lab results for 2010 and 2011; and responses to health risk assessment questions. The platform predicted subsequent risk of metabolic syndrome, both overall and by risk factor, on population and individual levels, with ROC/AUC varying from 0.80 to 0.88. We demonstrated that improving waist circumference and blood glucose yielded the largest benefits on subsequent risk and medical costs. We also showed that adherence to prescribed medications and, particularly, adherence to routine scheduled outpatient doctor visits, reduced subsequent risk. The platform generated individualized insights using available heterogeneous data within 3 months. The accuracy and short speed to insight with this type of analytic platform allowed Aetna to develop targeted cost-effective care management programs for individuals with or at risk for metabolic syndrome.

  8. Air-assisted liquid-liquid microextraction using floating organic droplet solidification for simultaneous extraction and spectrophotometric determination of some drugs in biological samples through chemometrics methods

    NASA Astrophysics Data System (ADS)

    Farahmand, Farnaz; Ghasemzadeh, Bahar; Naseri, Abdolhossein

    2018-01-01

    An air assisted liquid-liquid microextraction by applying the solidification of a floating organic droplet method (AALLME-SFOD) coupled with a multivariate calibration method, namely partial least squares (PLS), was introduced for the fast and easy determination of Atenolol (ATE), Propanolol (PRO) and Carvedilol (CAR) in biological samples via a spectrophotometric approach. The analytes would be extracted from neutral aqueous solution into 1-dodecanol as an organic solvent, using AALLME. In this approach a low-density solvent with a melting point close to room temperature was applied as the extraction solvent. The emulsion was immediately formed by repeatedly pulling in and pushing out the aqueous sample solution and extraction solvent mixture via a 10-mL glass syringe for ten times. After centrifugation, the extractant droplet could be simply collected from the aqueous samples by solidifying the emulsion at a lower than the melting point temperature. In the next step, analytes were back extracted simultaneously into the acidic aqueous solution. Derringer and Suich multi-response optimization were utilized for simultaneous optimizing the parameters of three analytes. This method incorporates the benefits of AALLME and dispersive liquid-liquid microextraction considering the solidification of floating organic droplets (DLLME-SFOD). Calibration graphs under optimized conditions were linear in the range of 0.30-6.00, 0.32-2.00 and 0.30-1.40 μg mL- 1 for ATE, CAR and PRO, respectively. Other analytical parameters were obtained as follows: enrichment factors (EFs) were found to be 11.24, 16.55 and 14.90, and limits of detection (LODs) were determined to be 0.09, 0.10 and 0.08 μg mL- 1 for ATE, CAR and PRO, respectively. The proposed method will require neither a highly toxic chlorinated solvent for extraction nor an organic dispersive solvent in the application process; hence, it is more environmentally friendly.

  9. Air-assisted liquid-liquid microextraction using floating organic droplet solidification for simultaneous extraction and spectrophotometric determination of some drugs in biological samples through chemometrics methods.

    PubMed

    Farahmand, Farnaz; Ghasemzadeh, Bahar; Naseri, Abdolhossein

    2018-01-05

    An air assisted liquid-liquid microextraction by applying the solidification of a floating organic droplet method (AALLME-SFOD) coupled with a multivariate calibration method, namely partial least squares (PLS), was introduced for the fast and easy determination of Atenolol (ATE), Propanolol (PRO) and Carvedilol (CAR) in biological samples via a spectrophotometric approach. The analytes would be extracted from neutral aqueous solution into 1-dodecanol as an organic solvent, using AALLME. In this approach a low-density solvent with a melting point close to room temperature was applied as the extraction solvent. The emulsion was immediately formed by repeatedly pulling in and pushing out the aqueous sample solution and extraction solvent mixture via a 10-mL glass syringe for ten times. After centrifugation, the extractant droplet could be simply collected from the aqueous samples by solidifying the emulsion at a lower than the melting point temperature. In the next step, analytes were back extracted simultaneously into the acidic aqueous solution. Derringer and Suich multi-response optimization were utilized for simultaneous optimizing the parameters of three analytes. This method incorporates the benefits of AALLME and dispersive liquid-liquid microextraction considering the solidification of floating organic droplets (DLLME-SFOD). Calibration graphs under optimized conditions were linear in the range of 0.30-6.00, 0.32-2.00 and 0.30-1.40μg mL -1 for ATE, CAR and PRO, respectively. Other analytical parameters were obtained as follows: enrichment factors (EFs) were found to be 11.24, 16.55 and 14.90, and limits of detection (LODs) were determined to be 0.09, 0.10 and 0.08μg mL -1 for ATE, CAR and PRO, respectively. The proposed method will require neither a highly toxic chlorinated solvent for extraction nor an organic dispersive solvent in the application process; hence, it is more environmentally friendly. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Correction for isotopic interferences between analyte and internal standard in quantitative mass spectrometry by a nonlinear calibration function.

    PubMed

    Rule, Geoffrey S; Clark, Zlatuse D; Yue, Bingfang; Rockwood, Alan L

    2013-04-16

    Stable isotope-labeled internal standards are of great utility in providing accurate quantitation in mass spectrometry (MS). An implicit assumption has been that there is no "cross talk" between signals of the internal standard and the target analyte. In some cases, however, naturally occurring isotopes of the analyte do contribute to the signal of the internal standard. This phenomenon becomes more pronounced for isotopically rich compounds, such as those containing sulfur, chlorine, or bromine, higher molecular weight compounds, and those at high analyte/internal standard concentration ratio. This can create nonlinear calibration behavior that may bias quantitative results. Here, we propose the use of a nonlinear but more accurate fitting of data for these situations that incorporates one or two constants determined experimentally for each analyte/internal standard combination and an adjustable calibration parameter. This fitting provides more accurate quantitation in MS-based assays where contributions from analyte to stable labeled internal standard signal exist. It can also correct for the reverse situation where an analyte is present in the internal standard as an impurity. The practical utility of this approach is described, and by using experimental data, the approach is compared to alternative fits.

  11. A multidomain approach to understanding risk for underage drinking: converging evidence from 5 data sets.

    PubMed

    Jones, Damon E; Feinberg, Mark E; Cleveland, Michael J; Cooper, Brittany Rhoades

    2012-11-01

    We examined the independent and combined influence of major risk and protective factors on youths' alcohol use. Five large data sets provided similar measures of alcohol use and risk or protective factors. We carried out analyses within each data set, separately for boys and girls in 8th and 10th grades. We included interaction and curvilinear predictive terms in final models if results were robust across data sets. We combined results using meta-analytic techniques. Individual, family, and peer risk factors and a community protective factor moderately predicted youths' alcohol use. Family and school protective factors did not predict alcohol use when combined with other factors. Youths' antisocial attitudes were more strongly associated with alcohol use for those also reporting higher levels of peer or community risk. For certain risk factors, the association with alcohol use varied across different risk levels. Efforts toward reducing youths' alcohol use should be based on robust estimates of the relative influence of risk and protective factors across adolescent environment domains. Public health advocates should focus on context (e.g., community factors) as a strategy for curbing underage alcohol use.

  12. Discordance between net analyte signal theory and practical multivariate calibration.

    PubMed

    Brown, Christopher D

    2004-08-01

    Lorber's concept of net analyte signal is reviewed in the context of classical and inverse least-squares approaches to multivariate calibration. It is shown that, in the presence of device measurement error, the classical and inverse calibration procedures have radically different theoretical prediction objectives, and the assertion that the popular inverse least-squares procedures (including partial least squares, principal components regression) approximate Lorber's net analyte signal vector in the limit is disproved. Exact theoretical expressions for the prediction error bias, variance, and mean-squared error are given under general measurement error conditions, which reinforce the very discrepant behavior between these two predictive approaches, and Lorber's net analyte signal theory. Implications for multivariate figures of merit and numerous recently proposed preprocessing treatments involving orthogonal projections are also discussed.

  13. Analytical and Computational Properties of Distributed Approaches to MDO

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Lewis, Robert Michael

    2000-01-01

    Historical evolution of engineering disciplines and the complexity of the MDO problem suggest that disciplinary autonomy is a desirable goal in formulating and solving MDO problems. We examine the notion of disciplinary autonomy and discuss the analytical properties of three approaches to formulating and solving MDO problems that achieve varying degrees of autonomy by distributing the problem along disciplinary lines. Two of the approaches-Optimization by Linear Decomposition and Collaborative Optimization-are based on bi-level optimization and reflect what we call a structural perspective. The third approach, Distributed Analysis Optimization, is a single-level approach that arises from what we call an algorithmic perspective. The main conclusion of the paper is that disciplinary autonomy may come at a price: in the bi-level approaches, the system-level constraints introduced to relax the interdisciplinary coupling and enable disciplinary autonomy can cause analytical and computational difficulties for optimization algorithms. The single-level alternative we discuss affords a more limited degree of autonomy than that of the bi-level approaches, but without the computational difficulties of the bi-level methods. Key Words: Autonomy, bi-level optimization, distributed optimization, multidisciplinary optimization, multilevel optimization, nonlinear programming, problem integration, system synthesis

  14. A Structural and Correlational Analysis of Two Common Measures of Personal Epistemology

    ERIC Educational Resources Information Center

    Laster, Bonnie Bost

    2010-01-01

    Scope and Method of Study: The current inquiry is a factor analytic study which utilizes first and second order factor analytic methods to examine the internal structures of two measurements of personal epistemological beliefs: the Schommer Epistemological Questionnaire (SEQ) and Epistemic Belief Inventory (EBI). The study also examines the…

  15. A Bayesian Multi-Level Factor Analytic Model of Consumer Price Sensitivities across Categories

    ERIC Educational Resources Information Center

    Duvvuri, Sri Devi; Gruca, Thomas S.

    2010-01-01

    Identifying price sensitive consumers is an important problem in marketing. We develop a Bayesian multi-level factor analytic model of the covariation among household-level price sensitivities across product categories that are substitutes. Based on a multivariate probit model of category incidence, this framework also allows the researcher to…

  16. Bayes and empirical Bayes methods for reduced rank regression models in matched case-control studies.

    PubMed

    Satagopan, Jaya M; Sen, Ananda; Zhou, Qin; Lan, Qing; Rothman, Nathaniel; Langseth, Hilde; Engel, Lawrence S

    2016-06-01

    Matched case-control studies are popular designs used in epidemiology for assessing the effects of exposures on binary traits. Modern studies increasingly enjoy the ability to examine a large number of exposures in a comprehensive manner. However, several risk factors often tend to be related in a nontrivial way, undermining efforts to identify the risk factors using standard analytic methods due to inflated type-I errors and possible masking of effects. Epidemiologists often use data reduction techniques by grouping the prognostic factors using a thematic approach, with themes deriving from biological considerations. We propose shrinkage-type estimators based on Bayesian penalization methods to estimate the effects of the risk factors using these themes. The properties of the estimators are examined using extensive simulations. The methodology is illustrated using data from a matched case-control study of polychlorinated biphenyls in relation to the etiology of non-Hodgkin's lymphoma. © 2015, The International Biometric Society.

  17. Introducing Text Analytics as a Graduate Business School Course

    ERIC Educational Resources Information Center

    Edgington, Theresa M.

    2011-01-01

    Text analytics refers to the process of analyzing unstructured data from documented sources, including open-ended surveys, blogs, and other types of web dialog. Text analytics has enveloped the concept of text mining, an analysis approach influenced heavily from data mining. While text mining has been covered extensively in various computer…

  18. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    ERIC Educational Resources Information Center

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  19. A Progressive Approach to Teaching Analytics in the Marketing Curriculum

    ERIC Educational Resources Information Center

    Liu, Yiyuan; Levin, Michael A.

    2018-01-01

    With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…

  20. Learning Analytics for Online Discussions: Embedded and Extracted Approaches

    ERIC Educational Resources Information Center

    Wise, Alyssa Friend; Zhao, Yuting; Hausknecht, Simone Nicole

    2014-01-01

    This paper describes an application of learning analytics that builds on an existing research program investigating how students contribute and attend to the messages of others in asynchronous online discussions. We first overview the E-Listening research program and then explain how this work was translated into analytics that students and…

  1. A Functional Analytic Approach to Group Psychotherapy

    ERIC Educational Resources Information Center

    Vandenberghe, Luc

    2009-01-01

    This article provides a particular view on the use of Functional Analytical Psychotherapy (FAP) in a group therapy format. This view is based on the author's experiences as a supervisor of Functional Analytical Psychotherapy Groups, including groups for women with depression and groups for chronic pain patients. The contexts in which this approach…

  2. Empire: An Analytical Category for Educational Research

    ERIC Educational Resources Information Center

    Coloma, Roland Sintos

    2013-01-01

    In this article Roland Sintos Coloma argues for the relevance of empire as an analytical category in educational research. He points out the silence in mainstream studies of education on the subject of empire, the various interpretive approaches to deploying empire as an analytic, and the importance of indigeneity in research on empire and…

  3. Factorization and resummation: A new paradigm to improve gravitational wave amplitudes

    NASA Astrophysics Data System (ADS)

    Nagar, Alessandro; Shah, Abhay

    2016-11-01

    We introduce a new resummed analytical form of the post-Newtonian (PN), factorized, multipolar amplitude corrections fℓm of the effective-one-body (EOB) gravitational waveform of spinning, nonprecessing, circularized, coalescing black hole binaries (BBHs). This stems from the following two-step paradigm: (i) the factorization of the orbital (spin-independent) terms in fℓm; (ii) the resummation of the residual spin (or orbital) factors. We find that resumming the residual spin factor by taking its inverse resummed (iResum) is an efficient way to obtain amplitudes that are more accurate in the strong-field, fast-velocity regime. The performance of the method is illustrated on the ℓ=2 and m =(1 ,2 ) waveform multipoles, both for a test mass orbiting around a Kerr black hole and for comparable-mass BBHs. In the first case, the iResum fℓm's are much closer to the corresponding "exact" functions (obtained by numerically solving the Teukolsky equation) up to the light ring than the nonresummed ones, especially when the black-hole spin is nearly extremal. The iResum paradigm is also more efficient than including higher post-Newtonian terms (up to 20PN order): the resummed 5PN information yields per se a rather good numerical or analytical agreement at the last stable orbit and a well-controlled behavior up to the light ring. For comparable mass binaries (including the highest PN-order information available, 3.5PN), comparing EOB with numerical relativity (NR) data shows that the EOB/NR fractional disagreement at merger, without NR calibration of the EOB waveform, is generically reduced by iResum, from 40% of the usual approach to just a few percent. This suggests that EOBNR waveform models for coalescing BBHs may be improved by using iResum amplitudes.

  4. Climate Analytics as a Service. Chapter 11

    NASA Technical Reports Server (NTRS)

    Schnase, John L.

    2016-01-01

    Exascale computing, big data, and cloud computing are driving the evolution of large-scale information systems toward a model of data-proximal analysis. In response, we are developing a concept of climate analytics as a service (CAaaS) that represents a convergence of data analytics and archive management. With this approach, high-performance compute-storage implemented as an analytic system is part of a dynamic archive comprising both static and computationally realized objects. It is a system whose capabilities are framed as behaviors over a static data collection, but where queries cause results to be created, not found and retrieved. Those results can be the product of a complex analysis, but, importantly, they also can be tailored responses to the simplest of requests. NASA's MERRA Analytic Service and associated Climate Data Services API provide a real-world example of climate analytics delivered as a service in this way. Our experiences reveal several advantages to this approach, not the least of which is orders-of-magnitude time reduction in the data assembly task common to many scientific workflows.

  5. Analytical and numerical treatment of drift-tearing modes in plasma slab

    NASA Astrophysics Data System (ADS)

    Mirnov, V. V.; Hegna, C. C.; Sovinec, C. R.; Howell, E. C.

    2016-10-01

    Two-fluid corrections to linear tearing modes includes 1) diamagnetic drifts that reduce the growth rate and 2) electron and ion decoupling on short scales that can lead to fast reconnection. We have recently developed an analytical model that includes effects 1) and 2) and important contribution from finite electron parallel thermal conduction. Both the tendencies 1) and 2) are confirmed by an approximate analytic dispersion relation that is derived using a perturbative approach of small ion-sound gyroradius ρs. This approach is only valid at the beginning of the transition from the collisional to semi-collisional regimes. Further analytical and numerical work is performed to cover the full interval of ρs connecting these two limiting cases. Growth rates are computed from analytic theory with a shooting method. They match the resistive MHD regime with the dispersion relations known at asymptotically large ion-sound gyroradius. A comparison between this analytical treatment and linear numerical simulations using the NIMROD code with cold ions and hot electrons in plasma slab is reported. The material is based on work supported by the U.S. DOE and NSF.

  6. Consolidation & Factors Influencing Sintering Process in Polymer Powder Based Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Sagar, M. B.; Elangovan, K.

    2017-08-01

    Additive Manufacturing (AM) is two decade old technology; where parts are build layer manufacturing method directly from a CAD template. Over the years, AM techniques changes the future way of part fabrication with enhanced intricacy and custom-made features are aimed. Commercially polymers, metals, ceramic and metal-polymer composites are in practice where polymers enhanced the expectations in AM and are considered as a kind of next industrial revolution. Growing trend in polymer application motivated to study their feasibility and properties. Laser sintering, Heat sintering and Inhibition sintering are the most successful AM techniques for polymers but having least application. The presentation gives up selective sintering of powder polymers and listed commercially available polymer materials. Important significant factors for effective processing and analytical approaches to access them are discussed.

  7. Two Approaches in the Lunar Libration Theory: Analytical vs. Numerical Methods

    NASA Astrophysics Data System (ADS)

    Petrova, Natalia; Zagidullin, Arthur; Nefediev, Yurii; Kosulin, Valerii

    2016-10-01

    Observation of the physical libration of the Moon and the celestial bodies is one of the astronomical methods to remotely evaluate the internal structure of a celestial body without using expensive space experiments. Review of the results obtained due to the physical libration study, is presented in the report.The main emphasis is placed on the description of successful lunar laser ranging for libration determination and on the methods of simulating the physical libration. As a result, estimation of the viscoelastic and dissipative properties of the lunar body, of the lunar core parameters were done. The core's existence was confirmed by the recent reprocessing of seismic data Apollo missions. Attention is paid to the physical interpretation of the phenomenon of free libration and methods of its determination.A significant part of the report is devoted to describing the practical application of the most accurate to date the analytical tables of lunar libration built by comprehensive analytical processing of residual differences obtained when comparing the long-term series of laser observations with numerical ephemeris DE421 [1].In general, the basic outline of the report reflects the effectiveness of two approaches in the libration theory - numerical and analytical solution. It is shown that the two approaches complement each other for the study of the Moon in different aspects: numerical approach provides high accuracy of the theory necessary for adequate treatment of modern high-accurate observations and the analytic approach allows you to see the essence of the various kind manifestations in the lunar rotation, predict and interpret the new effects in observations of physical libration [2].[1] Rambaux, N., J. G. Williams, 2011, The Moon's physical librations and determination of their free modes, Celest. Mech. Dyn. Astron., 109, 85-100.[2] Petrova N., A. Zagidullin, Yu. Nefediev. Analysis of long-periodic variations of lunar libration parameters on the basis of analytical theory / // The Russian-Japanese Workshop, 20-25 October, Tokyo (Mitaka) - Mizusawa, Japan. - 2014.

  8. Recently published analytical methods for determining alcohol in body materials : alcohol countermeasures literature review

    DOT National Transportation Integrated Search

    1974-10-01

    The author has brought the review of published analytical methods for determining alcohol in body materials up-to- date. The review deals with analytical methods for alcohol in blood and other body fluids and tissues; breath alcohol methods; factors ...

  9. Statically determined slip-line field solution for the axial forming force estimation in the radial-axial ring rolling process

    NASA Astrophysics Data System (ADS)

    Quagliato, Luca; Berti, Guido A.

    2017-10-01

    In this paper, a statically determined slip-line solution algorithm is proposed for the calculation of the axial forming force in the radial-axial ring rolling process of flat rings. The developed solution is implemented in an Excel spreadsheet for the construction of the slip-line field and the calculation of the pressure factor to be used in the force model. The comparison between analytical solution and authors' FE simulation allows stating that the developed model supersedes the previous literature ones and proves the reliability of the proposed approach.

  10. Study of diatomic molecules. 2: Intensities. [optical emission spectroscopy of ScO

    NASA Technical Reports Server (NTRS)

    Femenias, J. L.

    1978-01-01

    The theory of perturbations, giving the diatomic effective Hamiltonian, is used for calculating actual molecular wave functions and intensity factors involved in transitions between states arising from Hund's coupling cases a,b, intermediate a-b, and c tendency. The Herman and Wallis corrections are derived, without any knowledge of the analytical expressions of the wave functions, and generalized to transitions between electronic states with whatever symmetry and multiplicity. A general method for studying perturbed intensities is presented using primarily modern spectroscopic numerical approaches. The method is used in the study of the ScO optical emission spectrum.

  11. A semi-empirical approach for the chemoviscosity modeling of reactive resin system

    NASA Technical Reports Server (NTRS)

    Hou, T. H.; Bai, J. M.

    1988-01-01

    A new analytical model for simulating chemoviscosity of a thermosetting resin is presented. The model is developed on the basis of the Williams-Landel-Ferry (WLF, 1955) polymer rheology theory for the thermoplastic materials, which was modified to account for the factor of reaction time by introducing a relationship between the glass transition temperature and the degree of cure of the resin system. Theoretical predictions of the chemoviscosity profiles under dynamic curing conditions are shown to compare favorably with the experimental data obtained on the Hercules 3501-6 resin system cured under seven isothermal conditions.

  12. Spatial derivatives of flow quantities behind curved shocks of all strengths

    NASA Technical Reports Server (NTRS)

    Darden, C. M.

    1984-01-01

    Explicit formulas in terms of shock curvature are developed for spatial derivatives of flow quantities behind a curved shock for two-dimensional inviscid steady flow. Factors which yield the equations indeterminate as the shock strength approaches 0 have been cancelled analytically so that formulas are valid for shocks of any strength. An application for the method is shown in the solution of shock coalescence when nonaxisymmetric effects are felt through derivatives in the circumferential direction. The solution of this problem requires flow derivatives behind the shock in both the axial and radial direction.

  13. Comparison of adjoint and analytical Bayesian inversion methods for constraining Asian sources of carbon monoxide using satellite (MOPITT) measurements of CO columns

    NASA Astrophysics Data System (ADS)

    Kopacz, Monika; Jacob, Daniel J.; Henze, Daven K.; Heald, Colette L.; Streets, David G.; Zhang, Qiang

    2009-02-01

    We apply the adjoint of an atmospheric chemical transport model (GEOS-Chem CTM) to constrain Asian sources of carbon monoxide (CO) with 2° × 2.5° spatial resolution using Measurement of Pollution in the Troposphere (MOPITT) satellite observations of CO columns in February-April 2001. Results are compared to the more common analytical method for solving the same Bayesian inverse problem and applied to the same data set. The analytical method is more exact but because of computational limitations it can only constrain emissions over coarse regions. We find that the correction factors to the a priori CO emission inventory from the adjoint inversion are generally consistent with those of the analytical inversion when averaged over the large regions of the latter. The adjoint solution reveals fine-scale variability (cities, political boundaries) that the analytical inversion cannot resolve, for example, in the Indian subcontinent or between Korea and Japan, and some of that variability is of opposite sign which points to large aggregation errors in the analytical solution. Upward correction factors to Chinese emissions from the prior inventory are largest in central and eastern China, consistent with a recent bottom-up revision of that inventory, although the revised inventory also sees the need for upward corrections in southern China where the adjoint and analytical inversions call for downward correction. Correction factors for biomass burning emissions derived from the adjoint and analytical inversions are consistent with a recent bottom-up inventory on the basis of MODIS satellite fire data.

  14. Effect of injection screen slot geometry on hydraulic conductivity tests

    NASA Astrophysics Data System (ADS)

    Klammler, Harald; Nemer, Bassel; Hatfield, Kirk

    2014-04-01

    Hydraulic conductivity and its spatial variability are important hydrogeological parameters and are typically determined through injection tests at different scales. For injection test interpretation, shape factors are required to account for injection screen geometry. Shape factors act as proportionality constants between hydraulic conductivity and observed ratios of injection flow rate and injection head at steady-state. Existing results for such shape factors assume either an ideal screen (i.e., ignoring effects of screen slot geometry) or infinite screen length (i.e., ignoring effects of screen extremes). In the present work, we investigate the combined effects of circumferential screen slot geometry and finite screen length on injection shape factors. This is done in terms of a screen entrance resistance by solving a steady-state potential flow mixed type boundary value problem in a homogeneous axi-symmetric flow domain using a semi-analytical solution approach. Results are compared to existing analytical solutions for circumferential and longitudinal slots on infinite screens, which are found to be identical. Based on an existing approximation, an expression is developed for a dimensionless screen entrance resistance of infinite screens, which is a function of the relative slot area only. For anisotropic conditions, e.g., when conductivity is smaller in the vertical direction than in the horizontal, screen entrance losses for circumferential slots increase, while they remain unaffected for longitudinal slots. This work is not concerned with investigating the effects of (possibly turbulent) head losses inside the injection device including the passage through the injection slots prior to entering the porous aquifer.

  15. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.

    PubMed

    Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.

  16. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory

    PubMed Central

    Zaman, Zahur; Blanckaert, Norbert J. C.; Chan, Daniel W.; Dubois, Jeffrey A.; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W.; Nilsen, Olaug L.; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L.; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality. PMID:18924721

  17. Analytical Chemistry: A Literary Approach.

    ERIC Educational Resources Information Center

    Lucy, Charles A.

    2000-01-01

    Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)

  18. Complete characterization of fourth-order symplectic integrators with extended-linear coefficients.

    PubMed

    Chin, Siu A

    2006-02-01

    The structure of symplectic integrators up to fourth order can be completely and analytically understood when the factorization (split) coefficients are related linearly but with a uniform nonlinear proportional factor. The analytic form of these extended-linear symplectic integrators greatly simplified proofs of their general properties and allowed easy construction of both forward and nonforward fourth-order algorithms with an arbitrary number of operators. Most fourth-order forward integrators can now be derived analytically from this extended-linear formulation without the use of symbolic algebra.

  19. Molecularly imprinted polymer coupled with dispersive liquid-liquid microextraction and injector port silylation: a novel approach for the determination of 3-phenoxybenzoic acid in complex biological samples using gas chromatography-tandem mass spectrometry.

    PubMed

    Mudiam, Mohana Krishna Reddy; Chauhan, Abhishek; Jain, Rajeev; Dhuriya, Yogesh Kumar; Saxena, Prem Narain; Khanna, Vinay Kumar

    2014-01-15

    A novel analytical approach based on molecularly imprinted solid phase extraction (MISPE) coupled with dispersive liquid-liquid microextraction (DLLME), and injector port silylation (IPS) has been developed for the selective preconcentration, derivatization and analysis of 3-phenoxybenzoic acid (3-PBA) using gas chromatography-tandem mass spectrometry (GC-MS/MS) in complex biological samples such as rat blood and liver. Factors affecting the synthesis of MIP were evaluated and the best monomer and cross-linker were selected based on binding affinity studies. Various parameters of MISPE, DLLME and IPS were optimized for the selective preconcentration and derivatization of 3-PBA. The developed method offers a good linearity over the calibration range of 0.02-2.5ngmg(-1) and 7.5-2000ngmL(-1) for liver and blood respectively. Under optimized conditions, the recovery of 3-PBA in liver and blood samples were found to be in the range of 83-91%. The detection limit was found to be 0.0045ngmg(-1) and 1.82ngmL(-1) in liver and blood respectively. SRM transition of 271→227 and 271→197 has been selected as quantifier and qualifier transition for 3-PBA derivative. Intra and inter-day precision for five replicates in a day and for five, successive days was found to be less than 8%. The method developed was successfully applied to real samples, i.e. rat blood and tissue for quantitative evaluation of 3-PBA. The analytical approach developed is rapid, economic, simple, eco-friendly and possess immense utility for the analysis of analytes with polar functional groups in complex biological samples by GC-MS/MS. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Task-based image quality evaluation of iterative reconstruction methods for low dose CT using computer simulations

    NASA Astrophysics Data System (ADS)

    Xu, Jingyan; Fuld, Matthew K.; Fung, George S. K.; Tsui, Benjamin M. W.

    2015-04-01

    Iterative reconstruction (IR) methods for x-ray CT is a promising approach to improve image quality or reduce radiation dose to patients. The goal of this work was to use task based image quality measures and the channelized Hotelling observer (CHO) to evaluate both analytic and IR methods for clinical x-ray CT applications. We performed realistic computer simulations at five radiation dose levels, from a clinical reference low dose D0 to 25% D0. A fixed size and contrast lesion was inserted at different locations into the liver of the XCAT phantom to simulate a weak signal. The simulated data were reconstructed on a commercial CT scanner (SOMATOM Definition Flash; Siemens, Forchheim, Germany) using the vendor-provided analytic (WFBP) and IR (SAFIRE) methods. The reconstructed images were analyzed by CHOs with both rotationally symmetric (RS) and rotationally oriented (RO) channels, and with different numbers of lesion locations (5, 10, and 20) in a signal known exactly (SKE), background known exactly but variable (BKEV) detection task. The area under the receiver operating characteristic curve (AUC) was used as a summary measure to compare the IR and analytic methods; the AUC was also used as the equal performance criterion to derive the potential dose reduction factor of IR. In general, there was a good agreement in the relative AUC values of different reconstruction methods using CHOs with RS and RO channels, although the CHO with RO channels achieved higher AUCs than RS channels. The improvement of IR over analytic methods depends on the dose level. The reference dose level D0 was based on a clinical low dose protocol, lower than the standard dose due to the use of IR methods. At 75% D0, the performance improvement was statistically significant (p < 0.05). The potential dose reduction factor also depended on the detection task. For the SKE/BKEV task involving 10 lesion locations, a dose reduction of at least 25% from D0 was achieved.

Top