Sample records for factor analytic approach

  1. Factor-Analytic and Individualized Approaches to Constructing Brief Measures of ADHD Behaviors

    ERIC Educational Resources Information Center

    Volpe, Robert J.; Gadow, Kenneth D.; Blom-Hoffman, Jessica; Feinberg, Adam B.

    2009-01-01

    Two studies were performed to examine a factor-analytic and an individualized approach to creating short progress-monitoring measures from the longer "ADHD-Symptom Checklist-4" (ADHD-SC4). In Study 1, teacher ratings on items of the ADHD:Inattentive (IA) and ADHD:Hyperactive-Impulsive (HI) scales of the ADHD-SC4 were factor analyzed in a normative…

  2. The Structure of Temperament in Preschoolers: A Two-Stage Factor Analytic Approach

    PubMed Central

    Dyson, Margaret W.; Olino, Thomas M.; Durbin, C. Emily; Goldsmith, H. Hill; Klein, Daniel N.

    2012-01-01

    The structure of temperament traits in young children has been the subject of extensive debate, with separate models proposing different trait dimensions. This research has relied almost exclusively on parent-report measures. The present study used an alternative approach, a laboratory observational measure, to explore the structure of temperament in preschoolers. A 2-stage factor analytic approach, exploratory factor analyses (n = 274) followed by confirmatory factor analyses (n = 276), was used. We retrieved an adequately fitting model that consisted of 5 dimensions: Sociability, Positive Affect/Interest, Dysphoria, Fear/Inhibition, and Constraint versus Impulsivity. This solution overlaps with, but is also distinct from, the major models derived from parent-report measures. PMID:21859196

  3. Taxometric and Factor Analytic Models of Anxiety Sensitivity: Integrating Approaches to Latent Structural Research

    ERIC Educational Resources Information Center

    Bernstein, Amit; Zvolensky, Michael J.; Norton, Peter J.; Schmidt, Norman B.; Taylor, Steven; Forsyth, John P.; Lewis, Sarah F.; Feldner, Matthew T.; Leen-Feldner, Ellen W.; Stewart, Sherry H.; Cox, Brian

    2007-01-01

    This study represents an effort to better understand the latent structure of anxiety sensitivity (AS), as indexed by the 16-item Anxiety Sensitivity Index (ASI; S. Reiss, R. A. Peterson, M. Gursky, & R. J. McNally, 1986), by using taxometric and factor-analytic approaches in an integrative manner. Taxometric analyses indicated that AS has a…

  4. Significance Testing in Confirmatory Factor Analytic Models.

    ERIC Educational Resources Information Center

    Khattab, Ali-Maher; Hocevar, Dennis

    Traditionally, confirmatory factor analytic models are tested against a null model of total independence. Using randomly generated factors in a matrix of 46 aptitude tests, this approach is shown to be unlikely to reject even random factors. An alternative null model, based on a single general factor, is suggested. In addition, an index of model…

  5. Proposal of a risk-factor-based analytical approach for integrating occupational health and safety into project risk evaluation.

    PubMed

    Badri, Adel; Nadeau, Sylvie; Gbodossou, André

    2012-09-01

    Excluding occupational health and safety (OHS) from project management is no longer acceptable. Numerous industrial accidents have exposed the ineffectiveness of conventional risk evaluation methods as well as negligence of risk factors having major impact on the health and safety of workers and nearby residents. Lack of reliable and complete evaluations from the beginning of a project generates bad decisions that could end up threatening the very existence of an organization. This article supports a systematic approach to the evaluation of OHS risks and proposes a new procedure based on the number of risk factors identified and their relative significance. A new concept called risk factor concentration along with weighting of risk factor categories as contributors to undesirable events are used in the analytical hierarchy process multi-criteria comparison model with Expert Choice(©) software. A case study is used to illustrate the various steps of the risk evaluation approach and the quick and simple integration of OHS at an early stage of a project. The approach allows continual reassessment of criteria over the course of the project or when new data are acquired. It was thus possible to differentiate the OHS risks from the risk of drop in quality in the case of the factory expansion project. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Psychometric Structure of a Comprehensive Objective Structured Clinical Examination: A Factor Analytic Approach

    ERIC Educational Resources Information Center

    Volkan, Kevin; Simon, Steven R.; Baker, Harley; Todres, I. David

    2004-01-01

    Problem Statement and Background: While the psychometric properties of Objective Structured Clinical Examinations (OSCEs) have been studied, their latent structures have not been well characterized. This study examines a factor analytic model of a comprehensive OSCE and addresses implications for measurement of clinical performance. Methods: An…

  7. Resilience: A Meta-Analytic Approach

    ERIC Educational Resources Information Center

    Lee, Ji Hee; Nam, Suk Kyung; Kim, A-Reum; Kim, Boram; Lee, Min Young; Lee, Sang Min

    2013-01-01

    This study investigated the relationship between psychological resilience and its relevant variables by using a meta-analytic method. The results indicated that the largest effect on resilience was found to stem from the protective factors, a medium effect from risk factors, and the smallest effect from demographic factors. (Contains 4 tables.)

  8. Managing knowledge business intelligence: A cognitive analytic approach

    NASA Astrophysics Data System (ADS)

    Surbakti, Herison; Ta'a, Azman

    2017-10-01

    The purpose of this paper is to identify and analyze integration of Knowledge Management (KM) and Business Intelligence (BI) in order to achieve competitive edge in context of intellectual capital. Methodology includes review of literatures and analyzes the interviews data from managers in corporate sector and models established by different authors. BI technologies have strong association with process of KM for attaining competitive advantage. KM have strong influence from human and social factors and turn them to the most valuable assets with efficient system run under BI tactics and technologies. However, the term of predictive analytics is based on the field of BI. Extracting tacit knowledge is a big challenge to be used as a new source for BI to use in analyzing. The advanced approach of the analytic methods that address the diversity of data corpus - structured and unstructured - required a cognitive approach to provide estimative results and to yield actionable descriptive, predictive and prescriptive results. This is a big challenge nowadays, and this paper aims to elaborate detail in this initial work.

  9. An analytical approach for the calculation of stress-intensity factors in transformation-toughened ceramics

    NASA Astrophysics Data System (ADS)

    Müller, W. H.

    1990-12-01

    Stress-induced transformation toughening in Zirconia-containing ceramics is described analytically by means of a quantitative model: A Griffith crack which interacts with a transformed, circular Zirconia inclusion. Due to its volume expansion, a ZrO2-particle compresses its flanks, whereas a particle in front of the crack opens the flanks such that the crack will be attracted and finally absorbed. Erdogan's integral equation technique is applied to calculate the dislocation functions and the stress-intensity-factors which correspond to these situations. In order to derive analytical expressions, the elastic constants of the inclusion and the matrix are assumed to be equal.

  10. Defining dignity in terminally ill cancer patients: a factor-analytic approach.

    PubMed

    Hack, Thomas F; Chochinov, Harvey Max; Hassard, Thomas; Kristjanson, Linda J; McClement, Susan; Harlos, Mike

    2004-10-01

    The construct of 'dignity' is frequently raised in discussions about quality end of life care for terminal cancer patients, and is invoked by parties on both sides of the euthanasia debate. Lacking in this general debate has been an empirical explication of 'dignity' from the viewpoint of cancer patients themselves. The purpose of the present study was to use factor-analytic and regression methods to analyze dignity data gathered from 213 cancer patients having less than 6 months to live. Patients rated their sense of dignity, and completed measures of symptom distress and psychological well-being. The results showed that although the majority of patients had an intact sense of dignity, there were 99 (46%) patients who reported at least some, or occasional loss of dignity, and 16 (7.5%) patients who indicated that loss of dignity was a significant problem. The exploratory factor analysis yielded six primary factors: (1) Pain; (2) Intimate Dependency; (3) Hopelessness/Depression; (4) Informal Support Network; (5) Formal Support Network; and (6) Quality of Life. Subsequent regression analyses of modifiable factors produced a final two-factor (Hopelessness/Depression and Intimate Dependency) model of statistical significance. These results provide empirical support for the dignity model, and suggest that the provision of end of life care should include methods for treating depression, fostering hope, and facilitating functional independence. Copyright 2004 John Wiley & Sons, Ltd.

  11. An analytical approach to γ-ray self-shielding effects for radioactive bodies encountered nuclear decommissioning scenarios.

    PubMed

    Gamage, K A A; Joyce, M J

    2011-10-01

    A novel analytical approach is described that accounts for self-shielding of γ radiation in decommissioning scenarios. The approach is developed with plutonium-239, cobalt-60 and caesium-137 as examples; stainless steel and concrete have been chosen as the media for cobalt-60 and caesium-137, respectively. The analytical methods have been compared MCNPX 2.6.0 simulations. A simple, linear correction factor relates the analytical results and the simulated estimates. This has the potential to greatly simplify the estimation of self-shielding effects in decommissioning activities. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. ESTIMATING UNCERTAINITIES IN FACTOR ANALYTIC MODELS

    EPA Science Inventory

    When interpreting results from factor analytic models as used in receptor modeling, it is important to quantify the uncertainties in those results. For example, if the presence of a species on one of the factors is necessary to interpret the factor as originating from a certain ...

  13. Critical Factors in Data Governance for Learning Analytics

    ERIC Educational Resources Information Center

    Elouazizi, Noureddine

    2014-01-01

    This paper identifies some of the main challenges of data governance modelling in the context of learning analytics for higher education institutions, and discusses the critical factors for designing data governance models for learning analytics. It identifies three fundamental common challenges that cut across any learning analytics data…

  14. Pre-analytical and analytical factors influencing Alzheimer's disease cerebrospinal fluid biomarker variability.

    PubMed

    Fourier, Anthony; Portelius, Erik; Zetterberg, Henrik; Blennow, Kaj; Quadrio, Isabelle; Perret-Liaudet, Armand

    2015-09-20

    A panel of cerebrospinal fluid (CSF) biomarkers including total Tau (t-Tau), phosphorylated Tau protein at residue 181 (p-Tau) and β-amyloid peptides (Aβ42 and Aβ40), is frequently used as an aid in Alzheimer's disease (AD) diagnosis for young patients with cognitive impairment, for predicting prodromal AD in mild cognitive impairment (MCI) subjects, for AD discrimination in atypical clinical phenotypes and for inclusion/exclusion and stratification of patients in clinical trials. Due to variability in absolute levels between laboratories, there is no consensus on medical cut-off value for the CSF AD signature. Thus, for full implementation of this core AD biomarker panel in clinical routine, this issue has to be solved. Variability can be explained both by pre-analytical and analytical factors. For example, the plastic tubes used for CSF collection and storage, the lack of reference material and the variability of the analytical protocols were identified as important sources of variability. The aim of this review is to highlight these pre-analytical and analytical factors and describe efforts done to counteract them in order to establish cut-off values for core CSF AD biomarkers. This review will give the current state of recommendations. Copyright © 2015. Published by Elsevier B.V.

  15. Bridging analytical approaches for low-carbon transitions

    NASA Astrophysics Data System (ADS)

    Geels, Frank W.; Berkhout, Frans; van Vuuren, Detlef P.

    2016-06-01

    Low-carbon transitions are long-term multi-faceted processes. Although integrated assessment models have many strengths for analysing such transitions, their mathematical representation requires a simplification of the causes, dynamics and scope of such societal transformations. We suggest that integrated assessment model-based analysis should be complemented with insights from socio-technical transition analysis and practice-based action research. We discuss the underlying assumptions, strengths and weaknesses of these three analytical approaches. We argue that full integration of these approaches is not feasible, because of foundational differences in philosophies of science and ontological assumptions. Instead, we suggest that bridging, based on sequential and interactive articulation of different approaches, may generate a more comprehensive and useful chain of assessments to support policy formation and action. We also show how these approaches address knowledge needs of different policymakers (international, national and local), relate to different dimensions of policy processes and speak to different policy-relevant criteria such as cost-effectiveness, socio-political feasibility, social acceptance and legitimacy, and flexibility. A more differentiated set of analytical approaches thus enables a more differentiated approach to climate policy making.

  16. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    PubMed Central

    Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-01-01

    Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928

  17. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    PubMed

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  18. Multi-analytical Approaches Informing the Risk of Sepsis

    NASA Astrophysics Data System (ADS)

    Gwadry-Sridhar, Femida; Lewden, Benoit; Mequanint, Selam; Bauer, Michael

    Sepsis is a significant cause of mortality and morbidity and is often associated with increased hospital resource utilization, prolonged intensive care unit (ICU) and hospital stay. The economic burden associated with sepsis is huge. With advances in medicine, there are now aggressive goal oriented treatments that can be used to help these patients. If we were able to predict which patients may be at risk for sepsis we could start treatment early and potentially reduce the risk of mortality and morbidity. Analytic methods currently used in clinical research to determine the risk of a patient developing sepsis may be further enhanced by using multi-modal analytic methods that together could be used to provide greater precision. Researchers commonly use univariate and multivariate regressions to develop predictive models. We hypothesized that such models could be enhanced by using multiple analytic methods that together could be used to provide greater insight. In this paper, we analyze data about patients with and without sepsis using a decision tree approach and a cluster analysis approach. A comparison with a regression approach shows strong similarity among variables identified, though not an exact match. We compare the variables identified by the different approaches and draw conclusions about the respective predictive capabilities,while considering their clinical significance.

  19. Methods for Estimating Uncertainty in Factor Analytic Solutions

    EPA Science Inventory

    The EPA PMF (Environmental Protection Agency positive matrix factorization) version 5.0 and the underlying multilinear engine-executable ME-2 contain three methods for estimating uncertainty in factor analytic models: classical bootstrap (BS), displacement of factor elements (DI...

  20. Development of an Analytical Method for Dibutyl Phthalate Determination Using Surrogate Analyte Approach

    PubMed Central

    Farzanehfar, Vahid; Faizi, Mehrdad; Naderi, Nima; Kobarfard, Farzad

    2017-01-01

    Dibutyl phthalate (DBP) is a phthalic acid ester and is widely used in polymeric products to make them more flexible. DBP is found in almost every plastic material and is believed to be persistent in the environment. Various analytical methods have been used to measure DBP in different matrices. Considering the ubiquitous nature of DBP, the most important challenge in DBP analyses is the contamination of even analytical grade organic solvents with this compound and lack of availability of a true blank matrix to construct the calibration line. Standard addition method or using artificial matrices reduce the precision and accuracy of the results. In this study a surrogate analyte approach that is based on using deuterium labeled analyte (DBP-d4) to construct the calibration line was applied to determine DBP in hexane samples. PMID:28496469

  1. Separating method factors and higher order traits of the Big Five: a meta-analytic multitrait-multimethod approach.

    PubMed

    Chang, Luye; Connelly, Brian S; Geeza, Alexis A

    2012-02-01

    Though most personality researchers now recognize that ratings of the Big Five are not orthogonal, the field has been divided about whether these trait intercorrelations are substantive (i.e., driven by higher order factors) or artifactual (i.e., driven by correlated measurement error). We used a meta-analytic multitrait-multirater study to estimate trait correlations after common method variance was controlled. Our results indicated that common method variance substantially inflates trait correlations, and, once controlled, correlations among the Big Five became relatively modest. We then evaluated whether two different theories of higher order factors could account for the pattern of Big Five trait correlations. Our results did not support Rushton and colleagues' (Rushton & Irwing, 2008; Rushton et al., 2009) proposed general factor of personality, but Digman's (1997) α and β metatraits (relabeled by DeYoung, Peterson, and Higgins (2002) as Stability and Plasticity, respectively) produced viable fit. However, our models showed considerable overlap between Stability and Emotional Stability and between Plasticity and Extraversion, raising the question of whether these metatraits are redundant with their dominant Big Five traits. This pattern of findings was robust when we included only studies whose observers were intimately acquainted with targets. Our results underscore the importance of using a multirater approach to studying personality and the need to separate the causes and outcomes of higher order metatraits from those of the Big Five. We discussed the implications of these findings for the array of research fields in which personality is studied.

  2. Analytical method development of nifedipine and its degradants binary mixture using high performance liquid chromatography through a quality by design approach

    NASA Astrophysics Data System (ADS)

    Choiri, S.; Ainurofiq, A.; Ratri, R.; Zulmi, M. U.

    2018-03-01

    Nifedipin (NIF) is a photo-labile drug that easily degrades when it exposures a sunlight. This research aimed to develop of an analytical method using a high-performance liquid chromatography and implemented a quality by design approach to obtain effective, efficient, and validated analytical methods of NIF and its degradants. A 22 full factorial design approach with a curvature as a center point was applied to optimize of the analytical condition of NIF and its degradants. Mobile phase composition (MPC) and flow rate (FR) as factors determined on the system suitability parameters. The selected condition was validated by cross-validation using a leave one out technique. Alteration of MPC affected on time retention significantly. Furthermore, an increase of FR reduced the tailing factor. In addition, the interaction of both factors affected on an increase of the theoretical plates and resolution of NIF and its degradants. The selected analytical condition of NIF and its degradants has been validated at range 1 – 16 µg/mL that had good linearity, precision, accuration and efficient due to an analysis time within 10 min.

  3. Relative frequencies of constrained events in stochastic processes: An analytical approach.

    PubMed

    Rusconi, S; Akhmatskaya, E; Sokolovski, D; Ballard, N; de la Cal, J C

    2015-10-01

    The stochastic simulation algorithm (SSA) and the corresponding Monte Carlo (MC) method are among the most common approaches for studying stochastic processes. They relies on knowledge of interevent probability density functions (PDFs) and on information about dependencies between all possible events. Analytical representations of a PDF are difficult to specify in advance, in many real life applications. Knowing the shapes of PDFs, and using experimental data, different optimization schemes can be applied in order to evaluate probability density functions and, therefore, the properties of the studied system. Such methods, however, are computationally demanding, and often not feasible. We show that, in the case where experimentally accessed properties are directly related to the frequencies of events involved, it may be possible to replace the heavy Monte Carlo core of optimization schemes with an analytical solution. Such a replacement not only provides a more accurate estimation of the properties of the process, but also reduces the simulation time by a factor of order of the sample size (at least ≈10(4)). The proposed analytical approach is valid for any choice of PDF. The accuracy, computational efficiency, and advantages of the method over MC procedures are demonstrated in the exactly solvable case and in the evaluation of branching fractions in controlled radical polymerization (CRP) of acrylic monomers. This polymerization can be modeled by a constrained stochastic process. Constrained systems are quite common, and this makes the method useful for various applications.

  4. Taxometric and Factor Analytic Models of Anxiety Sensitivity among Youth: Exploring the Latent Structure of Anxiety Psychopathology Vulnerability

    ERIC Educational Resources Information Center

    Bernstein, Amit; Zvolensky, Michael J.; Stewart, Sherry; Comeau, Nancy

    2007-01-01

    This study represents an effort to better understand the latent structure of anxiety sensitivity (AS), a well-established affect-sensitivity individual difference factor, among youth by employing taxometric and factor analytic approaches in an integrative manner. Taxometric analyses indicated that AS, as indexed by the Child Anxiety Sensitivity…

  5. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  6. Problematic eating behaviors among bariatric surgical candidates: a psychometric investigation and factor analytic approach.

    PubMed

    Gelinas, Bethany L; Delparte, Chelsea A; Wright, Kristi D; Hart, Regan

    2015-01-01

    Psychological factors (e.g., anxiety, depression) are routinely assessed in bariatric pre-surgical programs, as high levels of psychopathology are consistently related to poor program outcomes (e.g., failure to lose significant weight pre-surgery, weight regain post-surgery). Behavioral factors related to poor program outcomes and ways in which behavioral and psychological factors interact, have received little attention in bariatric research and practice. Potentially problematic behavioral factors are queried by Section H of the Weight and Lifestyle Inventory (WALI-H), in which respondents indicate the relevance of certain eating behaviors to obesity. A factor analytic investigation of the WALI-H serves to improve the way in which this assessment tool is interpreted and used among bariatric surgical candidates, and subsequent moderation analyses serve to demonstrate potential compounding influences of psychopathology on eating behavior factors. Bariatric surgical candidates (n =362) completed several measures of psychopathology and the WALI-H. Item responses from the WALI-H were subjected to principal axis factoring with oblique rotation. Results revealed a three-factor model including: (1) eating in response to negative affect, (2) overeating/desirability of food, and (3) eating in response to positive affect/social cues. All three behavioral factors of the WALI-H were significantly associated with measures of depression and anxiety. Moderation analyses revealed that depression did not moderate the relationship between anxiety and any eating behavior factor. Although single forms of psychopathology are related to eating behaviors, the combination of psychopathology does not appear to influence these problematic behaviors. Recommendations for pre-surgical assessment and treatment of bariatric surgical candidates are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Approaching near real-time biosensing: microfluidic microsphere based biosensor for real-time analyte detection.

    PubMed

    Cohen, Noa; Sabhachandani, Pooja; Golberg, Alexander; Konry, Tania

    2015-04-15

    In this study we describe a simple lab-on-a-chip (LOC) biosensor approach utilizing well mixed microfluidic device and a microsphere-based assay capable of performing near real-time diagnostics of clinically relevant analytes such cytokines and antibodies. We were able to overcome the adsorption kinetics reaction rate-limiting mechanism, which is diffusion-controlled in standard immunoassays, by introducing the microsphere-based assay into well-mixed yet simple microfluidic device with turbulent flow profiles in the reaction regions. The integrated microsphere-based LOC device performs dynamic detection of the analyte in minimal amount of biological specimen by continuously sampling micro-liter volumes of sample per minute to detect dynamic changes in target analyte concentration. Furthermore we developed a mathematical model for the well-mixed reaction to describe the near real time detection mechanism observed in the developed LOC method. To demonstrate the specificity and sensitivity of the developed real time monitoring LOC approach, we applied the device for clinically relevant analytes: Tumor Necrosis Factor (TNF)-α cytokine and its clinically used inhibitor, anti-TNF-α antibody. Based on the reported results herein, the developed LOC device provides continuous sensitive and specific near real-time monitoring method for analytes such as cytokines and antibodies, reduces reagent volumes by nearly three orders of magnitude as well as eliminates the washing steps required by standard immunoassays. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Metabolomics and Diabetes: Analytical and Computational Approaches

    PubMed Central

    Sas, Kelli M.; Karnovsky, Alla; Michailidis, George

    2015-01-01

    Diabetes is characterized by altered metabolism of key molecules and regulatory pathways. The phenotypic expression of diabetes and associated complications encompasses complex interactions between genetic, environmental, and tissue-specific factors that require an integrated understanding of perturbations in the network of genes, proteins, and metabolites. Metabolomics attempts to systematically identify and quantitate small molecule metabolites from biological systems. The recent rapid development of a variety of analytical platforms based on mass spectrometry and nuclear magnetic resonance have enabled identification of complex metabolic phenotypes. Continued development of bioinformatics and analytical strategies has facilitated the discovery of causal links in understanding the pathophysiology of diabetes and its complications. Here, we summarize the metabolomics workflow, including analytical, statistical, and computational tools, highlight recent applications of metabolomics in diabetes research, and discuss the challenges in the field. PMID:25713200

  9. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  10. Shape anomaly detection under strong measurement noise: An analytical approach to adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Krasichkov, Alexander S.; Grigoriev, Eugene B.; Bogachev, Mikhail I.; Nifontov, Eugene M.

    2015-10-01

    We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.

  11. Teaching Analytical Chemistry to Pharmacy Students: A Combined, Iterative Approach

    ERIC Educational Resources Information Center

    Masania, Jinit; Grootveld, Martin; Wilson, Philippe B.

    2018-01-01

    Analytical chemistry has often been a difficult subject to teach in a classroom or lecture-based context. Numerous strategies for overcoming the inherently practical-based difficulties have been suggested, each with differing pedagogical theories. Here, we present a combined approach to tackling the problem of teaching analytical chemistry, with…

  12. Basic emotion processing and the adolescent brain: Task demands, analytic approaches, and trajectories of changes.

    PubMed

    Del Piero, Larissa B; Saxbe, Darby E; Margolin, Gayla

    2016-06-01

    Early neuroimaging studies suggested that adolescents show initial development in brain regions linked with emotional reactivity, but slower development in brain structures linked with emotion regulation. However, the increased sophistication of adolescent brain research has made this picture more complex. This review examines functional neuroimaging studies that test for differences in basic emotion processing (reactivity and regulation) between adolescents and either children or adults. We delineated different emotional processing demands across the experimental paradigms in the reviewed studies to synthesize the diverse results. The methods for assessing change (i.e., analytical approach) and cohort characteristics (e.g., age range) were also explored as potential factors influencing study results. Few unifying dimensions were found to successfully distill the results of the reviewed studies. However, this review highlights the potential impact of subtle methodological and analytic differences between studies, need for standardized and theory-driven experimental paradigms, and necessity of analytic approaches that are can adequately test the trajectories of developmental change that have recently been proposed. Recommendations for future research highlight connectivity analyses and non-linear developmental trajectories, which appear to be promising approaches for measuring change across adolescence. Recommendations are made for evaluating gender and biological markers of development beyond chronological age. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Basic emotion processing and the adolescent brain: Task demands, analytic approaches, and trajectories of changes

    PubMed Central

    Del Piero, Larissa B.; Saxbe, Darby E.; Margolin, Gayla

    2016-01-01

    Early neuroimaging studies suggested that adolescents show initial development in brain regions linked with emotional reactivity, but slower development in brain structures linked with emotion regulation. However, the increased sophistication of adolescent brain research has made this picture more complex. This review examines functional neuroimaging studies that test for differences in basic emotion processing (reactivity and regulation) between adolescents and either children or adults. We delineated different emotional processing demands across the experimental paradigms in the reviewed studies to synthesize the diverse results. The methods for assessing change (i.e., analytical approach) and cohort characteristics (e.g., age range) were also explored as potential factors influencing study results. Few unifying dimensions were found to successfully distill the results of the reviewed studies. However, this review highlights the potential impact of subtle methodological and analytic differences between studies, need for standardized and theory-driven experimental paradigms, and necessity of analytic approaches that are can adequately test the trajectories of developmental change that have recently been proposed. Recommendations for future research highlight connectivity analyses and nonlinear developmental trajectories, which appear to be promising approaches for measuring change across adolescence. Recommendations are made for evaluating gender and biological markers of development beyond chronological age. PMID:27038840

  14. Analytical and Computational Properties of Distributed Approaches to MDO

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Lewis, Robert Michael

    2000-01-01

    Historical evolution of engineering disciplines and the complexity of the MDO problem suggest that disciplinary autonomy is a desirable goal in formulating and solving MDO problems. We examine the notion of disciplinary autonomy and discuss the analytical properties of three approaches to formulating and solving MDO problems that achieve varying degrees of autonomy by distributing the problem along disciplinary lines. Two of the approaches-Optimization by Linear Decomposition and Collaborative Optimization-are based on bi-level optimization and reflect what we call a structural perspective. The third approach, Distributed Analysis Optimization, is a single-level approach that arises from what we call an algorithmic perspective. The main conclusion of the paper is that disciplinary autonomy may come at a price: in the bi-level approaches, the system-level constraints introduced to relax the interdisciplinary coupling and enable disciplinary autonomy can cause analytical and computational difficulties for optimization algorithms. The single-level alternative we discuss affords a more limited degree of autonomy than that of the bi-level approaches, but without the computational difficulties of the bi-level methods. Key Words: Autonomy, bi-level optimization, distributed optimization, multidisciplinary optimization, multilevel optimization, nonlinear programming, problem integration, system synthesis

  15. An Analysis of Machine- and Human-Analytics in Classification.

    PubMed

    Tam, Gary K L; Kothari, Vivek; Chen, Min

    2017-01-01

    In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.

  16. Statistical Approaches to Assess Biosimilarity from Analytical Data.

    PubMed

    Burdick, Richard; Coffey, Todd; Gutka, Hiten; Gratzl, Gyöngyi; Conlon, Hugh D; Huang, Chi-Ting; Boyne, Michael; Kuehne, Henriette

    2017-01-01

    Protein therapeutics have unique critical quality attributes (CQAs) that define their purity, potency, and safety. The analytical methods used to assess CQAs must be able to distinguish clinically meaningful differences in comparator products, and the most important CQAs should be evaluated with the most statistical rigor. High-risk CQA measurements assess the most important attributes that directly impact the clinical mechanism of action or have known implications for safety, while the moderate- to low-risk characteristics may have a lower direct impact and thereby may have a broader range to establish similarity. Statistical equivalence testing is applied for high-risk CQA measurements to establish the degree of similarity (e.g., highly similar fingerprint, highly similar, or similar) of selected attributes. Notably, some high-risk CQAs (e.g., primary sequence or disulfide bonding) are qualitative (e.g., the same as the originator or not the same) and therefore not amenable to equivalence testing. For biosimilars, an important step is the acquisition of a sufficient number of unique originator drug product lots to measure the variability in the originator drug manufacturing process and provide sufficient statistical power for the analytical data comparisons. Together, these analytical evaluations, along with PK/PD and safety data (immunogenicity), provide the data necessary to determine if the totality of the evidence warrants a designation of biosimilarity and subsequent licensure for marketing in the USA. In this paper, a case study approach is used to provide examples of analytical similarity exercises and the appropriateness of statistical approaches for the example data.

  17. xQuake: A Modern Approach to Seismic Network Analytics

    NASA Astrophysics Data System (ADS)

    Johnson, C. E.; Aikin, K. E.

    2017-12-01

    While seismic networks have expanded over the past few decades, and social needs for accurate and timely information has increased dramatically, approaches to the operational needs of both global and regional seismic observatories have been slow to adopt new technologies. This presentation presents the xQuake system that provides a fresh approach to seismic network analytics based on complexity theory and an adaptive architecture of streaming connected microservices as diverse data (picks, beams, and other data) flow into a final, curated catalog of events. The foundation for xQuake is the xGraph (executable graph) framework that is essentially a self-organizing graph database. An xGraph instance provides both the analytics as well as the data storage capabilities at the same time. Much of the analytics, such as synthetic annealing in the detection process and an evolutionary programing approach for event evolution, draws from the recent GLASS 3.0 seismic associator developed by and for the USGS National Earthquake Information Center (NEIC). In some respects xQuake is reminiscent of the Earthworm system, in that it comprises processes interacting through store and forward rings; not surprising as the first author was the lead architect of the original Earthworm project when it was known as "Rings and Things". While Earthworm components can easily be integrated into the xGraph processing framework, the architecture and analytics are more current (e.g. using a Kafka Broker for store and forward rings). The xQuake system is being released under an unrestricted open source license to encourage and enable sthe eismic community support in further development of its capabilities.

  18. A Factor Analytic and Regression Approach to Functional Age: Potential Effects of Race.

    ERIC Educational Resources Information Center

    Colquitt, Alan L.; And Others

    Factor analysis and multiple regression are two major approaches used to look at functional age, which takes account of the extensive variation in the rate of physiological and psychological maturation throughout life. To examine the role of racial or cultural influences on the measurement of functional age, a battery of 12 tests concentrating on…

  19. Sample Size and Power Estimates for a Confirmatory Factor Analytic Model in Exercise and Sport: A Monte Carlo Approach

    ERIC Educational Resources Information Center

    Myers, Nicholas D.; Ahn, Soyeon; Jin, Ying

    2011-01-01

    Monte Carlo methods can be used in data analytic situations (e.g., validity studies) to make decisions about sample size and to estimate power. The purpose of using Monte Carlo methods in a validity study is to improve the methodological approach within a study where the primary focus is on construct validity issues and not on advancing…

  20. Universal analytical scattering form factor for shell-, core-shell, or homogeneous particles with continuously variable density profile shape.

    PubMed

    Foster, Tobias

    2011-09-01

    A novel analytical and continuous density distribution function with a widely variable shape is reported and used to derive an analytical scattering form factor that allows us to universally describe the scattering from particles with the radial density profile of homogeneous spheres, shells, or core-shell particles. Composed by the sum of two Fermi-Dirac distribution functions, the shape of the density profile can be altered continuously from step-like via Gaussian-like or parabolic to asymptotically hyperbolic by varying a single "shape parameter", d. Using this density profile, the scattering form factor can be calculated numerically. An analytical form factor can be derived using an approximate expression for the original Fermi-Dirac distribution function. This approximation is accurate for sufficiently small rescaled shape parameters, d/R (R being the particle radius), up to values of d/R ≈ 0.1, and thus captures step-like, Gaussian-like, and parabolic as well as asymptotically hyperbolic profile shapes. It is expected that this form factor is particularly useful in a model-dependent analysis of small-angle scattering data since the applied continuous and analytical function for the particle density profile can be compared directly with the density profile extracted from the data by model-free approaches like the generalized inverse Fourier transform method. © 2011 American Chemical Society

  1. Microemulsification: an approach for analytical determinations.

    PubMed

    Lima, Renato S; Shiroma, Leandro Y; Teixeira, Alvaro V N C; de Toledo, José R; do Couto, Bruno C; de Carvalho, Rogério M; Carrilho, Emanuel; Kubota, Lauro T; Gobbi, Angelo L

    2014-09-16

    We address a novel method for analytical determinations that combines simplicity, rapidity, low consumption of chemicals, and portability with high analytical performance taking into account parameters such as precision, linearity, robustness, and accuracy. This approach relies on the effect of the analyte content over the Gibbs free energy of dispersions, affecting the thermodynamic stabilization of emulsions or Winsor systems to form microemulsions (MEs). Such phenomenon was expressed by the minimum volume fraction of amphiphile required to form microemulsion (Φ(ME)), which was the analytical signal of the method. Thus, the measurements can be taken by visually monitoring the transition of the dispersions from cloudy to transparent during the microemulsification, like a titration. It bypasses the employment of electric energy. The performed studies were: phase behavior, droplet dimension by dynamic light scattering, analytical curve, and robustness tests. The reliability of the method was evaluated by determining water in ethanol fuels and monoethylene glycol in complex samples of liquefied natural gas. The dispersions were composed of water-chlorobenzene (water analysis) and water-oleic acid (monoethylene glycol analysis) with ethanol as the hydrotrope phase. The mean hydrodynamic diameter values for the nanostructures in the droplet-based water-chlorobenzene MEs were in the range of 1 to 11 nm. The procedures of microemulsification were conducted by adding ethanol to water-oleic acid (W-O) mixtures with the aid of micropipette and shaking. The Φ(ME) measurements were performed in a thermostatic water bath at 23 °C by direct observation that is based on the visual analyses of the media. The experiments to determine water demonstrated that the analytical performance depends on the composition of ME. It shows flexibility in the developed method. The linear range was fairly broad with limits of linearity up to 70.00% water in ethanol. For monoethylene glycol in

  2. Accurate analytical modeling of junctionless DG-MOSFET by green's function approach

    NASA Astrophysics Data System (ADS)

    Nandi, Ashutosh; Pandey, Nilesh

    2017-11-01

    An accurate analytical model of Junctionless double gate MOSFET (JL-DG-MOSFET) in the subthreshold regime of operation is developed in this work using green's function approach. The approach considers 2-D mixed boundary conditions and multi-zone techniques to provide an exact analytical solution to 2-D Poisson's equation. The Fourier coefficients are calculated correctly to derive the potential equations that are further used to model the channel current and subthreshold slope of the device. The threshold voltage roll-off is computed from parallel shifts of Ids-Vgs curves between the long channel and short-channel devices. It is observed that the green's function approach of solving 2-D Poisson's equation in both oxide and silicon region can accurately predict channel potential, subthreshold current (Isub), threshold voltage (Vt) roll-off and subthreshold slope (SS) of both long & short channel devices designed with different doping concentrations and higher as well as lower tsi/tox ratio. All the analytical model results are verified through comparisons with TCAD Sentaurus simulation results. It is observed that the model matches quite well with TCAD device simulations.

  3. Dimensions of Early Speech Sound Disorders: A Factor Analytic Study

    ERIC Educational Resources Information Center

    Lewis, Barbara A.; Freebairn, Lisa A.; Hansen, Amy J.; Stein, Catherine M.; Shriberg, Lawrence D.; Iyengar, Sudha K.; Taylor, H. Gerry

    2006-01-01

    The goal of this study was to classify children with speech sound disorders (SSD) empirically, using factor analytic techniques. Participants were 3-7-year olds enrolled in speech/language therapy (N=185). Factor analysis of an extensive battery of speech and language measures provided support for two distinct factors, representing the skill…

  4. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  5. A confirmatory factor analytic validation of the Tinnitus Handicap Inventory.

    PubMed

    Kleinstäuber, Maria; Frank, Ina; Weise, Cornelia

    2015-03-01

    Because the postulated three-factor structure of the internationally widely used Tinnitus Handicap Inventory (THI) has not been confirmed yet by a confirmatory factor analytic approach this was the central aim of the current study. From a clinical setting, N=373 patients with chronic tinnitus completed the THI and further questionnaires assessing tinnitus-related and psychological variables. In order to analyze the psychometric properties of the THI, confirmatory factor analysis (CFA) and correlational analyses were conducted. CFA provided a statistically significant support for a better fit of the data to the hypothesized three-factor structure (RMSEA=.049, WRMR=1.062, CFI=.965, TLI=.961) than to a general factor model (RMSEA=.062, WRMR=1.258, CFI=.942, TLI=.937). The calculation of Cronbach's alpha as indicator of internal consistency revealed satisfactory values (.80-.91) with the exception of the catastrophic subscale (.65). High positive correlations of the THI and its subscales with other measures of tinnitus distress, anxiety, and depression, high negative correlations with tinnitus acceptance, moderate positive correlations with anxiety sensitivity, sleeping difficulties, tinnitus loudness, and small correlations with the Big Five personality dimensions confirmed construct validity. Results show that the THI is a highly reliable and valid measure of tinnitus-related handicap. In contrast to results of previous exploratory analyses the current findings speak for a three-factor in contrast to a unifactorial structure. Future research is needed to replicate this result in different tinnitus populations. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Analytical Approaches to Verify Food Integrity: Needs and Challenges.

    PubMed

    Stadler, Richard H; Tran, Lien-Anh; Cavin, Christophe; Zbinden, Pascal; Konings, Erik J M

    2016-09-01

    A brief overview of the main analytical approaches and practices to determine food authenticity is presented, addressing, as well, food supply chain and future requirements to more effectively mitigate food fraud. Food companies are introducing procedures and mechanisms that allow them to identify vulnerabilities in their food supply chain under the umbrella of a food fraud prevention management system. A key step and first line of defense is thorough supply chain mapping and full transparency, assessing the likelihood of fraudsters to penetrate the chain at any point. More vulnerable chains, such as those where ingredients and/or raw materials are purchased through traders or auctions, may require a higher degree of sampling, testing, and surveillance. Access to analytical tools is therefore pivotal, requiring continuous development and possibly sophistication in identifying chemical markers, data acquisition, and modeling. Significant progress in portable technologies is evident already today, for instance, as in the rapid testing now available at the agricultural level. In the near future, consumers may also have the ability to scan products in stores or at home to authenticate labels and food content. For food manufacturers, targeted analytical methods complemented by untargeted approaches are end control measures at the factory gate when the material is delivered. In essence, testing for food adulterants is an integral part of routine QC, ideally tailored to the risks in the individual markets and/or geographies or supply chains. The development of analytical methods is a first step in verifying the compliance and authenticity of food materials. A next, more challenging step is the successful establishment of global consensus reference methods as exemplified by the AOAC Stakeholder Panel on Infant Formula and Adult Nutritionals initiative, which can serve as an approach that could also be applied to methods for contaminants and adulterants in food. The food

  7. Two Approaches in the Lunar Libration Theory: Analytical vs. Numerical Methods

    NASA Astrophysics Data System (ADS)

    Petrova, Natalia; Zagidullin, Arthur; Nefediev, Yurii; Kosulin, Valerii

    2016-10-01

    Observation of the physical libration of the Moon and the celestial bodies is one of the astronomical methods to remotely evaluate the internal structure of a celestial body without using expensive space experiments. Review of the results obtained due to the physical libration study, is presented in the report.The main emphasis is placed on the description of successful lunar laser ranging for libration determination and on the methods of simulating the physical libration. As a result, estimation of the viscoelastic and dissipative properties of the lunar body, of the lunar core parameters were done. The core's existence was confirmed by the recent reprocessing of seismic data Apollo missions. Attention is paid to the physical interpretation of the phenomenon of free libration and methods of its determination.A significant part of the report is devoted to describing the practical application of the most accurate to date the analytical tables of lunar libration built by comprehensive analytical processing of residual differences obtained when comparing the long-term series of laser observations with numerical ephemeris DE421 [1].In general, the basic outline of the report reflects the effectiveness of two approaches in the libration theory - numerical and analytical solution. It is shown that the two approaches complement each other for the study of the Moon in different aspects: numerical approach provides high accuracy of the theory necessary for adequate treatment of modern high-accurate observations and the analytic approach allows you to see the essence of the various kind manifestations in the lunar rotation, predict and interpret the new effects in observations of physical libration [2].[1] Rambaux, N., J. G. Williams, 2011, The Moon's physical librations and determination of their free modes, Celest. Mech. Dyn. Astron., 109, 85-100.[2] Petrova N., A. Zagidullin, Yu. Nefediev. Analysis of long-periodic variations of lunar libration parameters on the basis of

  8. Analytic Couple Modeling Introducing Device Design Factor, Fin Factor, Thermal Diffusivity Factor, and Inductance Factor

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    A set of convenient thermoelectric device solutions have been derived in order to capture a number of factors which are previously only resolved with numerical techniques. The concise conversion efficiency equations derived from governing equations provide intuitive and straight-forward design guidelines. These guidelines allow for better device design without requiring detailed numerical modeling. The analytical modeling accounts for factors such as i) variable temperature boundary conditions, ii) lateral heat transfer, iii) temperature variable material properties, and iv) transient operation. New dimensionless parameters, similar to the figure of merit, are introduced including the device design factor, fin factor, thermal diffusivity factor, and inductance factor. These new device factors allow for the straight-forward description of phenomenon generally only captured with numerical work otherwise. As an example a device design factor of 0.38, which accounts for thermal resistance of the hot and cold shoes, can be used to calculate a conversion efficiency of 2.28 while the ideal conversion efficiency based on figure of merit alone would be 6.15. Likewise an ideal couple with efficiency of 6.15 will be reduced to 5.33 when lateral heat is accounted for with a fin factor of 1.0.

  9. Accounting for context in studies of health inequalities: a review and comparison of analytic approaches.

    PubMed

    Schempf, Ashley H; Kaufman, Jay S

    2012-10-01

    A common epidemiologic objective is to evaluate the contribution of residential context to individual-level disparities by race or socioeconomic position. We reviewed analytic strategies to account for the total (observed and unobserved factors) contribution of environmental context to health inequalities, including conventional fixed effects (FE) and hybrid FE implemented within a random effects (RE) or a marginal model. To illustrate results and limitations of the various analytic approaches of accounting for the total contextual component of health disparities, we used data on births nested within neighborhoods as an applied example of evaluating neighborhood confounding of racial disparities in gestational age at birth, including both a continuous and a binary outcome. Ordinary and RE models provided disparity estimates that can be substantially biased in the presence of neighborhood confounding. Both FE and hybrid FE models can account for cluster level confounding and provide disparity estimates unconfounded by neighborhood, with the latter having greater flexibility in allowing estimation of neighborhood-level effects and intercept/slope variability when implemented in a RE specification. Given the range of models that can be implemented in a hybrid approach and the frequent goal of accounting for contextual confounding, this approach should be used more often. Published by Elsevier Inc.

  10. An analytically based numerical method for computing view factors in real urban environments

    NASA Astrophysics Data System (ADS)

    Lee, Doo-Il; Woo, Ju-Wan; Lee, Sang-Hyun

    2018-01-01

    A view factor is an important morphological parameter used in parameterizing in-canyon radiative energy exchange process as well as in characterizing local climate over urban environments. For realistic representation of the in-canyon radiative processes, a complete set of view factors at the horizontal and vertical surfaces of urban facets is required. Various analytical and numerical methods have been suggested to determine the view factors for urban environments, but most of the methods provide only sky-view factor at the ground level of a specific location or assume simplified morphology of complex urban environments. In this study, a numerical method that can determine the sky-view factors ( ψ ga and ψ wa ) and wall-view factors ( ψ gw and ψ ww ) at the horizontal and vertical surfaces is presented for application to real urban morphology, which are derived from an analytical formulation of the view factor between two blackbody surfaces of arbitrary geometry. The established numerical method is validated against the analytical sky-view factor estimation for ideal street canyon geometries, showing a consolidate confidence in accuracy with errors of less than 0.2 %. Using a three-dimensional building database, the numerical method is also demonstrated to be applicable in determining the sky-view factors at the horizontal (roofs and roads) and vertical (walls) surfaces in real urban environments. The results suggest that the analytically based numerical method can be used for the radiative process parameterization of urban numerical models as well as for the characterization of local urban climate.

  11. Concentrations of metals in tissues of lowbush blueberry (Vaccinium angustifolium) near a copper-nickel smelter at Sudbury, Ontario, Canada: A factor analytic approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bagatto, G.; Shorthouse, J.D.; Crowder, A.A.

    1993-10-01

    Ecosystems damaged by emissions from the copper-nickel smelters of Inco and Falconbridge Ltd. near Sudbury, Ontario, Canada have provided a unique opportunity to study the effects of metal particulates and sulphur dioxide fumigations on plant and animal communities. The most infamous terrain in the Sudbury region is nearest the smelters (two active and one closed), where nearly all vegetation has been destroyed and soils eroded and contaminated. However, over all the past twenty years, some species of plants have developed a tolerance to polluted soils and some denuded lands have been naturally and artificially revegetated. Furthermore, a series of uniquemore » anthropogenic forests have developed away from the smelters. Several studies on the accumulation of metals in plant tissues indicate the levels of metals are usually highest closest to the smelters. Consequently, several studies have reported high correlations between plant concentrations of certain metals with distance from the source of pollution. However, tissue metal burdens are not always correlated with distance from the emission source, suggesting that other biological and physico-chemical factors may influence tissue metal burdens in the Sudbury habitat. The present study provides information on the metal burdens in another plant, lowbush blueberry, growing both near and away from the smelters. This study assesses the apparent influence of the Sudbury smelting operations on plant tissue burdens of five additional elements, along with copper and nickel, by using a factor analytic approach. This approach will allow determination of underlying factors which govern tissue metal burdens in a polluted environment and helps to refine the future direction of research in the Sudbury ecosystem. 12 refs., 2 tabs.« less

  12. Exploring the Different Trajectories of Analytical Thinking Ability Factors: An Application of the Second-Order Growth Curve Factor Model

    ERIC Educational Resources Information Center

    Saengprom, Narumon; Erawan, Waraporn; Damrongpanit, Suntonrapot; Sakulku, Jaruwan

    2015-01-01

    The purposes of this study were 1) Compare analytical thinking ability by testing the same sets of students 5 times 2) Develop and verify whether analytical thinking ability of students corresponds to second-order growth curve factors model. Samples were 1,093 eighth-grade students. The results revealed that 1) Analytical thinking ability scores…

  13. Interconnections between various analytic approaches applicable to third-order nonlinear differential equations

    PubMed Central

    Mohanasubha, R.; Chandrasekar, V. K.; Senthilvelan, M.; Lakshmanan, M.

    2015-01-01

    We unearth the interconnection between various analytical methods which are widely used in the current literature to identify integrable nonlinear dynamical systems described by third-order nonlinear ODEs. We establish an important interconnection between the extended Prelle–Singer procedure and λ-symmetries approach applicable to third-order ODEs to bring out the various linkages associated with these different techniques. By establishing this interconnection we demonstrate that given any one of the quantities as a starting point in the family consisting of Jacobi last multipliers, Darboux polynomials, Lie point symmetries, adjoint-symmetries, λ-symmetries, integrating factors and null forms one can derive the rest of the quantities in this family in a straightforward and unambiguous manner. We also illustrate our findings with three specific examples. PMID:27547076

  14. Interconnections between various analytic approaches applicable to third-order nonlinear differential equations.

    PubMed

    Mohanasubha, R; Chandrasekar, V K; Senthilvelan, M; Lakshmanan, M

    2015-04-08

    We unearth the interconnection between various analytical methods which are widely used in the current literature to identify integrable nonlinear dynamical systems described by third-order nonlinear ODEs. We establish an important interconnection between the extended Prelle-Singer procedure and λ-symmetries approach applicable to third-order ODEs to bring out the various linkages associated with these different techniques. By establishing this interconnection we demonstrate that given any one of the quantities as a starting point in the family consisting of Jacobi last multipliers, Darboux polynomials, Lie point symmetries, adjoint-symmetries, λ-symmetries, integrating factors and null forms one can derive the rest of the quantities in this family in a straightforward and unambiguous manner. We also illustrate our findings with three specific examples.

  15. The predictive accuracy of analytical formulas and semiclassical approaches for α decay half-lives of superheavy nuclei

    NASA Astrophysics Data System (ADS)

    Zhao, T. L.; Bao, X. J.; Guo, S. Q.

    2018-02-01

    Systematic calculations on the α decay half-lives are performed by using three analytical formulas and two semiclassical approaches. For the three analytical formulas, the experimental α decay half-lives and {Q}α values of the 66 reference nuclei have been used to obtain the coefficients. We get only four adjustable parameters to describe α decay half-lives for even-even, odd-A, and odd-odd nuclei. By comparison between the calculated values from ten analytical formulas and experimental data, it is shown that the new universal decay law (NUDL) foumula is the most accurate one to reproduce the experimental α decay half-lives of the superheavy nuclei (SHN). Meanwhile it is found that the experimental α decay half-lives of SHN are well reproduced by the Royer formula although many parameters are contained. The results show that the NUDL formula and the generalized liquid drop model (GLDM2) with consideration of the preformation factor can give fairly equivalent results for the superheavy nuclei.

  16. Stability analysis of magnetized neutron stars - a semi-analytic approach

    NASA Astrophysics Data System (ADS)

    Herbrik, Marlene; Kokkotas, Kostas D.

    2017-04-01

    We implement a semi-analytic approach for stability analysis, addressing the ongoing uncertainty about stability and structure of neutron star magnetic fields. Applying the energy variational principle, a model system is displaced from its equilibrium state. The related energy density variation is set up analytically, whereas its volume integration is carried out numerically. This facilitates the consideration of more realistic neutron star characteristics within the model compared to analytical treatments. At the same time, our method retains the possibility to yield general information about neutron star magnetic field and composition structures that are likely to be stable. In contrast to numerical studies, classes of parametrized systems can be studied at once, finally constraining realistic configurations for interior neutron star magnetic fields. We apply the stability analysis scheme on polytropic and non-barotropic neutron stars with toroidal, poloidal and mixed fields testing their stability in a Newtonian framework. Furthermore, we provide the analytical scheme for dropping the Cowling approximation in an axisymmetric system and investigate its impact. Our results confirm the instability of simple magnetized neutron star models as well as a stabilization tendency in the case of mixed fields and stratification. These findings agree with analytical studies whose spectrum of model systems we extend by lifting former simplifications.

  17. Correlations and analytical approaches to co-evolving voter models

    NASA Astrophysics Data System (ADS)

    Ji, M.; Xu, C.; Choi, C. W.; Hui, P. M.

    2013-11-01

    The difficulty in formulating analytical treatments in co-evolving networks is studied in light of the Vazquez-Eguíluz-San Miguel voter model (VM) and a modified VM (MVM) that introduces a random mutation of the opinion as a noise in the VM. The density of active links, which are links that connect the nodes of opposite opinions, is shown to be highly sensitive to both the degree k of a node and the active links n among the neighbors of a node. We test the validity in the formalism of analytical approaches and show explicitly that the assumptions behind the commonly used homogeneous pair approximation scheme in formulating a mean-field theory are the source of the theory's failure due to the strong correlations between k, n and n2. An improved approach that incorporates spatial correlation to the nearest-neighbors explicitly and a random approximation for the next-nearest neighbors is formulated for the VM and the MVM, and it gives better agreement with the simulation results. We introduce an empirical approach that quantifies the correlations more accurately and gives results in good agreement with the simulation results. The work clarifies why simply mean-field theory fails and sheds light on how to analyze the correlations in the dynamic equations that are often generated in co-evolving processes.

  18. Transcription factor-based biosensors enlightened by the analyte

    PubMed Central

    Fernandez-López, Raul; Ruiz, Raul; de la Cruz, Fernando; Moncalián, Gabriel

    2015-01-01

    Whole cell biosensors (WCBs) have multiple applications for environmental monitoring, detecting a wide range of pollutants. WCBs depend critically on the sensitivity and specificity of the transcription factor (TF) used to detect the analyte. We describe the mechanism of regulation and the structural and biochemical properties of TF families that are used, or could be used, for the development of environmental WCBs. Focusing on the chemical nature of the analyte, we review TFs that respond to aromatic compounds (XylS-AraC, XylR-NtrC, and LysR), metal ions (MerR, ArsR, DtxR, Fur, and NikR) or antibiotics (TetR and MarR). Analyzing the structural domains involved in DNA recognition, we highlight the similitudes in the DNA binding domains (DBDs) of these TF families. Opposite to DBDs, the wide range of analytes detected by TFs results in a diversity of structures at the effector binding domain. The modular architecture of TFs opens the possibility of engineering TFs with hybrid DNA and effector specificities. Yet, the lack of a crisp correlation between structural domains and specific functions makes this a challenging task. PMID:26191047

  19. Transcription factor-based biosensors enlightened by the analyte.

    PubMed

    Fernandez-López, Raul; Ruiz, Raul; de la Cruz, Fernando; Moncalián, Gabriel

    2015-01-01

    Whole cell biosensors (WCBs) have multiple applications for environmental monitoring, detecting a wide range of pollutants. WCBs depend critically on the sensitivity and specificity of the transcription factor (TF) used to detect the analyte. We describe the mechanism of regulation and the structural and biochemical properties of TF families that are used, or could be used, for the development of environmental WCBs. Focusing on the chemical nature of the analyte, we review TFs that respond to aromatic compounds (XylS-AraC, XylR-NtrC, and LysR), metal ions (MerR, ArsR, DtxR, Fur, and NikR) or antibiotics (TetR and MarR). Analyzing the structural domains involved in DNA recognition, we highlight the similitudes in the DNA binding domains (DBDs) of these TF families. Opposite to DBDs, the wide range of analytes detected by TFs results in a diversity of structures at the effector binding domain. The modular architecture of TFs opens the possibility of engineering TFs with hybrid DNA and effector specificities. Yet, the lack of a crisp correlation between structural domains and specific functions makes this a challenging task.

  20. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.

    PubMed

    Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.

  1. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory

    PubMed Central

    Zaman, Zahur; Blanckaert, Norbert J. C.; Chan, Daniel W.; Dubois, Jeffrey A.; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W.; Nilsen, Olaug L.; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L.; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality. PMID:18924721

  2. Cryogenic parallel, single phase flows: an analytical approach

    NASA Astrophysics Data System (ADS)

    Eichhorn, R.

    2017-02-01

    Managing the cryogenic flows inside a state-of-the-art accelerator cryomodule has become a demanding endeavour: In order to build highly efficient modules, all heat transfers are usually intercepted at various temperatures. For a multi-cavity module, operated at 1.8 K, this requires intercepts at 4 K and at 80 K at different locations with sometimes strongly varying heat loads which for simplicity reasons are operated in parallel. This contribution will describe an analytical approach, based on optimization theories.

  3. Determinants of 25(OH)D sufficiency in obese minority children: selecting outcome measures and analytic approaches.

    PubMed

    Zhou, Ping; Schechter, Clyde; Cai, Ziyong; Markowitz, Morri

    2011-06-01

    To highlight complexities in defining vitamin D sufficiency in children. Serum 25-(OH) vitamin D [25(OH)D] levels from 140 healthy obese children age 6 to 21 years living in the inner city were compared with multiple health outcome measures, including bone biomarkers and cardiovascular risk factors. Several statistical analytic approaches were used, including Pearson correlation, analysis of covariance (ANCOVA), and "hockey stick" regression modeling. Potential threshold levels for vitamin D sufficiency varied by outcome variable and analytic approach. Only systolic blood pressure (SBP) was significantly correlated with 25(OH)D (r = -0.261; P = .038). ANCOVA revealed that SBP and triglyceride levels were statistically significant in the test groups [25(OH)D <10, <15 and <20 ng/mL] compared with the reference group [25(OH)D >25 ng/mL]. ANCOVA also showed that only children with severe vitamin D deficiency [25(OH)D <10 ng/mL] had significantly higher parathyroid hormone levels (Δ = 15; P = .0334). Hockey stick model regression analyses found evidence of a threshold level in SBP, with a 25(OH)D breakpoint of 27 ng/mL, along with a 25(OH)D breakpoint of 18 ng/mL for triglycerides, but no relationship between 25(OH)D and parathyroid hormone. Defining vitamin D sufficiency should take into account different vitamin D-related health outcome measures and analytic methodologies. Copyright © 2011 Mosby, Inc. All rights reserved.

  4. Cocontraction of pairs of antagonistic muscles: analytical solution for planar static nonlinear optimization approaches.

    PubMed

    Herzog, W; Binding, P

    1993-11-01

    It has been stated in the literature that static, nonlinear optimization approaches cannot predict coactivation of pairs of antagonistic muscles; however, numerical solutions of such approaches have predicted coactivation of pairs of one-joint and multijoint antagonists. Analytical support for either finding is not available in the literature for systems containing more than one degree of freedom. The purpose of this study was to investigate analytically the possibility of cocontraction of pairs of antagonistic muscles using a static nonlinear optimization approach for a multidegree-of-freedom, two-dimensional system. Analytical solutions were found using the Karush-Kuhn-Tucker conditions, which were necessary and sufficient for optimality in this problem. The results show that cocontraction of pairs of one-joint antagonistic muscles is not possible, whereas cocontraction of pairs of multijoint antagonists is. These findings suggest that cocontraction of pairs of antagonistic muscles may be an "efficient" way to accomplish many movement tasks.

  5. Analytical methodology for determination of helicopter IFR precision approach requirements. [pilot workload and acceptance level

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.

    1980-01-01

    A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.

  6. Analytical approach on the stiffness of MR fluid filled spring

    NASA Astrophysics Data System (ADS)

    Sikulskyi, Stanislav; Kim, Daewon

    2017-04-01

    A solid mechanical spring generally exhibits uniform stiffness. This paper studies a mechanical spring filled with magnetorheological (MR) fluid to achieve controllable stiffness. The hollow spring filled with MR fluid is subjected to a controlled magnetic field in order to change the viscosity of the MR fluid and thereby to change the overall stiffness of the spring. MR fluid is considered as a Bingham viscoplastic linear material in the mathematical model. The goal of this research is to study the feasibility of such spring system by analytically computing the effects of MR fluid on overall spring stiffness. For this purpose, spring mechanics and MR fluid behavior are studied to increase the accuracy of the analysis. Numerical simulations are also performed to generate some assumptions, which simplify calculations in the analytical part. The accuracy of the present approach is validated by comparing the analytical results to previously known experimental results. Overall stiffness variations of the spring are also discussed for different spring designs.

  7. On Statistical Approaches for Demonstrating Analytical Similarity in the Presence of Correlation.

    PubMed

    Yang, Harry; Novick, Steven; Burdick, Richard K

    Analytical similarity is the foundation for demonstration of biosimilarity between a proposed product and a reference product. For this assessment, currently the U.S. Food and Drug Administration (FDA) recommends a tiered system in which quality attributes are categorized into three tiers commensurate with their risk and approaches of varying statistical rigor are subsequently used for the three-tier quality attributes. Key to the analyses of Tiers 1 and 2 quality attributes is the establishment of equivalence acceptance criterion and quality range. For particular licensure applications, the FDA has provided advice on statistical methods for demonstration of analytical similarity. For example, for Tier 1 assessment, an equivalence test can be used based on an equivalence margin of 1.5 σ R , where σ R is the reference product variability estimated by the sample standard deviation S R from a sample of reference lots. The quality range for demonstrating Tier 2 analytical similarity is of the form X̄ R ± K × σ R where the constant K is appropriately justified. To demonstrate Tier 2 analytical similarity, a large percentage (e.g., 90%) of test product must fall in the quality range. In this paper, through both theoretical derivations and simulations, we show that when the reference drug product lots are correlated, the sample standard deviation S R underestimates the true reference product variability σ R As a result, substituting S R for σ R in the Tier 1 equivalence acceptance criterion and the Tier 2 quality range inappropriately reduces the statistical power and the ability to declare analytical similarity. Also explored is the impact of correlation among drug product lots on Type I error rate and power. Three methods based on generalized pivotal quantities are introduced, and their performance is compared against a two-one-sided tests (TOST) approach. Finally, strategies to mitigate risk of correlation among the reference products lots are discussed. A

  8. Identifying environmental variables explaining genotype-by-environment interaction for body weight of rainbow trout (Onchorynchus mykiss): reaction norm and factor analytic models.

    PubMed

    Sae-Lim, Panya; Komen, Hans; Kause, Antti; Mulder, Han A

    2014-02-26

    Identifying the relevant environmental variables that cause GxE interaction is often difficult when they cannot be experimentally manipulated. Two statistical approaches can be applied to address this question. When data on candidate environmental variables are available, GxE interaction can be quantified as a function of specific environmental variables using a reaction norm model. Alternatively, a factor analytic model can be used to identify the latent common factor that explains GxE interaction. This factor can be correlated with known environmental variables to identify those that are relevant. Previously, we reported a significant GxE interaction for body weight at harvest in rainbow trout reared on three continents. Here we explore their possible causes. Reaction norm and factor analytic models were used to identify which environmental variables (age at harvest, water temperature, oxygen, and photoperiod) may have caused the observed GxE interaction. Data on body weight at harvest was recorded on 8976 offspring reared in various locations: (1) a breeding environment in the USA (nucleus), (2) a recirculating aquaculture system in the Freshwater Institute in West Virginia, USA, (3) a high-altitude farm in Peru, and (4) a low-water temperature farm in Germany. Akaike and Bayesian information criteria were used to compare models. The combination of days to harvest multiplied with daily temperature (Day*Degree) and photoperiod were identified by the reaction norm model as the environmental variables responsible for the GxE interaction. The latent common factor that was identified by the factor analytic model showed the highest correlation with Day*Degree. Day*Degree and photoperiod were the environmental variables that differed most between Peru and other environments. Akaike and Bayesian information criteria indicated that the factor analytical model was more parsimonious than the reaction norm model. Day*Degree and photoperiod were identified as environmental

  9. Identifying environmental variables explaining genotype-by-environment interaction for body weight of rainbow trout (Onchorynchus mykiss): reaction norm and factor analytic models

    PubMed Central

    2014-01-01

    Background Identifying the relevant environmental variables that cause GxE interaction is often difficult when they cannot be experimentally manipulated. Two statistical approaches can be applied to address this question. When data on candidate environmental variables are available, GxE interaction can be quantified as a function of specific environmental variables using a reaction norm model. Alternatively, a factor analytic model can be used to identify the latent common factor that explains GxE interaction. This factor can be correlated with known environmental variables to identify those that are relevant. Previously, we reported a significant GxE interaction for body weight at harvest in rainbow trout reared on three continents. Here we explore their possible causes. Methods Reaction norm and factor analytic models were used to identify which environmental variables (age at harvest, water temperature, oxygen, and photoperiod) may have caused the observed GxE interaction. Data on body weight at harvest was recorded on 8976 offspring reared in various locations: (1) a breeding environment in the USA (nucleus), (2) a recirculating aquaculture system in the Freshwater Institute in West Virginia, USA, (3) a high-altitude farm in Peru, and (4) a low-water temperature farm in Germany. Akaike and Bayesian information criteria were used to compare models. Results The combination of days to harvest multiplied with daily temperature (Day*Degree) and photoperiod were identified by the reaction norm model as the environmental variables responsible for the GxE interaction. The latent common factor that was identified by the factor analytic model showed the highest correlation with Day*Degree. Day*Degree and photoperiod were the environmental variables that differed most between Peru and other environments. Akaike and Bayesian information criteria indicated that the factor analytical model was more parsimonious than the reaction norm model. Conclusions Day*Degree and

  10. An analytic approach for the study of pulsar spindown

    NASA Astrophysics Data System (ADS)

    Chishtie, F. A.; Zhang, Xiyang; Valluri, S. R.

    2018-07-01

    In this work we develop an analytic approach to study pulsar spindown. We use the monopolar spindown model by Alvarez and Carramiñana (2004 Astron. Astrophys. 414 651–8), which assumes an inverse linear law of magnetic field decay of the pulsar, to extract an all-order formula for the spindown parameters using the Taylor series representation of Jaranowski et al (1998 Phys. Rev. D 58 6300). We further extend the analytic model to incorporate the quadrupole term that accounts for the emission of gravitational radiation, and obtain expressions for the period P and frequency f in terms of transcendental equations. We derive the analytic solution for pulsar frequency spindown in the absence of glitches. We examine the different cases that arise in the analysis of the roots in the solution of the non-linear differential equation for pulsar period evolution. We provide expressions for the spin-down parameters and find that the spindown values are in reasonable agreement with observations. A detection of gravitational waves from pulsars will be the next landmark in the field of multi-messenger gravitational wave astronomy.

  11. Analytical approach of laser beam propagation in the hollow polygonal light pipe.

    PubMed

    Zhu, Guangzhi; Zhu, Xiao; Zhu, Changhong

    2013-08-10

    An analytical method of researching the light distribution properties on the output end of a hollow n-sided polygonal light pipe and a light source with a Gaussian distribution is developed. The mirror transformation matrices and a special algorithm of removing void virtual images are created to acquire the location and direction vector of each effective virtual image on the entrance plane. The analytical method is demonstrated by Monte Carlo ray tracing. At the same time, four typical cases are discussed. The analytical results indicate that the uniformity of light distribution varies with the structural and optical parameters of the hollow n-sided polygonal light pipe and light source with a Gaussian distribution. The analytical approach will be useful to design and choose the hollow n-sided polygonal light pipe, especially for high-power laser beam homogenization techniques.

  12. Analytical Approach Validation for the Spin-Stabilized Satellite Attitude

    NASA Technical Reports Server (NTRS)

    Zanardi, Maria Cecilia F. P. S.; Garcia, Roberta Veloso; Kuga, Helio Koiti

    2007-01-01

    An analytical approach for spin-stabilized spacecraft attitude prediction is presented for the influence of the residual magnetic torques and the satellite in an elliptical orbit. Assuming a quadripole model for the Earth s magnetic field, an analytical averaging method is applied to obtain the mean residual torque in every orbital period. The orbit mean anomaly is used to compute the average components of residual torque in the spacecraft body frame reference system. The theory is developed for time variations in the orbital elements, giving rise to many curvature integrals. It is observed that the residual magnetic torque does not have component along the spin axis. The inclusion of this torque on the rotational motion differential equations of a spin stabilized spacecraft yields conditions to derive an analytical solution. The solution shows that the residual torque does not affect the spin velocity magnitude, contributing only for the precession and the drift of the spin axis of the spacecraft. The theory developed has been applied to the Brazilian s spin stabilized satellites, which are quite appropriated for verification and comparison of the theory with the data generated and processed by the Satellite Control Center of Brazil National Research Institute. The results show the period that the analytical solution can be used to the attitude propagation, within the dispersion range of the attitude determination system performance of Satellite Control Center of Brazil National Research Institute.

  13. FACTOR ANALYTIC MODELS OF CLUSTERED MULTIVARIATE DATA WITH INFORMATIVE CENSORING

    EPA Science Inventory

    This paper describes a general class of factor analytic models for the analysis of clustered multivariate data in the presence of informative missingness. We assume that there are distinct sets of cluster-level latent variables related to the primary outcomes and to the censorin...

  14. An approach for environmental risk assessment of engineered nanomaterials using Analytical Hierarchy Process (AHP) and fuzzy inference rules.

    PubMed

    Topuz, Emel; van Gestel, Cornelis A M

    2016-01-01

    The usage of Engineered Nanoparticles (ENPs) in consumer products is relatively new and there is a need to conduct environmental risk assessment (ERA) to evaluate their impacts on the environment. However, alternative approaches are required for ERA of ENPs because of the huge gap in data and knowledge compared to conventional pollutants and their unique properties that make it difficult to apply existing approaches. This study aims to propose an ERA approach for ENPs by integrating Analytical Hierarchy Process (AHP) and fuzzy inference models which provide a systematic evaluation of risk factors and reducing uncertainty about the data and information, respectively. Risk is assumed to be the combination of occurrence likelihood, exposure potential and toxic effects in the environment. A hierarchy was established to evaluate the sub factors of these components. Evaluation was made with fuzzy numbers to reduce uncertainty and incorporate the expert judgements. Overall score of each component was combined with fuzzy inference rules by using expert judgements. Proposed approach reports the risk class and its membership degree such as Minor (0.7). Therefore, results are precise and helpful to determine the risk management strategies. Moreover, priority weights calculated by comparing the risk factors based on their importance for the risk enable users to understand which factor is effective on the risk. Proposed approach was applied for Ag (two nanoparticles with different coating) and TiO2 nanoparticles for different case studies. Results verified the proposed benefits of the approach. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Data analytics approach to create waste generation profiles for waste management and collection.

    PubMed

    Niska, Harri; Serkkola, Ari

    2018-04-30

    Extensive monitoring data on waste generation is increasingly collected in order to implement cost-efficient and sustainable waste management operations. In addition, geospatial data from different registries of the society are opening for free usage. Novel data analytics approaches can be built on the top of the data to produce more detailed, and in-time waste generation information for the basis of waste management and collection. In this paper, a data-based approach based on the self-organizing map (SOM) and the k-means algorithm is developed for creating a set of waste generation type profiles. The approach is demonstrated using the extensive container-level waste weighting data collected in the metropolitan area of Helsinki, Finland. The results obtained highlight the potential of advanced data analytic approaches in producing more detailed waste generation information e.g. for the basis of tailored feedback services for waste producers and the planning and optimization of waste collection and recycling. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Determination of immersion factors for radiance sensors in marine and inland waters: a semi-analytical approach using refractive index approximation

    NASA Astrophysics Data System (ADS)

    Dev, Pravin J.; Shanmugam, P.

    2016-05-01

    Underwater radiometers are generally calibrated in air using a standard source. The immersion factors are required for these radiometers to account for the change in the in-water measurements with respect to in-air due to the different refractive index of the medium. The immersion factors previously determined for RAMSES series of commercial radiometers manufactured by TriOS are applicable to clear oceanic waters. In typical inland and turbid productive coastal waters, these experimentally determined immersion factors yield significantly large errors in water-leaving radiances (Lw) and hence remote sensing reflectances (Rrs). To overcome this limitation, a semi-analytical method with based on the refractive index approximation is proposed in this study, with the aim of obtaining reliable Lw and Rrs from RAMSES radiometers for turbid and productive waters within coastal and inland water environments. We also briefly show the variation of pure water immersion factors (Ifw) and newly derived If on Lw and Rrs for clear and turbid waters. The remnant problems other than the immersion factor coefficients such as transmission, air-water and water-air Fresnel's reflectances are also discussed.

  17. An analytical approach to separate climate and human contributions to basin streamflow variability

    NASA Astrophysics Data System (ADS)

    Li, Changbin; Wang, Liuming; Wanrui, Wang; Qi, Jiaguo; Linshan, Yang; Zhang, Yuan; Lei, Wu; Cui, Xia; Wang, Peng

    2018-04-01

    Climate variability and anthropogenic regulations are two interwoven factors in the ecohydrologic system across large basins. Understanding the roles that these two factors play under various hydrologic conditions is of great significance for basin hydrology and sustainable water utilization. In this study, we present an analytical approach based on coupling water balance method and Budyko hypothesis to derive effectiveness coefficients (ECs) of climate change, as a way to disentangle contributions of it and human activities to the variability of river discharges under different hydro-transitional situations. The climate dominated streamflow change (ΔQc) by EC approach was compared with those deduced by the elasticity method and sensitivity index. The results suggest that the EC approach is valid and applicable for hydrologic study at large basin scale. Analyses of various scenarios revealed that contributions of climate change and human activities to river discharge variation differed among the regions of the study area. Over the past several decades, climate change dominated hydro-transitions from dry to wet, while human activities played key roles in the reduction of streamflow during wet to dry periods. Remarkable decline of discharge in upstream was mainly due to human interventions, although climate contributed more to runoff increasing during dry periods in the semi-arid downstream. Induced effectiveness on streamflow changes indicated a contribution ratio of 49% for climate and 51% for human activities at the basin scale from 1956 to 2015. The mathematic derivation based simple approach, together with the case example of temporal segmentation and spatial zoning, could help people understand variation of river discharge with more details at a large basin scale under the background of climate change and human regulations.

  18. Reverse transcription-polymerase chain reaction molecular testing of cytology specimens: Pre-analytic and analytic factors.

    PubMed

    Bridge, Julia A

    2017-01-01

    The introduction of molecular testing into cytopathology laboratory practice has expanded the types of samples considered feasible for identifying genetic alterations that play an essential role in cancer diagnosis and treatment. Reverse transcription-polymerase chain reaction (RT-PCR), a sensitive and specific technical approach for amplifying a defined segment of RNA after it has been reverse-transcribed into its DNA complement, is commonly used in clinical practice for the identification of recurrent or tumor-specific fusion gene events. Real-time RT-PCR (quantitative RT-PCR), a technical variation, also permits the quantitation of products generated during each cycle of the polymerase chain reaction process. This review addresses qualitative and quantitative pre-analytic and analytic considerations of RT-PCR as they relate to various cytologic specimens. An understanding of these aspects of genetic testing is central to attaining optimal results in the face of the challenges that cytology specimens may present. Cancer Cytopathol 2017;125:11-19. © 2016 American Cancer Society. © 2016 American Cancer Society.

  19. Multidimensional assessment of awareness in early-stage dementia: a cluster analytic approach.

    PubMed

    Clare, Linda; Whitaker, Christopher J; Nelis, Sharon M; Martyr, Anthony; Markova, Ivana S; Roth, Ilona; Woods, Robert T; Morris, Robin G

    2011-01-01

    Research on awareness in dementia has yielded variable and inconsistent associations between awareness and other factors. This study examined awareness using a multidimensional approach and applied cluster analytic techniques to identify associations between the level of awareness and other variables. Participants were 101 individuals with early-stage dementia (PwD) and their carers. Explicit awareness was assessed at 3 levels: performance monitoring in relation to memory, evaluative judgement in relation to memory, everyday activities and socio-emotional functioning, and metacognitive reflection in relation to the experience and impact of the condition. Implicit awareness was assessed with an emotional Stroop task. Different measures of explicit awareness scores were related only to a limited extent. Cluster analysis yielded 3 groups with differing degrees of explicit awareness. These groups showed no differences in implicit awareness. Lower explicit awareness was associated with greater age, lower MMSE scores, poorer recall and naming scores, lower anxiety and greater carer stress. Multidimensional assessment offers a more robust approach to classifying PwD according to level of awareness and hence to examining correlates and predictors of awareness. Copyright © 2011 S. Karger AG, Basel.

  20. Analytic study of orbiter landing profiles

    NASA Technical Reports Server (NTRS)

    Walker, H. J.

    1981-01-01

    A broad survey of possible orbiter landing configurations was made with specific goals of defining boundaries for the landing task. The results suggest that the center of the corridors between marginal and routine represents a more or less optimal preflare condition for regular operations. Various constraints used to define the boundaries are based largely on qualitative judgements from earlier flight experience with the X-15 and lifting body research aircraft. The results should serve as useful background for expanding and validating landing simulation programs. The analytic approach offers a particular advantage in identifying trends due to the systematic variation of factors such as vehicle weight, load factor, approach speed, and aim point. Limitations such as a constant load factor during the flare and using a fixed gear deployment time interval, can be removed by increasing the flexibility of the computer program. This analytic definition of landing profiles of the orbiter may suggest additional studies, includin more configurations or more comparisons of landing profiles within and beyond the corridor boundaries.

  1. Influence of Pre-Analytical Factors on Thymus- and Activation-Regulated Chemokine Quantitation in Plasma

    PubMed Central

    Zhao, Xuemei; Delgado, Liliana; Weiner, Russell; Laterza, Omar F.

    2015-01-01

    Thymus- and activation-regulated chemokine (TARC) in serum/plasma associates with the disease activity of atopic dermatitis (AD), and is a promising tool for assessing the response to the treatment of the disease. TARC also exists within platelets, with elevated levels detectable in AD patients. We examined the effects of pre-analytical factors on the quantitation of TARC in human EDTA plasma. TARC levels in platelet-free plasma were significantly lower than those in platelet-containing plasma. After freeze-thaw, TARC levels increased in platelet-containing plasma, but remained unchanged in platelet-free plasma, suggesting TARC was released from the platelets during the freeze-thaw process. In contrast, TARC levels were stable in serum independent of freeze-thaw. These findings underscore the importance of pre-analytical factors to TARC quantitation. Plasma TARC levels should be measured in platelet-free plasma for accurate quantitation. Pre-analytical factors influence the quantitation, interpretation, and implementation of circulating TARC as a biomarker for the development of AD therapeutics. PMID:28936246

  2. An analytic approach to cyber adversarial dynamics

    NASA Astrophysics Data System (ADS)

    Sweeney, Patrick; Cybenko, George

    2012-06-01

    To date, cyber security investment by both the government and commercial sectors has been largely driven by the myopic best response of players to the actions of their adversaries and their perception of the adversarial environment. However, current work in applying traditional game theory to cyber operations typically assumes that games exist with prescribed moves, strategies, and payos. This paper presents an analytic approach to characterizing the more realistic cyber adversarial metagame that we believe is being played. Examples show that understanding the dynamic metagame provides opportunities to exploit an adversary's anticipated attack strategy. A dynamic version of a graph-based attack-defend game is introduced, and a simulation shows how an optimal strategy can be selected for success in the dynamic environment.

  3. The areal reduction factor: A new analytical expression for the Lazio Region in central Italy

    NASA Astrophysics Data System (ADS)

    Mineo, C.; Ridolfi, E.; Napolitano, F.; Russo, F.

    2018-05-01

    For the study and modeling of hydrological phenomena, both in urban and rural areas, a proper estimation of the areal reduction factor (ARF) is crucial. In this paper, we estimated the ARF from observed rainfall data as the ratio between the average rainfall occurring in a specific area and the point rainfall. Then, we compared the obtained ARF values with some of the most widespread empirical approaches in literature which are used when rainfall observations are not available. Results highlight that the literature formulations can lead to a substantial over- or underestimation of the ARF estimated from observed data. These findings can have severe consequences, especially in the design of hydraulic structures where empirical formulations are extensively applied. The aim of this paper is to present a new analytical relationship with an explicit dependence on the rainfall duration and area that can better represent the ARF-area trend over the area case of study. The analytical curve presented here can find an important application to estimate the ARF values for design purposes. The test study area is the Lazio Region (central Italy).

  4. An Analytical Dynamics Approach to the Control of Mechanical Systems

    NASA Astrophysics Data System (ADS)

    Mylapilli, Harshavardhan

    A new and novel approach to the control of nonlinear mechanical systems is presented in this study. The approach is inspired by recent results in analytical dynamics that deal with the theory of constrained motion. The control requirements on the dynamical system are viewed from an analytical dynamics perspective and the theory of constrained motion is used to recast these control requirements as constraints on the dynamical system. Explicit closed form expressions for the generalized nonlinear control forces are obtained by using the fundamental equation of mechanics. The control so obtained is optimal at each instant of time and causes the constraints to be exactly satisfied. No linearizations and/or approximations of the nonlinear dynamical system are made, and no a priori structure is imposed on the nature of nonlinear controller. Three examples dealing with highly nonlinear complex dynamical systems that are chosen from diverse areas of discrete and continuum mechanics are presented to demonstrate the control approach. The first example deals with the energy control of underactuated inhomogeneous nonlinear lattices (or chains), the second example deals with the synchronization of the motion of multiple coupled slave gyros with that of a master gyro, and the final example deals with the control of incompressible hyperelastic rubber-like thin cantilever beams. Numerical simulations accompanying these examples show the ease, simplicity and the efficacy with which the control methodology can be applied and the accuracy with which the desired control objectives can be met.

  5. The Identification and Significance of Intuitive and Analytic Problem Solving Approaches Among College Physics Students

    ERIC Educational Resources Information Center

    Thorsland, Martin N.; Novak, Joseph D.

    1974-01-01

    Described is an approach to assessment of intuitive and analytic modes of thinking in physics. These modes of thinking are associated with Ausubel's theory of learning. High ability in either intuitive or analytic thinking was associated with success in college physics, with high learning efficiency following a pattern expected on the basis of…

  6. Exact analytical approach for six-degree-of-freedom measurement using image-orientation-change method.

    PubMed

    Tsai, Chung-Yu

    2012-04-01

    An exact analytical approach is proposed for measuring the six-degree-of-freedom (6-DOF) motion of an object using the image-orientation-change (IOC) method. The proposed measurement system comprises two reflector systems, where each system consists of two reflectors and one position sensing detector (PSD). The IOCs of the object in the two reflector systems are described using merit functions determined from the respective PSD readings before and after motion occurs, respectively. The three rotation variables are then determined analytically from the eigenvectors of the corresponding merit functions. After determining the three rotation variables, the order of the translation equations is downgraded to a linear form. Consequently, the solution for the three translation variables can also be analytically determined. As a result, the motion transformation matrix describing the 6-DOF motion of the object is fully determined. The validity of the proposed approach is demonstrated by means of an illustrative example.

  7. The Prioritization of Clinical Risk Factors of Obstructive Sleep Apnea Severity Using Fuzzy Analytic Hierarchy Process

    PubMed Central

    Maranate, Thaya; Pongpullponsak, Adisak; Ruttanaumpawan, Pimon

    2015-01-01

    Recently, there has been a problem of shortage of sleep laboratories that can accommodate the patients in a timely manner. Delayed diagnosis and treatment may lead to worse outcomes particularly in patients with severe obstructive sleep apnea (OSA). For this reason, the prioritization in polysomnography (PSG) queueing should be endorsed based on disease severity. To date, there have been conflicting data whether clinical information can predict OSA severity. The 1,042 suspected OSA patients underwent diagnostic PSG study at Siriraj Sleep Center during 2010-2011. A total of 113 variables were obtained from sleep questionnaires and anthropometric measurements. The 19 groups of clinical risk factors consisting of 42 variables were categorized into each OSA severity. This study aimed to array these factors by employing Fuzzy Analytic Hierarchy Process approach based on normalized weight vector. The results revealed that the first rank of clinical risk factors in Severe, Moderate, Mild, and No OSA was nighttime symptoms. The overall sensitivity/specificity of the approach to these groups was 92.32%/91.76%, 89.52%/88.18%, 91.08%/84.58%, and 96.49%/81.23%, respectively. We propose that the urgent PSG appointment should include clinical risk factors of Severe OSA group. In addition, the screening for Mild from No OSA patients in sleep center setting using symptoms during sleep is also recommended (sensitivity = 87.12% and specificity = 72.22%). PMID:26221183

  8. Analytic model for academic research productivity having factors, interactions and implications

    PubMed Central

    2011-01-01

    Financial support is dear in academia and will tighten further. How can the research mission be accomplished within new restraints? A model is presented for evaluating source components of academic research productivity. It comprises six factors: funding; investigator quality; efficiency of the research institution; the research mix of novelty, incremental advancement, and confirmatory studies; analytic accuracy; and passion. Their interactions produce output and patterned influences between factors. Strategies for optimizing output are enabled. PMID:22130145

  9. Unifying Approach to Analytical Chemistry and Chemical Analysis: Problem-Oriented Role of Chemical Analysis.

    ERIC Educational Resources Information Center

    Pardue, Harry L.; Woo, Jannie

    1984-01-01

    Proposes an approach to teaching analytical chemistry and chemical analysis in which a problem to be resolved is the focus of a course. Indicates that this problem-oriented approach is intended to complement detailed discussions of fundamental and applied aspects of chemical determinations and not replace such discussions. (JN)

  10. Pre-analytical Factors Influence Accuracy of Urine Spot Iodine Assessment in Epidemiological Surveys.

    PubMed

    Doggui, Radhouene; El Ati-Hellal, Myriam; Traissac, Pierre; El Ati, Jalila

    2018-03-26

    Urinary iodine concentration (UIC) is commonly used to assess iodine status of subjects in epidemiological surveys. As pre-analytical factors are an important source of measurement error and studies about this phase are scarce, our objective was to assess the influence of urine sampling conditions on UIC, i.e., whether the child ate breakfast or not, urine void rank of the day, and time span between last meal and urine collection. A nationwide, two-stage, stratified, cross-sectional study including 1560 children (6-12 years) was performed in 2012. UIC was determined by the Sandell-Kolthoff method. Pre-analytical factors were assessed from children's mothers by using a questionnaire. Association between iodine status and pre-analytical factors were adjusted for one another and socio-economic characteristics by multivariate linear and multinomial regression models (RPR: relative prevalence ratios). Skipping breakfast prior to morning urine sampling decreased UIC by 40 to 50 μg/L and the proportion of UIC < 100 μg/L was higher among children having those skipped breakfast (RPR = 3.2[1.0-10.4]). In unadjusted analyses, UIC was less among children sampled more than 5 h from their last meal. UIC decreased with rank of urine void (e.g., first vs. second, P < 0.001); also, the proportion of UIC < 100 μg/L was greater among 4th rank samples (vs. second RPR = 2.1[1.1-4.0]). Subjects' breakfast status and urine void rank should be accounted for when assessing iodine status. Providing recommendations to standardize pre-analytical factors is a key step toward improving accuracy and comparability of survey results for assessing iodine status from spot urine samples. These recommendations have to be evaluated by future research.

  11. Satellite Orbit Under Influence of a Drag - Analytical Approach

    NASA Astrophysics Data System (ADS)

    Martinović, M. M.; Šegan, S. D.

    2017-12-01

    The report studies some changes in orbital elements of the artificial satellites of Earth under influence of atmospheric drag. In order to develop possibilities of applying the results in many future cases, an analytical interpretation of the orbital element perturbations is given via useful, but very long expressions. The development is based on the TD88 air density model, recently upgraded with some additional terms. Some expressions and formulae were developed by the computer algebra system Mathematica and tested in some hypothetical cases. The results have good agreement with iterative (numerical) approach.

  12. TLD efficiency calculations for heavy ions: an analytical approach

    DOE PAGES

    Boscolo, Daria; Scifoni, Emanuele; Carlino, Antonio; ...

    2015-12-18

    The use of thermoluminescent dosimeters (TLDs) in heavy charged particles’ dosimetry is limited by their non-linear dose response curve and by their response dependence on the radiation quality. Thus, in order to use TLDs with particle beams, a model that can reproduce the behavior of these detectors under different conditions is needed. Here a new, simple and completely analytical algorithm for the calculation of the relative TL-efficiency depending on the ion charge Z and energy E is presented. In addition, the detector response is evaluated starting from the single ion case, where the computed effectiveness values have been compared withmore » experimental data as well as with predictions from a different method. The main advantage of this approach is that, being fully analytical, it is computationally fast and can be efficiently integrated into treatment planning verification tools. In conclusion, the calculated efficiency values have been then implemented in the treatment planning code TRiP98 and dose calculations on a macroscopic target irradiated with an extended carbon ion field have been performed and verified against experimental data.« less

  13. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2014-01-01

    Monitoring the misuse of drugs and the abuse of substances and methods potentially or evidently improving athletic performance by analytical chemistry strategies is one of the main pillars of modern anti-doping efforts. Owing to the continuously growing knowledge in medicine, pharmacology, and (bio)chemistry, new chemical entities are frequently established and developed, various of which present a temptation for sportsmen and women due to assumed/attributed beneficial effects of such substances and preparations on, for example, endurance, strength, and regeneration. By means of new technologies, expanded existing test protocols, new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA), analytical assays have been further improved in agreement with the content of the 2013 Prohibited List. In this annual banned-substance review, literature concerning human sports drug testing that was published between October 2012 and September 2013 is summarized and reviewed with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2013 John Wiley & Sons, Ltd.

  14. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Walpurgis, Katja; Geyer, Hans; Schänzer, Wilhelm

    2016-01-01

    The aim of improving anti-doping efforts is predicated on several different pillars, including, amongst others, optimized analytical methods. These commonly result from exploiting most recent developments in analytical instrumentation as well as research data on elite athletes' physiology in general, and pharmacology, metabolism, elimination, and downstream effects of prohibited substances and methods of doping, in particular. The need for frequent and adequate adaptations of sports drug testing procedures has been incessant, largely due to the uninterrupted emergence of new chemical entities but also due to the apparent use of established or even obsolete drugs for reasons other than therapeutic means, such as assumed beneficial effects on endurance, strength, and regeneration capacities. Continuing the series of annual banned-substance reviews, literature concerning human sports drug testing published between October 2014 and September 2015 is summarized and reviewed in reference to the content of the 2015 Prohibited List as issued by the World Anti-Doping Agency (WADA), with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Factor Analytic Approach to Transitive Text Mining using Medline Descriptors

    NASA Astrophysics Data System (ADS)

    Stegmann, J.; Grohmann, G.

    Matrix decomposition methods were applied to examples of noninteractive literature sets sharing implicit relations. Document-by-term matrices were created from downloaded PubMed literature sets, the terms being the Medical Subject Headings (MeSH descriptors) assigned to the documents. The loadings of the factors derived from singular value or eigenvalue matrix decomposition were sorted according to absolute values and subsequently inspected for positions of terms relevant to the discovery of hidden connections. It was found that only a small number of factors had to be screened to find key terms in close neighbourhood, being separated by a small number of terms only.

  16. ASVCP quality assurance guidelines: control of general analytical factors in veterinary laboratories.

    PubMed

    Flatland, Bente; Freeman, Kathy P; Friedrichs, Kristen R; Vap, Linda M; Getzy, Karen M; Evans, Ellen W; Harr, Kendal E

    2010-09-01

    Owing to lack of governmental regulation of veterinary laboratory performance, veterinarians ideally should demonstrate a commitment to self-monitoring and regulation of laboratory performance from within the profession. In response to member concerns about quality management in veterinary laboratories, the American Society for Veterinary Clinical Pathology (ASVCP) formed a Quality Assurance and Laboratory Standards (QAS) committee in 1996. This committee recently published updated and peer-reviewed Quality Assurance Guidelines on the ASVCP website. The Quality Assurance Guidelines are intended for use by veterinary diagnostic laboratories and veterinary research laboratories that are not covered by the US Food and Drug Administration Good Laboratory Practice standards (Code of Federal Regulations Title 21, Chapter 58). The guidelines have been divided into 3 reports on 1) general analytic factors for veterinary laboratory performance and comparisons, 2) hematology and hemostasis, and 3) clinical chemistry, endocrine assessment, and urinalysis. This report documents recommendations for control of general analytical factors within veterinary clinical laboratories and is based on section 2.1 (Analytical Factors Important In Veterinary Clinical Pathology, General) of the newly revised ASVCP QAS Guidelines. These guidelines are not intended to be all-inclusive; rather, they provide minimum guidelines for quality assurance and quality control for veterinary laboratory testing. It is hoped that these guidelines will provide a basis for laboratories to assess their current practices, determine areas for improvement, and guide continuing professional development and education efforts. ©2010 American Society for Veterinary Clinical Pathology.

  17. Gravity Field Recovery from the Cartwheel Formation by the Semi-analytical Approach

    NASA Astrophysics Data System (ADS)

    Li, Huishu; Reubelt, Tilo; Antoni, Markus; Sneeuw, Nico; Zhong, Min; Zhou, Zebing

    2016-04-01

    Past and current gravimetric satellite missions have contributed drastically to our knowledge of the Earth's gravity field. Nevertheless, several geoscience disciplines push for even higher requirements on accuracy, homogeneity and time- and space-resolution of the Earth's gravity field. Apart from better instruments or new observables, alternative satellite formations could improve the signal and error structure. With respect to other methods, one significant advantage of the semi-analytical approach is its effective pre-mission error assessment for gravity field missions. The semi-analytical approach builds a linear analytical relationship between the Fourier spectrum of the observables and the spherical harmonic spectrum of the gravity field. The spectral link between observables and gravity field parameters is given by the transfer coefficients, which constitutes the observation model. In connection with a stochastic model, it can be used for pre-mission error assessment of gravity field mission. The cartwheel formation is formed by two satellites on elliptic orbits in the same plane. The time dependent ranging will be considered in the transfer coefficients via convolution including the series expansion of the eccentricity functions. The transfer coefficients are applied to assess the error patterns, which are caused by different orientation of the cartwheel for range-rate and range acceleration. This work will present the isotropy and magnitude of the formal errors of the gravity field coefficients, for different orientations of the cartwheel.

  18. Assessing the Structure of the Ways of Coping Questionnaire in Fibromyalgia Patients Using Common Factor Analytic Approaches.

    PubMed

    Van Liew, Charles; Santoro, Maya S; Edwards, Larissa; Kang, Jeremy; Cronan, Terry A

    2016-01-01

    The Ways of Coping Questionnaire (WCQ) is a widely used measure of coping processes. Despite its use in a variety of populations, there has been concern about the stability and structure of the WCQ across different populations. This study examines the factor structure of the WCQ in a large sample of individuals diagnosed with fibromyalgia. The participants were 501 adults (478 women) who were part of a larger intervention study. Participants completed the WCQ at their 6-month assessment. Foundational factoring approaches were performed on the data (i.e., maximum likelihood factoring [MLF], iterative principal factoring [IPF], principal axis factoring (PAF), and principal components factoring [PCF]) with oblique oblimin rotation. Various criteria were evaluated to determine the number of factors to be extracted, including Kaiser's rule, Scree plot visual analysis, 5 and 10% unique variance explained, 70 and 80% communal variance explained, and Horn's parallel analysis (PA). It was concluded that the 4-factor PAF solution was the preferable solution, based on PA extraction and the fact that this solution minimizes nonvocality and multivocality. The present study highlights the need for more research focused on defining the limits of the WCQ and the degree to which population-specific and context-specific subscale adjustments are needed.

  19. Assessing the Structure of the Ways of Coping Questionnaire in Fibromyalgia Patients Using Common Factor Analytic Approaches

    PubMed Central

    Edwards, Larissa; Kang, Jeremy

    2016-01-01

    The Ways of Coping Questionnaire (WCQ) is a widely used measure of coping processes. Despite its use in a variety of populations, there has been concern about the stability and structure of the WCQ across different populations. This study examines the factor structure of the WCQ in a large sample of individuals diagnosed with fibromyalgia. The participants were 501 adults (478 women) who were part of a larger intervention study. Participants completed the WCQ at their 6-month assessment. Foundational factoring approaches were performed on the data (i.e., maximum likelihood factoring [MLF], iterative principal factoring [IPF], principal axis factoring (PAF), and principal components factoring [PCF]) with oblique oblimin rotation. Various criteria were evaluated to determine the number of factors to be extracted, including Kaiser's rule, Scree plot visual analysis, 5 and 10% unique variance explained, 70 and 80% communal variance explained, and Horn's parallel analysis (PA). It was concluded that the 4-factor PAF solution was the preferable solution, based on PA extraction and the fact that this solution minimizes nonvocality and multivocality. The present study highlights the need for more research focused on defining the limits of the WCQ and the degree to which population-specific and context-specific subscale adjustments are needed. PMID:28070160

  20. A Confirmatory Approach to Examining the Factor Structure of the Strengths and Difficulties Questionnaire (SDQ): A Large Scale Cohort Study

    ERIC Educational Resources Information Center

    Niclasen, Janni; Skovgaard, Anne Mette; Andersen, Anne-Marie Nybo; Somhovd, Mikael Julius; Obel, Carsten

    2013-01-01

    The aim of this study was to examine the factor structure of the Strengths and Difficulties Questionnaire (SDQ) using a Structural Confirmatory Factor Analytic approach. The Danish translation of the SDQ was distributed to 71,840 parents and teachers of 5-7 and 10-12-year-old boys and girls from four large scale cohorts. Three theoretical models…

  1. Clarivate Analytics: Continued Omnia vanitas Impact Factor Culture.

    PubMed

    Teixeira da Silva, Jaime A; Bernès, Sylvain

    2018-02-01

    This opinion paper takes aim at an error made recently by Clarivate Analytics in which it sent out an email that congratulated academics for becoming exclusive members of academia's most cited elite, the Highly Cited Researchers (HCRs). However, that email was sent out to an undisclosed number of non-HCRs, who were offered an apology shortly after, through a bulk mail, which tried to down-play the importance of the error, all the while praising the true HCRs. When Clarivate Analytics senior management was contacted, the company declined to offer an indication of the number of academics who had been contacted and erroneously awarded the HCR status. We believe that this regrettable blunder, together with the opacity offered by the company, fortify the corporate attitude about the value of the journal impact factor (JIF), and what it represents, namely a marketing tool that is falsely used to equate citations with quality, worth, or influence. The continued commercialization of metrics such as the JIF is at the heart of their use to assess the "quality" of a researcher, their work, or a journal, and contributes to a great extent to driving scientific activities towards a futile endeavor.

  2. Mitigating Sports Injury Risks Using Internet of Things and Analytics Approaches.

    PubMed

    Wilkerson, Gary B; Gupta, Ashish; Colston, Marisa A

    2018-03-12

    Sport injuries restrict participation, impose a substantial economic burden, and can have persisting adverse effects on health-related quality of life. The effective use of Internet of Things (IoT), when combined with analytics approaches, can improve player safety through identification of injury risk factors that can be addressed by targeted risk reduction training activities. Use of IoT devices can facilitate highly efficient quantification of relevant functional capabilities prior to sport participation, which could substantially advance the prevailing sport injury management paradigm. This study introduces a framework for using sensor-derived IoT data to supplement other data for objective estimation of each individual college football player's level of injury risk, which is an approach to injury prevention that has not been previously reported. A cohort of 45 NCAA Division I-FCS college players provided data in the form of self-ratings of persisting effects of previous injuries and single-leg postural stability test. Instantaneous change in body mass acceleration (jerk) during the test was quantified by a smartphone accelerometer, with data wirelessly transmitted to a secure cloud server. Injuries sustained from the beginning of practice sessions until the end of the 13-game season were documented, along with the number of games played by each athlete over the course of a 13-game season. Results demonstrate a strong prediction model. Our approach may have strong relevance to the estimation of injury risk for other physically demanding activities. Clearly, there is great potential for improvement of injury prevention initiatives through identification of individual athletes who possess elevated injury risk and targeted interventions. © 2018 Society for Risk Analysis.

  3. Food risk perceptions, gender, and individual differences in avoidance and approach motivation, intuitive and analytic thinking styles, and anxiety.

    PubMed

    Leikas, Sointu; Lindeman, Marjaana; Roininen, Katariina; Lähteenmäki, Liisa

    2007-03-01

    Risks appear to be perceived in two different ways, affectively and rationally. Finnish adult internet users were contacted via e-mail and asked to fill an internet questionnaire consisting of questions of food risks and measures of avoidance and approach motivation, analytic and intuitive information processing style, trait anxiety, and gender in order to find out (1) whether food risks are perceived two-dimensionally, (2) how individual differences in motivation, information processing, and anxiety are associated with the different dimensions of food risk perceptions, and (3) whether gender moderates these associations. The data were analyzed by factor, correlation and regression analyses. Three factors emerged: risk scariness, risk likelihood, and risks of cardiovascular disease. Personality and gender x personality interactions predicted food risk perceptions. Results showed that food risk perceptions generally form two dimensions; scariness and likelihood, but that this may depend on the nature of the risk. In addition, results imply that individuals with high avoidance motivation perceive food risks as scarier and more likely than others, and that individuals with an analytic information processing style perceive food risks as less likely than others. Trait anxiety seems to be associated with higher food risk perceptions only among men.

  4. Analytical Chemistry: A Literary Approach.

    ERIC Educational Resources Information Center

    Lucy, Charles A.

    2000-01-01

    Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)

  5. New vistas in refractive laser beam shaping with an analytic design approach

    NASA Astrophysics Data System (ADS)

    Duerr, Fabian; Thienpont, Hugo

    2014-05-01

    Many commercial, medical and scientific applications of the laser have been developed since its invention. Some of these applications require a specific beam irradiance distribution to ensure optimal performance. Often, it is possible to apply geometrical methods to design laser beam shapers. This common design approach is based on the ray mapping between the input plane and the output beam. Geometric ray mapping designs with two plano-aspheric lenses have been thoroughly studied in the past. Even though analytic expressions for various ray mapping functions do exist, the surface profiles of the lenses are still calculated numerically. In this work, we present an alternative novel design approach that allows direct calculation of the rotational symmetric lens profiles described by analytic functions. Starting from the example of a basic beam expander, a set of functional differential equations is derived from Fermat's principle. This formalism allows calculating the exact lens profiles described by Taylor series coefficients up to very high orders. To demonstrate the versatility of this new approach, two further cases are solved: a Gaussian to at-top irradiance beam shaping system, and a beam shaping system that generates a more complex dark-hollow Gaussian (donut-like) irradiance profile with zero intensity in the on-axis region. The presented ray tracing results confirm the high accuracy of all calculated solutions and indicate the potential of this design approach for refractive beam shaping applications.

  6. Factors Affecting Higher Order Thinking Skills of Students: A Meta-Analytic Structural Equation Modeling Study

    ERIC Educational Resources Information Center

    Budsankom, Prayoonsri; Sawangboon, Tatsirin; Damrongpanit, Suntorapot; Chuensirimongkol, Jariya

    2015-01-01

    The purpose of the research is to develop and identify the validity of factors affecting higher order thinking skills (HOTS) of students. The thinking skills can be divided into three types: analytical, critical, and creative thinking. This analysis is done by applying the meta-analytic structural equation modeling (MASEM) based on a database of…

  7. The analytical and numerical approaches to the theory of the Moon's librations: Modern analysis and results

    NASA Astrophysics Data System (ADS)

    Petrova, N.; Zagidullin, A.; Nefedyev, Y.; Kosulin, V.; Andreev, A.

    2017-11-01

    Observing physical librations of celestial bodies and the Moon represents one of the astronomical methods of remotely assessing the internal structure of a celestial body without conducting expensive space experiments. The paper contains a review of recent advances in studying the Moon's structure using various methods of obtaining and applying the lunar physical librations (LPhL) data. In this article LPhL simulation methods of assessing viscoelastic and dissipative properties of the lunar body and lunar core parameters, whose existence has been recently confirmed during the seismic data reprocessing of ;Apollo; space mission, are described. Much attention is paid to physical interpretation of the free librations phenomenon and the methods for its determination. In the paper the practical application of the most accurate analytical LPhL tables (Rambaux and Williams, 2011) is discussed. The tables were built on the basis of complex analytical processing of the residual differences obtained when comparing long-term series of laser observations with the numerical ephemeris DE421. In the paper an efficiency analysis of two approaches to LPhL theory is conducted: the numerical and the analytical ones. It has been shown that in lunar investigation both approaches complement each other in various aspects: the numerical approach provides high accuracy of the theory, which is required for the proper processing of modern observations, the analytical approach allows to comprehend the essence of the phenomena in the lunar rotation, predict and interpret new effects in the observations of lunar body and lunar core parameters.

  8. Behavior analytic approaches to problem behavior in intellectual disabilities.

    PubMed

    Hagopian, Louis P; Gregory, Meagan K

    2016-03-01

    The purpose of the current review is to summarize recent behavior analytic research on problem behavior in individuals with intellectual disabilities. We have focused our review on studies published from 2013 to 2015, but also included earlier studies that were relevant. Behavior analytic research on problem behavior continues to focus on the use and refinement of functional behavioral assessment procedures and function-based interventions. During the review period, a number of studies reported on procedures aimed at making functional analysis procedures more time efficient. Behavioral interventions continue to evolve, and there were several larger scale clinical studies reporting on multiple individuals. There was increased attention on the part of behavioral researchers to develop statistical methods for analysis of within subject data and continued efforts to aggregate findings across studies through evaluative reviews and meta-analyses. Findings support continued utility of functional analysis for guiding individualized interventions and for classifying problem behavior. Modifications designed to make functional analysis more efficient relative to the standard method of functional analysis were reported; however, these require further validation. Larger scale studies on behavioral assessment and treatment procedures provided additional empirical support for effectiveness of these approaches and their sustainability outside controlled clinical settings.

  9. Human performance modeling for system of systems analytics: combat performance-shaping factors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, Craig R.; Miller, Dwight Peter

    The US military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives. To support this goal, Sandia National Laboratories (SNL) has undertaken a program of HPM as an integral augmentation to its system-of-system (SoS) analytics capabilities. The previous effort, reported in SAND2005-6569, evaluated the effects of soldier cognitive fatigue on SoS performance. The current effort began with a very broad survey of any performance-shaping factors (PSFs) that also might affect soldiers performance in combat situations. The work included consideration of three different approaches to cognition modeling and how appropriate theymore » would be for application to SoS analytics. This bulk of this report categorizes 47 PSFs into three groups (internal, external, and task-related) and provides brief descriptions of how each affects combat performance, according to the literature. The PSFs were then assembled into a matrix with 22 representative military tasks and assigned one of four levels of estimated negative impact on task performance, based on the literature. Blank versions of the matrix were then sent to two ex-military subject-matter experts to be filled out based on their personal experiences. Data analysis was performed to identify the consensus most influential PSFs. Results indicate that combat-related injury, cognitive fatigue, inadequate training, physical fatigue, thirst, stress, poor perceptual processing, and presence of chemical agents are among the PSFs with the most negative impact on combat performance.« less

  10. Comparison of three methods for wind turbine capacity factor estimation.

    PubMed

    Ditkovich, Y; Kuperman, A

    2014-01-01

    Three approaches to calculating capacity factor of fixed speed wind turbines are reviewed and compared using a case study. The first "quasiexact" approach utilizes discrete wind raw data (in the histogram form) and manufacturer-provided turbine power curve (also in discrete form) to numerically calculate the capacity factor. On the other hand, the second "analytic" approach employs a continuous probability distribution function, fitted to the wind data as well as continuous turbine power curve, resulting from double polynomial fitting of manufacturer-provided power curve data. The latter approach, while being an approximation, can be solved analytically thus providing a valuable insight into aspects, affecting the capacity factor. Moreover, several other merits of wind turbine performance may be derived based on the analytical approach. The third "approximate" approach, valid in case of Rayleigh winds only, employs a nonlinear approximation of the capacity factor versus average wind speed curve, only requiring rated power and rotor diameter of the turbine. It is shown that the results obtained by employing the three approaches are very close, enforcing the validity of the analytically derived approximations, which may be used for wind turbine performance evaluation.

  11. Using Configural Frequency Analysis as a Person-Centered Analytic Approach with Categorical Data

    ERIC Educational Resources Information Center

    Stemmler, Mark; Heine, Jörg-Henrik

    2017-01-01

    Configural frequency analysis and log-linear modeling are presented as person-centered analytic approaches for the analysis of categorical or categorized data in multi-way contingency tables. Person-centered developmental psychology, based on the holistic interactionistic perspective of the Stockholm working group around David Magnusson and Lars…

  12. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based, Graduate-Level Analytical Chemistry Course

    NASA Astrophysics Data System (ADS)

    Toh, Chee-Seng

    2007-04-01

    A research-focused approach is described for a nonlaboratory-based graduate-level module on analytical chemistry. The approach utilizes commonly practiced activities carried out in active research laboratories, in particular, activities involving logging of ideas and thoughts, journal clubs, proposal writing, classroom participation and discussions, and laboratory tours. This approach was adapted without compromising the course content and results suggest possible adaptation and implementation in other graduate-level courses.

  13. Quantifying Acoustic Impacts on Marine Mammals and Sea Turtles: Methods and Analytical Approach for Phase III Training and Testing

    DTIC Science & Technology

    2017-06-16

    Acoustic Impacts on Marine Mammals and Sea Turtles: Methods and Analytical Approach for Phase III Training and Testing Sarah A. Blackstock Joseph O...December 2017 4. TITLE AND SUBTITLE Quantifying Acoustic Impacts on Marine Mammals and Sea Turtles: Methods and Analytical Approach for Phase III...Navy’s Phase III Study Areas as described in each Environmental Impact Statement/ Overseas Environmental Impact Statement and describes the methods

  14. A Functional Analytic Approach to Group Psychotherapy

    ERIC Educational Resources Information Center

    Vandenberghe, Luc

    2009-01-01

    This article provides a particular view on the use of Functional Analytical Psychotherapy (FAP) in a group therapy format. This view is based on the author's experiences as a supervisor of Functional Analytical Psychotherapy Groups, including groups for women with depression and groups for chronic pain patients. The contexts in which this approach…

  15. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  16. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  17. SAW-Based Phononic Crystal Microfluidic Sensor—Microscale Realization of Velocimetry Approaches for Integrated Analytical Platform Applications

    PubMed Central

    Lucklum, Ralf; Zubtsov, Mikhail; Schmidt, Marc-Peter; Mukhin, Nikolay V.; Hirsch, Soeren

    2017-01-01

    The current work demonstrates a novel surface acoustic wave (SAW) based phononic crystal sensor approach that allows the integration of a velocimetry-based sensor concept into single chip integrated solutions, such as Lab-on-a-Chip devices. The introduced sensor platform merges advantages of ultrasonic velocimetry analytic systems and a microacoustic sensor approach. It is based on the analysis of structural resonances in a periodic composite arrangement of microfluidic channels confined within a liquid analyte. Completed theoretical and experimental investigations show the ability to utilize periodic structure localized modes for the detection of volumetric properties of liquids and prove the efficacy of the proposed sensor concept. PMID:28946609

  18. SAW-Based Phononic Crystal Microfluidic Sensor-Microscale Realization of Velocimetry Approaches for Integrated Analytical Platform Applications.

    PubMed

    Oseev, Aleksandr; Lucklum, Ralf; Zubtsov, Mikhail; Schmidt, Marc-Peter; Mukhin, Nikolay V; Hirsch, Soeren

    2017-09-23

    The current work demonstrates a novel surface acoustic wave (SAW) based phononic crystal sensor approach that allows the integration of a velocimetry-based sensor concept into single chip integrated solutions, such as Lab-on-a-Chip devices. The introduced sensor platform merges advantages of ultrasonic velocimetry analytic systems and a microacoustic sensor approach. It is based on the analysis of structural resonances in a periodic composite arrangement of microfluidic channels confined within a liquid analyte. Completed theoretical and experimental investigations show the ability to utilize periodic structure localized modes for the detection of volumetric properties of liquids and prove the efficacy of the proposed sensor concept.

  19. Comparison of maximum runup through analytical and numerical approaches for different fault parameters estimates

    NASA Astrophysics Data System (ADS)

    Kanoglu, U.; Wronna, M.; Baptista, M. A.; Miranda, J. M. A.

    2017-12-01

    The one-dimensional analytical runup theory in combination with near shore synthetic waveforms is a promising tool for tsunami rapid early warning systems. Its application in realistic cases with complex bathymetry and initial wave condition from inverse modelling have shown that maximum runup values can be estimated reasonably well. In this study we generate a simplistic bathymetry domains which resemble realistic near-shore features. We investigate the accuracy of the analytical runup formulae to the variation of fault source parameters and near-shore bathymetric features. To do this we systematically vary the fault plane parameters to compute the initial tsunami wave condition. Subsequently, we use the initial conditions to run the numerical tsunami model using coupled system of four nested grids and compare the results to the analytical estimates. Variation of the dip angle of the fault plane showed that analytical estimates have less than 10% difference for angles 5-45 degrees in a simple bathymetric domain. These results shows that the use of analytical formulae for fast run up estimates constitutes a very promising approach in a simple bathymetric domain and might be implemented in Hazard Mapping and Early Warning.

  20. Modern Adaptive Analytics Approach to Lowering Seismic Network Detection Thresholds

    NASA Astrophysics Data System (ADS)

    Johnson, C. E.

    2017-12-01

    Modern seismic networks present a number of challenges, but perhaps most notably are those related to 1) extreme variation in station density, 2) temporal variation in station availability, and 3) the need to achieve detectability for much smaller events of strategic importance. The first of these has been reasonably addressed in the development of modern seismic associators, such as GLASS 3.0 by the USGS/NEIC, though some work still remains to be done in this area. However, the latter two challenges demand special attention. Station availability is impacted by weather, equipment failure or the adding or removing of stations, and while thresholds have been pushed to increasingly smaller magnitudes, new algorithms are needed to achieve even lower thresholds. Station availability can be addressed by a modern, adaptive architecture that maintains specified performance envelopes using adaptive analytics coupled with complexity theory. Finally, detection thresholds can be lowered using a novel approach that tightly couples waveform analytics with the event detection and association processes based on a principled repicking algorithm that uses particle realignment for enhanced phase discrimination.

  1. Equity Analytics: A Methodological Approach for Quantifying Participation Patterns in Mathematics Classroom Discourse

    ERIC Educational Resources Information Center

    Reinholz, Daniel L.; Shah, Niral

    2018-01-01

    Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…

  2. Health Monitoring of a Rotating Disk Using a Combined Analytical-Experimental Approach

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Woike, Mark R.; Lekki, John D.; Baaklini, George Y.

    2009-01-01

    Rotating disks undergo rigorous mechanical loading conditions that make them subject to a variety of failure mechanisms leading to structural deformities and cracking. During operation, periodic loading fluctuations and other related factors cause fractures and hidden internal cracks that can only be detected via noninvasive types of health monitoring and/or nondestructive evaluation. These evaluations go further to inspect material discontinuities and other irregularities that have grown to become critical defects that can lead to failure. Hence, the objectives of this work is to conduct a collective analytical and experimental study to present a well-rounded structural assessment of a rotating disk by means of a health monitoring approach and to appraise the capabilities of an in-house rotor spin system. The analyses utilized the finite element method to analyze the disk with and without an induced crack at different loading levels, such as rotational speeds starting at 3000 up to 10 000 rpm. A parallel experiment was conducted to spin the disk at the desired speeds in an attempt to correlate the experimental findings with the analytical results. The testing involved conducting spin experiments which, covered the rotor in both damaged and undamaged (i.e., notched and unnotched) states. Damaged disks had artificially induced through-thickness flaws represented in the web region ranging from 2.54 to 5.08 cm (1 to 2 in.) in length. This study aims to identify defects that are greater than 1.27 cm (0.5 in.), applying available means of structural health monitoring and nondestructive evaluation, and documenting failure mechanisms experienced by the rotor system under typical turbine engine operating conditions.

  3. Semi-Analytic Reconstruction of Flux in Finite Volume Formulations

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2006-01-01

    Semi-analytic reconstruction uses the analytic solution to a second-order, steady, ordinary differential equation (ODE) to simultaneously evaluate the convective and diffusive flux at all interfaces of a finite volume formulation. The second-order ODE is itself a linearized approximation to the governing first- and second- order partial differential equation conservation laws. Thus, semi-analytic reconstruction defines a family of formulations for finite volume interface fluxes using analytic solutions to approximating equations. Limiters are not applied in a conventional sense; rather, diffusivity is adjusted in the vicinity of changes in sign of eigenvalues in order to achieve a sufficiently small cell Reynolds number in the analytic formulation across critical points. Several approaches for application of semi-analytic reconstruction for the solution of one-dimensional scalar equations are introduced. Results are compared with exact analytic solutions to Burger s Equation as well as a conventional, upwind discretization using Roe s method. One approach, the end-point wave speed (EPWS) approximation, is further developed for more complex applications. One-dimensional vector equations are tested on a quasi one-dimensional nozzle application. The EPWS algorithm has a more compact difference stencil than Roe s algorithm but reconstruction time is approximately a factor of four larger than for Roe. Though both are second-order accurate schemes, Roe s method approaches a grid converged solution with fewer grid points. Reconstruction of flux in the context of multi-dimensional, vector conservation laws including effects of thermochemical nonequilibrium in the Navier-Stokes equations is developed.

  4. Beyond Engagement Analytics: Which Online Mixed-Data Factors Predict Student Learning Outcomes?

    ERIC Educational Resources Information Center

    Strang, Kenneth David

    2017-01-01

    This mixed-method study focuses on online learning analytics, a research area of importance. Several important student attributes and their online activities are examined to identify what seems to work best to predict higher grades. The purpose is to explore the relationships between student grade and key learning engagement factors using a large…

  5. Surrogate analyte approach for quantitation of endogenous NAD(+) in human acidified blood samples using liquid chromatography coupled with electrospray ionization tandem mass spectrometry.

    PubMed

    Liu, Liling; Cui, Zhiyi; Deng, Yuzhong; Dean, Brian; Hop, Cornelis E C A; Liang, Xiaorong

    2016-02-01

    A high-performance liquid chromatography tandem mass spectrometry (LC-MS/MS) assay for the quantitative determination of NAD(+) in human whole blood using a surrogate analyte approach was developed and validated. Human whole blood was acidified using 0.5N perchloric acid at a ratio of 1:3 (v:v, blood:perchloric acid) during sample collection. 25μL of acidified blood was extracted using a protein precipitation method and the resulting extracts were analyzed using reverse-phase chromatography and positive electrospray ionization mass spectrometry. (13)C5-NAD(+) was used as the surrogate analyte for authentic analyte, NAD(+). The standard curve ranging from 0.250 to 25.0μg/mL in acidified human blood for (13)C5-NAD(+) was fitted to a 1/x(2) weighted linear regression model. The LC-MS/MS response between surrogate analyte and authentic analyte at the same concentration was obtained before and after the batch run. This response factor was not applied when determining the NAD(+) concentration from the (13)C5-NAD(+) standard curve since the percent difference was less than 5%. The precision and accuracy of the LC-MS/MS assay based on the five analytical QC levels were well within the acceptance criteria from both FDA and EMA guidance for bioanalytical method validation. Average extraction recovery of (13)C5-NAD(+) was 94.6% across the curve range. Matrix factor was 0.99 for both high and low QC indicating minimal ion suppression or enhancement. The validated assay was used to measure the baseline level of NAD(+) in 29 male and 21 female human subjects. This assay was also used to study the circadian effect of endogenous level of NAD(+) in 10 human subjects. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Tensor scale: An analytic approach with efficient computation and applications☆

    PubMed Central

    Xu, Ziyue; Saha, Punam K.; Dasgupta, Soura

    2015-01-01

    Scale is a widely used notion in computer vision and image understanding that evolved in the form of scale-space theory where the key idea is to represent and analyze an image at various resolutions. Recently, we introduced a notion of local morphometric scale referred to as “tensor scale” using an ellipsoidal model that yields a unified representation of structure size, orientation and anisotropy. In the previous work, tensor scale was described using a 2-D algorithmic approach and a precise analytic definition was missing. Also, the application of tensor scale in 3-D using the previous framework is not practical due to high computational complexity. In this paper, an analytic definition of tensor scale is formulated for n-dimensional (n-D) images that captures local structure size, orientation and anisotropy. Also, an efficient computational solution in 2- and 3-D using several novel differential geometric approaches is presented and the accuracy of results is experimentally examined. Also, a matrix representation of tensor scale is derived facilitating several operations including tensor field smoothing to capture larger contextual knowledge. Finally, the applications of tensor scale in image filtering and n-linear interpolation are presented and the performance of their results is examined in comparison with respective state-of-art methods. Specifically, the performance of tensor scale based image filtering is compared with gradient and Weickert’s structure tensor based diffusive filtering algorithms. Also, the performance of tensor scale based n-linear interpolation is evaluated in comparison with standard n-linear and windowed-sinc interpolation methods. PMID:26236148

  7. Analytic halo approach to the bispectrum of galaxies in redshift space

    NASA Astrophysics Data System (ADS)

    Yamamoto, Kazuhiro; Nan, Yue; Hikage, Chiaki

    2017-02-01

    We present an analytic formula for the galaxy bispectrum in redshift space on the basis of the halo approach description with the halo occupation distribution of central galaxies and satellite galaxies. This work is an extension of a previous work on the galaxy power spectrum, which illuminated the significant contribution of satellite galaxies to the higher multipole spectrum through the nonlinear redshift space distortions of their random motions. Behaviors of the multipoles of the bispectrum are compared with results of numerical simulations assuming a halo occupation distribution of the low-redshift (LOWZ) sample of the Sloan Digital Sky Survey (SDSS) III baryon oscillation spectroscopic survey (BOSS) survey. Also presented are analytic approximate formulas for the multipoles of the bispectrum, which is useful to understanding their characteristic properties. We demonstrate that the Fingers of God effect is quite important for the higher multipoles of the bispectrum in redshift space, depending on the halo occupation distribution parameters.

  8. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2015-01-01

    Within the mosaic display of international anti-doping efforts, analytical strategies based on up-to-date instrumentation as well as most recent information about physiology, pharmacology, metabolism, etc., of prohibited substances and methods of doping are indispensable. The continuous emergence of new chemical entities and the identification of arguably beneficial effects of established or even obsolete drugs on endurance, strength, and regeneration, necessitate frequent and adequate adaptations of sports drug testing procedures. These largely rely on exploiting new technologies, extending the substance coverage of existing test protocols, and generating new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA). In reference of the content of the 2014 Prohibited List, literature concerning human sports drug testing that was published between October 2013 and September 2014 is summarized and reviewed in this annual banned-substance review, with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2014 John Wiley & Sons, Ltd.

  9. Advances in Assays and Analytical Approaches for Botulinum Toxin Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grate, Jay W.; Ozanich, Richard M.; Warner, Marvin G.

    2010-08-04

    Methods to detect botulinum toxin, the most poisonous substance known, are reviewed. Current assays are being developed with two main objectives in mind: 1) to obtain sufficiently low detection limits to replace the mouse bioassay with an in vitro assay, and 2) to develop rapid assays for screening purposes that are as sensitive as possible while requiring an hour or less to process the sample an obtain the result. This review emphasizes the diverse analytical approaches and devices that have been developed over the last decade, while also briefly reviewing representative older immunoassays to provide background and context.

  10. Faculty Workload: An Analytical Approach

    ERIC Educational Resources Information Center

    Dennison, George M.

    2012-01-01

    Recent discussions of practices in higher education have tended toward muck-raking and self-styled exposure of cynical self-indulgence by faculty and administrators at the expense of students and their families, as usually occurs during periods of economic duress, rather than toward analytical studies designed to foster understanding This article…

  11. A GRAPHICAL DIAGNOSTIC METHOD FOR ASSESSING THE ROTATION IN FACTOR ANALYTICAL MODELS OF ATMOSPHERIC POLLUTION. (R831078)

    EPA Science Inventory

    Factor analytic tools such as principal component analysis (PCA) and positive matrix factorization (PMF), suffer from rotational ambiguity in the results: different solutions (factors) provide equally good fits to the measured data. The PMF model imposes non-negativity of both...

  12. An Investigation of First-Year Engineering Student and Instructor Perspectives of Learning Analytics Approaches

    ERIC Educational Resources Information Center

    Knight, David B.; Brozina, Cory; Novoselich, Brian

    2016-01-01

    This paper investigates how first-year engineering undergraduates and their instructors describe the potential for learning analytics approaches to contribute to student success. Results of qualitative data collection in a first-year engineering course indicated that both students and instructors\temphasized a preference for learning analytics…

  13. An algebraic approach to the analytic bootstrap

    DOE PAGES

    Alday, Luis F.; Zhiboedov, Alexander

    2017-04-27

    We develop an algebraic approach to the analytic bootstrap in CFTs. By acting with the Casimir operator on the crossing equation we map the problem of doing large spin sums to any desired order to the problem of solving a set of recursion relations. We compute corrections to the anomalous dimension of large spin operators due to the exchange of a primary and its descendants in the crossed channel and show that this leads to a Borel-summable expansion. Here, we analyse higher order corrections to the microscopic CFT data in the direct channel and its matching to infinite towers ofmore » operators in the crossed channel. We apply this method to the critical O(N ) model. At large N we reproduce the first few terms in the large spin expansion of the known two-loop anomalous dimensions of higher spin currents in the traceless symmetric representation of O(N ) and make further predictions. At small N we present the results for the truncated large spin expansion series of anomalous dimensions of higher spin currents.« less

  14. Orthogonal Higher Order Structure of the WISC-IV Spanish Using Hierarchical Exploratory Factor Analytic Procedures

    ERIC Educational Resources Information Center

    McGill, Ryan J.; Canivez, Gary L.

    2016-01-01

    As recommended by Carroll, the present study examined the factor structure of the Wechsler Intelligence Scale for Children-Fourth Edition Spanish (WISC-IV Spanish) normative sample using higher order exploratory factor analytic techniques not included in the WISC-IV Spanish Technical Manual. Results indicated that the WISC-IV Spanish subtests were…

  15. Latent structure of the Wisconsin Card Sorting Test: a confirmatory factor analytic study.

    PubMed

    Greve, Kevin W; Stickle, Timothy R; Love, Jeffrey M; Bianchini, Kevin J; Stanford, Matthew S

    2005-05-01

    The present study represents the first large scale confirmatory factor analysis of the Wisconsin Card Sorting Test (WCST). The results generally support the three factor solutions reported in the exploratory factor analysis literature. However, only the first factor, which reflects general executive functioning, is statistically sound. The secondary factors, while likely reflecting meaningful cognitive abilities, are less stable except when all subjects complete all 128 cards. It is likely that having two discontinuation rules for the WCST has contributed to the varied factor analytic solutions reported in the literature and early discontinuation may result in some loss of useful information. Continued multivariate research will be necessary to better clarify the processes underlying WCST performance and their relationships to one another.

  16. A Bayesian Multi-Level Factor Analytic Model of Consumer Price Sensitivities across Categories

    ERIC Educational Resources Information Center

    Duvvuri, Sri Devi; Gruca, Thomas S.

    2010-01-01

    Identifying price sensitive consumers is an important problem in marketing. We develop a Bayesian multi-level factor analytic model of the covariation among household-level price sensitivities across product categories that are substitutes. Based on a multivariate probit model of category incidence, this framework also allows the researcher to…

  17. Chemical imaging of drug delivery systems with structured surfaces-a combined analytical approach of confocal raman microscopy and optical profilometry.

    PubMed

    Kann, Birthe; Windbergs, Maike

    2013-04-01

    Confocal Raman microscopy is an analytical technique with a steadily increasing impact in the field of pharmaceutics as the instrumental setup allows for nondestructive visualization of component distribution within drug delivery systems. Here, the attention is mainly focused on classic solid carrier systems like tablets, pellets, or extrudates. Due to the opacity of these systems, Raman analysis is restricted either to exterior surfaces or cross sections. As Raman spectra are only recorded from one focal plane at a time, the sample is usually altered to create a smooth and even surface. However, this manipulation can lead to misinterpretation of the analytical results. Here, we present a trendsetting approach to overcome these analytical pitfalls with a combination of confocal Raman microscopy and optical profilometry. By acquiring a topography profile of the sample area of interest prior to Raman spectroscopy, the profile height information allowed to level the focal plane to the sample surface for each spectrum acquisition. We first demonstrated the basic principle of this complementary approach in a case study using a tilted silica wafer. In a second step, we successfully adapted the two techniques to investigate an extrudate and a lyophilisate as two exemplary solid drug carrier systems. Component distribution analysis with the novel analytical approach was neither hampered by the curvature of the cylindrical extrudate nor the highly structured surface of the lyophilisate. Therefore, the combined analytical approach bears a great potential to be implemented in diversified fields of pharmaceutical sciences.

  18. Learning Analytics Considered Harmful

    ERIC Educational Resources Information Center

    Dringus, Laurie P.

    2012-01-01

    This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through…

  19. A Visual Analytics Approach for Station-Based Air Quality Data

    PubMed Central

    Du, Yi; Ma, Cuixia; Wu, Chao; Xu, Xiaowei; Guo, Yike; Zhou, Yuanchun; Li, Jianhui

    2016-01-01

    With the deployment of multi-modality and large-scale sensor networks for monitoring air quality, we are now able to collect large and multi-dimensional spatio-temporal datasets. For these sensed data, we present a comprehensive visual analysis approach for air quality analysis. This approach integrates several visual methods, such as map-based views, calendar views, and trends views, to assist the analysis. Among those visual methods, map-based visual methods are used to display the locations of interest, and the calendar and the trends views are used to discover the linear and periodical patterns. The system also provides various interaction tools to combine the map-based visualization, trends view, calendar view and multi-dimensional view. In addition, we propose a self-adaptive calendar-based controller that can flexibly adapt the changes of data size and granularity in trends view. Such a visual analytics system would facilitate big-data analysis in real applications, especially for decision making support. PMID:28029117

  20. A Visual Analytics Approach for Station-Based Air Quality Data.

    PubMed

    Du, Yi; Ma, Cuixia; Wu, Chao; Xu, Xiaowei; Guo, Yike; Zhou, Yuanchun; Li, Jianhui

    2016-12-24

    With the deployment of multi-modality and large-scale sensor networks for monitoring air quality, we are now able to collect large and multi-dimensional spatio-temporal datasets. For these sensed data, we present a comprehensive visual analysis approach for air quality analysis. This approach integrates several visual methods, such as map-based views, calendar views, and trends views, to assist the analysis. Among those visual methods, map-based visual methods are used to display the locations of interest, and the calendar and the trends views are used to discover the linear and periodical patterns. The system also provides various interaction tools to combine the map-based visualization, trends view, calendar view and multi-dimensional view. In addition, we propose a self-adaptive calendar-based controller that can flexibly adapt the changes of data size and granularity in trends view. Such a visual analytics system would facilitate big-data analysis in real applications, especially for decision making support.

  1. A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions

    PubMed Central

    Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.

    2009-01-01

    Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453

  2. Approaches to acceptable risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whipple, C

    Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, inmore » a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.« less

  3. Comparison of Three Methods for Wind Turbine Capacity Factor Estimation

    PubMed Central

    Ditkovich, Y.; Kuperman, A.

    2014-01-01

    Three approaches to calculating capacity factor of fixed speed wind turbines are reviewed and compared using a case study. The first “quasiexact” approach utilizes discrete wind raw data (in the histogram form) and manufacturer-provided turbine power curve (also in discrete form) to numerically calculate the capacity factor. On the other hand, the second “analytic” approach employs a continuous probability distribution function, fitted to the wind data as well as continuous turbine power curve, resulting from double polynomial fitting of manufacturer-provided power curve data. The latter approach, while being an approximation, can be solved analytically thus providing a valuable insight into aspects, affecting the capacity factor. Moreover, several other merits of wind turbine performance may be derived based on the analytical approach. The third “approximate” approach, valid in case of Rayleigh winds only, employs a nonlinear approximation of the capacity factor versus average wind speed curve, only requiring rated power and rotor diameter of the turbine. It is shown that the results obtained by employing the three approaches are very close, enforcing the validity of the analytically derived approximations, which may be used for wind turbine performance evaluation. PMID:24587755

  4. Pavement Performance : Approaches Using Predictive Analytics

    DOT National Transportation Integrated Search

    2018-03-23

    Acceptable pavement condition is paramount to road safety. Using predictive analytics techniques, this project attempted to develop models that provide an assessment of pavement condition based on an array of indictors that include pavement distress,...

  5. An analytic approach to optimize tidal turbine fields

    NASA Astrophysics Data System (ADS)

    Pelz, P.; Metzler, M.

    2013-12-01

    Motivated by global warming due to CO2-emission various technologies for harvesting of energy from renewable sources are developed. Hydrokinetic turbines get applied to surface watercourse or tidal flow to gain electrical energy. Since the available power for hydrokinetic turbines is proportional to the projected cross section area, fields of turbines are installed to scale shaft power. Each hydrokinetic turbine of a field can be considered as a disk actuator. In [1], the first author derives the optimal operation point for hydropower in an open-channel. The present paper concerns about a 0-dimensional model of a disk-actuator in an open-channel flow with bypass, as a special case of [1]. Based on the energy equation, the continuity equation and the momentum balance an analytical approach is made to calculate the coefficient of performance for hydrokinetic turbines with bypass flow as function of the turbine head and the ratio of turbine width to channel width.

  6. Constraints on the [Formula: see text] form factor from analyticity and unitarity.

    PubMed

    Ananthanarayan, B; Caprini, I; Kubis, B

    Motivated by the discrepancies noted recently between the theoretical calculations of the electromagnetic [Formula: see text] form factor and certain experimental data, we investigate this form factor using analyticity and unitarity in a framework known as the method of unitarity bounds. We use a QCD correlator computed on the spacelike axis by operator product expansion and perturbative QCD as input, and exploit unitarity and the positivity of its spectral function, including the two-pion contribution that can be reliably calculated using high-precision data on the pion form factor. From this information, we derive upper and lower bounds on the modulus of the [Formula: see text] form factor in the elastic region. The results provide a significant check on those obtained with standard dispersion relations, confirming the existence of a disagreement with experimental data in the region around [Formula: see text].

  7. Ohmic Inflation of Hot Jupiters: an Analytical Approach

    NASA Astrophysics Data System (ADS)

    Ginzburg, Sivan; Sari, Re'em

    2015-12-01

    Many giant exoplanets in close orbits have observed radii which exceed theoretical predictions.One suggested explanation for this discrepancy is heat deposited deep inside the atmospheres of these hot Jupiters.We present an analytical model for the evolution of such irradiated, and internally heated gas giants, and derive scaling laws for their cooling rates and radii.We estimate the Ohmic dissipation resulting from the interaction between the atmospheric winds and the planet's magnetic field, and apply our model to Ohmically heated planets.Our model can account for the observed radii of many inflated planets, but not the most extreme ones.We show that Ohmically heated planets have already reached their equilibrium phase and they no longer contract.We show that it is possible to re-inflate planets, but we confirm that re-heating timescales are longer by about a factor of 30 than cooling times.

  8. Different analytical approaches in assessing antibacterial activity and the purity of commercial lysozyme preparations for dairy application.

    PubMed

    Brasca, Milena; Morandi, Stefano; Silvetti, Tiziana; Rosi, Veronica; Cattaneo, Stefano; Pellegrino, Luisa

    2013-05-21

    Hen egg-white lysozyme (LSZ) is currently used in the food industry to limit the proliferation of lactic acid bacteria spoilage in the production of wine and beer, and to inhibit butyric acid fermentation in hard and extra hard cheeses (late blowing) caused by the outgrowth of clostridial spores. The aim of this work was to evaluate how the enzyme activity in commercial preparations correlates to the enzyme concentration and can be affected by the presence of process-related impurities. Different analytical approaches, including turbidimetric assay, SDS-PAGE and HPLC were used to analyse 17 commercial preparations of LSZ marketed in different countries. The HPLC method adopted by ISO allowed the true LSZ concentration to be determined with accuracy. The turbidimetric assay was the most suitable method to evaluate LSZ activity, whereas SDS-PAGE allowed the presence of other egg proteins, which are potential allergens, to be detected. The analytical results showed that the purity of commercially available enzyme preparations can vary significantly, and evidenced the effectiveness of combining different analytical approaches in this type of control.

  9. Disentangling WTP per QALY data: different analytical approaches, different answers.

    PubMed

    Gyrd-Hansen, Dorte; Kjaer, Trine

    2012-03-01

    A large random sample of the Danish general population was asked to value health improvements by way of both the time trade-off elicitation technique and willingness-to-pay (WTP) using contingent valuation methods. The data demonstrate a high degree of heterogeneity across respondents in their relative valuations on the two scales. This has implications for data analysis. We show that the estimates of WTP per QALY are highly sensitive to the analytical strategy. For both open-ended and dichotomous choice data we demonstrate that choice of aggregated approach (ratios of means) or disaggregated approach (means of ratios) affects estimates markedly as does the interpretation of the constant term (which allows for disproportionality across the two scales) in the regression analyses. We propose that future research should focus on why some respondents are unwilling to trade on the time trade-off scale, on how to interpret the constant value in the regression analyses, and on how best to capture the heterogeneity in preference structures when applying mixed multinomial logit. Copyright © 2011 John Wiley & Sons, Ltd.

  10. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    PubMed

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current

  11. Analysis and synthesis of bianisotropic metasurfaces by using analytical approach based on equivalent parameters

    NASA Astrophysics Data System (ADS)

    Danaeifar, Mohammad; Granpayeh, Nosrat

    2018-03-01

    An analytical method is presented to analyze and synthesize bianisotropic metasurfaces. The equivalent parameters of metasurfaces in terms of meta-atom properties and other specifications of metasurfaces are derived. These parameters are related to electric, magnetic, and electromagnetic/magnetoelectric dipole moments of the bianisotropic media, and they can simplify the analysis of complicated and multilayer structures. A metasurface of split ring resonators is studied as an example demonstrating the proposed method. The optical properties of the meta-atom are explored, and the calculated polarizabilities are applied to find the reflection coefficient and the equivalent parameters of the metasurface. Finally, a structure consisting of two metasurfaces of the split ring resonators is provided, and the proposed analytical method is applied to derive the reflection coefficient. The validity of this analytical approach is verified by full-wave simulations which demonstrate good accuracy of the equivalent parameter method. This method can be used in the analysis and synthesis of bianisotropic metasurfaces with different materials and in different frequency ranges by considering electric, magnetic, and electromagnetic/magnetoelectric dipole moments.

  12. Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Keller, J.; Wallen, R.

    2015-02-01

    Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.

  13. Spatial Correlation Of Streamflows: An Analytical Approach

    NASA Astrophysics Data System (ADS)

    Betterle, A.; Schirmer, M.; Botter, G.

    2016-12-01

    The interwoven space and time variability of climate and landscape properties results in complex and non-linear hydrological response of streamflow dynamics. Understanding how meteorologic and morphological characteristics of catchments affect similarity/dissimilarity of streamflow timeseries at their outlets represents a scientific challenge with application in water resources management, ecological studies and regionalization approaches aimed to predict streamflows in ungauged areas. In this study, we establish an analytical approach to estimate the spatial correlation of daily streamflows in two arbitrary locations within a given hydrologic district or river basin at seasonal and annual time scales. The method is based on a stochastic description of the coupled streamflow dynamics at the outlet of two catchments. The framework aims to express the correlation of daily streamflows at two locations along a river network as a function of a limited number of physical parameters characterizing the main underlying hydrological drivers, that include climate conditions, precipitation regime and catchment drainage rates. The proposed method portrays how heterogeneity of climate and landscape features affect the spatial variability of flow regimes along river systems. In particular, we show that frequency and intensity of synchronous effective rainfall events in the relevant contributing catchments are the main driver of the spatial correlation of daily discharge, whereas only pronounced differences in the drainage rate of the two basins bear a significant effect on the streamflow correlation. The topological arrangement of the two outlets also influences the underlying streamflow correlation, as we show that nested catchments tend to maximize the spatial correlation of flow regimes. The application of the method to a set of catchments in the South-Eastern US suggests the potential of the proposed tool for the characterization of spatial connections of flow regimes in the

  14. Analytic game—theoretic approach to ground-water extraction

    NASA Astrophysics Data System (ADS)

    Loáiciga, Hugo A.

    2004-09-01

    The roles of cooperation and non-cooperation in the sustainable exploitation of a jointly used groundwater resource have been quantified mathematically using an analytical game-theoretic formulation. Cooperative equilibrium arises when ground-water users respect water-level constraints and consider mutual impacts, which allows them to derive economic benefits from ground-water indefinitely, that is, to achieve sustainability. This work shows that cooperative equilibrium can be obtained from the solution of a quadratic programming problem. For cooperative equilibrium to hold, however, enforcement must be effective. Otherwise, according to the commonized costs-privatized profits paradox, there is a natural tendency towards non-cooperation and non-sustainable aquifer mining, of which overdraft is a typical symptom. Non-cooperative behavior arises when at least one ground-water user neglects the externalities of his adopted ground-water pumping strategy. In this instance, water-level constraints may be violated in a relatively short time and the economic benefits from ground-water extraction fall below those obtained with cooperative aquifer use. One example illustrates the game theoretic approach of this work.

  15. Do different data analytic approaches generate discrepant findings when measuring mother-infant HPA axis attunement?

    PubMed

    Bernard, Nicola K; Kashy, Deborah A; Levendosky, Alytia A; Bogat, G Anne; Lonstein, Joseph S

    2017-03-01

    Attunement between mothers and infants in their hypothalamic-pituitary-adrenal (HPA) axis responsiveness to acute stressors is thought to benefit the child's emerging physiological and behavioral self-regulation, as well as their socioemotional development. However, there is no universally accepted definition of attunement in the literature, which appears to have resulted in inconsistent statistical analyses for determining its presence or absence, and contributed to discrepant results. We used a series of data analytic approaches, some previously used in the attunement literature and others not, to evaluate the attunement between 182 women and their 1-year-old infants in their HPA axis responsivity to acute stress. Cortisol was measured in saliva samples taken from mothers and infants before and twice after a naturalistic laboratory stressor (infant arm restraint). The results of the data analytic approaches were mixed, with some analyses suggesting attunement while others did not. The strengths and weaknesses of each statistical approach are discussed, and an analysis using a cross-lagged model that considered both time and interactions between mother and infant appeared the most appropriate. Greater consensus in the field about the conceptualization and analysis of physiological attunement would be valuable in order to advance our understanding of this phenomenon. © 2016 Wiley Periodicals, Inc.

  16. An approach to estimate spatial distribution of analyte within cells using spectrally-resolved fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam

    2017-03-01

    While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in

  17. An approach to estimate spatial distribution of analyte within cells using spectrally-resolved fluorescence microscopy.

    PubMed

    Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam

    2017-01-18

    While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in

  18. Relativistic algorithm for time transfer in Mars missions under IAU Resolutions: an analytic approach

    NASA Astrophysics Data System (ADS)

    Pan, Jun-Yang; Xie, Yi

    2015-02-01

    With tremendous advances in modern techniques, Einstein's general relativity has become an inevitable part of deep space missions. We investigate the relativistic algorithm for time transfer between the proper time τ of the onboard clock and the Geocentric Coordinate Time, which extends some previous works by including the effects of propagation of electromagnetic signals. In order to evaluate the implicit algebraic equations and integrals in the model, we take an analytic approach to work out their approximate values. This analytic model might be used in an onboard computer because of its limited capability to perform calculations. Taking an orbiter like Yinghuo-1 as an example, we find that the contributions of the Sun, the ground station and the spacecraft dominate the outcomes of the relativistic corrections to the model.

  19. Analytical Thinking, Analytical Action: Using Prelab Video Demonstrations and e-Quizzes to Improve Undergraduate Preparedness for Analytical Chemistry Practical Classes

    ERIC Educational Resources Information Center

    Jolley, Dianne F.; Wilson, Stephen R.; Kelso, Celine; O'Brien, Glennys; Mason, Claire E.

    2016-01-01

    This project utilizes visual and critical thinking approaches to develop a higher-education synergistic prelab training program for a large second-year undergraduate analytical chemistry class, directing more of the cognitive learning to the prelab phase. This enabled students to engage in more analytical thinking prior to engaging in the…

  20. Analytical approach for collective diffusion: One-dimensional lattice with the nearest neighbor and the next nearest neighbor lateral interactions

    NASA Astrophysics Data System (ADS)

    Tarasenko, Alexander

    2018-01-01

    Diffusion of particles adsorbed on a homogeneous one-dimensional lattice is investigated using a theoretical approach and MC simulations. The analytical dependencies calculated in the framework of approach are tested using the numerical data. The perfect coincidence of the data obtained by these different methods demonstrates that the correctness of the approach based on the theory of the non-equilibrium statistical operator.

  1. Optimal starting conditions for the rendezvous maneuver: Analytical and computational approach

    NASA Astrophysics Data System (ADS)

    Ciarcia, Marco

    by the optimal trajectory. For the guidance trajectory, because of the replacement of the variable thrust direction of the powered subarc with a constant thrust direction, the optimal control problem degenerates into a mathematical programming problem with a relatively small number of degrees of freedom, more precisely: three for case (i) time-to-rendezvous free and two for case (ii) time-to-rendezvous given. In particular, we consider the rendezvous between the Space Shuttle (chaser) and the International Space Station (target). Once a given initial distance SS-to-ISS is preselected, the present work supplies not only the best initial conditions for the rendezvous trajectory, but simultaneously the corresponding final conditions for the ascent trajectory. In Part B, an analytical solution of the Clohessy-Wiltshire equations is presented (i) neglecting the change of the spacecraft mass due to the fuel consumption and (ii) and assuming that the thrust is finite, that is, the trajectory includes powered subarcs flown with max thrust and coasting subarc flown with zero thrust. Then, employing the found analytical solution, we study the rendezvous problem under the assumption that the initial separation coordinates and initial separation velocities are free except for the requirement that the initial chaser-to-target distance is given. The main contribution of Part B is the development of analytical solutions for the powered subarcs, an important extension of the analytical solutions already available for the coasting subarcs. One consequence is that the entire optimal trajectory can be described analytically. Another consequence is that the optimal control problems degenerate into mathematical programming problems. A further consequence is that, vis-a-vis the optimal control formulation, the mathematical programming formulation reduces the CPU time by a factor of order 1000. Key words. Space trajectories, rendezvous, optimization, guidance, optimal control, calculus of

  2. EEG source space analysis of the supervised factor analytic approach for the classification of multi-directional arm movement

    NASA Astrophysics Data System (ADS)

    Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai

    2017-08-01

    Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.

  3. Traceability of 'Limone di Siracusa PGI' by a multidisciplinary analytical and chemometric approach.

    PubMed

    Amenta, M; Fabroni, S; Costa, C; Rapisarda, P

    2016-11-15

    Food traceability is increasingly relevant with respect to safety, quality and typicality issues. Lemon fruits grown in a typical lemon-growing area of southern Italy (Siracusa), have been awarded the PGI (Protected Geographical Indication) recognition as 'Limone di Siracusa'. Due to its peculiarity, consumers have an increasing interest about this product. The detection of potential fraud could be improved by using the tools linking the composition of this production to its typical features. This study used a wide range of analytical techniques, including conventional techniques and analytical approaches, such as spectral (NIR spectra), multi-elemental (Fe, Zn, Mn, Cu, Li, Sr) and isotopic ((13)C/(12)C, (18)O/(16)O) marker investigations, joined with multivariate statistical analysis, such as PLS-DA (Partial Least Squares Discriminant Analysis) and LDA (Linear Discriminant Analysis), to implement a traceability system to verify the authenticity of 'Limone di Siracusa' production. The results demonstrated a very good geographical discrimination rate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Learning Analytics for Online Discussions: Embedded and Extracted Approaches

    ERIC Educational Resources Information Center

    Wise, Alyssa Friend; Zhao, Yuting; Hausknecht, Simone Nicole

    2014-01-01

    This paper describes an application of learning analytics that builds on an existing research program investigating how students contribute and attend to the messages of others in asynchronous online discussions. We first overview the E-Listening research program and then explain how this work was translated into analytics that students and…

  5. Analytical slave-spin mean-field approach to orbital selective Mott insulators

    NASA Astrophysics Data System (ADS)

    Komijani, Yashar; Kotliar, Gabriel

    2017-09-01

    We use the slave-spin mean-field approach to study particle-hole symmetric one- and two-band Hubbard models in the presence of Hund's coupling interaction. By analytical analysis of the Hamiltonian, we show that the locking of the two orbitals vs orbital selective Mott transition can be formulated within a Landau-Ginzburg framework. By applying the slave-spin mean field to impurity problems, we are able to make a correspondence between impurity and lattice. We also consider the stability of the orbital selective Mott phase to the hybridization between the orbitals and study the limitations of the slave-spin method for treating interorbital tunnelings in the case of multiorbital Bethe lattices with particle-hole symmetry.

  6. Analytical Chemistry: A Literary Approach

    NASA Astrophysics Data System (ADS)

    Lucy, Charles A.

    2000-04-01

    The benefits of incorporating real-world examples of chemistry into lectures and lessons is reflected by the recent inclusion of the Teaching with Problems and Case Studies column in this Journal. However, these examples lie outside the experience of many students, and so much of the impact of "real-world" examples is lost. This paper provides an anthology of references to analytical chemistry techniques from history, popular fiction, and film. Such references are amusing to both instructor and student. Further, the fictional descriptions can serve as a focal point for discussions of a technique's true capabilities and limitations.

  7. Risk Factors Predicting Infectious Lactational Mastitis: Decision Tree Approach versus Logistic Regression Analysis.

    PubMed

    Fernández, Leónides; Mediano, Pilar; García, Ricardo; Rodríguez, Juan M; Marín, María

    2016-09-01

    Objectives Lactational mastitis frequently leads to a premature abandonment of breastfeeding; its development has been associated with several risk factors. This study aims to use a decision tree (DT) approach to establish the main risk factors involved in mastitis and to compare its performance for predicting this condition with a stepwise logistic regression (LR) model. Methods Data from 368 cases (breastfeeding women with mastitis) and 148 controls were collected by a questionnaire about risk factors related to medical history of mother and infant, pregnancy, delivery, postpartum, and breastfeeding practices. The performance of the DT and LR analyses was compared using the area under the receiver operating characteristic (ROC) curve. Sensitivity, specificity and accuracy of both models were calculated. Results Cracked nipples, antibiotics and antifungal drugs during breastfeeding, infant age, breast pumps, familial history of mastitis and throat infection were significant risk factors associated with mastitis in both analyses. Bottle-feeding and milk supply were related to mastitis for certain subgroups in the DT model. The areas under the ROC curves were similar for LR and DT models (0.870 and 0.835, respectively). The LR model had better classification accuracy and sensitivity than the DT model, but the last one presented better specificity at the optimal threshold of each curve. Conclusions The DT and LR models constitute useful and complementary analytical tools to assess the risk of lactational infectious mastitis. The DT approach identifies high-risk subpopulations that need specific mastitis prevention programs and, therefore, it could be used to make the most of public health resources.

  8. Ram Pressure Stripping Made Easy: An Analytical Approach

    NASA Astrophysics Data System (ADS)

    Köppen, J.; Jáchym, P.; Taylor, R.; Palouš, J.

    2018-06-01

    The removal of gas by ram pressure stripping of galaxies is treated by a purely kinematic description. The solution has two asymptotic limits: if the duration of the ram pressure pulse exceeds the period of vertical oscillations perpendicular to the galactic plane, the commonly used quasi-static criterion of Gunn & Gott is obtained which uses the maximum ram pressure that the galaxy has experienced along its orbit. For shorter pulses the outcome depends on the time-integrated ram pressure. This parameter pair fully describes the gas mass fraction that is stripped from a given galaxy. This approach closely reproduces results from SPH simulations. We show that typical galaxies follow a very tight relation in this parameter space corresponding to a pressure pulse length of about 300 Myr. Thus, the Gunn & Gott criterion provides a good description for galaxies in larger clusters. Applying the analytic description to a sample of 232 Virgo galaxies from the GoldMine database, we show that the ICM provides indeed the ram pressures needed to explain the deficiencies. We also can distinguish current and past strippers, including objects whose stripping state was unknown.

  9. The analytical approach to optimization of active region structure of quantum dot laser

    NASA Astrophysics Data System (ADS)

    Korenev, V. V.; Savelyev, A. V.; Zhukov, A. E.; Omelchenko, A. V.; Maximov, M. V.

    2014-10-01

    Using the analytical approach introduced in our previous papers we analyse the possibilities of optimization of size and structure of active region of semiconductor quantum dot lasers emitting via ground-state optical transitions. It is shown that there are optimal length' dispersion and number of QD layers in laser active region which allow one to obtain lasing spectrum of a given width at minimum injection current. Laser efficiency corresponding to the injection current optimized by the cavity length is practically equal to its maximum value.

  10. Toward a definition of intolerance of uncertainty: a review of factor analytical studies of the Intolerance of Uncertainty Scale.

    PubMed

    Birrell, Jane; Meares, Kevin; Wilkinson, Andrew; Freeston, Mark

    2011-11-01

    Since its emergence in the early 1990s, a narrow but concentrated body of research has developed examining the role of intolerance of uncertainty (IU) in worry, and yet we still know little about its phenomenology. In an attempt to clarify our understanding of this construct, this paper traces the way in which our understanding and definition of IU have evolved throughout the literature. This paper also aims to further our understanding of IU by exploring the latent variables measures by the Intolerance of Uncertainty Scale (IUS; Freeston, Rheaume, Letarte, Dugas & Ladouceur, 1994). A review of the literature surrounding IU confirmed that the current definitions are categorical and lack specificity. A critical review of existing factor analytic studies was carried out in order to determine the underlying factors measured by the IUS. Systematic searches yielded 9 papers for review. Two factors with 12 consistent items emerged throughout the exploratory studies, and the stability of models containing these two factors was demonstrated in subsequent confirmatory studies. It is proposed that these factors represent (i) desire for predictability and an active engagement in seeking certainty, and (ii) paralysis of cognition and action in the face of uncertainty. It is suggested that these factors may represent approach and avoidance responses to uncertainty. Further research is required to confirm the construct validity of these factors and to determine the stability of this structure within clinical samples. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  12. Visual analytics of brain networks.

    PubMed

    Li, Kaiming; Guo, Lei; Faraco, Carlos; Zhu, Dajiang; Chen, Hanbo; Yuan, Yixuan; Lv, Jinglei; Deng, Fan; Jiang, Xi; Zhang, Tuo; Hu, Xintao; Zhang, Degang; Miller, L Stephen; Liu, Tianming

    2012-05-15

    Identification of regions of interest (ROIs) is a fundamental issue in brain network construction and analysis. Recent studies demonstrate that multimodal neuroimaging approaches and joint analysis strategies are crucial for accurate, reliable and individualized identification of brain ROIs. In this paper, we present a novel approach of visual analytics and its open-source software for ROI definition and brain network construction. By combining neuroscience knowledge and computational intelligence capabilities, visual analytics can generate accurate, reliable and individualized ROIs for brain networks via joint modeling of multimodal neuroimaging data and an intuitive and real-time visual analytics interface. Furthermore, it can be used as a functional ROI optimization and prediction solution when fMRI data is unavailable or inadequate. We have applied this approach to an operation span working memory fMRI/DTI dataset, a schizophrenia DTI/resting state fMRI (R-fMRI) dataset, and a mild cognitive impairment DTI/R-fMRI dataset, in order to demonstrate the effectiveness of visual analytics. Our experimental results are encouraging. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Approach of Decision Making Based on the Analytic Hierarchy Process for Urban Landscape Management

    NASA Astrophysics Data System (ADS)

    Srdjevic, Zorica; Lakicevic, Milena; Srdjevic, Bojan

    2013-03-01

    This paper proposes a two-stage group decision making approach to urban landscape management and planning supported by the analytic hierarchy process. The proposed approach combines an application of the consensus convergence model and the weighted geometric mean method. The application of the proposed approach is shown on a real urban landscape planning problem with a park-forest in Belgrade, Serbia. Decision makers were policy makers, i.e., representatives of several key national and municipal institutions, and experts coming from different scientific fields. As a result, the most suitable management plan from the set of plans is recognized. It includes both native vegetation renewal in degraded areas of park-forest and continued maintenance of its dominant tourism function. Decision makers included in this research consider the approach to be transparent and useful for addressing landscape management tasks. The central idea of this paper can be understood in a broader sense and easily applied to other decision making problems in various scientific fields.

  14. Approach of decision making based on the analytic hierarchy process for urban landscape management.

    PubMed

    Srdjevic, Zorica; Lakicevic, Milena; Srdjevic, Bojan

    2013-03-01

    This paper proposes a two-stage group decision making approach to urban landscape management and planning supported by the analytic hierarchy process. The proposed approach combines an application of the consensus convergence model and the weighted geometric mean method. The application of the proposed approach is shown on a real urban landscape planning problem with a park-forest in Belgrade, Serbia. Decision makers were policy makers, i.e., representatives of several key national and municipal institutions, and experts coming from different scientific fields. As a result, the most suitable management plan from the set of plans is recognized. It includes both native vegetation renewal in degraded areas of park-forest and continued maintenance of its dominant tourism function. Decision makers included in this research consider the approach to be transparent and useful for addressing landscape management tasks. The central idea of this paper can be understood in a broader sense and easily applied to other decision making problems in various scientific fields.

  15. Towards Big Earth Data Analytics: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2013-04-01

    Big Data in the Earth sciences, the Tera- to Exabyte archives, mostly are made up from coverage data whereby the term "coverage", according to ISO and OGC, is defined as the digital representation of some space-time varying phenomenon. Common examples include 1-D sensor timeseries, 2-D remote sensing imagery, 3D x/y/t image timeseries and x/y/z geology data, and 4-D x/y/z/t atmosphere and ocean data. Analytics on such data requires on-demand processing of sometimes significant complexity, such as getting the Fourier transform of satellite images. As network bandwidth limits prohibit transfer of such Big Data it is indispensable to devise protocols allowing clients to task flexible and fast processing on the server. The EarthServer initiative, funded by EU FP7 eInfrastructures, unites 11 partners from computer and earth sciences to establish Big Earth Data Analytics. One key ingredient is flexibility for users to ask what they want, not impeded and complicated by system internals. The EarthServer answer to this is to use high-level query languages; these have proven tremendously successful on tabular and XML data, and we extend them with a central geo data structure, multi-dimensional arrays. A second key ingredient is scalability. Without any doubt, scalability ultimately can only be achieved through parallelization. In the past, parallelizing code has been done at compile time and usually with manual intervention. The EarthServer approach is to perform a samentic-based dynamic distribution of queries fragments based on networks optimization and further criteria. The EarthServer platform is comprised by rasdaman, an Array DBMS enabling efficient storage and retrieval of any-size, any-type multi-dimensional raster data. In the project, rasdaman is being extended with several functionality and scalability features, including: support for irregular grids and general meshes; in-situ retrieval (evaluation of database queries on existing archive structures, avoiding data

  16. An Analytical Thermal Model for Autonomous Soaring Research

    NASA Technical Reports Server (NTRS)

    Allen, Michael

    2006-01-01

    A viewgraph presentation describing an analytical thermal model used to enable research on autonomous soaring for a small UAV aircraft is given. The topics include: 1) Purpose; 2) Approach; 3) SURFRAD Data; 4) Convective Layer Thickness; 5) Surface Heat Budget; 6) Surface Virtual Potential Temperature Flux; 7) Convective Scaling Velocity; 8) Other Calculations; 9) Yearly trends; 10) Scale Factors; 11) Scale Factor Test Matrix; 12) Statistical Model; 13) Updraft Strength Calculation; 14) Updraft Diameter; 15) Updraft Shape; 16) Smoothed Updraft Shape; 17) Updraft Spacing; 18) Environment Sink; 19) Updraft Lifespan; 20) Autonomous Soaring Research; 21) Planned Flight Test; and 22) Mixing Ratio.

  17. Patterns of Work and Family Involvement among Single and Dual Earner Couples: Two Competing Analytical Approaches.

    ERIC Educational Resources Information Center

    Yogev, Sara; Brett, Jeanne

    This paper offers a conceptual framework for the intersection of work and family roles based on the constructs of work involvement and family involvement. The theoretical and empirical literature on the intersection of work and family roles is reviewed from two analytical approaches. From the individual level of analysis, the literature reviewed…

  18. Simplified analytical model and balanced design approach for light-weight wood-based structural panel in bending

    Treesearch

    Jinghao Li; John F. Hunt; Shaoqin Gong; Zhiyong Cai

    2016-01-01

    This paper presents a simplified analytical model and balanced design approach for modeling lightweight wood-based structural panels in bending. Because many design parameters are required to input for the model of finite element analysis (FEA) during the preliminary design process and optimization, the equivalent method was developed to analyze the mechanical...

  19. A Big Data and Learning Analytics Approach to Process-Level Feedback in Cognitive Simulations.

    PubMed

    Pecaric, Martin; Boutis, Kathy; Beckstead, Jason; Pusic, Martin

    2017-02-01

    Collecting and analyzing large amounts of process data for the purposes of education can be considered a big data/learning analytics (BD/LA) approach to improving learning. However, in the education of health care professionals, the application of BD/LA is limited to date. The authors discuss the potential advantages of the BD/LA approach for the process of learning via cognitive simulations. Using the lens of a cognitive model of radiograph interpretation with four phases (orientation, searching/scanning, feature detection, and decision making), they reanalyzed process data from a cognitive simulation of pediatric ankle radiography where 46 practitioners from three expertise levels classified 234 cases online. To illustrate the big data component, they highlight the data available in a digital environment (time-stamped, click-level process data). Learning analytics were illustrated using algorithmic computer-enabled approaches to process-level feedback.For each phase, the authors were able to identify examples of potentially useful BD/LA measures. For orientation, the trackable behavior of re-reviewing the clinical history was associated with increased diagnostic accuracy. For searching/scanning, evidence of skipping views was associated with an increased false-negative rate. For feature detection, heat maps overlaid on the radiograph can provide a metacognitive visualization of common novice errors. For decision making, the measured influence of sequence effects can reflect susceptibility to bias, whereas computer-generated path maps can provide insights into learners' diagnostic strategies.In conclusion, the augmented collection and dynamic analysis of learning process data within a cognitive simulation can improve feedback and prompt more precise reflection on a novice clinician's skill development.

  20. A program wide framework for evaluating data driven teaching and learning - earth analytics approaches, results and lessons learned

    NASA Astrophysics Data System (ADS)

    Wasser, L. A.; Gold, A. U.

    2017-12-01

    There is a deluge of earth systems data available to address cutting edge science problems yet specific skills are required to work with these data. The Earth analytics education program, a core component of Earth Lab at the University of Colorado - Boulder - is building a data intensive program that provides training in realms including 1) interdisciplinary communication and collaboration 2) earth science domain knowledge including geospatial science and remote sensing and 3) reproducible, open science workflows ("earth analytics"). The earth analytics program includes an undergraduate internship, undergraduate and graduate level courses and a professional certificate / degree program. All programs share the goals of preparing a STEM workforce for successful earth analytics driven careers. We are developing an program-wide evaluation framework that assesses the effectiveness of data intensive instruction combined with domain science learning to better understand and improve data-intensive teaching approaches using blends of online, in situ, asynchronous and synchronous learning. We are using targeted online search engine optimization (SEO) to increase visibility and in turn program reach. Finally our design targets longitudinal program impacts on participant career tracts over time.. Here we present results from evaluation of both an interdisciplinary undergrad / graduate level earth analytics course and and undergraduate internship. Early results suggest that a blended approach to learning and teaching that includes both synchronous in-person teaching and active classroom hands-on learning combined with asynchronous learning in the form of online materials lead to student success. Further we will present our model for longitudinal tracking of participant's career focus overtime to better understand long-term program impacts. We also demonstrate the impact of SEO optimization on online content reach and program visibility.

  1. Maternal and infant activity: Analytic approaches for the study of circadian rhythm.

    PubMed

    Thomas, Karen A; Burr, Robert L; Spieker, Susan

    2015-11-01

    The study of infant and mother circadian rhythm entails choice of instruments appropriate for use in the home environment as well as selection of analytic approach that characterizes circadian rhythm. While actigraphy monitoring suits the needs of home study, limited studies have examined mother and infant rhythm derived from actigraphy. Among this existing research a variety of analyses have been employed to characterize 24-h rhythm, reducing ability to evaluate and synthesize findings. Few studies have examined the correspondence of mother and infant circadian parameters for the most frequently cited approaches: cosinor, non-parametric circadian rhythm analysis (NPCRA), and autocorrelation function (ACF). The purpose of this research was to examine analytic approaches in the study of mother and infant circadian activity rhythm. Forty-three healthy mother and infant pairs were studied in the home environment over a 72h period at infant age 4, 8, and 12 weeks. Activity was recorded continuously using actigraphy monitors and mothers completed a diary. Parameters of circadian rhythm were generated from cosinor analysis, NPCRA, and ACF. The correlation among measures of rhythm center (cosinor mesor, NPCRA mid level), strength or fit of 24-h period (cosinor magnitude and R(2), NPCRA amplitude and relative amplitude (RA)), phase (cosinor acrophase, NPCRA M10 and L5 midpoint), and rhythm stability and variability (NPCRA interdaily stability (IS) and intradaily variability (IV), ACF) was assessed, and additionally the effect size (eta(2)) for change over time evaluated. Results suggest that cosinor analysis, NPCRA, and autocorrelation provide several comparable parameters of infant and maternal circadian rhythm center, fit, and phase. IS and IV were strongly correlated with the 24-h cycle fit. The circadian parameters analyzed offer separate insight into rhythm and differing effect size for the detection of change over time. Findings inform selection of analysis and

  2. Maternal and infant activity: Analytic approaches for the study of circadian rhythm

    PubMed Central

    Thomas, Karen A.; Burr, Robert L.; Spieker, Susan

    2015-01-01

    The study of infant and mother circadian rhythm entails choice of instruments appropriate for use in the home environment as well as selection of analytic approach that characterizes circadian rhythm. While actigraphy monitoring suits the needs of home study, limited studies have examined mother and infant rhythm derived from actigraphy. Among this existing research a variety of analyses have been employed to characterize 24-h rhythm, reducing ability to evaluate and synthesize findings. Few studies have examined the correspondence of mother and infant circadian parameters for the most frequently cited approaches: cosinor, non-parametric circadian rhythm analysis (NPCRA), and autocorrelation function (ACF). The purpose of this research was to examine analytic approaches in the study of mother and infant circadian activity rhythm. Forty-three healthy mother and infant pairs were studied in the home environment over a 72 h period at infant age 4, 8, and 12 weeks. Activity was recorded continuously using actigraphy monitors and mothers completed a diary. Parameters of circadian rhythm were generated from cosinor analysis, NPCRA, and ACF. The correlation among measures of rhythm center (cosinor mesor, NPCRA mid level), strength or fit of 24-h period (cosinor magnitude and R2, NPCRA amplitude and relative amplitude (RA)), phase (cosinor acrophase, NPCRA M10 and L5 midpoint), and rhythm stability and variability (NPCRA interdaily stability (IS) and intradaily variability (IV), ACF) was assessed, and additionally the effect size (eta2) for change over time evaluated. Results suggest that cosinor analysis, NPCRA, and autocorrelation provide several comparable parameters of infant and maternal circadian rhythm center, fit, and phase. IS and IV were strongly correlated with the 24-h cycle fit. The circadian parameters analyzed offer separate insight into rhythm and differing effect size for the detection of change over time. Findings inform selection of analysis and

  3. Earthdata Cloud Analytics Project

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Lynnes, Chris

    2018-01-01

    This presentation describes a nascent project in NASA to develop a framework to support end-user analytics of NASA's Earth science data in the cloud. The chief benefit of migrating EOSDIS (Earth Observation System Data and Information Systems) data to the cloud is to position the data next to enormous computing capacity to allow end users to process data at scale. The Earthdata Cloud Analytics project will user a service-based approach to facilitate the infusion of evolving analytics technology and the integration with non-NASA analytics or other complementary functionality at other agencies and in other nations.

  4. Soft x-ray continuum radiation transmitted through metallic filters: an analytical approach to fast electron temperature measurements.

    PubMed

    Delgado-Aparicio, L; Tritz, K; Kramer, T; Stutman, D; Finkenthal, M; Hill, K; Bitter, M

    2010-10-01

    A new set of analytic formulas describes the transmission of soft x-ray continuum radiation through a metallic foil for its application to fast electron temperature measurements in fusion plasmas. This novel approach shows good agreement with numerical calculations over a wide range of plasma temperatures in contrast with the solutions obtained when using a transmission approximated by a single-Heaviside function [S. von Goeler et al., Rev. Sci. Instrum. 70, 599 (1999)]. The new analytic formulas can improve the interpretation of the experimental results and thus contribute in obtaining fast temperature measurements in between intermittent Thomson scattering data.

  5. Imaging MALDI MS of Dosed Brain Tissues Utilizing an Alternative Analyte Pre-extraction Approach

    NASA Astrophysics Data System (ADS)

    Quiason, Cristine M.; Shahidi-Latham, Sheerin K.

    2015-06-01

    Matrix-assisted laser desorption ionization (MALDI) imaging mass spectrometry has been adopted in the pharmaceutical industry as a useful tool to detect xenobiotic distribution within tissues. A unique sample preparation approach for MALDI imaging has been described here for the extraction and detection of cobimetinib and clozapine, which were previously undetectable in mouse and rat brain using a single matrix application step. Employing a combination of a buffer wash and a cyclohexane pre-extraction step prior to standard matrix application, the xenobiotics were successfully extracted and detected with an 8 to 20-fold gain in sensitivity. This alternative approach for sample preparation could serve as an advantageous option when encountering difficult to detect analytes.

  6. Generalized Subset Designs in Analytical Chemistry.

    PubMed

    Surowiec, Izabella; Vikström, Ludvig; Hector, Gustaf; Johansson, Erik; Vikström, Conny; Trygg, Johan

    2017-06-20

    Design of experiments (DOE) is an established methodology in research, development, manufacturing, and production for screening, optimization, and robustness testing. Two-level fractional factorial designs remain the preferred approach due to high information content while keeping the number of experiments low. These types of designs, however, have never been extended to a generalized multilevel reduced design type that would be capable to include both qualitative and quantitative factors. In this Article we describe a novel generalized fractional factorial design. In addition, it also provides complementary and balanced subdesigns analogous to a fold-over in two-level reduced factorial designs. We demonstrate how this design type can be applied with good results in three different applications in analytical chemistry including (a) multivariate calibration using microwave resonance spectroscopy for the determination of water in tablets, (b) stability study in drug product development, and (c) representative sample selection in clinical studies. This demonstrates the potential of generalized fractional factorial designs to be applied in many other areas of analytical chemistry where representative, balanced, and complementary subsets are required, especially when a combination of quantitative and qualitative factors at multiple levels exists.

  7. A semi-analytical refrigeration cycle modelling approach for a heat pump hot water heater

    NASA Astrophysics Data System (ADS)

    Panaras, G.; Mathioulakis, E.; Belessiotis, V.

    2018-04-01

    The use of heat pump systems in applications like the production of hot water or space heating makes important the modelling of the processes for the evaluation of the performance of existing systems, as well as for design purposes. The proposed semi-analytical model offers the opportunity to estimate the performance of a heat pump system producing hot water, without using detailed geometrical or any performance data. This is important, as for many commercial systems the type and characteristics of the involved subcomponents can hardly be detected, thus not allowing the implementation of more analytical approaches or the exploitation of the manufacturers' catalogue performance data. The analysis copes with the issues related with the development of the models of the subcomponents involved in the studied system. Issues not discussed thoroughly in the existing literature, as the refrigerant mass inventory in the case an accumulator is present, are examined effectively.

  8. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  9. ENVIRONMENTAL ANALYTICAL CHEMISTRY OF ...

    EPA Pesticide Factsheets

    Within the scope of a number of emerging contaminant issues in environmental analysis, one area that has received a great deal of public interest has been the assessment of the role of pharmaceuticals and personal care products (PPCPs) as stressors and agents of change in ecosystems as well as their role in unplanned human exposure. The relationship between personal actions and the occurrence of PPCPs in the environment is clear-cut and comprehensible to the public. In this overview, we attempt to examine the separations aspect of the analytical approach to the vast array of potential analytes among this class of compounds. We also highlight the relationship between these compounds and endocrine disrupting compounds (EDCs) and between PPCPs and EDCs and the more traditional environmental analytes such as the persistent organic pollutants (POPs). Although the spectrum of chemical behavior extends from hydrophobic to hydrophilic, the current focus has shifted to moderately and highly polar analytes. Thus, emphasis on HPLC and LC/MS has grown and MS/MS has become a detection technique of choice with either electrospray ionization or atmospheric pressure chemical ionization. This contrasts markedly with the bench mark approach of capillary GC, GC/MS and electron ionization in traditional environmental analysis. The expansion of the analyte list has fostered new vigor in the development of environmental analytical chemistry, modernized the range of tools appli

  10. A geovisual analytic approach to understanding geo-social relationships in the international trade network.

    PubMed

    Luo, Wei; Yin, Peifeng; Di, Qian; Hardisty, Frank; MacEachren, Alan M

    2014-01-01

    The world has become a complex set of geo-social systems interconnected by networks, including transportation networks, telecommunications, and the internet. Understanding the interactions between spatial and social relationships within such geo-social systems is a challenge. This research aims to address this challenge through the framework of geovisual analytics. We present the GeoSocialApp which implements traditional network analysis methods in the context of explicitly spatial and social representations. We then apply it to an exploration of international trade networks in terms of the complex interactions between spatial and social relationships. This exploration using the GeoSocialApp helps us develop a two-part hypothesis: international trade network clusters with structural equivalence are strongly 'balkanized' (fragmented) according to the geography of trading partners, and the geographical distance weighted by population within each network cluster has a positive relationship with the development level of countries. In addition to demonstrating the potential of visual analytics to provide insight concerning complex geo-social relationships at a global scale, the research also addresses the challenge of validating insights derived through interactive geovisual analytics. We develop two indicators to quantify the observed patterns, and then use a Monte-Carlo approach to support the hypothesis developed above.

  11. A Geovisual Analytic Approach to Understanding Geo-Social Relationships in the International Trade Network

    PubMed Central

    Luo, Wei; Yin, Peifeng; Di, Qian; Hardisty, Frank; MacEachren, Alan M.

    2014-01-01

    The world has become a complex set of geo-social systems interconnected by networks, including transportation networks, telecommunications, and the internet. Understanding the interactions between spatial and social relationships within such geo-social systems is a challenge. This research aims to address this challenge through the framework of geovisual analytics. We present the GeoSocialApp which implements traditional network analysis methods in the context of explicitly spatial and social representations. We then apply it to an exploration of international trade networks in terms of the complex interactions between spatial and social relationships. This exploration using the GeoSocialApp helps us develop a two-part hypothesis: international trade network clusters with structural equivalence are strongly ‘balkanized’ (fragmented) according to the geography of trading partners, and the geographical distance weighted by population within each network cluster has a positive relationship with the development level of countries. In addition to demonstrating the potential of visual analytics to provide insight concerning complex geo-social relationships at a global scale, the research also addresses the challenge of validating insights derived through interactive geovisual analytics. We develop two indicators to quantify the observed patterns, and then use a Monte-Carlo approach to support the hypothesis developed above. PMID:24558409

  12. A Modern Approach to College Analytical Chemistry.

    ERIC Educational Resources Information Center

    Neman, R. L.

    1983-01-01

    Describes a course which emphasizes all facets of analytical chemistry, including sampling, preparation, interference removal, selection of methodology, measurement of a property, and calculation/interpretation of results. Includes special course features (such as cooperative agreement with an environmental protection center) and course…

  13. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2010-04-01

    The annual update of the list of prohibited substances and doping methods as issued by the World Anti-Doping Agency (WADA) allows the implementation of most recent considerations of performance manipulation and emerging therapeutics into human sports doping control programmes. The annual banned-substance review for human doping controls critically summarizes recent innovations in analytical approaches that support the efforts of convicting cheating athletes by improved or newly established methods that focus on known as well as newly outlawed substances and doping methods. In the current review, literature published between October 2008 and September 2009 reporting on new and/or enhanced procedures and techniques for doping analysis, as well as aspects relevant to the doping control arena, was considered to complement the 2009 annual banned-substance review.

  14. Teaching dermatoscopy of pigmented skin tumours to novices: comparison of analytic vs. heuristic approach.

    PubMed

    Tschandl, P; Kittler, H; Schmid, K; Zalaudek, I; Argenziano, G

    2015-06-01

    There are two strategies to approach the dermatoscopic diagnosis of pigmented skin tumours, namely the verbal-based analytic and the more visual-global heuristic method. It is not known if one or the other is more efficient in teaching dermatoscopy. To compare two teaching methods in short-term training of dermatoscopy to medical students. Fifty-seven medical students in the last year of the curriculum were given a 1-h lecture of either the heuristic- or the analytic-based teaching of dermatoscopy. Before and after this session, they were shown the same 50 lesions and asked to diagnose them and rate for chance of malignancy. Test lesions consisted of melanomas, basal cell carcinomas, nevi, seborrhoeic keratoses, benign vascular tumours and dermatofibromas. Performance measures were diagnostic accuracy regarding malignancy as measured by the area under the curves of receiver operating curves (range: 0-1), as well as per cent correct diagnoses (range: 0-100%). Diagnostic accuracy as well as per cent correct diagnoses increased by +0.21 and +32.9% (heuristic teaching) and +0.19 and +35.7% (analytic teaching) respectively (P for all <0.001). Neither for diagnostic accuracy (P = 0.585), nor for per cent correct diagnoses (P = 0.298) was a difference between the two groups. Short-term training of dermatoscopy to medical students allows significant improvement in diagnostic abilities. Choosing a heuristic or analytic method does not have an influence on this effect in short training using common pigmented skin lesions. © 2014 European Academy of Dermatology and Venereology.

  15. Analysis of a Nonlinear Oscillator with CAS through Analytical, Numerical, and Qualitative Approaches: A Prototype for Teaching

    ERIC Educational Resources Information Center

    López-García, Jeanett; Jiménez Zamudio, Jorge Javier

    2017-01-01

    It is very common to find in contemporary literature of Differential Equations, the need to incorporate holistically in teaching and learning the three different approaches: analytical, qualitative, and numerical, for continuous dynamical systems. However, nowadays, in some Bachelor of Science that includes only one course in differential…

  16. The case for visual analytics of arsenic concentrations in foods.

    PubMed

    Johnson, Matilda O; Cohly, Hari H P; Isokpehi, Raphael D; Awofolu, Omotayo R

    2010-05-01

    Arsenic is a naturally occurring toxic metal and its presence in food could be a potential risk to the health of both humans and animals. Prolonged ingestion of arsenic contaminated water may result in manifestations of toxicity in all systems of the body. Visual Analytics is a multidisciplinary field that is defined as the science of analytical reasoning facilitated by interactive visual interfaces. The concentrations of arsenic vary in foods making it impractical and impossible to provide regulatory limit for each food. This review article presents a case for the use of visual analytics approaches to provide comparative assessment of arsenic in various foods. The topics covered include (i) metabolism of arsenic in the human body; (ii) arsenic concentrations in various foods; (ii) factors affecting arsenic uptake in plants; (ii) introduction to visual analytics; and (iv) benefits of visual analytics for comparative assessment of arsenic concentration in foods. Visual analytics can provide an information superstructure of arsenic in various foods to permit insightful comparative risk assessment of the diverse and continually expanding data on arsenic in food groups in the context of country of study or origin, year of study, method of analysis and arsenic species.

  17. The Case for Visual Analytics of Arsenic Concentrations in Foods

    PubMed Central

    Johnson, Matilda O.; Cohly, Hari H.P.; Isokpehi, Raphael D.; Awofolu, Omotayo R.

    2010-01-01

    Arsenic is a naturally occurring toxic metal and its presence in food could be a potential risk to the health of both humans and animals. Prolonged ingestion of arsenic contaminated water may result in manifestations of toxicity in all systems of the body. Visual Analytics is a multidisciplinary field that is defined as the science of analytical reasoning facilitated by interactive visual interfaces. The concentrations of arsenic vary in foods making it impractical and impossible to provide regulatory limit for each food. This review article presents a case for the use of visual analytics approaches to provide comparative assessment of arsenic in various foods. The topics covered include (i) metabolism of arsenic in the human body; (ii) arsenic concentrations in various foods; (ii) factors affecting arsenic uptake in plants; (ii) introduction to visual analytics; and (iv) benefits of visual analytics for comparative assessment of arsenic concentration in foods. Visual analytics can provide an information superstructure of arsenic in various foods to permit insightful comparative risk assessment of the diverse and continually expanding data on arsenic in food groups in the context of country of study or origin, year of study, method of analysis and arsenic species. PMID:20623005

  18. Analytical approach to the multi-state lasing phenomenon in quantum dot lasers

    NASA Astrophysics Data System (ADS)

    Korenev, V. V.; Savelyev, A. V.; Zhukov, A. E.; Omelchenko, A. V.; Maximov, M. V.

    2013-03-01

    We introduce an analytical approach to describe the multi-state lasing phenomenon in quantum dot lasers. We show that the key parameter is the hole-to-electron capture rate ratio. If it is lower than a certain critical value, the complete quenching of ground-state lasing takes place at high injection levels. At higher values of the ratio, the model predicts saturation of the ground-state power. This explains the diversity of experimental results and their contradiction to the conventional rate equation model. Recently found enhancement of ground-state lasing in p-doped samples and temperature dependence of the ground-state power are also discussed.

  19. Novel predictive models for metabolic syndrome risk: a "big data" analytic approach.

    PubMed

    Steinberg, Gregory B; Church, Bruce W; McCall, Carol J; Scott, Adam B; Kalis, Brian P

    2014-06-01

    We applied a proprietary "big data" analytic platform--Reverse Engineering and Forward Simulation (REFS)--to dimensions of metabolic syndrome extracted from a large data set compiled from Aetna's databases for 1 large national customer. Our goals were to accurately predict subsequent risk of metabolic syndrome and its various factors on both a population and individual level. The study data set included demographic, medical claim, pharmacy claim, laboratory test, and biometric screening results for 36,944 individuals. The platform reverse-engineered functional models of systems from diverse and large data sources and provided a simulation framework for insight generation. The platform interrogated data sets from the results of 2 Comprehensive Metabolic Syndrome Screenings (CMSSs) as well as complete coverage records; complete data from medical claims, pharmacy claims, and lab results for 2010 and 2011; and responses to health risk assessment questions. The platform predicted subsequent risk of metabolic syndrome, both overall and by risk factor, on population and individual levels, with ROC/AUC varying from 0.80 to 0.88. We demonstrated that improving waist circumference and blood glucose yielded the largest benefits on subsequent risk and medical costs. We also showed that adherence to prescribed medications and, particularly, adherence to routine scheduled outpatient doctor visits, reduced subsequent risk. The platform generated individualized insights using available heterogeneous data within 3 months. The accuracy and short speed to insight with this type of analytic platform allowed Aetna to develop targeted cost-effective care management programs for individuals with or at risk for metabolic syndrome.

  20. VAST Challenge 2016: Streaming Visual Analytics

    DTIC Science & Technology

    2016-10-25

    understand rapidly evolving situations. To support such tasks, visual analytics solutions must move well beyond systems that simply provide real-time...received. Mini-Challenge 1: Design Challenge Mini-Challenge 1 focused on systems to support security and operational analytics at the Euybia...Challenge 1 was to solicit novel approaches for streaming visual analytics that push the boundaries for what constitutes a visual analytics system , and to

  1. Factors Affecting the Location of Road Emergency Bases in Iran Using Analytical Hierarchy Process (AHP).

    PubMed

    Bahadori, Mohammadkarim; Hajebrahimi, Ahmad; Alimohammadzadeh, Khalil; Ravangard, Ramin; Hosseini, Seyed Mojtaba

    2017-10-01

    To identify and prioritize factors affecting the location of road emergency bases in Iran using Analytical Hierarchy Process (AHP). This was a mixed method (quantitative-qualitative) study conducted in 2016. The participants in this study included the professionals and experts in the field of pre-hospital and road emergency services issues working in the Health Deputy of Iran Ministry of Health and Medical Education, which were selected using purposive sampling method. In this study at first, the factors affecting the location of road emergency bases in Iran were identified using literature review and conducting interviews with the experts. Then, the identified factors were scored and prioritized using the studied professionals and experts' viewpoints through using the analytic hierarchy process (AHP) technique and its related pair-wise questionnaire. The collected data were analyzed using MAXQDA 10.0 software to analyze the answers given to the open question and Expert Choice 10.0 software to determine the weights and priorities of the identified factors. The results showed that eight factors were effective in locating the road emergency bases in Iran from the viewpoints of the studied professionals and experts in the field of pre-hospital and road emergency services issues, including respectively distance from the next base, region population, topography and geographical situation of the region, the volume of road traffic, the existence of amenities such as water, electricity, gas, etc. and proximity to the village, accident-prone sites, University ownership of the base site, and proximity to toll-house. Among the eight factors which were effective in locating the road emergency bases from the studied professionals and experts' perspectives, "distance from the next base" and "region population" were respectively the most important ones which had great differences with other factors.

  2. Determination of Slope Safety Factor with Analytical Solution and Searching Critical Slip Surface with Genetic-Traversal Random Method

    PubMed Central

    2014-01-01

    In the current practice, to determine the safety factor of a slope with two-dimensional circular potential failure surface, one of the searching methods for the critical slip surface is Genetic Algorithm (GA), while the method to calculate the slope safety factor is Fellenius' slices method. However GA needs to be validated with more numeric tests, while Fellenius' slices method is just an approximate method like finite element method. This paper proposed a new method to determine the minimum slope safety factor which is the determination of slope safety factor with analytical solution and searching critical slip surface with Genetic-Traversal Random Method. The analytical solution is more accurate than Fellenius' slices method. The Genetic-Traversal Random Method uses random pick to utilize mutation. A computer automatic search program is developed for the Genetic-Traversal Random Method. After comparison with other methods like slope/w software, results indicate that the Genetic-Traversal Random Search Method can give very low safety factor which is about half of the other methods. However the obtained minimum safety factor with Genetic-Traversal Random Search Method is very close to the lower bound solutions of slope safety factor given by the Ansys software. PMID:24782679

  3. Effect-Based Screening Methods for Water Quality Characterization Will Augment Conventional Analyte-by-Analyte Chemical Methods in Research As Well As Regulatory Monitoring

    EPA Science Inventory

    Conventional approaches to water quality characterization can provide data on individual chemical components of each water sample. This analyte-by-analyte approach currently serves many useful research and compliance monitoring needs. However these approaches, which require a ...

  4. BOOK REVIEW Analytical and Numerical Approaches to Mathematical Relativity

    NASA Astrophysics Data System (ADS)

    Stewart, John M.

    2007-08-01

    The 319th Wilhelm-and-Else-Heraeus Seminar 'Mathematical Relativity: New Ideas and Developments' took place in March 2004. Twelve of the invited speakers have expanded their one hour talks into the papers appearing in this volume, preceded by a foreword by Roger Penrose. The first group consists of four papers on 'differential geometry and differential topology'. Paul Ehrlich opens with a very witty review of global Lorentzian geometry, which caused this reviewer to think more carefully about how he uses the adjective 'generic'. Robert Low addresses the issue of causality with a description of the 'space of null geodesics' and a tentative proposal for a new definition of causal boundary. The underlying review of global Lorentzian geometry is continued by Antonio Masiello, looking at variational approaches (actually valid for more general semi-Riemannian manifolds). This group concludes with a very clear review of pp-wave spacetimes from José Flores and Miguel Sánchez. (This reviewer was delighted to see a reproduction of Roger Penrose's seminal (1965) picture of null geodesics in plane wave spacetimes which attracted him into the subject.) Robert Beig opens the second group 'analytic methods and differential equations' with a brief but careful discussion of symmetric (regular) hyperbolicity for first (second) order systems, respectively, of partial differential equations. His description is peppered with examples, many specific to relativstic continuum mechanics. There follows a succinct review of linear elliptic boundary value problems with applications to general relativity from Sergio Dain. The numerous examples he provides are thought-provoking. The 'standard cosmological model' has been well understood for three quarters of a century. However recent observations suggest that the expansion in our Universe may be accelerating. Alan Rendall provides a careful discussion of the changes, both mathematical and physical, to the standard model which might be needed

  5. Analytical and quasi-Bayesian methods as development of the iterative approach for mixed radiation biodosimetry.

    PubMed

    Słonecka, Iwona; Łukasik, Krzysztof; Fornalski, Krzysztof W

    2018-06-04

    The present paper proposes two methods of calculating components of the dose absorbed by the human body after exposure to a mixed neutron and gamma radiation field. The article presents a novel approach to replace the common iterative method in its analytical form, thus reducing the calculation time. It also shows a possibility of estimating the neutron and gamma doses when their ratio in a mixed beam is not precisely known.

  6. Psychopathy Factor Interactions and Co-Occurring Psychopathology: Does Measurement Approach Matter?

    PubMed Central

    Hunt, Elizabeth; Bornovalova, Marina A.; Kimonis, Eva R.; Lilienfeld, Scott O.; Poythress, Norman G.

    2014-01-01

    The two dimensions of psychopathy as operationalized by various measurement tools show differential associations with psychopathology; however, evidence suggests that the statistical interaction of Factor 1 (F1) and Factor 2 (F2) may be important in understanding associations with psychopathology. Findings regarding the interactive effects of F1 and F2 are mixed, as both potentiating and protective effects have emerged. Moreover, approaches to measuring F1 (e.g. clinical interview versus self-report) are based on different conceptualizations of F1, which may influence the interactive effects. The current study aims to 1) elucidate the influence of F1 and F2 on psychopathology by using both variable-centered and person-centered approaches and 2) determine if the measurement of F1 influences the interactive effects of F1 and F2 by comparing the strength of interactive effects across F1 measures in a sample of over 1,500 offenders. Across analytic methods, there were very few cases in which F1 statistically influenced the association between F2 and psychopathology, such that F1 failed to evidence either potentiating or protective effects on F2. Furthermore, the conceptualization of F1 across psychopathy measures did not impact the interactive effects of F1 and F2. These findings suggest that F2 is probably driving the relations between psychopathy and other forms of psychopathology, and that F1 may play less of a role in interacting with F2 than previously believed. PMID:25580612

  7. Substrate mass transfer: analytical approach for immobilized enzyme reactions

    NASA Astrophysics Data System (ADS)

    Senthamarai, R.; Saibavani, T. N.

    2018-04-01

    In this paper, the boundary value problem in immobilized enzyme reactions is formulated and approximate expression for substrate concentration without external mass transfer resistance is presented. He’s variational iteration method is used to give approximate and analytical solutions of non-linear differential equation containing a non linear term related to enzymatic reaction. The relevant analytical solution for the dimensionless substrate concentration profile is discussed in terms of dimensionless reaction parameters α and β.

  8. Measuring bio-oil upgrade intermediates and corrosive species with polarity-matched analytical approaches

    DOE PAGES

    Connatser, Raynella M.; Lewis, Sr., Samuel Arthur; Keiser, James R.; ...

    2014-10-03

    Integrating biofuels with conventional petroleum products requires improvements in processing to increase blendability with existing fuels. This work demonstrates analysis techniques for more hydrophilic bio-oil liquids that give improved quantitative and qualitative description of the total acid content and organic acid profiles. To protect infrastructure from damage and reduce the cost associated with upgrading, accurate determination of acid content and representative chemical compound analysis are central imperatives to assessing both the corrosivity and the progress toward removing oxygen and acidity in processed biomass liquids. Established techniques form an ample basis for bio-liquids evaluation. However, early in the upgrading process, themore » unique physical phases and varied hydrophilicity of many pyrolysis liquids can render analytical methods originally designed for use in petroleum-derived oils inadequate. In this work, the water solubility of the organic acids present in bio-oils is exploited in a novel extraction and titration technique followed by analysis on the water-based capillary electrophoresis (CE) platform. The modification of ASTM D664, the standard for Total Acid Number (TAN), to include aqueous carrier solvents improves the utility of that approach for quantifying acid content in hydrophilic bio-oils. Termed AMTAN (modified Total Acid Number), this technique offers 1.2% relative standard deviation and dynamic range comparable to the conventional ASTM method. Furthermore, the results of corrosion product evaluations using several different sources of real bio-oil are discussed in the context of the unique AMTAN and CE analytical approaches developed to facilitate those measurements.« less

  9. Analytical Quality by Design Approach in RP-HPLC Method Development for the Assay of Etofenamate in Dosage Forms

    PubMed Central

    Peraman, R.; Bhadraya, K.; Reddy, Y. Padmanabha; Reddy, C. Surayaprakash; Lokesh, T.

    2015-01-01

    By considering the current regulatory requirement for an analytical method development, a reversed phase high performance liquid chromatographic method for routine analysis of etofenamate in dosage form has been optimized using analytical quality by design approach. Unlike routine approach, the present study was initiated with understanding of quality target product profile, analytical target profile and risk assessment for method variables that affect the method response. A liquid chromatography system equipped with a C18 column (250×4.6 mm, 5 μ), a binary pump and photodiode array detector were used in this work. The experiments were conducted based on plan by central composite design, which could save time, reagents and other resources. Sigma Tech software was used to plan and analyses the experimental observations and obtain quadratic process model. The process model was used for predictive solution for retention time. The predicted data from contour diagram for retention time were verified actually and it satisfied with actual experimental data. The optimized method was achieved at 1.2 ml/min flow rate of using mobile phase composition of methanol and 0.2% triethylamine in water at 85:15, % v/v, pH adjusted to 6.5. The method was validated and verified for targeted method performances, robustness and system suitability during method transfer. PMID:26997704

  10. A new approach to analytic, non-perturbative and gauge-invariant QCD

    NASA Astrophysics Data System (ADS)

    Fried, H. M.; Grandou, T.; Sheu, Y.-M.

    2012-11-01

    Following a previous calculation of quark scattering in eikonal approximation, this paper presents a new, analytic and rigorous approach to the calculation of QCD phenomena. In this formulation a basic distinction between the conventional "idealistic" description of QCD and a more "realistic" description is brought into focus by a non-perturbative, gauge-invariant evaluation of the Schwinger solution for the QCD generating functional in terms of the exact Fradkin representations of Green's functional G(x,y|A) and the vacuum functional L[A]. Because quarks exist asymptotically only in bound states, their transverse coordinates can never be measured with arbitrary precision; the non-perturbative neglect of this statement leads to obstructions that are easily corrected by invoking in the basic Lagrangian a probability amplitude which describes such transverse imprecision. The second result of this non-perturbative analysis is the appearance of a new and simplifying output called "Effective Locality", in which the interactions between quarks by the exchange of a "gluon bundle"-which "bundle" contains an infinite number of gluons, including cubic and quartic gluon interactions-display an exact locality property that reduces the several functional integrals of the formulation down to a set of ordinary integrals. It should be emphasized that "non-perturbative" here refers to the effective summation of all gluons between a pair of quark lines-which may be the same quark line, as in a self-energy graph-but does not (yet) include a summation over all closed-quark loops which are tied by gluon-bundle exchange to the rest of the "Bundle Diagram". As an example of the power of these methods we offer as a first analytic calculation the quark-antiquark binding potential of a pion, and the corresponding three-quark binding potential of a nucleon, obtained in a simple way from relevant eikonal scattering approximations. A second calculation, analytic, non-perturbative and gauge

  11. Analytical approach to an integrate-and-fire model with spike-triggered adaptation

    NASA Astrophysics Data System (ADS)

    Schwalger, Tilo; Lindner, Benjamin

    2015-12-01

    The calculation of the steady-state probability density for multidimensional stochastic systems that do not obey detailed balance is a difficult problem. Here we present the analytical derivation of the stationary joint and various marginal probability densities for a stochastic neuron model with adaptation current. Our approach assumes weak noise but is valid for arbitrary adaptation strength and time scale. The theory predicts several effects of adaptation on the statistics of the membrane potential of a tonically firing neuron: (i) a membrane potential distribution with a convex shape, (ii) a strongly increased probability of hyperpolarized membrane potentials induced by strong and fast adaptation, and (iii) a maximized variability associated with the adaptation current at a finite adaptation time scale.

  12. Combining CBT and Behavior-Analytic Approaches to Target Severe Emotion Dysregulation in Verbal Youth with ASD and ID

    ERIC Educational Resources Information Center

    Parent, Veronique; Birtwell, Kirstin B.; Lambright, Nathan; DuBard, Melanie

    2016-01-01

    This article presents an individual intervention combining cognitive-behavioral and behavior-analytic approaches to target severe emotion dysregulation in verbal youth with autism spectrum disorder (ASD) concurrent with intellectual disability (ID). The article focuses on two specific individuals who received the treatment within a therapeutic…

  13. A novel approach to piecewise analytic agricultural machinery path reconstruction

    NASA Astrophysics Data System (ADS)

    Wörz, Sascha; Mederle, Michael; Heizinger, Valentin; Bernhardt, Heinz

    2017-12-01

    Before analysing machinery operation in fields, it has to be coped with the problem that the GPS signals of GPS receivers located on the machines contain measurement noise, are time-discrete, and the underlying physical system describing the positions, axial and absolute velocities, angular rates and angular orientation of the operating machines during the whole working time are unknown. This research work presents a new three-dimensional mathematical approach using kinematic relations based on control variables as Euler angular velocities and angles and a discrete target control problem, such that the state control function is given by the sum of squared residuals involving the state and control variables to get such a physical system, which yields a noise-free and piecewise analytic representation of the positions, velocities, angular rates and angular orientation. It can be used for a further detailed study and analysis of the problem of why agricultural vehicles operate in practice as they do.

  14. Active matrix-based collection of airborne analytes: an analyte recording chip providing exposure history and finger print.

    PubMed

    Fang, Jun; Park, Se-Chul; Schlag, Leslie; Stauden, Thomas; Pezoldt, Jörg; Jacobs, Heiko O

    2014-12-03

    In the field of sensors that target the detection of airborne analytes, Corona/lens-based-collection provides a new path to achieve a high sensitivity. An active-matrix-based analyte collection approach referred to as "airborne analyte memory chip/recorder" is demonstrated, which takes and stores airborne analytes in a matrix to provide an exposure history for off-site analysis. © 2014 The Authors. Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. A Progressive Approach to Teaching Analytics in the Marketing Curriculum

    ERIC Educational Resources Information Center

    Liu, Yiyuan; Levin, Michael A.

    2018-01-01

    With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…

  16. Parametrizations of three-body hadronic B - and D -decay amplitudes in terms of analytic and unitary meson-meson form factors

    NASA Astrophysics Data System (ADS)

    Boito, D.; Dedonder, J.-P.; El-Bennich, B.; Escribano, R.; Kamiński, R.; Leśniak, L.; Loiseau, B.

    2017-12-01

    We introduce parametrizations of hadronic three-body B and D weak decay amplitudes that can be readily implemented in experimental analyses and are a sound alternative to the simplistic and widely used sum of Breit-Wigner type amplitudes, also known as the isobar model. These parametrizations can be particularly useful in the interpretation of C P asymmetries in the Dalitz plots. They are derived from previous calculations based on a quasi-two-body factorization approach in which two-body hadronic final-state interactions are fully taken into account in terms of unitary S - and P -wave π π , π K , and K K ¯ form factors. These form factors can be determined rigorously, fulfilling fundamental properties of quantum field-theory amplitudes such as analyticity and unitarity, and are in agreement with the low-energy behavior predicted by effective theories of QCD. They are derived from sets of coupled-channel equations using T -matrix elements constrained by experimental meson-meson phase shifts and inelasticities, chiral symmetry, and asymptotic QCD. We provide explicit amplitude expressions for the decays B±→π+π-π±, B →K π+π-, B±→K+K-K±, D+→π-π+π+, D+→K-π+π+, and D0→KS0π+π-, for which we have shown in previous studies that this approach is phenomenologically successful; in addition, we provide expressions for the D0→KS0K+K- decay. Other three-body hadronic channels can be parametrized likewise.

  17. Analytic thinking promotes religious disbelief.

    PubMed

    Gervais, Will M; Norenzayan, Ara

    2012-04-27

    Scientific interest in the cognitive underpinnings of religious belief has grown in recent years. However, to date, little experimental research has focused on the cognitive processes that may promote religious disbelief. The present studies apply a dual-process model of cognitive processing to this problem, testing the hypothesis that analytic processing promotes religious disbelief. Individual differences in the tendency to analytically override initially flawed intuitions in reasoning were associated with increased religious disbelief. Four additional experiments provided evidence of causation, as subtle manipulations known to trigger analytic processing also encouraged religious disbelief. Combined, these studies indicate that analytic processing is one factor (presumably among several) that promotes religious disbelief. Although these findings do not speak directly to conversations about the inherent rationality, value, or truth of religious beliefs, they illuminate one cognitive factor that may influence such discussions.

  18. Feature Geo Analytics and Big Data Processing: Hybrid Approaches for Earth Science and Real-Time Decision Support

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Raad, M.; Hoel, E.; Park, M.; Mollenkopf, A.; Trujillo, R.

    2016-12-01

    Introduced is a new approach for processing spatiotemporal big data by leveraging distributed analytics and storage. A suite of temporally-aware analysis tools summarizes data nearby or within variable windows, aggregates points (e.g., for various sensor observations or vessel positions), reconstructs time-enabled points into tracks (e.g., for mapping and visualizing storm tracks), joins features (e.g., to find associations between features based on attributes, spatial relationships, temporal relationships or all three simultaneously), calculates point densities, finds hot spots (e.g., in species distributions), and creates space-time slices and cubes (e.g., in microweather applications with temperature, humidity, and pressure, or within human mobility studies). These "feature geo analytics" tools run in both batch and streaming spatial analysis mode as distributed computations across a cluster of servers on typical "big" data sets, where static data exist in traditional geospatial formats (e.g., shapefile) locally on a disk or file share, attached as static spatiotemporal big data stores, or streamed in near-real-time. In other words, the approach registers large datasets or data stores with ArcGIS Server, then distributes analysis across a cluster of machines for parallel processing. Several brief use cases will be highlighted based on a 16-node server cluster at 14 Gb RAM per node, allowing, for example, the buffering of over 8 million points or thousands of polygons in 1 minute. The approach is "hybrid" in that ArcGIS Server integrates open-source big data frameworks such as Apache Hadoop and Apache Spark on the cluster in order to run the analytics. In addition, the user may devise and connect custom open-source interfaces and tools developed in Python or Python Notebooks; the common denominator being the familiar REST API.

  19. Enhancement in the sensitivity of microfluidic enzyme-linked immunosorbent assays through analyte preconcentration.

    PubMed

    Yanagisawa, Naoki; Dutta, Debashis

    2012-08-21

    In this Article, we describe a microfluidic enzyme-linked immunosorbent assay (ELISA) method whose sensitivity can be substantially enhanced through preconcentration of the target analyte around a semipermeable membrane. The reported preconcentration has been accomplished in our current work via electrokinetic means allowing a significant increase in the amount of captured analyte relative to nonspecific binding in the trapping/detection zone. Upon introduction of an enzyme substrate into this region, the rate of generation of the ELISA reaction product (resorufin) was observed to increase by over a factor of 200 for the sample and 2 for the corresponding blank compared to similar assays without analyte trapping. Interestingly, in spite of nonuniformities in the amount of captured analyte along the surface of our analysis channel, the measured fluorescence signal in the preconcentration zone increased linearly with time over an enzyme reaction period of 30 min and at a rate that was proportional to the analyte concentration in the bulk sample. In our current study, the reported technique has been shown to reduce the smallest detectable concentration of the tumor marker CA 19-9 and Blue Tongue Viral antibody by over 2 orders of magnitude compared to immunoassays without analyte preconcentration. When compared to microwell based ELISAs, the reported microfluidic approach not only yielded a similar improvement in the smallest detectable analyte concentration but also reduced the sample consumption in the assay by a factor of 20 (5 μL versus 100 μL).

  20. Tiered analytics for purity assessment of macrocyclic peptides in drug discovery: Analytical consideration and method development.

    PubMed

    Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A

    2017-05-10

    Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources. Copyright © 2017. Published by Elsevier B.V.

  1. Numerical and analytical approaches to an advection-diffusion problem at small Reynolds number and large Péclet number

    NASA Astrophysics Data System (ADS)

    Fuller, Nathaniel J.; Licata, Nicholas A.

    2018-05-01

    Obtaining a detailed understanding of the physical interactions between a cell and its environment often requires information about the flow of fluid surrounding the cell. Cells must be able to effectively absorb and discard material in order to survive. Strategies for nutrient acquisition and toxin disposal, which have been evolutionarily selected for their efficacy, should reflect knowledge of the physics underlying this mass transport problem. Motivated by these considerations, in this paper we discuss the results from an undergraduate research project on the advection-diffusion equation at small Reynolds number and large Péclet number. In particular, we consider the problem of mass transport for a Stokesian spherical swimmer. We approach the problem numerically and analytically through a rescaling of the concentration boundary layer. A biophysically motivated first-passage problem for the absorption of material by the swimming cell demonstrates quantitative agreement between the numerical and analytical approaches. We conclude by discussing the connections between our results and the design of smart toxin disposal systems.

  2. Raman spectroscopy as a process analytical technology for pharmaceutical manufacturing and bioprocessing.

    PubMed

    Esmonde-White, Karen A; Cuellar, Maryann; Uerpmann, Carsten; Lenain, Bruno; Lewis, Ian R

    2017-01-01

    Adoption of Quality by Design (QbD) principles, regulatory support of QbD, process analytical technology (PAT), and continuous manufacturing are major factors effecting new approaches to pharmaceutical manufacturing and bioprocessing. In this review, we highlight new technology developments, data analysis models, and applications of Raman spectroscopy, which have expanded the scope of Raman spectroscopy as a process analytical technology. Emerging technologies such as transmission and enhanced reflection Raman, and new approaches to using available technologies, expand the scope of Raman spectroscopy in pharmaceutical manufacturing, and now Raman spectroscopy is successfully integrated into real-time release testing, continuous manufacturing, and statistical process control. Since the last major review of Raman as a pharmaceutical PAT in 2010, many new Raman applications in bioprocessing have emerged. Exciting reports of in situ Raman spectroscopy in bioprocesses complement a growing scientific field of biological and biomedical Raman spectroscopy. Raman spectroscopy has made a positive impact as a process analytical and control tool for pharmaceutical manufacturing and bioprocessing, with demonstrated scientific and financial benefits throughout a product's lifecycle.

  3. Evaluation of the risk factors associated with rectal neuroendocrine tumors: a big data analytic study from a health screening center.

    PubMed

    Pyo, Jeung Hui; Hong, Sung Noh; Min, Byung-Hoon; Lee, Jun Haeng; Chang, Dong Kyung; Rhee, Poong-Lyul; Kim, Jae Jun; Choi, Sun Kyu; Jung, Sin-Ho; Son, Hee Jung; Kim, Young-Ho

    2016-12-01

    Rectal neuroendocrine tumor (NET) is the most common NET in Asia. The risk factors associated with rectal NETs are unclear because of the overall low incidence rate of these tumors and the associated difficulty in conducting large epidemiological studies on rare cases. The aim of this study was to exploit the benefits of big data analytics to assess the risk factors associated with rectal NET. A retrospective case-control study was conducted, including 102 patients with histologically confirmed rectal NETs and 52,583 healthy controls who underwent screening colonoscopy at the Center for Health Promotion of the Samsung Medical Center in Korea between January 2002 and December 2012. Information on different risk factors was collected and logistic regression analysis applied to identify predictive factors. Four factors were significantly associated with rectal NET: higher levels of cholesterol [odds ratio (OR) = 1.007, 95 % confidence interval (CI), 1.001-1.013, p = 0.016] and ferritin (OR = 1.502, 95 % CI, 1.167-1.935, p = 0.002), presence of metabolic syndrome (OR = 1.768, 95 % CI, 1.071-2.918, p = 0.026), and family history of cancer among first-degree relatives (OR = 1.664, 95 % CI, 1.019-2.718, p = 0.042). The findings of our study demonstrate the benefits of using big data analytics for research and clinical risk factor studies. Specifically, in this study, this analytical method was applied to identify higher levels of serum cholesterol and ferritin, metabolic syndrome, and family history of cancer as factors that may explain the increasing incidence and prevalence of rectal NET.

  4. Modern Analytical Chemistry in the Contemporary World

    ERIC Educational Resources Information Center

    Šíma, Jan

    2016-01-01

    Students not familiar with chemistry tend to misinterpret analytical chemistry as some kind of the sorcery where analytical chemists working as modern wizards handle magical black boxes able to provide fascinating results. However, this approach is evidently improper and misleading. Therefore, the position of modern analytical chemistry among…

  5. A Decision Analytic Approach to Exposure-Based Chemical Prioritization

    PubMed Central

    Mitchell, Jade; Pabon, Nicolas; Collier, Zachary A.; Egeghy, Peter P.; Cohen-Hubal, Elaine; Linkov, Igor; Vallero, Daniel A.

    2013-01-01

    The manufacture of novel synthetic chemicals has increased in volume and variety, but often the environmental and health risks are not fully understood in terms of toxicity and, in particular, exposure. While efforts to assess risks have generally been effective when sufficient data are available, the hazard and exposure data necessary to assess risks adequately are unavailable for the vast majority of chemicals in commerce. The US Environmental Protection Agency has initiated the ExpoCast Program to develop tools for rapid chemical evaluation based on potential for exposure. In this context, a model is presented in which chemicals are evaluated based on inherent chemical properties and behaviorally-based usage characteristics over the chemical’s life cycle. These criteria are assessed and integrated within a decision analytic framework, facilitating rapid assessment and prioritization for future targeted testing and systems modeling. A case study outlines the prioritization process using 51 chemicals. The results show a preliminary relative ranking of chemicals based on exposure potential. The strength of this approach is the ability to integrate relevant statistical and mechanistic data with expert judgment, allowing for an initial tier assessment that can further inform targeted testing and risk management strategies. PMID:23940664

  6. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods

  7. Examining the Relations between Executive Function, Math, and Literacy during the Transition to Kindergarten: A Multi-Analytic Approach

    ERIC Educational Resources Information Center

    Schmitt, Sara A.; Geldhof, G. John; Purpura, David J.; Duncan, Robert; McClelland, Megan M.

    2017-01-01

    The present study explored the bidirectional and longitudinal associations between executive function (EF) and early academic skills (math and literacy) across 4 waves of measurement during the transition from preschool to kindergarten using 2 complementary analytical approaches: cross-lagged panel modeling and latent growth curve modeling (LCGM).…

  8. Assessing vocational outcome expectancy in individuals with serious mental illness: a factor-analytic approach.

    PubMed

    Iwanaga, Kanako; Umucu, Emre; Wu, Jia-Rung; Yaghmaian, Rana; Lee, Hui-Ling; Fitzgerald, Sandra; Chan, Fong

    2017-07-04

    Self-determination theory (SDT) and self-efficacy theory (SET) can be used to conceptualize self-determined motivation to engage in mental health and vocational rehabilitation (VR) services and to predict recovery. To incorporate SDT and SET as a framework for vocational recovery, developing and validating SDT/SET measures in vocational rehabilitation is warranted. Outcome expectancy is an important SDT/SET variable affecting rehabilitation engagement and recovery. The purpose of this study was to validate the Vocational Outcome Expectancy Scale (VOES) for use within the SDT/SET vocational recovery framework. One hundred and twenty-four individuals with serious mental illness (SMI) participated in this study. Measurement structure of the VOES was evaluated using exploratory factor analysis (EFA) and confirmatory factor analysis (CFA). Both EFA and CFA results supported a two-factor structure: (a) positive outcome expectancy, and (b) negative outcome expectancy. The internal consistency reliability coefficients for both factors were acceptable. In addition, positive outcome expectancy correlated stronger than negative outcome expectancy with other SDT/SET constructs in the expected directions. The VOES is a brief, reliable and valid instrument for assessing vocational outcome expectancy in individuals with SMI that can be integrated into SDT/SET as a vocational rehabilitation engagement and recovery model in psychiatric rehabilitation.

  9. How Does Social Inequality Continue to Influence Young People's Trajectories through the Apprenticeship Pathway System in South Africa? An Analytical Approach

    ERIC Educational Resources Information Center

    Kruss, Glenda; Wildschut, Angelique

    2016-01-01

    The paper contributes by proposing an analytical approach that allows for the identification of patterns of participation in education and training and the labour market, through empirical measurement of the number of transitions and distinct trajectories traversed by groups of individuals. To illustrate the value of the approach, we focus on an…

  10. Collector modulation in high-voltage bipolar transistor in the saturation mode: Analytical approach

    NASA Astrophysics Data System (ADS)

    Dmitriev, A. P.; Gert, A. V.; Levinshtein, M. E.; Yuferev, V. S.

    2018-04-01

    A simple analytical model is developed, capable of replacing the numerical solution of a system of nonlinear partial differential equations by solving a simple algebraic equation when analyzing the collector resistance modulation of a bipolar transistor in the saturation mode. In this approach, the leakage of the base current into the emitter and the recombination of non-equilibrium carriers in the base are taken into account. The data obtained are in good agreement with the results of numerical calculations and make it possible to describe both the motion of the front of the minority carriers and the steady state distribution of minority carriers across the collector in the saturation mode.

  11. Scattering from phase-separated vesicles. I. An analytical form factor for multiple static domains

    DOE PAGES

    Heberle, Frederick A.; Anghel, Vinicius N. P.; Katsaras, John

    2015-08-18

    This is the first in a series of studies considering elastic scattering from laterally heterogeneous lipid vesicles containing multiple domains. Unique among biophysical tools, small-angle neutron scattering can in principle give detailed information about the size, shape and spatial arrangement of domains. A general theory for scattering from laterally heterogeneous vesicles is presented, and the analytical form factor for static domains with arbitrary spatial configuration is derived, including a simplification for uniformly sized round domains. The validity of the model, including series truncation effects, is assessed by comparison with simulated data obtained from a Monte Carlo method. Several aspects ofmore » the analytical solution for scattering intensity are discussed in the context of small-angle neutron scattering data, including the effect of varying domain size and number, as well as solvent contrast. Finally, the analysis indicates that effects of domain formation are most pronounced when the vesicle's average scattering length density matches that of the surrounding solvent.« less

  12. An analytical approach to thermal modeling of Bridgman type crystal growth: One dimensional analysis. Computer program users manual

    NASA Technical Reports Server (NTRS)

    Cothran, E. K.

    1982-01-01

    The computer program written in support of one dimensional analytical approach to thermal modeling of Bridgman type crystal growth is presented. The program listing and flow charts are included, along with the complete thermal model. Sample problems include detailed comments on input and output to aid the first time user.

  13. Cognitive-analytical therapy for a patient with functional neurological symptom disorder-conversion disorder (psychogenic myopia): A case study.

    PubMed

    Nasiri, Hamid; Ebrahimi, Amrollah; Zahed, Arash; Arab, Mostafa; Samouei, Rahele

    2015-05-01

    Functional neurological symptom disorder commonly presents with symptoms and defects of sensory and motor functions. Therefore, it is often mistaken for a medical condition. It is well known that functional neurological symptom disorder more often caused by psychological factors. There are three main approaches namely analytical, cognitive and biological to manage conversion disorder. Any of such approaches can be applied through short-term treatment programs. In this case, study a 12-year-old boy with the diagnosed functional neurological symptom disorder (psychogenic myopia) was put under a cognitive-analytical treatment. The outcome of this treatment modality was proved successful.

  14. Bioinformatics approaches to predict target genes from transcription factor binding data.

    PubMed

    Essebier, Alexandra; Lamprecht, Marnie; Piper, Michael; Bodén, Mikael

    2017-12-01

    Transcription factors regulate gene expression and play an essential role in development by maintaining proliferative states, driving cellular differentiation and determining cell fate. Transcription factors are capable of regulating multiple genes over potentially long distances making target gene identification challenging. Currently available experimental approaches to detect distal interactions have multiple weaknesses that have motivated the development of computational approaches. Although an improvement over experimental approaches, existing computational approaches are still limited in their application, with different weaknesses depending on the approach. Here, we review computational approaches with a focus on data dependency, cell type specificity and usability. With the aim of identifying transcription factor target genes, we apply available approaches to typical transcription factor experimental datasets. We show that approaches are not always capable of annotating all transcription factor binding sites; binding sites should be treated disparately; and a combination of approaches can increase the biological relevance of the set of genes identified as targets. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Quasi-Steady Evolution of Hillslopes in Layered Landscapes: An Analytic Approach

    NASA Astrophysics Data System (ADS)

    Glade, R. C.; Anderson, R. S.

    2018-01-01

    Landscapes developed in layered sedimentary or igneous rocks are common on Earth, as well as on other planets. Features such as hogbacks, exposed dikes, escarpments, and mesas exhibit resistant rock layers adjoining more erodible rock in tilted, vertical, or horizontal orientations. Hillslopes developed in the erodible rock are typically characterized by steep, linear-to-concave slopes or "ramps" mantled with material derived from the resistant layers, often in the form of large blocks. Previous work on hogbacks has shown that feedbacks between weathering and transport of the blocks and underlying soft rock can create relief over time and lead to the development of concave-up slope profiles in the absence of rilling processes. Here we employ an analytic approach, informed by numerical modeling and field data, to describe the quasi-steady state behavior of such rocky hillslopes for the full spectrum of resistant layer dip angles. We begin with a simple geometric analysis that relates structural dip to erosion rates. We then explore the mechanisms by which our numerical model of hogback evolution self-organizes to meet these geometric expectations, including adjustment of soil depth, erosion rates, and block velocities along the ramp. Analytical solutions relate easily measurable field quantities such as ramp length, slope, block size, and resistant layer dip angle to local incision rate, block velocity, and block weathering rate. These equations provide a framework for exploring the evolution of layered landscapes and pinpoint the processes for which we require a more thorough understanding to predict their evolution over time.

  16. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  17. Hierarchical Analytical Approaches for Unraveling the Composition of Proprietary Mixtures

    EPA Pesticide Factsheets

    The composition of commercial mixtures including pesticide inert ingredients, aircraft deicers, and aqueous film-forming foam (AFFF) formulations, and by analogy, fracking fluids, are proprietary. Quantitative analytical methodologies can only be developed for mixture components once their identities are known. Because proprietary mixtures may contain volatile and non-volatile components, a hierarchy of analytical methods is often required for the full identification of all proprietary mixture components.

  18. Analytical Assessment of Simultaneous Parallel Approach Feasibility from Total System Error

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2014-01-01

    In a simultaneous paired approach to closely-spaced parallel runways, a pair of aircraft flies in close proximity on parallel approach paths. The aircraft pair must maintain a longitudinal separation within a range that avoids wake encounters and, if one of the aircraft blunders, avoids collision. Wake avoidance defines the rear gate of the longitudinal separation. The lead aircraft generates a wake vortex that, with the aid of crosswinds, can travel laterally onto the path of the trail aircraft. As runway separation decreases, the wake has less distance to traverse to reach the path of the trail aircraft. The total system error of each aircraft further reduces this distance. The total system error is often modeled as a probability distribution function. Therefore, Monte-Carlo simulations are a favored tool for assessing a "safe" rear-gate. However, safety for paired approaches typically requires that a catastrophic wake encounter be a rare one-in-a-billion event during normal operation. Using a Monte-Carlo simulation to assert this event rarity with confidence requires a massive number of runs. Such large runs do not lend themselves to rapid turn-around during the early stages of investigation when the goal is to eliminate the infeasible regions of the solution space and to perform trades among the independent variables in the operational concept. One can employ statistical analysis using simplified models more efficiently to narrow the solution space and identify promising trades for more in-depth investigation using Monte-Carlo simulations. These simple, analytical models not only have to address the uncertainty of the total system error but also the uncertainty in navigation sources used to alert an abort of the procedure. This paper presents a method for integrating total system error, procedure abort rates, avionics failures, and surveillance errors into a statistical analysis that identifies the likely feasible runway separations for simultaneous paired

  19. Analytical Sociology: A Bungean Appreciation

    NASA Astrophysics Data System (ADS)

    Wan, Poe Yu-ze

    2012-10-01

    Analytical sociology, an intellectual project that has garnered considerable attention across a variety of disciplines in recent years, aims to explain complex social processes by dissecting them, accentuating their most important constituent parts, and constructing appropriate models to understand the emergence of what is observed. To achieve this goal, analytical sociologists demonstrate an unequivocal focus on the mechanism-based explanation grounded in action theory. In this article I attempt a critical appreciation of analytical sociology from the perspective of Mario Bunge's philosophical system, which I characterize as emergentist systemism. I submit that while the principles of analytical sociology and those of Bunge's approach share a lot in common, the latter brings to the fore the ontological status and explanatory importance of supra-individual actors (as concrete systems endowed with emergent causal powers) and macro-social mechanisms (as processes unfolding in and among social systems), and therefore it does not stipulate that every causal explanation of social facts has to include explicit references to individual-level actors and mechanisms. In this sense, Bunge's approach provides a reasonable middle course between the Scylla of sociological reification and the Charybdis of ontological individualism, and thus serves as an antidote to the untenable "strong program of microfoundations" to which some analytical sociologists are committed.

  20. Multidisciplinary design and analytic approaches to advance prospective research on the multilevel determinants of child health.

    PubMed

    Johnson, Sara B; Little, Todd D; Masyn, Katherine; Mehta, Paras D; Ghazarian, Sharon R

    2017-06-01

    Characterizing the determinants of child health and development over time, and identifying the mechanisms by which these determinants operate, is a research priority. The growth of precision medicine has increased awareness and refinement of conceptual frameworks, data management systems, and analytic methods for multilevel data. This article reviews key methodological challenges in cohort studies designed to investigate multilevel influences on child health and strategies to address them. We review and summarize methodological challenges that could undermine prospective studies of the multilevel determinants of child health and ways to address them, borrowing approaches from the social and behavioral sciences. Nested data, variation in intervals of data collection and assessment, missing data, construct measurement across development and reporters, and unobserved population heterogeneity pose challenges in prospective multilevel cohort studies with children. We discuss innovations in missing data, innovations in person-oriented analyses, and innovations in multilevel modeling to address these challenges. Study design and analytic approaches that facilitate the integration across multiple levels, and that account for changes in people and the multiple, dynamic, nested systems in which they participate over time, are crucial to fully realize the promise of precision medicine for children and adolescents. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Effects of Vibrations on Metal Forming Process: Analytical Approach and Finite Element Simulations

    NASA Astrophysics Data System (ADS)

    Armaghan, Khan; Christophe, Giraud-Audine; Gabriel, Abba; Régis, Bigot

    2011-01-01

    Vibration assisted forming is one of the most recent and beneficial technique used to improve forming process. Effects of vibration on metal forming processes can be attributed to two causes. First, the volume effect links lowering of yield stress with the influence of vibration on the dislocation movement. Second, the surface effect explains lowering of the effective coefficient of friction by periodic reduction contact area. This work is related to vibration assisted forming process in viscoplastic domain. Impact of change in vibration waveform has been analyzed. For this purpose, two analytical models have been developed for two different types of vibration waveforms (sinusoidal and triangular). These models were developed on the basis of Slice method that is used to find out the required forming force for the process. Final relationships show that application of triangular waveform in forming process is more beneficial as compare to sinusoidal vibrations in terms of reduced forming force. Finite Element Method (FEM) based simulations were performed using Forge2008®and these confirmed the results of analytical models. The ratio of vibration speed to upper die speed is a critical factor in the reduction of the forming force.

  2. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  3. Quo vadis, analytical chemistry?

    PubMed

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  4. A factor analytic investigation of the Mercy Evaluation of Multiple Sclerosis.

    PubMed

    Merz, Zachary C; Wright, John D; Vander Wal, Jillon S; Gfeller, Jeffrey D

    2018-01-23

    Neurocognitive deficits commonly are an accompanying feature of Multiple Sclerosis (MS). A brief, yet comprehensive neuropsychological battery is desirable for assessing the extent of these deficits. Therefore, the present study examined the validity of the Mercy Evaluation of Multiple Sclerosis (MEMS) for use with the MS population. Archival data from individuals diagnosed with MS (N = 378) by independent neurologists was examined. Cognitive domains assessed included processing speed and attention, learning, and memory, visuospatial, language, and executive functioning. A mean battery index was calculated to provide a general indicator of cognitive impairment within the current sample. Overall performance across participants was found to be in the lower limits of the average range. Results of factor analytic statistical procedures yielded a four-factor solution, accounting for 67% of total variance within the MEMS. Four neurocognitive measures exhibited the highest sensitivity in detecting cognitive impairment, constituting a psychometrically established brief cognitive screening battery, which accounted for 83% of total variance within the mean battery index score. Overall, the results of the current study suggest appropriate construct validity of the MEMS for use with individuals with MS, as well as provide support for previously established cognitive batteries.

  5. Consistent analytic approach to the efficiency of collisional Penrose process

    NASA Astrophysics Data System (ADS)

    Harada, Tomohiro; Ogasawara, Kota; Miyamoto, Umpei

    2016-07-01

    We propose a consistent analytic approach to the efficiency of collisional Penrose process in the vicinity of a maximally rotating Kerr black hole. We focus on a collision with arbitrarily high center-of-mass energy, which occurs if either of the colliding particles has its angular momentum fine-tuned to the critical value to enter the horizon. We show that if the fine-tuned particle is ingoing on the collision, the upper limit of the efficiency is (2 +√{3 })(2 -√{2 })≃2.186 , while if the fine-tuned particle is bounced back before the collision, the upper limit is (2 +√{3 })2≃13.93 . Despite earlier claims, the former can be attained for inverse Compton scattering if the fine-tuned particle is massive and starts at rest at infinity, while the latter can be attained for various particle reactions, such as inverse Compton scattering and pair annihilation, if the fine-tuned particle is either massless or highly relativistic at infinity. We discuss the difference between the present and earlier analyses.

  6. Scenes for Social Information Processing in Adolescence: Item and factor analytic procedures for psychometric appraisal.

    PubMed

    Vagos, Paula; Rijo, Daniel; Santos, Isabel M

    2016-04-01

    Relatively little is known about measures used to investigate the validity and applications of social information processing theory. The Scenes for Social Information Processing in Adolescence includes items built using a participatory approach to evaluate the attribution of intent, emotion intensity, response evaluation, and response decision steps of social information processing. We evaluated a sample of 802 Portuguese adolescents (61.5% female; mean age = 16.44 years old) using this instrument. Item analysis and exploratory and confirmatory factor analytic procedures were used for psychometric examination. Two measures for attribution of intent were produced, including hostile and neutral; along with 3 emotion measures, focused on negative emotional states; 8 response evaluation measures; and 4 response decision measures, including prosocial and impaired social behavior. All of these measures achieved good internal consistency values and fit indicators. Boys seemed to favor and choose overt and relational aggression behaviors more often; girls conveyed higher levels of neutral attribution, sadness, and assertiveness and passiveness. The Scenes for Social Information Processing in Adolescence achieved adequate psychometric results and seems a valuable alternative for evaluating social information processing, even if it is essential to continue investigation into its internal and external validity. (c) 2016 APA, all rights reserved.

  7. Analytical Glycobiology at High Sensitivity: Current Approaches and Directions

    PubMed Central

    Novotny, Milos V.; Alley, William R.; Mann, Benjamin F.

    2013-01-01

    This review summarizes the analytical advances made during the last several years in the structural and quantitative determinations of glycoproteins in complex biological mixtures. The main analytical techniques used in the fields of glycomics and glycoproteomics involve different modes of mass spectrometry and their combinations with capillary separation methods such as microcolumn liquid chromatography and capillary electrophoresis. The needs for high-sensitivity measurements have been emphasized in the oligosaccharide profiling used in the field of biomarker discovery through MALDI mass spectrometry. High-sensitivity profiling of both glycans and glycopeptides from biological fluids and tissue extracts has been aided significantly through lectin preconcentration and the uses of affinity chromatography. PMID:22945852

  8. Analytic Steering: Inserting Context into the Information Dialog

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bohn, Shawn J.; Calapristi, Augustin J.; Brown, Shyretha D.

    2011-10-23

    An analyst’s intrinsic domain knowledge is a primary asset in almost any analysis task. Unstructured text analysis systems that apply un-supervised content analysis approaches can be more effective if they can leverage this domain knowledge in a manner that augments the information discovery process without obfuscating new or unexpected content. Current unsupervised approaches rely upon the prowess of the analyst to submit the right queries or observe generalized document and term relationships from ranked or visual results. We propose a new approach which allows the user to control or steer the analytic view within the unsupervised space. This process ismore » controlled through the data characterization process via user supplied context in the form of a collection of key terms. We show that steering with an appropriate choice of key terms can provide better relevance to the analytic domain and still enable the analyst to uncover un-expected relationships; this paper discusses cases where various analytic steering approaches can provide enhanced analysis results and cases where analytic steering can have a negative impact on the analysis process.« less

  9. Analytical challenges in sports drug testing.

    PubMed

    Thevis, Mario; Krug, Oliver; Geyer, Hans; Walpurgis, Katja; Baume, Norbert; Thomas, Andreas

    2018-03-01

    Analytical chemistry represents a central aspect of doping controls. Routine sports drug testing approaches are primarily designed to address the question whether a prohibited substance is present in a doping control sample and whether prohibited methods (for example, blood transfusion or sample manipulation) have been conducted by an athlete. As some athletes have availed themselves of the substantial breadth of research and development in the pharmaceutical arena, proactive and preventive measures are required such as the early implementation of new drug candidates and corresponding metabolites into routine doping control assays, even though these drug candidates are to date not approved for human use. Beyond this, analytical data are also cornerstones of investigations into atypical or adverse analytical findings, where the overall picture provides ample reason for follow-up studies. Such studies have been of most diverse nature, and tailored approaches have been required to probe hypotheses and scenarios reported by the involved parties concerning the plausibility and consistency of statements and (analytical) facts. In order to outline the variety of challenges that doping control laboratories are facing besides providing optimal detection capabilities and analytical comprehensiveness, selected case vignettes involving the follow-up of unconventional adverse analytical findings, urine sample manipulation, drug/food contamination issues, and unexpected biotransformation reactions are thematized.

  10. Topiramate: A Review of Analytical Approaches for the Drug Substance, Its Impurities and Pharmaceutical Formulations.

    PubMed

    Pinto, Eduardo Costa; Dolzan, Maressa Danielli; Cabral, Lucio Mendes; Armstrong, Daniel W; de Sousa, Valéria Pereira

    2016-02-01

    An important step during the development of high-performance liquid chromatography (HPLC) methods for quantitative analysis of drugs is choosing the appropriate detector. High sensitivity, reproducibility, stability, wide linear range, compatibility with gradient elution, non-destructive detection of the analyte and response unaffected by changes in the temperature/flow are some of the ideal characteristics of a universal HPLC detector. Topiramate is an anticonvulsant drug mainly used for the treatment of different types of seizures and prophylactic treatment of migraine. Different analytical approaches to quantify topiramate by HPLC have been described because of the lack of chromophoric moieties on its structure, such as derivatization with fluorescent moieties and UV-absorbing moieties, conductivity detection, evaporative light scattering detection, refractive index detection, chemiluminescent nitrogen detection and MS detection. Some methods for the determination of topiramate by capillary electrophoresis and gas chromatography have also been published. This systematic review provides a description of the main analytical methods presented in the literature to analyze topiramate in the drug substance and in pharmaceutical formulations. Each of these methods is briefly discussed, especially considering the detector used with HPLC. In addition, this article presents a review of the data available regarding topiramate stability, degradation products and impurities. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    NASA Astrophysics Data System (ADS)

    Chiadamrong, N.; Piyathanavong, V.

    2017-12-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  12. Flow adjustment inside homogeneous canopies after a leading edge – An analytical approach backed by LES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroniger, Konstantin; Banerjee, Tirtha; De Roo, Frederik

    A two-dimensional analytical model for describing the mean flow behavior inside a vegetation canopy after a leading edge in neutral conditions was developed and tested by means of large eddy simulations (LES) employing the LES code PALM. The analytical model is developed for the region directly after the canopy edge, the adjustment region, where one-dimensional canopy models fail due to the sharp change in roughness. The derivation of this adjustment region model is based on an analytic solution of the two-dimensional Reynolds averaged Navier–Stokes equation in neutral conditions for a canopy with constant plant area density (PAD). The main assumptionsmore » for solving the governing equations are separability of the velocity components concerning the spatial variables and the neglection of the Reynolds stress gradients. These two assumptions are verified by means of LES. To determine the emerging model parameters, a simultaneous fitting scheme was applied to the velocity and pressure data of a reference LES simulation. Furthermore a sensitivity analysis of the adjustment region model, equipped with the previously calculated parameters, was performed varying the three relevant length, the canopy height ( h), the canopy length and the adjustment length ( Lc), in additional LES. Even if the model parameters are, in general, functions of h/ Lc, it was found out that the model is capable of predicting the flow quantities in various cases, when using constant parameters. Subsequently the adjustment region model is combined with the one-dimensional model of Massman, which is applicable for the interior of the canopy, to attain an analytical model capable of describing the mean flow for the full canopy domain. As a result, the model is tested against an analytical model based on a linearization approach.« less

  13. Flow adjustment inside homogeneous canopies after a leading edge – An analytical approach backed by LES

    DOE PAGES

    Kroniger, Konstantin; Banerjee, Tirtha; De Roo, Frederik; ...

    2017-10-06

    A two-dimensional analytical model for describing the mean flow behavior inside a vegetation canopy after a leading edge in neutral conditions was developed and tested by means of large eddy simulations (LES) employing the LES code PALM. The analytical model is developed for the region directly after the canopy edge, the adjustment region, where one-dimensional canopy models fail due to the sharp change in roughness. The derivation of this adjustment region model is based on an analytic solution of the two-dimensional Reynolds averaged Navier–Stokes equation in neutral conditions for a canopy with constant plant area density (PAD). The main assumptionsmore » for solving the governing equations are separability of the velocity components concerning the spatial variables and the neglection of the Reynolds stress gradients. These two assumptions are verified by means of LES. To determine the emerging model parameters, a simultaneous fitting scheme was applied to the velocity and pressure data of a reference LES simulation. Furthermore a sensitivity analysis of the adjustment region model, equipped with the previously calculated parameters, was performed varying the three relevant length, the canopy height ( h), the canopy length and the adjustment length ( Lc), in additional LES. Even if the model parameters are, in general, functions of h/ Lc, it was found out that the model is capable of predicting the flow quantities in various cases, when using constant parameters. Subsequently the adjustment region model is combined with the one-dimensional model of Massman, which is applicable for the interior of the canopy, to attain an analytical model capable of describing the mean flow for the full canopy domain. As a result, the model is tested against an analytical model based on a linearization approach.« less

  14. An innovative approach to the development of a portable unit for analytical flame characterization in a microgravity environment

    NASA Technical Reports Server (NTRS)

    Dubinskiy, Mark A.; Kamal, Mohammed M.; Misra, Prabhaker

    1995-01-01

    The availability of manned laboratory facilities in space offers wonderful opportunities and challenges in microgravity combustion science and technology. In turn, the fundamentals of microgravity combustion science can be studied via spectroscopic characterization of free radicals generated in flames. The laser-induced fluorescence (LIF) technique is a noninvasive method of considerable utility in combustion physics and chemistry suitable for monitoring not only specific species and their kinetics, but it is also important for imaging of flames. This makes LIF one of the most important tools for microgravity combustion science. Flame characterization under microgravity conditions using LIF is expected to be more informative than other methods aimed at searching for effects like pumping phenomenon that can be modeled via ground level experiments. A primary goal of our work consisted in working out an innovative approach to devising an LIF-based analytical unit suitable for in-space flame characterization. It was decided to follow two approaches in tandem: (1) use the existing laboratory (non-portable) equipment and determine the optimal set of parameters for flames that can be used as analytical criteria for flame characterization under microgravity conditions; and (2) use state-of-the-art developments in laser technology and concentrate some effort in devising a layout for the portable analytical equipment. This paper presents an up-to-date summary of the results of our experiments aimed at the creation of the portable device for combustion studies in a microgravity environment, which is based on a portable UV tunable solid-state laser for excitation of free radicals normally present in flames in detectable amounts. A systematic approach has allowed us to make a convenient choice of species under investigation, as well as the proper tunable laser system, and also enabled us to carry out LIF experiments on free radicals using a solid-state laser tunable in the UV.

  15. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  16. Optimal design of piezoelectric transformers: a rational approach based on an analytical model and a deterministic global optimization.

    PubMed

    Pigache, Francois; Messine, Frédéric; Nogarede, Bertrand

    2007-07-01

    This paper deals with a deterministic and rational way to design piezoelectric transformers in radial mode. The proposed approach is based on the study of the inverse problem of design and on its reformulation as a mixed constrained global optimization problem. The methodology relies on the association of the analytical models for describing the corresponding optimization problem and on an exact global optimization software, named IBBA and developed by the second author to solve it. Numerical experiments are presented and compared in order to validate the proposed approach.

  17. Analytical approaches to determination of carnitine in biological materials, foods and dietary supplements.

    PubMed

    Dąbrowska, Monika; Starek, Małgorzata

    2014-01-01

    l-Carnitine is a vitamin-like amino acid derivative, which is an essential factor in fatty acid metabolism as acyltransferase cofactor and in energy production processes, such as interconversion in the mechanisms of regulation of cetogenesis and termogenesis, and it is also used in the therapy of primary and secondary deficiency, and in other diseases. The determination of carnitine and acyl-carnitines can provide important information about inherited or acquired metabolic disorders, and for monitoring the biochemical effect of carnitine therapy. The endogenous carnitine pool in humans is maintained by biosynthesis and absorption of carnitine from the diet. Carnitine has one asymmetric carbon giving two stereoisomers d and l, but only the l form has a biological positive effect, thus chiral recognition of l-carnitine enantiomers is extremely important in biological, chemical and pharmaceutical sciences. In order to get more insight into carnitine metabolism and synthesis, a sensitive analysis for the determination of the concentration of free carnitine, carnitine esters and the carnitine precursors is required. Carnitine has been investigated in many biochemical, pharmacokinetic, metabolic and toxicokinetic studies and thus many analytical methods have been developed and published for the determination of carnitine in foods, dietary supplements, pharmaceutical formulations, biological tissues and body fluid. The analytical procedures presented in this review have been validated in terms of basic parameters (linearity, limit of detection, limit of quantitation, sensitivity, accuracy, and precision). This article presented the impact of different analytical techniques, and provides an overview of applications that address a diverse array of pharmaceutical and biological questions and samples. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Thermodynamic efficiency limits of classical and bifacial multi-junction tandem solar cells: An analytical approach

    NASA Astrophysics Data System (ADS)

    Alam, Muhammad Ashraful; Khan, M. Ryyan

    2016-10-01

    Bifacial tandem cells promise to reduce three fundamental losses (i.e., above-bandgap, below bandgap, and the uncollected light between panels) inherent in classical single junction photovoltaic (PV) systems. The successive filtering of light through the bandgap cascade and the requirement of current continuity make optimization of tandem cells difficult and accessible only to numerical solution through computer modeling. The challenge is even more complicated for bifacial design. In this paper, we use an elegantly simple analytical approach to show that the essential physics of optimization is intuitively obvious, and deeply insightful results can be obtained with a few lines of algebra. This powerful approach reproduces, as special cases, all of the known results of conventional and bifacial tandem cells and highlights the asymptotic efficiency gain of these technologies.

  19. Grade 8 students' capability of analytical thinking and attitude toward science through teaching and learning about soil and its' pollution based on science technology and society (STS) approach

    NASA Astrophysics Data System (ADS)

    Boonprasert, Lapisarin; Tupsai, Jiraporn; Yuenyong, Chokchai

    2018-01-01

    This study reported Grade 8 students' analytical thinking and attitude toward science in teaching and learning about soil and its' pollution through science technology and society (STS) approach. The participants were 36 Grade 8 students in Naklang, Nongbualumphu, Thailand. The teaching and learning about soil and its' pollution through STS approach had carried out for 6 weeks. The soil and its' pollution unit through STS approach was developed based on framework of Yuenyong (2006) that consisted of five stages including (1) identification of social issues, (2) identification of potential solutions, (3) need for knowledge, (4) decision-making, and (5) socialization stage. Students' analytical thinking and attitude toward science was collected during their learning by participant observation, analytical thinking test, students' tasks, and journal writing. The findings revealed that students could gain their capability of analytical thinking. They could give ideas or behave the characteristics of analytical thinking such as thinking for classifying, compare and contrast, reasoning, interpreting, collecting data and decision making. Students' journal writing reflected that the STS class of soil and its' pollution motivated students. The paper will discuss implications of these for science teaching and learning through STS in Thailand.

  20. Analysis of Pre-Analytic Factors Affecting the Success of Clinical Next-Generation Sequencing of Solid Organ Malignancies.

    PubMed

    Chen, Hui; Luthra, Rajyalakshmi; Goswami, Rashmi S; Singh, Rajesh R; Roy-Chowdhuri, Sinchita

    2015-08-28

    Application of next-generation sequencing (NGS) technology to routine clinical practice has enabled characterization of personalized cancer genomes to identify patients likely to have a response to targeted therapy. The proper selection of tumor sample for downstream NGS based mutational analysis is critical to generate accurate results and to guide therapeutic intervention. However, multiple pre-analytic factors come into play in determining the success of NGS testing. In this review, we discuss pre-analytic requirements for AmpliSeq PCR-based sequencing using Ion Torrent Personal Genome Machine (PGM) (Life Technologies), a NGS sequencing platform that is often used by clinical laboratories for sequencing solid tumors because of its low input DNA requirement from formalin fixed and paraffin embedded tissue. The success of NGS mutational analysis is affected not only by the input DNA quantity but also by several other factors, including the specimen type, the DNA quality, and the tumor cellularity. Here, we review tissue requirements for solid tumor NGS based mutational analysis, including procedure types, tissue types, tumor volume and fraction, decalcification, and treatment effects.

  1. Field-driven chiral bubble dynamics analysed by a semi-analytical approach

    NASA Astrophysics Data System (ADS)

    Vandermeulen, J.; Leliaert, J.; Dupré, L.; Van Waeyenberge, B.

    2017-12-01

    Nowadays, field-driven chiral bubble dynamics in the presence of the Dzyaloshinskii-Moriya interaction are a topic of thorough investigation. In this paper, a semi-analytical approach is used to derive equations of motion that express the bubble wall (BW) velocity and the change in in-plane magnetization angle as function of the micromagnetic parameters of the involved interactions, thereby taking into account the two-dimensional nature of the bubble wall. It is demonstrated that the equations of motion enable an accurate description of the expanding and shrinking convex bubble dynamics and an expression for the transition field between shrinkage and expansion is derived. In addition, these equations of motion show that the BW velocity is not only dependent on the driving force, but also on the BW curvature. The absolute BW velocity increases for both a shrinking and an expanding bubble, but for different reasons: for expanding bubbles, it is due to the increasing importance of the driving force, while for shrinking bubbles, it is due to the increasing importance of contributions related to the BW curvature. Finally, using this approach we show how the recently proposed magnetic bubblecade memory can operate in the flow regime in the presence of a tilted sinusoidal magnetic field and at greatly reduced bubble sizes compared to the original device prototype.

  2. Learning Approaches, Demographic Factors to Predict Academic Outcomes

    ERIC Educational Resources Information Center

    Nguyen, Tuan Minh

    2016-01-01

    Purpose: The purpose of this paper is to predict academic outcome in math and math-related subjects using learning approaches and demographic factors. Design/Methodology/Approach: ASSIST was used as the instrumentation to measure learning approaches. The study was conducted in the International University of Vietnam with 616 participants. An…

  3. MERRA Analytic Services

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  4. Cotinine analytical workshop report: consideration of analytical methods for determining cotinine in human body fluids as a measure of passive exposure to tobacco smoke.

    PubMed Central

    Watts, R R; Langone, J J; Knight, G J; Lewtas, J

    1990-01-01

    A two-day technical workshop was convened November 10-11, 1986, to discuss analytical approaches for determining trace amounts of cotinine in human body fluids resulting from passive exposure to environmental tobacco smoke (ETS). The workshop, jointly sponsored by the U.S. Environmental Protection Agency and Centers for Disease Control, was attended by scientists with expertise in cotinine analytical methodology and/or conduct of human monitoring studies related to ETS. The workshop format included technical presentations, separate panel discussions on chromatography and immunoassay analytical approaches, and group discussions related to the quality assurance/quality control aspects of future monitoring programs. This report presents a consensus of opinion on general issues before the workshop panel participants and also a detailed comparison of several analytical approaches being used by the various represented laboratories. The salient features of the chromatography and immunoassay analytical methods are discussed separately. PMID:2190812

  5. Photoprotective Strategies of Mediterranean Plants in Relation to Morphological Traits and Natural Environmental Pressure: A Meta-Analytical Approach

    PubMed Central

    Fernández-Marín, Beatriz; Hernández, Antonio; Garcia-Plazaola, Jose I.; Esteban, Raquel; Míguez, Fátima; Artetxe, Unai; Gómez-Sagasti, Maria T.

    2017-01-01

    Despite being a small geographic extension, Mediterranean Basin is characterized by an exceptional plant biodiversity. Adaptive responses of this biocoenosis are delineated by an unusual temporal dissociation along the year between optimal temperature for growth and water availability. This fact generates the combination of two environmental stress factors: a period of summer drought, variable in length and intensity, and the occurrence of mild to cold winters. Both abiotic factors, trigger the generation of (photo)oxidative stress and plants orchestrate an arsenal of structural, physiological, biochemical, and molecular mechanisms to withstand such environmental injuries. In the last two decades an important effort has been made to characterize the adaptive morphological and ecophysiological traits behind plant survival strategies with an eye to predict how they will respond to future climatic changes. In the present work, we have compiled data from 89 studies following a meta-analytical approach with the aim of assessing the composition and plasticity of photosynthetic pigments and low-molecular-weight antioxidants (tocopherols, glutathione, and ascorbic acid) of wild Mediterranean plant species. The influence of internal plant and leaf factors on such composition together with the stress responsiveness, were also analyzed. This approach enabled to obtain data from 73 species of the Mediterranean flora, with the genus Quercus being the most frequently studied. Main highlights of present analysis are: (i) sort of photoprotective mechanisms do not differ between Mediterranean plants and other floras but they show higher plasticity indexes; (ii) α−tocopherol among the antioxidants and violaxanthin-cycle pigments show the highest responsiveness to environmental factors; (iii) both winter and drought stresses induce overnight retention of de-epoxidised violaxanthin-cycle pigments; (iv) this retention correlates with depressions of Fv/Fm; and (v) contrary to what

  6. Advances on a Decision Analytic Approach to Exposure-Based Chemical Prioritization.

    PubMed

    Wood, Matthew D; Plourde, Kenton; Larkin, Sabrina; Egeghy, Peter P; Williams, Antony J; Zemba, Valerie; Linkov, Igor; Vallero, Daniel A

    2018-05-11

    The volume and variety of manufactured chemicals is increasing, although little is known about the risks associated with the frequency and extent of human exposure to most chemicals. The EPA and the recent signing of the Lautenberg Act have both signaled the need for high-throughput methods to characterize and screen chemicals based on exposure potential, such that more comprehensive toxicity research can be informed. Prior work of Mitchell et al. using multicriteria decision analysis tools to prioritize chemicals for further research is enhanced here, resulting in a high-level chemical prioritization tool for risk-based screening. Reliable exposure information is a key gap in currently available engineering analytics to support predictive environmental and health risk assessments. An elicitation with 32 experts informed relative prioritization of risks from chemical properties and human use factors, and the values for each chemical associated with each metric were approximated with data from EPA's CP_CAT database. Three different versions of the model were evaluated using distinct weight profiles, resulting in three different ranked chemical prioritizations with only a small degree of variation across weight profiles. Future work will aim to include greater input from human factors experts and better define qualitative metrics. © 2018 Society for Risk Analysis.

  7. Human factors/ergonomics implications of big data analytics: Chartered Institute of Ergonomics and Human Factors annual lecture.

    PubMed

    Drury, Colin G

    2015-01-01

    In recent years, advances in sensor technology, connectedness and computational power have come together to produce huge data-sets. The treatment and analysis of these data-sets is known as big data analytics (BDA), and the somewhat related term data mining. Fields allied to human factors/ergonomics (HFE), e.g. statistics, have developed computational methods to derive meaningful, actionable conclusions from these data bases. This paper examines BDA, often characterised by volume, velocity and variety, giving examples of successful BDA use. This examination provides context by considering examples of using BDA on human data, using BDA in HFE studies, and studies of how people perform BDA. Significant issues for HFE are the reliance of BDA on correlation rather than hypotheses and theory, the ethics of BDA and the use of HFE in data visualisation.

  8. Analytical Approach to Large Deformation Problems of Frame Structures

    NASA Astrophysics Data System (ADS)

    Ohtsuki, Atsumi; Ellyin, Fernand

    In elements used as flexible linking devices and structures, the main characteristic is a fairly large deformation without exceeding the elastic limit of the material. This property is of both analytical and technological interests. Previous studies of large deformation have been generally concerned with a single member (e.g. a cantilever beam, a simply supported beam, etc.). However, there are very few large deformation studies of assembled members such as frames. This paper deals with a square frame with rigid joints, loaded diagonally in either tension or compression by a pair of opposite forces. Analytical solutions for large deformation are obtained in terms of elliptic integrals, and are compared with the experimental data. The agreement is found to be fairly close.

  9. Implementing a Contributory Scoring Approach for the "GRE"® Analytical Writing Section: A Comprehensive Empirical Investigation. Research Report. ETS RR-17-14

    ERIC Educational Resources Information Center

    Breyer, F. Jay; Rupp, André A.; Bridgeman, Brent

    2017-01-01

    In this research report, we present an empirical argument for the use of a contributory scoring approach for the 2-essay writing assessment of the analytical writing section of the "GRE"® test in which human and machine scores are combined for score creation at the task and section levels. The approach was designed to replace a currently…

  10. "Analytical" vector-functions I

    NASA Astrophysics Data System (ADS)

    Todorov, Vladimir Todorov

    2017-12-01

    In this note we try to give a new (or different) approach to the investigation of analytical vector functions. More precisely a notion of a power xn; n ∈ ℕ+ of a vector x ∈ ℝ3 is introduced which allows to define an "analytical" function f : ℝ3 → ℝ3. Let furthermore f (ξ )= ∑n =0 ∞ anξn be an analytical function of the real variable ξ. Here we replace the power ξn of the number ξ with the power of a vector x ∈ ℝ3 to obtain a vector "power series" f (x )= ∑n =0 ∞ anxn . We research some properties of the vector series as well as some applications of this idea. Note that an "analytical" vector function does not depend of any basis, which may be used in research into some problems in physics.

  11. An analytical approach to Sr isotope ratio determination in Lambrusco wines for geographical traceability purposes.

    PubMed

    Durante, Caterina; Baschieri, Carlo; Bertacchini, Lucia; Bertelli, Davide; Cocchi, Marina; Marchetti, Andrea; Manzini, Daniela; Papotti, Giulia; Sighinolfi, Simona

    2015-04-15

    Geographical origin and authenticity of food are topics of interest for both consumers and producers. Among the different indicators used for traceability studies, (87)Sr/(86)Sr isotopic ratio has provided excellent results. In this study, two analytical approaches for wine sample pre-treatment, microwave and low temperature mineralisation, were investigated to develop accurate and precise analytical method for (87)Sr/(86)Sr determination. The two procedures led to comparable results (paired t-test, with tanalytical procedure was evaluated by using a control sample (wine sample), processed during each sample batch (calculated Relative Standard Deviation, RSD%, equal to 0.002%. Lambrusco PDO (Protected Designation of Origin) wines coming from four different vintages (2009, 2010, 2011 and 2012) were pre-treated according to the best procedure and their isotopic values were compared with isotopic data coming from (i) soils of their territory of origin and (ii) wines obtained by same grape varieties cultivated in different districts. The obtained results have shown no significant variability among the different vintages of wines and a perfect agreement between the isotopic range of the soils and wines has been observed. Nevertheless, the investigated indicator was not enough powerful to discriminate between similar products. To this regard, it is worth to note that more soil samples as well as wines coming from different districts will be considered to obtain more trustworthy results. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Climate Analytics as a Service. Chapter 11

    NASA Technical Reports Server (NTRS)

    Schnase, John L.

    2016-01-01

    Exascale computing, big data, and cloud computing are driving the evolution of large-scale information systems toward a model of data-proximal analysis. In response, we are developing a concept of climate analytics as a service (CAaaS) that represents a convergence of data analytics and archive management. With this approach, high-performance compute-storage implemented as an analytic system is part of a dynamic archive comprising both static and computationally realized objects. It is a system whose capabilities are framed as behaviors over a static data collection, but where queries cause results to be created, not found and retrieved. Those results can be the product of a complex analysis, but, importantly, they also can be tailored responses to the simplest of requests. NASA's MERRA Analytic Service and associated Climate Data Services API provide a real-world example of climate analytics delivered as a service in this way. Our experiences reveal several advantages to this approach, not the least of which is orders-of-magnitude time reduction in the data assembly task common to many scientific workflows.

  13. Analytical approaches for the detection of emerging therapeutics and non-approved drugs in human doping controls.

    PubMed

    Thevis, Mario; Schänzer, Wilhelm

    2014-12-01

    The number and diversity of potentially performance-enhancing substances is continuously growing, fueled by new pharmaceutical developments but also by the inventiveness and, at the same time, unscrupulousness of black-market (designer) drug producers and providers. In terms of sports drug testing, this situation necessitates reactive as well as proactive research and expansion of the analytical armamentarium to ensure timely, adequate, and comprehensive doping controls. This review summarizes literature published over the past 5 years on new drug entities, discontinued therapeutics, and 'tailored' compounds classified as doping agents according to the regulations of the World Anti-Doping Agency, with particular attention to analytical strategies enabling their detection in human blood or urine. Among these compounds, low- and high-molecular mass substances of peptidic (e.g. modified insulin-like growth factor-1, TB-500, hematide/peginesatide, growth hormone releasing peptides, AOD-9604, etc.) and non-peptidic (selective androgen receptor modulators, hypoxia-inducible factor stabilizers, siRNA, S-107 and ARM036/aladorian, etc.) as well as inorganic (cobalt) nature are considered and discussed in terms of specific requirements originating from physicochemical properties, concentration levels, metabolism, and their amenability for chromatographic-mass spectrometric or alternative detection methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. An Analytic Approach to Projectile Motion in a Linear Resisting Medium

    ERIC Educational Resources Information Center

    Stewart, Sean M.

    2006-01-01

    The time of flight, range and the angle which maximizes the range of a projectile in a linear resisting medium are expressed in analytic form in terms of the recently defined Lambert W function. From the closed-form solutions a number of results characteristic to the motion of the projectile in a linear resisting medium are analytically confirmed,…

  15. Big data analytics as a service infrastructure: challenges, desired properties and solutions

    NASA Astrophysics Data System (ADS)

    Martín-Márquez, Manuel

    2015-12-01

    CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.

  16. Analytical caustic surfaces

    NASA Technical Reports Server (NTRS)

    Schmidt, R. F.

    1987-01-01

    This document discusses the determination of caustic surfaces in terms of rays, reflectors, and wavefronts. Analytical caustics are obtained as a family of lines, a set of points, and several types of equations for geometries encountered in optics and microwave applications. Standard methods of differential geometry are applied under different approaches: directly to reflector surfaces, and alternatively, to wavefronts, to obtain analytical caustics of two sheets or branches. Gauss/Seidel aberrations are introduced into the wavefront approach, forcing the retention of all three coefficients of both the first- and the second-fundamental forms of differential geometry. An existing method for obtaining caustic surfaces through exploitation of the singularities in flux density is examined, and several constant-intensity contour maps are developed using only the intrinsic Gaussian, mean, and normal curvatures of the reflector. Numerous references are provided for extending the material of the present document to the morphologies of caustics and their associated diffraction patterns.

  17. Trace metal speciation in natural waters: Computational vs. analytical

    USGS Publications Warehouse

    Nordstrom, D. Kirk

    1996-01-01

    Improvements in the field sampling, preservation, and determination of trace metals in natural waters have made many analyses more reliable and less affected by contamination. The speciation of trace metals, however, remains controversial. Chemical model speciation calculations do not necessarily agree with voltammetric, ion exchange, potentiometric, or other analytical speciation techniques. When metal-organic complexes are important, model calculations are not usually helpful and on-site analytical separations are essential. Many analytical speciation techniques have serious interferences and only work well for a limited subset of water types and compositions. A combined approach to the evaluation of speciation could greatly reduce these uncertainties. The approach proposed would be to (1) compare and contrast different analytical techniques with each other and with computed speciation, (2) compare computed trace metal speciation with reliable measurements of solubility, potentiometry, and mean activity coefficients, and (3) compare different model calculations with each other for the same set of water analyses, especially where supplementary data on speciation already exist. A comparison and critique of analytical with chemical model speciation for a range of water samples would delineate the useful range and limitations of these different approaches to speciation. Both model calculations and analytical determinations have useful and different constraints on the range of possible speciation such that they can provide much better insight into speciation when used together. Major discrepancies in the thermodynamic databases of speciation models can be evaluated with the aid of analytical speciation, and when the thermodynamic models are highly consistent and reliable, the sources of error in the analytical speciation can be evaluated. Major thermodynamic discrepancies also can be evaluated by simulating solubility and activity coefficient data and testing various

  18. Features Students Really Expect from Learning Analytics

    ERIC Educational Resources Information Center

    Schumacher, Clara; Ifenthaler, Dirk

    2016-01-01

    In higher education settings more and more learning is facilitated through online learning environments. To support and understand students' learning processes better, learning analytics offers a promising approach. The purpose of this study was to investigate students' expectations toward features of learning analytics systems. In a first…

  19. Suggestions toward Some Discourse-Analytic Approaches to Text Difficulty: With Special Reference to "T-Unit Configuration" in the Textual Unfolding

    ERIC Educational Resources Information Center

    Lotfipour-Saedi, Kazem

    2015-01-01

    This paper represents some suggestions towards discourse-analytic approaches for ESL/EFL education, with the focus on identifying the textual forms which can contribute to the textual difficulty. Textual difficulty/comprehensibility, rather than being purely text-based or reader-dependent, is certainly a matter of interaction between text and…

  20. An integrated approach using orthogonal analytical techniques to characterize heparan sulfate structure.

    PubMed

    Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Gunay, Nur Sibel; Wang, Jing; Sun, Elaine Y; Pradines, Joël R; Farutin, Victor; Shriver, Zachary; Kaundinya, Ganesh V; Capila, Ishan

    2017-02-01

    Heparan sulfate (HS), a glycosaminoglycan present on the surface of cells, has been postulated to have important roles in driving both normal and pathological physiologies. The chemical structure and sulfation pattern (domain structure) of HS is believed to determine its biological function, to vary across tissue types, and to be modified in the context of disease. Characterization of HS requires isolation and purification of cell surface HS as a complex mixture. This process may introduce additional chemical modification of the native residues. In this study, we describe an approach towards thorough characterization of bovine kidney heparan sulfate (BKHS) that utilizes a variety of orthogonal analytical techniques (e.g. NMR, IP-RPHPLC, LC-MS). These techniques are applied to characterize this mixture at various levels including composition, fragment level, and overall chain properties. The combination of these techniques in many instances provides orthogonal views into the fine structure of HS, and in other instances provides overlapping / confirmatory information from different perspectives. Specifically, this approach enables quantitative determination of natural and modified saccharide residues in the HS chains, and identifies unusual structures. Analysis of partially digested HS chains allows for a better understanding of the domain structures within this mixture, and yields specific insights into the non-reducing end and reducing end structures of the chains. This approach outlines a useful framework that can be applied to elucidate HS structure and thereby provides means to advance understanding of its biological role and potential involvement in disease progression. In addition, the techniques described here can be applied to characterization of heparin from different sources.

  1. Heterogeneity of emotional and interpersonal difficulties in alcohol-dependence: A cluster analytic approach.

    PubMed

    Maurage, Pierre; Timary, Philippe de; D'Hondt, Fabien

    2017-08-01

    Emotional and interpersonal impairments have been largely reported in alcohol-dependence, and their role in its development and maintenance is widely established. However, earlier studies have exclusively focused on group comparisons between healthy controls and alcohol-dependent individuals, considering them as a homogeneous population. The variability of socio-emotional profiles in this disorder thus remains totally unexplored. The present study used a cluster analytic approach to explore the heterogeneity of affective and social disorders in alcohol-dependent individuals. 296 recently-detoxified alcohol-dependent patients were first compared with 246 matched healthy controls regarding self-reported emotional (i.e. alexithymia) and social (i.e. interpersonal problems) difficulties. Then, a cluster analysis was performed, focusing on the alcohol-dependent sample, to explore the presence of differential patterns of socio-emotional deficits and their links with demographic, psychopathological and alcohol-related variables. The group comparison between alcohol-dependent individuals and controls clearly confirmed that emotional and interpersonal difficulties constitute a key factor in alcohol-dependence. However, the cluster analysis identified five subgroups of alcohol-dependent individuals, presenting distinct combinations of alexithymia and interpersonal problems ranging from a total absence of reported impairment to generalized socio-emotional difficulties. Alcohol-dependent individuals should no more be considered as constituting a unitary group regarding their affective and interpersonal difficulties, but rather as a population encompassing a wide variety of socio-emotional profiles. Future experimental studies on emotional and social variables should thus go beyond mere group comparisons to explore this heterogeneity, and prevention programs proposing an individualized evaluation and rehabilitation of these deficits should be promoted. Copyright © 2017

  2. Analytic structure of the S-matrix for singular quantum mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camblong, Horacio E.; Epele, Luis N.; Fanchiotti, Huner

    2015-06-15

    The analytic structure of the S-matrix of singular quantum mechanics is examined within a multichannel framework, with primary focus on its dependence with respect to a parameter (Ω) that determines the boundary conditions. Specifically, a characterization is given in terms of salient mathematical and physical properties governing its behavior. These properties involve unitarity and associated current-conserving Wronskian relations, time-reversal invariance, and Blaschke factorization. The approach leads to an interpretation of effective nonunitary solutions in singular quantum mechanics and their determination from the unitary family.

  3. General analytical approach for sound transmission loss analysis through a thick metamaterial plate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oudich, Mourad; Zhou, Xiaoming; Badreddine Assouar, M., E-mail: Badreddine.Assouar@univ-lorraine.fr

    We report theoretically and numerically on the sound transmission loss performance through a thick plate-type acoustic metamaterial made of spring-mass resonators attached to the surface of a homogeneous elastic plate. Two general analytical approaches based on plane wave expansion were developed to calculate both the sound transmission loss through the metamaterial plate (thick and thin) and its band structure. The first one can be applied to thick plate systems to study the sound transmission for any normal or oblique incident sound pressure. The second approach gives the metamaterial dispersion behavior to describe the vibrational motions of the plate, which helpsmore » to understand the physics behind sound radiation through air by the structure. Computed results show that high sound transmission loss up to 72 dB at 2 kHz is reached with a thick metamaterial plate while only 23 dB can be obtained for a simple homogeneous plate with the same thickness. Such plate-type acoustic metamaterial can be a very effective solution for high performance sound insulation and structural vibration shielding in the very low-frequency range.« less

  4. Consistent approach to describing aircraft HIRF protection

    NASA Technical Reports Server (NTRS)

    Rimbey, P. R.; Walen, D. B.

    1995-01-01

    The high intensity radiated fields (HIRF) certification process as currently implemented is comprised of an inconsistent combination of factors that tend to emphasize worst case scenarios in assessing commercial airplane certification requirements. By examining these factors which include the process definition, the external HIRF environment, the aircraft coupling and corresponding internal fields, and methods of measuring equipment susceptibilities, activities leading to an approach to appraising airplane vulnerability to HIRF are proposed. This approach utilizes technically based criteria to evaluate the nature of the threat, including the probability of encountering the external HIRF environment. No single test or analytic method comprehensively addresses the full HIRF threat frequency spectrum. Additional tools such as statistical methods must be adopted to arrive at more realistic requirements to reflect commercial aircraft vulnerability to the HIRF threat. Test and analytic data are provided to support the conclusions of this report. This work was performed under NASA contract NAS1-19360, Task 52.

  5. Modern analytical chemistry in the contemporary world

    NASA Astrophysics Data System (ADS)

    Šíma, Jan

    2016-12-01

    Students not familiar with chemistry tend to misinterpret analytical chemistry as some kind of the sorcery where analytical chemists working as modern wizards handle magical black boxes able to provide fascinating results. However, this approach is evidently improper and misleading. Therefore, the position of modern analytical chemistry among sciences and in the contemporary world is discussed. Its interdisciplinary character and the necessity of the collaboration between analytical chemists and other experts in order to effectively solve the actual problems of the human society and the environment are emphasized. The importance of the analytical method validation in order to obtain the accurate and precise results is highlighted. The invalid results are not only useless; they can often be even fatal (e.g., in clinical laboratories). The curriculum of analytical chemistry at schools and universities is discussed. It is referred to be much broader than traditional equilibrium chemistry coupled with a simple description of individual analytical methods. Actually, the schooling of analytical chemistry should closely connect theory and practice.

  6. Correlation of the Capacity Factor in Vesicular Electrokinetic Chromatography with the Octanol:Water Partition Coefficient for Charged and Neutral Analytes

    PubMed Central

    Razak, J. L.; Cutak, B. J.; Larive, C. K.; Lunte, C. E.

    2008-01-01

    Purpose The aim of this study was to develop a method based upon electrokinetic chromatography (EKC) using oppositely charged surfactant vesicles as a buffer modifier to estimate hydrophobicity (log P) for a range of neutral and charged compounds. Methods Vesicles were formed from cetyltrimethylammonium bromide (CTAB) and sodium n-octyl sulfate (SOS). The size and polydispersity of the vesicles were characterized by electron microscopy, dynamic light scattering, and pulsed-field gradient NMR (PFG-NMR). PFG-NMR was also used to determine if ion-pairing between cationic analytes and free SOS monomer occurred. The CTAB/SOS vesicles were used as a buffer modifier in capillary electrophoresis (CE). The capacity factor (log k′) was calculated by determining the mobility of the analytes both in the presence and absence of vesicles. Log k′ was determined for 29 neutral and charged analytes. Results There was a linear relationship between the log of capacity factor (log k′) and octanol/water partition coefficient (log P) for both neutral and basic species at pH 6.0, 7.3, and 10.2. This indicated that interaction between the cation and vesicle was dominated by hydrophobic forces. At pH 4.3, the log k′ values for the least hydrophobic basic analytes were higher than expected, indicating that electrostatic attraction as well as hydrophobic forces contributed to the overall interaction between the cation and vesicle. Anionic compounds could not be evaluated using this system. Conclusion Vesicular electrokinetic chromatography (VEKC) using surfactant vesicles as buffer modifiers is a promising method for the estimation of hydrophobicity. PMID:11336344

  7. A Factorization Approach to the Linear Regulator Quadratic Cost Problem

    NASA Technical Reports Server (NTRS)

    Milman, M. H.

    1985-01-01

    A factorization approach to the linear regulator quadratic cost problem is developed. This approach makes some new connections between optimal control, factorization, Riccati equations and certain Wiener-Hopf operator equations. Applications of the theory to systems describable by evolution equations in Hilbert space and differential delay equations in Euclidean space are presented.

  8. Demonstrating Success: Web Analytics and Continuous Improvement

    ERIC Educational Resources Information Center

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  9. Nursing students' attitudes towards information and communication technology: an exploratory and confirmatory factor analytic approach.

    PubMed

    Lee, Jung Jae; Clarke, Charlotte L

    2015-05-01

    The aim of this study was to develop and psychometrically test a shortened version of the Information Technology Attitude Scales for Health, in the investigation of nursing students with clinical placement experiences. Nurses and nursing students need to develop high levels of competency in information and communication technology. However, they encounter statistically significant barriers in the use of the technology. Although some instruments have been developed to measure factors that influence nurses' attitudes towards technology, the validity is questionable and few studies have been developed to test the attitudes of nursing students, in particular. A cross-sectional survey design was performed. The Information Technology Attitude Scales for Health was used to collect data from October 2012-December 2012. A panel of experts reviewed the content of the instrument and a pilot study was conducted. Following this, a total of 508 nursing students, who were engaged in clinical placements, were recruited from six universities in South Korea. Exploratory and confirmatory factor analyses were performed and reliability and construct validity were assessed. The resulting instrument consisted of 19 items across four factors. Reliability of the four factors was acceptable and the validity was supported. The instrument was shown to be both valid and reliable for measuring nursing students' attitudes towards technology, thus aiding in the current understandings of this aspect. Through these measurements and understandings, nursing educators and students are able to be more reflexive of their attitudes and can thus seek to develop them positively. © 2015 John Wiley & Sons Ltd.

  10. Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.

    2016-12-01

    The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.

  11. Metal-organic frameworks for analytical chemistry: from sample collection to chromatographic separation.

    PubMed

    Gu, Zhi-Yuan; Yang, Cheng-Xiong; Chang, Na; Yan, Xiu-Ping

    2012-05-15

    In modern analytical chemistry researchers pursue novel materials to meet analytical challenges such as improvements in sensitivity, selectivity, and detection limit. Metal-organic frameworks (MOFs) are an emerging class of microporous materials, and their unusual properties such as high surface area, good thermal stability, uniform structured nanoscale cavities, and the availability of in-pore functionality and outer-surface modification are attractive for diverse analytical applications. This Account summarizes our research on the analytical applications of MOFs ranging from sampling to chromatographic separation. MOFs have been either directly used or engineered to meet the demands of various analytical applications. Bulk MOFs with microsized crystals are convenient sorbents for direct application to in-field sampling and solid-phase extraction. Quartz tubes packed with MOF-5 have shown excellent stability, adsorption efficiency, and reproducibility for in-field sampling and trapping of atmospheric formaldehyde. The 2D copper(II) isonicotinate packed microcolumn has demonstrated large enhancement factors and good shape- and size-selectivity when applied to on-line solid-phase extraction of polycyclic aromatic hydrocarbons in water samples. We have explored the molecular sieving effect of MOFs for the efficient enrichment of peptides with simultaneous exclusion of proteins from biological fluids. These results show promise for the future of MOFs in peptidomics research. Moreover, nanosized MOFs and engineered thin films of MOFs are promising materials as novel coatings for solid-phase microextraction. We have developed an in situ hydrothermal growth approach to fabricate thin films of MOF-199 on etched stainless steel wire for solid-phase microextraction of volatile benzene homologues with large enhancement factors and wide linearity. Their high thermal stability and easy-to-engineer nanocrystals make MOFs attractive as new stationary phases to fabricate MOF

  12. Analytical optical scattering in clouds

    NASA Technical Reports Server (NTRS)

    Phanord, Dieudonne D.

    1989-01-01

    An analytical optical model for scattering of light due to lightning by clouds of different geometry is being developed. The self-consistent approach and the equivalent medium concept of Twersky was used to treat the case corresponding to outside illumination. Thus, the resulting multiple scattering problem is transformed with the knowledge of the bulk parameters, into scattering by a single obstacle in isolation. Based on the size parameter of a typical water droplet as compared to the incident wave length, the problem for the single scatterer equivalent to the distribution of cloud particles can be solved either by Mie or Rayleigh scattering theory. The super computing code of Wiscombe can be used immediately to produce results that can be compared to the Monte Carlo computer simulation for outside incidence. A fairly reasonable inverse approach using the solution of the outside illumination case was proposed to model analytically the situation for point sources located inside the thick optical cloud. Its mathematical details are still being investigated. When finished, it will provide scientists an enhanced capability to study more realistic clouds. For testing purposes, the direct approach to the inside illumination of clouds by lightning is under consideration. Presently, an analytical solution for the cubic cloud will soon be obtained. For cylindrical or spherical clouds, preliminary results are needed for scattering by bounded obstacles above or below a penetrable surface interface.

  13. Exact mode volume and Purcell factor of open optical systems

    NASA Astrophysics Data System (ADS)

    Muljarov, E. A.; Langbein, W.

    2016-12-01

    The Purcell factor quantifies the change of the radiative decay of a dipole in an electromagnetic environment relative to free space. Designing this factor is at the heart of photonics technology, striving to develop ever smaller or less lossy optical resonators. The Purcell factor can be expressed using the electromagnetic eigenmodes of the resonators, introducing the notion of a mode volume for each mode. This approach allows an analytic treatment, reducing the Purcell factor and other observables to sums over eigenmode resonances. Calculating the mode volumes requires a correct normalization of the modes. We introduce an exact normalization of modes, not relying on perfectly matched layers. We present an analytic theory of the Purcell effect based on this exact mode normalization and the resulting effective mode volume. We use a homogeneous dielectric sphere in vacuum, which is analytically solvable, to exemplify these findings. We furthermore verify the applicability of the normalization to numerically determined modes of a finite dielectric cylinder.

  14. Override the controversy: Analytic thinking predicts endorsement of evolution.

    PubMed

    Gervais, Will M

    2015-09-01

    Despite overwhelming scientific consensus, popular opinions regarding evolution are starkly divided. In the USA, for example, nearly one in three adults espouse a literal and recent divine creation account of human origins. Plausibly, resistance to scientific conclusions regarding the origins of species-like much resistance to other scientific conclusions (Bloom & Weisberg, 2007)-gains support from reliably developing intuitions. Intuitions about essentialism, teleology, agency, and order may combine to make creationism potentially more cognitively attractive than evolutionary concepts. However, dual process approaches to cognition recognize that people can often analytically override their intuitions. Two large studies (total N=1324) found consistent evidence that a tendency to engage analytic thinking predicted endorsement of evolution, even controlling for relevant demographic, attitudinal, and religious variables. Meanwhile, exposure to religion predicted reduced endorsement of evolution. Cognitive style is one factor among many affecting opinions on the origin of species. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Empire: An Analytical Category for Educational Research

    ERIC Educational Resources Information Center

    Coloma, Roland Sintos

    2013-01-01

    In this article Roland Sintos Coloma argues for the relevance of empire as an analytical category in educational research. He points out the silence in mainstream studies of education on the subject of empire, the various interpretive approaches to deploying empire as an analytic, and the importance of indigeneity in research on empire and…

  16. Big Data Analytics Methodology in the Financial Industry

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  17. Evaluating Effective Teaching in College Level Economics Using Student Ratings of Instruction: A Factor Analytic Approach

    ERIC Educational Resources Information Center

    Agbetsiafa, Douglas

    2010-01-01

    This paper explores the factors that affect students' evaluation of economic instruction using a sample of 1300 completed rating instruments at a comprehensive four-year mid-western public university. The study uses factor analysis to determine the validity and reliability of the evaluation instrument in assessing instructor or course…

  18. Annual banned-substance review: Analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans

    2018-01-01

    Several high-profile revelations concerning anti-doping rule violations over the past 12 months have outlined the importance of tackling prevailing challenges and reducing the limitations of the current anti-doping system. At this time, the necessity to enhance, expand, and improve analytical test methods in response to the substances outlined in the World Anti-Doping Agency's (WADA) Prohibited List represents an increasingly crucial task for modern sports drug-testing programs. The ability to improve analytical testing methods often relies on the expedient application of novel information regarding superior target analytes for sports drug-testing assays, drug elimination profiles, alternative test matrices, together with recent advances in instrumental developments. This annual banned-substance review evaluates literature published between October 2016 and September 2017 offering an in-depth evaluation of developments in these arenas and their potential application to substances reported in WADA's 2017 Prohibited List. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Bimodal fuzzy analytic hierarchy process (BFAHP) for coronary heart disease risk assessment.

    PubMed

    Sabahi, Farnaz

    2018-04-04

    Rooted deeply in medical multiple criteria decision-making (MCDM), risk assessment is very important especially when applied to the risk of being affected by deadly diseases such as coronary heart disease (CHD). CHD risk assessment is a stochastic, uncertain, and highly dynamic process influenced by various known and unknown variables. In recent years, there has been a great interest in fuzzy analytic hierarchy process (FAHP), a popular methodology for dealing with uncertainty in MCDM. This paper proposes a new FAHP, bimodal fuzzy analytic hierarchy process (BFAHP) that augments two aspects of knowledge, probability and validity, to fuzzy numbers to better deal with uncertainty. In BFAHP, fuzzy validity is computed by aggregating the validities of relevant risk factors based on expert knowledge and collective intelligence. By considering both soft and statistical data, we compute the fuzzy probability of risk factors using the Bayesian formulation. In BFAHP approach, these fuzzy validities and fuzzy probabilities are used to construct a reciprocal comparison matrix. We then aggregate fuzzy probabilities and fuzzy validities in a pairwise manner for each risk factor and each alternative. BFAHP decides about being affected and not being affected by ranking of high and low risks. For evaluation, the proposed approach is applied to the risk of being affected by CHD using a real dataset of 152 patients of Iranian hospitals. Simulation results confirm that adding validity in a fuzzy manner can accrue more confidence of results and clinically useful especially in the face of incomplete information when compared with actual results. Applying the proposed BFAHP on CHD risk assessment of the dataset, it yields high accuracy rate above 85% for correct prediction. In addition, this paper recognizes that the risk factors of diastolic blood pressure in men and high-density lipoprotein in women are more important in CHD than other risk factors. Copyright © 2018 Elsevier Inc. All

  20. Scalable non-negative matrix tri-factorization.

    PubMed

    Čopar, Andrej; Žitnik, Marinka; Zupan, Blaž

    2017-01-01

    Matrix factorization is a well established pattern discovery tool that has seen numerous applications in biomedical data analytics, such as gene expression co-clustering, patient stratification, and gene-disease association mining. Matrix factorization learns a latent data model that takes a data matrix and transforms it into a latent feature space enabling generalization, noise removal and feature discovery. However, factorization algorithms are numerically intensive, and hence there is a pressing challenge to scale current algorithms to work with large datasets. Our focus in this paper is matrix tri-factorization, a popular method that is not limited by the assumption of standard matrix factorization about data residing in one latent space. Matrix tri-factorization solves this by inferring a separate latent space for each dimension in a data matrix, and a latent mapping of interactions between the inferred spaces, making the approach particularly suitable for biomedical data mining. We developed a block-wise approach for latent factor learning in matrix tri-factorization. The approach partitions a data matrix into disjoint submatrices that are treated independently and fed into a parallel factorization system. An appealing property of the proposed approach is its mathematical equivalence with serial matrix tri-factorization. In a study on large biomedical datasets we show that our approach scales well on multi-processor and multi-GPU architectures. On a four-GPU system we demonstrate that our approach can be more than 100-times faster than its single-processor counterpart. A general approach for scaling non-negative matrix tri-factorization is proposed. The approach is especially useful parallel matrix factorization implemented in a multi-GPU environment. We expect the new approach will be useful in emerging procedures for latent factor analysis, notably for data integration, where many large data matrices need to be collectively factorized.

  1. Light-emitting diodes for analytical chemistry.

    PubMed

    Macka, Mirek; Piasecki, Tomasz; Dasgupta, Purnendu K

    2014-01-01

    Light-emitting diodes (LEDs) are playing increasingly important roles in analytical chemistry, from the final analysis stage to photoreactors for analyte conversion to actual fabrication of and incorporation in microdevices for analytical use. The extremely fast turn-on/off rates of LEDs have made possible simple approaches to fluorescence lifetime measurement. Although they are increasingly being used as detectors, their wavelength selectivity as detectors has rarely been exploited. From their first proposed use for absorbance measurement in 1970, LEDs have been used in analytical chemistry in too many ways to make a comprehensive review possible. Hence, we critically review here the more recent literature on their use in optical detection and measurement systems. Cloudy as our crystal ball may be, we express our views on the future applications of LEDs in analytical chemistry: The horizon will certainly become wider as LEDs in the deep UV with sufficient intensity become available.

  2. E-Education Applications: Human Factors and Innovative Approaches

    ERIC Educational Resources Information Center

    Ghaoui, Claude, Ed.

    2004-01-01

    "E-Education Applications: Human Factors and Innovative Approaches" enforces the need to take multi-disciplinary and/or inter-disciplinary approaches, when solutions for e-education (or online-, e-learning) are introduced. By focusing on the issues that have impact on the usability of e-learning, the book specifically fills-in a gap in this area,…

  3. Synthesizing evidence on complex interventions: how meta-analytical, qualitative, and mixed-method approaches can contribute.

    PubMed

    Petticrew, Mark; Rehfuess, Eva; Noyes, Jane; Higgins, Julian P T; Mayhew, Alain; Pantoja, Tomas; Shemilt, Ian; Sowden, Amanda

    2013-11-01

    Although there is increasing interest in the evaluation of complex interventions, there is little guidance on how evidence from complex interventions may be reviewed and synthesized, and the relevance of the plethora of evidence synthesis methods to complexity is unclear. This article aims to explore how different meta-analytical approaches can be used to examine aspects of complexity; describe the contribution of various narrative, tabular, and graphical approaches to synthesis; and give an overview of the potential choice of selected qualitative and mixed-method evidence synthesis approaches. The methodological discussions presented here build on a 2-day workshop held in Montebello, Canada, in January 2012, involving methodological experts from the Campbell and Cochrane Collaborations and from other international review centers (Anderson L, Petticrew M, Chandler J, et al. systematic reviews of complex interventions. In press). These systematic review methodologists discussed the broad range of existing methods and considered the relevance of these methods to reviews of complex interventions. The evidence from primary studies of complex interventions may be qualitative or quantitative. There is a wide range of methodological options for reviewing and presenting this evidence. Specific contributions of statistical approaches include the use of meta-analysis, meta-regression, and Bayesian methods, whereas narrative summary approaches provide valuable precursors or alternatives to these. Qualitative and mixed-method approaches include thematic synthesis, framework synthesis, and realist synthesis. A suitable combination of these approaches allows synthesis of evidence for understanding complex interventions. Reviewers need to consider which aspects of complex interventions should be a focus of their review and what types of quantitative and/or qualitative studies they will be including, and this will inform their choice of review methods. These may range from

  4. An Analytics-Based Approach to Managing Cognitive Load by Using Log Data of Learning Management Systems and Footprints of Social Media

    ERIC Educational Resources Information Center

    Yen, Cheng-Huang; Chen, I-Chuan; Lai, Su-Chun; Chuang, Yea-Ru

    2015-01-01

    Traces of learning behaviors generally provide insights into learners and the learning processes that they employ. In this article, a learning-analytics-based approach is proposed for managing cognitive load by adjusting the instructional strategies used in online courses. The technology-based learning environment examined in this study involved a…

  5. A Meta-Analytic Approach to Examining the Correlation Between Religion/Spirituality and Mental Health in Cancer

    PubMed Central

    Salsman, John M.; Pustejovsky, James E.; Jim, Heather S.L.; Munoz, Alexis R.; Merluzzi, Thomas V.; George, Login; Park, Crystal L.; Danhauer, Suzanne C.; Sherman, Allen C.; Snyder, Mallory A.; Fitchett, George

    2015-01-01

    Purpose Religion and spirituality (R/S) are patient-centered factors and often resources for managing the emotional sequelae of the cancer experience. Studies investigating the relationship between R/S (e.g., beliefs, experiences, coping) and mental health (e.g., depression, anxiety, well-being) in cancer have used very heterogeneous measures, with correspondingly inconsistent results. A meaningful synthesis of these findings has been lacking; thus, the purpose of this study was to conduct a meta-analysis of the research on R/S and mental health. Methods Four electronic databases were systematically reviewed and 2,073 abstracts met initial selection criteria. Reviewer pairs applied standardized coding schemes to extract correlational indices of the relationship between R/S and mental health. A total of 617 effect sizes from 148 eligible studies were synthesized using meta-analytic generalized estimating equations; subgroup analyses were performed to examine moderators of effects. Results The estimated mean correlation (Fisher z) was 0.19 (95% CI 0.16–0.23), which varied as a function of R/S dimension: affective, z=0.38 (95% CI 0.33-0.43); behavioral, z=0.03 (95% CI -0.02-0.08); cognitive, z=0.10 (95% CI 0.06-0.14); and ‘other,’ z=0.08 (95% CI 0.03-0.13). Aggregate, study-level demographic and clinical factors were not predictive of the relationship between R/S and mental health. There was little indication of publication or reporting biases. Conclusions The relationship between R/S and mental health is generally a positive one. The strength of that relationship is modest and varies as a function of R/S dimensions and mental health domains assessed. Identification of optimal R/S measures and more sophisticated methodological approaches are needed to advance research. PMID:26258536

  6. A meta-analytic approach to examining the correlation between religion/spirituality and mental health in cancer.

    PubMed

    Salsman, John M; Pustejovsky, James E; Jim, Heather S L; Munoz, Alexis R; Merluzzi, Thomas V; George, Login; Park, Crystal L; Danhauer, Suzanne C; Sherman, Allen C; Snyder, Mallory A; Fitchett, George

    2015-11-01

    Religion and spirituality (R/S) are patient-centered factors and often are resources for managing the emotional sequelae of the cancer experience. Studies investigating the correlation between R/S (eg, beliefs, experiences, coping) and mental health (eg, depression, anxiety, well being) in cancer have used very heterogeneous measures and have produced correspondingly inconsistent results. A meaningful synthesis of these findings has been lacking; thus, the objective of this review was to conduct a meta-analysis of the research on R/S and mental health. Four electronic databases were systematically reviewed, and 2073 abstracts met initial selection criteria. Reviewer pairs applied standardized coding schemes to extract indices of the correlation between R/S and mental health. In total, 617 effect sizes from 148 eligible studies were synthesized using meta-analytic generalized estimating equations, and subgroup analyses were performed to examine moderators of effects. The estimated mean correlation (Fisher z) was 0.19 (95% confidence interval [CI], 0.16-0.23), which varied as a function of R/S dimensions: affective R/S (z = 0.38; 95% CI, 0.33-0.43), behavioral R/S (z = 0.03; 95% CI, -0.02-0.08), cognitive R/S (z = 0.10; 95% CI, 0.06-0.14), and 'other' R/S (z = 0.08; 95% CI, 0.03-0.13). Aggregate, study-level demographic and clinical factors were not predictive of the relation between R/S and mental health. There was little indication of publication or reporting biases. The correlation between R/S and mental health generally was positive. The strength of that correlation was modest and varied as a function of the R/S dimensions and mental health domains assessed. The identification of optimal R/S measures and more sophisticated methodological approaches are needed to advance research. © 2015 American Cancer Society.

  7. A quasi-likelihood approach to non-negative matrix factorization

    PubMed Central

    Devarajan, Karthik; Cheung, Vincent C.K.

    2017-01-01

    A unified approach to non-negative matrix factorization based on the theory of generalized linear models is proposed. This approach embeds a variety of statistical models, including the exponential family, within a single theoretical framework and provides a unified view of such factorizations from the perspective of quasi-likelihood. Using this framework, a family of algorithms for handling signal-dependent noise is developed and its convergence proven using the Expectation-Maximization algorithm. In addition, a measure to evaluate the goodness-of-fit of the resulting factorization is described. The proposed methods allow modeling of non-linear effects via appropriate link functions and are illustrated using an application in biomedical signal processing. PMID:27348511

  8. Gravity field error analysis for pendulum formations by a semi-analytical approach

    NASA Astrophysics Data System (ADS)

    Li, Huishu; Reubelt, Tilo; Antoni, Markus; Sneeuw, Nico

    2017-03-01

    Many geoscience disciplines push for ever higher requirements on accuracy, homogeneity and time- and space-resolution of the Earth's gravity field. Apart from better instruments or new observables, alternative satellite formations could improve the signal and error structure compared to Grace. One possibility to increase the sensitivity and isotropy by adding cross-track information is a pair of satellites flying in a pendulum formation. This formation contains two satellites which have different ascending nodes and arguments of latitude, but have the same orbital height and inclination. In this study, the semi-analytical approach for efficient pre-mission error assessment is presented, and the transfer coefficients of range, range-rate and range-acceleration gravitational perturbations are derived analytically for the pendulum formation considering a set of opening angles. The new challenge is the time variations of the opening angle and the range, leading to temporally variable transfer coefficients. This is solved by Fourier expansion of the sine/cosine of the opening angle and the central angle. The transfer coefficients are further applied to assess the error patterns which are caused by different orbital parameters. The simulation results indicate that a significant improvement in accuracy and isotropy is obtained for small and medium initial opening angles of single polar pendulums, compared to Grace. The optimal initial opening angles are 45° and 15° for accuracy and isotropy, respectively. For a Bender configuration, which is constituted by a polar Grace and an inclined pendulum in this paper, the behaviour of results is dependent on the inclination (prograde vs. retrograde) and on the relative baseline orientation (left or right leading). The simulation for a sun-synchronous orbit shows better results for the left leading case.

  9. NEUROTROPHIC FACTORS IN COMBINATORIAL APPROACHES FOR SPINAL CORD REGENERATION

    PubMed Central

    McCall, Julianne; Weidner, Norbert; Blesch, Armin

    2012-01-01

    Axonal regeneration is inhibited by a plethora of different mechanisms in the adult central nervous system (CNS). While neurotrophic factors have been shown to stimulate axonal growth in numerous animal models of nervous system injury, a lack of suitable growth substrates, an insufficient activation of neuron-intrinsic regenerative programs and extracellular inhibitors of regeneration limit the efficacy of neurotrophic factor delivery for anatomical and functional recovery after spinal cord injury. Thus, growth-stimulating factors will likely have to be combined with other treatment approaches to tap into the full potential of growth factor therapy for axonal regeneration. In addition, the temporal and spatial distribution of growth factors have to be tightly controlled to achieve biologically active concentrations, to allow for the chemotropic guidance of axons and to prevent adverse effects related to the widespread distribution of neurotrophic factors. Here, we will review the rationale for combinatorial treatments in axonal regeneration and summarize some recent progress in promoting axonal regeneration in the injured CNS using such approaches. PMID:22526621

  10. Experienced quality factors: qualitative evaluation approach to audiovisual quality

    NASA Astrophysics Data System (ADS)

    Jumisko-Pyykkö, Satu; Häkkinen, Jukka; Nyman, Göte

    2007-02-01

    Subjective evaluation is used to identify impairment factors of multimedia quality. The final quality is often formulated via quantitative experiments, but this approach has its constraints, as subject's quality interpretations, experiences and quality evaluation criteria are disregarded. To identify these quality evaluation factors, this study examined qualitatively the criteria participants used to evaluate audiovisual video quality. A semi-structured interview was conducted with 60 participants after a subjective audiovisual quality evaluation experiment. The assessment compared several, relatively low audio-video bitrate ratios with five different television contents on mobile device. In the analysis, methodological triangulation (grounded theory, Bayesian networks and correspondence analysis) was applied to approach the qualitative quality. The results showed that the most important evaluation criteria were the factors of visual quality, contents, factors of audio quality, usefulness - followability and audiovisual interaction. Several relations between the quality factors and the similarities between the contents were identified. As a research methodological recommendation, the focus on content and usage related factors need to be further examined to improve the quality evaluation experiments.

  11. The Development of Verbal and Visual Working Memory Processes: A Latent Variable Approach

    ERIC Educational Resources Information Center

    Koppenol-Gonzalez, Gabriela V.; Bouwmeester, Samantha; Vermunt, Jeroen K.

    2012-01-01

    Working memory (WM) processing in children has been studied with different approaches, focusing on either the organizational structure of WM processing during development (factor analytic) or the influence of different task conditions on WM processing (experimental). The current study combined both approaches, aiming to distinguish verbal and…

  12. Experimental/analytical approaches to modeling, calibrating and optimizing shaking table dynamics for structural dynamic applications

    NASA Astrophysics Data System (ADS)

    Trombetti, Tomaso

    This thesis presents an Experimental/Analytical approach to modeling and calibrating shaking tables for structural dynamic applications. This approach was successfully applied to the shaking table recently built in the structural laboratory of the Civil Engineering Department at Rice University. This shaking table is capable of reproducing model earthquake ground motions with a peak acceleration of 6 g's, a peak velocity of 40 inches per second, and a peak displacement of 3 inches, for a maximum payload of 1500 pounds. It has a frequency bandwidth of approximately 70 Hz and is designed to test structural specimens up to 1/5 scale. The rail/table system is mounted on a reaction mass of about 70,000 pounds consisting of three 12 ft x 12 ft x 1 ft reinforced concrete slabs, post-tensioned together and connected to the strong laboratory floor. The slip table is driven by a hydraulic actuator governed by a 407 MTS controller which employs a proportional-integral-derivative-feedforward-differential pressure algorithm to control the actuator displacement. Feedback signals are provided by two LVDT's (monitoring the slip table relative displacement and the servovalve main stage spool position) and by one differential pressure transducer (monitoring the actuator force). The dynamic actuator-foundation-specimen system is modeled and analyzed by combining linear control theory and linear structural dynamics. The analytical model developed accounts for the effects of actuator oil compressibility, oil leakage in the actuator, time delay in the response of the servovalve spool to a given electrical signal, foundation flexibility, and dynamic characteristics of multi-degree-of-freedom specimens. In order to study the actual dynamic behavior of the shaking table, the transfer function between target and actual table accelerations were identified using experimental results and spectral estimation techniques. The power spectral density of the system input and the cross power spectral

  13. Growth of nano-dots on the grazing incidence mirror surface under FEL irradiation: analytic approach to modeling

    NASA Astrophysics Data System (ADS)

    Kozhevnikov, I. V.; Buzmakov, A. V.; Siewert, F.; Tiedtke, K.; Störmer, M.; Samoylova, L.; Sinn, H.

    2017-05-01

    Simple analytic equation is deduced to explain new physical phenomenon detected experimentally: growth of nano-dots (40-55 nm diameter, 8-13 nm height, 9.4 dots/μm2 surface density) on the grazing incidence mirror surface under the three years irradiation by the free electron laser FLASH (5-45 nm wavelength, 3 degrees grazing incidence angle). The growth model is based on the assumption that the growth of nano-dots is caused by polymerization of incoming hydrocarbon molecules under the action of incident photons directly or photoelectrons knocked out from a mirror surface. The key feature of our approach consists in that we take into account the radiation intensity variation nearby a mirror surface in an explicit form, because the polymerization probability is proportional to it. We demonstrate that the simple analytic approach allows to explain all phenomena observed in experiment and to predict new effects. In particular, we show that the nano-dots growth depends crucially on the grazing angle of incoming beam and its intensity: growth of nano-dots is observed in the limited from above and below intervals of the grazing angle and the radiation intensity. Decrease in the grazing angle by 1 degree only (from 3 to 2 degree) may result in a strong suppression of nanodots growth and their total disappearing. Similarly, decrease in the radiation intensity by several times (replacement of free electron laser by synchrotron) results also in disappearing of nano-dots growth.

  14. Writing analytic element programs in Python.

    PubMed

    Bakker, Mark; Kelson, Victor A

    2009-01-01

    The analytic element method is a mesh-free approach for modeling ground water flow at both the local and the regional scale. With the advent of the Python object-oriented programming language, it has become relatively easy to write analytic element programs. In this article, an introduction is given of the basic principles of the analytic element method and of the Python programming language. A simple, yet flexible, object-oriented design is presented for analytic element codes using multiple inheritance. New types of analytic elements may be added without the need for any changes in the existing part of the code. The presented code may be used to model flow to wells (with either a specified discharge or drawdown) and streams (with a specified head). The code may be extended by any hydrogeologist with a healthy appetite for writing computer code to solve more complicated ground water flow problems. Copyright © 2009 The Author(s). Journal Compilation © 2009 National Ground Water Association.

  15. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less

  16. Offshore drilling effects in Brazilian SE marine sediments: a meta-analytical approach.

    PubMed

    Dore, Marina Pereira; Farias, Cássia; Hamacher, Cláudia

    2017-01-01

    The exploration and production of oil and gas reserves often result to drill cutting accumulations on the seafloor adjacent to drill locations. In this study, the detection of drilling influence on marine sediments was performed by meta-analytical comparison between data from pre- and post-drilling surveys undertaken in offshore Campos Basin, southeast of Brazil. Besides this overall appraisal on the geochemical variables, a multivariate assessment, considering only the post-drilling data, was performed. Among the variables, fines content, carbonates, total organic carbon, barium, chromium, copper, iron, manganese, nickel, lead, vanadium, zinc, and total petroleum hydrocarbons, only barium, copper, and hydrocarbons were related to drilling impacts. In relation to the point of discharge, relative elevated levels in the post-drilling campaigns were observed preferentially up to 500 m in the northeast and southwest directions, associated to the Brazil Current-predominant direction. Other distributed concentrations in the surroundings seem to indicate the dilution and dispersion of drilling waste promoted by meteoceanographic factors.

  17. Analytical approach to calculation of response spectra from seismological models of ground motion

    USGS Publications Warehouse

    Safak, Erdal

    1988-01-01

    An analytical approach to calculate response spectra from seismological models of ground motion is presented. Seismological models have three major advantages over empirical models: (1) they help in an understanding of the physics of earthquake mechanisms, (2) they can be used to predict ground motions for future earthquakes and (3) they can be extrapolated to cases where there are no data available. As shown with this study, these models also present a convenient form for the calculation of response spectra, by using the methods of random vibration theory, for a given magnitude and site conditions. The first part of the paper reviews the past models for ground motion description, and introduces the available seismological models. Then, the random vibration equations for the spectral response are presented. The nonstationarity, spectral bandwidth and the correlation of the peaks are considered in the calculation of the peak response.

  18. An analytical approach to obtaining JWL parameters from cylinder tests

    NASA Astrophysics Data System (ADS)

    Sutton, B. D.; Ferguson, J. W.; Hodgson, A. N.

    2017-01-01

    An analytical method for determining parameters for the JWL Equation of State from cylinder test data is described. This method is applied to four datasets obtained from two 20.3 mm diameter EDC37 cylinder tests. The calculated pressure-relative volume (p-Vr) curves agree with those produced by hydro-code modelling. The average calculated Chapman-Jouguet (CJ) pressure is 38.6 GPa, compared to the model value of 38.3 GPa; the CJ relative volume is 0.729 for both. The analytical pressure-relative volume curves produced agree with the one used in the model out to the commonly reported expansion of 7 relative volumes, as do the predicted energies generated by integrating under the p-Vr curve. The calculated energy is within 1.6% of that predicted by the model.

  19. An Analytical Approach to Obtaining JWL Parameters from Cylinder Tests

    NASA Astrophysics Data System (ADS)

    Sutton, Ben; Ferguson, James

    2015-06-01

    An analytical method for determining parameters for the JWL equation of state (EoS) from cylinder test data is described. This method is applied to four datasets obtained from two 20.3 mm diameter EDC37 cylinder tests. The calculated parameters and pressure-volume (p-V) curves agree with those produced by hydro-code modelling. The calculated Chapman-Jouguet (CJ) pressure is 38.6 GPa, compared to the model value of 38.3 GPa; the CJ relative volume is 0.729 for both. The analytical pressure-volume curves produced agree with the one used in the model out to the commonly reported expansion of 7 relative volumes, as do the predicted energies generated by integrating under the p-V curve. The calculated and model energies are 8.64 GPa and 8.76 GPa respectively.

  20. Stabilization and robustness of non-linear unity-feedback system - Factorization approach

    NASA Technical Reports Server (NTRS)

    Desoer, C. A.; Kabuli, M. G.

    1988-01-01

    The paper is a self-contained discussion of a right factorization approach in the stability analysis of the nonlinear continuous-time or discrete-time, time-invariant or time-varying, well-posed unity-feedback system S1(P, C). It is shown that a well-posed stable feedback system S1(P, C) implies that P and C have right factorizations. In the case where C is stable, P has a normalized right-coprime factorization. The factorization approach is used in stabilization and simultaneous stabilization results.

  1. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  2. Analytical and simulator study of advanced transport

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Rickard, W. W.

    1982-01-01

    An analytic methodology, based on the optimal-control pilot model, was demonstrated for assessing longitidunal-axis handling qualities of transport aircraft in final approach. Calibration of the methodology is largely in terms of closed-loop performance requirements, rather than specific vehicle response characteristics, and is based on a combination of published criteria, pilot preferences, physical limitations, and engineering judgment. Six longitudinal-axis approach configurations were studied covering a range of handling qualities problems, including the presence of flexible aircraft modes. The analytical procedure was used to obtain predictions of Cooper-Harper ratings, a solar quadratic performance index, and rms excursions of important system variables.

  3. Visual analytics in medical education: impacting analytical reasoning and decision making for quality improvement.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2015-01-01

    The medical curriculum is the main tool representing the entire undergraduate medical education. Due to its complexity and multilayered structure it is of limited use to teachers in medical education for quality improvement purposes. In this study we evaluated three visualizations of curriculum data from a pilot course, using teachers from an undergraduate medical program and applying visual analytics methods. We found that visual analytics can be used to positively impacting analytical reasoning and decision making in medical education through the realization of variables capable to enhance human perception and cognition on complex curriculum data. The positive results derived from our evaluation of a medical curriculum and in a small scale, signify the need to expand this method to an entire medical curriculum. As our approach sustains low levels of complexity it opens a new promising direction in medical education informatics research.

  4. Analytic strategies to evaluate the association of time-varying exposures to HIV-related outcomes: Alcohol consumption as an example.

    PubMed

    Cook, Robert L; Kelso, Natalie E; Brumback, Babette A; Chen, Xinguang

    2016-01-01

    As persons with HIV are living longer, there is a growing need to investigate factors associated with chronic disease, rate of disease progression and survivorship. Many risk factors for this high-risk population change over time, such as participation in treatment, alcohol consumption and drug abuse. Longitudinal datasets are increasingly available, particularly clinical data that contain multiple observations of health exposures and outcomes over time. Several analytic options are available for assessment of longitudinal data; however, it can be challenging to choose the appropriate analytic method for specific combinations of research questions and types of data. The purpose of this review is to help researchers choose the appropriate methods to analyze longitudinal data, using alcohol consumption as an example of a time-varying exposure variable. When selecting the optimal analytic method, one must consider aspects of exposure (e.g. timing, pattern, and amount) and outcome (fixed or time-varying), while also addressing minimizing bias. In this article, we will describe several analytic approaches for longitudinal data, including developmental trajectory analysis, generalized estimating equations, and mixed effect models. For each analytic strategy, we describe appropriate situations to use the method and provide an example that demonstrates the use of the method. Clinical data related to alcohol consumption and HIV are used to illustrate these methods.

  5. Analytical approaches to optimizing system "Semiconductor converter-electric drive complex"

    NASA Astrophysics Data System (ADS)

    Kormilicin, N. V.; Zhuravlev, A. M.; Khayatov, E. S.

    2018-03-01

    In the electric drives of the machine-building industry, the problem of optimizing the drive in terms of mass-size indicators is acute. The article offers analytical methods that ensure the minimization of the mass of a multiphase semiconductor converter. In multiphase electric drives, the form of the phase current at which the best possible use of the "semiconductor converter-electric drive complex" for active materials is different from the sinusoidal form. It is shown that under certain restrictions on the phase current form, it is possible to obtain an analytical solution. In particular, if one assumes the shape of the phase current to be rectangular, the optimal shape of the control actions will depend on the width of the interpolar gap. In the general case, the proposed algorithm can be used to solve the problem under consideration by numerical methods.

  6. Dynamic behaviour of a planar micro-beam loaded by a fluid-gap: Analytical and numerical approach in a high frequency range, benchmark solutions

    NASA Astrophysics Data System (ADS)

    Novak, A.; Honzik, P.; Bruneau, M.

    2017-08-01

    Miniaturized vibrating MEMS devices, active (receivers or emitters) or passive devices, and their use for either new applications (hearing, meta-materials, consumer devices,…) or metrological purposes under non-standard conditions, are involved today in several acoustic domains. More in-depth characterisation than the classical ones available until now are needed. In this context, the paper presents analytical and numerical approaches for describing the behaviour of three kinds of planar micro-beams of rectangular shape (suspended rigid or clamped elastic planar beam) loaded by a backing cavity or a fluid-gap, surrounded by very thin slits, and excited by an incident acoustic field. The analytical approach accounts for the coupling between the vibrating structure and the acoustic field in the backing cavity, the thermal and viscous diffusion processes in the boundary layers in the slits and the cavity, the modal behaviour for the vibrating structure, and the non-uniformity of the acoustic field in the backing cavity which is modelled in using an integral formulation with a suitable Green's function. Benchmark solutions are proposed in terms of beam motion (from which the sensitivity, input impedance, and pressure transfer function can be calculated). A numerical implementation (FEM) is handled against which the analytical results are tested.

  7. Orbital-optimized MP2.5 and its analytic gradients: Approaching CCSD(T) quality for noncovalent interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bozkaya, Uğur, E-mail: ugur.bozkaya@atauni.edu.tr; Center for Computational Molecular Science and Technology, School of Chemistry and Biochemistry, and School of Computational Science and Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332; Sherrill, C. David

    2014-11-28

    Orbital-optimized MP2.5 [or simply “optimized MP2.5,” OMP2.5, for short] and its analytic energy gradients are presented. The cost of the presented method is as much as that of coupled-cluster singles and doubles (CCSD) [O(N{sup 6}) scaling] for energy computations. However, for analytic gradient computations the OMP2.5 method is only half as expensive as CCSD because there is no need to solve λ{sub 2}-amplitude equations for OMP2.5. The performance of the OMP2.5 method is compared with that of the standard second-order Møller–Plesset perturbation theory (MP2), MP2.5, CCSD, and coupled-cluster singles and doubles with perturbative triples (CCSD(T)) methods for equilibrium geometries, hydrogenmore » transfer reactions between radicals, and noncovalent interactions. For bond lengths of both closed and open-shell molecules, the OMP2.5 method improves upon MP2.5 and CCSD by 38%–43% and 31%–28%, respectively, with Dunning's cc-pCVQZ basis set. For complete basis set (CBS) predictions of hydrogen transfer reaction energies, the OMP2.5 method exhibits a substantially better performance than MP2.5, providing a mean absolute error of 1.1 kcal mol{sup −1}, which is more than 10 times lower than that of MP2.5 (11.8 kcal mol{sup −1}), and comparing to MP2 (14.6 kcal mol{sup −1}) there is a more than 12-fold reduction in errors. For noncovalent interaction energies (at CBS limits), the OMP2.5 method maintains the very good performance of MP2.5 for closed-shell systems, and for open-shell systems it significantly outperforms MP2.5 and CCSD, and approaches CCSD(T) quality. The MP2.5 errors decrease by a factor of 5 when the optimized orbitals are used for open-shell noncovalent interactions, and comparing to CCSD there is a more than 3-fold reduction in errors. Overall, the present application results indicate that the OMP2.5 method is very promising for open-shell noncovalent interactions and other chemical systems with difficult electronic structures.« less

  8. Factorization approach to superintegrable systems: Formalism and applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ballesteros, Á., E-mail: angelb@ubu.es; Herranz, F. J., E-mail: fjherranz@ubu.es; Kuru, Ş., E-mail: kuru@science.ankara.edu.tr

    2017-03-15

    The factorization technique for superintegrable Hamiltonian systems is revisited and applied in order to obtain additional (higher-order) constants of the motion. In particular, the factorization approach to the classical anisotropic oscillator on the Euclidean plane is reviewed, and new classical (super) integrable anisotropic oscillators on the sphere are constructed. The Tremblay–Turbiner–Winternitz system on the Euclidean plane is also studied from this viewpoint.

  9. Solving a layout design problem by analytic hierarchy process (AHP) and data envelopment analysis (DEA) approach

    NASA Astrophysics Data System (ADS)

    Tuzkaya, Umut R.; Eser, Arzum; Argon, Goner

    2004-02-01

    Today, growing amounts of waste due to fast consumption rate of products started an irreversible environmental pollution and damage. A considerable part of this waste is caused by packaging material. With the realization of this fact, various waste policies have taken important steps. Here we considered a firm, where waste Aluminum constitutes majority of raw materials for this fir0m. In order to achieve a profitable recycling process, plant layout should be well designed. In this study, we propose a two-step approach involving Analytic Hierarchy Process (AHP) and Data Envelopment Analysis (DEA) to solve facility layout design problems. A case example is considered to demonstrate the results achieved.

  10. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    PubMed

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Integrated Care and Connected Health Approaches Leveraging Personalised Health through Big Data Analytics.

    PubMed

    Maglaveras, Nicos; Kilintzis, Vassilis; Koutkias, Vassilis; Chouvarda, Ioanna

    2016-01-01

    Integrated care and connected health are two fast evolving concepts that have the potential to leverage personalised health. From the one side, the restructuring of care models and implementation of new systems and integrated care programs providing coaching and advanced intervention possibilities, enable medical decision support and personalized healthcare services. From the other side, the connected health ecosystem builds the means to follow and support citizens via personal health systems in their everyday activities and, thus, give rise to an unprecedented wealth of data. These approaches are leading to the deluge of complex data, as well as in new types of interactions with and among users of the healthcare ecosystem. The main challenges refer to the data layer, the information layer, and the output of information processing and analytics. In all the above mentioned layers, the primary concern is the quality both in data and information, thus, increasing the need for filtering mechanisms. Especially in the data layer, the big biodata management and analytics ecosystem is evolving, telemonitoring is a step forward for data quality leverage, with numerous challenges still left to address, partly due to the large number of micro-nano sensors and technologies available today, as well as the heterogeneity in the users' background and data sources. This leads to new R&D pathways as it concerns biomedical information processing and management, as well as to the design of new intelligent decision support systems (DSS) and interventions for patients. In this paper, we illustrate these issues through exemplar research targeting chronic patients, illustrating the current status and trends in PHS within the integrated care and connected care world.

  12. Authentication of Kalix (N.E. Sweden) vendace caviar using inductively coupled plasma-based analytical techniques: evaluation of different approaches.

    PubMed

    Rodushkin, I; Bergman, T; Douglas, G; Engström, E; Sörlin, D; Baxter, D C

    2007-02-05

    Different analytical approaches for origin differentiation between vendace and whitefish caviars from brackish- and freshwaters were tested using inductively coupled plasma double focusing sector field mass spectrometry (ICP-SFMS) and multi-collector inductively coupled plasma mass spectrometry (MC-ICP-MS). These approaches involve identifying differences in elemental concentrations or sample-specific isotopic composition (Sr and Os) variations. Concentrations of 72 elements were determined by ICP-SFMS following microwave-assisted digestion in vendace and whitefish caviar samples from Sweden (from both brackish and freshwater), Finland and USA, as well as in unprocessed vendace roe and salt used in caviar production. This data set allows identification of elements whose contents in caviar can be affected by salt addition as well as by contamination during production and packaging. Long-term method reproducibility was assessed for all analytes based on replicate caviar preparations/analyses and variations in element concentrations in caviar from different harvests were evaluated. The greatest utility for differentiation was demonstrated for elements with varying concentrations between brackish and freshwaters (e.g. As, Br, Sr). Elemental ratios, specifically Sr/Ca, Sr/Mg and Sr/Ba, are especially useful for authentication of vendace caviar processed from brackish water roe, due to the significant differences between caviar from different sources, limited between-harvest variations and relatively high concentrations in samples, allowing precise determination by modern analytical instrumentation. Variations in the 87Sr/86Sr ratio for vendace caviar from different harvests (on the order of 0.05-0.1%) is at least 10-fold less than differences between caviar processed from brackish and freshwater roe. Hence, Sr isotope ratio measurements (either by ICP-SFMS or by MC-ICP-MS) have great potential for origin differentiation. On the contrary, it was impossible to

  13. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  14. Mobility spectrum analytical approach for intrinsic band picture of Ba(FeAs)2

    NASA Astrophysics Data System (ADS)

    Huynh, K. K.; Tanabe, Y.; Urata, T.; Heguri, S.; Tanigaki, K.; Kida, T.; Hagiwara, M.

    2014-09-01

    Unconventional high temperature superconductivity as well as three-dimensional bulk Dirac cone quantum states arising from the unique d-orbital topology have comprised an intriguing research area in physics. Here we apply a special analytical approach using a mobility spectrum, in which the carrier number is conveniently described as a function of mobility without any hypothesis, both on the types and the numbers of carriers, for the interpretations of longitudinal and transverse electric transport of high quality single crystal Ba(FeAs)2 in a wide range of magnetic fields. We show that the majority carriers are accommodated in large parabolic hole and electron pockets with very different topology as well as remarkably different mobility spectra, whereas the minority carriers reside in Dirac quantum states with the largest mobility as high as 70,000 cm2(Vs)-1. The deduced mobility spectra are discussed and compared to the reported sophisticated first principle band calculations.

  15. Analytical Psychology: A Review of a Theoretical Approach and Its Application to Counseling.

    ERIC Educational Resources Information Center

    Ziff, Katherine K.

    Analytical psychology is a field supported by training centers, specially trained analysts, and a growing body of literature. While it receives much recognition, it remains mostly outside the mainstream of counseling and counselor education. This document presents a brief history of analytical psychology and how it has been revisited and renamed…

  16. Oral Motor Abilities Are Task Dependent: A Factor Analytic Approach to Performance Rate.

    PubMed

    Staiger, Anja; Schölderle, Theresa; Brendel, Bettina; Bötzel, Kai; Ziegler, Wolfram

    2017-01-01

    Measures of performance rates in speech-like or volitional nonspeech oral motor tasks are frequently used to draw inferences about articulation rate abnormalities in patients with neurologic movement disorders. The study objective was to investigate the structural relationship between rate measures of speech and of oral motor behaviors different from speech. A total of 130 patients with neurologic movement disorders and 130 healthy subjects participated in the study. Rate data was collected for oral reading (speech), rapid syllable repetition (speech-like), and rapid single articulator movements (nonspeech). The authors used factor analysis to determine whether the different rate variables reflect the same or distinct constructs. The behavioral data were most appropriately captured by a measurement model in which the different task types loaded onto separate latent variables. The data on oral motor performance rates show that speech tasks and oral motor tasks such as rapid syllable repetition or repetitive single articulator movements measure separate traits.

  17. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based Graduate-Level Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Toh, Chee-Seng

    2007-01-01

    A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.

  18. An analytical approach to the rise velocity of periodic bubble trains in non-Newtonian fluids.

    PubMed

    Frank, X; Li, H Z; Funfschilling, D

    2005-01-01

    The present study aims at providing insight into the acceleration mechanism of a bubble chain rising in shear-thinning viscoelastic fluids. The experimental investigation by the Particle Image Velocimetry (PIV), birefringence visualisation and rheological simulation shows that two aspects are central to bubble interactions in such media: the stress creation by the passage of bubbles, and their relaxation due to the fluid's memory forming an evanescent corridor of reduced viscosity. Interactions between bubbles were taken into account mainly through a linear superposition of the stress evolution behind each bubble. An analytical approach together with the rheological consideration was developed to compute the rise velocity of a bubble chain in function of the injection period and bubble volume. The model predictions compare satisfactorily with the experimental investigation.

  19. Moving Aerospace Structural Design Practice to a Load and Resistance Factor Approach

    NASA Technical Reports Server (NTRS)

    Larsen, Curtis E.; Raju, Ivatury S.

    2016-01-01

    Aerospace structures are traditionally designed using the factor of safety (FOS) approach. The limit load on the structure is determined and the structure is then designed for FOS times the limit load - the ultimate load. Probabilistic approaches utilize distributions for loads and strengths. Failures are predicted to occur in the region of intersection of the two distributions. The load and resistance factor design (LRFD) approach judiciously combines these two approaches by intensive calibration studies on loads and strength to result in structures that are efficient and reliable. This paper discusses these three approaches.

  20. An analytical approach to the CMB polarization in a spatially closed background

    NASA Astrophysics Data System (ADS)

    Niazy, Pedram; Abbassi, Amir H.

    2018-03-01

    The scalar mode polarization of the cosmic microwave background is derived in a spatially closed universe from the Boltzmann equation using the line of sight integral method. The EE and TE multipole coefficients have been extracted analytically by considering some tolerable approximations such as considering the evolution of perturbation hydrodynamically and sudden transition from opacity to transparency at the time of last scattering. As the major advantage of analytic expressions, CEE,ℓS and CTE,ℓ explicitly show the dependencies on baryon density ΩB, matter density ΩM, curvature ΩK, primordial spectral index ns, primordial power spectrum amplitude As, Optical depth τreion, recombination width σt and recombination time tL. Using a realistic set of cosmological parameters taken from a fit to data from Planck, the closed universe EE and TE power spectrums in the scalar mode are compared with numerical results from the CAMB code and also latest observational data. The analytic results agree with the numerical ones on the big and moderate scales. The peak positions are in good agreement with the numerical result on these scales while the peak heights agree with that to within 20% due to the approximations have been considered for these derivations. Also, several interesting properties of CMB polarization are revealed by the analytic spectra.

  1. The effect of inclined soil layers on surface vibration from underground railways using a semi-analytical approach

    NASA Astrophysics Data System (ADS)

    Jones, S.; Hunt, H.

    2009-08-01

    Ground vibration due to underground railways is a significant source of disturbance for people living or working near the subways. The numerical models used to predict vibration levels have inherent uncertainty which must be understood to give confidence in the predictions. A semi-analytical approach is developed herein to investigate the effect of soil layering on the surface vibration of a halfspace where both soil properties and layer inclination angles are varied. The study suggests that both material properties and inclination angle of the layers have significant effect (± 10dB) on the surface vibration response.

  2. Analytical, anthropometric and dietary factors associated with the development of fibrosis in patients with nonalcoholic fatty liver disease.

    PubMed

    Gómez de la Cuesta, Sara; Aller de la Fuente, Rocío; Tafur Sánchez, Carla; Izaola, Olatz; García Sánchez, Concepción; Mora, Natalia; González Hernández, Jose Manuel; de Luis Román, Daniel

    2018-05-01

    a prolonged non-alcoholic steatohepatitis (NASH) condition can lead to advanced stages of liver disease and the development of hepatocellular carcinoma. to evaluate analytical, anthropometric and dietary factors associated with the presence of fibrosis as this is the factor that most influences survival and evolution. seventy-six patients with liver biopsy-diagnosed non-alcoholic fatty liver disease (NAFLD) were included. Biopsies were scored considering the NASH criteria of Kleiner. Analytical, anthropometric and dietary (survey) parameters were obtained. NAFLD-FS is a non-invasive fibrosis index and was assessed for each patient. Leptin, adiponectin, resistin and TNF-alpha serum levels were determined. fifty-six patients were male (73.7%) and the mean age was 44.5 ± 11.3 years of age (19-68). Thirty-nine (51.3%) (F1-F2: 84.6%; F3-4: 15.4%) patients had fibrosis in the liver biopsy. Seventeen females (85%) had fibrosis versus 22 males (39%), which was statistically significant by univariate analysis (p < 0.01). Patients with advanced fibrosis were older, with lower platelet counts, lower serum albumin, greater homeostatic model assessment insulin resistance (HOMA-IR), lower dietary lipids percentage, higher serum leptin levels and higher NAFLD Fibrosis Score (NAFLD-FS) values. This index had a negative predictive value of 98% and a positive predictive value of 60% for the detection of fibrosis. Variables independently associated with fibrosis (logistic regression) included male gender (protective factor) (0.09, 95% CI 0.01-0.7; p < 0.05) and HOMA-IR (1.7, 95% CI, 1.03-2.79; p < 0.05). gender and HOMA-IR were the only independent factors associated with fibrosis. NAFLD-FS could be considered as an accurate scoring system to rule out advanced fibrosis.

  3. An analytical approach to the forensic identification of different classes of new psychoactive substances (NPSs) in seized materials.

    PubMed

    Strano Rossi, Sabina; Odoardi, Sara; Gregori, Adolfo; Peluso, Giuseppe; Ripani, Luigi; Ortar, Giorgio; Serpelloni, Giovanni; Romolo, Francesco Saverio

    2014-09-15

    New psychoactive substances (NPSs) are rapidly spreading worldwide, and forensic laboratories are often requested to identify new substances for which no reference standards or analytical data are available. This article describes an analytical approach that was adopted in Italy by a few collaborative centres of the Italian Early Warning System for Drugs, which has contributed many alerts for the identification of different classes of NPSs in the last 24 months. Seized crystals and powders were initially analysed via single quadrupole gas chromatography/mass spectrometry (GC/MS), followed by liquid chromatography/high-resolution mass spectrometry (LC/HRMS) in the positive electrospray ionisation (ESI) mode at 100,000 full width at half maximum resolution (FWHM) without fragmentation to elucidate the elemental compositions of unknown molecules. Different fragmentation voltages during LC/HRMS were applied to study the accurate masses of the obtained characteristic fragments. Nuclear magnetic resonance (NMR) analyses were performed to identify specific isomers when necessary. Some interesting examples of unknown NPSs from seizures later identified in our laboratories are reported, with special focus on those cases where analytical standards were not available during analyses. These cases include cathinones, such as 3-methylmethcathinone (3-MMC), methylone, bk-MBDB (butylone), 4-methylethcathinone (4-MEC), flephedrone, methylenedioxypyrovalerone (MDPV) and pentedrone, methoxetamine, apinaca or AKB48, benzydamine, meta-chlorophenylpiperazine (m-CPP), 5-MeO-N,N-dialkyl tryptamines, such as 5-MeO-DALT and 5-MeOMIPT, benzofurans, such as 6-APB and 4-APB, and diphenidine (identified for the first time in Europe). The identification of NPSs in confiscated materials was successfully achieved via GC/MS coupled with LC/HRMS and, in a few cases, NMR analyses. The availability of GC/MS libraries is of great assistance in the identification of new drugs. Alternatively, the study

  4. An analytical hierarchy process-based study on the factors affecting legislation on plastic bags in the USA.

    PubMed

    Li, Zhongguo; Zhao, Fu

    2017-08-01

    Annually, a large number of used plastic shopping bags are released into the environment, posing significant threats to public health and wildlife. Owing to these concerns, many local, regional, and national governments around the world have passed legislation to ban or restrict the use of plastic shopping bags. However, in the USA there are only 18 states that have approved plastic bag bans/fees, and even within these states these regulations do not cover all cities or counties. There are many factors that could affect the development and implementation of these regulations. This article employs an analytical hierarchy process to analyse the factors that could impact the enactment of plastic bag regulations. Five impact factors are identified based on statistical data, that is, geographical location, interest of industry achievable, cost of living, level of economic development, and educational level of population. The weights of the five impact factors are determined and it is found that the possibility of banning or restricting plastic bags in general follows a certain pattern among all states.

  5. A Big Data Analytics Methodology Program in the Health Sector

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  6. A Transformational Approach to Slip-Slide Factoring

    ERIC Educational Resources Information Center

    Steckroth, Jeffrey

    2015-01-01

    In this "Delving Deeper" article, the author introduces the slip-slide method for solving Algebra 1 mathematics problems. This article compares the traditional method approach of trial and error to the slip-slide method of factoring. Tools that used to be taken for granted now make it possible to investigate relationships visually,…

  7. A fit-for-purpose approach to analytical sensitivity applied to a cardiac troponin assay: time to escape the 'highly-sensitive' trap.

    PubMed

    Ungerer, Jacobus P J; Pretorius, Carel J

    2014-04-01

    Highly-sensitive cardiac troponin (cTn) assays are being introduced into the market. In this study we argue that the classification of cTn assays into sensitive and highly-sensitive is flawed and recommend a more appropriate way to characterize analytical sensitivity of cTn assays. The raw data of 2252 cardiac troponin I (cTnI) tests done in duplicate with a 'sensitive' assay was extracted and used to calculate the cTnI levels in all, including those below the 'limit of detection' (LoD) that were censored. Duplicate results were used to determine analytical imprecision. We show that cTnI can be quantified in all samples including those with levels below the LoD and that the actual margins of error decrease as concentrations approach zero. The dichotomous classification of cTn assays into sensitive and highly-sensitive is theoretically flawed and characterizing analytical sensitivity as a continuous variable based on imprecision at 0 and the 99th percentile cut-off would be more appropriate.

  8. Product identification techniques used as training aids for analytical chemists

    NASA Technical Reports Server (NTRS)

    Grillo, J. P.

    1968-01-01

    Laboratory staff assistants are trained to use data and observations of routine product analyses performed by experienced analytical chemists when analyzing compounds for potential toxic hazards. Commercial products are used as examples in teaching the analytical approach to unknowns.

  9. An analytics approach to designing patient centered medical homes.

    PubMed

    Ajorlou, Saeede; Shams, Issac; Yang, Kai

    2015-03-01

    Recently the patient centered medical home (PCMH) model has become a popular team based approach focused on delivering more streamlined care to patients. In current practices of medical homes, a clinical based prediction frame is recommended because it can help match the portfolio capacity of PCMH teams with the actual load generated by a set of patients. Without such balances in clinical supply and demand, issues such as excessive under and over utilization of physicians, long waiting time for receiving the appropriate treatment, and non-continuity of care will eliminate many advantages of the medical home strategy. In this paper, by using the hierarchical generalized linear model with multivariate responses, we develop a clinical workload prediction model for care portfolio demands in a Bayesian framework. The model allows for heterogeneous variances and unstructured covariance matrices for nested random effects that arise through complex hierarchical care systems. We show that using a multivariate approach substantially enhances the precision of workload predictions at both primary and non primary care levels. We also demonstrate that care demands depend not only on patient demographics but also on other utilization factors, such as length of stay. Our analyses of a recent data from Veteran Health Administration further indicate that risk adjustment for patient health conditions can considerably improve the prediction power of the model.

  10. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  11. Piezoresistive Cantilever Performance—Part I: Analytical Model for Sensitivity

    PubMed Central

    Park, Sung-Jin; Doll, Joseph C.; Pruitt, Beth L.

    2010-01-01

    An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors. PMID:20336183

  12. Piezoresistive Cantilever Performance-Part I: Analytical Model for Sensitivity.

    PubMed

    Park, Sung-Jin; Doll, Joseph C; Pruitt, Beth L

    2010-02-01

    An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors.

  13. Interactive Visual Analytics Approch for Exploration of Geochemical Model Simulations with Different Parameter Sets

    NASA Astrophysics Data System (ADS)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2015-04-01

    Many geoscience applications can benefit from testing many combinations of input parameters for geochemical simulation models. It is, however, a challenge to screen the input and output data from the model to identify the significant relationships between input parameters and output variables. For addressing this problem we propose a Visual Analytics approach that has been developed in an ongoing collaboration between computer science and geoscience researchers. Our Visual Analytics approach uses visualization methods of hierarchical horizontal axis, multi-factor stacked bar charts and interactive semi-automated filtering for input and output data together with automatic sensitivity analysis. This guides the users towards significant relationships. We implement our approach as an interactive data exploration tool. It is designed with flexibility in mind, so that a diverse set of tasks such as inverse modeling, sensitivity analysis and model parameter refinement can be supported. Here we demonstrate the capabilities of our approach by two examples for gas storage applications. For the first example our Visual Analytics approach enabled the analyst to observe how the element concentrations change around previously established baselines in response to thousands of different combinations of mineral phases. This supported combinatorial inverse modeling for interpreting observations about the chemical composition of the formation fluids at the Ketzin pilot site for CO2 storage. The results indicate that, within the experimental error range, the formation fluid cannot be considered at local thermodynamical equilibrium with the mineral assemblage of the reservoir rock. This is a valuable insight from the predictive geochemical modeling for the Ketzin site. For the second example our approach supports sensitivity analysis for a reaction involving the reductive dissolution of pyrite with formation of pyrrothite in presence of gaseous hydrogen. We determine that this reaction

  14. Recursive mass matrix factorization and inversion: An operator approach to open- and closed-chain multibody dynamics

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Kreutz, K.

    1988-01-01

    This report advances a linear operator approach for analyzing the dynamics of systems of joint-connected rigid bodies.It is established that the mass matrix M for such a system can be factored as M=(I+H phi L)D(I+H phi L) sup T. This yields an immediate inversion M sup -1=(I-H psi L) sup T D sup -1 (I-H psi L), where H and phi are given by known link geometric parameters, and L, psi and D are obtained recursively by a spatial discrete-step Kalman filter and by the corresponding Riccati equation associated with this filter. The factors (I+H phi L) and (I-H psi L) are lower triangular matrices which are inverses of each other, and D is a diagonal matrix. This factorization and inversion of the mass matrix leads to recursive algortihms for forward dynamics based on spatially recursive filtering and smoothing. The primary motivation for advancing the operator approach is to provide a better means to formulate, analyze and understand spatial recursions in multibody dynamics. This is achieved because the linear operator notation allows manipulation of the equations of motion using a very high-level analytical framework (a spatial operator algebra) that is easy to understand and use. Detailed lower-level recursive algorithms can readily be obtained for inspection from the expressions involving spatial operators. The report consists of two main sections. In Part 1, the problem of serial chain manipulators is analyzed and solved. Extensions to a closed-chain system formed by multiple manipulators moving a common task object are contained in Part 2. To retain ease of exposition in the report, only these two types of multibody systems are considered. However, the same methods can be easily applied to arbitrary multibody systems formed by a collection of joint-connected regid bodies.

  15. Big Data Analytics for Prostate Radiotherapy.

    PubMed

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose-volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the "RadoncSpace") in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches.

  16. Big Data Analytics for Prostate Radiotherapy

    PubMed Central

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose–volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the “RadoncSpace”) in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches. PMID:27379211

  17. Demographic Factors, Personality, and Ability as Predictors of Learning Approaches

    ERIC Educational Resources Information Center

    Xie, Qiuzhi; Zhang, Li-fang

    2015-01-01

    This study investigated the extent to which learning approaches can be accounted for by personal factors (i.e., demographics, ability, and personality). The participants were 443 students in a university in mainland China. The Revised Two-factor Study Process Questionnaire, the NEO Five-Factor Inventory-3, and the short form of Raven's Advanced…

  18. Integrating bioassays and analytical chemistry as an improved approach to support safety assessment of food contact materials.

    PubMed

    Veyrand, Julien; Marin-Kuan, Maricel; Bezencon, Claudine; Frank, Nancy; Guérin, Violaine; Koster, Sander; Latado, Hélia; Mollergues, Julie; Patin, Amaury; Piguet, Dominique; Serrant, Patrick; Varela, Jesus; Schilter, Benoît

    2017-10-01

    Food contact materials (FCM) contain chemicals which can migrate into food and result in human exposure. Although it is mandatory to ensure that migration does not endanger human health, there is still no consensus on how to pragmatically assess the safety of FCM since traditional approaches would require extensive toxicological and analytical testing which are expensive and time consuming. Recently, the combination of bioassays, analytical chemistry and risk assessment has been promoted as a new paradigm to identify toxicologically relevant molecules and address safety issues. However, there has been debate on the actual value of bioassays in that framework. In the present work, a FCM anticipated to release the endocrine active chemical 4-nonyphenol (4NP) was used as a model. In a migration study, the leaching of 4NP was confirmed by LC-MS/MS and GC-MS. This was correlated with an increase in both estrogenic and anti-androgenic activities as measured with bioassays. A standard risk assessment indicated that according to the food intake scenario applied, the level of 4NP measured was lower, close or slightly above the acceptable daily intake. Altogether these results show that bioassays could reveal the presence of an endocrine active chemical in a real-case FCM migration study. The levels reported were relevant for safety assessment. In addition, this work also highlighted that bioactivity measured in migrate does not necessarily represent a safety issue. In conclusion, together with analytics, bioassays contribute to identify toxicologically relevant molecules leaching from FCM and enable improved safety assessment.

  19. Penetrating the Fog: Analytics in Learning and Education

    ERIC Educational Resources Information Center

    Siemens, George; Long, Phil

    2011-01-01

    Attempts to imagine the future of education often emphasize new technologies--ubiquitous computing devices, flexible classroom designs, and innovative visual displays. But the most dramatic factor shaping the future of higher education is something that people cannot actually touch or see: "big data and analytics." Learning analytics is still in…

  20. Gradient retention prediction of acid-base analytes in reversed phase liquid chromatography: a simplified approach for acetonitrile-water mobile phases.

    PubMed

    Andrés, Axel; Rosés, Martí; Bosch, Elisabeth

    2014-11-28

    In previous work, a two-parameter model to predict chromatographic retention of ionizable analytes in gradient mode was proposed. However, the procedure required some previous experimental work to get a suitable description of the pKa change with the mobile phase composition. In the present study this previous experimental work has been simplified. The analyte pKa values have been calculated through equations whose coefficients vary depending on their functional group. Forced by this new approach, other simplifications regarding the retention of the totally neutral and totally ionized species also had to be performed. After the simplifications were applied, new prediction values were obtained and compared with the previously acquired experimental data. The simplified model gave pretty good predictions while saving a significant amount of time and resources. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Analyzing Response Times in Tests with Rank Correlation Approaches

    ERIC Educational Resources Information Center

    Ranger, Jochen; Kuhn, Jorg-Tobias

    2013-01-01

    It is common practice to log-transform response times before analyzing them with standard factor analytical methods. However, sometimes the log-transformation is not capable of linearizing the relation between the response times and the latent traits. Therefore, a more general approach to response time analysis is proposed in the current…

  2. Introducing Text Analytics as a Graduate Business School Course

    ERIC Educational Resources Information Center

    Edgington, Theresa M.

    2011-01-01

    Text analytics refers to the process of analyzing unstructured data from documented sources, including open-ended surveys, blogs, and other types of web dialog. Text analytics has enveloped the concept of text mining, an analysis approach influenced heavily from data mining. While text mining has been covered extensively in various computer…

  3. A Factor Graph Approach to Automated GO Annotation.

    PubMed

    Spetale, Flavio E; Tapia, Elizabeth; Krsticevic, Flavia; Roda, Fernando; Bulacio, Pilar

    2016-01-01

    As volume of genomic data grows, computational methods become essential for providing a first glimpse onto gene annotations. Automated Gene Ontology (GO) annotation methods based on hierarchical ensemble classification techniques are particularly interesting when interpretability of annotation results is a main concern. In these methods, raw GO-term predictions computed by base binary classifiers are leveraged by checking the consistency of predefined GO relationships. Both formal leveraging strategies, with main focus on annotation precision, and heuristic alternatives, with main focus on scalability issues, have been described in literature. In this contribution, a factor graph approach to the hierarchical ensemble formulation of the automated GO annotation problem is presented. In this formal framework, a core factor graph is first built based on the GO structure and then enriched to take into account the noisy nature of GO-term predictions. Hence, starting from raw GO-term predictions, an iterative message passing algorithm between nodes of the factor graph is used to compute marginal probabilities of target GO-terms. Evaluations on Saccharomyces cerevisiae, Arabidopsis thaliana and Drosophila melanogaster protein sequences from the GO Molecular Function domain showed significant improvements over competing approaches, even when protein sequences were naively characterized by their physicochemical and secondary structure properties or when loose noisy annotation datasets were considered. Based on these promising results and using Arabidopsis thaliana annotation data, we extend our approach to the identification of most promising molecular function annotations for a set of proteins of unknown function in Solanum lycopersicum.

  4. Transfer of analytical procedures: a panel of strategies selected for risk management, with emphasis on an integrated equivalence-based comparative testing approach.

    PubMed

    Agut, C; Caron, A; Giordano, C; Hoffman, D; Ségalini, A

    2011-09-10

    In 2001, a multidisciplinary team made of analytical scientists and statisticians at Sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release monographs. This article provides an overview of the recent adaptations brought to this original methodology taking advantage of our experience and the new regulatory framework, and, in particular, the risk management perspective introduced by ICH Q9. Although some alternate strategies have been introduced in our practices, the comparative testing one, based equivalence testing as statistical approach, remains the standard for assays lying on very critical quality attributes. This is conducted with the concern to control the most important consumer's risk involved at two levels in analytical decisions in the frame of transfer studies: risk, for the receiving laboratory, to take poor release decisions with the analytical method and risk, for the sending laboratory, to accredit such a receiving laboratory on account of its insufficient performances with the method. Among the enhancements to the comparative studies, the manuscript presents the process settled within our company for a better integration of the transfer study into the method life-cycle, just as proposals of generic acceptance criteria and designs for assay and related substances methods. While maintaining rigor and selectivity of the original approach, these improvements tend towards an increased efficiency in the transfer operations. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. Theory of ground state factorization in quantum cooperative systems.

    PubMed

    Giampaolo, Salvatore M; Adesso, Gerardo; Illuminati, Fabrizio

    2008-05-16

    We introduce a general analytic approach to the study of factorization points and factorized ground states in quantum cooperative systems. The method allows us to determine rigorously the existence, location, and exact form of separable ground states in a large variety of, generally nonexactly solvable, spin models belonging to different universality classes. The theory applies to translationally invariant systems, irrespective of spatial dimensionality, and for spin-spin interactions of arbitrary range.

  6. Towards a full integration of optimization and validation phases: An analytical-quality-by-design approach.

    PubMed

    Hubert, C; Houari, S; Rozet, E; Lebrun, P; Hubert, Ph

    2015-05-22

    When using an analytical method, defining an analytical target profile (ATP) focused on quantitative performance represents a key input, and this will drive the method development process. In this context, two case studies were selected in order to demonstrate the potential of a quality-by-design (QbD) strategy when applied to two specific phases of the method lifecycle: the pre-validation study and the validation step. The first case study focused on the improvement of a liquid chromatography (LC) coupled to mass spectrometry (MS) stability-indicating method by the means of the QbD concept. The design of experiments (DoE) conducted during the optimization step (i.e. determination of the qualitative design space (DS)) was performed a posteriori. Additional experiments were performed in order to simultaneously conduct the pre-validation study to assist in defining the DoE to be conducted during the formal validation step. This predicted protocol was compared to the one used during the formal validation. A second case study based on the LC/MS-MS determination of glucosamine and galactosamine in human plasma was considered in order to illustrate an innovative strategy allowing the QbD methodology to be incorporated during the validation phase. An operational space, defined by the qualitative DS, was considered during the validation process rather than a specific set of working conditions as conventionally performed. Results of all the validation parameters conventionally studied were compared to those obtained with this innovative approach for glucosamine and galactosamine. Using this strategy, qualitative and quantitative information were obtained. Consequently, an analyst using this approach would be able to select with great confidence several working conditions within the operational space rather than a given condition for the routine use of the method. This innovative strategy combines both a learning process and a thorough assessment of the risk involved

  7. New-generation bar adsorptive microextraction (BAμE) devices for a better eco-user-friendly analytical approach-Application for the determination of antidepressant pharmaceuticals in biological fluids.

    PubMed

    Ide, A H; Nogueira, J M F

    2018-05-10

    The present contribution aims to design new-generation bar adsorptive microextraction (BAμE) devices that promote an innovative and much better user-friendly analytical approach. The novel BAμE devices were lab-made prepared having smaller dimensions by using flexible nylon-based supports (7.5 × 1.0 mm) coated with convenient sorbents (≈ 0.5 mg). This novel advance allows effective microextraction and back-extraction ('only single liquid desorption step') stages as well as interfacing enhancement with the instrumental systems dedicated for routine analysis. To evaluate the achievements of these improvements, four antidepressant agents (bupropion, citalopram, amitriptyline and trazodone) were used as model compounds in aqueous media combined with liquid chromatography (LC) systems. By using an N-vinylpyrrolidone based-polymer phase good selectivity and efficiency were obtained. Assays performed on 25 mL spiked aqueous samples, yielded average recoveries in between 67.8 ± 12.4% (bupropion) and 88.3 ± 12.1% (citalopram), under optimized experimental conditions. The analytical performance also showed convenient precision (RSD < 12%) and detection limits (50 ng L -1 ), as well as linear dynamic ranges (160-2000 ng L -1 ) with suitable determination coefficients (r 2  > 0.9820). The application of the proposed analytical approach on biological fluids showed negligible matrix effects by using the standard addition methodology. From the data obtained, the new-generation BAμE devices presented herein provide an innovative and robust analytical cycle, are simple to prepare, cost-effective, user-friendly and compatible with the current LC autosampler systems. Furthermore, the novel devices were designed to be disposable and used together with negligible amounts of organic solvents (100 μL) during back-extraction, in compliance with the green analytical chemistry principles. In short, the new-generation BAμE devices showed to be

  8. Comparison of algebraic and analytical approaches to the formulation of the statistical model-based reconstruction problem for X-ray computed tomography.

    PubMed

    Cierniak, Robert; Lorent, Anna

    2016-09-01

    The main aim of this paper is to investigate properties of our originally formulated statistical model-based iterative approach applied to the image reconstruction from projections problem which are related to its conditioning, and, in this manner, to prove a superiority of this approach over ones recently used by other authors. The reconstruction algorithm based on this conception uses a maximum likelihood estimation with an objective adjusted to the probability distribution of measured signals obtained from an X-ray computed tomography system with parallel beam geometry. The analysis and experimental results presented here show that our analytical approach outperforms the referential algebraic methodology which is explored widely in the literature and exploited in various commercial implementations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. ASVCP quality assurance guidelines: control of preanalytical, analytical, and postanalytical factors for urinalysis, cytology, and clinical chemistry in veterinary laboratories.

    PubMed

    Gunn-Christie, Rebekah G; Flatland, Bente; Friedrichs, Kristen R; Szladovits, Balazs; Harr, Kendal E; Ruotsalo, Kristiina; Knoll, Joyce S; Wamsley, Heather L; Freeman, Kathy P

    2012-03-01

    In December 2009, the American Society for Veterinary Clinical Pathology (ASVCP) Quality Assurance and Laboratory Standards committee published the updated and peer-reviewed ASVCP Quality Assurance Guidelines on the Society's website. These guidelines are intended for use by veterinary diagnostic laboratories and veterinary research laboratories that are not covered by the US Food and Drug Administration Good Laboratory Practice standards (Code of Federal Regulations Title 21, Chapter 58). The guidelines have been divided into 3 reports: (1) general analytical factors for veterinary laboratory performance and comparisons; (2) hematology, hemostasis, and crossmatching; and (3) clinical chemistry, cytology, and urinalysis. This particular report is one of 3 reports and documents recommendations for control of preanalytical, analytical, and postanalytical factors related to urinalysis, cytology, and clinical chemistry in veterinary laboratories and is adapted from sections 1.1 and 2.2 (clinical chemistry), 1.3 and 2.5 (urinalysis), 1.4 and 2.6 (cytology), and 3 (postanalytical factors important in veterinary clinical pathology) of these guidelines. These guidelines are not intended to be all-inclusive; rather, they provide minimal guidelines for quality assurance and quality control for veterinary laboratory testing and a basis for laboratories to assess their current practices, determine areas for improvement, and guide continuing professional development and education efforts. © 2012 American Society for Veterinary Clinical Pathology.

  10. Responding to obesity in Brazil: understanding the international and domestic politics of policy reform through a nested analytic approach to comparative analysis.

    PubMed

    Gómez, Eduardo J

    2015-02-01

    Why do governments pursue obesity legislation? And is the case of Brazil unique compared with other nations when considering the politics of policy reform? Using a nested analytic approach to comparative research, I found that theoretical frameworks accounting for why nations implement obesity legislation were not supported with cross-national statistical evidence. I then turned to the case of Brazil's response to obesity at three levels of government, national, urban, and rural, to propose alternative hypotheses for why nations pursue obesity policy. The case of Brazil suggests that the reasons that governments respond are different at these three levels. International forces, historical institutions, and social health movements were factors that prompted national government responses. At the urban and rural government levels, receiving federal financial assistance and human resource support appeared to be more important. The case of Brazil suggests that the international and domestic politics of responding to obesity are highly complex and that national and subnational political actors have different perceptions and interests when pursuing obesity legislation. Copyright © 2015 by Duke University Press.

  11. Fluence correction factors for graphite calorimetry in a low-energy clinical proton beam: I. Analytical and Monte Carlo simulations.

    PubMed

    Palmans, H; Al-Sulaiti, L; Andreo, P; Shipley, D; Lühr, A; Bassler, N; Martinkovič, J; Dobrovodský, J; Rossomme, S; Thomas, R A S; Kacperek, A

    2013-05-21

    The conversion of absorbed dose-to-graphite in a graphite phantom to absorbed dose-to-water in a water phantom is performed by water to graphite stopping power ratios. If, however, the charged particle fluence is not equal at equivalent depths in graphite and water, a fluence correction factor, kfl, is required as well. This is particularly relevant to the derivation of absorbed dose-to-water, the quantity of interest in radiotherapy, from a measurement of absorbed dose-to-graphite obtained with a graphite calorimeter. In this work, fluence correction factors for the conversion from dose-to-graphite in a graphite phantom to dose-to-water in a water phantom for 60 MeV mono-energetic protons were calculated using an analytical model and five different Monte Carlo codes (Geant4, FLUKA, MCNPX, SHIELD-HIT and McPTRAN.MEDIA). In general the fluence correction factors are found to be close to unity and the analytical and Monte Carlo codes give consistent values when considering the differences in secondary particle transport. When considering only protons the fluence correction factors are unity at the surface and increase with depth by 0.5% to 1.5% depending on the code. When the fluence of all charged particles is considered, the fluence correction factor is about 0.5% lower than unity at shallow depths predominantly due to the contributions from alpha particles and increases to values above unity near the Bragg peak. Fluence correction factors directly derived from the fluence distributions differential in energy at equivalent depths in water and graphite can be described by kfl = 0.9964 + 0.0024·zw-eq with a relative standard uncertainty of 0.2%. Fluence correction factors derived from a ratio of calculated doses at equivalent depths in water and graphite can be described by kfl = 0.9947 + 0.0024·zw-eq with a relative standard uncertainty of 0.3%. These results are of direct relevance to graphite calorimetry in low-energy protons but given that the fluence

  12. A terahertz performance of hybrid single walled CNT based amplifier with analytical approach

    NASA Astrophysics Data System (ADS)

    Kumar, Sandeep; Song, Hanjung

    2018-01-01

    This work is focuses on terahertz performance of hybrid single walled carbon nanotube (CNT) based amplifier and proposed for measurement of soil parameters application. The proposed circuit topology provides hybrid structure which achieves wide impedance bandwidth of 0.33 THz within range of 1.07-THz to 1.42-THz with fractional amount of 28%. The single walled RF CNT network executes proposed ambition and proves its ability to resonant at 1.25-THz with analytical approach. Moreover, a RF based microstrip transmission line radiator used as compensator in the circuit topology which achieves more than 30 dB of gain. A proper methodology is chosen for achieves stability at circuit level in order to obtain desired optimal conditions. The fundamental approach optimizes matched impedance condition at (50+j0) Ω and noise variation with impact of series resistances for the proposed hybrid circuit topology and demonstrates the accuracy of performance parameters at the circuit level. The chip fabrication of the proposed circuit by using RF based commercial CMOS process of 45 nm which reveals promising results with simulation one. Additionally, power measurement analysis achieves highest output power of 26 dBm with power added efficiency of 78%. The succeed minimum noise figure from 0.6 dB to 0.4 dB is outstanding achievement for circuit topology at terahertz range. The chip area of hybrid circuit is 0.65 mm2 and power consumption of 9.6 mW.

  13. Analytical techniques for steroid estrogens in water samples - A review.

    PubMed

    Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza

    2016-12-01

    In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Characteristics, Properties and Analytical Methods of Amoxicillin: A Review with Green Approach.

    PubMed

    de Marco, Bianca Aparecida; Natori, Jéssica Sayuri Hisano; Fanelli, Stefany; Tótoli, Eliane Gandolpho; Salgado, Hérida Regina Nunes

    2017-05-04

    Bacterial infections are the second leading cause of global mortality. Considering this fact, it is extremely important studying the antimicrobial agents. Amoxicillin is an antimicrobial agent that belongs to the class of penicillins; it has bactericidal activity and is widely used in the Brazilian health system. In literature, some analytical methods are found for the identification and quantification of this penicillin, which are essential for its quality control, which ensures maintaining the product characteristics, therapeutic efficacy and patient's safety. Thus, this study presents a brief literature review on amoxicillin and the analytical methods developed for the analysis of this drug in official and scientific papers. The major analytical methods found were high-performance liquid chromatography (HPLC), ultra-performance liquid chromatography (U-HPLC), capillary electrophoresis and iodometry and diffuse reflectance infrared Fourier transform. It is essential to note that most of the developed methods used toxic and hazardous solvents, which makes necessary industries and researchers choose to develop environmental-friendly techniques to provide enhanced benefits to environment and staff.

  15. Analytical modeling of light transport in scattering materials with strong absorption.

    PubMed

    Meretska, M L; Uppu, R; Vissenberg, G; Lagendijk, A; Ijzerman, W L; Vos, W L

    2017-10-02

    We have investigated the transport of light through slabs that both scatter and strongly absorb, a situation that occurs in diverse application fields ranging from biomedical optics, powder technology, to solid-state lighting. In particular, we study the transport of light in the visible wavelength range between 420 and 700 nm through silicone plates filled with YAG:Ce 3+ phosphor particles, that even re-emit absorbed light at different wavelengths. We measure the total transmission, the total reflection, and the ballistic transmission of light through these plates. We obtain average single particle properties namely the scattering cross-section σ s , the absorption cross-section σ a , and the anisotropy factor µ using an analytical approach, namely the P3 approximation to the radiative transfer equation. We verify the extracted transport parameters using Monte-Carlo simulations of the light transport. Our approach fully describes the light propagation in phosphor diffuser plates that are used in white LEDs and that reveal a strong absorption (L/l a > 1) up to L/l a = 4, where L is the slab thickness, l a is the absorption mean free path. In contrast, the widely used diffusion theory fails to describe this parameter range. Our approach is a suitable analytical tool for industry, since it provides a fast yet accurate determination of key transport parameters, and since it introduces predictive power into the design process of white light emitting diodes.

  16. Technosocial Predictive Analytics in Support of Naturalistic Decision Making

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.; Cowell, Andrew J.; Malone, Elizabeth L.

    2009-06-23

    A main challenge we face in fostering sustainable growth is to anticipate outcomes through predictive and proactive across domains as diverse as energy, security, the environment, health and finance in order to maximize opportunities, influence outcomes and counter adversities. The goal of this paper is to present new methods for anticipatory analytical thinking which address this challenge through the development of a multi-perspective approach to predictive modeling as a core to a creative decision making process. This approach is uniquely multidisciplinary in that it strives to create decision advantage through the integration of human and physical models, and leverages knowledgemore » management and visual analytics to support creative thinking by facilitating the achievement of interoperable knowledge inputs and enhancing the user’s cognitive access. We describe a prototype system which implements this approach and exemplify its functionality with reference to a use case in which predictive modeling is paired with analytic gaming to support collaborative decision-making in the domain of agricultural land management.« less

  17. VAP/VAT: video analytics platform and test bed for testing and deploying video analytics

    NASA Astrophysics Data System (ADS)

    Gorodnichy, Dmitry O.; Dubrofsky, Elan

    2010-04-01

    Deploying Video Analytics in operational environments is extremely challenging. This paper presents a methodological approach developed by the Video Surveillance and Biometrics Section (VSB) of the Science and Engineering Directorate (S&E) of the Canada Border Services Agency (CBSA) to resolve these problems. A three-phase approach to enable VA deployment within an operational agency is presented and the Video Analytics Platform and Testbed (VAP/VAT) developed by the VSB section is introduced. In addition to allowing the integration of third party and in-house built VA codes into an existing video surveillance infrastructure, VAP/VAT also allows the agency to conduct an unbiased performance evaluation of the cameras and VA software available on the market. VAP/VAT consists of two components: EventCapture, which serves to Automatically detect a "Visual Event", and EventBrowser, which serves to Display & Peruse of "Visual Details" captured at the "Visual Event". To deal with Open architecture as well as with Closed architecture cameras, two video-feed capture mechanisms have been developed within the EventCapture component: IPCamCapture and ScreenCapture.

  18. Anisotropies in the cosmic microwave background: an analytic approach

    NASA Astrophysics Data System (ADS)

    Hu, Wayne; Sugiyama, Naoshi

    1995-05-01

    We introduce a conceptually simple yet powerful analytic method which traces the structure of cosmic microwave background anisotropies to better than 5%-10% in temperature fluctuations on all scales. It is applicable to any model in which the gravitational potential is known and last scattering is sufficiently early. Moreover, it recovers and explains the presence of the 'Doppler peaks' at degree scales as driven acoustic oscillations of the photon-baryon fluid. We treat in detail such subtleties as the time dependence of the gravitational driving force, anisotropic stress from the neutrino quadrupole, and damping during the recombination process, again all from an analytic standpoint. We apply this formalism to the standard cold dark matter model to gain physical insight into the anisotropies, including the dependence of the peak locations and heights on cosmological parameters such as Omegab and h. Furthermore, the ionization history controls damping due to the finite thickness of the last scattering surface, which is in fact mianly caused by photon diffusion. In addition to being a powerful probe into the nature of anisotropies, this treatment can be used in place of the standard Boltzmann code where 5%-10% accuracy in temperature fluctuations is satisfactory and/or speed is essential. Equally importantly, it can be used as a portable standard by which numerical codes can be tested and compared.

  19. Including scattering within the room acoustics diffusion model: An analytical approach.

    PubMed

    Foy, Cédric; Picaut, Judicaël; Valeau, Vincent

    2016-10-01

    Over the last 20 years, a statistical acoustic model has been developed to predict the reverberant sound field in buildings. This model is based on the assumption that the propagation of the reverberant sound field follows a transport process and, as an approximation, a diffusion process that can be easily solved numerically. This model, initially designed and validated for rooms with purely diffuse reflections, is extended in the present study to mixed reflections, with a proportion of specular and diffuse reflections defined by a scattering coefficient. The proposed mathematical developments lead to an analytical expression of the diffusion constant that is a function of the scattering coefficient, but also on the absorption coefficient of the walls. The results obtained with this extended diffusion model are then compared with the classical diffusion model, as well as with a sound particles tracing approach considering mixed wall reflections. The comparison shows a good agreement for long rooms with uniform low absorption (α = 0.01) and uniform scattering. For a larger absorption (α = 0.1), the agreement is moderate, due to the fact that the proposed expression of the diffusion coefficient does not vary spatially. In addition, the proposed model is for now limited to uniform diffusion and should be extended in the future to more general cases.

  20. Sensitive analytical method for simultaneous analysis of some vasoconstrictors with highly overlapped analytical signals

    NASA Astrophysics Data System (ADS)

    Nikolić, G. S.; Žerajić, S.; Cakić, M.

    2011-10-01

    Multivariate calibration method is a powerful mathematical tool that can be applied in analytical chemistry when the analytical signals are highly overlapped. The method with regression by partial least squares is proposed for the simultaneous spectrophotometric determination of adrenergic vasoconstrictors in decongestive solution containing two active components: phenyleprine hydrochloride and trimazoline hydrochloride. These sympathomimetic agents are that frequently associated in pharmaceutical formulations against the common cold. The proposed method, which is, simple and rapid, offers the advantages of sensitivity and wide range of determinations without the need for extraction of the vasoconstrictors. In order to minimize the optimal factors necessary to obtain the calibration matrix by multivariate calibration, different parameters were evaluated. The adequate selection of the spectral regions proved to be important on the number of factors. In order to simultaneously quantify both hydrochlorides among excipients, the spectral region between 250 and 290 nm was selected. A recovery for the vasoconstrictor was 98-101%. The developed method was applied to assay of two decongestive pharmaceutical preparations.

  1. A Factor Graph Approach to Automated GO Annotation

    PubMed Central

    Spetale, Flavio E.; Tapia, Elizabeth; Krsticevic, Flavia; Roda, Fernando; Bulacio, Pilar

    2016-01-01

    As volume of genomic data grows, computational methods become essential for providing a first glimpse onto gene annotations. Automated Gene Ontology (GO) annotation methods based on hierarchical ensemble classification techniques are particularly interesting when interpretability of annotation results is a main concern. In these methods, raw GO-term predictions computed by base binary classifiers are leveraged by checking the consistency of predefined GO relationships. Both formal leveraging strategies, with main focus on annotation precision, and heuristic alternatives, with main focus on scalability issues, have been described in literature. In this contribution, a factor graph approach to the hierarchical ensemble formulation of the automated GO annotation problem is presented. In this formal framework, a core factor graph is first built based on the GO structure and then enriched to take into account the noisy nature of GO-term predictions. Hence, starting from raw GO-term predictions, an iterative message passing algorithm between nodes of the factor graph is used to compute marginal probabilities of target GO-terms. Evaluations on Saccharomyces cerevisiae, Arabidopsis thaliana and Drosophila melanogaster protein sequences from the GO Molecular Function domain showed significant improvements over competing approaches, even when protein sequences were naively characterized by their physicochemical and secondary structure properties or when loose noisy annotation datasets were considered. Based on these promising results and using Arabidopsis thaliana annotation data, we extend our approach to the identification of most promising molecular function annotations for a set of proteins of unknown function in Solanum lycopersicum. PMID:26771463

  2. Discordance between net analyte signal theory and practical multivariate calibration.

    PubMed

    Brown, Christopher D

    2004-08-01

    Lorber's concept of net analyte signal is reviewed in the context of classical and inverse least-squares approaches to multivariate calibration. It is shown that, in the presence of device measurement error, the classical and inverse calibration procedures have radically different theoretical prediction objectives, and the assertion that the popular inverse least-squares procedures (including partial least squares, principal components regression) approximate Lorber's net analyte signal vector in the limit is disproved. Exact theoretical expressions for the prediction error bias, variance, and mean-squared error are given under general measurement error conditions, which reinforce the very discrepant behavior between these two predictive approaches, and Lorber's net analyte signal theory. Implications for multivariate figures of merit and numerous recently proposed preprocessing treatments involving orthogonal projections are also discussed.

  3. Accurate mass measurements and their appropriate use for reliable analyte identification.

    PubMed

    Godfrey, A Ruth; Brenton, A Gareth

    2012-09-01

    Accurate mass instrumentation is becoming increasingly available to non-expert users. This data can be mis-used, particularly for analyte identification. Current best practice in assigning potential elemental formula for reliable analyte identification has been described with modern informatic approaches to analyte elucidation, including chemometric characterisation, data processing and searching using facilities such as the Chemical Abstracts Service (CAS) Registry and Chemspider.

  4. Finite element and analytical solutions for van der Pauw and four-point probe correction factors when multiple non-ideal measurement conditions coexist

    NASA Astrophysics Data System (ADS)

    Reveil, Mardochee; Sorg, Victoria C.; Cheng, Emily R.; Ezzyat, Taha; Clancy, Paulette; Thompson, Michael O.

    2017-09-01

    This paper presents an extensive collection of calculated correction factors that account for the combined effects of a wide range of non-ideal conditions often encountered in realistic four-point probe and van der Pauw experiments. In this context, "non-ideal conditions" refer to conditions that deviate from the assumptions on sample and probe characteristics made in the development of these two techniques. We examine the combined effects of contact size and sample thickness on van der Pauw measurements. In the four-point probe configuration, we examine the combined effects of varying the sample's lateral dimensions, probe placement, and sample thickness. We derive an analytical expression to calculate correction factors that account, simultaneously, for finite sample size and asymmetric probe placement in four-point probe experiments. We provide experimental validation of the analytical solution via four-point probe measurements on a thin film rectangular sample with arbitrary probe placement. The finite sample size effect is very significant in four-point probe measurements (especially for a narrow sample) and asymmetric probe placement only worsens such effects. The contribution of conduction in multilayer samples is also studied and found to be substantial; hence, we provide a map of the necessary correction factors. This library of correction factors will enable the design of resistivity measurements with improved accuracy and reproducibility over a wide range of experimental conditions.

  5. Finite element and analytical solutions for van der Pauw and four-point probe correction factors when multiple non-ideal measurement conditions coexist.

    PubMed

    Reveil, Mardochee; Sorg, Victoria C; Cheng, Emily R; Ezzyat, Taha; Clancy, Paulette; Thompson, Michael O

    2017-09-01

    This paper presents an extensive collection of calculated correction factors that account for the combined effects of a wide range of non-ideal conditions often encountered in realistic four-point probe and van der Pauw experiments. In this context, "non-ideal conditions" refer to conditions that deviate from the assumptions on sample and probe characteristics made in the development of these two techniques. We examine the combined effects of contact size and sample thickness on van der Pauw measurements. In the four-point probe configuration, we examine the combined effects of varying the sample's lateral dimensions, probe placement, and sample thickness. We derive an analytical expression to calculate correction factors that account, simultaneously, for finite sample size and asymmetric probe placement in four-point probe experiments. We provide experimental validation of the analytical solution via four-point probe measurements on a thin film rectangular sample with arbitrary probe placement. The finite sample size effect is very significant in four-point probe measurements (especially for a narrow sample) and asymmetric probe placement only worsens such effects. The contribution of conduction in multilayer samples is also studied and found to be substantial; hence, we provide a map of the necessary correction factors. This library of correction factors will enable the design of resistivity measurements with improved accuracy and reproducibility over a wide range of experimental conditions.

  6. MS-based analytical methodologies to characterize genetically modified crops.

    PubMed

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.

  7. A Resource-Constrained Approach to Implementing Analytics in an Institution of Higher Education: An Experience Report

    ERIC Educational Resources Information Center

    Buerck, John P.; Mudigonda, Srikanth P.

    2014-01-01

    Academic analytics and learning analytics have been increasingly adopted by academic institutions of higher learning for improving student performance and retention. While several studies have reported the implementation details and the successes of specific analytics initiatives, relatively fewer studies exist in literature that describe the…

  8. An Overview of Learning Analytics

    ERIC Educational Resources Information Center

    Clow, Doug

    2013-01-01

    Learning analytics, the analysis and representation of data about learners in order to improve learning, is a new lens through which teachers can understand education. It is rooted in the dramatic increase in the quantity of data about learners and linked to management approaches that focus on quantitative metrics, which are sometimes antithetical…

  9. Analytical approach for the fractional differential equations by using the extended tanh method

    NASA Astrophysics Data System (ADS)

    Pandir, Yusuf; Yildirim, Ayse

    2018-07-01

    In this study, we consider analytical solutions of space-time fractional derivative foam drainage equation, the nonlinear Korteweg-de Vries equation with time and space-fractional derivatives and time-fractional reaction-diffusion equation by using the extended tanh method. The fractional derivatives are defined in the modified Riemann-Liouville context. As a result, various exact analytical solutions consisting of trigonometric function solutions, kink-shaped soliton solutions and new exact solitary wave solutions are obtained.

  10. The common risk factor approach: a rational basis for promoting oral health.

    PubMed

    Sheiham, A; Watt, R G

    2000-12-01

    Conventional oral health education is not effective nor efficient. Many oral health programmes are developed and implemented in isolation from other health programmes. This often leads, at best to a duplication of effort, or worse, conflicting messages being delivered to the public. In addition, oral health programmes tend to concentrate on individual behaviour change and largely ignore the influence of socio-political factors as the key determinants of health. Based upon the general principles of health promotion this paper presents a rationale for an alternative approach for oral health policy. The common risk factor approach addresses risk factors common to many chronic conditions within the context of the wider socio-environmental milieu. Oral health is determined by diet, hygiene, smoking, alcohol use, stress and trauma. As these causes are common to a number of other chronic diseases, adopting a collaborative approach is more rational than one that is disease specific. The common risk factor approach can be implemented in a variety of ways. Food policy development and the Health Promoting Schools initiative are used as examples of effective ways of promoting oral health.

  11. Genome-wide analytical approaches for reverse metabolic engineering of industrially relevant phenotypes in yeast.

    PubMed

    Oud, Bart; van Maris, Antonius J A; Daran, Jean-Marc; Pronk, Jack T

    2012-03-01

    Successful reverse engineering of mutants that have been obtained by nontargeted strain improvement has long presented a major challenge in yeast biotechnology. This paper reviews the use of genome-wide approaches for analysis of Saccharomyces cerevisiae strains originating from evolutionary engineering or random mutagenesis. On the basis of an evaluation of the strengths and weaknesses of different methods, we conclude that for the initial identification of relevant genetic changes, whole genome sequencing is superior to other analytical techniques, such as transcriptome, metabolome, proteome, or array-based genome analysis. Key advantages of this technique over gene expression analysis include the independency of genome sequences on experimental context and the possibility to directly and precisely reproduce the identified changes in naive strains. The predictive value of genome-wide analysis of strains with industrially relevant characteristics can be further improved by classical genetics or simultaneous analysis of strains derived from parallel, independent strain improvement lineages. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  12. An analytical design approach for self-powered active lateral secondary suspensions for railway vehicles

    NASA Astrophysics Data System (ADS)

    Wang, Peng; Li, Hong; Zhang, Jiye; Mei, TX

    2015-10-01

    In this paper, an analytical design approach for the development of self-powered active suspensions is investigated and is applied to optimise the control system design for an active lateral secondary suspension for railway vehicles. The conditions for energy balance are analysed and the relationship between the ride quality improvement and energy consumption is discussed in detail. The modal skyhook control is applied to analyse the energy consumption of this suspension by separating its dynamics into the lateral and yaw modes, and based on a simplified model, the average power consumption of actuators is computed in frequency domain by using the power spectral density of lateral alignment of track irregularities. Then the impact of control gains and actuators' key parameters on the performance for both vibration suppressing and energy recovery/storage is analysed. Computer simulation is used to verify the obtained energy balance condition and to demonstrate that the improved ride comfort is achieved by this self-powered active suspension without any external power supply.

  13. Genome-wide analytical approaches for reverse metabolic engineering of industrially relevant phenotypes in yeast

    PubMed Central

    Oud, Bart; Maris, Antonius J A; Daran, Jean-Marc; Pronk, Jack T

    2012-01-01

    Successful reverse engineering of mutants that have been obtained by nontargeted strain improvement has long presented a major challenge in yeast biotechnology. This paper reviews the use of genome-wide approaches for analysis of Saccharomyces cerevisiae strains originating from evolutionary engineering or random mutagenesis. On the basis of an evaluation of the strengths and weaknesses of different methods, we conclude that for the initial identification of relevant genetic changes, whole genome sequencing is superior to other analytical techniques, such as transcriptome, metabolome, proteome, or array-based genome analysis. Key advantages of this technique over gene expression analysis include the independency of genome sequences on experimental context and the possibility to directly and precisely reproduce the identified changes in naive strains. The predictive value of genome-wide analysis of strains with industrially relevant characteristics can be further improved by classical genetics or simultaneous analysis of strains derived from parallel, independent strain improvement lineages. PMID:22152095

  14. Modeling Semantic Emotion Space Using a 3D Hypercube-Projection: An Innovative Analytical Approach for the Psychology of Emotions

    PubMed Central

    Trnka, Radek; Lačev, Alek; Balcar, Karel; Kuška, Martin; Tavel, Peter

    2016-01-01

    The widely accepted two-dimensional circumplex model of emotions posits that most instances of human emotional experience can be understood within the two general dimensions of valence and activation. Currently, this model is facing some criticism, because complex emotions in particular are hard to define within only these two general dimensions. The present theory-driven study introduces an innovative analytical approach working in a way other than the conventional, two-dimensional paradigm. The main goal was to map and project semantic emotion space in terms of mutual positions of various emotion prototypical categories. Participants (N = 187; 54.5% females) judged 16 discrete emotions in terms of valence, intensity, controllability and utility. The results revealed that these four dimensional input measures were uncorrelated. This implies that valence, intensity, controllability and utility represented clearly different qualities of discrete emotions in the judgments of the participants. Based on this data, we constructed a 3D hypercube-projection and compared it with various two-dimensional projections. This contrasting enabled us to detect several sources of bias when working with the traditional, two-dimensional analytical approach. Contrasting two-dimensional and three-dimensional projections revealed that the 2D models provided biased insights about how emotions are conceptually related to one another along multiple dimensions. The results of the present study point out the reductionist nature of the two-dimensional paradigm in the psychological theory of emotions and challenge the widely accepted circumplex model. PMID:27148130

  15. Science and the Nonscience Major: Addressing the Fear Factor in the Chemical Arena Using Forensic Science

    ERIC Educational Resources Information Center

    Labianca, Dominick A.

    2007-01-01

    This article describes an approach to minimizing the "fear factor" in a chemistry course for the nonscience major, and also addresses relevant applications to other science courses, including biology, geology, and physics. The approach emphasizes forensic science and affords students the opportunity to hone their analytical skills in an…

  16. Analytical Ultrasonics in Materials Research and Testing

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1986-01-01

    Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.

  17. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. An analytical approach for the simulation of flow in a heterogeneous confined aquifer with a parameter zonation structure

    NASA Astrophysics Data System (ADS)

    Huang, Ching-Sheng; Yeh, Hund-Der

    2016-11-01

    This study introduces an analytical approach to estimate drawdown induced by well extraction in a heterogeneous confined aquifer with an irregular outer boundary. The aquifer domain is divided into a number of zones according to the zonation method for representing the spatial distribution of a hydraulic parameter field. The lateral boundary of the aquifer can be considered under the Dirichlet, Neumann or Robin condition at different parts of the boundary. Flow across the interface between two zones satisfies the continuities of drawdown and flux. Source points, each of which has an unknown volumetric rate representing the boundary effect on the drawdown, are allocated around the boundary of each zone. The solution of drawdown in each zone is expressed as a series in terms of the Theis equation with unknown volumetric rates from the source points. The rates are then determined based on the aquifer boundary conditions and the continuity requirements. The estimated aquifer drawdown by the present approach agrees well with a finite element solution developed based on the Mathematica function NDSolve. As compared with the existing numerical approaches, the present approach has a merit of directly computing the drawdown at any given location and time and therefore takes much less computing time to obtain the required results in engineering applications.

  19. Women's Career Success: A Factor Analytic Study of Contributing Factors.

    ERIC Educational Resources Information Center

    Gaskill, LuAnn Ricketts

    1991-01-01

    A survey of 466 women employed in retailing received 205 responses identifying (1) factors influencing the success and advancement of women in retailing and (2) how those factors differ for women in upper versus middle positions. Upper-level executives placed more importance on ambition and abilities; midlevel executives credited opportunity and…

  20. Novel approaches against epidermal growth factor receptor tyrosine kinase inhibitor resistance

    PubMed Central

    Heydt, Carina; Michels, Sebastian; Thress, Kenneth S.; Bergner, Sven; Wolf, Jürgen; Buettner, Reinhard

    2018-01-01

    Background The identification and characterization of molecular biomarkers has helped to revolutionize non-small-cell lung cancer (NSCLC) management, as it transitions from target-focused to patient-based treatment, centered on the evolving genomic profile of the individual. Determination of epidermal growth factor receptor (EGFR) mutation status represents a critical step in the diagnostic process. The recent emergence of acquired resistance to “third-generation” EGFR tyrosine kinase inhibitors (TKIs) via multiple mechanisms serves to illustrate the important influence of tumor heterogeneity on prognostic outcomes in patients with NSCLC. Design This literature review examines the emergence of TKI resistance and the course of disease progression and, consequently, the clinical decision-making process in NSCLC. Results Molecular markers of acquired resistance, of which T790M and HER2 or MET amplifications are the most common, help to guide ongoing treatment past the point of progression. Although tissue biopsy techniques remain the gold standard, the emergence of liquid biopsies and advances in analytical techniques may eventually allow “real-time” monitoring of tumor evolution and, in this way, help to optimize targeted treatment approaches. Conclusions The influence of inter- and intra-tumor heterogeneity on resistance mechanisms should be considered when treating patients using resistance-specific therapies. New tools are necessary to analyze changes in heterogeneity and clonal composition during drug treatment. The refinement and standardization of diagnostic procedures and increased accessibility to technology will ultimately help in personalizing the management of NSCLC. PMID:29632655

  1. Performance of a proportion-based approach to meta-analytic moderator estimation: results from Monte Carlo simulations.

    PubMed

    Aguirre-Urreta, Miguel I; Ellis, Michael E; Sun, Wenying

    2012-03-01

    This research investigates the performance of a proportion-based approach to meta-analytic moderator estimation through a series of Monte Carlo simulations. This approach is most useful when the moderating potential of a categorical variable has not been recognized in primary research and thus heterogeneous groups have been pooled together as a single sample. Alternative scenarios representing different distributions of group proportions are examined along with varying numbers of studies, subjects per study, and correlation combinations. Our results suggest that the approach is largely unbiased in its estimation of the magnitude of between-group differences and performs well with regard to statistical power and type I error. In particular, the average percentage bias of the estimated correlation for the reference group is positive and largely negligible, in the 0.5-1.8% range; the average percentage bias of the difference between correlations is also minimal, in the -0.1-1.2% range. Further analysis also suggests both biases decrease as the magnitude of the underlying difference increases, as the number of subjects in each simulated primary study increases, and as the number of simulated studies in each meta-analysis increases. The bias was most evident when the number of subjects and the number of studies were the smallest (80 and 36, respectively). A sensitivity analysis that examines its performance in scenarios down to 12 studies and 40 primary subjects is also included. This research is the first that thoroughly examines the adequacy of the proportion-based approach. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.

  2. Sequential Multiplex Analyte Capturing for Phosphoprotein Profiling*

    PubMed Central

    Poetz, Oliver; Henzler, Tanja; Hartmann, Michael; Kazmaier, Cornelia; Templin, Markus F.; Herget, Thomas; Joos, Thomas O.

    2010-01-01

    Microarray-based sandwich immunoassays can simultaneously detect dozens of proteins. However, their use in quantifying large numbers of proteins is hampered by cross-reactivity and incompatibilities caused by the immunoassays themselves. Sequential multiplex analyte capturing addresses these problems by repeatedly probing the same sample with different sets of antibody-coated, magnetic suspension bead arrays. As a miniaturized immunoassay format, suspension bead array-based assays fulfill the criteria of the ambient analyte theory, and our experiments reveal that the analyte concentrations are not significantly changed. The value of sequential multiplex analyte capturing was demonstrated by probing tumor cell line lysates for the abundance of seven different receptor tyrosine kinases and their degree of phosphorylation and by measuring the complex phosphorylation pattern of the epidermal growth factor receptor in the same sample from the same cavity. PMID:20682761

  3. Human factors systems approach to healthcare quality and patient safety

    PubMed Central

    Carayon, Pascale; Wetterneck, Tosha B.; Rivera-Rodriguez, A. Joy; Hundt, Ann Schoofs; Hoonakker, Peter; Holden, Richard; Gurses, Ayse P.

    2013-01-01

    Human factors systems approaches are critical for improving healthcare quality and patient safety. The SEIPS (Systems Engineering Initiative for Patient Safety) model of work system and patient safety is a human factors systems approach that has been successfully applied in healthcare research and practice. Several research and practical applications of the SEIPS model are described. Important implications of the SEIPS model for healthcare system and process redesign are highlighted. Principles for redesigning healthcare systems using the SEIPS model are described. Balancing the work system and encouraging the active and adaptive role of workers are key principles for improving healthcare quality and patient safety. PMID:23845724

  4. Analytic integrable systems: Analytic normalization and embedding flows

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang

    In this paper we mainly study the existence of analytic normalization and the normal form of finite dimensional complete analytic integrable dynamical systems. More details, we will prove that any complete analytic integrable diffeomorphism F(x)=Bx+f(x) in (Cn,0) with B having eigenvalues not modulus 1 and f(x)=O(|) is locally analytically conjugate to its normal form. Meanwhile, we also prove that any complete analytic integrable differential system x˙=Ax+f(x) in (Cn,0) with A having nonzero eigenvalues and f(x)=O(|) is locally analytically conjugate to its normal form. Furthermore we will prove that any complete analytic integrable diffeomorphism defined on an analytic manifold can be embedded in a complete analytic integrable flow. We note that parts of our results are the improvement of Moser's one in J. Moser, The analytic invariants of an area-preserving mapping near a hyperbolic fixed point, Comm. Pure Appl. Math. 9 (1956) 673-692 and of Poincaré's one in H. Poincaré, Sur l'intégration des équations différentielles du premier order et du premier degré, II, Rend. Circ. Mat. Palermo 11 (1897) 193-239. These results also improve the ones in Xiang Zhang, Analytic normalization of analytic integrable systems and the embedding flows, J. Differential Equations 244 (2008) 1080-1092 in the sense that the linear part of the systems can be nonhyperbolic, and the one in N.T. Zung, Convergence versus integrability in Poincaré-Dulac normal form, Math. Res. Lett. 9 (2002) 217-228 in the way that our paper presents the concrete expression of the normal form in a restricted case.

  5. Factors affecting the surgical approach and timing of bilateral adrenalectomy.

    PubMed

    Lan, Billy Y; Taskin, Halit E; Aksoy, Erol; Birsen, Onur; Dural, Cem; Mitchell, Jamie; Siperstein, Allan; Berber, Eren

    2015-07-01

    Laparoscopic adrenalectomy has gained widespread acceptance. However, the optimal surgical approach to laparoscopic bilateral adrenalectomy has not been clearly defined. The aim of this study is to analyze the patient and intraoperative factors affecting the feasibility and outcome of different surgical approaches to define an algorithm for bilateral adrenalectomy. Between 2000 and 2013, all patients who underwent bilateral adrenalectomy at a single institution were selected for retrospective analysis. Patient factors, surgical approach, operative outcomes, and complications were analyzed. From 2000 to 2013, 28 patients underwent bilateral adrenalectomy. Patient diagnoses included Cushing's disease (n = 19), pheochromocytoma (n = 7), and adrenal metastasis (n = 2). Of these 28 patients, successful laparoscopic adrenalectomy was performed in all but 2 patients. Twenty-three out of the 26 adrenalectomies were completed in a single stage, while three were performed as a staged approach due to deterioration in intraoperative respiratory status in two patients and patient body habitus in one. Of the adrenalectomies completed using the minimally invasive approach, a posterior retroperitoneal (PR) approach was performed in 17 patients and lateral transabdominal (LT) approach in 9 patients. Patients who underwent a LT approach had higher BMI, larger tumor size, and other concomitant intraabdominal pathology. Hospital stay for laparoscopic adrenalectomy was 3.5 days compared to 5 and 12 days for the two open cases. There were no 30-day hospital mortality and 5 patients had minor complications for the entire cohort. A minimally invasive operation is feasible in 93% of patients undergoing bilateral adrenalectomy with 65% of adrenalectomies performed using the PR approach. Indications for the LT approach include morbid obesity, tumor size >6 cm, and other concomitant intraabdominal pathology. Single-stage adrenalectomies are feasible in most patients, with prolonged operative

  6. Comparison of analytical and numerical approaches for CT-based aberration correction in transcranial passive acoustic imaging

    NASA Astrophysics Data System (ADS)

    Jones, Ryan M.; Hynynen, Kullervo

    2016-01-01

    Computed tomography (CT)-based aberration corrections are employed in transcranial ultrasound both for therapy and imaging. In this study, analytical and numerical approaches for calculating aberration corrections based on CT data were compared, with a particular focus on their application to transcranial passive imaging. Two models were investigated: a three-dimensional full-wave numerical model (Connor and Hynynen 2004 IEEE Trans. Biomed. Eng. 51 1693-706) based on the Westervelt equation, and an analytical method (Clement and Hynynen 2002 Ultrasound Med. Biol. 28 617-24) similar to that currently employed by commercial brain therapy systems. Trans-skull time delay corrections calculated from each model were applied to data acquired by a sparse hemispherical (30 cm diameter) receiver array (128 piezoceramic discs: 2.5 mm diameter, 612 kHz center frequency) passively listening through ex vivo human skullcaps (n  =  4) to emissions from a narrow-band, fixed source emitter (1 mm diameter, 516 kHz center frequency). Measurements were taken at various locations within the cranial cavity by moving the source around the field using a three-axis positioning system. Images generated through passive beamforming using CT-based skull corrections were compared with those obtained through an invasive source-based approach, as well as images formed without skull corrections, using the main lobe volume, positional shift, peak sidelobe ratio, and image signal-to-noise ratio as metrics for image quality. For each CT-based model, corrections achieved by allowing for heterogeneous skull acoustical parameters in simulation outperformed the corresponding case where homogeneous parameters were assumed. Of the CT-based methods investigated, the full-wave model provided the best imaging results at the cost of computational complexity. These results highlight the importance of accurately modeling trans-skull propagation when calculating CT-based aberration corrections

  7. Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Y; Glascoe, L

    The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less

  8. Analytics that Inform the University: Using Data You Already Have

    ERIC Educational Resources Information Center

    Dziuban, Charles; Moskal, Patsy; Cavanagh, Thomas; Watts, Andre

    2012-01-01

    The authors describe the University of Central Florida's top-down/bottom-up action analytics approach to using data to inform decision-making at the University of Central Florida. The top-down approach utilizes information about programs, modalities, and college implementation of Web initiatives. The bottom-up approach continuously monitors…

  9. Replica Analysis for Portfolio Optimization with Single-Factor Model

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2017-06-01

    In this paper, we use replica analysis to investigate the influence of correlation among the return rates of assets on the solution of the portfolio optimization problem. We consider the behavior of an optimal solution for the case where the return rate is described with a single-factor model and compare the findings obtained from our proposed methods with correlated return rates with those obtained with independent return rates. We then analytically assess the increase in the investment risk when correlation is included. Furthermore, we also compare our approach with analytical procedures for minimizing the investment risk from operations research.

  10. Quantitative and Qualitative Relations between Motivation and Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Miele, David B.; Wigfield, Allan

    2014-01-01

    The authors examine two kinds of factors that affect students' motivation to engage in critical-analytic thinking. The first, which includes ability beliefs, achievement values, and achievement goal orientations, influences the "quantitative" relation between motivation and critical-analytic thinking; that is, whether students are…

  11. The rise of environmental analytical chemistry as an interdisciplinary activity.

    PubMed

    Brown, Richard

    2009-07-01

    Modern scientific endeavour is increasingly delivered within an interdisciplinary framework. Analytical environmental chemistry is a long-standing example of an interdisciplinary approach to scientific research where value is added by the close cooperation of different disciplines. This editorial piece discusses the rise of environmental analytical chemistry as an interdisciplinary activity and outlines the scope of the Analytical Chemistry and the Environmental Chemistry domains of TheScientificWorldJOURNAL (TSWJ), and the appropriateness of TSWJ's domain format in covering interdisciplinary research. All contributions of new data, methods, case studies, and instrumentation, or new interpretations and developments of existing data, case studies, methods, and instrumentation, relating to analytical and/or environmental chemistry, to the Analytical and Environmental Chemistry domains, are welcome and will be considered equally.

  12. Common and Specific Factors Approaches to Home-Based Treatment: I-FAST and MST

    ERIC Educational Resources Information Center

    Lee, Mo Yee; Greene, Gilbert J.; Fraser, J. Scott; Edwards, Shivani G.; Grove, David; Solovey, Andrew D.; Scott, Pamela

    2013-01-01

    Objectives: This study examined the treatment outcomes of integrated families and systems treatment (I-FAST), a moderated common factors approach, in reference to multisystemic therapy (MST), an established specific factor approach, for treating at risk children and adolescents and their families in an intensive community-based setting. Method:…

  13. Do we need sustainability as a new approach in human factors and ergonomics?

    PubMed

    Zink, Klaus J; Fischer, Klaus

    2013-01-01

    The International Ergonomics Association Technical Committee 'Human Factors and Sustainable Development' was established to contribute to a broad discourse about opportunities and risks resulting from current societal 'mega-trends' and their impacts on the interactions among humans and other elements of a system, e.g. in work systems. This paper focuses on the underlying key issues: how do the sustainability paradigm and human factors/ergonomics interplay and interact, and is sustainability necessary as a new approach for our discipline? Based on a discussion of the sustainability concept, some general principles for designing new and enhancing existent approaches of human factors and ergonomics regarding their orientation towards sustainability are proposed. The increasing profile of sustainability on the international stage presents new opportunities for human factors/ergonomics. Positioning of the sustainability paradigm within human factors/ergonomics is discussed. Approaches to incorporating sustainability in the design of work systems are considered.

  14. Headspace versus direct immersion solid phase microextraction in complex matrixes: investigation of analyte behavior in multicomponent mixtures.

    PubMed

    Gionfriddo, Emanuela; Souza-Silva, Érica A; Pawliszyn, Janusz

    2015-08-18

    This work aims to investigate the behavior of analytes in complex mixtures and matrixes with the use of solid-phase microextraction (SPME). Various factors that influence analyte uptake such as coating chemistry, extraction mode, the physicochemical properties of analytes, and matrix complexity were considered. At first, an aqueous system containing analytes bearing different hydrophobicities, molecular weights, and chemical functionalities was investigated by using commercially available liquid and solid porous coatings. The differences in the mass transfer mechanisms resulted in a more pronounced occurrence of coating saturation in headspace mode. Contrariwise, direct immersion extraction minimizes the occurrence of artifacts related to coating saturation and provides enhanced extraction of polar compounds. In addition, matrix-compatible PDMS-modified solid coatings, characterized by a new morphology that avoids coating fouling, were compared to their nonmodified analogues. The obtained results indicate that PDMS-modified coatings reduce artifacts associated with coating saturation, even in headspace mode. This factor, coupled to their matrix compatibility, make the use of direct SPME very practical as a quantification approach and the best choice for metabolomics studies where wide coverage is intended. To further understand the influence on analyte uptake on a system where additional interactions occur due to matrix components, ex vivo and in vivo sampling conditions were simulated using a starch matrix model, with the aim of mimicking plant-derived materials. Our results corroborate the fact that matrix handling can affect analyte/matrix equilibria, with consequent release of high concentrations of previously bound hydrophobic compounds, potentially leading to coating saturation. Direct immersion SPME limited the occurrence of the artifacts, which confirms the suitability of SPME for in vivo applications. These findings shed light into the implementation of in

  15. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2017-01-01

    There has been an immense amount of visibility of doping issues on the international stage over the past 12 months with the complexity of doping controls reiterated on various occasions. Hence, analytical test methods continuously being updated, expanded, and improved to provide specific, sensitive, and comprehensive test results in line with the World Anti-Doping Agency's (WADA) 2016 Prohibited List represent one of several critical cornerstones of doping controls. This enterprise necessitates expediting the (combined) exploitation of newly generated information on novel and/or superior target analytes for sports drug testing assays, drug elimination profiles, alternative test matrices, and recent advances in instrumental developments. This paper is a continuation of the series of annual banned-substance reviews appraising the literature published between October 2015 and September 2016 concerning human sports drug testing in the context of WADA's 2016 Prohibited List. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Analytical Implications of Using Practice Theory in Workplace Information Literacy Research

    ERIC Educational Resources Information Center

    Moring, Camilla; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical focus and interest when researching workplace…

  17. Assessing Saudi medical students learning approach using the revised two-factor study process questionnaire.

    PubMed

    Shaik, Shaffi Ahamed; Almarzuqi, Ahmed; Almogheer, Rakan; Alharbi, Omar; Jalal, Abdulaziz; Alorainy, Majed

    2017-08-17

    To assess learning approaches of 1st, 2nd, and 3rd-year medical students by using revised two-factor study process questionnaire, and to assess reliability and validity of the questionnaire. This cross-sectional study was conducted at the College of Medicine, Riyadh, Saudi Arabia in 2014. The revised two-factor study process questionnaire (R-SPQ-2F) was completed by 610 medical students of both genders, from foundation (first year), central nervous system (second year), medicine and surgery (third year) courses. The study process was evaluated by computing mean scores of two research study approaches (deep & surface) using student's t-test and one-way analysis of variance. The internal consistency and construct validity of the questionnaire were assessed using Cronbach's α and factor analysis. The mean score of deep approach was significantly higher than the surface approach among participants(t (770) =7.83, p= 0.000) for the four courses. The mean scores of deep approach were significantly higher among participants with higher grade point average (F (2,768) =13.31, p=0.001) along with more number of study hours by participants (F (2,768) =20.08, p=0.001). The Cronbach's α-values of items at 0.70 indicate the good internal consistency of questionnaire used. Factor analysis confirms two factors (deep and surface approaches) of R-SPQ-2F. The deep approach to learning was the primary approach among 1st, 2nd and 3rd-year King Saud University medical students. This study confirms reliability and validity of the revised two-factor study process questionnaire. Medical educators could use the results of such studies to make required changes in the curriculum.

  18. Understanding suicide risk within the Research Domain Criteria (RDoC) framework: A meta-analytic review.

    PubMed

    Glenn, Catherine R; Kleiman, Evan M; Cha, Christine B; Deming, Charlene A; Franklin, Joseph C; Nock, Matthew K

    2018-01-01

    The field is in need of novel and transdiagnostic risk factors for suicide. The National Institute of Mental Health's Research Domain Criteria (RDoC) provides a framework that may help advance research on suicidal behavior. We conducted a meta-analytic review of existing prospective risk and protective factors for suicidal thoughts and behaviors (ideation, attempts, and deaths) that fall within one of the five RDoC domains or relate to a prominent suicide theory. Predictors were selected from a database of 4,082 prospective risk and protective factors for suicide outcomes. A total of 460 predictors met inclusion criteria for this meta-analytic review and most examined risk (vs. protective) factors for suicidal thoughts and behaviors. The overall effect of risk factors was statistically significant, but relatively small, in predicting suicide ideation (weighted mean odds ratio: wOR = 1.72; 95% CI: 1.59-1.87), suicide attempt (wOR = 1.66 [1.57-1.76), and suicide death (wOR = 1.41 [1.24-1.60]). Across all suicide outcomes, most risk factors related to the Negative Valence Systems domain, although effect sizes were of similar magnitude across RDoC domains. This study demonstrated that the RDoC framework provides a novel and promising approach to suicide research; however, relatively few studies of suicidal behavior fit within this framework. Future studies must go beyond the "usual suspects" of suicide risk factors (e.g., mental disorders, sociodemographics) to understand the processes that combine to lead to this deadly outcome. © 2017 Wiley Periodicals, Inc.

  19. Chemical clocks, oscillations, and other temporal effects in analytical chemistry: oddity or viable approach?

    PubMed

    Prabhu, Gurpur Rakesh D; Witek, Henryk A; Urban, Pawel L

    2018-05-31

    Most analytical methods are based on "analogue" inputs from sensors of light, electric potentials, or currents. The signals obtained by such sensors are processed using certain calibration functions to determine concentrations of the target analytes. The signal readouts are normally done after an optimised and fixed time period, during which an assay mixture is incubated. This minireview covers another-and somewhat unusual-analytical strategy, which relies on the measurement of time interval between the occurrences of two distinguishable states in the assay reaction. These states manifest themselves via abrupt changes in the properties of the assay mixture (e.g. change of colour, appearance or disappearance of luminescence, change in pH, variations in optical activity or mechanical properties). In some cases, a correlation between the time of appearance/disappearance of a given property and the analyte concentration can be also observed. An example of an assay based on time measurement is an oscillating reaction, in which the period of oscillations is linked to the concentration of the target analyte. A number of chemo-chronometric assays, relying on the existing (bio)transformations or artificially designed reactions, were disclosed in the past few years. They are very attractive from the fundamental point of view but-so far-only few of them have be validated and used to address real-world problems. Then, can chemo-chronometric assays become a practical tool for chemical analysis? Is there a need for further development of such assays? We are aiming to answer these questions.

  20. A new method for constructing analytic elements for groundwater flow.

    NASA Astrophysics Data System (ADS)

    Strack, O. D.

    2007-12-01

    The analytic element method is based upon the superposition of analytic functions that are defined throughout the infinite domain, and can be used to meet a variety of boundary conditions. Analytic elements have been use successfully for a number of problems, mainly dealing with the Poisson equation (see, e.g., Theory and Applications of the Analytic Element Method, Reviews of Geophysics, 41,2/1005 2003 by O.D.L. Strack). The majority of these analytic elements consists of functions that exhibit jumps along lines or curves. Such linear analytic elements have been developed also for other partial differential equations, e.g., the modified Helmholz equation and the heat equation, and were constructed by integrating elementary solutions, the point sink and the point doublet, along a line. This approach is limiting for two reasons. First, the existence is required of the elementary solutions, and, second, the integration tends to limit the range of solutions that can be obtained. We present a procedure for generating analytic elements that requires merely the existence of a harmonic function with the desired properties; such functions exist in abundance. The procedure to be presented is used to generalize this harmonic function in such a way that the resulting expression satisfies the applicable differential equation. The approach will be applied, along with numerical examples, for the modified Helmholz equation and for the heat equation, while it is noted that the method is in no way restricted to these equations. The procedure is carried out entirely in terms of complex variables, using Wirtinger calculus.

  1. Cogeneration Technology Alternatives Study (CTAS). Volume 2: Analytical approach

    NASA Technical Reports Server (NTRS)

    Gerlaugh, H. E.; Hall, E. W.; Brown, D. H.; Priestley, R. R.; Knightly, W. F.

    1980-01-01

    The use of various advanced energy conversion systems were compared with each other and with current technology systems for their savings in fuel energy, costs, and emissions in individual plants and on a national level. The ground rules established by NASA and assumptions made by the General Electric Company in performing this cogeneration technology alternatives study are presented. The analytical methodology employed is described in detail and is illustrated with numerical examples together with a description of the computer program used in calculating over 7000 energy conversion system-industrial process applications. For Vol. 1, see 80N24797.

  2. Single-diffractive production of dijets within the kt-factorization approach

    NASA Astrophysics Data System (ADS)

    Łuszczak, Marta; Maciuła, Rafał; Szczurek, Antoni; Babiarz, Izabela

    2017-09-01

    We discuss single-diffractive production of dijets. The cross section is calculated within the resolved Pomeron picture, for the first time in the kt-factorization approach, neglecting transverse momentum of the Pomeron. We use Kimber-Martin-Ryskin unintegrated parton (gluon, quark, antiquark) distributions in both the proton as well as in the Pomeron or subleading Reggeon. The unintegrated parton distributions are calculated based on conventional mmht2014nlo parton distribution functions in the proton and H1 Collaboration diffractive parton distribution functions used previously in the analysis of diffractive structure function and dijets at HERA. For comparison, we present results of calculations performed within the collinear-factorization approach. Our results remain those obtained in the next-to-leading-order approach. The calculation is (must be) supplemented by the so-called gap survival factor, which may, in general, depend on kinematical variables. We try to describe the existing data from Tevatron and make detailed predictions for possible LHC measurements. Several differential distributions are calculated. The E¯T, η ¯ and xp ¯ distributions are compared with the Tevatron data. A reasonable agreement is obtained for the first two distributions. The last one requires introducing a gap survival factor which depends on kinematical variables. We discuss how the phenomenological dependence on one kinematical variable may influence dependence on other variables such as E¯T and η ¯. Several distributions for the LHC are shown.

  3. A Crowdsensing Based Analytical Framework for Perceptional Degradation of OTT Web Browsing.

    PubMed

    Li, Ke; Wang, Hai; Xu, Xiaolong; Du, Yu; Liu, Yuansheng; Ahmad, M Omair

    2018-05-15

    Service perception analysis is crucial for understanding both user experiences and network quality as well as for maintaining and optimizing of mobile networks. Given the rapid development of mobile Internet and over-the-top (OTT) services, the conventional network-centric mode of network operation and maintenance is no longer effective. Therefore, developing an approach to evaluate and optimizing users' service perceptions has become increasingly important. Meanwhile, the development of a new sensing paradigm, mobile crowdsensing (MCS), makes it possible to evaluate and analyze the user's OTT service perception from end-user's point of view other than from the network side. In this paper, the key factors that impact users' end-to-end OTT web browsing service perception are analyzed by monitoring crowdsourced user perceptions. The intrinsic relationships among the key factors and the interactions between key quality indicators (KQI) are evaluated from several perspectives. Moreover, an analytical framework of perceptional degradation and a detailed algorithm are proposed whose goal is to identify the major factors that impact the perceptional degradation of web browsing service as well as their significance of contribution. Finally, a case study is presented to show the effectiveness of the proposed method using a dataset crowdsensed from a large number of smartphone users in a real mobile network. The proposed analytical framework forms a valuable solution for mobile network maintenance and optimization and can help improve web browsing service perception and network quality.

  4. Developing Guidelines for Assessing Visual Analytics Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean

    2011-07-01

    In this paper, we develop guidelines for evaluating visual analytic environments based on a synthesis of reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We then looked at guidelines developed by researchers in various domainsmore » and synthesized these into an initial set for use by others in the community. In a second part of the user study, we looked at guidelines for a new aspect of visual analytic systems – the generation of reports. Future visual analytic systems have been challenged to help analysts generate their reports. In our study we worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 Based on these efforts, we produced some initial guidelines for evaluating visual analytic environment and for evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope because of the type of tasks for which the visual analytic systems used in the studies in this paper were designed. More research and refinement is needed by the Visual Analytics Community to provide additional evaluation guidelines for different types of visual analytic environments.« less

  5. Classifying Correlation Matrices into Relatively Homogeneous Subgroups: A Cluster Analytic Approach

    ERIC Educational Resources Information Center

    Cheung, Mike W.-L.; Chan, Wai

    2005-01-01

    Researchers are becoming interested in combining meta-analytic techniques and structural equation modeling to test theoretical models from a pool of studies. Most existing procedures are based on the assumption that all correlation matrices are homogeneous. Few studies have addressed what the next step should be when studies being analyzed are…

  6. Analytical and numerical treatment of drift-tearing modes in plasma slab

    NASA Astrophysics Data System (ADS)

    Mirnov, V. V.; Hegna, C. C.; Sovinec, C. R.; Howell, E. C.

    2016-10-01

    Two-fluid corrections to linear tearing modes includes 1) diamagnetic drifts that reduce the growth rate and 2) electron and ion decoupling on short scales that can lead to fast reconnection. We have recently developed an analytical model that includes effects 1) and 2) and important contribution from finite electron parallel thermal conduction. Both the tendencies 1) and 2) are confirmed by an approximate analytic dispersion relation that is derived using a perturbative approach of small ion-sound gyroradius ρs. This approach is only valid at the beginning of the transition from the collisional to semi-collisional regimes. Further analytical and numerical work is performed to cover the full interval of ρs connecting these two limiting cases. Growth rates are computed from analytic theory with a shooting method. They match the resistive MHD regime with the dispersion relations known at asymptotically large ion-sound gyroradius. A comparison between this analytical treatment and linear numerical simulations using the NIMROD code with cold ions and hot electrons in plasma slab is reported. The material is based on work supported by the U.S. DOE and NSF.

  7. Enhancing the Interpretive Reading and Analytical Writing of Mainstreamed English Learners in Secondary School: Results from a Randomized Field Trial Using a Cognitive Strategies Approach

    ERIC Educational Resources Information Center

    Olson, Carol Booth; Kim, James S.; Scarcella, Robin; Kramer, Jason; Pearson, Matthew; van Dyk, David A.; Collins, Penny; Land, Robert E.

    2012-01-01

    In this study, 72 secondary English teachers from the Santa Ana Unified School District were randomly assigned to participate in the Pathway Project, a cognitive strategies approach to teaching interpretive reading and analytical writing, or to a control condition involving typical district training focusing on teaching content from the textbook.…

  8. Analytical advances in pharmaceutical impurity profiling.

    PubMed

    Holm, René; Elder, David P

    2016-05-25

    Impurities will be present in all drug substances and drug products, i.e. nothing is 100% pure if one looks in enough depth. The current regulatory guidance on impurities accepts this, and for drug products with a dose of less than 2g/day identification of impurities is set at 0.1% levels and above (ICH Q3B(R2), 2006). For some impurities, this is a simple undertaking as generally available analytical techniques can address the prevailing analytical challenges; whereas, for others this may be much more challenging requiring more sophisticated analytical approaches. The present review provides an insight into current development of analytical techniques to investigate and quantify impurities in drug substances and drug products providing discussion of progress particular within the field of chromatography to ensure separation of and quantification of those related impurities. Further, a section is devoted to the identification of classical impurities, but in addition, inorganic (metal residues) and solid state impurities are also discussed. Risk control strategies for pharmaceutical impurities aligned with several of the ICH guidelines, are also discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Analytical study of the heat loss attenuation by clothing on thermal manikins under radiative heat loads.

    PubMed

    Den Hartog, Emiel A; Havenith, George

    2010-01-01

    For wearers of protective clothing in radiation environments there are no quantitative guidelines available for the effect of a radiative heat load on heat exchange. Under the European Union funded project ThermProtect an analytical effort was defined to address the issue of radiative heat load while wearing protective clothing. As within the ThermProtect project much information has become available from thermal manikin experiments in thermal radiation environments, these sets of experimental data are used to verify the analytical approach. The analytical approach provided a good prediction of the heat loss in the manikin experiments, 95% of the variance was explained by the model. The model has not yet been validated at high radiative heat loads and neglects some physical properties of the radiation emissivity. Still, the analytical approach provides a pragmatic approach and may be useful for practical implementation in protective clothing standards for moderate thermal radiation environments.

  10. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  11. A multi-analytical approach for the characterization of powders from the Pompeii archaeological site.

    PubMed

    Canevali, Carmen; Gentile, Paolo; Orlandi, Marco; Modugno, Francesca; Lucejko, Jeannette Jacqueline; Colombini, Maria Perla; Brambilla, Laura; Goidanich, Sara; Riedo, Chiara; Chiantore, Oscar; Baraldi, Pietro; Baraldi, Cecilia; Gamberini, Maria Cristina

    2011-10-01

    Nine black powders found in Pompeii houses in three different types of bronze vessels (cylindrical theca atramentaria, unguentaries, and aryballoi) were characterized in order to assess a correspondence between the composition and the type of vessel and, possibly, to verify if these powders were inks or not. For the compositional characterization, a multi-analytical approach was adopted, which involved the use of scanning electron microscopy-energy dispersive X-ray, Fourier-transformed infrared spectroscopy, Raman, X-ray diffraction, electron paramagnetic resonance spectroscopy, thermogravimetric analysis, gas chromatography coupled with mass spectrometry (GC/MS), and pyrolysis GC/MS. Powders contained in cylindrical theca atramentaria form a homogeneous group, and their organic and inorganic compositions suggest that they were writing inks, while powders contained in unguentaries and aryballoi could have had several different uses, including writing inks and cosmetics. Furthermore, the composition profile of the powders found in cylindrical cases shows that, at 79 AD: , in Pompeii, carbon-based inks were still used for writing, and iron gall inks had not been introduced yet.

  12. TrajGraph: A Graph-Based Visual Analytics Approach to Studying Urban Network Centralities Using Taxi Trajectory Data.

    PubMed

    Huang, Xiaoke; Zhao, Ye; Yang, Jing; Zhang, Chong; Ma, Chao; Ye, Xinyue

    2016-01-01

    We propose TrajGraph, a new visual analytics method, for studying urban mobility patterns by integrating graph modeling and visual analysis with taxi trajectory data. A special graph is created to store and manifest real traffic information recorded by taxi trajectories over city streets. It conveys urban transportation dynamics which can be discovered by applying graph analysis algorithms. To support interactive, multiscale visual analytics, a graph partitioning algorithm is applied to create region-level graphs which have smaller size than the original street-level graph. Graph centralities, including Pagerank and betweenness, are computed to characterize the time-varying importance of different urban regions. The centralities are visualized by three coordinated views including a node-link graph view, a map view and a temporal information view. Users can interactively examine the importance of streets to discover and assess city traffic patterns. We have implemented a fully working prototype of this approach and evaluated it using massive taxi trajectories of Shenzhen, China. TrajGraph's capability in revealing the importance of city streets was evaluated by comparing the calculated centralities with the subjective evaluations from a group of drivers in Shenzhen. Feedback from a domain expert was collected. The effectiveness of the visual interface was evaluated through a formal user study. We also present several examples and a case study to demonstrate the usefulness of TrajGraph in urban transportation analysis.

  13. Development of An Analytic Approach to Determine How Environmental Protection Agency’s Integrated Risk Information System (IRIS) Is Used by Non-EPA Decision Makers (Final Contractor Report)

    EPA Science Inventory

    EPA announced the availability of the final contractor report entitled, Development of an Analytic Approach to Determine How Environmental Protection Agency’s Integrated Risk Information System (IRIS) Is Used By Non EPA Decision Makers. This contractor report analyzed how ...

  14. An Analytic Hierarchy Process-based Method to Rank the Critical Success Factors of Implementing a Pharmacy Barcode System.

    PubMed

    Alharthi, Hana; Sultana, Nahid; Al-Amoudi, Amjaad; Basudan, Afrah

    2015-01-01

    Pharmacy barcode scanning is used to reduce errors during the medication dispensing process. However, this technology has rarely been used in hospital pharmacies in Saudi Arabia. This article describes the barriers to successful implementation of a barcode scanning system in Saudi Arabia. A literature review was conducted to identify the relevant critical success factors (CSFs) for a successful dispensing barcode system implementation. Twenty-eight pharmacists from a local hospital in Saudi Arabia were interviewed to obtain their perception of these CSFs. In this study, planning (process flow issues and training requirements), resistance (fear of change, communication issues, and negative perceptions about technology), and technology (software, hardware, and vendor support) were identified as the main barriers. The analytic hierarchy process (AHP), one of the most widely used tools for decision making in the presence of multiple criteria, was used to compare and rank these identified CSFs. The results of this study suggest that resistance barriers have a greater impact than planning and technology barriers. In particular, fear of change is the most critical factor, and training is the least critical factor.

  15. Effect of primary and secondary parameters on analytical estimation of effective thermal conductivity of two phase materials using unit cell approach

    NASA Astrophysics Data System (ADS)

    S, Chidambara Raja; P, Karthikeyan; Kumaraswamidhas, L. A.; M, Ramu

    2018-05-01

    Most of the thermal design systems involve two phase materials and analysis of such systems requires detailed understanding of the thermal characteristics of the two phase material. This article aimed to develop geometry dependent unit cell approach model by considering the effects of all primary parameters (conductivity ratio and concentration) and secondary parameters (geometry, contact resistance, natural convection, Knudsen and radiation) for the estimation of effective thermal conductivity of two-phase materials. The analytical equations have been formulated based on isotherm approach for 2-D and 3-D spatially periodic medium. The developed models are validated with standard models and suited for all kind of operating conditions. The results have shown substantial improvement compared to the existing models and are in good agreement with the experimental data.

  16. Achieving Cost Reduction Through Data Analytics.

    PubMed

    Rocchio, Betty Jo

    2016-10-01

    The reimbursement structure of the US health care system is shifting from a volume-based system to a value-based system. Adopting a comprehensive data analytics platform has become important to health care facilities, in part to navigate this shift. Hospitals generate plenty of data, but actionable analytics are necessary to help personnel interpret and apply data to improve practice. Perioperative services is an important revenue-generating department for hospitals, and each perioperative service line requires a tailored approach to be successful in managing outcomes and controlling costs. Perioperative leaders need to prepare to use data analytics to reduce variation in supplies, labor, and overhead. Mercy, based in Chesterfield, Missouri, adopted a perioperative dashboard that helped perioperative leaders collaborate with surgeons and perioperative staff members to organize and analyze health care data, which ultimately resulted in significant cost savings. Copyright © 2016 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  17. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    PubMed

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  18. Do the Critical Success Factors from Learning Analytics Predict Student Outcomes?

    ERIC Educational Resources Information Center

    Strang, Kenneth David

    2016-01-01

    This article starts with a detailed literature review of recent studies that focused on using learning analytics software or learning management system data to determine the nature of any relationships between online student activity and their academic outcomes within university-level business courses. The article then describes how data was…

  19. Internal and external factors affecting photosynthetic pigment composition in plants: a meta-analytical approach.

    PubMed

    Esteban, Raquel; Barrutia, Oihana; Artetxe, Unai; Fernández-Marín, Beatriz; Hernández, Antonio; García-Plazaola, José Ignacio

    2015-04-01

    Photosynthetic pigment composition has been a major study target in plant ecophysiology during the last three decades. Although more than 2000 papers have been published, a comprehensive evaluation of the responses of photosynthetic pigment composition to environmental conditions is not yet available. After an extensive survey, we compiled data from 525 papers including 809 species (subkingdom Viridiplantae) in which pigment composition was described. A meta-analysis was then conducted to assess the ranges of photosynthetic pigment content. Calculated frequency distributions of pigments were compared with those expected from the theoretical pigment composition. Responses to environmental factors were also analysed. The results revealed that lutein and xanthophyll cycle pigments (VAZ) were highly responsive to the environment, emphasizing the high phenotypic plasticity of VAZ, whereas neoxanthin was very stable. The present meta-analysis supports the existence of relatively narrow limits for pigment ratios and also supports the presence of a pool of free 'unbound' VAZ. Results from this study provide highly reliable ranges of photosynthetic pigment contents as a framework for future research on plant pigments. © 2014 The Authors. New Phytologist © 2014 New Phytologist Trust.

  20. Learning Analytics: Insights into the Natural Learning Behavior of Our Students

    ERIC Educational Resources Information Center

    Becker, Bernd

    2013-01-01

    The migration from traditional classrooms to online learning environments is in full effect. In the midst of these changes, a new approach to learning analytics needs to be considered. Learning analytics refers to the process of collecting and studying usage data in order to make instructional decisions that will support student success. In…

  1. An Analytic Network Process approach for the environmental aspect selection problem — A case study for a hand blender

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bereketli Zafeirakopoulos, Ilke, E-mail: ibereketli@gsu.edu.tr; Erol Genevois, Mujde, E-mail: merol@gsu.edu.tr

    Life Cycle Assessment is a tool to assess, in a systematic way, the environmental aspects and its potential environmental impacts and resources used throughout a product's life cycle. It is widely accepted and considered as one of the most powerful tools to support decision-making processes used in ecodesign and sustainable production in order to learn about the most problematic parts and life cycle phases of a product and to have a projection for future improvements. However, since Life Cycle Assessment is a cost and time intensive method, companies do not intend to carry out a full version of it, exceptmore » for large corporate ones. Especially for small and medium sized enterprises, which do not have enough budget for and knowledge on sustainable production and ecodesign approaches, focusing only on the most important possible environmental aspect is unavoidable. In this direction, finding the right environmental aspect to work on is crucial for the companies. In this study, a multi-criteria decision-making methodology, Analytic Network Process is proposed to select the most relevant environmental aspect. The proposed methodology aims at providing a simplified environmental assessment to producers. It is applied for a hand blender, which is a member of the Electrical and Electronic Equipment family. The decision criteria for the environmental aspects and relations of dependence are defined. The evaluation is made by the Analytic Network Process in order to create a realistic approach to inter-dependencies among the criteria. The results are computed via the Super Decisions software. Finally, it is observed that the procedure is completed in less time, with less data, with less cost and in a less subjective way than conventional approaches. - Highlights: • We present a simplified environmental assessment methodology to support LCA. • ANP is proposed to select the most relevant environmental aspect. • ANP deals well with the interdependencies between

  2. Advanced, Analytic, Automated (AAA) Measurement of Engagement during Learning

    ERIC Educational Resources Information Center

    D'Mello, Sidney; Dieterle, Ed; Duckworth, Angela

    2017-01-01

    It is generally acknowledged that engagement plays a critical role in learning. Unfortunately, the study of engagement has been stymied by a lack of valid and efficient measures. We introduce the advanced, analytic, and automated (AAA) approach to measure engagement at fine-grained temporal resolutions. The AAA measurement approach is grounded in…

  3. Analytical functions for beta and gamma absorbed fractions of iodine-131 in spherical and ellipsoidal volumes.

    PubMed

    Mowlavi, Ali Asghar; Fornasier, Maria Rossa; Mirzaei, Mohammd; Bregant, Paola; de Denaro, Mario

    2014-10-01

    The beta and gamma absorbed fractions in organs and tissues are the important key factors of radionuclide internal dosimetry based on Medical Internal Radiation Dose (MIRD) approach. The aim of this study is to find suitable analytical functions for beta and gamma absorbed fractions in spherical and ellipsoidal volumes with a uniform distribution of iodine-131 radionuclide. MCNPX code has been used to calculate the energy absorption from beta and gamma rays of iodine-131 uniformly distributed inside different ellipsoids and spheres, and then the absorbed fractions have been evaluated. We have found the fit parameters of a suitable analytical function for the beta absorbed fraction, depending on a generalized radius for ellipsoid based on the radius of sphere, and a linear fit function for the gamma absorbed fraction. The analytical functions that we obtained from fitting process in Monte Carlo data can be used for obtaining the absorbed fractions of iodine-131 beta and gamma rays for any volume of the thyroid lobe. Moreover, our results for the spheres are in good agreement with the results of MIRD and other scientific literatures.

  4. A Numerical-Analytical Approach to Modeling the Axial Rotation of the Earth

    NASA Astrophysics Data System (ADS)

    Markov, Yu. G.; Perepelkin, V. V.; Rykhlova, L. V.; Filippova, A. S.

    2018-04-01

    A model for the non-uniform axial rotation of the Earth is studied using a celestial-mechanical approach and numerical simulations. The application of an approximate model containing a small number of parameters to predict variations of the axial rotation velocity of the Earth over short time intervals is justified. This approximate model is obtained by averaging variable parameters that are subject to small variations due to non-stationarity of the perturbing factors. The model is verified and compared with predictions over a long time interval published by the International Earth Rotation and Reference Systems Service (IERS).

  5. Impact of Pre-analytic Blood Sample Collection Factors on Metabolomics.

    PubMed

    Townsend, Mary K; Bao, Ying; Poole, Elizabeth M; Bertrand, Kimberly A; Kraft, Peter; Wolpin, Brian M; Clish, Clary B; Tworoger, Shelley S

    2016-05-01

    Many epidemiologic studies are using metabolomics to discover markers of carcinogenesis. However, limited data are available on the influence of pre-analytic blood collection factors on metabolite measurement. We quantified 166 metabolites in archived plasma from 423 Health Professionals Follow-up Study and Nurses' Health Study participants using liquid chromatography-tandem mass spectrometry (LC-MS). We compared multivariable-adjusted geometric mean metabolite LC-MS peak areas across fasting time, season of blood collection, and time of day of blood collection categories. The majority of metabolites (160 of 166 metabolites) had geometric mean peak areas that were within 15% comparing samples donated after fasting 9 to 12 versus ≥13 hours; greater differences were observed in samples donated after fasting ≤4 hours. Metabolite peak areas generally were similar across season of blood collection, although levels of certain metabolites (e.g., bile acids and purines/pyrimidines) tended to be different in the summer versus winter months. After adjusting for fasting status, geometric mean peak areas for bile acids and vitamins, but not other metabolites, differed by time of day of blood collection. Fasting, season of blood collection, and time of day of blood collection were not important sources of variability in measurements of most metabolites in our study. However, considering blood collection variables in the design or analysis of studies may be important for certain specific metabolites, particularly bile acids, purines/pyrimidines, and vitamins. These results may be useful for investigators formulating analysis plans for epidemiologic metabolomics studies, including determining which metabolites to a priori exclude from analyses. Cancer Epidemiol Biomarkers Prev; 25(5); 823-9. ©2016 AACR. ©2016 American Association for Cancer Research.

  6. Holistic rubric vs. analytic rubric for measuring clinical performance levels in medical students.

    PubMed

    Yune, So Jung; Lee, Sang Yeoup; Im, Sun Ju; Kam, Bee Sung; Baek, Sun Yong

    2018-06-05

    Task-specific checklists, holistic rubrics, and analytic rubrics are often used for performance assessments. We examined what factors evaluators consider important in holistic scoring of clinical performance assessment, and compared the usefulness of applying holistic and analytic rubrics respectively, and analytic rubrics in addition to task-specific checklists based on traditional standards. We compared the usefulness of a holistic rubric versus an analytic rubric in effectively measuring the clinical skill performances of 126 third-year medical students who participated in a clinical performance assessment conducted by Pusan National University School of Medicine. We conducted a questionnaire survey of 37 evaluators who used all three evaluation methods-holistic rubric, analytic rubric, and task-specific checklist-for each student. The relationship between the scores on the three evaluation methods was analyzed using Pearson's correlation. Inter-rater agreement was analyzed by Kappa index. The effect of holistic and analytic rubric scores on the task-specific checklist score was analyzed using multiple regression analysis. Evaluators perceived accuracy and proficiency to be major factors in objective structured clinical examinations evaluation, and history taking and physical examination to be major factors in clinical performance examinations evaluation. Holistic rubric scores were highly related to the scores of the task-specific checklist and analytic rubric. Relatively low agreement was found in clinical performance examinations compared to objective structured clinical examinations. Meanwhile, the holistic and analytic rubric scores explained 59.1% of the task-specific checklist score in objective structured clinical examinations and 51.6% in clinical performance examinations. The results show the usefulness of holistic and analytic rubrics in clinical performance assessment, which can be used in conjunction with task-specific checklists for more efficient

  7. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive

  8. Fast analytical spectral filtering methods for magnetic resonance perfusion quantification.

    PubMed

    Reddy, Kasireddy V; Mitra, Abhishek; Yalavarthy, Phaneendra K

    2016-08-01

    The deconvolution in the perfusion weighted imaging (PWI) plays an important role in quantifying the MR perfusion parameters. The PWI application to stroke and brain tumor studies has become a standard clinical practice. The standard approach for this deconvolution is oscillatory-limited singular value decomposition (oSVD) and frequency domain deconvolution (FDD). The FDD is widely recognized as the fastest approach currently available for deconvolution of MR perfusion data. In this work, two fast deconvolution methods (namely analytical fourier filtering and analytical showalter spectral filtering) are proposed. Through systematic evaluation, the proposed methods are shown to be computationally efficient and quantitatively accurate compared to FDD and oSVD.

  9. Toward an Empirical Multidimensional Structure of Anhedonia, Reward Sensitivity, and Positive Emotionality: An Exploratory Factor Analytic Study.

    PubMed

    Olino, Thomas M; McMakin, Dana L; Forbes, Erika E

    2016-11-20

    Positive emotionality, anhedonia, and reward sensitivity share motivational and experiential elements of approach motivation and pleasure. Earlier work has examined the interrelationships among these constructs from measures of extraversion. More recently, the Research Domain Criteria introduced the Positive Valence Systems as a primary dimension to better understand psychopathology. However, the suggested measures tapping this construct have not yet been integrated within the structural framework of personality, even at the level of self-report. Thus, this study conducted exploratory factor and exploratory bifactor analyses on 17 different dimensions relevant to approach motivation, spanning anhedonia, behavioral activation system functioning, and positive emotionality. Convergent validity of these dimensions is tested by examining associations with depressive symptoms. Relying on multiple indices of fit, our preferred model included a general factor along with specific factors of affiliation, positive emotion, assertiveness, and pleasure seeking. These factors demonstrated different patterns of association with depressive symptoms. We discuss the plausibility of this model and highlight important future directions for work on the structure of a broad Positive Valence Systems construct. © The Author(s) 2016.

  10. Analyzing the factors that influencing the success of post graduates in achieving graduate on time (GOT) using analytic hierarchy process (AHP)

    NASA Astrophysics Data System (ADS)

    Chin, Wan Yung; Ch'ng, Chee Keong; Jamil, Jastini Mohd.; Shaharanee, Izwan Nizal Mohd.

    2017-11-01

    In the globalization era, education plays an important role in educating and preparing individuals to face the demands and challenges of 21st century. Thus, this contributes to the increase of the number of individuals pursuing their studies in Doctor of Philosophy (Ph.D) program. However, the ability of Ph.D students in heading to the four years Graduate on Time (GOT) mission that is stipulated by University has become a major concern of students, institution and government. Therefore, the main objective of this study is to investigate the factors that influence the Ph.D students in Universiti Utara Malaysia (UUM) to achieve GOT. Through the reviewing of previous research, six factors which are student factor, financial factor, supervisor factor, skills factor, project factors and institution factor had been identified as the domain factors that influence the Ph.D students in achieving GOT. The level of importance for each factor will be ranked by the experts from three graduate schools using Analytic Hierarchy Process (AHP) technique. This study will bring a significant contribution to the understanding of factors that affecting the Ph.D students in UUM to achieve GOT. In Addition, this study can also succor the university in planning and assisting the Ph.D students to accomplish the GOT in future.

  11. Alcohol expectancy multiaxial assessment: a memory network-based approach.

    PubMed

    Goldman, Mark S; Darkes, Jack

    2004-03-01

    Despite several decades of activity, alcohol expectancy research has yet to merge measurement approaches with developing memory theory. This article offers an expectancy assessment approach built on a conceptualization of expectancy as an information processing network. The authors began with multidimensional scaling models of expectancy space, which served as heuristics to suggest confirmatory factor analytic dimensional models for entry into covariance structure predictive models. It is argued that this approach permits a relatively thorough assessment of the broad range of potential expectancy dimensions in a format that is very flexible in terms of instrument length and specificity versus breadth of focus. ((c) 2004 APA, all rights reserved)

  12. Analytical treatment of the deformation behavior of EUVL masks during electrostatic chucking

    NASA Astrophysics Data System (ADS)

    Brandstetter, Gerd; Govindjee, Sanjay

    2012-03-01

    A new analytical approach is presented to predict mask deformation during electro-static chucking in next generation extreme-ultraviolet-lithography (EUVL). Given an arbitrary profile measurement of the mask and chuck non-flatness, this method has been developed as an alternative to time-consuming finite element simulations for overlay error correction algorithms. We consider the feature transfer of each harmonic component in the profile shapes via linear elasticity theory and demonstrate analytically how high spatial frequencies are filtered. The method is compared to presumably more accurate finite element simulations and has been tested successfully in an overlay error compensation experiment, where the residual error y-component could be reduced by a factor 2. As a side outcome, the formulation provides a tool to estimate the critical pin-size and -pitch such that the distortion on the mask front-side remains within given tolerances. We find for a numerical example that pin-pitches of less than 5 mm will result in a mask pattern-distortion of less than 1 nm if the chucking pressure is below 30 kPa.

  13. ANALYTIC MODELING OF STARSHADES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cash, Webster

    2011-09-01

    External occulters, otherwise known as starshades, have been proposed as a solution to one of the highest priority yet technically vexing problems facing astrophysics-the direct imaging and characterization of terrestrial planets around other stars. New apodization functions, developed over the past few years, now enable starshades of just a few tens of meters diameter to occult central stars so efficiently that the orbiting exoplanets can be revealed and other high-contrast imaging challenges addressed. In this paper, an analytic approach to the analysis of these apodization functions is presented. It is used to develop a tolerance analysis suitable for use inmore » designing practical starshades. The results provide a mathematical basis for understanding starshades and a quantitative approach to setting tolerances.« less

  14. Positive lists of cosmetic ingredients: Analytical methodology for regulatory and safety controls - A review.

    PubMed

    Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen

    2016-04-07

    Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016

  15. Learning Analytics for 21st Century Competencies

    ERIC Educational Resources Information Center

    Buckingham Shum, Simon; Crick, Ruth Deakin

    2016-01-01

    Many educational institutions are shifting their teaching and learning towards equipping students with knowledge, skills, and dispositions that prepare them for lifelong learning, in a complex and uncertain world. These have been termed "21st century competencies." Learning analytics (LA) approaches in general offer different kinds of…

  16. E-Learning Personalization Using Triple-Factor Approach in Standard-Based Education

    NASA Astrophysics Data System (ADS)

    Laksitowening, K. A.; Santoso, H. B.; Hasibuan, Z. A.

    2017-01-01

    E-Learning can be a tool in monitoring learning process and progress towards the targeted competency. Process and progress on every learner can be different one to another, since every learner may have different learning type. Learning type itself can be identified by taking into account learning style, motivation, and knowledge ability. This study explores personalization for learning type based on Triple-Factor Approach. Considering that factors in Triple-Factor Approach are dynamic, the personalization system needs to accommodate the changes that may occurs. Originated from the issue, this study proposed personalization that guides learner progression dynamically towards stages of their learning process. The personalization is implemented in the form of interventions that trigger learner to access learning contents and discussion forums more often as well as improve their level of knowledge ability based on their state of learning type.

  17. A factor analytical study of tinnitus complaint behaviour.

    PubMed

    Jakes, S C; Hallam, R S; Chambers, C; Hinchcliffe, R

    1985-01-01

    Two separate factor analyses were conducted on various self-rated complaints about tinnitus and related neuro-otological symptoms, together with audiometric measurements of tinnitus 'intensity' (masking level and loudness matching levels). Two general tinnitus complaint factors were identified, i.e. 'intrusiveness of tinnitus' and 'distress due to tinnitus'. 3 specific tinnitus complaint factors were also found, i.e. 'sleep disturbance', 'medication use' and 'interference with passive auditory entertainments'. Other neuro-otological symptoms and the audiometric measures did not load on these factors. An exception was provided by loudness matches at 1 kHz, which had a small loading on the 'intrusiveness of tinnitus' factor. Self-rated loudness had a high loading on this factor. Otherwise, the loudness (either self-rated or determined by loudness matching) was unrelated to complaint dimensions. The clinical implications of the multifactorial nature of tinnitus complaint behaviour are considered.

  18. The transfer of analytical procedures.

    PubMed

    Ermer, J; Limberger, M; Lis, K; Wätzig, H

    2013-11-01

    Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. The ethical dimension of analytical psychology.

    PubMed

    Barreto, Marco Heleno

    2018-04-01

    The centrality of the ethical dimension in Carl Gustav Jung's analytical psychology is demonstrated through careful reference to fundamental moments in the Jungian text. Tracking Jung's statements about the primacy of the 'moral function' (or 'moral factor') in the cure of neurosis as well as in the process of individuation, the ethical nature of the psychotherapeutic praxis proposed by Jung is highlighted. This allows us to see the ethical aspect of psychological conflicts, and thus to understand better why individuation can be seen as a 'moral achievement'. Finally, the intelligible ethical structure of Jungian psychotherapeutic praxis is exposed. © 2018, The Society of Analytical Psychology.

  20. Single-analyte to multianalyte fluorescence sensors

    NASA Astrophysics Data System (ADS)

    Lavigne, John J.; Metzger, Axel; Niikura, Kenichi; Cabell, Larry A.; Savoy, Steven M.; Yoo, J. S.; McDevitt, John T.; Neikirk, Dean P.; Shear, Jason B.; Anslyn, Eric V.

    1999-05-01

    The rational design of small molecules for the selective complexation of analytes has reached a level of sophistication such that there exists a high degree of prediction. An effective strategy for transforming these hosts into sensors involves covalently attaching a fluorophore to the receptor which displays some fluorescence modulation when analyte is bound. Competition methods, such as those used with antibodies, are also amenable to these synthetic receptors, yet there are few examples. In our laboratories, the use of common dyes in competition assays with small molecules has proven very effective. For example, an assay for citrate in beverages and an assay for the secondary messenger IP3 in cells have been developed. Another approach we have explored focuses on multi-analyte sensor arrays with attempt to mimic the mammalian sense of taste. Our system utilizes polymer resin beads with the desired sensors covalently attached. These functionalized microspheres are then immobilized into micromachined wells on a silicon chip thereby creating our taste buds. Exposure of the resin to analyte causes a change in the transmittance of the bead. This change can be fluorescent or colorimetric. Optical interrogation of the microspheres, by illuminating from one side of the wafer and collecting the signal on the other, results in an image. These data streams are collected using a CCD camera which creates red, green and blue (RGB) patterns that are distinct and reproducible for their environments. Analysis of this data can identify and quantify the analytes present.

  1. The Impact of the One to One Laptop Initiative on High School Students' Academic Performance in Algebra I and English I--A Meta-Analytic Approach

    ERIC Educational Resources Information Center

    Dennis, Quincita

    2014-01-01

    This study examined the effectiveness of using laptops to teach and deliver instruction to students. The meta-analytic approach was employed to compare the means of End-of Course Test scores from North Carolina one-to-one high schools during the traditional teaching period and the laptop teaching period in order to determine if there are…

  2. Dual-domain mass-transfer parameters from electrical hysteresis: theory and analytical approach applied to laboratory, synthetic streambed, and groundwater experiments

    USGS Publications Warehouse

    Briggs, Martin A.; Day-Lewis, Frederick D.; Ong, John B.; Harvey, Judson W.; Lane, John W.

    2014-01-01

    Models of dual-domain mass transfer (DDMT) are used to explain anomalous aquifer transport behavior such as the slow release of contamination and solute tracer tailing. Traditional tracer experiments to characterize DDMT are performed at the flow path scale (meters), which inherently incorporates heterogeneous exchange processes; hence, estimated “effective” parameters are sensitive to experimental design (i.e., duration and injection velocity). Recently, electrical geophysical methods have been used to aid in the inference of DDMT parameters because, unlike traditional fluid sampling, electrical methods can directly sense less-mobile solute dynamics and can target specific points along subsurface flow paths. Here we propose an analytical framework for graphical parameter inference based on a simple petrophysical model explaining the hysteretic relation between measurements of bulk and fluid conductivity arising in the presence of DDMT at the local scale. Analysis is graphical and involves visual inspection of hysteresis patterns to (1) determine the size of paired mobile and less-mobile porosities and (2) identify the exchange rate coefficient through simple curve fitting. We demonstrate the approach using laboratory column experimental data, synthetic streambed experimental data, and field tracer-test data. Results from the analytical approach compare favorably with results from calibration of numerical models and also independent measurements of mobile and less-mobile porosity. We show that localized electrical hysteresis patterns resulting from diffusive exchange are independent of injection velocity, indicating that repeatable parameters can be extracted under varied experimental designs, and these parameters represent the true intrinsic properties of specific volumes of porous media of aquifers and hyporheic zones.

  3. Dual-domain mass-transfer parameters from electrical hysteresis: Theory and analytical approach applied to laboratory, synthetic streambed, and groundwater experiments

    NASA Astrophysics Data System (ADS)

    Briggs, Martin A.; Day-Lewis, Frederick D.; Ong, John B.; Harvey, Judson W.; Lane, John W.

    2014-10-01

    Models of dual-domain mass transfer (DDMT) are used to explain anomalous aquifer transport behavior such as the slow release of contamination and solute tracer tailing. Traditional tracer experiments to characterize DDMT are performed at the flow path scale (meters), which inherently incorporates heterogeneous exchange processes; hence, estimated "effective" parameters are sensitive to experimental design (i.e., duration and injection velocity). Recently, electrical geophysical methods have been used to aid in the inference of DDMT parameters because, unlike traditional fluid sampling, electrical methods can directly sense less-mobile solute dynamics and can target specific points along subsurface flow paths. Here we propose an analytical framework for graphical parameter inference based on a simple petrophysical model explaining the hysteretic relation between measurements of bulk and fluid conductivity arising in the presence of DDMT at the local scale. Analysis is graphical and involves visual inspection of hysteresis patterns to (1) determine the size of paired mobile and less-mobile porosities and (2) identify the exchange rate coefficient through simple curve fitting. We demonstrate the approach using laboratory column experimental data, synthetic streambed experimental data, and field tracer-test data. Results from the analytical approach compare favorably with results from calibration of numerical models and also independent measurements of mobile and less-mobile porosity. We show that localized electrical hysteresis patterns resulting from diffusive exchange are independent of injection velocity, indicating that repeatable parameters can be extracted under varied experimental designs, and these parameters represent the true intrinsic properties of specific volumes of porous media of aquifers and hyporheic zones.

  4. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  5. Quantifying cyanobacterial phycocyanin concentration in turbid productive waters: a quasi-analytical approach

    USDA-ARS?s Scientific Manuscript database

    In this research, we present a novel technique to monitor cyanobacterial algal bloom using remote sensing measurements. We have used a multi-band quasi analytical algorithm that determines phytoplankton absorption coefficients, aF('), from above-surface remote sensing reflectance, Rrs('). In situ da...

  6. Factors promoting marine invasions: A chemoecological approach

    PubMed Central

    Mollo, Ernesto; Gavagnin, Margherita; Carbone, Marianna; Castelluccio, Francesco; Pozone, Ferdinando; Roussis, Vassilios; Templado, José; Ghiselin, Michael T.; Cimino, Guido

    2008-01-01

    The Mediterranean Sea is losing its biological distinctiveness, and the same phenomenon is occurring in other seas. It gives urgency to a better understanding of the factors that affect marine biological invasions. A chemoecological approach is proposed here to define biotic conditions that promote biological invasions in terms of enemy escape and resource opportunities. Research has focused on the secondary metabolite composition of three exotic sea slugs found in Greece that have most probably entered the Mediterranean basin by Lessepsian migration, an exchange that contributes significantly to Mediterranean biodiversity. We have found toxic compounds with significant activity as feeding deterrents both in the cephalaspidean Haminoea cyanomarginata and in the nudibranch Melibe viridis. These findings led us to propose aposematism in the former and dietary autonomy in producing defensive metabolites in the latter case, as predisposing factors to the migration. In the third mollusk investigated, the anaspidean Syphonota geographica, the topic of marine invasions has been approached through a study of its feeding biology. The identification of the same compounds from both the viscera of each individual, separately analyzed, and their food, the seagrass Halophila stipulacea, implies a dietary dependency. The survival of S. geographica in the Mediterranean seems to be related to the presence of H. stipulacea. The initial invasion of this exotic pest would seem to have paved the way for the subsequent invasion of a trophic specialist that takes advantage of niche opportunities. PMID:18337492

  7. Hyphenated analytical techniques for materials characterisation

    NASA Astrophysics Data System (ADS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the

  8. Evaluation Criteria for Micro-CAI: A Psychometric Approach

    PubMed Central

    Wallace, Douglas; Slichter, Mark; Bolwell, Christine

    1985-01-01

    The increased use of microcomputer-based instructional programs has resulted in a greater need for third-party evaluation of the software. This in turn has prompted the development of micro-CAI evaluation tools. The present project sought to develop a prototype instrument to assess the impact of CAI program presentation characteristics on students. Data analysis and scale construction was conducted using standard item reliability analyses and factor analytic techniques. Adequate subscale reliabilities and factor structures were found, suggesting that a psychometric approach to CAI evaluation may possess some merit. Efforts to assess the utility of the resultant instrument are currently underway.

  9. Biochemical Applications in the Analytical Chemistry Lab

    ERIC Educational Resources Information Center

    Strong, Cynthia; Ruttencutter, Jeffrey

    2004-01-01

    An HPLC and a UV-visible spectrophotometer are identified as instruments that helps to incorporate more biologically-relevant experiments into the course, in order to increase the students understanding of selected biochemistry topics and enhances their ability to apply an analytical approach to biochemical problems. The experiment teaches…

  10. The Anxiety Sensitivity Index--Revised: Confirmatory Factor Analyses, Structural Invariance in Caucasian and African American Samples, and Score Reliability and Validity

    ERIC Educational Resources Information Center

    Arnau, Randolph C.; Broman-Fulks, Joshua J.; Green, Bradley A.; Berman, Mitchell E.

    2009-01-01

    The most commonly used measure of anxiety sensitivity is the 36-item Anxiety Sensitivity Index--Revised (ASI-R). Exploratory factor analyses have produced several different factors structures for the ASI-R, but an acceptable fit using confirmatory factor analytic approaches has only been found for a 21-item version of the instrument. We evaluated…

  11. Human factors engineering approaches to patient identification armband design.

    PubMed

    Probst, C Adam; Wolf, Laurie; Bollini, Mara; Xiao, Yan

    2016-01-01

    The task of patient identification is performed many times each day by nurses and other members of the care team. Armbands are used for both direct verification and barcode scanning during patient identification. Armbands and information layout are critical to reducing patient identification errors and dangerous workarounds. We report the effort at two large, integrated healthcare systems that employed human factors engineering approaches to the information layout design of new patient identification armbands. The different methods used illustrate potential pathways to obtain standardized armbands across healthcare systems that incorporate human factors principles. By extension, how the designs have been adopted provides examples of how to incorporate human factors engineering into key clinical processes. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  12. A Learning Analytics Approach to Investigating Factors Affecting EFL Students' Oral Performance in a Flipped Classroom

    ERIC Educational Resources Information Center

    Lin, Chi-Jen; Hwang, Gwo-Jen

    2018-01-01

    Flipped classrooms have been widely adopted and discussed by school teachers and researchers in the past decade. However, few studies have been conducted to formally evaluate the effectiveness of flipped classrooms in terms of improving EFL students' English oral presentation, not to mention investigating factors affecting their flipped learning…

  13. Amplitudes of doping striations: comparison of numerical calculations and analytical approaches

    NASA Astrophysics Data System (ADS)

    Jung, T.; Müller, G.

    1997-02-01

    Transient, axisymmetric numerical calculations of the heat and species transport including convection were performed for a simplified vertical gradient freeze (Bridgman) process with bottom seeding for GaAs. Periodical oscillations were superimposed onto the transient heater temperature profile. The amplitudes of the resulting oscillations of the growth rate and the dopant concentration (striations) in the growing crystals are compared with the predictions of analytical models.

  14. Application of analytical redundancy management to Shuttle crafts. [computerized simulation of microelectronic implementation

    NASA Technical Reports Server (NTRS)

    Montgomery, R. C.; Tabak, D.

    1979-01-01

    The study involves the bank of filters approach to analytical redundancy management since this is amenable to microelectronic implementation. Attention is given to a study of the UD factorized filter to determine if it gives more accurate estimates than the standard Kalman filter when data processing word size is reduced. It is reported that, as the word size is reduced, the effect of modeling error dominates the filter performance of the two filters. However, the UD filter is shown to maintain a slight advantage in tracking performance. It is concluded that because of the UD filter's stability in the serial processing mode, it remains the leading candidate for microelectronic implementation.

  15. A Deep Learning Approach to on-Node Sensor Data Analytics for Mobile or Wearable Devices.

    PubMed

    Ravi, Daniele; Wong, Charence; Lo, Benny; Yang, Guang-Zhong

    2017-01-01

    The increasing popularity of wearable devices in recent years means that a diverse range of physiological and functional data can now be captured continuously for applications in sports, wellbeing, and healthcare. This wealth of information requires efficient methods of classification and analysis where deep learning is a promising technique for large-scale data analytics. While deep learning has been successful in implementations that utilize high-performance computing platforms, its use on low-power wearable devices is limited by resource constraints. In this paper, we propose a deep learning methodology, which combines features learned from inertial sensor data together with complementary information from a set of shallow features to enable accurate and real-time activity classification. The design of this combined method aims to overcome some of the limitations present in a typical deep learning framework where on-node computation is required. To optimize the proposed method for real-time on-node computation, spectral domain preprocessing is used before the data are passed onto the deep learning framework. The classification accuracy of our proposed deep learning approach is evaluated against state-of-the-art methods using both laboratory and real world activity datasets. Our results show the validity of the approach on different human activity datasets, outperforming other methods, including the two methods used within our combined pipeline. We also demonstrate that the computation times for the proposed method are consistent with the constraints of real-time on-node processing on smartphones and a wearable sensor platform.

  16. An analytical SMASH procedure (ASP) for sensitivity-encoded MRI.

    PubMed

    Lee, R F; Westgate, C R; Weiss, R G; Bottomley, P A

    2000-05-01

    The simultaneous acquisition of spatial harmonics (SMASH) method of imaging with detector arrays can reduce the number of phase-encoding steps, and MRI scan time several-fold. The original approach utilized numerical gradient-descent fitting with the coil sensitivity profiles to create a set of composite spatial harmonics to replace the phase-encoding steps. Here, an analytical approach for generating the harmonics is presented. A transform is derived to project the harmonics onto a set of sensitivity profiles. A sequence of Fourier, Hilbert, and inverse Fourier transform is then applied to analytically eliminate spatially dependent phase errors from the different coils while fully preserving the spatial-encoding. By combining the transform and phase correction, the original numerical image reconstruction method can be replaced by an analytical SMASH procedure (ASP). The approach also allows simulation of SMASH imaging, revealing a criterion for the ratio of the detector sensitivity profile width to the detector spacing that produces optimal harmonic generation. When detector geometry is suboptimal, a group of quasi-harmonics arises, which can be corrected and restored to pure harmonics. The simulation also reveals high-order harmonic modulation effects, and a demodulation procedure is presented that enables application of ASP to a large numbers of detectors. The method is demonstrated on a phantom and humans using a standard 4-channel phased-array MRI system. Copyright 2000 Wiley-Liss, Inc.

  17. The Analytical Limits of Modeling Short Diffusion Timescales

    NASA Astrophysics Data System (ADS)

    Bradshaw, R. W.; Kent, A. J.

    2016-12-01

    Chemical and isotopic zoning in minerals is widely used to constrain the timescales of magmatic processes such as magma mixing and crystal residence, etc. via diffusion modeling. Forward modeling of diffusion relies on fitting diffusion profiles to measured compositional gradients. However, an individual measurement is essentially an average composition for a segment of the gradient defined by the spatial resolution of the analysis. Thus there is the potential for the analytical spatial resolution to limit the timescales that can be determined for an element of given diffusivity, particularly where the scale of the gradient approaches that of the measurement. Here we use a probabilistic modeling approach to investigate the effect of analytical spatial resolution on estimated timescales from diffusion modeling. Our method investigates how accurately the age of a synthetic diffusion profile can be obtained by modeling an "unknown" profile derived from discrete sampling of the synthetic compositional gradient at a given spatial resolution. We also include the effects of analytical uncertainty and the position of measurements relative to the diffusion gradient. We apply this method to the spatial resolutions of common microanalytical techniques (LA-ICP-MS, SIMS, EMP, NanoSIMS). Our results confirm that for a given diffusivity, higher spatial resolution gives access to shorter timescales, and that each analytical spacing has a minimum timescale, below which it overestimates the timescale. For example, for Ba diffusion in plagioclase at 750 °C timescales are accurate (within 20%) above 10, 100, 2,600, and 71,000 years at 0.3, 1, 5, and 25 mm spatial resolution, respectively. For Sr diffusion in plagioclase at 750 °C, timescales are accurate above 0.02, 0.2, 4, and 120 years at the same spatial resolutions. Our results highlight the importance of selecting appropriate analytical techniques to estimate accurate diffusion-based timescales.

  18. Quality assessment of Isfahan Medical Faculty web site electronic services and prioritizing solutions using analytic hierarchy process approach.

    PubMed

    Hajrahimi, Nafiseh; Dehaghani, Sayed Mehdi Hejazi; Hajrahimi, Nargess; Sarmadi, Sima

    2014-01-01

    Implementing information technology in the best possible way can bring many advantages such as applying electronic services and facilitating tasks. Therefore, assessment of service providing systems is a way to improve the quality and elevate these systems including e-commerce, e-government, e-banking, and e-learning. This study was aimed to evaluate the electronic services in the website of Isfahan University of Medical Sciences in order to propose solutions to improve them. Furthermore, we aim to rank the solutions based on the factors that enhance the quality of electronic services by using analytic hierarchy process (AHP) method. Non-parametric test was used to assess the quality of electronic services. The assessment of propositions was based on Aqual model and they were prioritized using AHP approach. The AHP approach was used because it directly applies experts' deductions in the model, and lead to more objective results in the analysis and prioritizing the risks. After evaluating the quality of the electronic services, a multi-criteria decision making frame-work was used to prioritize the proposed solutions. Non-parametric tests and AHP approach using Expert Choice software. The results showed that students were satisfied in most of the indicators. Only a few indicators received low satisfaction from students including, design attractiveness, the amount of explanation and details of information, honesty and responsiveness of authorities, and the role of e-services in the user's relationship with university. After interviewing with Information and Communications Technology (ICT) experts at the university, measurement criteria, and solutions to improve the quality were collected. The best solutions were selected by EC software. According to the results, the solution "controlling and improving the process in handling users complaints" is of the utmost importance and authorities have to have it on the website and place great importance on updating this process

  19. Analytic materials

    PubMed Central

    2016-01-01

    The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer p. If p takes its maximum value, then we have a complete analytic material. Otherwise, it is incomplete analytic material of rank p. For two-dimensional materials, further progress can be made in the identification of analytic materials by using the well-known fact that a 90° rotation applied to a divergence-free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations. PMID:27956882

  20. Analytical approaches to the determination of spin-dependent parton distribution functions at NNLO approximation

    NASA Astrophysics Data System (ADS)

    Salajegheh, Maral; Nejad, S. Mohammad Moosavi; Khanpour, Hamzeh; Tehrani, S. Atashbar

    2018-05-01

    In this paper, we present SMKA18 analysis, which is a first attempt to extract the set of next-to-next-leading-order (NNLO) spin-dependent parton distribution functions (spin-dependent PDFs) and their uncertainties determined through the Laplace transform technique and Jacobi polynomial approach. Using the Laplace transformations, we present an analytical solution for the spin-dependent Dokshitzer-Gribov-Lipatov-Altarelli-Parisi evolution equations at NNLO approximation. The results are extracted using a wide range of proton g1p(x ,Q2) , neutron g1n(x ,Q2) , and deuteron g1d(x ,Q2) spin-dependent structure functions data set including the most recent high-precision measurements from COMPASS16 experiments at CERN, which are playing an increasingly important role in global spin-dependent fits. The careful estimations of uncertainties have been done using the standard Hessian error propagation. We will compare our results with the available spin-dependent inclusive deep inelastic scattering data set and other results for the spin-dependent PDFs in literature. The results obtained for the spin-dependent PDFs as well as spin-dependent structure functions are clearly explained both in the small and large values of x .

  1. Factors associated with the income distribution of full-time physicians: a quantile regression approach.

    PubMed

    Shih, Ya-Chen Tina; Konrad, Thomas R

    2007-10-01

    Physician income is generally high, but quite variable; hence, physicians have divergent perspectives regarding health policy initiatives and market reforms that could affect their incomes. We investigated factors underlying the distribution of income within the physician population. Full-time physicians (N=10,777) from the restricted version of the 1996-1997 Community Tracking Study Physician Survey (CTS-PS), 1996 Area Resource File, and 1996 health maintenance organization penetration data. We conducted separate analyses for primary care physicians (PCPs) and specialists. We employed least square and quantile regression models to examine factors associated with physician incomes at the mean and at various points of the income distribution, respectively. We accounted for the complex survey design for the CTS-PS data using appropriate weighted procedures and explored endogeneity using an instrumental variables method. We detected widespread and subtle effects of many variables on physician incomes at different points (10th, 25th, 75th, and 90th percentiles) in the distribution that were undetected when employing regression estimations focusing on only the means or medians. Our findings show that the effects of managed care penetration are demonstrable at the mean of specialist incomes, but are more pronounced at higher levels. Conversely, a gender gap in earnings occurs at all levels of income of both PCPs and specialists, but is more pronounced at lower income levels. The quantile regression technique offers an analytical tool to evaluate policy effects beyond the means. A longitudinal application of this approach may enable health policy makers to identify winners and losers among segments of the physician workforce and assess how market dynamics and health policy initiatives affect the overall physician income distribution over various time intervals.

  2. Application of person-centered analytic methodology in longitudinal research: exemplars from the Women's Health Initiative Clinical Trial data.

    PubMed

    Zaslavsky, Oleg; Cochrane, Barbara B; Herting, Jerald R; Thompson, Hilaire J; Woods, Nancy F; Lacroix, Andrea

    2014-02-01

    Despite the variety of available analytic methods, longitudinal research in nursing has been dominated by use of a variable-centered analytic approach. The purpose of this article is to present the utility of person-centered methodology using a large cohort of American women 65 and older enrolled in the Women's Health Initiative Clinical Trial (N = 19,891). Four distinct trajectories of energy/fatigue scores were identified. Levels of fatigue were closely linked to age, socio-demographic factors, comorbidities, health behaviors, and poor sleep quality. These findings were consistent regardless of the methodological framework. Finally, we demonstrated that energy/fatigue levels predicted future hospitalization in non-disabled elderly. Person-centered methods provide unique opportunities to explore and statistically model the effects of longitudinal heterogeneity within a population. © 2013 Wiley Periodicals, Inc.

  3. Weighting of the data and analytical approaches may account for differences in overcoming the inadequate representativeness of the respondents to the third wave of a cohort study.

    PubMed

    Taylor, Anne W; Dal Grande, Eleonora; Grant, Janet; Appleton, Sarah; Gill, Tiffany K; Shi, Zumin; Adams, Robert J

    2013-04-01

    Attrition in cohort studies can cause the data to be nonreflective of the original population. Although of little concern if intragroup comparisons are being made or cause and effect assessed, the assessment of bias was undertaken in this study so that intergroup or descriptive analyses could be undertaken. The North West Adelaide Health Study is a chronic disease and risk factor cohort study undertaken in Adelaide, South Australia. In the original wave (1999), clinical and self-report data were collected from 4,056 adults. In the third wave (2008-2010), 2,710 adults were still actively involved. Comparisons were made against two other data sources: Australian Bureau of Statistics Estimated Residential Population and a regular conducted chronic disease and risk factor surveillance system. Comparisons of demographics (age, sex, area, education, work status, and income) proved to be statistically significantly different. In addition, smoking status, body mass index, and general health status were statistically significant from the comparison group. No statistically significant differences were found for alcohol risk. Although the third wave of this cohort study is not representative of the broader population on the variables assessed, weighting of the data and analytical approaches can account for differences. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Using Learning Analytics for Preserving Academic Integrity

    ERIC Educational Resources Information Center

    Amigud, Alexander; Arnedo-Moreno, Joan; Daradoumis, Thanasis; Guerrero-Roldan, Ana-Elena

    2017-01-01

    This paper presents the results of integrating learning analytics into the assessment process to enhance academic integrity in the e-learning environment. The goal of this research is to evaluate the computational-based approach to academic integrity. The machine-learning based framework learns students' patterns of language use from data,…

  5. A Systemic Approach to Implementing a Protective Factors Framework

    ERIC Educational Resources Information Center

    Parsons, Beverly; Jessup, Patricia; Moore, Marah

    2014-01-01

    The leadership team of the national Quality Improvement Center on early Childhood ventured into the frontiers of deep change in social systems by funding four research projects. The purpose of the research projects was to learn about implementing a protective factors approach with the goal of reducing the likelihood of child abuse and neglect. In…

  6. Deuteron electromagnetic form factors with the light-front approach

    NASA Astrophysics Data System (ADS)

    Sun, Bao-dong; Dong, Yu-bing

    2017-01-01

    The electromagnetic form factors and low-energy observables of the deuteron are studied with the help of the light-front approach, where the deuteron is regarded as a weakly bound state of a proton and a neutron. Both the S and D wave interacting vertexes among the deuteron, proton, and neutron are taken into account. Moreover, the regularization functions are also introduced. In our calculations, the vertex and the regularization functions are employed to simulate the momentum distribution inside the deuteron. Our numerical results show that the light-front approach can roughly reproduce the deuteron electromagnetic form factors, like charge G 0, magnetic G 1, and quadrupole G 2, in the low Q 2 region. The important effect of the D wave vertex on G 2 is also addressed. Supported by National Natural Science Foundation of China (10975146, 11475192), The fund provided by the Sino-German CRC 110 “Symmetries and the Emergence of Structure in QCD" project is also appreciated, YBD thanks FAPESP grant 2011/11973-4 for funding his visit to ICTP-SAIFR

  7. The role of analytical science in natural resource decision making

    NASA Astrophysics Data System (ADS)

    Miller, Alan

    1993-09-01

    There is a continuing debate about the proper role of analytical (positivist) science in natural resource decision making. Two diametrically opposed views are evident, arguing for and against a more extended role for scientific information. The debate takes on a different complexion if one recognizes that certain kinds of problem, referred to here as “wicked” or “trans-science” problems, may not be amenable to the analytical process. Indeed, the mistaken application of analytical methods to trans-science problems may not only be a waste of time and money but also serve to hinder policy development. Since many environmental issues are trans-science in nature, then it follows that alternatives to analytical science need to be developed. In this article, the issues involved in the debate are clarified by examining the impact of the use of analytical methods in a particular case, the spruce budworm controversy in New Brunswick. The article ends with some suggestions about a “holistic” approach to the problem.

  8. Analysis of the extracts of Isatis tinctoria by new analytical approaches of HPLC, MS and NMR.

    PubMed

    Zhou, Jue; Qu, Fan

    2011-01-01

    The methods of extraction, separation and analysis of alkaloids and indole glucosinolates (GLs) ofIsatis tinctoria were reviewed. Different analytical approaches such as High-pressure Liquid Chromatography (HPLC), Liquid Chromatography with Electrospray Ionization Mass Spectrometry (LC/ESI/MS), Electrospray Ionization Time-Of-Flight Mass Spectrometry (ESI-TOF-MS), and Nuclear Magnetic Resonance (NMR) were used to validate and identity of these constituents. These methods provide rapid separation, identification and quantitative measurements of alkaloids and GLs of Isatis tinctoria. By connection with different detectors to HPLC such as PDA, ELSD, ESI- and APCI-MS in positive and negative ion modes, complicated compounds could be detected with at least two independent detection modes. The molecular formula can be derived in a second step of ESI-TOF-MS data. But for some constituents, UV and MS cannot provide sufficient structure identification. After peak purification, NMR by semi-preparative HPLC can be used as a complementary method.

  9. Analytical approaches for the characterization and quantification of nanoparticles in food and beverages.

    PubMed

    Mattarozzi, Monica; Suman, Michele; Cascio, Claudia; Calestani, Davide; Weigel, Stefan; Undas, Anna; Peters, Ruud

    2017-01-01

    Estimating consumer exposure to nanomaterials (NMs) in food products and predicting their toxicological properties are necessary steps in the assessment of the risks of this technology. To this end, analytical methods have to be available to detect, characterize and quantify NMs in food and materials related to food, e.g. food packaging and biological samples following metabolization of food. The challenge for the analytical sciences is that the characterization of NMs requires chemical as well as physical information. This article offers a comprehensive analysis of methods available for the detection and characterization of NMs in food and related products. Special attention was paid to the crucial role of sample preparation methods since these have been partially neglected in the scientific literature so far. The currently available instrumental methods are grouped as fractionation, counting and ensemble methods, and their advantages and limitations are discussed. We conclude that much progress has been made over the last 5 years but that many challenges still exist. Future perspectives and priority research needs are pointed out. Graphical Abstract Two possible analytical strategies for the sizing and quantification of Nanoparticles: Asymmetric Flow Field-Flow Fractionation with multiple detectors (allows the determination of true size and mass-based particle size distribution); Single Particle Inductively Coupled Plasma Mass Spectrometry (allows the determination of a spherical equivalent diameter of the particle and a number-based particle size distribution).

  10. Analytic hierarchy process-based approach for selecting a Pareto-optimal solution of a multi-objective, multi-site supply-chain planning problem

    NASA Astrophysics Data System (ADS)

    Ayadi, Omar; Felfel, Houssem; Masmoudi, Faouzi

    2017-07-01

    The current manufacturing environment has changed from traditional single-plant to multi-site supply chain where multiple plants are serving customer demands. In this article, a tactical multi-objective, multi-period, multi-product, multi-site supply-chain planning problem is proposed. A corresponding optimization model aiming to simultaneously minimize the total cost, maximize product quality and maximize the customer satisfaction demand level is developed. The proposed solution approach yields to a front of Pareto-optimal solutions that represents the trade-offs among the different objectives. Subsequently, the analytic hierarchy process method is applied to select the best Pareto-optimal solution according to the preferences of the decision maker. The robustness of the solutions and the proposed approach are discussed based on a sensitivity analysis and an application to a real case from the textile and apparel industry.

  11. An integrated analytical approach for characterizing an organic residue from an archaeological glass bottle recovered in Pompeii (Naples, Italy).

    PubMed

    Ribechini, Erika; Modugno, Francesca; Baraldi, Cecilia; Baraldi, Pietro; Colombini, Maria Perla

    2008-01-15

    Within the framework of an Italian research project aimed at studying organic residues found in archaeological objects from the Roman period, the chemical composition of the contents of several glass vessels recovered from archaeological sites from the Vesuvian area (Naples, Italy) was investigated. In particular, this paper deals with the study of an organic material found in a glass bottle from the archaeological site of Pompeii using a multi-analytical approach, including FT-IR, direct exposure mass spectrometry (DE-MS) and GC-MS techniques. The overall results suggest the occurrence of a lipid material of vegetable origin. The hypothesis that the native lipid material had been subjected to a chemical transformation procedure before being used is presented and discussed.

  12. Promoting Efficacy Research on Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Maitland, Daniel W. M.; Gaynor, Scott T.

    2012-01-01

    Functional Analytic Psychotherapy (FAP) is a form of therapy grounded in behavioral principles that utilizes therapist reactions to shape target behavior. Despite a growing literature base, there is a paucity of research to establish the efficacy of FAP. As a general approach to psychotherapy, and how the therapeutic relationship produces change,…

  13. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  14. On the Modeling and Management of Cloud Data Analytics

    NASA Astrophysics Data System (ADS)

    Castillo, Claris; Tantawi, Asser; Steinder, Malgorzata; Pacifici, Giovanni

    A new era is dawning where vast amount of data is subjected to intensive analysis in a cloud computing environment. Over the years, data about a myriad of things, ranging from user clicks to galaxies, have been accumulated, and continue to be collected, on storage media. The increasing availability of such data, along with the abundant supply of compute power and the urge to create useful knowledge, gave rise to a new data analytics paradigm in which data is subjected to intensive analysis, and additional data is created in the process. Meanwhile, a new cloud computing environment has emerged where seemingly limitless compute and storage resources are being provided to host computation and data for multiple users through virtualization technologies. Such a cloud environment is becoming the home for data analytics. Consequently, providing good performance at run-time to data analytics workload is an important issue for cloud management. In this paper, we provide an overview of the data analytics and cloud environment landscapes, and investigate the performance management issues related to running data analytics in the cloud. In particular, we focus on topics such as workload characterization, profiling analytics applications and their pattern of data usage, cloud resource allocation, placement of computation and data and their dynamic migration in the cloud, and performance prediction. In solving such management problems one relies on various run-time analytic models. We discuss approaches for modeling and optimizing the dynamic data analytics workload in the cloud environment. All along, we use the Map-Reduce paradigm as an illustration of data analytics.

  15. Analytic Strategies of Streaming Data for eHealth.

    PubMed

    Yoon, Sunmoo

    2016-01-01

    New analytic strategies for streaming big data from wearable devices and social media are emerging in ehealth. We face challenges to find meaningful patterns from big data because researchers face difficulties to process big volume of streaming data using traditional processing applications.1 This introductory 180 minutes tutorial offers hand-on instruction on analytics2 (e.g., topic modeling, social network analysis) of streaming data. This tutorial aims to provide practical strategies of information on reducing dimensionality using examples of big data. This tutorial will highlight strategies of incorporating domain experts and a comprehensive approach to streaming social media data.

  16. Towards an analytical framework for tailoring supercontinuum generation.

    PubMed

    Castelló-Lurbe, David; Vermeulen, Nathalie; Silvestre, Enrique

    2016-11-14

    A fully analytical toolbox for supercontinuum generation relying on scenarios without pulse splitting is presented. Furthermore, starting from the new insights provided by this formalism about the physical nature of direct and cascaded dispersive wave emission, a unified description of this radiation in both normal and anomalous dispersion regimes is derived. Previously unidentified physics of broadband spectra reported in earlier works is successfully explained on this basis. Finally, a foundry-compatible few-millimeters-long silicon waveguide allowing octave-spanning supercontinuum generation pumped at telecom wavelengths in the normal dispersion regime is designed, hence showcasing the potential of this new analytical approach.

  17. Current projects in Pre-analytics: where to go?

    PubMed

    Sapino, Anna; Annaratone, Laura; Marchiò, Caterina

    2015-01-01

    The current clinical practice of tissue handling and sample preparation is multifaceted and lacks strict standardisation: this scenario leads to significant variability in the quality of clinical samples. Poor tissue preservation has a detrimental effect thus leading to morphological artefacts, hampering the reproducibility of immunocytochemical and molecular diagnostic results (protein expression, DNA gene mutations, RNA gene expression) and affecting the research outcomes with irreproducible gene expression and post-transcriptional data. Altogether, this limits the opportunity to share and pool national databases into European common databases. At the European level, standardization of pre-analytical steps is just at the beginning and issues regarding bio-specimen collection and management are still debated. A joint (public-private) project entitled on standardization of tissue handling in pre-analytical procedures has been recently funded in Italy with the aim of proposing novel approaches to the neglected issue of pre-analytical procedures. In this chapter, we will show how investing in pre-analytics may impact both public health problems and practical innovation in solid tumour processing.

  18. A contrarian view of the five-factor approach to personality description.

    PubMed

    Block, J

    1995-03-01

    The 5-factor approach (FFA) to personality description has been represented as a comprehensive and compelling rubric for assessment. In this article, various misgivings about the FFA are delineated. The algorithmic method of factor analysis may not provide dimensions that are incisive. The "discovery" of the five factors may be influenced by unrecognized constraints on the variable sets analyzed. Lexical analyses are based on questionable conceptual and methodological assumptions, and have achieved uncertain results. The questionnaire version of the FFA has not demonstrated the special merits and sufficiencies of the five factors settled upon. Serious uncertainties have arisen in regard to the claimed 5-factor structure and the substantive meanings of the factors. Some implications of these problems are drawn.

  19. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    ERIC Educational Resources Information Center

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  20. Real social analytics: A contribution towards a phenomenology of a digital world.

    PubMed

    Couldry, Nick; Fotopoulou, Aristea; Dickens, Luke

    2016-03-01

    This article argues against the assumption that agency and reflexivity disappear in an age of 'algorithmic power' (Lash 2007). Following the suggestions of Beer (2009), it proposes that, far from disappearing, new forms of agency and reflexivity around the embedding in everyday practice of not only algorithms but also analytics more broadly are emerging, as social actors continue to pursue their social ends but mediated through digital interfaces: this is the consequence of many social actors now needing their digital presence, regardless of whether they want this, to be measured and counted. The article proposes 'social analytics' as a new topic for sociology: the sociological study of social actors' uses of analytics not for the sake of measurement itself (or to make profit from measurement) but in order to fulfil better their social ends through an enhancement of their digital presence. The article places social analytics in the context of earlier debates about categorization, algorithmic power, and self-presentation online, and describes in detail a case study with a UK community organization which generated the social analytics approach. The article concludes with reflections on the implications of this approach for further sociological fieldwork in a digital world. © London School of Economics and Political Science 2016.