Sample records for simple analytical framework

  1. Facile room-temperature solution-phase synthesis of a spherical covalent organic framework for high-resolution chromatographic separation.

    PubMed

    Yang, Cheng-Xiong; Liu, Chang; Cao, Yi-Meng; Yan, Xiu-Ping

    2015-08-07

    A simple and facile room-temperature solution-phase synthesis was developed to fabricate a spherical covalent organic framework with large surface area, good solvent stability and high thermostability for high-resolution chromatographic separation of diverse important industrial analytes including alkanes, cyclohexane and benzene, α-pinene and β-pinene, and alcohols with high column efficiency and good precision.

  2. Branch length estimation and divergence dating: estimates of error in Bayesian and maximum likelihood frameworks.

    PubMed

    Schwartz, Rachel S; Mueller, Rachel L

    2010-01-11

    Estimates of divergence dates between species improve our understanding of processes ranging from nucleotide substitution to speciation. Such estimates are frequently based on molecular genetic differences between species; therefore, they rely on accurate estimates of the number of such differences (i.e. substitutions per site, measured as branch length on phylogenies). We used simulations to determine the effects of dataset size, branch length heterogeneity, branch depth, and analytical framework on branch length estimation across a range of branch lengths. We then reanalyzed an empirical dataset for plethodontid salamanders to determine how inaccurate branch length estimation can affect estimates of divergence dates. The accuracy of branch length estimation varied with branch length, dataset size (both number of taxa and sites), branch length heterogeneity, branch depth, dataset complexity, and analytical framework. For simple phylogenies analyzed in a Bayesian framework, branches were increasingly underestimated as branch length increased; in a maximum likelihood framework, longer branch lengths were somewhat overestimated. Longer datasets improved estimates in both frameworks; however, when the number of taxa was increased, estimation accuracy for deeper branches was less than for tip branches. Increasing the complexity of the dataset produced more misestimated branches in a Bayesian framework; however, in an ML framework, more branches were estimated more accurately. Using ML branch length estimates to re-estimate plethodontid salamander divergence dates generally resulted in an increase in the estimated age of older nodes and a decrease in the estimated age of younger nodes. Branch lengths are misestimated in both statistical frameworks for simulations of simple datasets. However, for complex datasets, length estimates are quite accurate in ML (even for short datasets), whereas few branches are estimated accurately in a Bayesian framework. Our reanalysis of empirical data demonstrates the magnitude of effects of Bayesian branch length misestimation on divergence date estimates. Because the length of branches for empirical datasets can be estimated most reliably in an ML framework when branches are <1 substitution/site and datasets are > or =1 kb, we suggest that divergence date estimates using datasets, branch lengths, and/or analytical techniques that fall outside of these parameters should be interpreted with caution.

  3. RBioCloud: A Light-Weight Framework for Bioconductor and R-based Jobs on the Cloud.

    PubMed

    Varghese, Blesson; Patel, Ishan; Barker, Adam

    2015-01-01

    Large-scale ad hoc analytics of genomic data is popular using the R-programming language supported by over 700 software packages provided by Bioconductor. More recently, analytical jobs are benefitting from on-demand computing and storage, their scalability and their low maintenance cost, all of which are offered by the cloud. While biologists and bioinformaticists can take an analytical job and execute it on their personal workstations, it remains challenging to seamlessly execute the job on the cloud infrastructure without extensive knowledge of the cloud dashboard. How analytical jobs can not only with minimum effort be executed on the cloud, but also how both the resources and data required by the job can be managed is explored in this paper. An open-source light-weight framework for executing R-scripts using Bioconductor packages, referred to as `RBioCloud', is designed and developed. RBioCloud offers a set of simple command-line tools for managing the cloud resources, the data and the execution of the job. Three biological test cases validate the feasibility of RBioCloud. The framework is available from http://www.rbiocloud.com.

  4. The Economics and Psychology of Personality Traits

    ERIC Educational Resources Information Center

    Borghans, Lex; Duckworth, Angela Lee; Heckman, James J.; ter Weel, Bas

    2008-01-01

    This paper explores the interface between personality psychology and economics. We examine the predictive power of personality and the stability of personality traits over the life cycle. We develop simple analytical frameworks for interpreting the evidence in personality psychology and suggest promising avenues for future research. The paper…

  5. Student Vulnerability, Agency, and Learning Analytics: An Exploration

    ERIC Educational Resources Information Center

    Prinsloo, Paul; Slade, Sharon

    2016-01-01

    In light of increasing concerns about surveillance, higher education institutions (HEIs) cannot afford a simple paternalistic approach to student data. Very few HEIs have regulatory frameworks in place and/or share information with students regarding the scope of data that may be collected, analyzed, used, and shared. It is clear from literature…

  6. Using Modern Solid-State Analytical Tools for Investigations of an Advanced Carbon Capture Material: Experiments for the Inorganic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Wriedt, Mario; Sculley, Julian P.; Aulakh, Darpandeep; Zhou, Hong-Cai

    2016-01-01

    A simple and straightforward synthesis of an ultrastable porous metal-organic framework (MOF) based on copper(II) and a mixed N donor ligand system is described as a laboratory experiment for chemistry undergraduate students. These experiments and the resulting analysis are designed to teach students basic research tools and procedures while…

  7. Multidisciplinary optimization in aircraft design using analytic technology models

    NASA Technical Reports Server (NTRS)

    Malone, Brett; Mason, W. H.

    1991-01-01

    An approach to multidisciplinary optimization is presented which combines the Global Sensitivity Equation method, parametric optimization, and analytic technology models. The result is a powerful yet simple procedure for identifying key design issues. It can be used both to investigate technology integration issues very early in the design cycle, and to establish the information flow framework between disciplines for use in multidisciplinary optimization projects using much more computational intense representations of each technology. To illustrate the approach, an examination of the optimization of a short takeoff heavy transport aircraft is presented for numerous combinations of performance and technology constraints.

  8. A novel analytical description of periodic volume coil geometries in MRI

    NASA Astrophysics Data System (ADS)

    Koh, D.; Felder, J.; Shah, N. J.

    2018-03-01

    MRI volume coils can be represented by equivalent lumped element circuits and for a variety of these circuit configurations analytical design equations have been presented. The unification of several volume coil topologies results in a two-dimensional gridded equivalent lumped element circuit which compromises the birdcage resonator, its multiple endring derivative but also novel structures like the capacitive coupled ring resonator. The theory section analyzes a general two-dimensional circuit by noting that its current distribution can be decomposed into a longitudinal and an azimuthal dependency. This can be exploited to compare the current distribution with a transfer function of filter circuits along one direction. The resonances of the transfer function coincide with the resonance of the volume resonator and the simple analytical solution can be used as a design equation. The proposed framework is verified experimentally against a novel capacitive coupled ring structure which was derived from the general circuit formulation and is proven to exhibit a dominant homogeneous mode. In conclusion, a unified analytical framework is presented that allows determining the resonance frequency of any volume resonator that can be represented by a two dimensional meshed equivalent circuit.

  9. A simple framework for relating variations in runoff to variations in climatic conditions and catchment properties

    NASA Astrophysics Data System (ADS)

    Roderick, Michael L.; Farquhar, Graham D.

    2011-12-01

    We use the Budyko framework to calculate catchment-scale evapotranspiration (E) and runoff (Q) as a function of two climatic factors, precipitation (P) and evaporative demand (Eo = 0.75 times the pan evaporation rate), and a third parameter that encodes the catchment properties (n) and modifies how P is partitioned between E and Q. This simple theory accurately predicted the long-term evapotranspiration (E) and runoff (Q) for the Murray-Darling Basin (MDB) in southeast Australia. We extend the theory by developing a simple and novel analytical expression for the effects on E and Q of small perturbations in P, Eo, and n. The theory predicts that a 10% change in P, with all else constant, would result in a 26% change in Q in the MDB. Future climate scenarios (2070-2099) derived using Intergovernmental Panel on Climate Change AR4 climate model output highlight the diversity of projections for P (±30%) with a correspondingly large range in projections for Q (±80%) in the MDB. We conclude with a qualitative description about the impact of changes in catchment properties on water availability and focus on the interaction between vegetation change, increasing atmospheric [CO2], and fire frequency. We conclude that the modern version of the Budyko framework is a useful tool for making simple and transparent estimates of changes in water availability.

  10. Highly stable aluminum-based metal-organic frameworks as biosensing platforms for assessment of food safety.

    PubMed

    Liu, Chun-Sen; Sun, Chun-Xiao; Tian, Jia-Yue; Wang, Zhuo-Wei; Ji, Hong-Fei; Song, Ying-Pan; Zhang, Shuai; Zhang, Zhi-Hong; He, Ling-Hao; Du, Miao

    2017-05-15

    Two unique immunosensors made of aluminum-based metal-organic frameworks (MOFs), namely, 515- and 516-MOFs, with 4,4',4''-nitrilotribenzoic acid (H 3 NTB) were successfully obtained to efficiently assess food safety. The as-prepared 515- and 516-MOFs exhibited superior thermal and physicochemical stability, high electrochemical activity, and good biocompatibility. Among these immunosensors, 516-MOF showed a preferable biosensing ability toward analytes determined by electrochemical techniques. The developed 516-MOF-based electrochemical biosensor not only demonstrated high sensitivity with low detection limits of 0.70 and 0.40pgmL -1 toward vomitoxin and salbutamol, respectively, but also showed good selectivity in the presence of other interferences. Therefore, with the advantages of high sensitivity, good selectivity, and simple operation, this new strategy is believed to exhibit great potential for simple and convenient detection of poisonous and harmful residues in food. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Thermal bending of liquid sheets and jets

    NASA Astrophysics Data System (ADS)

    Brenner, Michael P.; Paruchuri, Srinivas

    2003-11-01

    We present an analytical model for the bending of liquid jets and sheets from temperature gradients, as recently observed by Chwalek et al. [Phys. Fluids 14, L37 (2002)]. The bending arises from a local couple caused by Marangoni forces. The dependence of the bending angle on experimental parameters is presented, in qualitative agreement with reported experiments. The methodology gives a simple framework for understanding the mechanisms for jet and sheet bending.

  12. The Emergence of Compositional Communication in a Synthetic Ethology Framework

    DTIC Science & Technology

    2005-08-12

    34Integrating Language and Cognition: A Cognitive Robotics Approach", invited contribution to IEEE Computational Intelligence Magazine . The first two...papers address the main topic of investigation of the research proposal. In particular, we have introduced a simple structured meaning-signal mapping...Cavalli-Sforza (1982) to investigate analytically the evolution of structured com- munication codes. Let x 6 [0,1] be the proportion of individuals in a

  13. Probabilistic inference of ecohydrological parameters using observations from point to satellite scales

    NASA Astrophysics Data System (ADS)

    Bassiouni, Maoya; Higgins, Chad W.; Still, Christopher J.; Good, Stephen P.

    2018-06-01

    Vegetation controls on soil moisture dynamics are challenging to measure and translate into scale- and site-specific ecohydrological parameters for simple soil water balance models. We hypothesize that empirical probability density functions (pdfs) of relative soil moisture or soil saturation encode sufficient information to determine these ecohydrological parameters. Further, these parameters can be estimated through inverse modeling of the analytical equation for soil saturation pdfs, derived from the commonly used stochastic soil water balance framework. We developed a generalizable Bayesian inference framework to estimate ecohydrological parameters consistent with empirical soil saturation pdfs derived from observations at point, footprint, and satellite scales. We applied the inference method to four sites with different land cover and climate assuming (i) an annual rainfall pattern and (ii) a wet season rainfall pattern with a dry season of negligible rainfall. The Nash-Sutcliffe efficiencies of the analytical model's fit to soil observations ranged from 0.89 to 0.99. The coefficient of variation of posterior parameter distributions ranged from < 1 to 15 %. The parameter identifiability was not significantly improved in the more complex seasonal model; however, small differences in parameter values indicate that the annual model may have absorbed dry season dynamics. Parameter estimates were most constrained for scales and locations at which soil water dynamics are more sensitive to the fitted ecohydrological parameters of interest. In these cases, model inversion converged more slowly but ultimately provided better goodness of fit and lower uncertainty. Results were robust using as few as 100 daily observations randomly sampled from the full records, demonstrating the advantage of analyzing soil saturation pdfs instead of time series to estimate ecohydrological parameters from sparse records. Our work combines modeling and empirical approaches in ecohydrology and provides a simple framework to obtain scale- and site-specific analytical descriptions of soil moisture dynamics consistent with soil moisture observations.

  14. Firing patterns in the adaptive exponential integrate-and-fire model.

    PubMed

    Naud, Richard; Marcille, Nicolas; Clopath, Claudia; Gerstner, Wulfram

    2008-11-01

    For simulations of large spiking neuron networks, an accurate, simple and versatile single-neuron modeling framework is required. Here we explore the versatility of a simple two-equation model: the adaptive exponential integrate-and-fire neuron. We show that this model generates multiple firing patterns depending on the choice of parameter values, and present a phase diagram describing the transition from one firing type to another. We give an analytical criterion to distinguish between continuous adaption, initial bursting, regular bursting and two types of tonic spiking. Also, we report that the deterministic model is capable of producing irregular spiking when stimulated with constant current, indicating low-dimensional chaos. Lastly, the simple model is fitted to real experiments of cortical neurons under step current stimulation. The results provide support for the suitability of simple models such as the adaptive exponential integrate-and-fire neuron for large network simulations.

  15. Iterative categorization (IC): a systematic technique for analysing qualitative data

    PubMed Central

    2016-01-01

    Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155

  16. A Simple Demonstration of Concrete Structural Health Monitoring Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahadevan, Sankaran; Agarwal, Vivek; Cai, Guowei

    Assessment and management of aging concrete structures in nuclear power plants require a more systematic approach than simple reliance on existing code margins of safety. Structural health monitoring of concrete structures aims to understand the current health condition of a structure based on heterogeneous measurements to produce high confidence actionable information regarding structural integrity that supports operational and maintenance decisions. This ongoing research project is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in a nuclear power plant subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of four elements—damagemore » modeling, monitoring, data analytics, and uncertainty quantification. This report describes a proof-of-concept example on a small concrete slab subjected to a freeze-thaw experiment that explores techniques in each of the four elements of the framework and their integration. An experimental set-up at Vanderbilt University’s Laboratory for Systems Integrity and Reliability is used to research effective combination of full-field techniques that include infrared thermography, digital image correlation, and ultrasonic measurement. The measured data are linked to the probabilistic framework: the thermography, digital image correlation data, and ultrasonic measurement data are used for Bayesian calibration of model parameters, for diagnosis of damage, and for prognosis of future damage. The proof-of-concept demonstration presented in this report highlights the significance of each element of the framework and their integration.« less

  17. Metal-Organic Framework Thin Film Coated Optical Fiber Sensors: A Novel Waveguide-Based Chemical Sensing Platform.

    PubMed

    Kim, Ki-Joong; Lu, Ping; Culp, Jeffrey T; Ohodnicki, Paul R

    2018-02-23

    Integration of optical fiber with sensitive thin films offers great potential for the realization of novel chemical sensing platforms. In this study, we present a simple design strategy and high performance of nanoporous metal-organic framework (MOF) based optical gas sensors, which enables detection of a wide range of concentrations of small molecules based upon extremely small differences in refractive indices as a function of analyte adsorption within the MOF framework. Thin and compact MOF films can be uniformly formed and tightly bound on the surface of etched optical fiber through a simple solution method which is critical for manufacturability of MOF-based sensor devices. The resulting sensors show high sensitivity/selectivity to CO 2 gas relative to other small gases (H 2 , N 2 , O 2 , and CO) with rapid (

  18. Tidally induced residual current over the Malin Sea continental slope

    NASA Astrophysics Data System (ADS)

    Stashchuk, Nataliya; Vlasenko, Vasiliy; Hosegood, Phil; Nimmo-Smith, W. Alex M.

    2017-05-01

    Tidally induced residual currents generated over shelf-slope topography are investigated analytically and numerically using the Massachusetts Institute of Technology general circulation model. Observational support for the presence of such a slope current was recorded over the Malin Sea continental slope during the 88-th cruise of the RRS ;James Cook; in July 2013. A simple analytical formula developed here in the framework of time-averaged shallow water equations has been validated against a fully nonlinear nonhydrostatic numerical solution. A good agreement between analytical and numerical solutions is found for a wide range of input parameters of the tidal flow and bottom topography. In application to the Malin Shelf area both the numerical model and analytical solution predicted a northward moving current confined to the slope with its core located above the 400 m isobath and with vertically averaged maximum velocities up to 8 cm s-1, which is consistent with the in-situ data recorded at three moorings and along cross-slope transects.

  19. Reducing the barriers against analytical epidemiological studies in investigations of local foodborne disease outbreaks in Germany - a starter kit for local health authorities.

    PubMed

    Werber, D; Bernard, H

    2014-02-27

    Thousands of infectious food-borne disease outbreaks (FBDO) are reported annually to the European Food Safety Authority within the framework of the zoonoses Directive (2003/99/EC). Most recognised FBDO occur locally following point source exposure, but only few are investigated using analytical epidemiological studies. In Germany, and probably also in other countries of the European Union, this seems to be particularly true for those investigated by local health authorities. Analytical studies, usually cohort studies or case–control studies, are a powerful tool to identify suspect food vehicles. Therefore, from a public health and food safety perspective, their more frequent usage is highly desirable. We have developed a small toolbox consisting of a strategic concept and a simple software tool for data entry and analysis, with the objective to increase the use of analytical studies in the investigation of local point source FBDO in Germany.

  20. A Multilevel Multiset Time-Series Model for Describing Complex Developmental Processes

    PubMed Central

    Ma, Xin; Shen, Jianping

    2017-01-01

    The authors sought to develop an analytical platform where multiple sets of time series can be examined simultaneously. This multivariate platform capable of testing interaction effects among multiple sets of time series can be very useful in empirical research. The authors demonstrated that the multilevel framework can readily accommodate this analytical capacity. Given their intention to use the multilevel multiset time-series model to pursue complicated research purposes, their resulting model is relatively simple to specify, to run, and to interpret. These advantages make the adoption of their model relatively effortless as long as researchers have the basic knowledge and skills in working with multilevel growth modeling. With multiple potential extensions of their model, the establishment of this analytical platform for analysis of multiple sets of time series can inspire researchers to pursue far more advanced research designs to address complex developmental processes in reality. PMID:29881094

  1. Tsunami and acoustic-gravity waves in water of constant depth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendin, Gali; Stiassnie, Michael

    2013-08-15

    A study of wave radiation by a rather general bottom displacement, in a compressible ocean of otherwise constant depth, is carried out within the framework of a three-dimensional linear theory. Simple analytic expressions for the flow field, at large distance from the disturbance, are derived. Realistic numerical examples indicate that the Acoustic-Gravity waves, which significantly precede the Tsunami, are expected to leave a measurable signature on bottom-pressure records that should be considered for early detection of Tsunami.

  2. Surface-enhanced chiroptical spectroscopy with superchiral surface waves.

    PubMed

    Pellegrini, Giovanni; Finazzi, Marco; Celebrano, Michele; Duò, Lamberto; Biagioni, Paolo

    2018-07-01

    We study the chiroptical properties of one-dimensional photonic crystals supporting superchiral surface waves by introducing a simple formalism based on the Fresnel reflection matrix. We show that the proposed framework provides useful insights on the behavior of all the relevant chiroptical quantities, allowing for a deeper understanding of surface-enhanced chiral sensing platforms based on one-dimensional photonic crystals. Finally, we analyze and discuss the limitations of such platforms as the surface concentration of the target chiral analytes is gradually increased. © 2018 Wiley Periodicals, Inc.

  3. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  4. Controlling the light shift of the CPT resonance by modulation technique

    NASA Astrophysics Data System (ADS)

    Tsygankov, E. A.; Petropavlovsky, S. V.; Vaskovskaya, M. I.; Zibrov, S. A.; Velichansky, V. L.; Yakovlev, V. P.

    2017-12-01

    Motivated by recent developments in atomic frequency standards employing the effect of coherent population trapping (CPT), we propose a theoretical framework for the frequency modulation spectroscopy of the CPT resonances. Under realistic assumptions we provide simple yet non-trivial analytical formulae for the major spectroscopic signals such as the CPT resonance line and the in-phase/quadrature responses. We discuss the influence of the light shift and, in particular, derive a simple expression for the displacement of the resonance as a function of modulation index. The performance of the model is checked against numerical simulations, the agreement is good to perfect. The obtained results can be used in more general models accounting for light absorption in the thick optical medium.

  5. Iterative categorization (IC): a systematic technique for analysing qualitative data.

    PubMed

    Neale, Joanne

    2016-06-01

    The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  6. Metal–Organic Framework Thin Film Coated Optical Fiber Sensors: A Novel Waveguide-Based Chemical Sensing Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Ki-Joong; Lu, Ping; Culp, Jeffrey T.

    Integration of optical fiber with sensitive thin films offers great potential for the realization of novel chemical sensing platforms. In this study, we present a simple design strategy and high performance of nanoporous metal–organic framework (MOF) based optical gas sensors, which enables detection of a wide range of concentrations of small molecules based upon extremely small differences in refractive indices as a function of analyte adsorption within the MOF framework. Thin and compact MOF films can be uniformly formed and tightly bound on the surface of etched optical fiber through a simple solution method which is critical for manufacturability ofmore » MOF-based sensor devices. The resulting sensors show high sensitivity/selectivity to CO 2 gas relative to other small gases (H 2, N 2, O 2, and CO) with rapid (< tens of seconds) response time and excellent reversibility, which can be well correlated to the physisorption of gases into a nanoporous MOF. We propose a refractive index based sensing mechanism for the MOF-integrated optical fiber platform which results in an amplification of inherent optical absorption present within the MOF-based sensing layer with increasing values of effective refractive index associated with adsorption of gases.« less

  7. Metal–Organic Framework Thin Film Coated Optical Fiber Sensors: A Novel Waveguide-Based Chemical Sensing Platform

    DOE PAGES

    Kim, Ki-Joong; Lu, Ping; Culp, Jeffrey T.; ...

    2018-01-18

    Integration of optical fiber with sensitive thin films offers great potential for the realization of novel chemical sensing platforms. In this study, we present a simple design strategy and high performance of nanoporous metal–organic framework (MOF) based optical gas sensors, which enables detection of a wide range of concentrations of small molecules based upon extremely small differences in refractive indices as a function of analyte adsorption within the MOF framework. Thin and compact MOF films can be uniformly formed and tightly bound on the surface of etched optical fiber through a simple solution method which is critical for manufacturability ofmore » MOF-based sensor devices. The resulting sensors show high sensitivity/selectivity to CO 2 gas relative to other small gases (H 2, N 2, O 2, and CO) with rapid (< tens of seconds) response time and excellent reversibility, which can be well correlated to the physisorption of gases into a nanoporous MOF. We propose a refractive index based sensing mechanism for the MOF-integrated optical fiber platform which results in an amplification of inherent optical absorption present within the MOF-based sensing layer with increasing values of effective refractive index associated with adsorption of gases.« less

  8. Metal-organic framework based in-syringe solid-phase extraction for the on-site sampling of polycyclic aromatic hydrocarbons from environmental water samples.

    PubMed

    Zhang, Xiaoqiong; Wang, Peiyi; Han, Qiang; Li, Hengzhen; Wang, Tong; Ding, Mingyu

    2018-04-01

    In-syringe solid-phase extraction is a promising sample pretreatment method for the on-site sampling of water samples because of its outstanding advantages of portability, simple operation, short extraction time, and low cost. In this work, a novel in-syringe solid-phase extraction device using metal-organic frameworks as the adsorbent was fabricated for the on-site sampling of polycyclic aromatic hydrocarbons from environmental waters. Trace polycyclic aromatic hydrocarbons were effectively extracted through the self-made device followed by gas chromatography with mass spectrometry analysis. Owing to the excellent adsorption performance of metal-organic frameworks, the analytes could be completely adsorbed during one adsorption cycle, thus effectively shortening the extraction time. Moreover, the adsorbed analytes could remain stable on the device for at least 7 days, revealing the potential of the self-made device for on-site sampling of degradable compounds in remote regions. The limit of detection ranged from 0.20 to 1.9 ng/L under the optimum conditions. Satisfactory recoveries varying from 84.4 to 104.5% and relative standard deviations below 9.7% were obtained in real samples analysis. The results of this study promote the application of metal-organic frameworks in sample preparation and demonstrate the great potential of in-syringe solid-phase extraction for the on-site sampling of trace contaminants in environmental waters. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Modulational estimate for the maximal Lyapunov exponent in Fermi-Pasta-Ulam chains

    NASA Astrophysics Data System (ADS)

    Dauxois, Thierry; Ruffo, Stefano; Torcini, Alessandro

    1997-12-01

    In the framework of the Fermi-Pasta-Ulam (FPU) model, we show a simple method to give an accurate analytical estimation of the maximal Lyapunov exponent at high energy density. The method is based on the computation of the mean value of the modulational instability growth rates associated to unstable modes. Moreover, we show that the strong stochasticity threshold found in the β-FPU system is closely related to a transition in tangent space, the Lyapunov eigenvector being more localized in space at high energy.

  10. Specifying a target trial prevents immortal time bias and other self-inflicted injuries in observational analyses

    PubMed Central

    Hernán, Miguel A.; Sauer, Brian C.; Hernández-Díaz, Sonia; Platt, Robert; Shrier, Ian

    2016-01-01

    Many analyses of observational data are attempts to emulate a target trial. The emulation of the target trial may fail when researchers deviate from simple principles that guide the design and analysis of randomized experiments. We review a framework to describe and prevent biases, including immortal time bias, that result from a failure to align start of follow-up, specification of eligibility, and treatment assignment. We review some analytic approaches to avoid these problems in comparative effectiveness or safety research. PMID:27237061

  11. Streamflow variability and optimal capacity of run-of-river hydropower plants

    NASA Astrophysics Data System (ADS)

    Basso, S.; Botter, G.

    2012-10-01

    The identification of the capacity of a run-of-river plant which allows for the optimal utilization of the available water resources is a challenging task, mainly because of the inherent temporal variability of river flows. This paper proposes an analytical framework to describe the energy production and the economic profitability of small run-of-river power plants on the basis of the underlying streamflow regime. We provide analytical expressions for the capacity which maximize the produced energy as a function of the underlying flow duration curve and minimum environmental flow requirements downstream of the plant intake. Similar analytical expressions are derived for the capacity which maximize the economic return deriving from construction and operation of a new plant. The analytical approach is applied to a minihydro plant recently proposed in a small Alpine catchment in northeastern Italy, evidencing the potential of the method as a flexible and simple design tool for practical application. The analytical model provides useful insight on the major hydrologic and economic controls (e.g., streamflow variability, energy price, costs) on the optimal plant capacity and helps in identifying policy strategies to reduce the current gap between the economic and energy optimizations of run-of-river plants.

  12. Large deviation analysis of a simple information engine

    NASA Astrophysics Data System (ADS)

    Maitland, Michael; Grosskinsky, Stefan; Harris, Rosemary J.

    2015-11-01

    Information thermodynamics provides a framework for studying the effect of feedback loops on entropy production. It has enabled the understanding of novel thermodynamic systems such as the information engine, which can be seen as a modern version of "Maxwell's Dæmon," whereby a feedback controller processes information gained by measurements in order to extract work. Here, we analyze a simple model of such an engine that uses feedback control based on measurements to obtain negative entropy production. We focus on the distribution and fluctuations of the information obtained by the feedback controller. Significantly, our model allows an analytic treatment for a two-state system with exact calculation of the large deviation rate function. These results suggest an approximate technique for larger systems, which is corroborated by simulation data.

  13. Analytics4Action Evaluation Framework: A Review of Evidence-Based Learning Analytics Interventions at the Open University UK

    ERIC Educational Resources Information Center

    Rienties, Bart; Boroowa, Avinash; Cross, Simon; Kubiak, Chris; Mayles, Kevin; Murphy, Sam

    2016-01-01

    There is an urgent need to develop an evidence-based framework for learning analytics whereby stakeholders can manage, evaluate, and make decisions about which types of interventions work well and under which conditions. In this article, we will work towards developing a foundation of an Analytics4Action Evaluation Framework (A4AEF) that is…

  14. Stability analysis of magnetized neutron stars - a semi-analytic approach

    NASA Astrophysics Data System (ADS)

    Herbrik, Marlene; Kokkotas, Kostas D.

    2017-04-01

    We implement a semi-analytic approach for stability analysis, addressing the ongoing uncertainty about stability and structure of neutron star magnetic fields. Applying the energy variational principle, a model system is displaced from its equilibrium state. The related energy density variation is set up analytically, whereas its volume integration is carried out numerically. This facilitates the consideration of more realistic neutron star characteristics within the model compared to analytical treatments. At the same time, our method retains the possibility to yield general information about neutron star magnetic field and composition structures that are likely to be stable. In contrast to numerical studies, classes of parametrized systems can be studied at once, finally constraining realistic configurations for interior neutron star magnetic fields. We apply the stability analysis scheme on polytropic and non-barotropic neutron stars with toroidal, poloidal and mixed fields testing their stability in a Newtonian framework. Furthermore, we provide the analytical scheme for dropping the Cowling approximation in an axisymmetric system and investigate its impact. Our results confirm the instability of simple magnetized neutron star models as well as a stabilization tendency in the case of mixed fields and stratification. These findings agree with analytical studies whose spectrum of model systems we extend by lifting former simplifications.

  15. Conductivity of graphene in the framework of Dirac model: Interplay between nonzero mass gap and chemical potential

    NASA Astrophysics Data System (ADS)

    Klimchitskaya, G. L.; Mostepanenko, V. M.; Petrov, V. M.

    2017-12-01

    The complete theory of electrical conductivity of graphene at arbitrary temperature is developed with taking into account mass-gap parameter and chemical potential. Both the in-plane and out-of-plane conductivities of graphene are expressed via the components of the polarization tensor in (2+1)-dimensional space-time analytically continued to the real frequency axis. Simple analytic expressions for both the real and imaginary parts of the conductivity of graphene are obtained at zero and nonzero temperature. They demonstrate an interesting interplay depending on the values of mass gap and chemical potential. In the local limit, several results obtained earlier using various approximate and phenomenological approaches are reproduced, refined, and generalized. The numerical computations of both the real and imaginary parts of the conductivity of graphene are performed to illustrate the obtained results. The analytic expressions for the conductivity of graphene obtained in this paper can serve as a guide in the comparison between different theoretical approaches and between experiment and theory.

  16. 77 FR 47767 - Privacy Act of 1974: Implementation of Exemptions; Department of Homeland Security U.S. Customs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-10

    ... Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records AGENCY: Privacy Office... Homeland Security/U.S. Customs and Border Protection, DHS/CBP--017 Analytical Framework for Intelligence... Analytical Framework for Intelligence (AFI) System of Records'' from one or more provisions of the Privacy...

  17. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  18. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  19. Evaluation of generalized degrees of freedom for sparse estimation by replica method

    NASA Astrophysics Data System (ADS)

    Sakata, A.

    2016-12-01

    We develop a method to evaluate the generalized degrees of freedom (GDF) for linear regression with sparse regularization. The GDF is a key factor in model selection, and thus its evaluation is useful in many modelling applications. An analytical expression for the GDF is derived using the replica method in the large-system-size limit with random Gaussian predictors. The resulting formula has a universal form that is independent of the type of regularization, providing us with a simple interpretation. Within the framework of replica symmetric (RS) analysis, GDF has a physical meaning as the effective fraction of non-zero components. The validity of our method in the RS phase is supported by the consistency of our results with previous mathematical results. The analytical results in the RS phase are calculated numerically using the belief propagation algorithm.

  20. Reasoning and Knowledge Acquisition Framework for 5G Network Analytics

    PubMed Central

    2017-01-01

    Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration. PMID:29065473

  1. A novel numerical framework for self-similarity in plasticity: Wedge indentation in single crystals

    NASA Astrophysics Data System (ADS)

    Juul, K. J.; Niordson, C. F.; Nielsen, K. L.; Kysar, J. W.

    2018-03-01

    A novel numerical framework for analyzing self-similar problems in plasticity is developed and demonstrated. Self-similar problems of this kind include processes such as stationary cracks, void growth, indentation etc. The proposed technique offers a simple and efficient method for handling this class of complex problems by avoiding issues related to traditional Lagrangian procedures. Moreover, the proposed technique allows for focusing the mesh in the region of interest. In the present paper, the technique is exploited to analyze the well-known wedge indentation problem of an elastic-viscoplastic single crystal. However, the framework may be readily adapted to any constitutive law of interest. The main focus herein is the development of the self-similar framework, while the indentation study serves primarily as verification of the technique by comparing to existing numerical and analytical studies. In this study, the three most common metal crystal structures will be investigated, namely the face-centered cubic (FCC), body-centered cubic (BCC), and hexagonal close packed (HCP) crystal structures, where the stress and slip rate fields around the moving contact point singularity are presented.

  2. Reasoning and Knowledge Acquisition Framework for 5G Network Analytics.

    PubMed

    Sotelo Monge, Marco Antonio; Maestre Vidal, Jorge; García Villalba, Luis Javier

    2017-10-21

    Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration.

  3. An Analytical Framework for Studying Small-Number Effects in Catalytic Reaction Networks: A Probability Generating Function Approach to Chemical Master Equations

    PubMed Central

    Nakagawa, Masaki; Togashi, Yuichi

    2016-01-01

    Cell activities primarily depend on chemical reactions, especially those mediated by enzymes, and this has led to these activities being modeled as catalytic reaction networks. Although deterministic ordinary differential equations of concentrations (rate equations) have been widely used for modeling purposes in the field of systems biology, it has been pointed out that these catalytic reaction networks may behave in a way that is qualitatively different from such deterministic representation when the number of molecules for certain chemical species in the system is small. Apart from this, representing these phenomena by simple binary (on/off) systems that omit the quantities would also not be feasible. As recent experiments have revealed the existence of rare chemical species in cells, the importance of being able to model potential small-number phenomena is being recognized. However, most preceding studies were based on numerical simulations, and theoretical frameworks to analyze these phenomena have not been sufficiently developed. Motivated by the small-number issue, this work aimed to develop an analytical framework for the chemical master equation describing the distributional behavior of catalytic reaction networks. For simplicity, we considered networks consisting of two-body catalytic reactions. We used the probability generating function method to obtain the steady-state solutions of the chemical master equation without specifying the parameters. We obtained the time evolution equations of the first- and second-order moments of concentrations, and the steady-state analytical solution of the chemical master equation under certain conditions. These results led to the rank conservation law, the connecting state to the winner-takes-all state, and analysis of 2-molecules M-species systems. A possible interpretation of the theoretical conclusion for actual biochemical pathways is also discussed. PMID:27047384

  4. Asymptotic theory of time-varying social networks with heterogeneous activity and tie allocation.

    PubMed

    Ubaldi, Enrico; Perra, Nicola; Karsai, Márton; Vezzani, Alessandro; Burioni, Raffaella; Vespignani, Alessandro

    2016-10-24

    The dynamic of social networks is driven by the interplay between diverse mechanisms that still challenge our theoretical and modelling efforts. Amongst them, two are known to play a central role in shaping the networks evolution, namely the heterogeneous propensity of individuals to i) be socially active and ii) establish a new social relationships with their alters. Here, we empirically characterise these two mechanisms in seven real networks describing temporal human interactions in three different settings: scientific collaborations, Twitter mentions, and mobile phone calls. We find that the individuals' social activity and their strategy in choosing ties where to allocate their social interactions can be quantitatively described and encoded in a simple stochastic network modelling framework. The Master Equation of the model can be solved in the asymptotic limit. The analytical solutions provide an explicit description of both the system dynamic and the dynamical scaling laws characterising crucial aspects about the evolution of the networks. The analytical predictions match with accuracy the empirical observations, thus validating the theoretical approach. Our results provide a rigorous dynamical system framework that can be extended to include other processes shaping social dynamics and to generate data driven predictions for the asymptotic behaviour of social networks.

  5. Asymptotic theory of time-varying social networks with heterogeneous activity and tie allocation

    NASA Astrophysics Data System (ADS)

    Ubaldi, Enrico; Perra, Nicola; Karsai, Márton; Vezzani, Alessandro; Burioni, Raffaella; Vespignani, Alessandro

    2016-10-01

    The dynamic of social networks is driven by the interplay between diverse mechanisms that still challenge our theoretical and modelling efforts. Amongst them, two are known to play a central role in shaping the networks evolution, namely the heterogeneous propensity of individuals to i) be socially active and ii) establish a new social relationships with their alters. Here, we empirically characterise these two mechanisms in seven real networks describing temporal human interactions in three different settings: scientific collaborations, Twitter mentions, and mobile phone calls. We find that the individuals’ social activity and their strategy in choosing ties where to allocate their social interactions can be quantitatively described and encoded in a simple stochastic network modelling framework. The Master Equation of the model can be solved in the asymptotic limit. The analytical solutions provide an explicit description of both the system dynamic and the dynamical scaling laws characterising crucial aspects about the evolution of the networks. The analytical predictions match with accuracy the empirical observations, thus validating the theoretical approach. Our results provide a rigorous dynamical system framework that can be extended to include other processes shaping social dynamics and to generate data driven predictions for the asymptotic behaviour of social networks.

  6. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  7. Principal component of explained variance: An efficient and optimal data dimension reduction framework for association studies.

    PubMed

    Turgeon, Maxime; Oualkacha, Karim; Ciampi, Antonio; Miftah, Hanane; Dehghan, Golsa; Zanke, Brent W; Benedet, Andréa L; Rosa-Neto, Pedro; Greenwood, Celia Mt; Labbe, Aurélie

    2018-05-01

    The genomics era has led to an increase in the dimensionality of data collected in the investigation of biological questions. In this context, dimension-reduction techniques can be used to summarise high-dimensional signals into low-dimensional ones, to further test for association with one or more covariates of interest. This paper revisits one such approach, previously known as principal component of heritability and renamed here as principal component of explained variance (PCEV). As its name suggests, the PCEV seeks a linear combination of outcomes in an optimal manner, by maximising the proportion of variance explained by one or several covariates of interest. By construction, this method optimises power; however, due to its computational complexity, it has unfortunately received little attention in the past. Here, we propose a general analytical PCEV framework that builds on the assets of the original method, i.e. conceptually simple and free of tuning parameters. Moreover, our framework extends the range of applications of the original procedure by providing a computationally simple strategy for high-dimensional outcomes, along with exact and asymptotic testing procedures that drastically reduce its computational cost. We investigate the merits of the PCEV using an extensive set of simulations. Furthermore, the use of the PCEV approach is illustrated using three examples taken from the fields of epigenetics and brain imaging.

  8. Analytical Studies on the Synchronization of a Network of Linearly-Coupled Simple Chaotic Systems

    NASA Astrophysics Data System (ADS)

    Sivaganesh, G.; Arulgnanam, A.; Seethalakshmi, A. N.; Selvaraj, S.

    2018-05-01

    We present explicit generalized analytical solutions for a network of linearly-coupled simple chaotic systems. Analytical solutions are obtained for the normalized state equations of a network of linearly-coupled systems driven by a common chaotic drive system. Two parameter bifurcation diagrams revealing the various hidden synchronization regions, such as complete, phase and phase-lag synchronization are identified using the analytical results. The synchronization dynamics and their stability are studied using phase portraits and the master stability function, respectively. Further, experimental results for linearly-coupled simple chaotic systems are presented to confirm the analytical results. The synchronization dynamics of a network of chaotic systems studied analytically is reported for the first time.

  9. Modeling electrokinetic flows by consistent implicit incompressible smoothed particle hydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pan, Wenxiao; Kim, Kyungjoo; Perego, Mauro

    2017-04-01

    We present an efficient implicit incompressible smoothed particle hydrodynamics (I2SPH) discretization of Navier-Stokes, Poisson-Boltzmann, and advection-diffusion equations subject to Dirichlet or Robin boundary conditions. It is applied to model various two and three dimensional electrokinetic flows in simple or complex geometries. The I2SPH's accuracy and convergence are examined via comparison with analytical solutions, grid-based numerical solutions, or empirical models. The new method provides a framework to explore broader applications of SPH in microfluidics and complex fluids with charged objects, such as colloids and biomolecules, in arbitrary complex geometries.

  10. Calculation of the time resolution of the J-PET tomograph using kernel density estimation

    NASA Astrophysics Data System (ADS)

    Raczyński, L.; Wiślicki, W.; Krzemień, W.; Kowalski, P.; Alfs, D.; Bednarski, T.; Białas, P.; Curceanu, C.; Czerwiński, E.; Dulski, K.; Gajos, A.; Głowacz, B.; Gorgol, M.; Hiesmayr, B.; Jasińska, B.; Kamińska, D.; Korcyl, G.; Kozik, T.; Krawczyk, N.; Kubicz, E.; Mohammed, M.; Pawlik-Niedźwiecka, M.; Niedźwiecki, S.; Pałka, M.; Rudy, Z.; Rundel, O.; Sharma, N. G.; Silarski, M.; Smyrski, J.; Strzelecki, A.; Wieczorek, A.; Zgardzińska, B.; Zieliński, M.; Moskal, P.

    2017-06-01

    In this paper we estimate the time resolution of the J-PET scanner built from plastic scintillators. We incorporate the method of signal processing using the Tikhonov regularization framework and the kernel density estimation method. We obtain simple, closed-form analytical formulae for time resolution. The proposed method is validated using signals registered by means of the single detection unit of the J-PET tomograph built from a 30 cm long plastic scintillator strip. It is shown that the experimental and theoretical results obtained for the J-PET scanner equipped with vacuum tube photomultipliers are consistent.

  11. Specifying a target trial prevents immortal time bias and other self-inflicted injuries in observational analyses.

    PubMed

    Hernán, Miguel A; Sauer, Brian C; Hernández-Díaz, Sonia; Platt, Robert; Shrier, Ian

    2016-11-01

    Many analyses of observational data are attempts to emulate a target trial. The emulation of the target trial may fail when researchers deviate from simple principles that guide the design and analysis of randomized experiments. We review a framework to describe and prevent biases, including immortal time bias, that result from a failure to align start of follow-up, specification of eligibility, and treatment assignment. We review some analytic approaches to avoid these problems in comparative effectiveness or safety research. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Medical ethics: four principles plus attention to scope.

    PubMed

    Gillon, R

    1994-07-16

    The "four principles plus scope" approach provides a simple, accessible, and culturally neutral approach to thinking about ethical issues in health care. The approach, developed in the United States, is based on four common, basic prima facie moral commitments--respect for autonomy, beneficence, nonmaleficence, and justice--plus concern for their scope of application. It offers a common, basic moral analytical framework and a common, basic moral language. Although they do not provide ordered rules, these principles can help doctors and other health care workers to make decisions when reflecting on moral issues that arise at work.

  13. Multiaxis sensing using metal organic frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talin, Albert Alec; Allendorf, Mark D.; Leonard, Francois

    2017-01-17

    A sensor device including a sensor substrate; and a thin film comprising a porous metal organic framework (MOF) on the substrate that presents more than one transduction mechanism when exposed to an analyte. A method including exposing a porous metal organic framework (MOF) on a substrate to an analyte; and identifying more than one transduction mechanism in response to the exposure to the analyte.

  14. The MOOC and Learning Analytics Innovation Cycle (MOLAC): A Reflective Summary of Ongoing Research and Its Challenges

    ERIC Educational Resources Information Center

    Drachsler, H.; Kalz, M.

    2016-01-01

    The article deals with the interplay between learning analytics and massive open online courses (MOOCs) and provides a conceptual framework to situate ongoing research in the MOOC and learning analytics innovation cycle (MOLAC framework). The MOLAC framework is organized on three levels: On the micro-level, the data collection and analytics…

  15. Collaborative visual analytics of radio surveys in the Big Data era

    NASA Astrophysics Data System (ADS)

    Vohl, Dany; Fluke, Christopher J.; Hassan, Amr H.; Barnes, David G.; Kilborn, Virginia A.

    2017-06-01

    Radio survey datasets comprise an increasing number of individual observations stored as sets of multidimensional data. In large survey projects, astronomers commonly face limitations regarding: 1) interactive visual analytics of sufficiently large subsets of data; 2) synchronous and asynchronous collaboration; and 3) documentation of the discovery workflow. To support collaborative data inquiry, we present encube, a large-scale comparative visual analytics framework. encube can utilise advanced visualization environments such as the CAVE2 (a hybrid 2D and 3D virtual reality environment powered with a 100 Tflop/s GPU-based supercomputer and 84 million pixels) for collaborative analysis of large subsets of data from radio surveys. It can also run on standard desktops, providing a capable visual analytics experience across the display ecology. encube is composed of four primary units enabling compute-intensive processing, advanced visualisation, dynamic interaction, parallel data query, along with data management. Its modularity will make it simple to incorporate astronomical analysis packages and Virtual Observatory capabilities developed within our community. We discuss how encube builds a bridge between high-end display systems (such as CAVE2) and the classical desktop, preserving all traces of the work completed on either platform - allowing the research process to continue wherever you are.

  16. An Analytical Time–Domain Expression for the Net Ripple Produced by Parallel Interleaved Converters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Brian B.; Krein, Philip T.

    We apply modular arithmetic and Fourier series to analyze the superposition of N interleaved triangular waveforms with identical amplitudes and duty-ratios. Here, interleaving refers to the condition when a collection of periodic waveforms with identical periods are each uniformly phase-shifted across one period. The main result is a time-domain expression which provides an exact representation of the summed and interleaved triangular waveforms, where the peak amplitude and parameters of the time-periodic component are all specified in closed-form. Analysis is general and can be used to study various applications in multi-converter systems. This model is unique not only in that itmore » reveals a simple and intuitive expression for the net ripple, but its derivation via modular arithmetic and Fourier series is distinct from prior approaches. The analytical framework is experimentally validated with a system of three parallel converters under time-varying operating conditions.« less

  17. A theoretical framework for analyzing the effect of external change on tidal dynamics in estuaries

    NASA Astrophysics Data System (ADS)

    CAI, H.; Savenije, H.; Toffolon, M.

    2013-12-01

    The most densely populated areas of the world are usually located in coastal areas near estuaries. As a result, estuaries are often subject to intense human interventions, such as dredging for navigation, dam construction and fresh water withdrawal etc., which in some areas has led to serious deterioration of invaluable ecosystems. Hence it is important to understand the influence of such interventions on tidal dynamics in these areas. In this study, we present one consistent theoretical framework for tidal hydrodynamics, which can be used as a rapid assessment technique that assist policy maker and managers to make considered decisions for the protection and management of estuarine environment when assessing the effect of human interventions in estuaries. Analytical solutions to the one-dimensional St. Venant equations for the tidal hydrodynamics in convergent unbounded estuaries with negligible river discharge can be cast in the form of a set of four implicit dimensionless equations for phase lag, velocity amplitude, damping, and wave celerity, as a function of two localized parameters describing friction and convergence. This method allows for the comparison of the different analytical approaches by rewriting the different solutions in the same format. In this study, classical and more recent formulations are compared, showing the differences and similarities associated to their specific simplifications. The envelope method, which is based on the consideration of the dynamics at high water and low water, can be used to derive damping equations that use different friction approximations. This results in as many analytical solutions, and thereby allows one to build a consistent theoretical framework. Analysis of the asymptotic behaviour of the equations shows that an equilibrium tidal amplitude exits reflecting the balance between friction and channel convergence. The framework is subsequently extended to take into account the effect of river discharge. Hence, the analytical solutions are applicable even in the upstream part of an estuary, where the influence of river discharge is remarkable. The proposed analytical solutions are transparent and practical, allowing a quantitative and qualitative assessment of human interventions (e.g., dredging, flow reduction) on tidal dynamics. Moreover, they are rapid assessment techniques that enable the users to set up a simple model and to understand the functioning of the system with a minimum of information required. The analytical model is illustrated in three large-scale estuaries with significant influence by human activities, i.e., the Scheldt estuary in the Netherlands, the Modaomen and the Yangtze estuaries in China. In these estuaries, the correspondence with observations is good, which suggests that the proposed model is a useful, yet realistic and reliable instrument for quick detection of the effect of human interventions on tidal dynamics and subsequent environmental issues, such as salt intrusion.

  18. Topological determinants of self-sustained activity in a simple model of excitable dynamics on graphs

    PubMed Central

    Fretter, Christoph; Lesne, Annick; Hilgetag, Claus C.; Hütt, Marc-Thorsten

    2017-01-01

    Simple models of excitable dynamics on graphs are an efficient framework for studying the interplay between network topology and dynamics. This topic is of practical relevance to diverse fields, ranging from neuroscience to engineering. Here we analyze how a single excitation propagates through a random network as a function of the excitation threshold, that is, the relative amount of activity in the neighborhood required for the excitation of a node. We observe that two sharp transitions delineate a region of sustained activity. Using analytical considerations and numerical simulation, we show that these transitions originate from the presence of barriers to propagation and the excitation of topological cycles, respectively, and can be predicted from the network topology. Our findings are interpreted in the context of network reverberations and self-sustained activity in neural systems, which is a question of long-standing interest in computational neuroscience. PMID:28186182

  19. Topological determinants of self-sustained activity in a simple model of excitable dynamics on graphs.

    PubMed

    Fretter, Christoph; Lesne, Annick; Hilgetag, Claus C; Hütt, Marc-Thorsten

    2017-02-10

    Simple models of excitable dynamics on graphs are an efficient framework for studying the interplay between network topology and dynamics. This topic is of practical relevance to diverse fields, ranging from neuroscience to engineering. Here we analyze how a single excitation propagates through a random network as a function of the excitation threshold, that is, the relative amount of activity in the neighborhood required for the excitation of a node. We observe that two sharp transitions delineate a region of sustained activity. Using analytical considerations and numerical simulation, we show that these transitions originate from the presence of barriers to propagation and the excitation of topological cycles, respectively, and can be predicted from the network topology. Our findings are interpreted in the context of network reverberations and self-sustained activity in neural systems, which is a question of long-standing interest in computational neuroscience.

  20. Topological determinants of self-sustained activity in a simple model of excitable dynamics on graphs

    NASA Astrophysics Data System (ADS)

    Fretter, Christoph; Lesne, Annick; Hilgetag, Claus C.; Hütt, Marc-Thorsten

    2017-02-01

    Simple models of excitable dynamics on graphs are an efficient framework for studying the interplay between network topology and dynamics. This topic is of practical relevance to diverse fields, ranging from neuroscience to engineering. Here we analyze how a single excitation propagates through a random network as a function of the excitation threshold, that is, the relative amount of activity in the neighborhood required for the excitation of a node. We observe that two sharp transitions delineate a region of sustained activity. Using analytical considerations and numerical simulation, we show that these transitions originate from the presence of barriers to propagation and the excitation of topological cycles, respectively, and can be predicted from the network topology. Our findings are interpreted in the context of network reverberations and self-sustained activity in neural systems, which is a question of long-standing interest in computational neuroscience.

  1. Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making - Proceedings of a Workshop

    USGS Publications Warehouse

    Hogan, Dianna; Arthaud, Greg; Pattison, Malka; Sayre, Roger G.; Shapiro, Carl

    2010-01-01

    The analytical framework for understanding ecosystem services in conservation, resource management, and development decisions is multidisciplinary, encompassing a combination of the natural and social sciences. This report summarizes a workshop on 'Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making,' which focused on the analytical process and on identifying research priorities for assessing ecosystem services, their production and use, their spatial and temporal characteristics, their relationship with natural systems, and their interdependencies. Attendees discussed research directions and solutions to key challenges in developing the analytical framework. The discussion was divided into two sessions: (1) the measurement framework: quantities and values, and (2) the spatial framework: mapping and spatial relationships. This workshop was the second of three preconference workshops associated with ACES 2008 (A Conference on Ecosystem Services): Using Science for Decision Making in Dynamic Systems. These three workshops were designed to explore the ACES 2008 theme on decision making and how the concept of ecosystem services can be more effectively incorporated into conservation, restoration, resource management, and development decisions. Preconference workshop 1, 'Developing a Vision: Incorporating Ecosystem Services into Decision Making,' was held on April 15, 2008, in Cambridge, MA. In preconference workshop 1, participants addressed what would have to happen to make ecosystem services be used more routinely and effectively in conservation, restoration, resource management, and development decisions, and they identified some key challenges in developing the analytical framework. Preconference workshop 3, 'Developing an Institutional Framework: Incorporating Ecosystem Services into Decision Making,' was held on October 30, 2008, in Albuquerque, NM; participants examined the relationship between the institutional framework and the use of ecosystem services in decision making.

  2. Mechanical model of giant photoexpansion in a chalcogenide glass and the role of photofluidity

    NASA Astrophysics Data System (ADS)

    Buisson, Manuel; Gueguen, Yann; Laniel, Romain; Cantoni, Christopher; Houizot, Patrick; Bureau, Bruno; Sangleboeuf, Jean-Christophe; Lucas, Pierre

    2017-07-01

    An analytical model is developed to describe the phenomenon of giant photoexpansion in chalcogenide glasses. The proposed micro-mechanical model is based on the description of photoexpansion as a new type of eigenstrain, i.e. a deformation analogous to thermal expansion induced without external forces. In this framework, it is the viscoelastic flow induced by photofluidity which enable the conversion of the self-equilibrated stress into giant photoexpansion. This simple approach yields good fits to experimental data and demonstrates, for the first time, that the photoinduced viscous flow actually enhances the giant photoexpansion or the giant photocontraction as it has been suggested in the literature. Moreover, it highlights that the shear relaxation time due to photofluidity controls the expansion kinetic. This model is the first step towards describing giant photoexpansion from the point of view of mechanics and it provides the framework for investigating this phenomenon via numerical simulations.

  3. Critical asset and portfolio risk analysis: an all-hazards framework.

    PubMed

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  4. A classical regression framework for mediation analysis: fitting one model to estimate mediation effects.

    PubMed

    Saunders, Christina T; Blume, Jeffrey D

    2017-10-26

    Mediation analysis explores the degree to which an exposure's effect on an outcome is diverted through a mediating variable. We describe a classical regression framework for conducting mediation analyses in which estimates of causal mediation effects and their variance are obtained from the fit of a single regression model. The vector of changes in exposure pathway coefficients, which we named the essential mediation components (EMCs), is used to estimate standard causal mediation effects. Because these effects are often simple functions of the EMCs, an analytical expression for their model-based variance follows directly. Given this formula, it is instructive to revisit the performance of routinely used variance approximations (e.g., delta method and resampling methods). Requiring the fit of only one model reduces the computation time required for complex mediation analyses and permits the use of a rich suite of regression tools that are not easily implemented on a system of three equations, as would be required in the Baron-Kenny framework. Using data from the BRAIN-ICU study, we provide examples to illustrate the advantages of this framework and compare it with the existing approaches. © The Author 2017. Published by Oxford University Press.

  5. Analysis of System-Wide Investment in the National Airspace System: A Portfolio Analytical Framework and an Example

    NASA Technical Reports Server (NTRS)

    Bhadra, Dipasis; Morser, Frederick R.

    2006-01-01

    In this paper, the authors review the FAA s current program investments and lay out a preliminary analytical framework to undertake projects that may address some of the noted deficiencies. By drawing upon the well developed theories from corporate finance, an analytical framework is offered that can be used for choosing FAA s investments taking into account risk, expected returns and inherent dependencies across NAS programs. The framework can be expanded into taking multiple assets and realistic values for parameters in drawing an efficient risk-return frontier for the entire FAA investment programs.

  6. Variational method enabling simplified solutions to the linearized Boltzmann equation for oscillatory gas flows

    NASA Astrophysics Data System (ADS)

    Ladiges, Daniel R.; Sader, John E.

    2018-05-01

    Nanomechanical resonators and sensors, operated in ambient conditions, often generate low-Mach-number oscillating rarefied gas flows. Cercignani [C. Cercignani, J. Stat. Phys. 1, 297 (1969), 10.1007/BF01007482] proposed a variational principle for the linearized Boltzmann equation, which can be used to derive approximate analytical solutions of steady (time-independent) flows. Here we extend and generalize this principle to unsteady oscillatory rarefied flows and thus accommodate resonating nanomechanical devices. This includes a mathematical approach that facilitates its general use and allows for systematic improvements in accuracy. This formulation is demonstrated for two canonical flow problems: oscillatory Couette flow and Stokes' second problem. Approximate analytical formulas giving the bulk velocity and shear stress, valid for arbitrary oscillation frequency, are obtained for Couette flow. For Stokes' second problem, a simple system of ordinary differential equations is derived which may be solved to obtain the desired flow fields. Using this framework, a simple and accurate formula is provided for the shear stress at the oscillating boundary, again for arbitrary frequency, which may prove useful in application. These solutions are easily implemented on any symbolic or numerical package, such as Mathematica or matlab, facilitating the characterization of flows produced by nanomechanical devices and providing insight into the underlying flow physics.

  7. Modeling electrokinetic flows by consistent implicit incompressible smoothed particle hydrodynamics

    DOE PAGES

    Pan, Wenxiao; Kim, Kyungjoo; Perego, Mauro; ...

    2017-01-03

    In this paper, we present a consistent implicit incompressible smoothed particle hydrodynamics (I 2SPH) discretization of Navier–Stokes, Poisson–Boltzmann, and advection–diffusion equations subject to Dirichlet or Robin boundary conditions. It is applied to model various two and three dimensional electrokinetic flows in simple or complex geometries. The accuracy and convergence of the consistent I 2SPH are examined via comparison with analytical solutions, grid-based numerical solutions, or empirical models. Lastly, the new method provides a framework to explore broader applications of SPH in microfluidics and complex fluids with charged objects, such as colloids and biomolecules, in arbitrary complex geometries.

  8. Diffusion Dynamics and Creative Destruction in a Simple Classical Model

    PubMed Central

    2015-01-01

    ABSTRACT The article explores the impact of the diffusion of new methods of production on output and employment growth and income distribution within a Classical one‐sector framework. Disequilibrium paths are studied analytically and in terms of simulations. Diffusion by differential growth affects aggregate dynamics through several channels. The analysis reveals the non‐steady nature of economic change and shows that the adaptation pattern depends both on the innovation's factor‐saving bias and on the extent of the bias, which determines the strength of the selection pressure on non‐innovators. The typology of different cases developed shows various aspects of Schumpeter's concept of creative destruction. PMID:27642192

  9. Penetration and Growth Rates of Mobile Phones in Developing Countries: An Analytical Classification

    PubMed Central

    2010-01-01

    This brief paper uses a simple arithmetic framework to classify and explain the performance of developing countries in closing the absolute digital divide. Four categories are created on the basis of two variables, namely, the penetration and rate of growth of mobile phones. The paper answers questions such as: Which countries do well and badly on both variables? Are the countries in these categories drawn from specific regions or similar income levels or is the distribution more random? How can similar countries from the same region appear in two diametrically opposite categories? What does this imply for policy? PMID:20835391

  10. Medical ethics: four principles plus attention to scope.

    PubMed Central

    Gillon, R.

    1994-01-01

    The "four principles plus scope" approach provides a simple, accessible, and culturally neutral approach to thinking about ethical issues in health care. The approach, developed in the United States, is based on four common, basic prima facie moral commitments--respect for autonomy, beneficence, nonmaleficence, and justice--plus concern for their scope of application. It offers a common, basic moral analytical framework and a common, basic moral language. Although they do not provide ordered rules, these principles can help doctors and other health care workers to make decisions when reflecting on moral issues that arise at work. Images p184-a p187-a PMID:8044100

  11. Graph Databases for Large-Scale Healthcare Systems: A Framework for Efficient Data Management and Data Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Yubin; Shankar, Mallikarjun; Park, Byung H.

    Designing a database system for both efficient data management and data services has been one of the enduring challenges in the healthcare domain. In many healthcare systems, data services and data management are often viewed as two orthogonal tasks; data services refer to retrieval and analytic queries such as search, joins, statistical data extraction, and simple data mining algorithms, while data management refers to building error-tolerant and non-redundant database systems. The gap between service and management has resulted in rigid database systems and schemas that do not support effective analytics. We compose a rich graph structure from an abstracted healthcaremore » RDBMS to illustrate how we can fill this gap in practice. We show how a healthcare graph can be automatically constructed from a normalized relational database using the proposed 3NF Equivalent Graph (3EG) transformation.We discuss a set of real world graph queries such as finding self-referrals, shared providers, and collaborative filtering, and evaluate their performance over a relational database and its 3EG-transformed graph. Experimental results show that the graph representation serves as multiple de-normalized tables, thus reducing complexity in a database and enhancing data accessibility of users. Based on this finding, we propose an ensemble framework of databases for healthcare applications.« less

  12. The dynamics of adapting, unregulated populations and a modified fundamental theorem.

    PubMed

    O'Dwyer, James P

    2013-01-06

    A population in a novel environment will accumulate adaptive mutations over time, and the dynamics of this process depend on the underlying fitness landscape: the fitness of and mutational distance between possible genotypes in the population. Despite its fundamental importance for understanding the evolution of a population, inferring this landscape from empirical data has been problematic. We develop a theoretical framework to describe the adaptation of a stochastic, asexual, unregulated, polymorphic population undergoing beneficial, neutral and deleterious mutations on a correlated fitness landscape. We generate quantitative predictions for the change in the mean fitness and within-population variance in fitness over time, and find a simple, analytical relationship between the distribution of fitness effects arising from a single mutation, and the change in mean population fitness over time: a variant of Fisher's 'fundamental theorem' which explicitly depends on the form of the landscape. Our framework can therefore be thought of in three ways: (i) as a set of theoretical predictions for adaptation in an exponentially growing phase, with applications in pathogen populations, tumours or other unregulated populations; (ii) as an analytically tractable problem to potentially guide theoretical analysis of regulated populations; and (iii) as a basis for developing empirical methods to infer general features of a fitness landscape.

  13. Basic physical processes and reduced models for plasma detachment

    NASA Astrophysics Data System (ADS)

    Stangeby, P. C.

    2018-04-01

    The divertor of a tokamak reactor will have to satisfy a number of critical constraints, the first of which is that the divertor targets not fail due to excessive heating or sputter-erosion. This paramount constraint of target survival defines the operating window for the principal plasma properties at the divertor target, the density n t and temperature, T t. In particular T et < 10 eV is shown to be required. Code and experimental studies show that the pressure–momentum loss by the plasma that occurs along flux tubes in the edge, between the divertor entrance and target, (i) correlates strongly with T et, and (ii) begins to increase as T et falls below 10 eV, becoming very strong by 1 eV. The transition between the high-recycling regime and the detached divertor regime has therefore been defined here to occur when T et < 10 eV. Simple analytic models are developed (i) to relate (T t, n t) to the controlling conditions ‘upstream’ e.g. at the divertor entrance, and (ii) in turn to relate (T t, n t) to other important divertor quantities including (a) the required level of radiative cooling in the divertor, and (b) the ion flux to the target in the presence of volumetric loss of particles, momentum and power in the divertor. The 2 Point Model, 2PM, is a widely used analytic model for relating (T t, n t) to the controlling upstream conditions. The 2PM is derived here for various levels of complexity regarding the effects included. Analytic models of divertor detachment provide valuable insight and useful approximations, but more complete modeling requires the use of edge codes such as EDGE2D, SOLPS, SONIC, UEDGE, etc. Edge codes have grown to become quite sophisticated and now constitute, in effect, ‘code-experiments’ that—just as for actual experiments—can benefit from interpretation in terms of simple conceptual frameworks. 2 Point Model Formatting, 2PMF, of edge code output can provide such a conceptual framework. Methods of applying 2PMF are illustrated here with some examples.

  14. Superhydrophobic analyte concentration utilizing colloid-pillar array SERS substrates.

    PubMed

    Wallace, Ryan A; Charlton, Jennifer J; Kirchner, Teresa B; Lavrik, Nickolay V; Datskos, Panos G; Sepaniak, Michael J

    2014-12-02

    The ability to detect a few molecules present in a large sample is of great interest for the detection of trace components in both medicinal and environmental samples. Surface enhanced Raman spectroscopy (SERS) is a technique that can be utilized to detect molecules at very low absolute numbers. However, detection at trace concentration levels in real samples requires properly designed delivery and detection systems. The following work involves superhydrophobic surfaces that have as a framework deterministic or stochastic silicon pillar arrays formed by lithographic or metal dewetting protocols, respectively. In order to generate the necessary plasmonic substrate for SERS detection, simple and flow stable Ag colloid was added to the functionalized pillar array system via soaking. Native pillars and pillars with hydrophobic modification are used. The pillars provide a means to concentrate analyte via superhydrophobic droplet evaporation effects. A ≥ 100-fold concentration of analyte was estimated, with a limit of detection of 2.9 × 10(-12) M for mitoxantrone dihydrochloride. Additionally, analytes were delivered to the surface via a multiplex approach in order to demonstrate an ability to control droplet size and placement for scaled-up uses in real world applications. Finally, a concentration process involving transport and sequestration based on surface treatment selective wicking is demonstrated.

  15. Dual-domain mass-transfer parameters from electrical hysteresis: theory and analytical approach applied to laboratory, synthetic streambed, and groundwater experiments

    USGS Publications Warehouse

    Briggs, Martin A.; Day-Lewis, Frederick D.; Ong, John B.; Harvey, Judson W.; Lane, John W.

    2014-01-01

    Models of dual-domain mass transfer (DDMT) are used to explain anomalous aquifer transport behavior such as the slow release of contamination and solute tracer tailing. Traditional tracer experiments to characterize DDMT are performed at the flow path scale (meters), which inherently incorporates heterogeneous exchange processes; hence, estimated “effective” parameters are sensitive to experimental design (i.e., duration and injection velocity). Recently, electrical geophysical methods have been used to aid in the inference of DDMT parameters because, unlike traditional fluid sampling, electrical methods can directly sense less-mobile solute dynamics and can target specific points along subsurface flow paths. Here we propose an analytical framework for graphical parameter inference based on a simple petrophysical model explaining the hysteretic relation between measurements of bulk and fluid conductivity arising in the presence of DDMT at the local scale. Analysis is graphical and involves visual inspection of hysteresis patterns to (1) determine the size of paired mobile and less-mobile porosities and (2) identify the exchange rate coefficient through simple curve fitting. We demonstrate the approach using laboratory column experimental data, synthetic streambed experimental data, and field tracer-test data. Results from the analytical approach compare favorably with results from calibration of numerical models and also independent measurements of mobile and less-mobile porosity. We show that localized electrical hysteresis patterns resulting from diffusive exchange are independent of injection velocity, indicating that repeatable parameters can be extracted under varied experimental designs, and these parameters represent the true intrinsic properties of specific volumes of porous media of aquifers and hyporheic zones.

  16. Using Extreme Tropical Precipitation Statistics to Constrain Future Climate States

    NASA Astrophysics Data System (ADS)

    Igel, M.; Biello, J. A.

    2017-12-01

    Tropical precipitation is characterized by a rapid growth in mean intensity as the column humidity increases. This behavior is examined in both a cloud resolving model and with high-resolution observations of precipitation and column humidity from CloudSat and AIRS, respectively. The model and the observations exhibit remarkable consistency and suggest a new paradigm for extreme precipitation. We show that the total precipitation can be decomposed into a product of contributions from a mean intensity, a probability of precipitation, and a global PDF of column humidity values. We use the modeling and observational results to suggest simple, analytic forms for each of these functions. The analytic representations are then used to construct a simple expression for the global accumulated precipitation as a function of the parameters of each of the component functions. As the climate warms, extreme precipitation intensity and global precipitation are expected to increase, though at different rates. When these predictions are incorporated into the new analytic expression for total precipitation, predictions for changes due to global warming to the probability of precipitation and the PDF of column humidity can be made. We show that strong constraints can be imposed on the future shape of the PDF of column humidity but that only weak constraints can be set on the probability of precipitation. These are largely imposed by the intensification of extreme precipitation. This result suggests that understanding precisely how extreme precipitation responds to climate warming is critical to predicting other impactful properties of global hydrology. The new framework can also be used to confirm and discount existing theories for shifting precipitation.

  17. Dual-domain mass-transfer parameters from electrical hysteresis: Theory and analytical approach applied to laboratory, synthetic streambed, and groundwater experiments

    NASA Astrophysics Data System (ADS)

    Briggs, Martin A.; Day-Lewis, Frederick D.; Ong, John B.; Harvey, Judson W.; Lane, John W.

    2014-10-01

    Models of dual-domain mass transfer (DDMT) are used to explain anomalous aquifer transport behavior such as the slow release of contamination and solute tracer tailing. Traditional tracer experiments to characterize DDMT are performed at the flow path scale (meters), which inherently incorporates heterogeneous exchange processes; hence, estimated "effective" parameters are sensitive to experimental design (i.e., duration and injection velocity). Recently, electrical geophysical methods have been used to aid in the inference of DDMT parameters because, unlike traditional fluid sampling, electrical methods can directly sense less-mobile solute dynamics and can target specific points along subsurface flow paths. Here we propose an analytical framework for graphical parameter inference based on a simple petrophysical model explaining the hysteretic relation between measurements of bulk and fluid conductivity arising in the presence of DDMT at the local scale. Analysis is graphical and involves visual inspection of hysteresis patterns to (1) determine the size of paired mobile and less-mobile porosities and (2) identify the exchange rate coefficient through simple curve fitting. We demonstrate the approach using laboratory column experimental data, synthetic streambed experimental data, and field tracer-test data. Results from the analytical approach compare favorably with results from calibration of numerical models and also independent measurements of mobile and less-mobile porosity. We show that localized electrical hysteresis patterns resulting from diffusive exchange are independent of injection velocity, indicating that repeatable parameters can be extracted under varied experimental designs, and these parameters represent the true intrinsic properties of specific volumes of porous media of aquifers and hyporheic zones.

  18. Navigating a Mobile Robot Across Terrain Using Fuzzy Logic

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun; Howard, Ayanna; Bon, Bruce

    2003-01-01

    A strategy for autonomous navigation of a robotic vehicle across hazardous terrain involves the use of a measure of traversability of terrain within a fuzzy-logic conceptual framework. This navigation strategy requires no a priori information about the environment. Fuzzy logic was selected as a basic element of this strategy because it provides a formal methodology for representing and implementing a human driver s heuristic knowledge and operational experience. Within a fuzzy-logic framework, the attributes of human reasoning and decision- making can be formulated by simple IF (antecedent), THEN (consequent) rules coupled with easily understandable and natural linguistic representations. The linguistic values in the rule antecedents convey the imprecision associated with measurements taken by sensors onboard a mobile robot, while the linguistic values in the rule consequents represent the vagueness inherent in the reasoning processes to generate the control actions. The operational strategies of the human expert driver can be transferred, via fuzzy logic, to a robot-navigation strategy in the form of a set of simple conditional statements composed of linguistic variables. These linguistic variables are defined by fuzzy sets in accordance with user-defined membership functions. The main advantages of a fuzzy navigation strategy lie in the ability to extract heuristic rules from human experience and to obviate the need for an analytical model of the robot navigation process.

  19. 77 FR 33683 - Privacy Act of 1974: Implementation of Exemptions; Department of Homeland Security, U.S. Customs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-07

    ... Border Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records AGENCY... Framework for Intelligence (AFI) System of Records'' and this proposed rulemaking. In this proposed... Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records.'' AFI enhances DHS's...

  20. Analytical Overview of the European and Russian Qualifications Frameworks with a Focus on Doctoral Degree Level

    ERIC Educational Resources Information Center

    Chigisheva, Oksana; Bondarenko, Anna; Soltovets, Elena

    2017-01-01

    The paper provides analytical insights into highly acute issues concerning preparation and adoption of Qualifications Frameworks being an adequate response to the growing interactions at the global labor market and flourishing of knowledge economy. Special attention is paid to the analyses of transnational Meta Qualifications Frameworks (A…

  1. Degrees of School Democracy: A Holistic Framework

    ERIC Educational Resources Information Center

    Woods, Philip A.; Woods, Glenys J.

    2012-01-01

    This article outlines an analytical framework that enables analysis of degrees of democracy in a school or other organizational setting. It is founded in a holistic conception of democracy, which is a model of working together that aspires to truth, goodness, and meaning and the participation of all. We suggest that the analytical framework can be…

  2. MetaKTSP: a meta-analytic top scoring pair method for robust cross-study validation of omics prediction analysis.

    PubMed

    Kim, SungHwan; Lin, Chien-Wei; Tseng, George C

    2016-07-01

    Supervised machine learning is widely applied to transcriptomic data to predict disease diagnosis, prognosis or survival. Robust and interpretable classifiers with high accuracy are usually favored for their clinical and translational potential. The top scoring pair (TSP) algorithm is an example that applies a simple rank-based algorithm to identify rank-altered gene pairs for classifier construction. Although many classification methods perform well in cross-validation of single expression profile, the performance usually greatly reduces in cross-study validation (i.e. the prediction model is established in the training study and applied to an independent test study) for all machine learning methods, including TSP. The failure of cross-study validation has largely diminished the potential translational and clinical values of the models. The purpose of this article is to develop a meta-analytic top scoring pair (MetaKTSP) framework that combines multiple transcriptomic studies and generates a robust prediction model applicable to independent test studies. We proposed two frameworks, by averaging TSP scores or by combining P-values from individual studies, to select the top gene pairs for model construction. We applied the proposed methods in simulated data sets and three large-scale real applications in breast cancer, idiopathic pulmonary fibrosis and pan-cancer methylation. The result showed superior performance of cross-study validation accuracy and biomarker selection for the new meta-analytic framework. In conclusion, combining multiple omics data sets in the public domain increases robustness and accuracy of the classification model that will ultimately improve disease understanding and clinical treatment decisions to benefit patients. An R package MetaKTSP is available online. (http://tsenglab.biostat.pitt.edu/software.htm). ctseng@pitt.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. Visual Analytics for Law Enforcement: Deploying a Service-Oriented Analytic Framework for Web-based Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.

    2009-04-14

    This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less

  4. Validation of the replica trick for simple models

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2018-04-01

    We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.

  5. The stationary sine-Gordon equation on metric graphs: Exact analytical solutions for simple topologies

    NASA Astrophysics Data System (ADS)

    Sabirov, K.; Rakhmanov, S.; Matrasulov, D.; Susanto, H.

    2018-04-01

    We consider the stationary sine-Gordon equation on metric graphs with simple topologies. Exact analytical solutions are obtained for different vertex boundary conditions. It is shown that the method can be extended for tree and other simple graph topologies. Applications of the obtained results to branched planar Josephson junctions and Josephson junctions with tricrystal boundaries are discussed.

  6. Customization of ¹³C-MFA strategy according to cell culture system.

    PubMed

    Quek, Lake-Ee; Nielsen, Lars K

    2014-01-01

    (13)C-MFA is far from being a simple assay for quantifying metabolic activity. It requires considerable up-front experimental planning and familiarity with the cell culture system in question, as well as optimized analytics and adequate computation frameworks. The success of a (13)C-MFA experiment is ultimately rated by the ability to accurately quantify the flux of one or more reactions of interest. In this chapter, we describe the different (13)C-MFA strategies that have been developed for the various fermentation or cell culture systems, as well as the limitations of the respective strategies. The strategies are affected by many factors and the (13)C-MFA modeling and experimental strategy must be tailored to conditions. The prevailing philosophy in the computation process is that any metabolic processes that produce significant systematic bias in the labeling pattern of the metabolites being measured must be described in the model. It is equally important to plan a labeling strategy by analytical screening or by heuristics.

  7. Susceptibility of a Magnetic Impurity in Weakly Localized Regime

    NASA Astrophysics Data System (ADS)

    Suga, Seiichiro; Kasai, Hideaki; Okiji, Ayao

    1987-12-01

    Interplay between the randomness and the s-d exchange interaction is investigated theoretically in the weakly localized regime through the temperature dependence of the susceptibility. In the first half the analytic calculations are performed perturbatively in terms of the s-d exchange coupling constant. It is shown that the quantum corrections to the susceptibility construct geometric series and can be summed up as simple formulae within the framework of the most divergent approximation. In the second half the numerical calculations are performed with the use of the self-consistent ladder approximation. It is shown that the effective Curie constant decreases more rapidly with decreasing the temperature than that in the usual Kondo systems.

  8. Soft porous silicone rubbers with ultra-low sound speeds in acoustic metamaterials

    PubMed Central

    Ba, Abdoulaye; Kovalenko, Artem; Aristégui, Christophe; Mondain-Monval, Olivier; Brunet, Thomas

    2017-01-01

    Soft porous silicone rubbers are demonstrated to exhibit extremely low sound speeds of tens of m/s for these dense materials, even for low porosities of the order of a few percent. Our ultrasonic experiments show a sudden drop of the longitudinal sound speed with the porosity, while the transverse sound speed remains constant. For such porous elastomeric materials, we propose simple analytical expressions for these two sound speeds, derived in the framework of Kuster and Toksöz, revealing an excellent agreement between the theoretical predictions and the experimental results for both longitudinal and shear waves. Acoustic attenuation measurements also complete the characterization of these soft porous materials. PMID:28054661

  9. Quantum dressing orbits on compact groups

    NASA Astrophysics Data System (ADS)

    Jurčo, Branislav; Šťovíček, Pavel

    1993-02-01

    The quantum double is shown to imply the dressing transformation on quantum compact groups and the quantum Iwasawa decompositon in the general case. Quantum dressing orbits are described explicitly as *-algebras. The dual coalgebras consisting of differential operators are related to the quantum Weyl elements. Besides, the differential geometry on a quantum leaf allows a remarkably simple construction of irreducible *-representations of the algebras of quantum functions. Representation spaces then consist of analytic functions on classical phase spaces. These representations are also interpreted in the framework of quantization in the spirit of Berezin applied to symplectic leaves on classical compact groups. Convenient “coherent states” are introduced and a correspondence between classical and quantum observables is given.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yazawa, Kazuaki; Shakouri, Ali

    The energy conversion efficiency of today’s thermoelectric generators is significantly lower than that of conventional mechanical engines. Almost all of the existing research is focused on materials to improve the conversion efficiency. Here we propose a general framework to study the cost-efficiency trade-off for thermoelectric power generation. A key factor is the optimization of thermoelectric modules together with their heat source and heat sinks. Full electrical and thermal co-optimization yield a simple analytical expression for optimum design. Based on this model, power output per unit mass can be maximized. We show that the fractional area coverage of thermoelectric elements inmore » a module could play a significant role in reducing the cost of power generation systems.« less

  11. REVIEWS OF TOPICAL PROBLEMS: Axisymmetric stationary flows in compact astrophysical objects

    NASA Astrophysics Data System (ADS)

    Beskin, Vasilii S.

    1997-07-01

    A review is presented of the analytical results available for a large class of axisymmetric stationary flows in the vicinity of compact astrophysical objects. The determination of the two-dimensional structure of the poloidal magnetic field (hydrodynamic flow field) faces severe difficulties, due to the complexity of the trans-field equation for stationary axisymmetric flows. However, an approach exists which enables direct problems to be solved even within the balance law framework. This possibility arises when an exact solution to the equation is available and flows close to it are investigated. As a result, with the use of simple model problems, the basic features of supersonic flows past real compact objects are determined.

  12. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  13. The Influence of Intrinsic Framework Flexibility on Adsorption in Nanoporous Materials

    DOE PAGES

    Witman, Matthew; Ling, Sanliang; Jawahery, Sudi; ...

    2017-03-30

    For applications of metal–organic frameworks (MOFs) such as gas storage and separation, flexibility is often seen as a parameter that can tune material performance. In this work we aim to determine the optimal flexibility for the shape selective separation of similarly sized molecules (e.g., Xe/Kr mixtures). To obtain systematic insight into how the flexibility impacts this type of separation, we develop a simple analytical model that predicts a material’s Henry regime adsorption and selectivity as a function of flexibility. We elucidate the complex dependence of selectivity on a framework’s intrinsic flexibility whereby performance is either improved or reduced with increasingmore » flexibility, depending on the material’s pore size characteristics. However, the selectivity of a material with the pore size and chemistry that already maximizes selectivity in the rigid approximation is continuously diminished with increasing flexibility, demonstrating that the globally optimal separation exists within an entirely rigid pore. Molecular simulations show that our simple model predicts performance trends that are observed when screening the adsorption behavior of flexible MOFs. These flexible simulations provide better agreement with experimental adsorption data in a high-performance material that is not captured when modeling this framework as rigid, an approximation typically made in high-throughput screening studies. We conclude that, for shape selective adsorption applications, the globally optimal material will have the optimal pore size/chemistry and minimal intrinsic flexibility even though other nonoptimal materials’ selectivity can actually be improved by flexibility. In conclusion, equally important, we find that flexible simulations can be critical for correctly modeling adsorption in these types of systems.« less

  14. The Influence of Intrinsic Framework Flexibility on Adsorption in Nanoporous Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witman, Matthew; Ling, Sanliang; Jawahery, Sudi

    For applications of metal–organic frameworks (MOFs) such as gas storage and separation, flexibility is often seen as a parameter that can tune material performance. In this work we aim to determine the optimal flexibility for the shape selective separation of similarly sized molecules (e.g., Xe/Kr mixtures). To obtain systematic insight into how the flexibility impacts this type of separation, we develop a simple analytical model that predicts a material’s Henry regime adsorption and selectivity as a function of flexibility. We elucidate the complex dependence of selectivity on a framework’s intrinsic flexibility whereby performance is either improved or reduced with increasingmore » flexibility, depending on the material’s pore size characteristics. However, the selectivity of a material with the pore size and chemistry that already maximizes selectivity in the rigid approximation is continuously diminished with increasing flexibility, demonstrating that the globally optimal separation exists within an entirely rigid pore. Molecular simulations show that our simple model predicts performance trends that are observed when screening the adsorption behavior of flexible MOFs. These flexible simulations provide better agreement with experimental adsorption data in a high-performance material that is not captured when modeling this framework as rigid, an approximation typically made in high-throughput screening studies. We conclude that, for shape selective adsorption applications, the globally optimal material will have the optimal pore size/chemistry and minimal intrinsic flexibility even though other nonoptimal materials’ selectivity can actually be improved by flexibility. In conclusion, equally important, we find that flexible simulations can be critical for correctly modeling adsorption in these types of systems.« less

  15. A proposed analytic framework for determining the impact of an antimicrobial resistance intervention.

    PubMed

    Grohn, Yrjo T; Carson, Carolee; Lanzas, Cristina; Pullum, Laura; Stanhope, Michael; Volkova, Victoriya

    2017-06-01

    Antimicrobial use (AMU) is increasingly threatened by antimicrobial resistance (AMR). The FDA is implementing risk mitigation measures promoting prudent AMU in food animals. Their evaluation is crucial: the AMU/AMR relationship is complex; a suitable framework to analyze interventions is unavailable. Systems science analysis, depicting variables and their associations, would help integrate mathematics/epidemiology to evaluate the relationship. This would identify informative data and models to evaluate interventions. This National Institute for Mathematical and Biological Synthesis AMR Working Group's report proposes a system framework to address the methodological gap linking livestock AMU and AMR in foodborne bacteria. It could evaluate how AMU (and interventions) impact AMR. We will evaluate pharmacokinetic/dynamic modeling techniques for projecting AMR selection pressure on enteric bacteria. We study two methods to model phenotypic AMR changes in bacteria in the food supply and evolutionary genotypic analyses determining molecular changes in phenotypic AMR. Systems science analysis integrates the methods, showing how resistance in the food supply is explained by AMU and concurrent factors influencing the whole system. This process is updated with data and techniques to improve prediction and inform improvements for AMU/AMR surveillance. Our proposed framework reflects both the AMR system's complexity, and desire for simple, reliable conclusions.

  16. An Approximate Solution to the Equation of Motion for Large-Angle Oscillations of the Simple Pendulum with Initial Velocity

    ERIC Educational Resources Information Center

    Johannessen, Kim

    2010-01-01

    An analytic approximation of the solution to the differential equation describing the oscillations of a simple pendulum at large angles and with initial velocity is discussed. In the derivation, a sinusoidal approximation has been applied, and an analytic formula for the large-angle period of the simple pendulum is obtained, which also includes…

  17. Authentic Oral Language Production and Interaction in CALL: An Evolving Conceptual Framework for the Use of Learning Analytics within the SpeakApps Project

    ERIC Educational Resources Information Center

    Nic Giolla Mhichíl, Mairéad; van Engen, Jeroen; Ó Ciardúbháin, Colm; Ó Cléircín, Gearóid; Appel, Christine

    2014-01-01

    This paper sets out to construct and present the evolving conceptual framework of the SpeakApps projects to consider the application of learning analytics to facilitate synchronous and asynchronous oral language skills within this CALL context. Drawing from both the CALL and wider theoretical and empirical literature of learner analytics, the…

  18. GoFFish: A Sub-Graph Centric Framework for Large-Scale Graph Analytics1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmhan, Yogesh; Kumbhare, Alok; Wickramaarachchi, Charith

    2014-08-25

    Large scale graph processing is a major research area for Big Data exploration. Vertex centric programming models like Pregel are gaining traction due to their simple abstraction that allows for scalable execution on distributed systems naturally. However, there are limitations to this approach which cause vertex centric algorithms to under-perform due to poor compute to communication overhead ratio and slow convergence of iterative superstep. In this paper we introduce GoFFish a scalable sub-graph centric framework co-designed with a distributed persistent graph storage for large scale graph analytics on commodity clusters. We introduce a sub-graph centric programming abstraction that combines themore » scalability of a vertex centric approach with the flexibility of shared memory sub-graph computation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation.« less

  19. Real-Time Analytics for the Healthcare Industry: Arrhythmia Detection.

    PubMed

    Agneeswaran, Vijay Srinivas; Mukherjee, Joydeb; Gupta, Ashutosh; Tonpay, Pranay; Tiwari, Jayati; Agarwal, Nitin

    2013-09-01

    It is time for the healthcare industry to move from the era of "analyzing our health history" to the age of "managing the future of our health." In this article, we illustrate the importance of real-time analytics across the healthcare industry by providing a generic mechanism to reengineer traditional analytics expressed in the R programming language into Storm-based real-time analytics code. This is a powerful abstraction, since most data scientists use R to write the analytics and are not clear on how to make the data work in real-time and on high-velocity data. Our paper focuses on the applications necessary to a healthcare analytics scenario, specifically focusing on the importance of electrocardiogram (ECG) monitoring. A physician can use our framework to compare ECG reports by categorization and consequently detect Arrhythmia. The framework can read the ECG signals and uses a machine learning-based categorizer that runs within a Storm environment to compare different ECG signals. The paper also presents some performance studies of the framework to illustrate the throughput and accuracy trade-off in real-time analytics.

  20. Framework for assessing key variable dependencies in loose-abrasive grinding and polishing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, J.S.; Aikens, D.M.; Brown, N.J.

    1995-12-01

    This memo describes a framework for identifying all key variables that determine the figuring performance of loose-abrasive lapping and polishing machines. This framework is intended as a tool for prioritizing R&D issues, assessing the completeness of process models and experimental data, and for providing a mechanism to identify any assumptions in analytical models or experimental procedures. Future plans for preparing analytical models or performing experiments can refer to this framework in establishing the context of the work.

  1. Semi-analytical model of cross-borehole flow experiments for fractured medium characterization

    NASA Astrophysics Data System (ADS)

    Roubinet, D.; Irving, J.; Day-Lewis, F. D.

    2014-12-01

    The study of fractured rocks is extremely important in a wide variety of research fields where the fractures and faults can represent either rapid access to some resource of interest or potential pathways for the migration of contaminants in the subsurface. Identification of their presence and determination of their properties are critical and challenging tasks that have led to numerous fracture characterization methods. Among these methods, cross-borehole flowmeter analysis aims to evaluate fracture connections and hydraulic properties from vertical-flow-velocity measurements conducted in one or more observation boreholes under forced hydraulic conditions. Previous studies have demonstrated that analysis of these data can provide important information on fracture connectivity, transmissivity, and storativity. Estimating these properties requires the development of analytical and/or numerical modeling tools that are well adapted to the complexity of the problem. Quantitative analysis of cross-borehole flowmeter experiments, in particular, requires modeling formulations that: (i) can be adapted to a variety of fracture and experimental configurations; (ii) can take into account interactions between the boreholes because their radii of influence may overlap; and (iii) can be readily cast into an inversion framework that allows for not only the estimation of fracture hydraulic properties, but also an assessment of estimation error. To this end, we present a new semi-analytical formulation for cross-borehole flow in fractured media that links transient vertical-flow velocities measured in one or a series of observation wells during hydraulic forcing to the transmissivity and storativity of the fractures intersected by these wells. Our model addresses the above needs and provides a flexible and computationally efficient semi-analytical framework having strong potential for future adaptation to more complex configurations. The proposed modeling approach is demonstrated in the context of sensitivity analysis for a relatively simple two-fracture synthetic problem, as well as in the context of field-data analysis for fracture connectivity and estimation of corresponding hydraulic properties.

  2. Role of Knowledge Management and Analytical CRM in Business: Data Mining Based Framework

    ERIC Educational Resources Information Center

    Ranjan, Jayanthi; Bhatnagar, Vishal

    2011-01-01

    Purpose: The purpose of the paper is to provide a thorough analysis of the concepts of business intelligence (BI), knowledge management (KM) and analytical CRM (aCRM) and to establish a framework for integrating all the three to each other. The paper also seeks to establish a KM and aCRM based framework using data mining (DM) techniques, which…

  3. RT-18: Value of Flexibility. Phase 1

    DTIC Science & Technology

    2010-09-25

    an analytical framework based on sound mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory...framework that is mathematically consistent, domain independent and applicable under varying information levels. This report presents our advances in...During this period, we also explored the development of an analytical framework based on sound mathematical constructs. A review of the current state

  4. Analytical and numerical analysis of frictional damage in quasi brittle materials

    NASA Astrophysics Data System (ADS)

    Zhu, Q. Z.; Zhao, L. Y.; Shao, J. F.

    2016-07-01

    Frictional sliding and crack growth are two main dissipation processes in quasi brittle materials. The frictional sliding along closed cracks is the origin of macroscopic plastic deformation while the crack growth induces a material damage. The main difficulty of modeling is to consider the inherent coupling between these two processes. Various models and associated numerical algorithms have been proposed. But there are so far no analytical solutions even for simple loading paths for the validation of such algorithms. In this paper, we first present a micro-mechanical model taking into account the damage-friction coupling for a large class of quasi brittle materials. The model is formulated by combining a linear homogenization procedure with the Mori-Tanaka scheme and the irreversible thermodynamics framework. As an original contribution, a series of analytical solutions of stress-strain relations are developed for various loading paths. Based on the micro-mechanical model, two numerical integration algorithms are exploited. The first one involves a coupled friction/damage correction scheme, which is consistent with the coupling nature of the constitutive model. The second one contains a friction/damage decoupling scheme with two consecutive steps: the friction correction followed by the damage correction. With the analytical solutions as reference results, the two algorithms are assessed through a series of numerical tests. It is found that the decoupling correction scheme is efficient to guarantee a systematic numerical convergence.

  5. Fabricating Simple Wax Screen-Printing Paper-Based Analytical Devices to Demonstrate the Concept of Limiting Reagent in Acid- Base Reactions

    ERIC Educational Resources Information Center

    Namwong, Pithakpong; Jarujamrus, Purim; Amatatongchai, Maliwan; Chairam, Sanoe

    2018-01-01

    In this article, a low-cost, simple, and rapid fabrication of paper-based analytical devices (PADs) using a wax screen-printing method is reported here. The acid-base reaction is implemented in the simple PADs to demonstrate to students the chemistry concept of a limiting reagent. When a fixed concentration of base reacts with a gradually…

  6. Sandplay therapy with couples within the framework of analytical psychology.

    PubMed

    Albert, Susan Carol

    2015-02-01

    Sandplay therapy with couples is discussed within an analytical framework. Guidelines are proposed as a means of developing this relatively new area within sandplay therapy, and as a platform to open a wider discussion to bring together sandplay therapy and couple therapy. Examples of sand trays created during couple therapy are also presented to illustrate the transformations during the therapeutic process. © 2015, The Society of Analytical Psychology.

  7. Erratum: A Simple, Analytical Model of Collisionless Magnetic Reconnection in a Pair Plasma

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Zenitani, Seiji; Kuznetsova, Masha; Klimas, Alex

    2011-01-01

    The following describes a list of errata in our paper, "A simple, analytical model of collisionless magnetic reconnection in a pair plasma." It supersedes an earlier erratum. We recently discovered an error in the derivation of the outflow-to-inflow density ratio.

  8. The Role of Wakes in Modelling Tidal Current Turbines

    NASA Astrophysics Data System (ADS)

    Conley, Daniel; Roc, Thomas; Greaves, Deborah

    2010-05-01

    The eventual proper development of arrays of Tidal Current Turbines (TCT) will require a balance which maximizes power extraction while minimizing environmental impacts. Idealized analytical analogues and simple 2-D models are useful tools for investigating questions of a general nature but do not represent a practical tool for application to realistic cases. Some form of 3-D numerical simulations will be required for such applications and the current project is designed to develop a numerical decision-making tool for use in planning large scale TCT projects. The project is predicated on the use of an existing regional ocean modelling framework (the Regional Ocean Modelling System - ROMS) which is modified to enable the user to account for the effects of TCTs. In such a framework where mixing processes are highly parametrized, the fidelity of the quantitative results is critically dependent on the parameter values utilized. In light of the early stage of TCT development and the lack of field scale measurements, the calibration of such a model is problematic. In the absence of explicit calibration data sets, the device wake structure has been identified as an efficient feature for model calibration. This presentation will discuss efforts to design an appropriate calibration scheme which focuses on wake decay and the motivation for this approach, techniques applied, validation results from simple test cases and limitations shall be presented.

  9. Is harm reduction profitable? An analytical framework for corporate social responsibility based on an epidemic model of addictive consumption.

    PubMed

    Massin, Sophie

    2012-06-01

    This article aims to help resolve the apparent paradox of producers of addictive goods who claim to be socially responsible while marketing a product clearly identified as harmful. It advances that reputation effects are crucial in this issue and that determining whether harm reduction practices are costly or profitable for the producers can help to assess the sincerity of their discourse. An analytical framework based on an epidemic model of addictive consumption that includes a deterrent effect of heavy use on initiation is developed. This framework enables us to establish a clear distinction between a simple responsible discourse and genuine harm reduction practices and, among harm reduction practices, between use reduction practices and micro harm reduction practices. Using simulations based on tobacco sales in France from 1950 to 2008, we explore the impact of three corresponding types of actions: communication on damage, restraining selling practices and development of safer products on total sales and on the social cost. We notably find that restraining selling practices toward light users, that is, preventing light users from escalating to heavy use, can be profitable for the producer, especially at early stages of the epidemic, but that such practices also contribute to increase the social cost. These results suggest that the existence of a deterrent effect of heavy use on the initiation of the consumption of an addictive good can shed new light on important issues, such as the motivations for corporate social responsibility and the definition of responsible actions in the particular case of harm reduction. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Analytic Frameworks for Assessing Dialogic Argumentation in Online Learning Environments

    ERIC Educational Resources Information Center

    Clark, Douglas B; Sampson, Victor; Weinberger, Armin; Erkens, Gijsbert

    2007-01-01

    Over the last decade, researchers have developed sophisticated online learning environments to support students engaging in dialogic argumentation. This review examines five categories of analytic frameworks for measuring participant interactions within these environments focusing on (1) formal argumentation structure, (2) conceptual quality, (3)…

  11. Analytical model for ion stopping power and range in the therapeutic energy interval for beams of hydrogen and heavier ions

    NASA Astrophysics Data System (ADS)

    Donahue, William; Newhauser, Wayne D.; Ziegler, James F.

    2016-09-01

    Many different approaches exist to calculate stopping power and range of protons and heavy charged particles. These methods may be broadly categorized as physically complete theories (widely applicable and complex) or semi-empirical approaches (narrowly applicable and simple). However, little attention has been paid in the literature to approaches that are both widely applicable and simple. We developed simple analytical models of stopping power and range for ions of hydrogen, carbon, iron, and uranium that spanned intervals of ion energy from 351 keV u-1 to 450 MeV u-1 or wider. The analytical models typically reproduced the best-available evaluated stopping powers within 1% and ranges within 0.1 mm. The computational speed of the analytical stopping power model was 28% faster than a full-theoretical approach. The calculation of range using the analytic range model was 945 times faster than a widely-used numerical integration technique. The results of this study revealed that the new, simple analytical models are accurate, fast, and broadly applicable. The new models require just 6 parameters to calculate stopping power and range for a given ion and absorber. The proposed model may be useful as an alternative to traditional approaches, especially in applications that demand fast computation speed, small memory footprint, and simplicity.

  12. Analytical model for ion stopping power and range in the therapeutic energy interval for beams of hydrogen and heavier ions.

    PubMed

    Donahue, William; Newhauser, Wayne D; Ziegler, James F

    2016-09-07

    Many different approaches exist to calculate stopping power and range of protons and heavy charged particles. These methods may be broadly categorized as physically complete theories (widely applicable and complex) or semi-empirical approaches (narrowly applicable and simple). However, little attention has been paid in the literature to approaches that are both widely applicable and simple. We developed simple analytical models of stopping power and range for ions of hydrogen, carbon, iron, and uranium that spanned intervals of ion energy from 351 keV u(-1) to 450 MeV u(-1) or wider. The analytical models typically reproduced the best-available evaluated stopping powers within 1% and ranges within 0.1 mm. The computational speed of the analytical stopping power model was 28% faster than a full-theoretical approach. The calculation of range using the analytic range model was 945 times faster than a widely-used numerical integration technique. The results of this study revealed that the new, simple analytical models are accurate, fast, and broadly applicable. The new models require just 6 parameters to calculate stopping power and range for a given ion and absorber. The proposed model may be useful as an alternative to traditional approaches, especially in applications that demand fast computation speed, small memory footprint, and simplicity.

  13. An Analytical Framework for Evaluating E-Commerce Business Models and Strategies.

    ERIC Educational Resources Information Center

    Lee, Chung-Shing

    2001-01-01

    Considers electronic commerce as a paradigm shift, or a disruptive innovation, and presents an analytical framework based on the theories of transaction costs and switching costs. Topics include business transformation process; scale effect; scope effect; new sources of revenue; and e-commerce value creation model and strategy. (LRW)

  14. Reasoning across Ontologically Distinct Levels: Students' Understandings of Molecular Genetics

    ERIC Educational Resources Information Center

    Duncan, Ravit Golan; Reiser, Brian J.

    2007-01-01

    In this article we apply a novel analytical framework to explore students' difficulties in understanding molecular genetics--a domain that is particularly challenging to learn. Our analytical framework posits that reasoning in molecular genetics entails mapping across ontologically distinct levels--an information level containing the genetic…

  15. The Illness Narratives of Health Managers: Developing an Analytical Framework

    ERIC Educational Resources Information Center

    Exworthy, Mark

    2011-01-01

    This paper examines the personal experience of illness and healthcare by health managers through their illness narratives. By synthesising a wider literature of illness narratives and health management, an analytical framework is presented, which considers the impact of illness narratives, comprising the logic of illness narratives, the actors…

  16. A Data Protection Framework for Learning Analytics

    ERIC Educational Resources Information Center

    Cormack, Andrew

    2016-01-01

    Most studies on the use of digital student data adopt an ethical framework derived from human-subject research, based on the informed consent of the experimental subject. However, consent gives universities little guidance on using learning analytics as a routine part of educational provision: which purposes are legitimate and which analyses…

  17. Spectrum simulation in DTSA-II.

    PubMed

    Ritchie, Nicholas W M

    2009-10-01

    Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.

  18. Spiral and never-settling patterns in active systems

    NASA Astrophysics Data System (ADS)

    Yang, X.; Marenduzzo, D.; Marchetti, M. C.

    2014-01-01

    We present a combined numerical and analytical study of pattern formation in an active system where particles align, possess a density-dependent motility, and are subject to a logistic reaction. The model can describe suspensions of reproducing bacteria, as well as polymerizing actomyosin gels in vitro or in vivo. In the disordered phase, we find that motility suppression and growth compete to yield stable or blinking patterns, which, when dense enough, acquire internal orientational ordering to give asters or spirals. We predict these may be observed within chemotactic aggregates in bacterial fluids. In the ordered phase, the reaction term leads to previously unobserved never-settling patterns which can provide a simple framework to understand the formation of motile and spiral patterns in intracellular actin systems.

  19. Cost-efficiency trade-off and the design of thermoelectric power generators.

    PubMed

    Yazawa, Kazuaki; Shakouri, Ali

    2011-09-01

    The energy conversion efficiency of today's thermoelectric generators is significantly lower than that of conventional mechanical engines. Almost all of the existing research is focused on materials to improve the conversion efficiency. Here we propose a general framework to study the cost-efficiency trade-off for thermoelectric power generation. A key factor is the optimization of thermoelectric modules together with their heat source and heat sinks. Full electrical and thermal co-optimization yield a simple analytical expression for optimum design. Based on this model, power output per unit mass can be maximized. We show that the fractional area coverage of thermoelectric elements in a module could play a significant role in reducing the cost of power generation systems.

  20. Efficient and precise calculation of the b-matrix elements in diffusion-weighted imaging pulse sequences.

    PubMed

    Zubkov, Mikhail; Stait-Gardner, Timothy; Price, William S

    2014-06-01

    Precise NMR diffusion measurements require detailed knowledge of the cumulative dephasing effect caused by the numerous gradient pulses present in most NMR pulse sequences. This effect, which ultimately manifests itself as the diffusion-related NMR signal attenuation, is usually described by the b-value or the b-matrix in the case of multidirectional diffusion weighting, the latter being common in diffusion-weighted NMR imaging. Neglecting some of the gradient pulses introduces an error in the calculated diffusion coefficient reaching in some cases 100% of the expected value. Therefore, ensuring the b-matrix calculation includes all the known gradient pulses leads to significant error reduction. Calculation of the b-matrix for simple gradient waveforms is rather straightforward, yet it grows cumbersome when complexly shaped and/or numerous gradient pulses are introduced. Making three broad assumptions about the gradient pulse arrangement in a sequence results in an efficient framework for calculation of b-matrices as well providing some insight into optimal gradient pulse placement. The framework allows accounting for the diffusion-sensitising effect of complexly shaped gradient waveforms with modest computational time and power. This is achieved by using the b-matrix elements of the simple unmodified pulse sequence and minimising the integration of the complexly shaped gradient waveform in the modified sequence. Such re-evaluation of the b-matrix elements retains all the analytical relevance of the straightforward approach, yet at least halves the amount of symbolic integration required. The application of the framework is demonstrated with the evaluation of the expression describing the diffusion-sensitizing effect, caused by different bipolar gradient pulse modules. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Development of a point-kinetic verification scheme for nuclear reactor applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demazière, C., E-mail: demaz@chalmers.se; Dykin, V.; Jareteg, K.

    In this paper, a new method that can be used for checking the proper implementation of time- or frequency-dependent neutron transport models and for verifying their ability to recover some basic reactor physics properties is proposed. This method makes use of the application of a stationary perturbation to the system at a given frequency and extraction of the point-kinetic component of the system response. Even for strongly heterogeneous systems for which an analytical solution does not exist, the point-kinetic component follows, as a function of frequency, a simple analytical form. The comparison between the extracted point-kinetic component and its expectedmore » analytical form provides an opportunity to verify and validate neutron transport solvers. The proposed method is tested on two diffusion-based codes, one working in the time domain and the other working in the frequency domain. As long as the applied perturbation has a non-zero reactivity effect, it is demonstrated that the method can be successfully applied to verify and validate time- or frequency-dependent neutron transport solvers. Although the method is demonstrated in the present paper in a diffusion theory framework, higher order neutron transport methods could be verified based on the same principles.« less

  2. Effect of risk perception on epidemic spreading in temporal networks

    NASA Astrophysics Data System (ADS)

    Moinet, Antoine; Pastor-Satorras, Romualdo; Barrat, Alain

    2018-01-01

    Many progresses in the understanding of epidemic spreading models have been obtained thanks to numerous modeling efforts and analytical and numerical studies, considering host populations with very different structures and properties, including complex and temporal interaction networks. Moreover, a number of recent studies have started to go beyond the assumption of an absence of coupling between the spread of a disease and the structure of the contacts on which it unfolds. Models including awareness of the spread have been proposed, to mimic possible precautionary measures taken by individuals that decrease their risk of infection, but have mostly considered static networks. Here, we adapt such a framework to the more realistic case of temporal networks of interactions between individuals. We study the resulting model by analytical and numerical means on both simple models of temporal networks and empirical time-resolved contact data. Analytical results show that the epidemic threshold is not affected by the awareness but that the prevalence can be significantly decreased. Numerical studies on synthetic temporal networks highlight, however, the presence of very strong finite-size effects, resulting in a significant shift of the effective epidemic threshold in the presence of risk awareness. For empirical contact networks, the awareness mechanism leads as well to a shift in the effective threshold and to a strong reduction of the epidemic prevalence.

  3. A simple method for estimating frequency response corrections for eddy covariance systems

    Treesearch

    W. J. Massman

    2000-01-01

    A simple analytical formula is developed for estimating the frequency attenuation of eddy covariance fluxes due to sensor response, path-length averaging, sensor separation, signal processing, and flux averaging periods. Although it is an approximation based on flat terrain cospectra, this analytical formula should have broader applicability than just flat-terrain...

  4. Understanding Business Analytics

    DTIC Science & Technology

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  5. Assessing Proposals for New Global Health Treaties: An Analytic Framework.

    PubMed

    Hoffman, Steven J; Røttingen, John-Arne; Frenk, Julio

    2015-08-01

    We have presented an analytic framework and 4 criteria for assessing when global health treaties have reasonable prospects of yielding net positive effects. First, there must be a significant transnational dimension to the problem being addressed. Second, the goals should justify the coercive nature of treaties. Third, proposed global health treaties should have a reasonable chance of achieving benefits. Fourth, treaties should be the best commitment mechanism among the many competing alternatives. Applying this analytic framework to 9 recent calls for new global health treaties revealed that none fully meet the 4 criteria. Efforts aiming to better use or revise existing international instruments may be more productive than is advocating new treaties.

  6. Assessing Proposals for New Global Health Treaties: An Analytic Framework

    PubMed Central

    Røttingen, John-Arne; Frenk, Julio

    2015-01-01

    We have presented an analytic framework and 4 criteria for assessing when global health treaties have reasonable prospects of yielding net positive effects. First, there must be a significant transnational dimension to the problem being addressed. Second, the goals should justify the coercive nature of treaties. Third, proposed global health treaties should have a reasonable chance of achieving benefits. Fourth, treaties should be the best commitment mechanism among the many competing alternatives. Applying this analytic framework to 9 recent calls for new global health treaties revealed that none fully meet the 4 criteria. Efforts aiming to better use or revise existing international instruments may be more productive than is advocating new treaties. PMID:26066926

  7. Early Prediction of Reading Comprehension within the Simple View Framework

    ERIC Educational Resources Information Center

    Catts, Hugh W.; Herrera, Sarah; Nielsen, Diane Corcoran; Bridges, Mindy Sittner

    2015-01-01

    The simple view of reading proposes that reading comprehension is the product of word reading and language comprehension. In this study, we used the simple view framework to examine the early prediction of reading comprehension abilities. Using multiple measures for all constructs, we assessed word reading precursors (i.e., letter knowledge,…

  8. Methods for integrating moderation and mediation: a general analytical framework using moderated path analysis.

    PubMed

    Edwards, Jeffrey R; Lambert, Lisa Schurer

    2007-03-01

    Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated and the mediated effects under investigation. This article presents a general analytical framework for combining moderation and mediation that integrates moderated regression analysis and path analysis. This framework clarifies how moderator variables influence the paths that constitute the direct, indirect, and total effects of mediated models. The authors empirically illustrate this framework and give step-by-step instructions for estimation and interpretation. They summarize the advantages of their framework over current approaches, explain how it subsumes moderated mediation and mediated moderation, and describe how it can accommodate additional moderator and mediator variables, curvilinear relationships, and structural equation models with latent variables. (c) 2007 APA, all rights reserved.

  9. Let's Talk Learning Analytics: A Framework for Implementation in Relation to Student Retention

    ERIC Educational Resources Information Center

    West, Deborah; Heath, David; Huijser, Henk

    2016-01-01

    This paper presents a dialogical tool for the advancement of learning analytics implementation for student retention in Higher Education institutions. The framework was developed as an outcome of a project commissioned and funded by the Australian Government's "Office for Learning and Teaching". The project took a mixed-method approach…

  10. Organizational Culture and Organizational Effectiveness: A Meta-Analytic Investigation of the Competing Values Framework's Theoretical Suppositions

    ERIC Educational Resources Information Center

    Hartnell, Chad A.; Ou, Amy Yi; Kinicki, Angelo

    2011-01-01

    We apply Quinn and Rohrbaugh's (1983) competing values framework (CVF) as an organizing taxonomy to meta-analytically test hypotheses about the relationship between 3 culture types and 3 major indices of organizational effectiveness (employee attitudes, operational performance [i.e., innovation and product and service quality], and financial…

  11. Unintended Revelations in History Textbooks: The Precarious Authenticity and Historical Continuity of the Slovak Nation

    ERIC Educational Resources Information Center

    Šulíková, Jana

    2016-01-01

    Purpose: This article proposes an analytical framework that helps to identify and challenge misconceptions of ethnocentrism found in pre-tertiary teaching resources for history and the social sciences in numerous countries. Design: Drawing on nationalism studies, the analytical framework employs ideas known under the umbrella terms of…

  12. A generalized theory of preferential linking

    NASA Astrophysics Data System (ADS)

    Hu, Haibo; Guo, Jinli; Liu, Xuan; Wang, Xiaofan

    2014-12-01

    There are diverse mechanisms driving the evolution of social networks. A key open question dealing with understanding their evolution is: How do various preferential linking mechanisms produce networks with different features? In this paper we first empirically study preferential linking phenomena in an evolving online social network, find and validate the linear preference. We propose an analyzable model which captures the real growth process of the network and reveals the underlying mechanism dominating its evolution. Furthermore based on preferential linking we propose a generalized model reproducing the evolution of online social networks, and present unified analytical results describing network characteristics for 27 preference scenarios. We study the mathematical structure of degree distributions and find that within the framework of preferential linking analytical degree distributions can only be the combinations of finite kinds of functions which are related to rational, logarithmic and inverse tangent functions, and extremely complex network structure will emerge even for very simple sublinear preferential linking. This work not only provides a verifiable origin for the emergence of various network characteristics in social networks, but bridges the micro individuals' behaviors and the global organization of social networks.

  13. Uncertainty evaluation of nuclear reaction model parameters using integral and microscopic measurements. Covariances evaluation with CONRAD code

    NASA Astrophysics Data System (ADS)

    de Saint Jean, C.; Habert, B.; Archier, P.; Noguere, G.; Bernard, D.; Tommasi, J.; Blaise, P.

    2010-10-01

    In the [eV;MeV] energy range, modelling of the neutron induced reactions are based on nuclear reaction models having parameters. Estimation of co-variances on cross sections or on nuclear reaction model parameters is a recurrent puzzle in nuclear data evaluation. Major breakthroughs were asked by nuclear reactor physicists to assess proper uncertainties to be used in applications. In this paper, mathematical methods developped in the CONRAD code[2] will be presented to explain the treatment of all type of uncertainties, including experimental ones (statistical and systematic) and propagate them to nuclear reaction model parameters or cross sections. Marginalization procedure will thus be exposed using analytical or Monte-Carlo solutions. Furthermore, one major drawback found by reactor physicist is the fact that integral or analytical experiments (reactor mock-up or simple integral experiment, e.g. ICSBEP, …) were not taken into account sufficiently soon in the evaluation process to remove discrepancies. In this paper, we will describe a mathematical framework to take into account properly this kind of information.

  14. Vortex Core Size in the Rotor Near-Wake

    NASA Technical Reports Server (NTRS)

    Young, Larry A.

    2003-01-01

    Using a kinetic energy conservation approach, a number of simple analytic expressions are derived for estimating the core size of tip vortices in the near-wake of rotors in hover and axial-flow flight. The influence of thrust, induced power losses, advance ratio, and vortex structure on rotor vortex core size is assessed. Experimental data from the literature is compared to the analytical results derived in this paper. In general, three conclusions can be drawn from the work in this paper. First, the greater the rotor thrust, t h e larger the vortex core size in the rotor near-wake. Second, the more efficient a rotor is with respect to induced power losses, the smaller the resulting vortex core size. Third, and lastly, vortex core size initially decreases for low axial-flow advance ratios, but for large advance ratios core size asymptotically increases to a nominal upper limit. Insights gained from this work should enable improved modeling of rotary-wing aerodynamics, as well as provide a framework for improved experimental investigations of rotor a n d advanced propeller wakes.

  15. Web-based Visual Analytics for Extreme Scale Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Evans, Katherine J; Harney, John F

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less

  16. The Earth Data Analytic Services (EDAS) Framework

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2017-12-01

    Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.

  17. Big data and high-performance analytics in structural health monitoring for bridge management

    NASA Astrophysics Data System (ADS)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  18. Analytical framework and tool kit for SEA follow-up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran

    2009-04-15

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less

  19. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL

    EPA Science Inventory

    The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...

  20. Expanding Students' Analytical Frameworks through the Study of Graphic Novels

    ERIC Educational Resources Information Center

    Connors, Sean P.

    2015-01-01

    When teachers work with students to construct a metalanguage that they can draw on to describe and analyze graphic novels, and then invite students to apply that metalanguage in the service of composing multimodal texts of their own, teachers broaden students' analytical frameworks. In the process of doing so, teachers empower students. In this…

  1. PISA 2015 Assessment and Analytical Framework: Science, Reading, Mathematic, Financial Literacy and Collaborative Problem Solving

    ERIC Educational Resources Information Center

    OECD Publishing, 2017

    2017-01-01

    What is important for citizens to know and be able to do? The OECD Programme for International Student Assessment (PISA) seeks to answer that question through the most comprehensive and rigorous international assessment of student knowledge and skills. The PISA 2015 Assessment and Analytical Framework presents the conceptual foundations of the…

  2. Xpey' Relational Environments: an analytic framework for conceptualizing Indigenous health equity.

    PubMed

    Kent, Alexandra; Loppie, Charlotte; Carriere, Jeannine; MacDonald, Marjorie; Pauly, Bernie

    2017-12-01

    Both health equity research and Indigenous health research are driven by the goal of promoting equitable health outcomes among marginalized and underserved populations. However, the two fields often operate independently, without collaboration. As a result, Indigenous populations are underrepresented in health equity research relative to the disproportionate burden of health inequities they experience. In this methodological article, we present Xpey' Relational Environments, an analytic framework that maps some of the barriers and facilitators to health equity for Indigenous peoples. Health equity research needs to include a focus on Indigenous populations and Indigenized methodologies, a shift that could fill gaps in knowledge with the potential to contribute to 'closing the gap' in Indigenous health. With this in mind, the Equity Lens in Public Health (ELPH) research program adopted the Xpey' Relational Environments framework to add a focus on Indigenous populations to our research on the prioritization and implementation of health equity. The analytic framework introduced an Indigenized health equity lens to our methodology, which facilitated the identification of social, structural and systemic determinants of Indigenous health. To test the framework, we conducted a pilot case study of one of British Columbia's regional health authorities, which included a review of core policies and plans as well as interviews and focus groups with frontline staff, managers and senior executives. ELPH's application of Xpey' Relational Environments serves as an example of the analytic framework's utility for exploring and conceptualizing Indigenous health equity in BC's public health system. Future applications of the framework should be embedded in Indigenous research methodologies.

  3. Simple quasi-analytical holonomic homogenization model for the non-linear analysis of in-plane loaded masonry panels: Part 1, meso-scale

    NASA Astrophysics Data System (ADS)

    Milani, G.; Bertolesi, E.

    2017-07-01

    A simple quasi analytical holonomic homogenization approach for the non-linear analysis of masonry walls in-plane loaded is presented. The elementary cell (REV) is discretized with 24 triangular elastic constant stress elements (bricks) and non-linear interfaces (mortar). A holonomic behavior with softening is assumed for mortar. It is shown how the mechanical problem in the unit cell is characterized by very few displacement variables and how homogenized stress-strain behavior can be evaluated semi-analytically.

  4. Computation of rare transitions in the barotropic quasi-geostrophic equations

    NASA Astrophysics Data System (ADS)

    Laurie, Jason; Bouchet, Freddy

    2015-01-01

    We investigate the theoretical and numerical computation of rare transitions in simple geophysical turbulent models. We consider the barotropic quasi-geostrophic and two-dimensional Navier-Stokes equations in regimes where bistability between two coexisting large-scale attractors exist. By means of large deviations and instanton theory with the use of an Onsager-Machlup path integral formalism for the transition probability, we show how one can directly compute the most probable transition path between two coexisting attractors analytically in an equilibrium (Langevin) framework and numerically otherwise. We adapt a class of numerical optimization algorithms known as minimum action methods to simple geophysical turbulent models. We show that by numerically minimizing an appropriate action functional in a large deviation limit, one can predict the most likely transition path for a rare transition between two states. By considering examples where theoretical predictions can be made, we show that the minimum action method successfully predicts the most likely transition path. Finally, we discuss the application and extension of such numerical optimization schemes to the computation of rare transitions observed in direct numerical simulations and experiments and to other, more complex, turbulent systems.

  5. A nanoscale Zr-based fluorescent metal-organic framework for selective and sensitive detection of hydrogen sulfide

    NASA Astrophysics Data System (ADS)

    Li, Yanping; Zhang, Xin; Zhang, Ling; Jiang, Ke; Cui, Yuanjing; Yang, Yu; Qian, Guodong

    2017-11-01

    Hydrogen sulfide (H2S) has been commonly viewed as a gas signaling molecule in various physiological and pathological processes. However, the highly efficient H2S detection still remains challenging. Herein, we designed a new robust nano metal-organic framework (MOF) UiO-66-CH=CH2 as a fluorescent probe for rapid, sensitive and selective detection of biological H2S. UiO-66-CH=CH2 was prepared by heating ZrCl4 and 2-vinylterephthalic acid via a simple method. UiO-66-CH=CH2 displayed fluorescence quenching to H2S and kept excellent selectivity in the presence of biological relevant analytes especially the cysteine and glutathione. This MOF-based probe also exhibited fast response (10 s) and high sensitivity with a detection limit of 6.46 μM which was within the concentration range of biological H2S in living system. Moreover, this constructed MOF featured water-stability, nanoscale (20-30 nm) and low toxicity, which made it a promising candidate for biological H2S sensing.

  6. Prediction of the low-velocity distribution from the pore structure in simple porous media

    NASA Astrophysics Data System (ADS)

    de Anna, Pietro; Quaife, Bryan; Biros, George; Juanes, Ruben

    2017-12-01

    The macroscopic properties of fluid flow and transport through porous media are a direct consequence of the underlying pore structure. However, precise relations that characterize flow and transport from the statistics of pore-scale disorder have remained elusive. Here we investigate the relationship between pore structure and the resulting fluid flow and asymptotic transport behavior in two-dimensional geometries of nonoverlapping circular posts. We derive an analytical relationship between the pore throat size distribution fλ˜λ-β and the distribution of the low fluid velocities fu˜u-β /2 , based on a conceptual model of porelets (the flow established within each pore throat, here a Hagen-Poiseuille flow). Our model allows us to make predictions, within a continuous-time random-walk framework, for the asymptotic statistics of the spreading of fluid particles along their own trajectories. These predictions are confirmed by high-fidelity simulations of Stokes flow and advective transport. The proposed framework can be extended to other configurations which can be represented as a collection of known flow distributions.

  7. Proxy functions for turbulent transport optimization of stellarators

    NASA Astrophysics Data System (ADS)

    Rorvig, Mordechai; Hegna, Chris; Mynick, Harry; Xanthopoulos, Pavlos

    2012-10-01

    The design freedom of toroidal confinement shaping suggests the possibility of optimizing the magnetic geometry for turbulent transport, particularly in stellarators. The framework for implementing such an optimization was recently established [1] using a proxy function as a measure of the ITG induced turbulent transport associated with a given geometry. Working in the framework of local 3-D equilibrium [2], we investigate the theory and implications of such proxy functions by analyzing the linear instability dependence on curvature and local shear, and the associated quasilinear transport estimates. Simple analytic models suggest the beneficial effect of local shear enters through polarization effects, which can be controlled by field torsion in small net current regimes. We test the proxy functions with local, electrostatic gyrokinetics calculations [3] of ITG modes for experimentally motivated local 3-D equilibria.[4pt] [1] H. E. Mynick, N. Pomphrey, and P. Xanthopoulos, Phys. Rev. Lett. 105, 095004 (2010).[0pt] [2] C. C. Hegna, Physics of Plasmas 7, 3921 (2000).[0pt] [3] F. Jenko, W. Dorland, M. Kotschenreuther, and B. N. Rogers, Physical Review Letters 7, 1904 (2000).

  8. Functional Genomics Assistant (FUGA): a toolbox for the analysis of complex biological networks

    PubMed Central

    2011-01-01

    Background Cellular constituents such as proteins, DNA, and RNA form a complex web of interactions that regulate biochemical homeostasis and determine the dynamic cellular response to external stimuli. It follows that detailed understanding of these patterns is critical for the assessment of fundamental processes in cell biology and pathology. Representation and analysis of cellular constituents through network principles is a promising and popular analytical avenue towards a deeper understanding of molecular mechanisms in a system-wide context. Findings We present Functional Genomics Assistant (FUGA) - an extensible and portable MATLAB toolbox for the inference of biological relationships, graph topology analysis, random network simulation, network clustering, and functional enrichment statistics. In contrast to conventional differential expression analysis of individual genes, FUGA offers a framework for the study of system-wide properties of biological networks and highlights putative molecular targets using concepts of systems biology. Conclusion FUGA offers a simple and customizable framework for network analysis in a variety of systems biology applications. It is freely available for individual or academic use at http://code.google.com/p/fuga. PMID:22035155

  9. Detection of epistatic effects with logic regression and a classical linear regression model.

    PubMed

    Malina, Magdalena; Ickstadt, Katja; Schwender, Holger; Posch, Martin; Bogdan, Małgorzata

    2014-02-01

    To locate multiple interacting quantitative trait loci (QTL) influencing a trait of interest within experimental populations, usually methods as the Cockerham's model are applied. Within this framework, interactions are understood as the part of the joined effect of several genes which cannot be explained as the sum of their additive effects. However, if a change in the phenotype (as disease) is caused by Boolean combinations of genotypes of several QTLs, this Cockerham's approach is often not capable to identify them properly. To detect such interactions more efficiently, we propose a logic regression framework. Even though with the logic regression approach a larger number of models has to be considered (requiring more stringent multiple testing correction) the efficient representation of higher order logic interactions in logic regression models leads to a significant increase of power to detect such interactions as compared to a Cockerham's approach. The increase in power is demonstrated analytically for a simple two-way interaction model and illustrated in more complex settings with simulation study and real data analysis.

  10. How to evaluate population management? Transforming the Care Continuum Alliance population health guide toward a broadly applicable analytical framework.

    PubMed

    Struijs, Jeroen N; Drewes, Hanneke W; Heijink, Richard; Baan, Caroline A

    2015-04-01

    Many countries face the persistent twin challenge of providing high-quality care while keeping health systems affordable and accessible. As a result, the interest for more efficient strategies to stimulate population health is increasing. A possible successful strategy is population management (PM). PM strives to address health needs for the population at-risk and the chronically ill at all points along the health continuum by integrating services across health care, prevention, social care and welfare. The Care Continuum Alliance (CCA) population health guide, which recently changed their name in Population Health Alliance (PHA) provides a useful instrument for implementing and evaluating such innovative approaches. This framework is developed for PM specifically and describes the core elements of the PM-concept on the basis of six subsequent interrelated steps. The aim of this article is to transform the CCA framework into an analytical framework. Quantitative methods are refined and we operationalized a set of indicators to measure the impact of PM in terms of the Triple Aim (population health, quality of care and cost per capita). Additionally, we added a qualitative part to gain insight into the implementation process of PM. This resulted in a broadly applicable analytical framework based on a mixed-methods approach. In the coming years, the analytical framework will be applied within the Dutch Monitor Population Management to derive transferable 'lessons learned' and to methodologically underpin the concept of PM. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  11. A simple analytical aerodynamic model of Langley Winged-Cone Aerospace Plane concept

    NASA Technical Reports Server (NTRS)

    Pamadi, Bandu N.

    1994-01-01

    A simple three DOF analytical aerodynamic model of the Langley Winged-Coned Aerospace Plane concept is presented in a form suitable for simulation, trajectory optimization, and guidance and control studies. The analytical model is especially suitable for methods based on variational calculus. Analytical expressions are presented for lift, drag, and pitching moment coefficients from subsonic to hypersonic Mach numbers and angles of attack up to +/- 20 deg. This analytical model has break points at Mach numbers of 1.0, 1.4, 4.0, and 6.0. Across these Mach number break points, the lift, drag, and pitching moment coefficients are made continuous but their derivatives are not. There are no break points in angle of attack. The effect of control surface deflection is not considered. The present analytical model compares well with the APAS calculations and wind tunnel test data for most angles of attack and Mach numbers.

  12. Pair-Starved Pulsar Magnetospheres

    NASA Technical Reports Server (NTRS)

    Muslimov, Alex G.; Harding, Alice K.

    2009-01-01

    We propose a simple analytic model for the innermost (within the light cylinder of canonical radius, approx. c/Omega) structure of open-magnetic-field lines of a rotating neutron star (NS) with relativistic outflow of charged particles (electrons/positrons) and arbitrary angle between the NS spin and magnetic axes. We present the self-consistent solution of Maxwell's equations for the magnetic field and electric current in the pair-starved regime where the density of electron-positron plasma generated above the pulsar polar cap is not sufficient to completely screen the accelerating electric field and thus establish thee E . B = 0 condition above the pair-formation front up to the very high altitudes within the light cylinder. The proposed mode1 may provide a theoretical framework for developing the refined model of the global pair-starved pulsar magnetosphere.

  13. Multi-phase outflows as probes of AGN accretion history

    NASA Astrophysics Data System (ADS)

    Nardini, Emanuele; Zubovas, Kastytis

    2018-05-01

    Powerful outflows with a broad range of properties (such as velocity, ionization, radial scale and mass loss rate) represent a key feature of active galactic nuclei (AGN), even more so since they have been simultaneously revealed also in individual objects. Here we revisit in a simple analytical framework the recent remarkable cases of two ultraluminous infrared quasars, IRAS F11119+3257 and Mrk 231, which allow us to investigate the physical connection between multi-phase AGN outflows across the ladder of distance from the central supermassive black hole (SMBH). We argue that any major deviations from the standard outflow propagation models might encode unique information on the past SMBH accretion history, and briefly discuss how this could help address some controversial aspects of the current picture of AGN feedback.

  14. Efficient generation of sum-of-products representations of high-dimensional potential energy surfaces based on multimode expansions

    NASA Astrophysics Data System (ADS)

    Ziegler, Benjamin; Rauhut, Guntram

    2016-03-01

    The transformation of multi-dimensional potential energy surfaces (PESs) from a grid-based multimode representation to an analytical one is a standard procedure in quantum chemical programs. Within the framework of linear least squares fitting, a simple and highly efficient algorithm is presented, which relies on a direct product representation of the PES and a repeated use of Kronecker products. It shows the same scalings in computational cost and memory requirements as the potfit approach. In comparison to customary linear least squares fitting algorithms, this corresponds to a speed-up and memory saving by several orders of magnitude. Different fitting bases are tested, namely, polynomials, B-splines, and distributed Gaussians. Benchmark calculations are provided for the PESs of a set of small molecules.

  15. Efficient generation of sum-of-products representations of high-dimensional potential energy surfaces based on multimode expansions.

    PubMed

    Ziegler, Benjamin; Rauhut, Guntram

    2016-03-21

    The transformation of multi-dimensional potential energy surfaces (PESs) from a grid-based multimode representation to an analytical one is a standard procedure in quantum chemical programs. Within the framework of linear least squares fitting, a simple and highly efficient algorithm is presented, which relies on a direct product representation of the PES and a repeated use of Kronecker products. It shows the same scalings in computational cost and memory requirements as the potfit approach. In comparison to customary linear least squares fitting algorithms, this corresponds to a speed-up and memory saving by several orders of magnitude. Different fitting bases are tested, namely, polynomials, B-splines, and distributed Gaussians. Benchmark calculations are provided for the PESs of a set of small molecules.

  16. Vectorlike fermions and Higgs effective field theory revisited

    DOE PAGES

    Chen, Chien-Yi; Dawson, S.; Furlan, Elisabetta

    2017-07-10

    Heavy vectorlike quarks (VLQs) appear in many models of beyond the Standard Model physics. Direct experimental searches require these new quarks to be heavy, ≳ 800 – 1000 GeV . Here, we perform a global fit of the parameters of simple VLQ models in minimal representations of S U ( 2 ) L to precision data and Higgs rates. One interesting connection between anomalous Z bmore » $$\\bar{b}$$ interactions and Higgs physics in VLQ models is discussed. Finally, we present our analysis in an effective field theory (EFT) framework and show that the parameters of VLQ models are already highly constrained. Exact and approximate analytical formulas for the S and T parameters in the VLQ models we consider are available in the Supplemental Material as Mathematica files.« less

  17. An implementation of the maximum-caliber principle by replica-averaged time-resolved restrained simulations

    NASA Astrophysics Data System (ADS)

    Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo

    2018-05-01

    Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.

  18. The Discounted Method and Equivalence of Average Criteria for Risk-Sensitive Markov Decision Processes on Borel Spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cavazos-Cadena, Rolando, E-mail: rcavazos@uaaan.m; Salem-Silva, Francisco, E-mail: frsalem@uv.m

    2010-04-15

    This note concerns discrete-time controlled Markov chains with Borel state and action spaces. Given a nonnegative cost function, the performance of a control policy is measured by the superior limit risk-sensitive average criterion associated with a constant and positive risk sensitivity coefficient. Within such a framework, the discounted approach is used (a) to establish the existence of solutions for the corresponding optimality inequality, and (b) to show that, under mild conditions on the cost function, the optimal value functions corresponding to the superior and inferior limit average criteria coincide on a certain subset of the state space. The approach ofmore » the paper relies on standard dynamic programming ideas and on a simple analytical derivation of a Tauberian relation.« less

  19. An implementation of the maximum-caliber principle by replica-averaged time-resolved restrained simulations.

    PubMed

    Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo

    2018-05-14

    Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.

  20. Towards an Analytical Framework for Understanding the Development of a Quality Assurance System in an International Joint Programme

    ERIC Educational Resources Information Center

    Zheng, Gaoming; Cai, Yuzhuo; Ma, Shaozhuang

    2017-01-01

    This paper intends to construct an analytical framework for understanding quality assurance in international joint programmes and to test it in a case analysis of a European--Chinese joint doctoral degree programme. The development of a quality assurance system for an international joint programme is understood as an institutionalization process…

  1. The Framework of Intervention Engine Based on Learning Analytics

    ERIC Educational Resources Information Center

    Sahin, Muhittin; Yurdugül, Halil

    2017-01-01

    Learning analytics primarily deals with the optimization of learning environments and the ultimate goal of learning analytics is to improve learning and teaching efficiency. Studies on learning analytics seem to have been made in the form of adaptation engine and intervention engine. Adaptation engine studies are quite widespread, but intervention…

  2. Median of patient results as a tool for assessment of analytical stability.

    PubMed

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Policy-Making Theory as an Analytical Framework in Policy Analysis: Implications for Research Design and Professional Advocacy.

    PubMed

    Sheldon, Michael R

    2016-01-01

    Policy studies are a recent addition to the American Physical Therapy Association's Research Agenda and are critical to our understanding of various federal, state, local, and organizational policies on the provision of physical therapist services across the continuum of care. Policy analyses that help to advance the profession's various policy agendas will require relevant theoretical frameworks to be credible. The purpose of this perspective article is to: (1) demonstrate the use of a policy-making theory as an analytical framework in a policy analysis and (2) discuss how sound policy analysis can assist physical therapists in becoming more effective change agents, policy advocates, and partners with other relevant stakeholder groups. An exploratory study of state agency policy responses to address work-related musculoskeletal disorders is provided as a contemporary example to illustrate key points and to demonstrate the importance of selecting a relevant analytical framework based on the context of the policy issue under investigation. © 2016 American Physical Therapy Association.

  4. A simple, analytical, axisymmetric microburst model for downdraft estimation

    NASA Technical Reports Server (NTRS)

    Vicroy, Dan D.

    1991-01-01

    A simple analytical microburst model was developed for use in estimating vertical winds from horizontal wind measurements. It is an axisymmetric, steady state model that uses shaping functions to satisfy the mass continuity equation and simulate boundary layer effects. The model is defined through four model variables: the radius and altitude of the maximum horizontal wind, a shaping function variable, and a scale factor. The model closely agrees with a high fidelity analytical model and measured data, particularily in the radial direction and at lower altitudes. At higher altitudes, the model tends to overestimate the wind magnitude relative to the measured data.

  5. Simple analytical model of a thermal diode

    NASA Astrophysics Data System (ADS)

    Kaushik, Saurabh; Kaushik, Sachin; Marathe, Rahul

    2018-05-01

    Recently there is a lot of attention given to manipulation of heat by constructing thermal devices such as thermal diodes, transistors and logic gates. Many of the models proposed have an asymmetry which leads to the desired effect. Presence of non-linear interactions among the particles is also essential. But, such models lack analytical understanding. Here we propose a simple, analytically solvable model of a thermal diode. Our model consists of classical spins in contact with multiple heat baths and constant external magnetic fields. Interestingly the magnetic field is the only parameter required to get the effect of heat rectification.

  6. Assessment of Children's Psychological Development and Data Analytic Framework in New York City Infant Day Care Study.

    ERIC Educational Resources Information Center

    Golden, Mark

    This report briefly describes the procedures for assessing children's psychological development and the data analytic framework used in the New York City Infant Day Care Study. This study is a 5-year, longitudinal investigation in which infants in group and family day care programs and infants reared at home are compared. Children in the study are…

  7. Value of Flexibility - Phase 1

    DTIC Science & Technology

    2010-09-25

    weaknesses of each approach. During this period, we also explored the development of an analytical framework based on sound mathematical constructs... mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory or guidance on best approaches to...research activities is in developing a coherent value based definition of flexibility that is based on an analytical framework that is mathematically

  8. A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions

    PubMed Central

    Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.

    2009-01-01

    Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453

  9. An analytical procedure to assist decision-making in a government research organization

    Treesearch

    H. Dean Claxton; Giuseppe Rensi

    1972-01-01

    An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...

  10. TU-H-CAMPUS-IeP1-05: A Framework for the Analytic Calculation of Patient-Specific Dose Distribution Due to CBCT Scan for IGRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Youn, H; Jeon, H; Nam, J

    Purpose: To investigate the feasibility of an analytic framework to estimate patients’ absorbed dose distribution owing to daily cone-beam CT scan for image-guided radiation treatment. Methods: To compute total absorbed dose distribution, we separated the framework into primary and scattered dose calculations. Using the source parameters such as voltage, current, and bowtie filtration, for the primary dose calculation, we simulated the forward projection from the source to each voxel of an imaging object including some inhomogeneous inserts. Then we calculated the primary absorbed dose at each voxel based on the absorption probability deduced from the HU values and Beer’s law.more » In sequence, all voxels constructing the phantom were regarded as secondary sources to radiate scattered photons for scattered dose calculation. Details of forward projection were identical to that of the previous step. The secondary source intensities were given by using scatter-to- primary ratios provided by NIST. In addition, we compared the analytically calculated dose distribution with their Monte Carlo simulation results. Results: The suggested framework for absorbed dose estimation successfully provided the primary and secondary dose distributions of the phantom. Moreover, our analytic dose calculations and Monte Carlo calculations were well agreed each other even near the inhomogeneous inserts. Conclusion: This work indicated that our framework can be an effective monitor to estimate a patient’s exposure owing to cone-beam CT scan for image-guided radiation treatment. Therefore, we expected that the patient’s over-exposure during IGRT might be prevented by our framework.« less

  11. Estimating Aquifer Properties Using Sinusoidal Pumping Tests

    NASA Astrophysics Data System (ADS)

    Rasmussen, T. C.; Haborak, K. G.; Young, M. H.

    2001-12-01

    We develop the theoretical and applied framework for using sinusoidal pumping tests to estimate aquifer properties for confined, leaky, and partially penetrating conditions. The framework 1) derives analytical solutions for three boundary conditions suitable for many practical applications, 2) validates the analytical solutions against a finite element model, 3) establishes a protocol for conducting sinusoidal pumping tests, and 4) estimates aquifer hydraulic parameters based on the analytical solutions. The analytical solutions to sinusoidal stimuli in radial coordinates are derived for boundary value problems that are analogous to the Theis (1935) confined aquifer solution, the Hantush and Jacob (1955) leaky aquifer solution, and the Hantush (1964) partially penetrated confined aquifer solution. The analytical solutions compare favorably to a finite-element solution of a simulated flow domain, except in the region immediately adjacent to the pumping well where the implicit assumption of zero borehole radius is violated. The procedure is demonstrated in one unconfined and two confined aquifer units near the General Separations Area at the Savannah River Site, a federal nuclear facility located in South Carolina. Aquifer hydraulic parameters estimated using this framework provide independent confirmation of parameters obtained from conventional aquifer tests. The sinusoidal approach also resulted in the elimination of investigation-derived wastes.

  12. Analytical method of waste allocation in waste management systems: Concept, method and case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergeron, Francis C., E-mail: francis.b.c@videotron.ca

    Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. Themore » conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. - Highlights: • The study presents a new analytical method based on the waste allocation process. • The method provides an in-depth and contextual knowledge of the waste management system. • The paper provides a reproducible procedure for professionals, experts and academics. • It may be integrated into impact assessment or environmental system analysis tools. • An illustrative case study is provided based on household waste management in Geneva.« less

  13. A novel metal-organic framework composite MIL-101(Cr)@GO as an efficient sorbent in dispersive micro-solid phase extraction coupling with UHPLC-MS/MS for the determination of sulfonamides in milk samples.

    PubMed

    Jia, Xiuna; Zhao, Pan; Ye, Xiu; Zhang, Lianjun; Wang, Ting; Chen, Qinyu; Hou, Xiaohong

    2017-07-01

    As a novel material, metal-organic framework/graphite oxide (MIL-101(Cr)@GO) has great potential for the pretreatment of trace analytes. In the present study, MIL-101(Cr)@GO was synthesized using a solvothermal synthesis method at the nanoscale and was applied as sorbent in the dispersive micro-solid phase extraction (DMSPE) for the enrichment of the trace sulfonamides (SAs) from milk samples for the first time. Several experimental parameters including kinds of sorbents, the effect of pH, the amount of MIL-101(Cr)@GO, ionic strength, adsorption time, desorption solvent and desorption time were investigated. Under the optimal conditions, the linear ranges were from 0.1 to 10μg/L, 0.2-20μg/L or 0.5-50μg/L for the analytes with regression coefficients (r) from 0.9942 to 0.9999. The limits of detection were between 0.012 and 0.145μg/L. The recoveries ranged from 79.83% to 103.8% with relative standard deviations (RSDs)<10% (n=3). MIL-101(Cr)@GO exhibited remarkable advantages compared to MIL-101(Cr), MIL-100(Fe), activated carbon and other sorbent materials used in pretreatment methods. A simple, rapid, sensitive, inexpensive and less solvent consuming method of DMSPE-ultra-high performance liquid chromatography-tandem mass spectrometry (DMSPE-UHPLC-MS/MS) was successfully applied to the pre-concentration and determination of twelve SAs in milk samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. A Modular GIS-Based Software Architecture for Model Parameter Estimation using the Method of Anchored Distributions (MAD)

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.

    2012-12-01

    The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.

  15. A Structural Framework for a Near-Minimal Form of Life: Mass and Compositional Analysis of the Helical Mollicute Spiroplasma melliferum BC3

    PubMed Central

    Trachtenberg, Shlomo; Schuck, Peter; Phillips, Terry M.; Andrews, S. Brian; Leapman, Richard D.

    2014-01-01

    Spiroplasma melliferum is a wall-less bacterium with dynamic helical geometry. This organism is geometrically well defined and internally well ordered, and has an exceedingly small genome. Individual cells are chemotactic, polar, and swim actively. Their dynamic helicity can be traced at the molecular level to a highly ordered linear motor (composed essentially of the proteins fib and MreB) that is positioned on a defined helical line along the internal face of the cell’s membrane. Using an array of complementary, informationally overlapping approaches, we have taken advantage of this uniquely simple, near-minimal life-form and its helical geometry to analyze the copy numbers of Spiroplasma’s essential parts, as well as to elucidate how these components are spatially organized to subserve the whole living cell. Scanning transmission electron microscopy (STEM) was used to measure the mass-per-length and mass-per-area of whole cells, membrane fractions, intact cytoskeletons and cytoskeletal components. These local data were fit into whole-cell geometric parameters determined by a variety of light microscopy modalities. Hydrodynamic data obtained by analytical ultracentrifugation allowed computation of the hydration state of whole living cells, for which the relative amounts of protein, lipid, carbohydrate, DNA, and RNA were also estimated analytically. Finally, ribosome and RNA content, genome size and gene expression were also estimated (using stereology, spectroscopy and 2D-gel analysis, respectively). Taken together, the results provide a general framework for a minimal inventory and arrangement of the major cellular components needed to support life. PMID:24586297

  16. Developing Learning Analytics Design Knowledge in the "Middle Space": The Student Tuning Model and Align Design Framework for Learning Analytics Use

    ERIC Educational Resources Information Center

    Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting

    2016-01-01

    This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…

  17. DEMONSTRATION OF THE ANALYTIC ELEMENT METHOD FOR WELLHEAD PROTECTION

    EPA Science Inventory

    A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a ground-water fl...

  18. Analytical Tools in School Finance Reform.

    ERIC Educational Resources Information Center

    Johns, R. L.

    This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…

  19. Horizontal lifelines - review of regulations and simple design method considering anchorage rigidity.

    PubMed

    Galy, Bertrand; Lan, André

    2018-03-01

    Among the many occupational risks construction workers encounter every day falling from a height is the most dangerous. The objective of this article is to propose a simple analytical design method for horizontal lifelines (HLLs) that considers anchorage flexibility. The article presents a short review of the standards and regulations/acts/codes concerning HLLs in Canada the USA and Europe. A static analytical approach is proposed considering anchorage flexibility. The analytical results are compared with a series of 42 dynamic fall tests and a SAP2000 numerical model. The experimental results show that the analytical method is a little conservative and overestimates the line tension in most cases with a maximum of 17%. The static SAP2000 results show a maximum 2.1% difference with the analytical method. The analytical method is accurate enough to safely design HLLs and quick design abaci are provided to allow the engineer to make quick on-site verification if needed.

  20. A hybrid finite element-transfer matrix model for vibroacoustic systems with flat and homogeneous acoustic treatments.

    PubMed

    Alimonti, Luca; Atalla, Noureddine; Berry, Alain; Sgard, Franck

    2015-02-01

    Practical vibroacoustic systems involve passive acoustic treatments consisting of highly dissipative media such as poroelastic materials. The numerical modeling of such systems at low to mid frequencies typically relies on substructuring methodologies based on finite element models. Namely, the master subsystems (i.e., structural and acoustic domains) are described by a finite set of uncoupled modes, whereas condensation procedures are typically preferred for the acoustic treatments. However, although accurate, such methodology is computationally expensive when real life applications are considered. A potential reduction of the computational burden could be obtained by approximating the effect of the acoustic treatment on the master subsystems without introducing physical degrees of freedom. To do that, the treatment has to be assumed homogeneous, flat, and of infinite lateral extent. Under these hypotheses, simple analytical tools like the transfer matrix method can be employed. In this paper, a hybrid finite element-transfer matrix methodology is proposed. The impact of the limiting assumptions inherent within the analytical framework are assessed for the case of plate-cavity systems involving flat and homogeneous acoustic treatments. The results prove that the hybrid model can capture the qualitative behavior of the vibroacoustic system while reducing the computational effort.

  1. Modal expansions in periodic photonic systems with material loss and dispersion

    NASA Astrophysics Data System (ADS)

    Wolff, Christian; Busch, Kurt; Mortensen, N. Asger

    2018-03-01

    We study band-structure properties of periodic optical systems composed of lossy and intrinsically dispersive materials. To this end, we develop an analytical framework based on adjoint modes of a lossy periodic electromagnetic system and show how the problem of linearly dependent eigenmodes in the presence of material dispersion can be overcome. We then formulate expressions for the band-structure derivative (∂ ω )/(∂ k ) (complex group velocity) and the local and total density of transverse optical states. Our exact expressions hold for 3D periodic arrays of materials with arbitrary dispersion properties and in general need to be evaluated numerically. They can be generalized to systems with two, one, or no directions of periodicity provided the fields are localized along nonperiodic directions. Possible applications are photonic crystals, metamaterials, metasurfaces composed of highly dispersive materials such as metals or lossless photonic crystals, and metamaterials or metasurfaces strongly coupled to resonant perturbations such as quantum dots or excitons in 2D materials. For illustration purposes, we analytically evaluate our expressions for some simple systems consisting of lossless dielectrics with one sharp Lorentzian material resonance added. By combining several Lorentz poles, this provides an avenue to perturbatively treat quite general material loss bands in photonic crystals.

  2. Analytical model of a corona discharge from a conical electrode under saturation

    NASA Astrophysics Data System (ADS)

    Boltachev, G. Sh.; Zubarev, N. M.

    2012-11-01

    Exact partial solutions are found for the electric field distribution in the outer region of a stationary unipolar corona discharge from an ideal conical needle in the space-charge-limited current mode with allowance for the electric field dependence of the ion mobility. It is assumed that only the very tip of the cone is responsible for the discharge, i.e., that the ionization zone is a point. The solutions are obtained by joining the spherically symmetric potential distribution in the drift space and the self-similar potential distribution in the space-charge-free region. Such solutions are outside the framework of the conventional Deutsch approximation, according to which the space charge insignificantly influences the shape of equipotential surfaces and electric lines of force. The dependence is derived of the corona discharge saturation current on the apex angle of the conical electrode and applied potential difference. A simple analytical model is suggested that describes drift in the point-plane electrode geometry under saturation as a superposition of two exact solutions for the field potential. In terms of this model, the angular distribution of the current density over the massive plane electrode is derived, which agrees well with Warburg's empirical law.

  3. Ethics and Justice in Learning Analytics

    ERIC Educational Resources Information Center

    Johnson, Jeffrey Alan

    2017-01-01

    The many complex challenges posed by learning analytics can best be understood within a framework of structural justice, which focuses on the ways in which the informational, operational, and organizational structures of learning analytics influence students' capacities for self-development and self-determination. This places primary…

  4. Reading Multimodal Texts: Perceptual, Structural and Ideological Perspectives

    ERIC Educational Resources Information Center

    Serafini, Frank

    2010-01-01

    This article presents a tripartite framework for analyzing multimodal texts. The three analytical perspectives presented include: (1) perceptual, (2) structural, and (3) ideological analytical processes. Using Anthony Browne's picturebook "Piggybook" as an example, assertions are made regarding what each analytical perspective brings to the…

  5. Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework

    PubMed Central

    Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.

    2016-01-01

    Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of TOF scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (Direct Image Reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias vs. variance performance to iterative TOF reconstruction with a matched resolution model. PMID:27032968

  6. Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework

    NASA Astrophysics Data System (ADS)

    Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.

    2016-05-01

    Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of time-of-flight (TOF) scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (DIRECT: direct image reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias versus variance performance to iterative TOF reconstruction with a matched resolution model.

  7. DEMONSTRATION OF THE ANALYTIC ELEMENT METHOD FOR WELLHEAD PROJECTION - PROJECT SUMMARY

    EPA Science Inventory

    A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a ground-water fl...

  8. iSeq: Web-Based RNA-seq Data Analysis and Visualization.

    PubMed

    Zhang, Chao; Fan, Caoqi; Gan, Jingbo; Zhu, Ping; Kong, Lei; Li, Cheng

    2018-01-01

    Transcriptome sequencing (RNA-seq) is becoming a standard experimental methodology for genome-wide characterization and quantification of transcripts at single base-pair resolution. However, downstream analysis of massive amount of sequencing data can be prohibitively technical for wet-lab researchers. A functionally integrated and user-friendly platform is required to meet this demand. Here, we present iSeq, an R-based Web server, for RNA-seq data analysis and visualization. iSeq is a streamlined Web-based R application under the Shiny framework, featuring a simple user interface and multiple data analysis modules. Users without programming and statistical skills can analyze their RNA-seq data and construct publication-level graphs through a standardized yet customizable analytical pipeline. iSeq is accessible via Web browsers on any operating system at http://iseq.cbi.pku.edu.cn .

  9. Analytical determination of space station response to crew motion and design of suspension system for microgravity experiments

    NASA Technical Reports Server (NTRS)

    Liu, F. C.

    1986-01-01

    The objective of this investigation is to make analytical determination of the acceleration produced by crew motion in an orbiting space station and define design parameters for the suspension system of microgravity experiments. A simple structural model for simulation of the IOC space station is proposed. Mathematical formulation of this model provides the engineers a simple and direct tool for designing an effective suspension system.

  10. An Analytical State Transition Matrix for Orbits Perturbed by an Oblate Spheroid

    NASA Technical Reports Server (NTRS)

    Mueller, A. C.

    1977-01-01

    An analytical state transition matrix and its inverse, which include the short period and secular effects of the second zonal harmonic, were developed from the nonsingular PS satellite theory. The fact that the independent variable in the PS theory is not time is in no respect disadvantageous, since any explicit analytical solution must be expressed in the true or eccentric anomaly. This is shown to be the case for the simple conic matrix. The PS theory allows for a concise, accurate, and algorithmically simple state transition matrix. The improvement over the conic matrix ranges from 2 to 4 digits accuracy.

  11. The Strategic Management of Accountability in Nonprofit Organizations: An Analytical Framework.

    ERIC Educational Resources Information Center

    Kearns, Kevin P.

    1994-01-01

    Offers a framework stressing the strategic and tactical choices facing nonprofit organizations and discusses policy and management implications. Claims framework is a useful tool for conducting accountability audits and conceptual foundation for discussions of public policy. (Author/JOW)

  12. Curriculum Innovation for Marketing Analytics

    ERIC Educational Resources Information Center

    Wilson, Elizabeth J.; McCabe, Catherine; Smith, Robert S.

    2018-01-01

    College graduates need better preparation for and experience in data analytics for higher-quality problem solving. Using the curriculum innovation framework of Borin, Metcalf, and Tietje (2007) and case study research methods, we offer rich insights about one higher education institution's work to address the marketing analytics skills gap.…

  13. BRICK v0.2, a simple, accessible, and transparent model framework for climate and regional sea-level projections

    NASA Astrophysics Data System (ADS)

    Wong, Tony E.; Bakker, Alexander M. R.; Ruckert, Kelsey; Applegate, Patrick; Slangen, Aimée B. A.; Keller, Klaus

    2017-07-01

    Simple models can play pivotal roles in the quantification and framing of uncertainties surrounding climate change and sea-level rise. They are computationally efficient, transparent, and easy to reproduce. These qualities also make simple models useful for the characterization of risk. Simple model codes are increasingly distributed as open source, as well as actively shared and guided. Alas, computer codes used in the geosciences can often be hard to access, run, modify (e.g., with regards to assumptions and model components), and review. Here, we describe the simple model framework BRICK (Building blocks for Relevant Ice and Climate Knowledge) v0.2 and its underlying design principles. The paper adds detail to an earlier published model setup and discusses the inclusion of a land water storage component. The framework largely builds on existing models and allows for projections of global mean temperature as well as regional sea levels and coastal flood risk. BRICK is written in R and Fortran. BRICK gives special attention to the model values of transparency, accessibility, and flexibility in order to mitigate the above-mentioned issues while maintaining a high degree of computational efficiency. We demonstrate the flexibility of this framework through simple model intercomparison experiments. Furthermore, we demonstrate that BRICK is suitable for risk assessment applications by using a didactic example in local flood risk management.

  14. The RISE Framework: Using Learning Analytics to Automatically Identify Open Educational Resources for Continuous Improvement

    ERIC Educational Resources Information Center

    Bodily, Robert; Nyland, Rob; Wiley, David

    2017-01-01

    The RISE (Resource Inspection, Selection, and Enhancement) Framework is a framework supporting the continuous improvement of open educational resources (OER). The framework is an automated process that identifies learning resources that should be evaluated and either eliminated or improved. This is particularly useful in OER contexts where the…

  15. A Demonstration of Concrete Structural Health Monitoring Framework for Degradation due to Alkali-Silica Reaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahadevan, Sankaran; Agarwal, Vivek; Neal, Kyle

    Assessment and management of aging concrete structures in nuclear power plants require a more systematic approach than simple reliance on existing code margins of safety. Structural health monitoring of concrete structures aims to understand the current health condition of a structure based on heterogeneous measurements to produce high-confidence actionable information regarding structural integrity that supports operational and maintenance decisions. This ongoing research project is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in a nuclear power plant that is subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of fourmore » elements: monitoring, data analytics, uncertainty quantification and prognosis. This report focuses on degradation caused by ASR (alkali-silica reaction). Controlled specimens were prepared to develop accelerated ASR degradation. Different monitoring techniques – thermography, digital image correlation (DIC), mechanical deformation measurements, nonlinear impact resonance acoustic spectroscopy (NIRAS), and vibro-acoustic modulation (VAM) -- were used to detect the damage caused by ASR. Heterogeneous data from the multiple techniques was used for damage diagnosis and prognosis, and quantification of the associated uncertainty using a Bayesian network approach. Additionally, MapReduce technique has been demonstrated with synthetic data. This technique can be used in future to handle large amounts of observation data obtained from the online monitoring of realistic structures.« less

  16. On the representability problem and the physical meaning of coarse-grained models

    NASA Astrophysics Data System (ADS)

    Wagner, Jacob W.; Dama, James F.; Durumeric, Aleksander E. P.; Voth, Gregory A.

    2016-07-01

    In coarse-grained (CG) models where certain fine-grained (FG, i.e., atomistic resolution) observables are not directly represented, one can nonetheless identify indirect the CG observables that capture the FG observable's dependence on CG coordinates. Often, in these cases it appears that a CG observable can be defined by analogy to an all-atom or FG observable, but the similarity is misleading and significantly undermines the interpretation of both bottom-up and top-down CG models. Such problems emerge especially clearly in the framework of the systematic bottom-up CG modeling, where a direct and transparent correspondence between FG and CG variables establishes precise conditions for consistency between CG observables and underlying FG models. Here we present and investigate these representability challenges and illustrate them via the bottom-up conceptual framework for several simple analytically tractable polymer models. The examples provide special focus on the observables of configurational internal energy, entropy, and pressure, which have been at the root of controversy in the CG literature, as well as discuss observables that would seem to be entirely missing in the CG representation but can nonetheless be correlated with CG behavior. Though we investigate these problems in the framework of systematic coarse-graining, the lessons apply to top-down CG modeling also, with crucial implications for simulation at constant pressure and surface tension and for the interpretations of structural and thermodynamic correlations for comparison to experiment.

  17. Quantifying risks with exact analytical solutions of derivative pricing distribution

    NASA Astrophysics Data System (ADS)

    Zhang, Kun; Liu, Jing; Wang, Erkang; Wang, Jin

    2017-04-01

    Derivative (i.e. option) pricing is essential for modern financial instrumentations. Despite of the previous efforts, the exact analytical forms of the derivative pricing distributions are still challenging to obtain. In this study, we established a quantitative framework using path integrals to obtain the exact analytical solutions of the statistical distribution for bond and bond option pricing for the Vasicek model. We discuss the importance of statistical fluctuations away from the expected option pricing characterized by the distribution tail and their associations to value at risk (VaR). The framework established here is general and can be applied to other financial derivatives for quantifying the underlying statistical distributions.

  18. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL (EPA/600/SR-94/210)

    EPA Science Inventory

    A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a groundwater flo...

  19. Theory of precipitation effects on dead cylindrical fuels

    Treesearch

    Michael A. Fosberg

    1972-01-01

    Numerical and analytical solutions of the Fickian diffusion equation were used to determine the effects of precipitation on dead cylindrical forest fuels. The analytical solution provided a physical framework. The numerical solutions were then used to refine the analytical solution through a similarity argument. The theoretical solutions predicted realistic rates of...

  20. Consistent Yokoya-Chen Approximation to Beamstrahlung(LCC-0010)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peskin, M

    2004-04-22

    I reconsider the Yokoya-Chen approximate evolution equation for beamstrahlung and modify it slightly to generate simple, consistent analytical approximations for the electron and photon energy spectra. I compare these approximations to previous ones, and to simulation data.I reconsider the Yokoya-Chen approximate evolution equation for beamstrahlung and modify it slightly to generate simple, consistent analytical approximations for the electron and photon energy spectra. I compare these approximations to previous ones, and to simulation data.

  1. A coordination theory for intelligent machines

    NASA Technical Reports Server (NTRS)

    Wang, Fei-Yue; Saridis, George N.

    1990-01-01

    A formal model for the coordination level of intelligent machines is established. The framework of the coordination level investigated consists of one dispatcher and a number of coordinators. The model called coordination structure has been used to describe analytically the information structure and information flow for the coordination activities in the coordination level. Specifically, the coordination structure offers a formalism to (1) describe the task translation of the dispatcher and coordinators; (2) represent the individual process within the dispatcher and coordinators; (3) specify the cooperation and connection among the dispatcher and coordinators; (4) perform the process analysis and evaluation; and (5) provide a control and communication mechanism for the real-time monitor or simulation of the coordination process. A simple procedure for the task scheduling in the coordination structure is presented. The task translation is achieved by a stochastic learning algorithm. The learning process is measured with entropy and its convergence is guaranteed. Finally, a case study of the coordination structure with three coordinators and one dispatcher for a simple intelligent manipulator system illustrates the proposed model and the simulation of the task processes performed on the model verifies the soundness of the theory.

  2. Two additional principles for determining which species to monitor.

    PubMed

    Wilson, Howard B; Rhodes, Jonathan R; Possingham, Hugh P

    2015-11-01

    Monitoring to detect population declines is widespread, but also costly. There is, consequently, a need to optimize monitoring to maximize cost-effectiveness. Here we develop a quantitative decision analysis framework for how to optimally allocate resources for monitoring among species. By keeping the framework simple, we analytically establish two new principles about which species are optimal to monitor for detecting declines: (1) those that lie on the boundary between species being allocated resources for conservation action and species that are not and (2) those with the greatest uncertainty in whether they are declining. These two principles are in addition to other factors that are also important in monitoring decisions, such as complementarity. We demonstrate the efficacy of these principles when other factors are not present, and show how the two principles can be combined. This analysis demonstrates that the most cost-effective species to monitor are ones where the information gained from monitoring is most likely to change the allocation of funds for action, not necessarily the most vulnerable or endangered. We suggest these results are general and apply to all ecological monitoring, not just of biological species: monitoring and information are only valuable when they are likely to change how people act.

  3. Post-transcriptional bursting in genes regulated by small RNA molecules

    NASA Astrophysics Data System (ADS)

    Rodrigo, Guillermo

    2018-03-01

    Gene expression programs in living cells are highly dynamic due to spatiotemporal molecular signaling and inherent biochemical stochasticity. Here we study a mechanism based on molecule-to-molecule variability at the RNA level for the generation of bursts of protein production, which can lead to heterogeneity in a cell population. We develop a mathematical framework to show numerically and analytically that genes regulated post transcriptionally by small RNA molecules can exhibit such bursts due to different states of translation activity (on or off), mostly revealed in a regime of few molecules. We exploit this framework to compare transcriptional and post-transcriptional bursting and also to illustrate how to tune the resulting protein distribution with additional post-transcriptional regulations. Moreover, because RNA-RNA interactions are predictable with an energy model, we define the kinetic constants of on-off switching as functions of the two characteristic free-energy differences of the system, activation and formation, with a nonequilibrium scheme. Overall, post-transcriptional bursting represents a distinctive principle linking gene regulation to gene expression noise, which highlights the importance of the RNA layer beyond the simple information transfer paradigm and significantly contributes to the understanding of the intracellular processes from a first-principles perspective.

  4. Fractal Folding and Medium Viscoelasticity Contribute Jointly to Chromosome Dynamics

    NASA Astrophysics Data System (ADS)

    Polovnikov, K. E.; Gherardi, M.; Cosentino-Lagomarsino, M.; Tamm, M. V.

    2018-02-01

    Chromosomes are key players of cell physiology, their dynamics provides valuable information about its physical organization. In both prokaryotes and eukaryotes, the short-time motion of chromosomal loci has been described with a Rouse model in a simple or viscoelastic medium. However, little emphasis has been put on the influence of the folded organization of chromosomes on the local dynamics. Clearly, stress propagation, and thus dynamics, must be affected by such organization, but a theory allowing us to extract such information from data, e.g., on two-point correlations, is lacking. Here, we describe a theoretical framework able to answer this general polymer dynamics question. We provide a scaling analysis of the stress-propagation time between two loci at a given arclength distance along the chromosomal coordinate. The results suggest a precise way to assess folding information from the dynamical coupling of chromosome segments. Additionally, we realize this framework in a specific model of a polymer whose long-range interactions are designed to make it fold in a fractal way and immersed in a medium characterized by subdiffusive fractional Langevin motion with a tunable scaling exponent. This allows us to derive explicit analytical expressions for the correlation functions.

  5. Global optimum vegetation rain water use is determined by aridity

    NASA Astrophysics Data System (ADS)

    Good, S. P.; Wang, L.; Caylor, K. K.

    2015-12-01

    The amount of rainwater that vegetation is able to transpire directly determines the total productivity of ecosystems, yet broad-scale trends in this sub-component of total evapotranspiration remain unclear. Since development in the 1970's, the Budyko framework has provided a simple, first-order, approach to partitioning total rainfall into runoff and evapotranspiration across climates. However, this classic paradigm provides little insight into the strength of biological mediation (i.e. transpiration flux) of the hydrologic cycle. Through a minimalist stochastic hydrology model we analytically extend the classical Budyko framework to predict the magnitude of transpiration relative to total rainfall as a function of ecosystem aridity. Consistent with a synthesis of experimental partitioning studies across climates, this model suggests a peak in the biological contribution to the hydrologic cycle at intermediate moisture values, with both arid and wet climates seeing decreased transpiration:precipitation ratios. To best match observed transpiration:precipitation ratios requires incorporation of elevated evaporation at lower canopy covers due to greater energy availability at the soil surface and elevated evaporation at higher canopy covers due to greater interception amounts. This new approach provides a connection between current and future climate regimes, hydrologic flux partitioning, and macro-system ecology.

  6. Toward an Analytic Framework of Interdisciplinary Reasoning and Communication (IRC) Processes in Science

    NASA Astrophysics Data System (ADS)

    Shen, Ji; Sung, Shannon; Zhang, Dongmei

    2015-11-01

    Students need to think and work across disciplinary boundaries in the twenty-first century. However, it is unclear what interdisciplinary thinking means and how to analyze interdisciplinary interactions in teamwork. In this paper, drawing on multiple theoretical perspectives and empirical analysis of discourse contents, we formulate a theoretical framework that helps analyze interdisciplinary reasoning and communication (IRC) processes in interdisciplinary collaboration. Specifically, we propose four interrelated IRC processes-integration, translation, transfer, and transformation, and develop a corresponding analytic framework. We apply the framework to analyze two meetings of a project that aims to develop interdisciplinary science assessment items. The results illustrate that the framework can help interpret the interdisciplinary meeting dynamics and patterns. Our coding process and results also suggest that these IRC processes can be further examined in terms of interconnected sub-processes. We also discuss the implications of using the framework in conceptualizing, practicing, and researching interdisciplinary learning and teaching in science education.

  7. Open-source Framework for Storing and Manipulation of Plasma Chemical Reaction Data

    NASA Astrophysics Data System (ADS)

    Jenkins, T. G.; Averkin, S. N.; Cary, J. R.; Kruger, S. E.

    2017-10-01

    We present a new open-source framework for storage and manipulation of plasma chemical reaction data that has emerged from our in-house project MUNCHKIN. This framework consists of python scripts and C + + programs. It stores data in an SQL data base for fast retrieval and manipulation. For example, it is possible to fit cross-section data into most widely used analytical expressions, calculate reaction rates for Maxwellian distribution functions of colliding particles, and fit them into different analytical expressions. Another important feature of this framework is the ability to calculate transport properties based on the cross-section data and supplied distribution functions. In addition, this framework allows the export of chemical reaction descriptions in LaTeX format for ease of inclusion in scientific papers. With the help of this framework it is possible to generate corresponding VSim (Particle-In-Cell simulation code) and USim (unstructured multi-fluid code) input blocks with appropriate cross-sections.

  8. Electrocardiographic interpretation skills of cardiology residents: are they competent?

    PubMed

    Sibbald, Matthew; Davies, Edward G; Dorian, Paul; Yu, Eric H C

    2014-12-01

    Achieving competency at electrocardiogram (ECG) interpretation among cardiology subspecialty residents has traditionally focused on interpreting a target number of ECGs during training. However, there is little evidence to support this approach. Further, there are no data documenting the competency of ECG interpretation skills among cardiology residents, who become de facto the gold standard in their practice communities. We tested 29 Cardiology residents from all 3 years in a large training program using a set of 20 ECGs collected from a community cardiology practice over a 1-month period. Residents interpreted half of the ECGs using a standard analytic framework, and half using their own approach. Residents were scored on the number of correct and incorrect diagnoses listed. Overall diagnostic accuracy was 58%. Of 6 potentially life-threatening diagnoses, residents missed 36% (123 of 348) including hyperkalemia (81%), long QT (52%), complete heart block (35%), and ventricular tachycardia (19%). Residents provided additional inappropriate diagnoses on 238 ECGs (41%). Diagnostic accuracy was similar between ECGs interpreted using an analytic framework vs ECGs interpreted without an analytic framework (59% vs 58%; F(1,1333) = 0.26; P = 0.61). Cardiology resident proficiency at ECG interpretation is suboptimal. Despite the use of an analytic framework, there remain significant deficiencies in ECG interpretation among Cardiology residents. A more systematic method of addressing these important learning gaps is urgently needed. Copyright © 2014 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.

  9. Test of a potential link between analytic and nonanalytic category learning and automatic, effortful processing.

    PubMed

    Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J

    2001-08-01

    The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.

  10. The NIH analytical methods and reference materials program for dietary supplements.

    PubMed

    Betz, Joseph M; Fisher, Kenneth D; Saldanha, Leila G; Coates, Paul M

    2007-09-01

    Quality of botanical products is a great uncertainty that consumers, clinicians, regulators, and researchers face. Definitions of quality abound, and include specifications for sanitation, adventitious agents (pesticides, metals, weeds), and content of natural chemicals. Because dietary supplements (DS) are often complex mixtures, they pose analytical challenges and method validation may be difficult. In response to product quality concerns and the need for validated and publicly available methods for DS analysis, the US Congress directed the Office of Dietary Supplements (ODS) at the National Institutes of Health (NIH) to accelerate an ongoing methods validation process, and the Dietary Supplements Methods and Reference Materials Program was created. The program was constructed from stakeholder input and incorporates several federal procurement and granting mechanisms in a coordinated and interlocking framework. The framework facilitates validation of analytical methods, analytical standards, and reference materials.

  11. The Role of Nanoparticle Design in Determining Analytical Performance of Lateral Flow Immunoassays.

    PubMed

    Zhan, Li; Guo, Shuang-Zhuang; Song, Fayi; Gong, Yan; Xu, Feng; Boulware, David R; McAlpine, Michael C; Chan, Warren C W; Bischof, John C

    2017-12-13

    Rapid, simple, and cost-effective diagnostics are needed to improve healthcare at the point of care (POC). However, the most widely used POC diagnostic, the lateral flow immunoassay (LFA), is ∼1000-times less sensitive and has a smaller analytical range than laboratory tests, requiring a confirmatory test to establish truly negative results. Here, a rational and systematic strategy is used to design the LFA contrast label (i.e., gold nanoparticles) to improve the analytical sensitivity, analytical detection range, and antigen quantification of LFAs. Specifically, we discovered that the size (30, 60, or 100 nm) of the gold nanoparticles is a main contributor to the LFA analytical performance through both the degree of receptor interaction and the ultimate visual or thermal contrast signals. Using the optimal LFA design, we demonstrated the ability to improve the analytical sensitivity by 256-fold and expand the analytical detection range from 3 log 10 to 6 log 10 for diagnosing patients with inflammatory conditions by measuring C-reactive protein. This work demonstrates that, with appropriate design of the contrast label, a simple and commonly used diagnostic technology can compete with more expensive state-of-the-art laboratory tests.

  12. Tungsten devices in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hou, Xiandeng; Jones, Bradley T.

    2002-04-01

    Tungsten devices have been employed in analytical atomic spectrometry for approximately 30 years. Most of these atomizers can be electrically heated up to 3000 °C at very high heating rates, with a simple power supply. Usually, a tungsten device is employed in one of two modes: as an electrothermal atomizer with which the sample vapor is probed directly, or as an electrothermal vaporizer, which produces a sample aerosol that is then carried to a separate atomizer for analysis. Tungsten devices may take various physical shapes: tubes, cups, boats, ribbons, wires, filaments, coils and loops. Most of these orientations have been applied to many analytical techniques, such as atomic absorption spectrometry, atomic emission spectrometry, atomic fluorescence spectrometry, laser excited atomic fluorescence spectrometry, metastable transfer emission spectroscopy, inductively coupled plasma optical emission spectrometry, inductively coupled plasma mass spectrometry and microwave plasma atomic spectrometry. The analytical figures of merit and the practical applications reported for these techniques are reviewed. Atomization mechanisms reported for tungsten atomizers are also briefly summarized. In addition, less common applications of tungsten devices are discussed, including analyte preconcentration by adsorption or electrodeposition and electrothermal separation of analytes prior to analysis. Tungsten atomization devices continue to provide simple, versatile alternatives for analytical atomic spectrometry.

  13. Principal polynomial analysis.

    PubMed

    Laparra, Valero; Jiménez, Sandra; Tuia, Devis; Camps-Valls, Gustau; Malo, Jesus

    2014-11-01

    This paper presents a new framework for manifold learning based on a sequence of principal polynomials that capture the possibly nonlinear nature of the data. The proposed Principal Polynomial Analysis (PPA) generalizes PCA by modeling the directions of maximal variance by means of curves, instead of straight lines. Contrarily to previous approaches, PPA reduces to performing simple univariate regressions, which makes it computationally feasible and robust. Moreover, PPA shows a number of interesting analytical properties. First, PPA is a volume-preserving map, which in turn guarantees the existence of the inverse. Second, such an inverse can be obtained in closed form. Invertibility is an important advantage over other learning methods, because it permits to understand the identified features in the input domain where the data has physical meaning. Moreover, it allows to evaluate the performance of dimensionality reduction in sensible (input-domain) units. Volume preservation also allows an easy computation of information theoretic quantities, such as the reduction in multi-information after the transform. Third, the analytical nature of PPA leads to a clear geometrical interpretation of the manifold: it allows the computation of Frenet-Serret frames (local features) and of generalized curvatures at any point of the space. And fourth, the analytical Jacobian allows the computation of the metric induced by the data, thus generalizing the Mahalanobis distance. These properties are demonstrated theoretically and illustrated experimentally. The performance of PPA is evaluated in dimensionality and redundancy reduction, in both synthetic and real datasets from the UCI repository.

  14. The Sensitivity of Atmospheric Water Isotopes to Entrainment and Precipitation Efficiency in a Bulk Plume Model of Convection

    NASA Astrophysics Data System (ADS)

    Duan, S.; Wright, J. S.; Romps, D. M.

    2016-12-01

    Atmospheric water isotopes have been proposed as potentially powerful constraints on the physics of convective clouds and parameterizations of convective processes in models. We have previously derived an analytical model of water vapor (H2O) and one of its heavy isotopes (HDO) in convective environments based on a bulk-plume convective water budget in radiative convective equilibrium. This analytical model provides a useful starting point for examining the joint responses of water vapor and its isotopic composition to changes in convective parameters; however, certain idealistic assumptions are required to make the model analytically solvable. Here, we develop a more flexible numerical framework that enables a wider range of model configurations and includes additional isotopic tracers. This model provides a bridge between Rayleigh distillation, which is simple but inflexible, and more complicated convection schemes and cloud resolving models, which are more realistic but also more difficult to perturb and interpret. Application of realistic in-cloud water profiles in our model produces vertical distributions of δD that qualitatively match satellite observations from the Tropospheric Emission Spectrometer (TES). We test the sensitivity of water vapor and its isotopic composition to a wide range of perturbations in the model parameters and their vertical profiles. In this presentation, we focus especially on establishing constraints for convective entrainment and precipitation efficiency. We conclude by discussing the potential application of this model as part of a larger water isotope toolkit for use with offline diagnostics provided by reanalyses and GCMs.

  15. Quasi-Steady Evolution of Hillslopes in Layered Landscapes: An Analytic Approach

    NASA Astrophysics Data System (ADS)

    Glade, R. C.; Anderson, R. S.

    2018-01-01

    Landscapes developed in layered sedimentary or igneous rocks are common on Earth, as well as on other planets. Features such as hogbacks, exposed dikes, escarpments, and mesas exhibit resistant rock layers adjoining more erodible rock in tilted, vertical, or horizontal orientations. Hillslopes developed in the erodible rock are typically characterized by steep, linear-to-concave slopes or "ramps" mantled with material derived from the resistant layers, often in the form of large blocks. Previous work on hogbacks has shown that feedbacks between weathering and transport of the blocks and underlying soft rock can create relief over time and lead to the development of concave-up slope profiles in the absence of rilling processes. Here we employ an analytic approach, informed by numerical modeling and field data, to describe the quasi-steady state behavior of such rocky hillslopes for the full spectrum of resistant layer dip angles. We begin with a simple geometric analysis that relates structural dip to erosion rates. We then explore the mechanisms by which our numerical model of hogback evolution self-organizes to meet these geometric expectations, including adjustment of soil depth, erosion rates, and block velocities along the ramp. Analytical solutions relate easily measurable field quantities such as ramp length, slope, block size, and resistant layer dip angle to local incision rate, block velocity, and block weathering rate. These equations provide a framework for exploring the evolution of layered landscapes and pinpoint the processes for which we require a more thorough understanding to predict their evolution over time.

  16. Forgetfulness can help you win games.

    PubMed

    Burridge, James; Gao, Yu; Mao, Yong

    2015-09-01

    We present a simple game model where agents with different memory lengths compete for finite resources. We show by simulation and analytically that an instability exists at a critical memory length, and as a result, different memory lengths can compete and coexist in a dynamical equilibrium. Our analytical formulation makes a connection to statistical urn models, and we show that temperature is mirrored by the agent's memory. Our simple model of memory may be incorporated into other game models with implications that we briefly discuss.

  17. Simple functionalization method for single conical pores with a polydopamine layer

    NASA Astrophysics Data System (ADS)

    Horiguchi, Yukichi; Goda, Tatsuro; Miyahara, Yuji

    2018-04-01

    Resistive pulse sensing (RPS) is an interesting analytical system in which micro- to nanosized pores are used to evaluate particles or small analytes. Recently, molecular immobilization techniques to improve the performance of RPS have been reported. The problem in functionalization for RPS is that molecular immobilization by chemical reaction is restricted by the pore material type. Herein, a simple functionalization is performed using mussel-inspired polydopamine as an intermediate layer to connect the pore material with functional molecules.

  18. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    ERIC Educational Resources Information Center

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  19. LEA in Private: A Privacy and Data Protection Framework for a Learning Analytics Toolbox

    ERIC Educational Resources Information Center

    Steiner, Christina M.; Kickmeier-Rust, Michael D.; Albert, Dietrich

    2016-01-01

    To find a balance between learning analytics research and individual privacy, learning analytics initiatives need to appropriately address ethical, privacy, and data protection issues. A range of general guidelines, model codes, and principles for handling ethical issues and for appropriate data and privacy protection are available, which may…

  20. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  1. SABRE hyperpolarization enables high-sensitivity 1H and 13C benchtop NMR spectroscopy.

    PubMed

    Richardson, Peter M; Parrott, Andrew J; Semenova, Olga; Nordon, Alison; Duckett, Simon B; Halse, Meghan E

    2018-06-19

    Benchtop NMR spectrometers operating with low magnetic fields of 1-2 T at sub-ppm resolution show great promise as analytical platforms that can be used outside the traditional laboratory environment for industrial process monitoring. One current limitation that reduces the uptake of benchtop NMR is associated with the detection fields' reduced sensitivity. Here we demonstrate how para-hydrogen (p-H2) based signal amplification by reversible exchange (SABRE), a simple to achieve hyperpolarization technique, enhances agent detectability within the environment of a benchtop (1 T) NMR spectrometer so that informative 1H and 13C NMR spectra can be readily recorded for low-concentration analytes. SABRE-derived 1H NMR signal enhancements of up to 17 000-fold, corresponding to 1H polarization levels of P = 5.9%, were achieved for 26 mM pyridine in d4-methanol in a matter of seconds. Comparable enhancement levels can be achieved in both deuterated and protio solvents but now the SABRE-enhanced analyte signals dominate due to the comparatively weak thermally-polarized solvent response. The SABRE approach also enables the acquisition of 13C NMR spectra of analytes at natural isotopic abundance in a single scan as evidenced by hyperpolarized 13C NMR spectra of tens of millimolar concentrations of 4-methylpyridine. Now the associated signal enhancement factors are up to 45 500 fold (P = 4.0%) and achieved in just 15 s. Integration of an automated SABRE polarization system with the benchtop NMR spectrometer framework produces renewable and reproducible NMR signal enhancements that can be exploited for the collection of multi-dimensional NMR spectra, exemplified here by a SABRE-enhanced 2D COSY NMR spectrum.

  2. Targeted Analyte Detection by Standard Addition Improves Detection Limits in MALDI Mass Spectrometry

    PubMed Central

    Eshghi, Shadi Toghi; Li, Xingde; Zhang, Hui

    2014-01-01

    Matrix-assisted laser desorption/ionization has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications. PMID:22877355

  3. Targeted analyte detection by standard addition improves detection limits in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Toghi Eshghi, Shadi; Li, Xingde; Zhang, Hui

    2012-09-18

    Matrix-assisted laser desorption/ionization (MALDI) has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications.

  4. Analysis of pulsating spray flames propagating in lean two-phase mixtures with unity Lewis number

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicoli, C.; Haldenwang, P.; Suard, S.

    2005-11-01

    Pulsating (or oscillatory) spray flames have recently been observed in experiments on two-phase combustion. Numerical studies have pointed out that such front oscillations can be obtained even with very simple models of homogeneous two-phase mixtures, including elementary vaporization schemes. The paper presents an analytical approach within the simple framework of the thermal-diffusive model, which is complemented by a vaporization rate independent of gas temperature, as soon as the latter reaches a certain thermal threshold ({theta}{sub v} in reduced form). The study involves the Damkoehler number (Da), the ratio of chemical reaction rate to vaporization rate, and the Zeldovich number (Ze)more » as essential parameters. We use the standard asymptotic method based on matched expansions in terms of 1/Ze. Linear analysis of two-phase flame stability is performed by studying, in the absence of differential diffusive effects (unity Lewis number), the linear growth rate of 2-D perturbations added to steady plane solutions and characterized by wavenumber k in the direction transverse to spreading. A domain of existence is found for the pulsating regime. It corresponds to mixture characteristics often met in air-fuel two-phase systems: low boiling temperature ({theta}{sub v} << 1), reaction rate not higher than vaporization rate (Da < 1, i.e., small droplets), and activation temperature assumed to be high compared with flame temperature (Ze {>=} 10). Satisfactory comparison with numerical simulations confirms the validity of the analytical approach; in particular, positive growth rates have been found for planar perturbations (k = 0) and for wrinkled fronts (k {ne} 0). Finally, comparison between predicted frequencies and experimental measurements is discussed.« less

  5. Adsorption-Induced Deformation of Hierarchically Structured Mesoporous Silica—Effect of Pore-Level Anisotropy

    PubMed Central

    2017-01-01

    The goal of this work is to understand adsorption-induced deformation of hierarchically structured porous silica exhibiting well-defined cylindrical mesopores. For this purpose, we performed an in situ dilatometry measurement on a calcined and sintered monolithic silica sample during the adsorption of N2 at 77 K. To analyze the experimental data, we extended the adsorption stress model to account for the anisotropy of cylindrical mesopores, i.e., we explicitly derived the adsorption stress tensor components in the axial and radial direction of the pore. For quantitative predictions of stresses and strains, we applied the theoretical framework of Derjaguin, Broekhoff, and de Boer for adsorption in mesopores and two mechanical models of silica rods with axially aligned pore channels: an idealized cylindrical tube model, which can be described analytically, and an ordered hexagonal array of cylindrical mesopores, whose mechanical response to adsorption stress was evaluated by 3D finite element calculations. The adsorption-induced strains predicted by both mechanical models are in good quantitative agreement making the cylindrical tube the preferable model for adsorption-induced strains due to its simple analytical nature. The theoretical results are compared with the in situ dilatometry data on a hierarchically structured silica monolith composed by a network of mesoporous struts of MCM-41 type morphology. Analyzing the experimental adsorption and strain data with the proposed theoretical framework, we find the adsorption-induced deformation of the monolithic sample being reasonably described by a superposition of axial and radial strains calculated on the mesopore level. The structural and mechanical parameters obtained from the model are in good agreement with expectations from independent measurements and literature, respectively. PMID:28547995

  6. Determining when to change course in management actions.

    PubMed

    Ng, Chooi Fei; McCarthy, Michael A; Martin, Tara G; Possingham, Hugh P

    2014-12-01

    Time is of the essence in conservation biology. To secure the persistence of a species, we need to understand how to balance time spent among different management actions. A new and simple method to test the efficacy of a range of conservation actions is required. Thus, we devised a general theoretical framework to help determine whether to test a new action and when to cease a trial and revert to an existing action if the new action did not perform well. The framework involves constructing a general population model under the different management actions and specifying a management objective. By maximizing the management objective, we could generate an analytical solution that identifies the optimal timing of when to change management action. We applied the analytical solution to the case of the Christmas Island pipistrelle bat (Pipistrelle murrayi), a species for which captive breeding might have prevented its extinction. For this case, we used our model to determine whether to start a captive breeding program and when to stop a captive breeding program and revert to managing the species in the wild, given that the management goal is to maximize the chance of reaching a target wild population size. For the pipistrelle bat, captive breeding was to start immediately and it was desirable to place the species in captivity for the entire management period. The optimal time to revert to managing the species in the wild was driven by several key parameters, including the management goal, management time frame, and the growth rates of the population under different management actions. Knowing when to change management actions can help conservation managers' act in a timely fashion to avoid species extinction. © 2014 Society for Conservation Biology.

  7. Integrated atomic layer deposition and chemical vapor reaction for the preparation of metal organic framework coatings for solid-phase microextraction Arrow.

    PubMed

    Lan, Hangzhen; Salmi, Leo D; Rönkkö, Tuukka; Parshintsev, Jevgeni; Jussila, Matti; Hartonen, Kari; Kemell, Marianna; Riekkola, Marja-Liisa

    2018-09-18

    New chemical vapor reaction (CVR) and atomic layer deposition (ALD)-conversion methods were utilized for preparation of metal organic frameworks (MOFs) coatings of solid phase microextraction (SPME) Arrow for the first time. With simple, easy and convenient one-step reaction or conversion, four MOF coatings were made by suspend ALD iron oxide (Fe 2 O 3 ) film or aluminum oxide (Al 2 O 3 ) film above terephthalic acid (H 2 BDC) or trimesic acid (H 3 BTC) vapor. UIO-66 coating was made by zirconium (Zr)-BDC film in acetic acid vapor. As the first documented instance of all-gas phase synthesis of SPME Arrow coatings, preparation parameters including CVR/conversion time and temperature, acetic acid volume, and metal oxide film/metal-ligand films thickness were investigated. The optimal coatings exhibited crystalline structures, excellent uniformity, satisfactory thickness (2-7.5 μm), and high robustness (>80 times usage). To study the practical usefulness of the coatings for the extraction, several analytes with different chemical properties were tested. The Fe-BDC coating was found to be the most selective and sensitive for the determination of benzene ring contained compounds due to its highly hydrophobic surface and unsaturated metal site. UIO-66 coating was best for small polar, aromatic, and long chain polar compounds owing to its high porosity. The usefulness of new coatings were evaluated for gas chromatography-mass spectrometer (GC-MS) determination of several analytes, present in wastewater samples at three levels of concentration, and satisfactory results were achieved. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Adsorption-Induced Deformation of Hierarchically Structured Mesoporous Silica-Effect of Pore-Level Anisotropy.

    PubMed

    Balzer, Christian; Waag, Anna M; Gehret, Stefan; Reichenauer, Gudrun; Putz, Florian; Hüsing, Nicola; Paris, Oskar; Bernstein, Noam; Gor, Gennady Y; Neimark, Alexander V

    2017-06-06

    The goal of this work is to understand adsorption-induced deformation of hierarchically structured porous silica exhibiting well-defined cylindrical mesopores. For this purpose, we performed an in situ dilatometry measurement on a calcined and sintered monolithic silica sample during the adsorption of N 2 at 77 K. To analyze the experimental data, we extended the adsorption stress model to account for the anisotropy of cylindrical mesopores, i.e., we explicitly derived the adsorption stress tensor components in the axial and radial direction of the pore. For quantitative predictions of stresses and strains, we applied the theoretical framework of Derjaguin, Broekhoff, and de Boer for adsorption in mesopores and two mechanical models of silica rods with axially aligned pore channels: an idealized cylindrical tube model, which can be described analytically, and an ordered hexagonal array of cylindrical mesopores, whose mechanical response to adsorption stress was evaluated by 3D finite element calculations. The adsorption-induced strains predicted by both mechanical models are in good quantitative agreement making the cylindrical tube the preferable model for adsorption-induced strains due to its simple analytical nature. The theoretical results are compared with the in situ dilatometry data on a hierarchically structured silica monolith composed by a network of mesoporous struts of MCM-41 type morphology. Analyzing the experimental adsorption and strain data with the proposed theoretical framework, we find the adsorption-induced deformation of the monolithic sample being reasonably described by a superposition of axial and radial strains calculated on the mesopore level. The structural and mechanical parameters obtained from the model are in good agreement with expectations from independent measurements and literature, respectively.

  9. Determinants of participation in prostate cancer screening: A simple analytical framework to account for healthy-user bias

    PubMed Central

    Tabuchi, Takahiro; Nakayama, Tomio; Fukushima, Wakaba; Matsunaga, Ichiro; Ohfuji, Satoko; Kondo, Kyoko; Kawano, Eiji; Fukuhara, Hiroyuki; Ito, Yuri; Oshima, Akira

    2015-01-01

    In Japan at present, fecal occult blood testing (FOBT) is recommended for cancer screening while routine population-based prostate-specific antigen (PSA) screening is not. In future it may be necessary to increase participation in the former and decrease it in the latter. Our objectives were to explore determinants of PSA-screening participation while simultaneously taking into account factors associated with FOBT. Data were gathered from a cross-sectional study conducted with random sampling of 6191 adults in Osaka city in 2011. Of 3244 subjects (return rate 52.4%), 936 men aged 40–64 years were analyzed using log-binomial regression to explore factors related to PSA-screening participation within 1 year. Only responders for cancer screening, defined as men who participated in either FOBT or PSA-testing, were used as main study subjects. Men who were older (prevalence ratio [PR] [95% confidence interval (CI)] = 2.17 [1.43, 3.28] for 60–64 years compared with 40–49 years), had technical or junior college education (PR [95% CI] = 1.76 [1.19, 2.59] compared with men with high school or less) and followed doctors' recommendations (PR [95% CI] = 1.50 [1.00, 2.26]) were significantly more likely to have PSA-screening after multiple variable adjustment among cancer-screening responders. Attenuation in PR of hypothesized common factors was observed among cancer-screening responders compared with the usual approach (among total subjects). Using the analytical framework to account for healthy-user bias, we found three factors related to participation in PSA-screening with attenuated association of common factors. This approach may provide a more sophisticated interpretation of participation in various screenings with different levels of recommendation. PMID:25456306

  10. Multifield stochastic particle production: beyond a maximum entropy ansatz

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amin, Mustafa A.; Garcia, Marcos A.G.; Xie, Hong-Yi

    2017-09-01

    We explore non-adiabatic particle production for N {sub f} coupled scalar fields in a time-dependent background with stochastically varying effective masses, cross-couplings and intervals between interactions. Under the assumption of weak scattering per interaction, we provide a framework for calculating the typical particle production rates after a large number of interactions. After setting up the framework, for analytic tractability, we consider interactions (effective masses and cross couplings) characterized by series of Dirac-delta functions in time with amplitudes and locations drawn from different distributions. Without assuming that the fields are statistically equivalent, we present closed form results (up to quadratures) formore » the asymptotic particle production rates for the N {sub f}=1 and N {sub f}=2 cases. We also present results for the general N {sub f} >2 case, but with more restrictive assumptions. We find agreement between our analytic results and direct numerical calculations of the total occupation number of the produced particles, with departures that can be explained in terms of violation of our assumptions. We elucidate the precise connection between the maximum entropy ansatz (MEA) used in Amin and Baumann (2015) and the underlying statistical distribution of the self and cross couplings. We provide and justify a simple to use (MEA-inspired) expression for the particle production rate, which agrees with our more detailed treatment when the parameters characterizing the effective mass and cross-couplings between fields are all comparable to each other. However, deviations are seen when some parameters differ significantly from others. We show that such deviations become negligible for a broad range of parameters when N {sub f}>> 1.« less

  11. Analytical close-form solutions to the elastic fields of solids with dislocations and surface stress

    NASA Astrophysics Data System (ADS)

    Ye, Wei; Paliwal, Bhasker; Ougazzaden, Abdallah; Cherkaoui, Mohammed

    2013-07-01

    The concept of eigenstrain is adopted to derive a general analytical framework to solve the elastic field for 3D anisotropic solids with general defects by considering the surface stress. The formulation shows the elastic constants and geometrical features of the surface play an important role in determining the elastic fields of the solid. As an application, the analytical close-form solutions to the stress fields of an infinite isotropic circular nanowire are obtained. The stress fields are compared with the classical solutions and those of complex variable method. The stress fields from this work demonstrate the impact from the surface stress when the size of the nanowire shrinks but becomes negligible in macroscopic scale. Compared with the power series solutions of complex variable method, the analytical solutions in this work provide a better platform and they are more flexible in various applications. More importantly, the proposed analytical framework profoundly improves the studies of general 3D anisotropic materials with surface effects.

  12. Experimental Validation of Lightning-Induced Electromagnetic (Indirect) Coupling to Short Monopole Antennas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crull, E W; Brown Jr., C G; Perkins, M P

    2008-07-30

    For short monopoles in this low-power case, it has been shown that a simple circuit model is capable of accurate predictions for the shape and magnitude of the antenna response to lightning-generated electric field coupling effects, provided that the elements of the circuit model have accurate values. Numerical EM simulation can be used to provide more accurate values for the circuit elements than the simple analytical formulas, since the analytical formulas are used outside of their region of validity. However, even with the approximate analytical formulas the simple circuit model produces reasonable results, which would improve if more accurate analyticalmore » models were used. This report discusses the coupling analysis approaches taken to understand the interaction between a time-varying EM field and a short monopole antenna, within the context of lightning safety for nuclear weapons at DOE facilities. It describes the validation of a simple circuit model using laboratory study in order to understand the indirect coupling of energy into a part, and the resulting voltage. Results show that in this low-power case, the circuit model predicts peak voltages within approximately 32% using circuit component values obtained from analytical formulas and about 13% using circuit component values obtained from numerical EM simulation. We note that the analytical formulas are used outside of their region of validity. First, the antenna is insulated and not a bare wire and there are perhaps fringing field effects near the termination of the outer conductor that the formula does not take into account. Also, the effective height formula is for a monopole directly over a ground plane, while in the time-domain measurement setup the monopole is elevated above the ground plane by about 1.5-inch (refer to Figure 5).« less

  13. A Graphics Design Framework to Visualize Multi-Dimensional Economic Datasets

    ERIC Educational Resources Information Center

    Chandramouli, Magesh; Narayanan, Badri; Bertoline, Gary R.

    2013-01-01

    This study implements a prototype graphics visualization framework to visualize multidimensional data. This graphics design framework serves as a "visual analytical database" for visualization and simulation of economic models. One of the primary goals of any kind of visualization is to extract useful information from colossal volumes of…

  14. Relationships among Classical Test Theory and Item Response Theory Frameworks via Factor Analytic Models

    ERIC Educational Resources Information Center

    Kohli, Nidhi; Koran, Jennifer; Henn, Lisa

    2015-01-01

    There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…

  15. Strehl ratio: a tool for optimizing optical nulls and singularities.

    PubMed

    Hénault, François

    2015-07-01

    In this paper a set of radial and azimuthal phase functions are reviewed that have a null Strehl ratio, which is equivalent to generating a central extinction in the image plane of an optical system. The study is conducted in the framework of Fraunhofer scalar diffraction, and is oriented toward practical cases where optical nulls or singularities are produced by deformable mirrors or phase plates. The identified solutions reveal unexpected links with the zeros of type-J Bessel functions of integer order. They include linear azimuthal phase ramps giving birth to an optical vortex, azimuthally modulated phase functions, and circular phase gratings (CPGs). It is found in particular that the CPG radiometric efficiency could be significantly improved by the null Strehl ratio condition. Simple design rules for rescaling and combining the different phase functions are also defined. Finally, the described analytical solutions could also serve as starting points for an automated searching software tool.

  16. Embedding dynamical networks into distributed models

    NASA Astrophysics Data System (ADS)

    Innocenti, Giacomo; Paoletti, Paolo

    2015-07-01

    Large networks of interacting dynamical systems are well-known for the complex behaviours they are able to display, even when each node features a quite simple dynamics. Despite examples of such networks being widespread both in nature and in technological applications, the interplay between the local and the macroscopic behaviour, through the interconnection topology, is still not completely understood. Moreover, traditional analytical methods for dynamical response analysis fail because of the intrinsically large dimension of the phase space of the network which makes the general problem intractable. Therefore, in this paper we develop an approach aiming to condense all the information in a compact description based on partial differential equations. By focusing on propagative phenomena, rigorous conditions under which the original network dynamical properties can be successfully analysed within the proposed framework are derived as well. A network of Fitzhugh-Nagumo systems is finally used to illustrate the effectiveness of the proposed method.

  17. Scalets, wavelets and (complex) turning point quantization

    NASA Astrophysics Data System (ADS)

    Handy, C. R.; Brooks, H. A.

    2001-05-01

    Despite the many successes of wavelet analysis in image and signal processing, the incorporation of continuous wavelet transform theory within quantum mechanics has lacked a compelling, first principles, motivating analytical framework, until now. For arbitrary one-dimensional rational fraction Hamiltonians, we develop a simple, unified formalism, which clearly underscores the complementary, and mutually interdependent, role played by moment quantization theory (i.e. via scalets, as defined herein) and wavelets. This analysis involves no approximation of the Hamiltonian within the (equivalent) wavelet space, and emphasizes the importance of (complex) multiple turning point contributions in the quantization process. We apply the method to three illustrative examples. These include the (double-well) quartic anharmonic oscillator potential problem, V(x) = Z2x2 + gx4, the quartic potential, V(x) = x4, and the very interesting and significant non-Hermitian potential V(x) = -(ix)3, recently studied by Bender and Boettcher.

  18. Unravelling daily human mobility motifs

    PubMed Central

    Schneider, Christian M.; Belik, Vitaly; Couronné, Thomas; Smoreda, Zbigniew; González, Marta C.

    2013-01-01

    Human mobility is differentiated by time scales. While the mechanism for long time scales has been studied, the underlying mechanism on the daily scale is still unrevealed. Here, we uncover the mechanism responsible for the daily mobility patterns by analysing the temporal and spatial trajectories of thousands of persons as individual networks. Using the concept of motifs from network theory, we find only 17 unique networks are present in daily mobility and they follow simple rules. These networks, called here motifs, are sufficient to capture up to 90 per cent of the population in surveys and mobile phone datasets for different countries. Each individual exhibits a characteristic motif, which seems to be stable over several months. Consequently, daily human mobility can be reproduced by an analytically tractable framework for Markov chains by modelling periods of high-frequency trips followed by periods of lower activity as the key ingredient. PMID:23658117

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeng, Li; Jacobsen, Stein B., E-mail: astrozeng@gmail.com, E-mail: jacobsen@neodymium.harvard.edu

    In the past few years, the number of confirmed planets has grown above 2000. It is clear that they represent a diversity of structures not seen in our own solar system. In addition to very detailed interior modeling, it is valuable to have a simple analytical framework for describing planetary structures. The variational principle is a fundamental principle in physics, entailing that a physical system follows the trajectory, which minimizes its action. It is alternative to the differential equation formulation of a physical system. Applying the variational principle to the planetary interior can beautifully summarize the set of differential equationsmore » into one, which provides us some insight into the problem. From this principle, a universal mass–radius relation, an estimate of the error propagation from the equation of state to the mass–radius relation, and a form of the virial theorem applicable to planetary interiors are derived.« less

  20. Interaction of two walkers: wave-mediated energy and force.

    PubMed

    Borghesi, Christian; Moukhtar, Julien; Labousse, Matthieu; Eddi, Antonin; Fort, Emmanuel; Couder, Yves

    2014-12-01

    A bouncing droplet, self-propelled by its interaction with the waves it generates, forms a classical wave-particle association called a "walker." Previous works have demonstrated that the dynamics of a single walker is driven by its global surface wave field that retains information on its past trajectory. Here we investigate the energy stored in this wave field for two coupled walkers and how it conveys an interaction between them. For this purpose, we characterize experimentally the "promenade modes" where two walkers are bound and propagate together. Their possible binding distances take discrete values, and the velocity of the pair depends on their mutual binding. The mean parallel motion can be either rectilinear or oscillating. The experimental results are recovered analytically with a simple theoretical framework. A relation between the kinetic energy of the droplets and the total energy of the standing waves is established.

  1. Perspectives on scaling and multiscaling in passive scalar turbulence

    NASA Astrophysics Data System (ADS)

    Banerjee, Tirthankar; Basu, Abhik

    2018-05-01

    We revisit the well-known problem of multiscaling in substances passively advected by homogeneous and isotropic turbulent flows or passive scalar turbulence. To that end we propose a two-parameter continuum hydrodynamic model for an advected substance concentration θ , parametrized jointly by y and y ¯, that characterize the spatial scaling behavior of the variances of the advecting stochastic velocity and the stochastic additive driving force, respectively. We analyze it within a one-loop dynamic renormalization group method to calculate the multiscaling exponents of the equal-time structure functions of θ . We show how the interplay between the advective velocity and the additive force may lead to simple scaling or multiscaling. In one limit, our results reduce to the well-known results from the Kraichnan model for passive scalar. Our framework of analysis should be of help for analytical approaches for the still intractable problem of fluid turbulence itself.

  2. Analyzing public health policy: three approaches.

    PubMed

    Coveney, John

    2010-07-01

    Policy is an important feature of public and private organizations. Within the field of health as a policy arena, public health has emerged in which policy is vital to decision making and the deployment of resources. Public health practitioners and students need to be able to analyze public health policy, yet many feel daunted by the subject's complexity. This article discusses three approaches that simplify policy analysis: Bacchi's "What's the problem?" approach examines the way that policy represents problems. Colebatch's governmentality approach provides a way of analyzing the implementation of policy. Bridgman and Davis's policy cycle allows for an appraisal of public policy development. Each approach provides an analytical framework from which to rigorously study policy. Practitioners and students of public health gain much in engaging with the politicized nature of policy, and a simple approach to policy analysis can greatly assist one's understanding and involvement in policy work.

  3. Effect of deposition rate and NNN interactions on adatoms mobility in epitaxial growth

    NASA Astrophysics Data System (ADS)

    Hamouda, Ajmi B. H.; Mahjoub, B.; Blel, S.

    2017-07-01

    This paper provides a detailed analysis of the surface diffusion problem during epitaxial step-flow growth using a simple theoretical model for the diffusion equation of adatoms concentration. Within this framework, an analytical expression for the adatom mobility as a function of the deposition rate and the Next-Nearest-Neighbor (NNN) interactions is derived and compared with the effective mobility computed from kinetic Monte Carlo (kMC) simulations. As far as the 'small' step velocity or relatively weak deposition rate commonly used for copper growth is concerned, an excellent quantitative agreement with the theoretical prediction is found. The effective adatoms mobility is shown to exhibit an exponential decrease with NNN interactions strength and increases in roughly linear behavior versus deposition rate F. The effective step stiffness and the adatoms mobility are also shown to be closely related to the concentration of kinks.

  4. Temporal correlation functions of concentration fluctuations: an anomalous case.

    PubMed

    Lubelski, Ariel; Klafter, Joseph

    2008-10-09

    We calculate, within the framework of the continuous time random walk (CTRW) model, multiparticle temporal correlation functions of concentration fluctuations (CCF) in systems that display anomalous subdiffusion. The subdiffusion stems from the nonstationary nature of the CTRW waiting times, which also lead to aging and ergodicity breaking. Due to aging, a system of diffusing particles tends to slow down as time progresses, and therefore, the temporal correlation functions strongly depend on the initial time of measurement. As a consequence, time averages of the CCF differ from ensemble averages, displaying therefore ergodicity breaking. We provide a simple example that demonstrates the difference between these two averages, a difference that might be amenable to experimental tests. We focus on the case of ensemble averaging and assume that the preparation time of the system coincides with the starting time of the measurement. Our analytical calculations are supported by computer simulations based on the CTRW model.

  5. A universal indicator of critical state transitions in noisy complex networked systems

    PubMed Central

    Liang, Junhao; Hu, Yanqing; Chen, Guanrong; Zhou, Tianshou

    2017-01-01

    Critical transition, a phenomenon that a system shifts suddenly from one state to another, occurs in many real-world complex networks. We propose an analytical framework for exactly predicting the critical transition in a complex networked system subjected to noise effects. Our prediction is based on the characteristic return time of a simple one-dimensional system derived from the original higher-dimensional system. This characteristic time, which can be easily calculated using network data, allows us to systematically separate the respective roles of dynamics, noise and topology of the underlying networked system. We find that the noise can either prevent or enhance critical transitions, playing a key role in compensating the network structural defect which suffers from either internal failures or environmental changes, or both. Our analysis of realistic or artificial examples reveals that the characteristic return time is an effective indicator for forecasting the sudden deterioration of complex networks. PMID:28230166

  6. Crystal structure optimisation using an auxiliary equation of state

    NASA Astrophysics Data System (ADS)

    Jackson, Adam J.; Skelton, Jonathan M.; Hendon, Christopher H.; Butler, Keith T.; Walsh, Aron

    2015-11-01

    Standard procedures for local crystal-structure optimisation involve numerous energy and force calculations. It is common to calculate an energy-volume curve, fitting an equation of state around the equilibrium cell volume. This is a computationally intensive process, in particular, for low-symmetry crystal structures where each isochoric optimisation involves energy minimisation over many degrees of freedom. Such procedures can be prohibitive for non-local exchange-correlation functionals or other "beyond" density functional theory electronic structure techniques, particularly where analytical gradients are not available. We present a simple approach for efficient optimisation of crystal structures based on a known equation of state. The equilibrium volume can be predicted from one single-point calculation and refined with successive calculations if required. The approach is validated for PbS, PbTe, ZnS, and ZnTe using nine density functionals and applied to the quaternary semiconductor Cu2ZnSnS4 and the magnetic metal-organic framework HKUST-1.

  7. Critical length scale controls adhesive wear mechanisms

    PubMed Central

    Aghababaei, Ramin; Warner, Derek H.; Molinari, Jean-Francois

    2016-01-01

    The adhesive wear process remains one of the least understood areas of mechanics. While it has long been established that adhesive wear is a direct result of contacting surface asperities, an agreed upon understanding of how contacting asperities lead to wear debris particle has remained elusive. This has restricted adhesive wear prediction to empirical models with limited transferability. Here we show that discrepant observations and predictions of two distinct adhesive wear mechanisms can be reconciled into a unified framework. Using atomistic simulations with model interatomic potentials, we reveal a transition in the asperity wear mechanism when contact junctions fall below a critical length scale. A simple analytic model is formulated to predict the transition in both the simulation results and experiments. This new understanding may help expand use of computer modelling to explore adhesive wear processes and to advance physics-based wear laws without empirical coefficients. PMID:27264270

  8. [Developments in preparation and experimental method of solid phase microextraction fibers].

    PubMed

    Yi, Xu; Fu, Yujie

    2004-09-01

    Solid phase microextraction (SPME) is a simple and effective adsorption and desorption technique, which concentrates volatile or nonvolatile compounds from liquid samples or headspace of samples. SPME is compatible with analyte separation and detection by gas chromatography, high performance liquid chromatography, and other instrumental methods. It can provide many advantages, such as wide linear scale, low solvent and sample consumption, short analytical times, low detection limits, simple apparatus, and so on. The theory of SPME is introduced, which includes equilibrium theory and non-equilibrium theory. The novel development of fiber preparation methods and relative experimental techniques are discussed. In addition to commercial fiber preparation, different newly developed fabrication techniques, such as sol-gel, electronic deposition, carbon-base adsorption, high-temperature epoxy immobilization, are presented. Effects of extraction modes, selection of fiber coating, optimization of operating conditions, method sensitivity and precision, and systematical automation, are taken into considerations in the analytical process of SPME. A simple perspective of SPME is proposed at last.

  9. Environmental Stewardship: A Conceptual Review and Analytical Framework.

    PubMed

    Bennett, Nathan J; Whitty, Tara S; Finkbeiner, Elena; Pittman, Jeremy; Bassett, Hannah; Gelcich, Stefan; Allison, Edward H

    2018-04-01

    There has been increasing attention to and investment in local environmental stewardship in conservation and environmental management policies and programs globally. Yet environmental stewardship has not received adequate conceptual attention. Establishing a clear definition and comprehensive analytical framework could strengthen our ability to understand the factors that lead to the success or failure of environmental stewardship in different contexts and how to most effectively support and enable local efforts. Here we propose such a definition and framework. First, we define local environmental stewardship as the actions taken by individuals, groups or networks of actors, with various motivations and levels of capacity, to protect, care for or responsibly use the environment in pursuit of environmental and/or social outcomes in diverse social-ecological contexts. Next, drawing from a review of the environmental stewardship, management and governance literatures, we unpack the elements of this definition to develop an analytical framework that can facilitate research on local environmental stewardship. Finally, we discuss potential interventions and leverage points for promoting or supporting local stewardship and future applications of the framework to guide descriptive, evaluative, prescriptive or systematic analysis of environmental stewardship. Further application of this framework in diverse environmental and social contexts is recommended to refine the elements and develop insights that will guide and improve the outcomes of environmental stewardship initiatives and investments. Ultimately, our aim is to raise the profile of environmental stewardship as a valuable and holistic concept for guiding productive and sustained relationships with the environment.

  10. Environmental Stewardship: A Conceptual Review and Analytical Framework

    NASA Astrophysics Data System (ADS)

    Bennett, Nathan J.; Whitty, Tara S.; Finkbeiner, Elena; Pittman, Jeremy; Bassett, Hannah; Gelcich, Stefan; Allison, Edward H.

    2018-04-01

    There has been increasing attention to and investment in local environmental stewardship in conservation and environmental management policies and programs globally. Yet environmental stewardship has not received adequate conceptual attention. Establishing a clear definition and comprehensive analytical framework could strengthen our ability to understand the factors that lead to the success or failure of environmental stewardship in different contexts and how to most effectively support and enable local efforts. Here we propose such a definition and framework. First, we define local environmental stewardship as the actions taken by individuals, groups or networks of actors, with various motivations and levels of capacity, to protect, care for or responsibly use the environment in pursuit of environmental and/or social outcomes in diverse social-ecological contexts. Next, drawing from a review of the environmental stewardship, management and governance literatures, we unpack the elements of this definition to develop an analytical framework that can facilitate research on local environmental stewardship. Finally, we discuss potential interventions and leverage points for promoting or supporting local stewardship and future applications of the framework to guide descriptive, evaluative, prescriptive or systematic analysis of environmental stewardship. Further application of this framework in diverse environmental and social contexts is recommended to refine the elements and develop insights that will guide and improve the outcomes of environmental stewardship initiatives and investments. Ultimately, our aim is to raise the profile of environmental stewardship as a valuable and holistic concept for guiding productive and sustained relationships with the environment.

  11. A lightweight messaging-based distributed processing and workflow execution framework for real-time and big data analysis

    NASA Astrophysics Data System (ADS)

    Laban, Shaban; El-Desouky, Aly

    2014-05-01

    To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Also, it is possible to extend the use of the framework in monitoring the IDC pipeline. The detailed design, implementation,conclusion and future work of the proposed framework will be presented.

  12. Information Tailoring Enhancements for Large Scale Social Data

    DTIC Science & Technology

    2016-03-15

    i.com) 1 Work Performed within This Reporting Period .................................................... 2 1.1 Implemented Temporal Analytics ...following tasks.  Implemented Temporal Analysis Algorithms for Advanced Analytics in Scraawl. We implemented our backend web service design for the...temporal analysis and we created a prototyope GUI web service of Scraawl analytics dashboard.  Upgraded Scraawl computational framework to increase

  13. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines

    ERIC Educational Resources Information Center

    Ifenthaler, Dirk; Widanapathirana, Chathuranga

    2014-01-01

    Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…

  14. Vitamin K

    USDA-ARS?s Scientific Manuscript database

    A wide range of analytical techniques are available for the detection, quantitation, and evaluation of vitamin K in foods. The methods vary from simple to complex depending on extraction, separation, identification and detection of the analyte. Among the extraction methods applied for vitamin K anal...

  15. Many-objective reservoir policy identification and refinement to reduce policy inertia and myopia in water management

    NASA Astrophysics Data System (ADS)

    Giuliani, M.; Herman, J. D.; Castelletti, A.; Reed, P.

    2014-04-01

    This study contributes a decision analytic framework to overcome policy inertia and myopia in complex river basin management contexts. The framework combines reservoir policy identification, many-objective optimization under uncertainty, and visual analytics to characterize current operations and discover key trade-offs between alternative policies for balancing competing demands and system uncertainties. The approach is demonstrated on the Conowingo Dam, located within the Lower Susquehanna River, USA. The Lower Susquehanna River is an interstate water body that has been subject to intensive water management efforts due to competing demands from urban water supply, atomic power plant cooling, hydropower production, and federally regulated environmental flows. We have identified a baseline operating policy for the Conowingo Dam that closely reproduces the dynamics of current releases and flows for the Lower Susquehanna and thus can be used to represent the preferences structure guiding current operations. Starting from this baseline policy, our proposed decision analytic framework then combines evolutionary many-objective optimization with visual analytics to discover new operating policies that better balance the trade-offs within the Lower Susquehanna. Our results confirm that the baseline operating policy, which only considers deterministic historical inflows, significantly overestimates the system's reliability in meeting the reservoir's competing demands. Our proposed framework removes this bias by successfully identifying alternative reservoir policies that are more robust to hydroclimatic uncertainties while also better addressing the trade-offs across the Conowingo Dam's multisector services.

  16. The Climate Data Analytic Services (CDAS) Framework.

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2016-12-01

    Faced with unprecedented growth in climate data volume and demand, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute data processing workflows combining common analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted climate data analysis tools (ESMF, CDAT, NCO, etc.). A dynamic caching architecture enables interactive response times. CDAS utilizes Apache Spark for parallelization and a custom array framework for processing huge datasets within limited memory spaces. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using either direct web service calls, a python script, a unix-like shell client, or a javascript-based web application. Client packages in python, scala, or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends and variability, and compare multiple reanalysis datasets.

  17. a Web-Based Framework for Visualizing Industrial Spatiotemporal Distribution Using Standard Deviational Ellipse and Shifting Routes of Gravity Centers

    NASA Astrophysics Data System (ADS)

    Song, Y.; Gui, Z.; Wu, H.; Wei, Y.

    2017-09-01

    Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise) to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.

  18. An Analytical Framework for the Steady State Impact of Carbonate Compensation on Atmospheric CO2

    NASA Astrophysics Data System (ADS)

    Omta, Anne Willem; Ferrari, Raffaele; McGee, David

    2018-04-01

    The deep-ocean carbonate ion concentration impacts the fraction of the marine calcium carbonate production that is buried in sediments. This gives rise to the carbonate compensation feedback, which is thought to restore the deep-ocean carbonate ion concentration on multimillennial timescales. We formulate an analytical framework to investigate the impact of carbonate compensation under various changes in the carbon cycle relevant for anthropogenic change and glacial cycles. Using this framework, we show that carbonate compensation amplifies by 15-20% changes in atmospheric CO2 resulting from a redistribution of carbon between the atmosphere and ocean (e.g., due to changes in temperature, salinity, or nutrient utilization). A counterintuitive result emerges when the impact of organic matter burial in the ocean is examined. The organic matter burial first leads to a slight decrease in atmospheric CO2 and an increase in the deep-ocean carbonate ion concentration. Subsequently, enhanced calcium carbonate burial leads to outgassing of carbon from the ocean to the atmosphere, which is quantified by our framework. Results from simulations with a multibox model including the minor acids and bases important for the ocean-atmosphere exchange of carbon are consistent with our analytical predictions. We discuss the potential role of carbonate compensation in glacial-interglacial cycles as an example of how our theoretical framework may be applied.

  19. Closed-form solutions and scaling laws for Kerr frequency combs

    PubMed Central

    Renninger, William H.; Rakich, Peter T.

    2016-01-01

    A single closed-form analytical solution of the driven nonlinear Schrödinger equation is developed, reproducing a large class of the behaviors in Kerr-comb systems, including bright-solitons, dark-solitons, and a large class of periodic wavetrains. From this analytical framework, a Kerr-comb area theorem and a pump-detuning relation are developed, providing new insights into soliton- and wavetrain-based combs along with concrete design guidelines for both. This new area theorem reveals significant deviation from the conventional soliton area theorem, which is crucial to understanding cavity solitons in certain limits. Moreover, these closed-form solutions represent the first step towards an analytical framework for wavetrain formation, and reveal new parameter regimes for enhanced Kerr-comb performance. PMID:27108810

  20. On Connectivity of Wireless Sensor Networks with Directional Antennas

    PubMed Central

    Wang, Qiu; Dai, Hong-Ning; Zheng, Zibin; Imran, Muhammad; Vasilakos, Athanasios V.

    2017-01-01

    In this paper, we investigate the network connectivity of wireless sensor networks with directional antennas. In particular, we establish a general framework to analyze the network connectivity while considering various antenna models and the channel randomness. Since existing directional antenna models have their pros and cons in the accuracy of reflecting realistic antennas and the computational complexity, we propose a new analytical directional antenna model called the iris model to balance the accuracy against the complexity. We conduct extensive simulations to evaluate the analytical framework. Our results show that our proposed analytical model on the network connectivity is accurate, and our iris antenna model can provide a better approximation to realistic directional antennas than other existing antenna models. PMID:28085081

  1. Developing an Analytical Framework for Argumentation on Energy Consumption Issues

    ERIC Educational Resources Information Center

    Jin, Hui; Mehl, Cathy E.; Lan, Deborah H.

    2015-01-01

    In this study, we aimed to develop a framework for analyzing the argumentation practice of high school students and high school graduates. We developed the framework in a specific context--how energy consumption activities such as changing diet, converting forests into farmlands, and choosing transportation modes affect the carbon cycle. The…

  2. A Theoretical Framework of the Relation between Socioeconomic Status and Academic Achievement of Students

    ERIC Educational Resources Information Center

    Lam, Gigi

    2014-01-01

    A socio-psychological analytical framework will be adopted to illuminate the relation between socioeconomic status and academic achievement. The framework puts the emphasis to incorporate micro familial factors into macro factor of the tracking system. Initially, children of the poor families always lack major prerequisite: diminution of cognitive…

  3. European Qualifications Framework: Weighing Some Pros and Cons out of a French Perspective

    ERIC Educational Resources Information Center

    Bouder, Annie

    2008-01-01

    Purpose: The purpose of this paper is to question the appropriateness of a proposal for a new European Qualifications Framework. The framework has three perspectives: historical; analytical; and national. Design/methodology/approach: The approaches are diverse since the first insists on the institutional and decision-making processes at European…

  4. High temperature polymer degradation: Rapid IR flow-through method for volatile quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giron, Nicholas H.; Celina, Mathew C.

    Accelerated aging of polymers at elevated temperatures often involves the generation of volatiles. These can be formed as the products of oxidative degradation reactions or intrinsic pyrolytic decomposition as part of polymer scission reactions. A simple analytical method for the quantification of water, CO 2, and CO as fundamental signatures of degradation kinetics is required. Here, we describe an analytical framework and develops a rapid mid-IR based gas analysis methodology to quantify volatiles that are contained in small ampoules after aging exposures. The approach requires identification of unique spectral signatures, systematic calibration with known concentrations of volatiles, and a rapidmore » acquisition FTIR spectrometer for time resolved successive spectra. Furthermore, the volatiles are flushed out from the ampoule with dry N2 carrier gas and are then quantified through spectral and time integration. This method is sufficiently sensitive to determine absolute yields of ~50 μg water or CO 2, which relates to probing mass losses of less than 0.01% for a 1 g sample, i.e. the early stages in the degradation process. Such quantitative gas analysis is not easily achieved with other approaches. Our approach opens up the possibility of quantitative monitoring of volatile evolution as an avenue to explore polymer degradation kinetics and its dependence on time and temperature.« less

  5. High temperature polymer degradation: Rapid IR flow-through method for volatile quantification

    DOE PAGES

    Giron, Nicholas H.; Celina, Mathew C.

    2017-05-19

    Accelerated aging of polymers at elevated temperatures often involves the generation of volatiles. These can be formed as the products of oxidative degradation reactions or intrinsic pyrolytic decomposition as part of polymer scission reactions. A simple analytical method for the quantification of water, CO 2, and CO as fundamental signatures of degradation kinetics is required. Here, we describe an analytical framework and develops a rapid mid-IR based gas analysis methodology to quantify volatiles that are contained in small ampoules after aging exposures. The approach requires identification of unique spectral signatures, systematic calibration with known concentrations of volatiles, and a rapidmore » acquisition FTIR spectrometer for time resolved successive spectra. Furthermore, the volatiles are flushed out from the ampoule with dry N2 carrier gas and are then quantified through spectral and time integration. This method is sufficiently sensitive to determine absolute yields of ~50 μg water or CO 2, which relates to probing mass losses of less than 0.01% for a 1 g sample, i.e. the early stages in the degradation process. Such quantitative gas analysis is not easily achieved with other approaches. Our approach opens up the possibility of quantitative monitoring of volatile evolution as an avenue to explore polymer degradation kinetics and its dependence on time and temperature.« less

  6. Prediction of down-gradient impacts of DNAPL source depletion using tracer techniques: Laboratory and modeling validation

    NASA Astrophysics Data System (ADS)

    Jawitz, J. W.; Basu, N.; Chen, X.

    2007-05-01

    Interwell application of coupled nonreactive and reactive tracers through aquifer contaminant source zones enables quantitative characterization of aquifer heterogeneity and contaminant architecture. Parameters obtained from tracer tests are presented here in a Lagrangian framework that can be used to predict the dissolution of nonaqueous phase liquid (NAPL) contaminants. Nonreactive tracers are commonly used to provide information about travel time distributions in hydrologic systems. Reactive tracers have more recently been introduced as a tool to quantify the amount of NAPL contaminant present within the tracer swept volume. Our group has extended reactive tracer techniques to also characterize NAPL spatial distribution heterogeneity. By conceptualizing the flow field through an aquifer as a collection of streamtubes, the aquifer hydrodynamic heterogeneities may be characterized by a nonreactive tracer travel time distribution, and NAPL spatial distribution heterogeneity may be similarly described using reactive travel time distributions. The combined statistics of these distributions are used to derive a simple analytical solution for contaminant dissolution. This analytical solution, and the tracer techniques used for its parameterization, were validated both numerically and experimentally. Illustrative applications are presented from numerical simulations using the multiphase flow and transport simulator UTCHEM, and laboratory experiments of surfactant-enhanced NAPL remediation in two-dimensional flow chambers.

  7. Comparison of maximum runup through analytical and numerical approaches for different fault parameters estimates

    NASA Astrophysics Data System (ADS)

    Kanoglu, U.; Wronna, M.; Baptista, M. A.; Miranda, J. M. A.

    2017-12-01

    The one-dimensional analytical runup theory in combination with near shore synthetic waveforms is a promising tool for tsunami rapid early warning systems. Its application in realistic cases with complex bathymetry and initial wave condition from inverse modelling have shown that maximum runup values can be estimated reasonably well. In this study we generate a simplistic bathymetry domains which resemble realistic near-shore features. We investigate the accuracy of the analytical runup formulae to the variation of fault source parameters and near-shore bathymetric features. To do this we systematically vary the fault plane parameters to compute the initial tsunami wave condition. Subsequently, we use the initial conditions to run the numerical tsunami model using coupled system of four nested grids and compare the results to the analytical estimates. Variation of the dip angle of the fault plane showed that analytical estimates have less than 10% difference for angles 5-45 degrees in a simple bathymetric domain. These results shows that the use of analytical formulae for fast run up estimates constitutes a very promising approach in a simple bathymetric domain and might be implemented in Hazard Mapping and Early Warning.

  8. Stress Analysis of Beams with Shear Deformation of the Flanges

    NASA Technical Reports Server (NTRS)

    Kuhn, Paul

    1937-01-01

    This report discusses the fundamental action of shear deformation of the flanges on the basis of simplifying assumptions. The theory is developed to the point of giving analytical solutions for simple cases of beams and of skin-stringer panels under axial load. Strain-gage tests on a tension panel and on a beam corresponding to these simple cases are described and the results are compared with analytical results. For wing beams, an approximate method of applying the theory is given. As an alternative, the construction of a mechanical analyzer is advocated.

  9. A framework for analyzing contagion in assortative banking networks

    PubMed Central

    Hurd, Thomas R.; Gleeson, James P.; Melnik, Sergey

    2017-01-01

    We introduce a probabilistic framework that represents stylized banking networks with the aim of predicting the size of contagion events. Most previous work on random financial networks assumes independent connections between banks, whereas our framework explicitly allows for (dis)assortative edge probabilities (i.e., a tendency for small banks to link to large banks). We analyze default cascades triggered by shocking the network and find that the cascade can be understood as an explicit iterated mapping on a set of edge probabilities that converges to a fixed point. We derive a cascade condition, analogous to the basic reproduction number R0 in epidemic modelling, that characterizes whether or not a single initially defaulted bank can trigger a cascade that extends to a finite fraction of the infinite network. This cascade condition is an easily computed measure of the systemic risk inherent in a given banking network topology. We use percolation theory for random networks to derive a formula for the frequency of global cascades. These analytical results are shown to provide limited quantitative agreement with Monte Carlo simulation studies of finite-sized networks. We show that edge-assortativity, the propensity of nodes to connect to similar nodes, can have a strong effect on the level of systemic risk as measured by the cascade condition. However, the effect of assortativity on systemic risk is subtle, and we propose a simple graph theoretic quantity, which we call the graph-assortativity coefficient, that can be used to assess systemic risk. PMID:28231324

  10. On the representability problem and the physical meaning of coarse-grained models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, Jacob W.; Dama, James F.; Durumeric, Aleksander E. P.

    2016-07-28

    In coarse-grained (CG) models where certain fine-grained (FG, i.e., atomistic resolution) observables are not directly represented, one can nonetheless identify indirect the CG observables that capture the FG observable’s dependence on CG coordinates. Often, in these cases it appears that a CG observable can be defined by analogy to an all-atom or FG observable, but the similarity is misleading and significantly undermines the interpretation of both bottom-up and top-down CG models. Such problems emerge especially clearly in the framework of the systematic bottom-up CG modeling, where a direct and transparent correspondence between FG and CG variables establishes precise conditions formore » consistency between CG observables and underlying FG models. Here we present and investigate these representability challenges and illustrate them via the bottom-up conceptual framework for several simple analytically tractable polymer models. The examples provide special focus on the observables of configurational internal energy, entropy, and pressure, which have been at the root of controversy in the CG literature, as well as discuss observables that would seem to be entirely missing in the CG representation but can nonetheless be correlated with CG behavior. Though we investigate these problems in the framework of systematic coarse-graining, the lessons apply to top-down CG modeling also, with crucial implications for simulation at constant pressure and surface tension and for the interpretations of structural and thermodynamic correlations for comparison to experiment.« less

  11. A framework for analyzing contagion in assortative banking networks.

    PubMed

    Hurd, Thomas R; Gleeson, James P; Melnik, Sergey

    2017-01-01

    We introduce a probabilistic framework that represents stylized banking networks with the aim of predicting the size of contagion events. Most previous work on random financial networks assumes independent connections between banks, whereas our framework explicitly allows for (dis)assortative edge probabilities (i.e., a tendency for small banks to link to large banks). We analyze default cascades triggered by shocking the network and find that the cascade can be understood as an explicit iterated mapping on a set of edge probabilities that converges to a fixed point. We derive a cascade condition, analogous to the basic reproduction number R0 in epidemic modelling, that characterizes whether or not a single initially defaulted bank can trigger a cascade that extends to a finite fraction of the infinite network. This cascade condition is an easily computed measure of the systemic risk inherent in a given banking network topology. We use percolation theory for random networks to derive a formula for the frequency of global cascades. These analytical results are shown to provide limited quantitative agreement with Monte Carlo simulation studies of finite-sized networks. We show that edge-assortativity, the propensity of nodes to connect to similar nodes, can have a strong effect on the level of systemic risk as measured by the cascade condition. However, the effect of assortativity on systemic risk is subtle, and we propose a simple graph theoretic quantity, which we call the graph-assortativity coefficient, that can be used to assess systemic risk.

  12. Urban Partnership Agreement and Congestion Reduction Demonstration : National Evaluation Framework

    DOT National Transportation Integrated Search

    2008-11-21

    This report provides an analytical framework for evaluating six deployments under the United States Department of Transportation (U.S. DOT) Urban Partnership Agreement (UPA) and Congestion Reduction Demonstration (CRD) Programs. The six UPA/CRD sites...

  13. Scaling Student Success with Predictive Analytics: Reflections after Four Years in the Data Trenches

    ERIC Educational Resources Information Center

    Wagner, Ellen; Longanecker, David

    2016-01-01

    The metrics used in the US to track students do not include adults and part-time students. This has led to the development of a massive data initiative--the Predictive Analytics Reporting (PAR) framework--that uses predictive analytics to trace the progress of all types of students in the system. This development has allowed actionable,…

  14. Assessment of Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Brown, Nathaniel J.; Afflerbach, Peter P.; Croninger, Robert G.

    2014-01-01

    National policy and standards documents, including the National Assessment of Educational Progress frameworks, the "Common Core State Standards" and the "Next Generation Science Standards," assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for…

  15. Earthdata Cloud Analytics Project

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Lynnes, Chris

    2018-01-01

    This presentation describes a nascent project in NASA to develop a framework to support end-user analytics of NASA's Earth science data in the cloud. The chief benefit of migrating EOSDIS (Earth Observation System Data and Information Systems) data to the cloud is to position the data next to enormous computing capacity to allow end users to process data at scale. The Earthdata Cloud Analytics project will user a service-based approach to facilitate the infusion of evolving analytics technology and the integration with non-NASA analytics or other complementary functionality at other agencies and in other nations.

  16. Evaluation of Copper-1,3,5-benzenetricarboxylate Metal-organic Framework (Cu-MOF) as a Selective Sorbent for Lewis-base Analytes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, Scott D.; Eckberg, Alison D.; Thallapally, Praveen K.

    2011-09-01

    The metal-organic framework Cu-BTC was evaluated for its ability to selectively interact with Lewis-base analytes, including explosives, by examining retention on GC columns packed with Chromosorb W HP that contained 3.0% SE-30 along with various loadings of Cu-BTC. SEM images of the support material showed the characteristic Cu-BTC crystals embedded in the SE-30 coating on the diatomaceous support. Results indicated that the Cu-BTC-containing stationary phase had limited thermal stability (220°C) and strong general retention for analytes. Kováts index calculations showed selective retention (amounting to about 300 Kováts units) relative to n-alkanes for many small Lewis-base analytes on a column thatmore » contained 0.75% Cu-BTC compared to an SE-30 control. Short columns that contained lower loadings of Cu-BTC (0.10%) were necessary to elute explosives and related analytes; however, selectivity was not observed for aromatic compounds (including nitroaromatics) or nitroalkanes. Observed retention characteristics are discussed.« less

  17. A Simple Analytical Model for Magnetization and Coercivity of Hard/Soft Nanocomposite Magnets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Jihoon; Hong, Yang-Ki; Lee, Woncheol

    Here, we present a simple analytical model to estimate the magnetization (σ s) and intrinsic coercivity (Hci) of a hard/soft nanocomposite magnet using the mass fraction. Previously proposed models are based on the volume fraction of the hard phase of the composite. But, it is difficult to measure the volume of the hard or soft phase material of a composite. We synthesized Sm 2Co 7/Fe-Co, MnAl/Fe-Co, MnBi/Fe-Co, and BaFe 12O 19/Fe-Co composites for characterization of their σs and Hci. The experimental results are in good agreement with the present model. Therefore, this analytical model can be extended to predict themore » maximum energy product (BH) max of hard/soft composite.« less

  18. A Simple Analytical Model for Magnetization and Coercivity of Hard/Soft Nanocomposite Magnets

    DOE PAGES

    Park, Jihoon; Hong, Yang-Ki; Lee, Woncheol; ...

    2017-07-10

    Here, we present a simple analytical model to estimate the magnetization (σ s) and intrinsic coercivity (Hci) of a hard/soft nanocomposite magnet using the mass fraction. Previously proposed models are based on the volume fraction of the hard phase of the composite. But, it is difficult to measure the volume of the hard or soft phase material of a composite. We synthesized Sm 2Co 7/Fe-Co, MnAl/Fe-Co, MnBi/Fe-Co, and BaFe 12O 19/Fe-Co composites for characterization of their σs and Hci. The experimental results are in good agreement with the present model. Therefore, this analytical model can be extended to predict themore » maximum energy product (BH) max of hard/soft composite.« less

  19. IoT Big-Data Centred Knowledge Granule Analytic and Cluster Framework for BI Applications: A Case Base Analysis.

    PubMed

    Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih

    2015-01-01

    The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions.

  20. IoT Big-Data Centred Knowledge Granule Analytic and Cluster Framework for BI Applications: A Case Base Analysis

    PubMed Central

    Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih

    2015-01-01

    The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions. PMID:26600156

  1. Coma dust scattering concepts applied to the Rosetta mission

    NASA Astrophysics Data System (ADS)

    Fink, Uwe; Rinaldi, Giovanna

    2015-09-01

    This paper describes basic concepts, as well as providing a framework, for the interpretation of the light scattered by the dust in a cometary coma as observed by instruments on a spacecraft such as Rosetta. It is shown that the expected optical depths are small enough that single scattering can be applied. Each of the quantities that contribute to the scattered intensity is discussed in detail. Using optical constants of the likely coma dust constituents, olivine, pyroxene and carbon, the scattering properties of the dust are calculated. For the resulting observable scattering intensities several particle size distributions are considered, a simple power law, power laws with a small particle cut off and a log-normal distributions with various parameters. Within the context of a simple outflow model, the standard definition of Afρ for a circular observing aperture is expanded to an equivalent Afρ for an annulus and specific line-of-sight observation. The resulting equivalence between the observed intensity and Afρ is used to predict observable intensities for 67P/Churyumov-Gerasimenko at the spacecraft encounter near 3.3 AU and near perihelion at 1.3 AU. This is done by normalizing particle production rates of various size distributions to agree with observed ground based Afρ values. Various geometries for the column densities in a cometary coma are considered. The calculations for a simple outflow model are compared with more elaborate Direct Simulation Monte Carlo Calculation (DSMC) models to define the limits of applicability of the simpler analytical approach. Thus our analytical approach can be applied to the majority of the Rosetta coma observations, particularly beyond several nuclear radii where the dust is no longer in a collisional environment, without recourse to computer intensive DSMC calculations for specific cases. In addition to a spherically symmetric 1-dimensional approach we investigate column densities for the 2-dimensional DSMC model on the day and night side of the comet. Our calculations are also applied to estimates of the dust particle densities and flux which are useful for the in-situ experiments on Rosetta.

  2. Analyzing Electronic Question/Answer Services: Framework and Evaluations of Selected Services.

    ERIC Educational Resources Information Center

    White, Marilyn Domas, Ed.

    This report develops an analytical framework based on systems analysis for evaluating electronic question/answer or AskA services operated by a wide range of types of organizations, including libraries. Version 1.0 of this framework was applied in June 1999 to a selective sample of 11 electronic question/answer services, which cover a range of…

  3. Rainbow: A Framework for Analysing Computer-Mediated Pedagogical Debates

    ERIC Educational Resources Information Center

    Baker, Michael; Andriessen, Jerry; Lund, Kristine; van Amelsvoort, Marie; Quignard, Matthieu

    2007-01-01

    In this paper we present a framework for analysing when and how students engage in a specific form of interactive knowledge elaboration in CSCL environments: broadening and deepening understanding of a space of debate. The framework is termed "Rainbow," as it comprises seven principal analytical categories, to each of which a colour is assigned,…

  4. Analysis of Naval NETWAR FORCEnet Enterprise: Implications for Capabilities Based Budgeting

    DTIC Science & Technology

    2006-12-01

    of this background information and projecting how ADNS is likely to succeed in the NNFE framework , two fundamental research questions were addressed...background information and projecting how ADNS is likely to succeed in the NNFE framework , two fundamental research questions were addressed. The...Business Approach ......................................................26 Figure 8. Critical Assumption for Common Analytical Framework

  5. Teacher Identity and Numeracy: Developing an Analytic Lens for Understanding Numeracy Teacher Identity

    ERIC Educational Resources Information Center

    Bennison, Anne; Goos, Merrilyn

    2013-01-01

    This paper reviews recent literature on teacher identity in order to propose an operational framework that can be used to investigate the formation and development of numeracy teacher identities. The proposed framework is based on Van Zoest and Bohl's (2005) framework for mathematics teacher identity with a focus on those characteristics thought…

  6. Green Framework and Its Role in Sustainable City Development (by Example of Yekaterinburg)

    NASA Astrophysics Data System (ADS)

    Maltseva, A.

    2017-11-01

    The article focuses on the destruction of the city green framework in Yekaterinburg. The strategy of its recovery by means of a bioactive core represented by a botanic garden has been proposed. The analytical framework for modification in the proportion of green territories and the total city area has been described.

  7. Integrated corridor management initiative : demonstration phase evaluation - final national evaluation framework.

    DOT National Transportation Integrated Search

    2012-05-01

    This report provides an analytical framework for evaluating the two field deployments under the United States Department of Transportation (U.S. DOT) Integrated Corridor Management (ICM) Initiative Demonstration Phase. The San Diego Interstate 15 cor...

  8. Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework.

    PubMed

    Khazaei, Hamzeh; McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil

    2015-11-18

    Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids' NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution.

  9. Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework

    PubMed Central

    McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil

    2015-01-01

    Background Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. Objective To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. Methods We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). Results We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids’ NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Conclusions Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution. PMID:26582268

  10. Radiation dose reduction in computed tomography perfusion using spatial-temporal Bayesian methods

    NASA Astrophysics Data System (ADS)

    Fang, Ruogu; Raj, Ashish; Chen, Tsuhan; Sanelli, Pina C.

    2012-03-01

    In current computed tomography (CT) examinations, the associated X-ray radiation dose is of significant concern to patients and operators, especially CT perfusion (CTP) imaging that has higher radiation dose due to its cine scanning technique. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) parameter as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and degrade CT perfusion maps greatly if no adequate noise control is applied during image reconstruction. To capture the essential dynamics of CT perfusion, a simple spatial-temporal Bayesian method that uses a piecewise parametric model of the residual function is used, and then the model parameters are estimated from a Bayesian formulation of prior smoothness constraints on perfusion parameters. From the fitted residual function, reliable CTP parameter maps are obtained from low dose CT data. The merit of this scheme exists in the combination of analytical piecewise residual function with Bayesian framework using a simpler prior spatial constrain for CT perfusion application. On a dataset of 22 patients, this dynamic spatial-temporal Bayesian model yielded an increase in signal-tonoise-ratio (SNR) of 78% and a decrease in mean-square-error (MSE) of 40% at low dose radiation of 43mA.

  11. Distinctive aspects of the evolution of galactic magnetic fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yar-Mukhamedov, D., E-mail: danial.su@gmail.com

    2016-11-15

    We perform an in-depth analysis of the evolution of galactic magnetic fields within a semi-analytic galaxy formation and evolution framework, determine various distinctive aspects of the evolution process, and obtain analytic solutions for a wide range of possible evolution scenarios.

  12. Strategic, Analytic and Operational Domains of Information Management.

    ERIC Educational Resources Information Center

    Diener, Richard AV

    1992-01-01

    Discussion of information management focuses on three main areas of activities and their interrelationship: (1) strategic, including establishing frameworks and principles of operations; (2) analytic, or research elements, including user needs assessment, data gathering, and data analysis; and (3) operational activities, including reference…

  13. Statistics of dislocation pinning at localized obstacles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dutta, A.; Bhattacharya, M., E-mail: mishreyee@vecc.gov.in; Barat, P.

    2014-10-14

    Pinning of dislocations at nanosized obstacles like precipitates, voids, and bubbles is a crucial mechanism in the context of phenomena like hardening and creep. The interaction between such an obstacle and a dislocation is often studied at fundamental level by means of analytical tools, atomistic simulations, and finite element methods. Nevertheless, the information extracted from such studies cannot be utilized to its maximum extent on account of insufficient information about the underlying statistics of this process comprising a large number of dislocations and obstacles in a system. Here, we propose a new statistical approach, where the statistics of pinning ofmore » dislocations by idealized spherical obstacles is explored by taking into account the generalized size-distribution of the obstacles along with the dislocation density within a three-dimensional framework. Starting with a minimal set of material parameters, the framework employs the method of geometrical statistics with a few simple assumptions compatible with the real physical scenario. The application of this approach, in combination with the knowledge of fundamental dislocation-obstacle interactions, has successfully been demonstrated for dislocation pinning at nanovoids in neutron irradiated type 316-stainless steel in regard to the non-conservative motion of dislocations. An interesting phenomenon of transition from rare pinning to multiple pinning regimes with increasing irradiation temperature is revealed.« less

  14. A Learning Framework for Winner-Take-All Networks with Stochastic Synapses.

    PubMed

    Mostafa, Hesham; Cauwenberghs, Gert

    2018-06-01

    Many recent generative models make use of neural networks to transform the probability distribution of a simple low-dimensional noise process into the complex distribution of the data. This raises the question of whether biological networks operate along similar principles to implement a probabilistic model of the environment through transformations of intrinsic noise processes. The intrinsic neural and synaptic noise processes in biological networks, however, are quite different from the noise processes used in current abstract generative networks. This, together with the discrete nature of spikes and local circuit interactions among the neurons, raises several difficulties when using recent generative modeling frameworks to train biologically motivated models. In this letter, we show that a biologically motivated model based on multilayer winner-take-all circuits and stochastic synapses admits an approximate analytical description. This allows us to use the proposed networks in a variational learning setting where stochastic backpropagation is used to optimize a lower bound on the data log likelihood, thereby learning a generative model of the data. We illustrate the generality of the proposed networks and learning technique by using them in a structured output prediction task and a semisupervised learning task. Our results extend the domain of application of modern stochastic network architectures to networks where synaptic transmission failure is the principal noise mechanism.

  15. Interactive Resource Planning—An Anticipative Concept in the Simulation-Based Decision Support System EXPOSIM

    NASA Astrophysics Data System (ADS)

    Leopold-Wildburger, Ulrike; Pickl, Stefan

    2008-10-01

    In our research we intend to use experiments to study human behavior in a simulation environment based on a simple Lotka-Volterra predator-prey ecology. The aim is to study the influence of participants' harvesting strategies and certain personality traits derived from [1] on the outcome in terms of sustainability and economic performance. Such an approach is embedded in a research program which intends to develop and understand interactive resource planning processes. We present the general framework as well as the new decision support system EXPOSIM. The key element is the combination of experimental design, analytical understanding of time-discrete systems (especially Lotka-Volterra systems) and economic performance. In the first part, the general role of laboratory experiments is discussed. The second part summarizes the concept of sustainable development. It is taken from [18]. As we use Lotka-Volterra systems as the basis for our simulations a theoretical framework is described afterwards. It is possible to determine optimal behavior for those systems. The empirical setting is based on the empirical approach that the subjects are put into the position of a decision-maker. They are able to model the environment in such a way that harvesting can be observed. We suggest an experimental setting which might lead to new insights in an anticipatory sense.

  16. Contaminant attenuation by shallow aquifer systems under steady flow

    NASA Astrophysics Data System (ADS)

    Soltani, S. S.; Cvetkovic, V.

    2017-10-01

    We present a framework for analyzing advection-dominated solute transport and transformation in aquifer systems of boreal catchments that are typically shallow and rest on crystalline bedrock. A methodology is presented for estimating tracer discharge based on particle trajectories from recharge to discharge locations and computing their first passage times assuming that the flow pattern is approximately steady-state. Transformation processes can be included by solving one-dimensional reactive transport with randomized water travel time as the independent variable; the distribution of the travel times incorporates morphological dispersion (due to catchment geometry/topography) as well as macro-dispersion (due to heterogeneity of underlying hydraulic properties). The implementation of the framework is illustrated for the well characterized coastal catchment of Forsmark (Sweden). We find that macro-dispersion has a notable effect on attenuation even though the morphological dispersion is significantly larger. Preferential flow on the catchment scale is found to be considerable with only 5% of the Eulerian velocities contributing to transport over the simulation period of 375 years. Natural attenuation is illustrated as a simple (linear decay) transformation process. Simulated natural attenuation can be estimated analytically reasonably well by using basic hydrological and structural information, the latter being the pathway length distribution and average aquifer depth to the bedrock.

  17. Programming chemistry in DNA-addressable bioreactors

    PubMed Central

    Fellermann, Harold; Cardelli, Luca

    2014-01-01

    We present a formal calculus, termed the chemtainer calculus, able to capture the complexity of compartmentalized reaction systems such as populations of possibly nested vesicular compartments. Compartments contain molecular cargo as well as surface markers in the form of DNA single strands. These markers serve as compartment addresses and allow for their targeted transport and fusion, thereby enabling reactions of previously separated chemicals. The overall system organization allows for the set-up of programmable chemistry in microfluidic or other automated environments. We introduce a simple sequential programming language whose instructions are motivated by state-of-the-art microfluidic technology. Our approach integrates electronic control, chemical computing and material production in a unified formal framework that is able to mimic the integrated computational and constructive capabilities of the subcellular matrix. We provide a non-deterministic semantics of our programming language that enables us to analytically derive the computational and constructive power of our machinery. This semantics is used to derive the sets of all constructable chemicals and supermolecular structures that emerge from different underlying instruction sets. Because our proofs are constructive, they can be used to automatically infer control programs for the construction of target structures from a limited set of resource molecules. Finally, we present an example of our framework from the area of oligosaccharide synthesis. PMID:25121647

  18. Using the RE-AIM framework to evaluate a school-based municipal programme tripling time spent on PE.

    PubMed

    Nielsen, Jonas Vestergaard; Skovgaard, Thomas; Bredahl, Thomas Viskum Gjelstrup; Bugge, Anna; Wedderkopp, Niels; Klakk, Heidi

    2018-06-01

    Documenting the implementation of effective real-world programmes is considered an important step to support the translation of evidence into practice. Thus, the aim of this study was to identify factors influencing the adoption, implementation and maintenance of the Svendborgproject (SP) - an effective real-world programme comprising schools to implement triple the amount of physical education (PE) in pre-school to sixth grade in six primary schools in the municipality of Svendborg, Denmark. SP has been maintained for ten years and scaled up to all municipal schools since it was initiated in 2008. The Reach, Effectiveness, Adoption, Implementation and Maintenance framework (RE-AIM) was applied as an analytic tool through a convergent mixed method triangulation design. Results show that SP has been implemented with high fidelity and become an established part of the municipality and school identity. The successful implementation and dissemination of the programme has been enabled through the introduction of a predominantly bottom-up approach combined with simple non-negotiable requirements. The results show that this combination has led to a better fit of programmes to the individual school context while still obtaining high implementation fidelity. Finally, the early integration of research has legitimated and benefitted the programme. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. OPC and PSM design using inverse lithography: a nonlinear optimization approach

    NASA Astrophysics Data System (ADS)

    Poonawala, Amyn; Milanfar, Peyman

    2006-03-01

    We propose a novel method for the fast synthesis of low complexity model-based optical proximity correction (OPC) and phase shift masks (PSM) to improve the resolution and pattern fidelity of optical microlithography. We use the pixel-based mask representation, a continuous function formulation, and gradient based iterative optimization techniques to solve the above inverse problem. The continuous function formulation allows analytic calculation of the gradient. Pixel-based parametrization provides tremendous liberty in terms of the features possible in the synthesized masks, but also suffers the inherent disadvantage that the masks are very complex and difficult to manufacture. We therefore introduce the regularization framework; a useful tool which provides the flexibility to promote certain desirable properties in the solution. We employ the above framework to ensure that the estimated masks have only two or three (allowable) transmission values and are also comparatively simple and easy to manufacture. The results demonstrate that we are able to bring the CD on target using OPC masks. Furthermore, we were also able to boost the contrast of the aerial image using attenuated, strong, and 100% transmission phase shift masks. Our algorithm automatically (and optimally) adds assist-bars, dog-ears, serifs, anti-serifs, and other custom structures best suited for printing the desired pattern.

  20. Combining analytical frameworks to assess livelihood vulnerability to climate change and analyse adaptation options.

    PubMed

    Reed, M S; Podesta, G; Fazey, I; Geeson, N; Hessel, R; Hubacek, K; Letson, D; Nainggolan, D; Prell, C; Rickenbach, M G; Ritsema, C; Schwilch, G; Stringer, L C; Thomas, A D

    2013-10-01

    Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change.

  1. Combining analytical frameworks to assess livelihood vulnerability to climate change and analyse adaptation options☆

    PubMed Central

    Reed, M.S.; Podesta, G.; Fazey, I.; Geeson, N.; Hessel, R.; Hubacek, K.; Letson, D.; Nainggolan, D.; Prell, C.; Rickenbach, M.G.; Ritsema, C.; Schwilch, G.; Stringer, L.C.; Thomas, A.D.

    2013-01-01

    Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change. PMID:25844020

  2. Non-linear analytic and coanalytic problems ( L_p-theory, Clifford analysis, examples)

    NASA Astrophysics Data System (ADS)

    Dubinskii, Yu A.; Osipenko, A. S.

    2000-02-01

    Two kinds of new mathematical model of variational type are put forward: non-linear analytic and coanalytic problems. The formulation of these non-linear boundary-value problems is based on a decomposition of the complete scale of Sobolev spaces into the "orthogonal" sum of analytic and coanalytic subspaces. A similar decomposition is considered in the framework of Clifford analysis. Explicit examples are presented.

  3. A Simple Analytic Model for Estimating Mars Ascent Vehicle Mass and Performance

    NASA Technical Reports Server (NTRS)

    Woolley, Ryan C.

    2014-01-01

    The Mars Ascent Vehicle (MAV) is a crucial component in any sample return campaign. In this paper we present a universal model for a two-stage MAV along with the analytic equations and simple parametric relationships necessary to quickly estimate MAV mass and performance. Ascent trajectories can be modeled as two-burn transfers from the surface with appropriate loss estimations for finite burns, steering, and drag. Minimizing lift-off mass is achieved by balancing optimized staging and an optimized path-to-orbit. This model allows designers to quickly find optimized solutions and to see the effects of design choices.

  4. Pilot testing of SHRP 2 reliability data and analytical products: Florida. [supporting datasets

    DOT National Transportation Integrated Search

    2014-01-01

    SHRP 2 initiated the L38 project to pilot test products from five of the programs completed projects. The products support reliability estimation and use based on data analyses, analytical techniques, and decision-making framework. The L38 project...

  5. Joseph v. Brady: Synthesis Reunites What Analysis Has Divided

    ERIC Educational Resources Information Center

    Thompson, Travis

    2012-01-01

    Joseph V. Brady (1922-2011) created behavior-analytic neuroscience and the analytic framework for understanding how the external and internal neurobiological environments and mechanisms interact. Brady's approach offered synthesis as well as analysis. He embraced Findley's approach to constructing multioperant behavioral repertoires that found…

  6. A framework for performing workplace hazard and risk analysis: a participative ergonomics approach.

    PubMed

    Morag, Ido; Luria, Gil

    2013-01-01

    Despite the unanimity among researchers about the centrality of workplace analysis based on participatory ergonomics (PE) as a basis for preventive interventions, there is still little agreement about the necessary of a theoretical framework for providing practical guidance. In an effort to develop a conceptual PE framework, the authors, focusing on 20 studies, found five primary dimensions for characterising an analytical structure: (1) extent of workforce involvement; (2) analysis duration; (3) diversity of reporter role types; (4) scope of analysis and (5) supportive information system for analysis management. An ergonomics analysis carried out in a chemical manufacturing plant serves as a case study for evaluating the proposed framework. The study simultaneously demonstrates the five dimensions and evaluates their feasibility. The study showed that managerial leadership was fundamental to the successful implementation of the analysis; that all job holders should participate in analysing their own workplace and simplified reporting methods contributed to a desirable outcome. This paper seeks to clarify the scope of workplace ergonomics analysis by offering a theoretical and structured framework for providing practical advice and guidance. Essential to successfully implementing the analytical framework are managerial involvement, participation of all job holders and simplified reporting methods.

  7. On the analytical form of the Earth's magnetic attraction expressed as a function of time

    NASA Technical Reports Server (NTRS)

    Carlheim-Gyllenskold, V.

    1983-01-01

    An attempt is made to express the Earth's magnetic attraction in simple analytical form using observations during the 16th to 19th centuries. Observations of the magnetic inclination in the 16th and 17th centuries are discussed.

  8. Medication information leaflets for patients: the further validation of an analytic linguistic framework.

    PubMed

    Clerehan, Rosemary; Hirsh, Di; Buchbinder, Rachelle

    2009-01-01

    While clinicians may routinely use patient information leaflets about drug therapy, a poorly conceived leaflet has the potential to do harm. We previously developed a novel approach to analysing leaflets about a rheumatoid arthritis drug, using an analytic approach based on systemic functional linguistics. The aim of the present study was to verify the validity of the linguistic framework by applying it to two further arthritis drug leaflets. The findings confirmed the applicability of the framework and were used to refine it. A new stage or 'move' in the genre was identified. While the function of many of the moves appeared to be 'to instruct' the patient, the instruction was often unclear. The role relationships expressed in the text were critical to the meaning. As with our previous study, judged on their lexical density, the leaflets resembled academic text. The framework can provide specific tools to assess and produce medication information leaflets to support readers in taking medication. Future work could utilize the framework to evaluate information on other treatments and procedures or on healthcare information more widely.

  9. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the use of epidemiological models for infectious disease surveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less

  10. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the Use of Epidemiological Models for Infectious Disease Surveillance

    PubMed Central

    Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y.; Fairchild, Geoffrey; Hyman, James M.; Kiang, Richard; Morse, Andrew P.; Pancerella, Carmen M.; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina

    2016-01-01

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models. PMID:26820405

  11. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the Use of Epidemiological Models for Infectious Disease Surveillance.

    PubMed

    Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y; Fairchild, Geoffrey; Hyman, James M; Kiang, Richard; Morse, Andrew P; Pancerella, Carmen M; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina

    2016-01-01

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.

  12. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the use of epidemiological models for infectious disease surveillance

    DOE PAGES

    Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban; ...

    2016-01-28

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less

  13. Big data analytics in healthcare: promise and potential.

    PubMed

    Raghupathi, Wullianallur; Raghupathi, Viju

    2014-01-01

    To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.

  14. The Ophidia framework: toward cloud-based data analytics for climate change

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni

    2015-04-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.

  15. Single Particle-Inductively Coupled Plasma Mass Spectroscopy Analysis of Metallic Nanoparticles in Environmental Samples with Large Dissolved Analyte Fractions.

    PubMed

    Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I

    2016-10-18

    There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.

  16. The Matsu Wheel: A Cloud-Based Framework for Efficient Analysis and Reanalysis of Earth Satellite Imagery

    NASA Technical Reports Server (NTRS)

    Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James; hide

    2016-01-01

    Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes.

  17. Facilitating Multiple Intelligences through Multimodal Learning Analytics

    ERIC Educational Resources Information Center

    Perveen, Ayesha

    2018-01-01

    This paper develops a theoretical framework for employing learning analytics in online education to trace multiple learning variations of online students by considering their potential of being multiple intelligences based on Howard Gardner's 1983 theory of multiple intelligences. The study first emphasizes the need to facilitate students as…

  18. Phonon scattering in nanoscale systems: lowest order expansion of the current and power expressions

    NASA Astrophysics Data System (ADS)

    Paulsson, Magnus; Frederiksen, Thomas; Brandbyge, Mads

    2006-04-01

    We use the non-equilibrium Green's function method to describe the effects of phonon scattering on the conductance of nano-scale devices. Useful and accurate approximations are developed that both provide (i) computationally simple formulas for large systems and (ii) simple analytical models. In addition, the simple models can be used to fit experimental data and provide physical parameters.

  19. A simple and accurate rule-based modeling framework for simulation of autocrine/paracrine stimulation of glioblastoma cell motility and proliferation by L1CAM in 2-D culture.

    PubMed

    Caccavale, Justin; Fiumara, David; Stapf, Michael; Sweitzer, Liedeke; Anderson, Hannah J; Gorky, Jonathan; Dhurjati, Prasad; Galileo, Deni S

    2017-12-11

    Glioblastoma multiforme (GBM) is a devastating brain cancer for which there is no known cure. Its malignancy is due to rapid cell division along with high motility and invasiveness of cells into the brain tissue. Simple 2-dimensional laboratory assays (e.g., a scratch assay) commonly are used to measure the effects of various experimental perturbations, such as treatment with chemical inhibitors. Several mathematical models have been developed to aid the understanding of the motile behavior and proliferation of GBM cells. However, many are mathematically complicated, look at multiple interdependent phenomena, and/or use modeling software not freely available to the research community. These attributes make the adoption of models and simulations of even simple 2-dimensional cell behavior an uncommon practice by cancer cell biologists. Herein, we developed an accurate, yet simple, rule-based modeling framework to describe the in vitro behavior of GBM cells that are stimulated by the L1CAM protein using freely available NetLogo software. In our model L1CAM is released by cells to act through two cell surface receptors and a point of signaling convergence to increase cell motility and proliferation. A simple graphical interface is provided so that changes can be made easily to several parameters controlling cell behavior, and behavior of the cells is viewed both pictorially and with dedicated graphs. We fully describe the hierarchical rule-based modeling framework, show simulation results under several settings, describe the accuracy compared to experimental data, and discuss the potential usefulness for predicting future experimental outcomes and for use as a teaching tool for cell biology students. It is concluded that this simple modeling framework and its simulations accurately reflect much of the GBM cell motility behavior observed experimentally in vitro in the laboratory. Our framework can be modified easily to suit the needs of investigators interested in other similar intrinsic or extrinsic stimuli that influence cancer or other cell behavior. This modeling framework of a commonly used experimental motility assay (scratch assay) should be useful to both researchers of cell motility and students in a cell biology teaching laboratory.

  20. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends

    PubMed Central

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields. PMID:25383096

  1. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends.

    PubMed

    Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields.

  2. Keeping It Simple: Can We Estimate Malting Quality Potential Using an Isothermal Mashing Protocol and Common Laboratory Instrumentation?

    USDA-ARS?s Scientific Manuscript database

    Current methods for generating malting quality metrics have been developed largely to support commercial malting and brewing operations, providing accurate, reproducible analytical data to guide malting and brewing production. Infrastructure to support these analytical operations often involves sub...

  3. Analytical evaluation of current starch methods used in the international sugar industry: Part I

    USDA-ARS?s Scientific Manuscript database

    Several analytical starch methods currently exist in the international sugar industry that are used to prevent or mitigate starch-related processing challenges as well as assess the quality of traded end-products. These methods use simple iodometric chemistry, mostly potato starch standards, and uti...

  4. Metal-Organic Framework Modified Glass Substrate for Analysis of Highly Volatile Chemical Warfare Agents by Paper Spray Mass Spectrometry.

    PubMed

    Dhummakupt, Elizabeth S; Carmany, Daniel O; Mach, Phillip M; Tovar, Trenton M; Ploskonka, Ann M; Demond, Paul S; DeCoste, Jared B; Glaros, Trevor

    2018-03-07

    Paper spray mass spectrometry has been shown to successfully analyze chemical warfare agent (CWA) simulants. However, due to the volatility differences between the simulants and real G-series (i.e., sarin, soman) CWAs, analysis from an untreated paper substrate proved difficult. To extend the analytical lifetime of these G-agents, metal-organic frameworks (MOFs) were successfully integrated onto the paper spray substrates to increase adsorption and desorption. In this study, several MOFs and nanoparticles were tested to extend the analytical lifetimes of sarin, soman, and cyclosarin on paper spray substrates. It was found that the addition of either UiO-66 or HKUST-1 to the paper substrate increased the analytical lifetime of the G-agents from less than 5 min detectability to at least 50 min.

  5. Development of an analytical model for estimating global terrestrial carbon assimilation using a rate-limitation framework

    NASA Astrophysics Data System (ADS)

    Donohue, Randall; Yang, Yuting; McVicar, Tim; Roderick, Michael

    2016-04-01

    A fundamental question in climate and ecosystem science is "how does climate regulate the land surface carbon budget?" To better answer that question, here we develop an analytical model for estimating mean annual terrestrial gross primary productivity (GPP), which is the largest carbon flux over land, based on a rate-limitation framework. Actual GPP (climatological mean from 1982 to 2010) is calculated as a function of the balance between two GPP potentials defined by the climate (i.e., precipitation and solar radiation) and a third parameter that encodes other environmental variables and modifies the GPP-climate relationship. The developed model was tested at three spatial scales using different GPP sources, i.e., (1) observed GPP from 94 flux-sites, (2) modelled GPP (using the model-tree-ensemble approach) at 48654 (0.5 degree) grid-cells and (3) at 32 large catchments across the globe. Results show that the proposed model could account for the spatial GPP patterns, with a root-mean-square error of 0.70, 0.65 and 0.3 g C m-2 d-1 and R2 of 0.79, 0.92 and 0.97 for the flux-site, grid-cell and catchment scales, respectively. This analytical GPP model shares a similar form with the Budyko hydroclimatological model, which opens the possibility of a general analytical framework to analyze the linked carbon-water-energy cycles.

  6. Cultural Cleavage and Criminal Justice.

    ERIC Educational Resources Information Center

    Scheingold, Stuart A.

    1978-01-01

    Reviews major theories of criminal justice, proposes an alternative analytic framework which focuses on cultural factors, applies this framework to several cases, and discusses implications of a cultural perspective for rule of law values. Journal available from Office of Publication, Department of Political Science, University of Florida,…

  7. Institutional Racist Melancholia: A Structural Understanding of Grief and Power in Schooling

    ERIC Educational Resources Information Center

    Vaught, Sabina E.

    2012-01-01

    In this article, Sabina Vaught undertakes the theoretical and analytical project of conceptually integrating "Whiteness as property", a key structural framework of Critical Race Theory (CRT), and "melancholia", a framework originally emerging from psychoanalysis. Specifically, Vaught engages "Whiteness as property" as…

  8. Video-Based Analyses of Motivation and Interaction in Science Classrooms

    NASA Astrophysics Data System (ADS)

    Moeller Andersen, Hanne; Nielsen, Birgitte Lund

    2013-04-01

    An analytical framework for examining students' motivation was developed and used for analyses of video excerpts from science classrooms. The framework was developed in an iterative process involving theories on motivation and video excerpts from a 'motivational event' where students worked in groups. Subsequently, the framework was used for an analysis of students' motivation in the whole class situation. A cross-case analysis was carried out illustrating characteristics of students' motivation dependent on the context. This research showed that students' motivation to learn science is stimulated by a range of different factors, with autonomy, relatedness and belonging apparently being the main sources of motivation. The teacher's combined use of questions, uptake and high level evaluation was very important for students' learning processes and motivation, especially students' self-efficacy. By coding and analysing video excerpts from science classrooms, we were able to demonstrate that the analytical framework helped us gain new insights into the effect of teachers' communication and other elements on students' motivation.

  9. A SIMPLE, EFFICIENT SOLUTION OF FLUX-PROFILE RELATIONSHIPS IN THE ATMOSPHERIC SURFACE LAYER

    EPA Science Inventory

    This note describes a simple scheme for analytical estimation of the surface layer similarity functions from state variables. What distinguishes this note from the many previous papers on this topic is that this method is specifically targeted for numerical models where simplici...

  10. ClimateSpark: An in-memory distributed computing framework for big climate data analytics

    NASA Astrophysics Data System (ADS)

    Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei

    2018-06-01

    The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.

  11. Defense Resource Management Studies: Introduction to Capability and Acquisition Planning Processes

    DTIC Science & Technology

    2010-08-01

    interchangeable and useful in a common contextual framework . Currently, both simulations use a common scenario, the same fictitious country, and...culture, legal framework , and institutions. • Incorporate Principles of Good Governance and Respect for Human Rights: Stress accountability and...Preparing for the assessments requires defining the missions to be analyzed; subdividing the mission definitions to provide a framework for analytic work

  12. Using Learning Analytics for Preserving Academic Integrity

    ERIC Educational Resources Information Center

    Amigud, Alexander; Arnedo-Moreno, Joan; Daradoumis, Thanasis; Guerrero-Roldan, Ana-Elena

    2017-01-01

    This paper presents the results of integrating learning analytics into the assessment process to enhance academic integrity in the e-learning environment. The goal of this research is to evaluate the computational-based approach to academic integrity. The machine-learning based framework learns students' patterns of language use from data,…

  13. An Active Learning Exercise for Introducing Agent-Based Modeling

    ERIC Educational Resources Information Center

    Pinder, Jonathan P.

    2013-01-01

    Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…

  14. Translating Learning into Numbers: A Generic Framework for Learning Analytics

    ERIC Educational Resources Information Center

    Greller, Wolfgang; Drachsler, Hendrik

    2012-01-01

    With the increase in available educational data, it is expected that Learning Analytics will become a powerful means to inform and support learners, teachers and their institutions in better understanding and predicting personal learning needs and performance. However, the processes and requirements behind the beneficial application of Learning…

  15. Analytics to Literacies: The Development of a Learning Analytics Framework for Multiliteracies Assessment

    ERIC Educational Resources Information Center

    Dawson, Shane; Siemens, George

    2014-01-01

    The rapid advances in information and communication technologies, coupled with increased access to information and the formation of global communities, have resulted in interest among researchers and academics to revise educational practice to move beyond traditional "literacy" skills towards an enhanced set of…

  16. The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.

    2015-12-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the datasets distribution - and will include intercomparison, ensemble, and outlier analysis. The two-level workflow solution envisioned in INDIGO (coarse grain for distributed tasks orchestration, and fine grain, at the level of a single data analytics cluster instance) will be presented and discussed.

  17. A hierarchical framework for investigating epiphyte assemblages: networks, meta-communities, and scale.

    PubMed

    Burns, K C; Zotz, G

    2010-02-01

    Epiphytes are an important component of many forested ecosystems, yet our understanding of epiphyte communities lags far behind that of terrestrial-based plant communities. This discrepancy is exacerbated by the lack of a theoretical context to assess patterns in epiphyte community structure. We attempt to fill this gap by developing an analytical framework to investigate epiphyte assemblages, which we then apply to a data set on epiphyte distributions in a Panamanian rain forest. On a coarse scale, interactions between epiphyte species and host tree species can be viewed as bipartite networks, similar to pollination and seed dispersal networks. On a finer scale, epiphyte communities on individual host trees can be viewed as meta-communities, or suites of local epiphyte communities connected by dispersal. Similar analytical tools are typically employed to investigate species interaction networks and meta-communities, thus providing a unified analytical framework to investigate coarse-scale (network) and fine-scale (meta-community) patterns in epiphyte distributions. Coarse-scale analysis of the Panamanian data set showed that most epiphyte species interacted with fewer host species than expected by chance. Fine-scale analyses showed that epiphyte species richness on individual trees was lower than null model expectations. Therefore, epiphyte distributions were clumped at both scales, perhaps as a result of dispersal limitations. Scale-dependent patterns in epiphyte species composition were observed. Epiphyte-host networks showed evidence of negative co-occurrence patterns, which could arise from adaptations among epiphyte species to avoid competition for host species, while most epiphyte meta-communities were distributed at random. Application of our "meta-network" analytical framework in other locales may help to identify general patterns in the structure of epiphyte assemblages and their variation in space and time.

  18. Phylodynamic Inference with Kernel ABC and Its Application to HIV Epidemiology.

    PubMed

    Poon, Art F Y

    2015-09-01

    The shapes of phylogenetic trees relating virus populations are determined by the adaptation of viruses within each host, and by the transmission of viruses among hosts. Phylodynamic inference attempts to reverse this flow of information, estimating parameters of these processes from the shape of a virus phylogeny reconstructed from a sample of genetic sequences from the epidemic. A key challenge to phylodynamic inference is quantifying the similarity between two trees in an efficient and comprehensive way. In this study, I demonstrate that a new distance measure, based on a subset tree kernel function from computational linguistics, confers a significant improvement over previous measures of tree shape for classifying trees generated under different epidemiological scenarios. Next, I incorporate this kernel-based distance measure into an approximate Bayesian computation (ABC) framework for phylodynamic inference. ABC bypasses the need for an analytical solution of model likelihood, as it only requires the ability to simulate data from the model. I validate this "kernel-ABC" method for phylodynamic inference by estimating parameters from data simulated under a simple epidemiological model. Results indicate that kernel-ABC attained greater accuracy for parameters associated with virus transmission than leading software on the same data sets. Finally, I apply the kernel-ABC framework to study a recent outbreak of a recombinant HIV subtype in China. Kernel-ABC provides a versatile framework for phylodynamic inference because it can fit a broader range of models than methods that rely on the computation of exact likelihoods. © The Author 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  19. Spatiotemporal causal modeling for the management of Dengue Fever

    NASA Astrophysics Data System (ADS)

    Yu, Hwa-Lung; Huang, Tailin; Lee, Chieh-Han

    2015-04-01

    Increasing climatic extremes have caused growing concerns about the health effects and disease outbreaks. The association between climate variation and the occurrence of epidemic diseases play an important role on a country's public health systems. Part of the impacts are direct casualties associated with the increasing frequency and intensity of typhoons, the proliferation of disease vectors and the short-term increase of clinic visits on gastro-intestinal discomforts, diarrhea, dermatosis, or psychological trauma. Other impacts come indirectly from the influence of disasters on the ecological and socio-economic systems, including the changes of air/water quality, living environment and employment condition. Previous risk assessment studies on dengue fever focus mostly on climatic and non-climatic factors and their association with vectors' reproducing pattern. The public-health implication may appear simple. Considering the seasonal changes and regional differences, however, the causality of the impacts is full of uncertainties. Without further investigation, the underlying dengue fever risk dynamics may not be assessed accurately. The objective of this study is to develop an epistemic framework for assessing dynamic dengue fever risk across space and time. The proposed framework integrates cross-departmental data, including public-health databases, precipitation data over time and various socio-economic data. We explore public-health issues induced by typhoon through literature review and spatiotemporal analytic techniques on public health databases. From those data, we identify relevant variables and possible causal relationships, and their spatiotemporal patterns derived from our proposed spatiotemporal techniques. Eventually, we create a spatiotemporal causal network and a framework for modeling dynamic dengue fever risk.

  20. A Learning-Based Approach for IP Geolocation

    NASA Astrophysics Data System (ADS)

    Eriksson, Brian; Barford, Paul; Sommers, Joel; Nowak, Robert

    The ability to pinpoint the geographic location of IP hosts is compelling for applications such as on-line advertising and network attack diagnosis. While prior methods can accurately identify the location of hosts in some regions of the Internet, they produce erroneous results when the delay or topology measurement on which they are based is limited. The hypothesis of our work is that the accuracy of IP geolocation can be improved through the creation of a flexible analytic framework that accommodates different types of geolocation information. In this paper, we describe a new framework for IP geolocation that reduces to a machine-learning classification problem. Our methodology considers a set of lightweight measurements from a set of known monitors to a target, and then classifies the location of that target based on the most probable geographic region given probability densities learned from a training set. For this study, we employ a Naive Bayes framework that has low computational complexity and enables additional environmental information to be easily added to enhance the classification process. To demonstrate the feasibility and accuracy of our approach, we test IP geolocation on over 16,000 routers given ping measurements from 78 monitors with known geographic placement. Our results show that the simple application of our method improves geolocation accuracy for over 96% of the nodes identified in our data set, with on average accuracy 70 miles closer to the true geographic location versus prior constraint-based geolocation. These results highlight the promise of our method and indicate how future expansion of the classifier can lead to further improvements in geolocation accuracy.

  1. Retaining Ring Fastener for Solar Panels

    NASA Technical Reports Server (NTRS)

    Wilson, A. H.

    1983-01-01

    Simple articulating linkage secures solar panels into supporting framework. Five element linkage collapses into W-shape for easy placement into framework, then expands to form rectangle of same dimensions as those of panel.

  2. A Framework for Integrating Environmental Justice in Regulatory Analysis

    PubMed Central

    Nweke, Onyemaechi C.

    2011-01-01

    With increased interest in integrating environmental justice into the process for developing environmental regulations in the United States, analysts and decision makers are confronted with the question of what methods and data can be used to assess disproportionate environmental health impacts. However, as a first step to identifying data and methods, it is important that analysts understand what information on equity impacts is needed for decision making. Such knowledge originates from clearly stated equity objectives and the reflection of those objectives throughout the analytical activities that characterize Regulatory Impact Analysis (RIA), a process that is traditionally used to inform decision making. The framework proposed in this paper advocates structuring analyses to explicitly provide pre-defined output on equity impacts. Specifically, the proposed framework emphasizes: (a) defining equity objectives for the proposed regulatory action at the onset of the regulatory process, (b) identifying specific and related sub-objectives for key analytical steps in the RIA process, and (c) developing explicit analytical/research questions to assure that stated sub-objectives and objectives are met. In proposing this framework, it is envisioned that information on equity impacts informs decision-making in regulatory development, and that this is achieved through a systematic and consistent approach that assures linkages between stated equity objectives, regulatory analyses, selection of policy options, and the design of compliance and enforcement activities. PMID:21776235

  3. Framework for Deploying a Virtualized Computing Environment for Collaborative and Secure Data Analytics

    PubMed Central

    Meyer, Adrian; Green, Laura; Faulk, Ciearro; Galla, Stephen; Meyer, Anne-Marie

    2016-01-01

    Introduction: Large amounts of health data generated by a wide range of health care applications across a variety of systems have the potential to offer valuable insight into populations and health care systems, but robust and secure computing and analytic systems are required to leverage this information. Framework: We discuss our experiences deploying a Secure Data Analysis Platform (SeDAP), and provide a framework to plan, build and deploy a virtual desktop infrastructure (VDI) to enable innovation, collaboration and operate within academic funding structures. It outlines 6 core components: Security, Ease of Access, Performance, Cost, Tools, and Training. Conclusion: A platform like SeDAP is not simply successful through technical excellence and performance. It’s adoption is dependent on a collaborative environment where researchers and users plan and evaluate the requirements of all aspects. PMID:27683665

  4. Analytical Computation of Energy-Energy Correlation at Next-to-Leading Order in QCD [The Energy-Energy Correlation at Next-to-Leading Order in QCD, Analytically

    DOE PAGES

    Dixon, Lance J.; Luo, Ming-xing; Shtabovenko, Vladyslav; ...

    2018-03-09

    Here, the energy-energy correlation (EEC) between two detectors in e +e – annihilation was computed analytically at leading order in QCD almost 40 years ago, and numerically at next-to-leading order (NLO) starting in the 1980s. We present the first analytical result for the EEC at NLO, which is remarkably simple, and facilitates analytical study of the perturbative structure of the EEC. We provide the expansion of the EEC in the collinear and back-to-back regions through next-to-leading power, information which should aid resummation in these regions.

  5. Analytical Computation of Energy-Energy Correlation at Next-to-Leading Order in QCD [The Energy-Energy Correlation at Next-to-Leading Order in QCD, Analytically

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dixon, Lance J.; Luo, Ming-xing; Shtabovenko, Vladyslav

    Here, the energy-energy correlation (EEC) between two detectors in e +e – annihilation was computed analytically at leading order in QCD almost 40 years ago, and numerically at next-to-leading order (NLO) starting in the 1980s. We present the first analytical result for the EEC at NLO, which is remarkably simple, and facilitates analytical study of the perturbative structure of the EEC. We provide the expansion of the EEC in the collinear and back-to-back regions through next-to-leading power, information which should aid resummation in these regions.

  6. Metrics for comparing neuronal tree shapes based on persistent homology.

    PubMed

    Li, Yanjie; Wang, Dingkang; Ascoli, Giorgio A; Mitra, Partha; Wang, Yusu

    2017-01-01

    As more and more neuroanatomical data are made available through efforts such as NeuroMorpho.Org and FlyCircuit.org, the need to develop computational tools to facilitate automatic knowledge discovery from such large datasets becomes more urgent. One fundamental question is how best to compare neuron structures, for instance to organize and classify large collection of neurons. We aim to develop a flexible yet powerful framework to support comparison and classification of large collection of neuron structures efficiently. Specifically we propose to use a topological persistence-based feature vectorization framework. Existing methods to vectorize a neuron (i.e, convert a neuron to a feature vector so as to support efficient comparison and/or searching) typically rely on statistics or summaries of morphometric information, such as the average or maximum local torque angle or partition asymmetry. These simple summaries have limited power in encoding global tree structures. Based on the concept of topological persistence recently developed in the field of computational topology, we vectorize each neuron structure into a simple yet informative summary. In particular, each type of information of interest can be represented as a descriptor function defined on the neuron tree, which is then mapped to a simple persistence-signature. Our framework can encode both local and global tree structure, as well as other information of interest (electrophysiological or dynamical measures), by considering multiple descriptor functions on the neuron. The resulting persistence-based signature is potentially more informative than simple statistical summaries (such as average/mean/max) of morphometric quantities-Indeed, we show that using a certain descriptor function will give a persistence-based signature containing strictly more information than the classical Sholl analysis. At the same time, our framework retains the efficiency associated with treating neurons as points in a simple Euclidean feature space, which would be important for constructing efficient searching or indexing structures over them. We present preliminary experimental results to demonstrate the effectiveness of our persistence-based neuronal feature vectorization framework.

  7. Metrics for comparing neuronal tree shapes based on persistent homology

    PubMed Central

    Li, Yanjie; Wang, Dingkang; Ascoli, Giorgio A.; Mitra, Partha

    2017-01-01

    As more and more neuroanatomical data are made available through efforts such as NeuroMorpho.Org and FlyCircuit.org, the need to develop computational tools to facilitate automatic knowledge discovery from such large datasets becomes more urgent. One fundamental question is how best to compare neuron structures, for instance to organize and classify large collection of neurons. We aim to develop a flexible yet powerful framework to support comparison and classification of large collection of neuron structures efficiently. Specifically we propose to use a topological persistence-based feature vectorization framework. Existing methods to vectorize a neuron (i.e, convert a neuron to a feature vector so as to support efficient comparison and/or searching) typically rely on statistics or summaries of morphometric information, such as the average or maximum local torque angle or partition asymmetry. These simple summaries have limited power in encoding global tree structures. Based on the concept of topological persistence recently developed in the field of computational topology, we vectorize each neuron structure into a simple yet informative summary. In particular, each type of information of interest can be represented as a descriptor function defined on the neuron tree, which is then mapped to a simple persistence-signature. Our framework can encode both local and global tree structure, as well as other information of interest (electrophysiological or dynamical measures), by considering multiple descriptor functions on the neuron. The resulting persistence-based signature is potentially more informative than simple statistical summaries (such as average/mean/max) of morphometric quantities—Indeed, we show that using a certain descriptor function will give a persistence-based signature containing strictly more information than the classical Sholl analysis. At the same time, our framework retains the efficiency associated with treating neurons as points in a simple Euclidean feature space, which would be important for constructing efficient searching or indexing structures over them. We present preliminary experimental results to demonstrate the effectiveness of our persistence-based neuronal feature vectorization framework. PMID:28809960

  8. Deriving Appropriate Educational Program Costs in Illinois.

    ERIC Educational Resources Information Center

    Parrish, Thomas B.; Chambers, Jay G.

    This document describes the comprehensive analytical framework for school finance used by the Illinois State Board of Education to assist policymakers in their decisions about equitable distribution of state aid and appropriate levels of resources to meet the varying educational requirements of differing student populations. This framework, the…

  9. Analyzing Agricultural Technology Systems: A Research Report.

    ERIC Educational Resources Information Center

    Swanson, Burton E.

    The International Program for Agricultural Knowledge Systems (INTERPAKS) research team is developing a descriptive and analytic framework to examine and assess agricultural technology systems. The first part of the framework is an inductive methodology that organizes data collection and orders data for comparison between countries. It requires and…

  10. Large Ensemble Analytic Framework for Consequence-Driven Discovery of Climate Change Scenarios

    NASA Astrophysics Data System (ADS)

    Lamontagne, Jonathan R.; Reed, Patrick M.; Link, Robert; Calvin, Katherine V.; Clarke, Leon E.; Edmonds, James A.

    2018-03-01

    An analytic scenario generation framework is developed based on the idea that the same climate outcome can result from very different socioeconomic and policy drivers. The framework builds on the Scenario Matrix Framework's abstraction of "challenges to mitigation" and "challenges to adaptation" to facilitate the flexible discovery of diverse and consequential scenarios. We combine visual and statistical techniques for interrogating a large factorial data set of 33,750 scenarios generated using the Global Change Assessment Model. We demonstrate how the analytic framework can aid in identifying which scenario assumptions are most tied to user-specified measures for policy relevant outcomes of interest, specifically for our example high or low mitigation costs. We show that the current approach for selecting reference scenarios can miss policy relevant scenario narratives that often emerge as hybrids of optimistic and pessimistic scenario assumptions. We also show that the same scenario assumption can be associated with both high and low mitigation costs depending on the climate outcome of interest and the mitigation policy context. In the illustrative example, we show how agricultural productivity, population growth, and economic growth are most predictive of the level of mitigation costs. Formulating policy relevant scenarios of deeply and broadly uncertain futures benefits from large ensemble-based exploration of quantitative measures of consequences. To this end, we have contributed a large database of climate change futures that can support "bottom-up" scenario generation techniques that capture a broader array of consequences than those that emerge from limited sampling of a few reference scenarios.

  11. An overview of ethical frameworks in public health: can they be supportive in the evaluation of programs to prevent overweight?

    PubMed Central

    2010-01-01

    Background The prevention of overweight sometimes raises complex ethical questions. Ethical public health frameworks may be helpful in evaluating programs or policy for overweight prevention. We give an overview of the purpose, form and contents of such public health frameworks and investigate to which extent they are useful for evaluating programs to prevent overweight and/or obesity. Methods Our search for frameworks consisted of three steps. Firstly, we asked experts in the field of ethics and public health for the frameworks they were aware of. Secondly, we performed a search in Pubmed. Thirdly, we checked literature references in the articles on frameworks we found. In total, we thus found six ethical frameworks. We assessed the area on which the available ethical frameworks focus, the users they target at, the type of policy or intervention they propose to address, and their aim. Further, we looked at their structure and content, that is, tools for guiding the analytic process, the main ethical principles or values, possible criteria for dealing with ethical conflicts, and the concrete policy issues they are applied to. Results All frameworks aim to support public health professionals or policymakers. Most of them provide a set of values or principles that serve as a standard for evaluating policy. Most frameworks articulate both the positive ethical foundations for public health and ethical constraints or concerns. Some frameworks offer analytic tools for guiding the evaluative process. Procedural guidelines and concrete criteria for solving important ethical conflicts in the particular area of the prevention of overweight or obesity are mostly lacking. Conclusions Public health ethical frameworks may be supportive in the evaluation of overweight prevention programs or policy, but seem to lack practical guidance to address ethical conflicts in this particular area. PMID:20969761

  12. A simple formula for the effective complex conductivity of periodic fibrous composites with interfacial impedance and applications to biological tissues

    NASA Astrophysics Data System (ADS)

    Bisegna, Paolo; Caselli, Federica

    2008-06-01

    This paper presents a simple analytical expression for the effective complex conductivity of a periodic hexagonal arrangement of conductive circular cylinders embedded in a conductive matrix, with interfaces exhibiting a capacitive impedance. This composite material may be regarded as an idealized model of a biological tissue comprising tubular cells, such as skeletal muscle. The asymptotic homogenization method is adopted, and the corresponding local problem is solved by resorting to Weierstrass elliptic functions. The effectiveness of the present analytical result is proved by convergence analysis and comparison with finite-element solutions and existing models.

  13. From Complex to Simple: Interdisciplinary Stochastic Models

    ERIC Educational Resources Information Center

    Mazilu, D. A.; Zamora, G.; Mazilu, I.

    2012-01-01

    We present two simple, one-dimensional, stochastic models that lead to a qualitative understanding of very complex systems from biology, nanoscience and social sciences. The first model explains the complicated dynamics of microtubules, stochastic cellular highways. Using the theory of random walks in one dimension, we find analytical expressions…

  14. HIPAA Compliant Wireless Sensing Smartwatch Application for the Self-Management of Pediatric Asthma

    PubMed Central

    Hosseini, Anahita; Buonocore, Chris M.; Hashemzadeh, Sepideh; Hojaiji, Hannaneh; Kalantarian, Haik; Sideris, Costas; Bui, Alex A.T.; King, Christine E.; Sarrafzadeh, Majid

    2018-01-01

    Asthma is the most prevalent chronic disease among pediatrics, as it is the leading cause of student absenteeism and hospitalization for those under the age of 15. To address the significant need to manage this disease in children, the authors present a mobile health (mHealth) system that determines the risk of an asthma attack through physiological and environmental wireless sensors and representational state transfer application program interfaces (RESTful APIs). The data is sent from wireless sensors to a smartwatch application (app) via a Health Insurance Portability and Accountability Act (HIPAA) compliant cryptography framework, which then sends data to a cloud for real-time analytics. The asthma risk is then sent to the smartwatch and provided to the user via simple graphics for easy interpretation by children. After testing the safety and feasibility of the system in an adult with moderate asthma prior to testing in children, it was found that the analytics model is able to determine the overall asthma risk (high, medium, or low risk) with an accuracy of 80.10±14.13%. Furthermore, the features most important for assessing the risk of an asthma attack were multifaceted, highlighting the importance of continuously monitoring different wireless sensors and RESTful APIs. Future testing this asthma attack risk prediction system in pediatric asthma individuals may lead to an effective self-management asthma program. PMID:29354688

  15. Magnetic porous carbon derived from a metal-organic framework as a magnetic solid-phase extraction adsorbent for the extraction of sex hormones from water and human urine.

    PubMed

    Ma, Ruiyang; Hao, Lin; Wang, Junmin; Wang, Chun; Wu, Qiuhua; Wang, Zhi

    2016-09-01

    An iron-embedded porous carbon material (MIL-53-C) was fabricated by the direct carbonization of MIL-53. The MIL-53-C possesses a high surface area and good magnetic behavior. The structure, morphology, magnetic property, and porosity of the MIL-53-C were studied by scanning electron microscopy, transmission electron microscopy, vibrating sample magnetometry, and N2 adsorption. With the use of MIL-53-C as the magnetic solid-phase extraction adsorbent, a simple and efficient method was developed for the magnetic solid-phase extraction of three hormones from water and human urine samples before high-performance liquid chromatography with UV detection. The developed method exhibits a good linear response in the range of 0.02-100 ng/mL for water and 0.5-100 ng/mL for human urine samples, respectively. The limit of detection (S/N = 3) for the analytes was 0.005-0.01 ng/mL for water sample and 0.1-0.3 ng/mL for human urine sample. The limit of quantification (S/N = 10) of the analytes were in the range of 0.015-0.030 and 0.3-0.9 ng/mL, respectively. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Linear and Order Statistics Combiners for Pattern Classification

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Ghosh, Joydeep; Lau, Sonie (Technical Monitor)

    2001-01-01

    Several researchers have experimentally shown that substantial improvements can be obtained in difficult pattern recognition problems by combining or integrating the outputs of multiple classifiers. This chapter provides an analytical framework to quantify the improvements in classification results due to combining. The results apply to both linear combiners and order statistics combiners. We first show that to a first order approximation, the error rate obtained over and above the Bayes error rate, is directly proportional to the variance of the actual decision boundaries around the Bayes optimum boundary. Combining classifiers in output space reduces this variance, and hence reduces the 'added' error. If N unbiased classifiers are combined by simple averaging. the added error rate can be reduced by a factor of N if the individual errors in approximating the decision boundaries are uncorrelated. Expressions are then derived for linear combiners which are biased or correlated, and the effect of output correlations on ensemble performance is quantified. For order statistics based non-linear combiners, we derive expressions that indicate how much the median, the maximum and in general the i-th order statistic can improve classifier performance. The analysis presented here facilitates the understanding of the relationships among error rates, classifier boundary distributions, and combining in output space. Experimental results on several public domain data sets are provided to illustrate the benefits of combining and to support the analytical results.

  17. The effects of nonuniform magnetic field strength on density flux and test particle transport in drift wave turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dewhurst, J. M.; Hnat, B.; Dendy, R. O.

    2009-07-15

    The extended Hasegawa-Wakatani equations generate fully nonlinear self-consistent solutions for coupled density n and vorticity {nabla}{sup 2}{phi}, where {phi} is electrostatic potential, in a plasma with background density inhomogeneity {kappa}=-{partial_derivative} ln n{sub 0}/{partial_derivative}x and magnetic field strength inhomogeneity C=-{partial_derivative} ln B/{partial_derivative}x. Finite C introduces interchange effects and {nabla}B drifts into the framework of drift turbulence through compressibility of the ExB and diamagnetic drifts. This paper addresses the direct computation of the radial ExB density flux {gamma}{sub n}=-n{partial_derivative}{phi}/{partial_derivative}y, tracer particle transport, the statistical properties of the turbulent fluctuations that drive {gamma}{sub n} and tracer motion, and analytical underpinnings. Systematic trends emergemore » in the dependence on C of the skewness of the distribution of pointwise {gamma}{sub n} and in the relative phase of density-velocity and density-potential pairings. It is shown how these effects, together with conservation of potential vorticity {pi}={nabla}{sup 2}{phi}-n+({kappa}-C)x, account for much of the transport phenomenology. Simple analytical arguments yield a Fickian relation {gamma}{sub n}=({kappa}-C)D{sub x} between the radial density flux {gamma}{sub n} and the radial tracer diffusivity D{sub x}, which is shown to explain key trends in the simulations.« less

  18. Parameter-free driven Liouville-von Neumann approach for time-dependent electronic transport simulations in open quantum systems

    DOE PAGES

    Zelovich, Tamar; Hansen, Thorsten; Liu, Zhen-Fei; ...

    2017-03-02

    A parameter-free version of the recently developed driven Liouville-von Neumann equation [T. Zelovich et al., J. Chem. Theory Comput. 10(8), 2927-2941 (2014)] for electronic transport calculations in molecular junctions is presented. The single driving rate, appearing as a fitting parameter in the original methodology, is replaced by a set of state-dependent broadening factors applied to the different single-particle lead levels. These broadening factors are extracted explicitly from the self-energy of the corresponding electronic reservoir and are fully transferable to any junction incorporating the same lead model. Furthermore, the performance of the method is demonstrated via tight-binding and extended Hückel calculationsmore » of simple junction models. Our analytic considerations and numerical results indicate that the developed methodology constitutes a rigorous framework for the design of "black-box" algorithms to simulate electron dynamics in open quantum systems out of equilibrium.« less

  19. Mixed mechanisms of multi-site phosphorylation

    PubMed Central

    Suwanmajo, Thapanar; Krishnan, J.

    2015-01-01

    Multi-site phosphorylation is ubiquitous in cell biology and has been widely studied experimentally and theoretically. The underlying chemical modification mechanisms are typically assumed to be distributive or processive. In this paper, we study the behaviour of mixed mechanisms that can arise either because phosphorylation and dephosphorylation involve different mechanisms or because phosphorylation and/or dephosphorylation can occur through a combination of mechanisms. We examine a hierarchy of models to assess chemical information processing through different mixed mechanisms, using simulations, bifurcation analysis and analytical work. We demonstrate how mixed mechanisms can show important and unintuitive differences from pure distributive and processive mechanisms, in some cases resulting in monostable behaviour with simple dose–response behaviour, while in other cases generating new behaviour-like oscillations. Our results also suggest patterns of information processing that are relevant as the number of modification sites increases. Overall, our work creates a framework to examine information processing arising from complexities of multi-site modification mechanisms and their impact on signal transduction. PMID:25972433

  20. Mechanism synthesis and 2-D control designs of an active three cable crane

    NASA Technical Reports Server (NTRS)

    Yang, Li-Farn; Mikulas, Martin M., Jr.

    1992-01-01

    A Lunar Crane with a suspension system based on a three cable mechanism is investigated to provide a stable end-effector for hoisting, positioning, and assembling large components during construction and servicing of a Lunar Base. The three cable suspension mechanism consists of a structural framework of three cables pointing to a common point that closely coincides with the suspended payload's center of gravity. The vibrational characteristics of this three cable suspension system are investigated by comparing a simple 2-D symmetric suspension model and a swinging pendulum in terms of their analytical natural frequency equations. A study is also made of actively controlling the dynamics of the crane using two different actuator concepts. Also, Lyapunov-based control algorithms are developed to determine two regulator-type control laws performing the system vibrational suppression for both system dynamics. Simulations including initial-valued dynamic responses as well as control performances for two different system dynamics are also presented.

  1. Interpretation of Ground Temperature Anomalies in Hydrothermal Discharge Areas

    NASA Astrophysics Data System (ADS)

    Price, A. N.; Lindsey, C.; Fairley, J. P., Jr.

    2017-12-01

    Researchers have long noted the potential for shallow hydrothermal fluids to perturb near-surface temperatures. Several investigators have made qualitative or semi-quantitative use of elevated surface temperatures; for example, in snowfall calorimetry, or for tracing subsurface flow paths. However, little effort has been expended to develop a quantitative framework connecting surface temperature observations with conditions in the subsurface. Here, we examine an area of shallow subsurface flow at Burgdorf Hot Springs, in the Payette National Forest, north of McCall, Idaho USA. We present a simple analytical model that uses easily-measured surface data to infer the temperatures of laterally-migrating shallow hydrothermal fluids. The model is calibrated using shallow ground temperature measurements and overburden thickness estimates from seismic refraction studies. The model predicts conditions in the shallow subsurface, and suggests that the Biot number may place a more important control on the expression of near-surface thermal perturbations than previously thought. In addition, our model may have application in inferring difficult-to-measure parameters, such as shallow subsurface discharge from hydrothermal springs.

  2. Parameter-free driven Liouville-von Neumann approach for time-dependent electronic transport simulations in open quantum systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zelovich, Tamar; Hansen, Thorsten; Liu, Zhen-Fei

    A parameter-free version of the recently developed driven Liouville-von Neumann equation [T. Zelovich et al., J. Chem. Theory Comput. 10(8), 2927-2941 (2014)] for electronic transport calculations in molecular junctions is presented. The single driving rate, appearing as a fitting parameter in the original methodology, is replaced by a set of state-dependent broadening factors applied to the different single-particle lead levels. These broadening factors are extracted explicitly from the self-energy of the corresponding electronic reservoir and are fully transferable to any junction incorporating the same lead model. Furthermore, the performance of the method is demonstrated via tight-binding and extended Hückel calculationsmore » of simple junction models. Our analytic considerations and numerical results indicate that the developed methodology constitutes a rigorous framework for the design of "black-box" algorithms to simulate electron dynamics in open quantum systems out of equilibrium.« less

  3. On the spin-axis dynamics of a Moonless Earth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Gongjie; Batygin, Konstantin, E-mail: gli@cfa.harvard.edu

    2014-07-20

    The variation of a planet's obliquity is influenced by the existence of satellites with a high mass ratio. For instance, Earth's obliquity is stabilized by the Moon and would undergo chaotic variations in the Moon's absence. In turn, such variations can lead to large-scale changes in the atmospheric circulation, rendering spin-axis dynamics a central issue for understanding climate. The relevant quantity for dynamically forced climate change is the rate of chaotic diffusion. Accordingly, here we re-examine the spin-axis evolution of a Moonless Earth within the context of a simplified perturbative framework. We present analytical estimates of the characteristic Lyapunov coefficientmore » as well as the chaotic diffusion rate and demonstrate that even in absence of the Moon, the stochastic change in Earth's obliquity is sufficiently slow to not preclude long-term habitability. Our calculations are consistent with published numerical experiments and illustrate the putative system's underlying dynamical structure in a simple and intuitive manner.« less

  4. Theoretical study of mixing in liquid clouds – Part 1: Classical concepts

    DOE PAGES

    Korolev, Alexei; Khain, Alex; Pinsky, Mark; ...

    2016-07-28

    The present study considers final stages of in-cloud mixing in the framework of classical concept of homogeneous and extreme inhomogeneous mixing. Simple analytical relationships between basic microphysical parameters were obtained for homogeneous and extreme inhomogeneous mixing based on the adiabatic consideration. It was demonstrated that during homogeneous mixing the functional relationships between the moments of the droplets size distribution hold only during the primary stage of mixing. Subsequent random mixing between already mixed parcels and undiluted cloud parcels breaks these relationships. However, during extreme inhomogeneous mixing the functional relationships between the microphysical parameters hold both for primary and subsequent mixing.more » The obtained relationships can be used to identify the type of mixing from in situ observations. The effectiveness of the developed method was demonstrated using in situ data collected in convective clouds. It was found that for the specific set of in situ measurements the interaction between cloudy and entrained environments was dominated by extreme inhomogeneous mixing.« less

  5. Optimization of single-base-pair mismatch discrimination in oligonucleotide microarrays

    NASA Technical Reports Server (NTRS)

    Urakawa, Hidetoshi; El Fantroussi, Said; Smidt, Hauke; Smoot, James C.; Tribou, Erik H.; Kelly, John J.; Noble, Peter A.; Stahl, David A.

    2003-01-01

    The discrimination between perfect-match and single-base-pair-mismatched nucleic acid duplexes was investigated by using oligonucleotide DNA microarrays and nonequilibrium dissociation rates (melting profiles). DNA and RNA versions of two synthetic targets corresponding to the 16S rRNA sequences of Staphylococcus epidermidis (38 nucleotides) and Nitrosomonas eutropha (39 nucleotides) were hybridized to perfect-match probes (18-mer and 19-mer) and to a set of probes having all possible single-base-pair mismatches. The melting profiles of all probe-target duplexes were determined in parallel by using an imposed temperature step gradient. We derived an optimum wash temperature for each probe and target by using a simple formula to calculate a discrimination index for each temperature of the step gradient. This optimum corresponded to the output of an independent analysis using a customized neural network program. These results together provide an experimental and analytical framework for optimizing mismatch discrimination among all probes on a DNA microarray.

  6. Installed Base as a Facilitator for User-Driven Innovation: How Can User Innovation Challenge Existing Institutional Barriers?

    PubMed Central

    Andersen, Synnøve Thomassen; Jansen, Arild

    2012-01-01

    The paper addresses an ICT-based, user-driven innovation process in the health sector in rural areas in Norway. The empirical base is the introduction of a new model for psychiatric health provision. This model is supported by a technical solution based on mobile phones that is aimed to help the communication between professional health personnel and patients. This innovation was made possible through the use of standard mobile technology rather than more sophisticated systems. The users were heavily involved in the development work. Our analysis shows that by thinking simple and small-scale solutions, including to take the user's needs and premises as a point of departure rather than focusing on advanced technology, the implementation process was made possible. We show that by combining theory on information infrastructures, user-oriented system development, and innovation in a three-layered analytical framework, we can explain the interrelationship between technical, organizational, and health professional factors that made this innovation a success. PMID:23304134

  7. Self-similar Theory of Wind-driven Sea

    NASA Astrophysics Data System (ADS)

    Zakharov, V. E.

    2015-12-01

    More than two dozens field experiments performed in the ocean and on the lakes show that the fetch-limited growth of dimensionless energy and dimensionless peak frequency is described by powerlike functions of the dimensionless fetch. Moreover, the exponents of these two functions are connected with a proper accuracy by the standard "magic relation", 10q-2p=1. Recent massive numerical experiments as far as experiments in wave tanks also confirm this magic relation. All these experimental facts can be interpreted in a framework of the following simple theory. The wind-driven sea is described by the "conservative" Hasselmann kinetic equation. The source terms, wind input and white-capping dissipation, play a secondary role in comparison with the nonlinear term Snl that is responsible for the four-wave resonant interaction. This equation has four-parameter family of self-similar solutions. The magic relation holds for all numbers of this family. This fact gives strong hope that development of self-consistent analytic theory of wind-driven sea is quite realizable task.

  8. Self-consistent approach for neutral community models with speciation

    NASA Astrophysics Data System (ADS)

    Haegeman, Bart; Etienne, Rampal S.

    2010-03-01

    Hubbell’s neutral model provides a rich theoretical framework to study ecological communities. By incorporating both ecological and evolutionary time scales, it allows us to investigate how communities are shaped by speciation processes. The speciation model in the basic neutral model is particularly simple, describing speciation as a point-mutation event in a birth of a single individual. The stationary species abundance distribution of the basic model, which can be solved exactly, fits empirical data of distributions of species’ abundances surprisingly well. More realistic speciation models have been proposed such as the random-fission model in which new species appear by splitting up existing species. However, no analytical solution is available for these models, impeding quantitative comparison with data. Here, we present a self-consistent approximation method for neutral community models with various speciation modes, including random fission. We derive explicit formulas for the stationary species abundance distribution, which agree very well with simulations. We expect that our approximation method will be useful to study other speciation processes in neutral community models as well.

  9. Short-ranged memory model with preferential growth

    NASA Astrophysics Data System (ADS)

    Schaigorodsky, Ana L.; Perotti, Juan I.; Almeira, Nahuel; Billoni, Orlando V.

    2018-02-01

    In this work we introduce a variant of the Yule-Simon model for preferential growth by incorporating a finite kernel to model the effects of bounded memory. We characterize the properties of the model combining analytical arguments with extensive numerical simulations. In particular, we analyze the lifetime and popularity distributions by mapping the model dynamics to corresponding Markov chains and branching processes, respectively. These distributions follow power laws with well-defined exponents that are within the range of the empirical data reported in ecologies. Interestingly, by varying the innovation rate, this simple out-of-equilibrium model exhibits many of the characteristics of a continuous phase transition and, around the critical point, it generates time series with power-law popularity, lifetime and interevent time distributions, and nontrivial temporal correlations, such as a bursty dynamics in analogy with the activity of solar flares. Our results suggest that an appropriate balance between innovation and oblivion rates could provide an explanatory framework for many of the properties commonly observed in many complex systems.

  10. Short-ranged memory model with preferential growth.

    PubMed

    Schaigorodsky, Ana L; Perotti, Juan I; Almeira, Nahuel; Billoni, Orlando V

    2018-02-01

    In this work we introduce a variant of the Yule-Simon model for preferential growth by incorporating a finite kernel to model the effects of bounded memory. We characterize the properties of the model combining analytical arguments with extensive numerical simulations. In particular, we analyze the lifetime and popularity distributions by mapping the model dynamics to corresponding Markov chains and branching processes, respectively. These distributions follow power laws with well-defined exponents that are within the range of the empirical data reported in ecologies. Interestingly, by varying the innovation rate, this simple out-of-equilibrium model exhibits many of the characteristics of a continuous phase transition and, around the critical point, it generates time series with power-law popularity, lifetime and interevent time distributions, and nontrivial temporal correlations, such as a bursty dynamics in analogy with the activity of solar flares. Our results suggest that an appropriate balance between innovation and oblivion rates could provide an explanatory framework for many of the properties commonly observed in many complex systems.

  11. Diffuse interface modeling of three-phase contact line dynamics on curved boundaries: A lattice Boltzmann model for large density and viscosity ratios

    NASA Astrophysics Data System (ADS)

    Fakhari, Abbas; Bolster, Diogo

    2017-04-01

    We introduce a simple and efficient lattice Boltzmann method for immiscible multiphase flows, capable of handling large density and viscosity contrasts. The model is based on a diffuse-interface phase-field approach. Within this context we propose a new algorithm for specifying the three-phase contact angle on curved boundaries within the framework of structured Cartesian grids. The proposed method has superior computational accuracy compared with the common approach of approximating curved boundaries with stair cases. We test the model by applying it to four benchmark problems: (i) wetting and dewetting of a droplet on a flat surface and (ii) on a cylindrical surface, (iii) multiphase flow past a circular cylinder at an intermediate Reynolds number, and (iv) a droplet falling on hydrophilic and superhydrophobic circular cylinders under differing conditions. Where available, our results show good agreement with analytical solutions and/or existing experimental data, highlighting strengths of this new approach.

  12. Cracking the chocolate egg problem: polymeric films coated on curved substrates

    NASA Astrophysics Data System (ADS)

    Brun, Pierre-Thomas; Lee, Anna; Marthelot, Joel; Balestra, Gioele; Gallaire, François; Reis, Pedro

    2015-11-01

    Inspired by the traditional chocolate egg recipe, we show that pouring a polymeric solution onto spherical molds yields a simple and robust path of fabrication of thin elastic curved shells. The drainage dynamics naturally leads to uniform coatings frozen in time as the polymer cures, which are subsequently peeled off their mold. We show how the polymer curing affects the drainage dynamics and eventually selects the shell thickness and sets its uniformity. To this end, we perform coating experiments using silicon based elastomers, Vinylpolysiloxane (VPS) and Polydimethylsiloxane (PDMS). These results are rationalized combining numerical simulations of the lubrication flow field to a theoretical model of the dynamics yielding an analytical prediction of the formed shell characteristics. In particular, the robustness of the coating technique and its flexibility, two critical features for providing a generic framework for future studies, are shown to be an inherent consequence of the flow field (memory loss). The shell structure is both independent of initial conditions and tailorable by changing a single experimental parameter.

  13. A simple analytical thermo-mechanical model for liquid crystal elastomer bilayer structures

    NASA Astrophysics Data System (ADS)

    Cui, Yun; Wang, Chengjun; Sim, Kyoseung; Chen, Jin; Li, Yuhang; Xing, Yufeng; Yu, Cunjiang; Song, Jizhou

    2018-02-01

    The bilayer structure consisting of thermal-responsive liquid crystal elastomers (LCEs) and other polymer materials with stretchable heaters has attracted much attention in applications of soft actuators and soft robots due to its ability to generate large deformations when subjected to heat stimuli. A simple analytical thermo-mechanical model, accounting for the non-uniform feature of the temperature/strain distribution along the thickness direction, is established for this type of bilayer structure. The analytical predictions of the temperature and bending curvature radius agree well with finite element analysis and experiments. The influences of the LCE thickness and the heat generation power on the bending deformation of the bilayer structure are fully investigated. It is shown that a thinner LCE layer and a higher heat generation power could yield more bending deformation. These results may help the design of soft actuators and soft robots involving thermal responsive LCEs.

  14. Using Presentation Software to Flip an Undergraduate Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Fitzgerald, Neil; Li, Luisa

    2015-01-01

    An undergraduate analytical chemistry course has been adapted to a flipped course format. Course content was provided by video clips, text, graphics, audio, and simple animations organized as concept maps using the cloud-based presentation platform, Prezi. The advantages of using Prezi to present course content in a flipped course format are…

  15. Data Acquisition Programming (LabVIEW): An Aid to Teaching Instrumental Analytical Chemistry.

    ERIC Educational Resources Information Center

    Gostowski, Rudy

    A course was developed at Austin Peay State University (Tennessee) which offered an opportunity for hands-on experience with the essential components of modern analytical instruments. The course aimed to provide college students with the skills necessary to construct a simple model instrument, including the design and fabrication of electronic…

  16. Operational Environmental Assessment

    DTIC Science & Technology

    1988-09-01

    Chemistry Branch - Physical Chemistry Branch " Analytical Research Division - Analytical Systems Branch - Methodology Research Branch - Spectroscopy Branch...electromagnetic frequency spec- trum and includes radio frequencies, infrared , visible light, ultraviolet, X-rays and gamma rays (in ascending order of...Verruculogen Aflatrem Picrotoxin Ciguatoxin Mycotoxins Simple Tr ichothecenes T-2 Toxin T-2 Tetraol Neosolaniol * Nivalenol Deoxynivalenol Verrucarol B-3 B lank

  17. Numerical Simulation of the Perrin-Like Experiments

    ERIC Educational Resources Information Center

    Mazur, Zygmunt; Grech, Dariusz

    2008-01-01

    A simple model of the random Brownian walk of a spherical mesoscopic particle in viscous liquids is proposed. The model can be solved analytically and simulated numerically. The analytic solution gives the known Einstein-Smoluchowski diffusion law r[superscript 2] = 2Dt, where the diffusion constant D is expressed by the mass and geometry of a…

  18. Quantitative Ultrasound-Assisted Extraction for Trace-Metal Determination: An Experiment for Analytical Chemistry

    ERIC Educational Resources Information Center

    Lavilla, Isela; Costas, Marta; Pena-Pereira, Francisco; Gil, Sandra; Bendicho, Carlos

    2011-01-01

    Ultrasound-assisted extraction (UAE) is introduced to upper-level analytical chemistry students as a simple strategy focused on sample preparation for trace-metal determination in biological tissues. Nickel extraction in seafood samples and quantification by electrothermal atomic absorption spectrometry (ETAAS) are carried out by a team of four…

  19. A simple analytical model of coupled single flow channel over porous electrode in vanadium redox flow battery with serpentine flow channel

    NASA Astrophysics Data System (ADS)

    Ke, Xinyou; Alexander, J. Iwan D.; Prahl, Joseph M.; Savinell, Robert F.

    2015-08-01

    A simple analytical model of a layered system comprised of a single passage of a serpentine flow channel and a parallel underlying porous electrode (or porous layer) is proposed. This analytical model is derived from Navier-Stokes motion in the flow channel and Darcy-Brinkman model in the porous layer. The continuities of flow velocity and normal stress are applied at the interface between the flow channel and the porous layer. The effects of the inlet volumetric flow rate, thickness of the flow channel and thickness of a typical carbon fiber paper porous layer on the volumetric flow rate within this porous layer are studied. The maximum current density based on the electrolyte volumetric flow rate is predicted, and found to be consistent with reported numerical simulation. It is found that, for a mean inlet flow velocity of 33.3 cm s-1, the analytical maximum current density is estimated to be 377 mA cm-2, which compares favorably with experimental result reported by others of ∼400 mA cm-2.

  20. Boundary condition determined wave functions for the ground states of one- and two-electron homonuclear molecules

    NASA Astrophysics Data System (ADS)

    Patil, S. H.; Tang, K. T.; Toennies, J. P.

    1999-10-01

    Simple analytical wave functions satisfying appropriate boundary conditions are constructed for the ground states of one-and two-electron homonuclear molecules. Both the asymptotic condition when one electron is far away and the cusp condition when the electron coalesces with a nucleus are satisfied by the proposed wave function. For H2+, the resulting wave function is almost identical to the Guillemin-Zener wave function which is known to give very good energies. For the two electron systems H2 and He2++, the additional electron-electron cusp condition is rigorously accounted for by a simple analytic correlation function which has the correct behavior not only for r12→0 and r12→∞ but also for R→0 and R→∞, where r12 is the interelectronic distance and R, the internuclear distance. Energies obtained from these simple wave functions agree within 2×10-3 a.u. with the results of the most sophisticated variational calculations for all R and for all systems studied. This demonstrates that rather simple physical considerations can be used to derive very accurate wave functions for simple molecules thereby avoiding laborious numerical variational calculations.

  1. Molecule database framework: a framework for creating database applications with chemical structure search capability

    PubMed Central

    2013-01-01

    Background Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Results Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes: • Support for multi-component compounds (mixtures) • Import and export of SD-files • Optional security (authorization) For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures). Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. Conclusions By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework. PMID:24325762

  2. Molecule database framework: a framework for creating database applications with chemical structure search capability.

    PubMed

    Kiener, Joos

    2013-12-11

    Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes:•Support for multi-component compounds (mixtures)•Import and export of SD-files•Optional security (authorization)For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures).Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework.

  3. All-organic microelectromechanical systems integrating specific molecular recognition--a new generation of chemical sensors.

    PubMed

    Ayela, Cédric; Dubourg, Georges; Pellet, Claude; Haupt, Karsten

    2014-09-03

    Cantilever-type all-organic microelectromechanical systems based on molecularly imprinted polymers for specific analyte recognition are used as chemical sensors. They are produced by a simple spray-coating-shadow-masking process. Analyte binding to the cantilever generates a measurable change in its resonance frequency. This allows label-free detection by direct mass sensing of low-molecular-weight analytes at nanomolar concentrations. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Approaching near real-time biosensing: microfluidic microsphere based biosensor for real-time analyte detection.

    PubMed

    Cohen, Noa; Sabhachandani, Pooja; Golberg, Alexander; Konry, Tania

    2015-04-15

    In this study we describe a simple lab-on-a-chip (LOC) biosensor approach utilizing well mixed microfluidic device and a microsphere-based assay capable of performing near real-time diagnostics of clinically relevant analytes such cytokines and antibodies. We were able to overcome the adsorption kinetics reaction rate-limiting mechanism, which is diffusion-controlled in standard immunoassays, by introducing the microsphere-based assay into well-mixed yet simple microfluidic device with turbulent flow profiles in the reaction regions. The integrated microsphere-based LOC device performs dynamic detection of the analyte in minimal amount of biological specimen by continuously sampling micro-liter volumes of sample per minute to detect dynamic changes in target analyte concentration. Furthermore we developed a mathematical model for the well-mixed reaction to describe the near real time detection mechanism observed in the developed LOC method. To demonstrate the specificity and sensitivity of the developed real time monitoring LOC approach, we applied the device for clinically relevant analytes: Tumor Necrosis Factor (TNF)-α cytokine and its clinically used inhibitor, anti-TNF-α antibody. Based on the reported results herein, the developed LOC device provides continuous sensitive and specific near real-time monitoring method for analytes such as cytokines and antibodies, reduces reagent volumes by nearly three orders of magnitude as well as eliminates the washing steps required by standard immunoassays. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. A visual analytic framework for data fusion in investigative intelligence

    NASA Astrophysics Data System (ADS)

    Cai, Guoray; Gross, Geoff; Llinas, James; Hall, David

    2014-05-01

    Intelligence analysis depends on data fusion systems to provide capabilities of detecting and tracking important objects, events, and their relationships in connection to an analytical situation. However, automated data fusion technologies are not mature enough to offer reliable and trustworthy information for situation awareness. Given the trend of increasing sophistication of data fusion algorithms and loss of transparency in data fusion process, analysts are left out of the data fusion process cycle with little to no control and confidence on the data fusion outcome. Following the recent rethinking of data fusion as human-centered process, this paper proposes a conceptual framework towards developing alternative data fusion architecture. This idea is inspired by the recent advances in our understanding of human cognitive systems, the science of visual analytics, and the latest thinking about human-centered data fusion. Our conceptual framework is supported by an analysis of the limitation of existing fully automated data fusion systems where the effectiveness of important algorithmic decisions depend on the availability of expert knowledge or the knowledge of the analyst's mental state in an investigation. The success of this effort will result in next generation data fusion systems that can be better trusted while maintaining high throughput.

  6. The path dependency theory: analytical framework to study institutional integration. The case of France.

    PubMed

    Trouvé, Hélène; Couturier, Yves; Etheridge, Francis; Saint-Jean, Olivier; Somme, Dominique

    2010-06-30

    The literature on integration indicates the need for an enhanced theorization of institutional integration. This article proposes path dependence as an analytical framework to study the systems in which integration takes place. PRISMA proposes a model for integrating health and social care services for older adults. This model was initially tested in Quebec. The PRISMA France study gave us an opportunity to analyze institutional integration in France. A qualitative approach was used. Analyses were based on semi-structured interviews with actors of all levels of decision-making, observations of advisory board meetings, and administrative documents. Our analyses revealed the complexity and fragmentation of institutional integration. The path dependency theory, which analyzes the change capacity of institutions by taking into account their historic structures, allows analysis of this situation. The path dependency to the Bismarckian system and the incomplete reforms of gerontological policies generate the coexistence and juxtaposition of institutional systems. In such a context, no institution has sufficient ability to determine gerontology policy and build institutional integration by itself. Using path dependence as an analytical framework helps to understand the reasons why institutional integration is critical to organizational and clinical integration, and the complex construction of institutional integration in France.

  7. Principles of Catholic Social Teaching, Critical Pedagogy, and the Theory of Intersectionality: An Integrated Framework to Examine the Roles of Social Status in the Formation of Catholic Teachers

    ERIC Educational Resources Information Center

    Eick, Caroline Marie; Ryan, Patrick A.

    2014-01-01

    This article discusses the relevance of an analytic framework that integrates principles of Catholic social teaching, critical pedagogy, and the theory of intersectionality to explain attitudes toward marginalized youth held by Catholic students preparing to become teachers. The framework emerges from five years of action research data collected…

  8. Competency Analytics Tool: Analyzing Curriculum Using Course Competencies

    ERIC Educational Resources Information Center

    Gottipati, Swapna; Shankararaman, Venky

    2018-01-01

    The applications of learning outcomes and competency frameworks have brought better clarity to engineering programs in many universities. Several frameworks have been proposed to integrate outcomes and competencies into course design, delivery and assessment. However, in many cases, competencies are course-specific and their overall impact on the…

  9. A Methodological Framework to Analyze Stakeholder Preferences and Propose Strategic Pathways for a Sustainable University

    ERIC Educational Resources Information Center

    Turan, Fikret Korhan; Cetinkaya, Saadet; Ustun, Ceyda

    2016-01-01

    Building sustainable universities calls for participative management and collaboration among stakeholders. Combining analytic hierarchy and network processes (AHP/ANP) with statistical analysis, this research proposes a framework that can be used in higher education institutions for integrating stakeholder preferences into strategic decisions. The…

  10. Collaborative Rhetorical Structure: A Discourse Analysis Method for Analyzing Student Collaborative Inquiry via Computer Conferencing

    ERIC Educational Resources Information Center

    Kou, Xiaojing

    2011-01-01

    Various formats of online discussion have proven valuable for enhancing learning and collaboration in distance and blended learning contexts. However, despite their capacity to reveal essential processes in collaborative inquiry, current mainstream analytical frameworks, such as the cognitive presence framework (Garrison, Anderson, & Archer,…

  11. A Cognitive Framework for the Analysis of Online Chemistry Courses

    ERIC Educational Resources Information Center

    Evans, Karen L.; Leinhardt, Gaea

    2008-01-01

    Many students now are receiving instruction in online environments created by universities, museums, corporations, and even students. What features of a given online course contribute to its effectiveness? This paper addresses that query by proposing and applying an analytic framework to five online introductory chemistry courses. Introductory…

  12. Optimizing the Long-Term Retention of Skills: Structural and Analytic Approaches to Skill Maintenance

    DTIC Science & Technology

    1990-08-01

    evidence for a surprising degree of long-term skill retention. We formulated a theoretical framework , focusing on the importance of procedural reinstatement...considerable forgetting over even relatively short retention intervals. We have been able to place these studies in the same general theoretical framework developed

  13. Managing Offshore Branch Campuses: An Analytical Framework for Institutional Strategies

    ERIC Educational Resources Information Center

    Shams, Farshid; Huisman, Jeroen

    2012-01-01

    The aim of this article is to develop a framework that encapsulates the key managerial complexities of running offshore branch campuses. In the transnational higher education (TNHE) literature, several managerial ramifications and impediments have been addressed by scholars and practitioners. However, the strands of the literature are highly…

  14. Memory-Based Simple Heuristics as Attribute Substitution: Competitive Tests of Binary Choice Inference Models.

    PubMed

    Honda, Hidehito; Matsuka, Toshihiko; Ueda, Kazuhiro

    2017-05-01

    Some researchers on binary choice inference have argued that people make inferences based on simple heuristics, such as recognition, fluency, or familiarity. Others have argued that people make inferences based on available knowledge. To examine the boundary between heuristic and knowledge usage, we examine binary choice inference processes in terms of attribute substitution in heuristic use (Kahneman & Frederick, 2005). In this framework, it is predicted that people will rely on heuristic or knowledge-based inference depending on the subjective difficulty of the inference task. We conducted competitive tests of binary choice inference models representing simple heuristics (fluency and familiarity heuristics) and knowledge-based inference models. We found that a simple heuristic model (especially a familiarity heuristic model) explained inference patterns for subjectively difficult inference tasks, and that a knowledge-based inference model explained subjectively easy inference tasks. These results were consistent with the predictions of the attribute substitution framework. Issues on usage of simple heuristics and psychological processes are discussed. Copyright © 2016 Cognitive Science Society, Inc.

  15. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  16. Analytically based forward and inverse models of fluvial landscape evolution during temporally continuous climatic and tectonic variations

    NASA Astrophysics Data System (ADS)

    Goren, Liran; Petit, Carole

    2017-04-01

    Fluvial channels respond to changing tectonic and climatic conditions by adjusting their patterns of erosion and relief. It is therefore expected that by examining these patterns, we can infer the tectonic and climatic conditions that shaped the channels. However, the potential interference between climatic and tectonic signals complicates this inference. Within the framework of the stream power model that describes incision rate of mountainous bedrock rivers, climate variability has two effects: it influences the erosive power of the river, causing local slope change, and it changes the fluvial response time that controls the rate at which tectonically and climatically induced slope breaks are communicated upstream. Because of this dual role, the fluvial response time during continuous climate change has so far been elusive, which hinders our understanding of environmental signal propagation and preservation in the fluvial topography. An analytic solution of the stream power model during general tectonic and climatic histories gives rise to a new definition of the fluvial response time. The analytic solution offers accurate predictions for landscape evolution that are hard to achieve with classical numerical schemes and thus can be used to validate and evaluate the accuracy of numerical landscape evolution models. The analytic solution together with the new definition of the fluvial response time allow inferring either the tectonic history or the climatic history from river long profiles by using simple linear inversion schemes. Analytic study of landscape evolution during periodic climate change reveals that high frequency (10-100 kyr) climatic oscillations with respect to the response time, such as Milankovitch cycles, are not expected to leave significant fingerprints in the upstream reaches of fluvial channels. Linear inversion schemes are applied to the Tinee river tributaries in the southern French Alps, where tributary long profiles are used to recover the incision rate history of the Tinee main trunk. Inversion results show periodic, high incision rate pulses, which are correlated with interglacial episodes. Similar incision rate histories are recovered for the past 100 kyr when assuming constant climatic conditions or periodic climatic oscillations, in agreement with theoretical predictions.

  17. Development of an analytical solution for the Budyko watershed parameter in terms of catchment physical features

    NASA Astrophysics Data System (ADS)

    Reaver, N.; Kaplan, D. A.; Jawitz, J. W.

    2017-12-01

    The Budyko hypothesis states that a catchment's long-term water and energy balances are dependent on two relatively easy to measure quantities: rainfall depth and potential evaporation. This hypothesis is expressed as a simple function, the Budyko equation, which allows for the prediction of a catchment's actual evapotranspiration and discharge from measured rainfall depth and potential evaporation, data which are widely available. However, the two main analytically derived forms of the Budyko equation contain a single unknown watershed parameter, whose value varies across catchments; variation in this parameter has been used to explain the hydrological behavior of different catchments. The watershed parameter is generally thought of as a lumped quantity that represents the influence of all catchment biophysical features (e.g. soil type and depth, vegetation type, timing of rainfall, etc). Previous work has shown that the parameter is statistically correlated with catchment properties, but an explicit expression has been elusive. While the watershed parameter can be determined empirically by fitting the Budyko equation to measured data in gauged catchments where actual evapotranspiration can be estimated, this limits the utility of the framework for predicting impacts to catchment hydrology due to changing climate and land use. In this study, we developed an analytical solution for the lumped catchment parameter for both forms of the Budyko equation. We combined these solutions with a statistical soil moisture model to obtain analytical solutions for the Budyko equation parameter as a function of measurable catchment physical features, including rooting depth, soil porosity, and soil wilting point. We tested the predictive power of these solutions using the U.S. catchments in the MOPEX database. We also compared the Budyko equation parameter estimates generated from our analytical solutions (i.e. predicted parameters) with those obtained through the calibration of the Budyko equation to discharge data (i.e. empirical parameters), and found good agreement. These results suggest that it is possible to predict the Budyko equation watershed parameter directly from physical features, even for ungauged catchments.

  18. Automation effects in a multiloop manual control system

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Mcnally, B. D.

    1986-01-01

    An experimental and analytical study was undertaken to investigate human interaction with a simple multiloop manual control system in which the human's activity was systematically varied by changing the level of automation. The system simulated was the longitudinal dynamics of a hovering helicopter. The automation-systems-stabilized vehicle responses from attitude to velocity to position and also provided for display automation in the form of a flight director. The control-loop structure resulting from the task definition can be considered a simple stereotype of a hierarchical control system. The experimental study was complemented by an analytical modeling effort which utilized simple crossover models of the human operator. It was shown that such models can be extended to the description of multiloop tasks involving preview and precognitive human operator behavior. The existence of time optimal manual control behavior was established for these tasks and the role which internal models may play in establishing human-machine performance was discussed.

  19. Spectral properties of thermal fluctuations on simple liquid surfaces below shot-noise levels.

    PubMed

    Aoki, Kenichiro; Mitsui, Takahisa

    2012-07-01

    We study the spectral properties of thermal fluctuations on simple liquid surfaces, sometimes called ripplons. Analytical properties of the spectral function are investigated and are shown to be composed of regions with simple analytic behavior with respect to the frequency or the wave number. The derived expressions are compared to spectral measurements performed orders of magnitude below shot-noise levels, which is achieved using a novel noise reduction method. The agreement between the theory of thermal surface fluctuations and the experiment is found to be excellent, elucidating the spectral properties of the surface fluctuations. The measurement method requires relatively only a small sample both spatially (few μm) and temporally (~20 s). The method also requires relatively weak light power (~0.5 mW) so that it has a broad range of applicability, including local measurements, investigations of time-dependent phenomena, and noninvasive measurements.

  20. Experimental evaluation of expendable supersonic nozzle concepts

    NASA Technical Reports Server (NTRS)

    Baker, V.; Kwon, O.; Vittal, B.; Berrier, B.; Re, R.

    1990-01-01

    Exhaust nozzles for expendable supersonic turbojet engine missile propulsion systems are required to be simple, short and compact, in addition to having good broad-range thrust-minus-drag performance. A series of convergent-divergent nozzle scale model configurations were designed and wind tunnel tested for a wide range of free stream Mach numbers and nozzle pressure ratios. The models included fixed geometry and simple variable exit area concepts. The experimental and analytical results show that the fixed geometry configurations tested have inferior off-design thrust-minus-drag performance in the transonic Mach range. A simple variable exit area configuration called the Axi-Quad nozzle, combining features of both axisymmetric and two-dimensional convergent-divergent nozzles, performed well over a broad range of operating conditions. Analytical predictions of the flow pattern as well as overall performance of the nozzles, using a fully viscous, compressible CFD code, compared very well with the test data.

  1. Shock compression modeling of metallic single crystals: comparison of finite difference, steady wave, and analytical solutions

    DOE PAGES

    Lloyd, Jeffrey T.; Clayton, John D.; Austin, Ryan A.; ...

    2015-07-10

    Background: The shock response of metallic single crystals can be captured using a micro-mechanical description of the thermoelastic-viscoplastic material response; however, using a such a description within the context of traditional numerical methods may introduce a physical artifacts. Advantages and disadvantages of complex material descriptions, in particular the viscoplastic response, must be framed within approximations introduced by numerical methods. Methods: Three methods of modeling the shock response of metallic single crystals are summarized: finite difference simulations, steady wave simulations, and algebraic solutions of the Rankine-Hugoniot jump conditions. For the former two numerical techniques, a dislocation density based framework describes themore » rate- and temperature-dependent shear strength on each slip system. For the latter analytical technique, a simple (two-parameter) rate- and temperature-independent linear hardening description is necessarily invoked to enable simultaneous solution of the governing equations. For all models, the same nonlinear thermoelastic energy potential incorporating elastic constants of up to order 3 is applied. Results: Solutions are compared for plate impact of highly symmetric orientations (all three methods) and low symmetry orientations (numerical methods only) of aluminum single crystals shocked to 5 GPa (weak shock regime) and 25 GPa (overdriven regime). Conclusions: For weak shocks, results of the two numerical methods are very similar, regardless of crystallographic orientation. For strong shocks, artificial viscosity affects the finite difference solution, and effects of transverse waves for the lower symmetry orientations not captured by the steady wave method become important. The analytical solution, which can only be applied to highly symmetric orientations, provides reasonable accuracy with regards to prediction of most variables in the final shocked state but, by construction, does not provide insight into the shock structure afforded by the numerical methods.« less

  2. A generic analytical foot rollover model for predicting translational ankle kinematics in gait simulation studies.

    PubMed

    Ren, Lei; Howard, David; Ren, Luquan; Nester, Chris; Tian, Limei

    2010-01-19

    The objective of this paper is to develop an analytical framework to representing the ankle-foot kinematics by modelling the foot as a rollover rocker, which cannot only be used as a generic tool for general gait simulation but also allows for case-specific modelling if required. Previously, the rollover models used in gait simulation have often been based on specific functions that have usually been of a simple form. In contrast, the analytical model described here is in a general form that the effective foot rollover shape can be represented by any polar function rho=rho(phi). Furthermore, a normalized generic foot rollover model has been established based on a normative foot rollover shape dataset of 12 normal healthy subjects. To evaluate model accuracy, the predicted ankle motions and the centre of pressure (CoP) were compared with measurement data for both subject-specific and general cases. The results demonstrated that the ankle joint motions in both vertical and horizontal directions (relative RMSE approximately 10%) and CoP (relative RMSE approximately 15% for most of the subjects) are accurately predicted over most of the stance phase (from 10% to 90% of stance). However, we found that the foot cannot be very accurately represented by a rollover model just after heel strike (HS) and just before toe off (TO), probably due to shear deformation of foot plantar tissues (ankle motion can occur without any foot rotation). The proposed foot rollover model can be used in both inverse and forward dynamics gait simulation studies and may also find applications in rehabilitation engineering. Copyright 2009 Elsevier Ltd. All rights reserved.

  3. Educational Change in Post-Conflict Contexts: Reflections on the South African Experience 20 Years Later

    ERIC Educational Resources Information Center

    Christie, Pam

    2016-01-01

    Reflecting on South African experience, this paper develops an analytical framework using the work of Henri Lefebvre and Nancy Fraser to understand why socially just arrangements may be so difficult to achieve in post-conflict reconstruction. The paper uses Lefebvre's analytic to trace three sets of entangled practices…

  4. Triple Helix Systems: An Analytical Framework for Innovation Policy and Practice in the Knowledge Society

    ERIC Educational Resources Information Center

    Ranga, Marina; Etzkowitz, Henry

    2013-01-01

    This paper introduces the concept of Triple Helix systems as an analytical construct that synthesizes the key features of university--industry--government (Triple Helix) interactions into an "innovation system" format, defined according to systems theory as a set of components, relationships and functions. Among the components of Triple…

  5. Methods for Integrating Moderation and Mediation: A General Analytical Framework Using Moderated Path Analysis

    ERIC Educational Resources Information Center

    Edwards, Jeffrey R.; Lambert, Lisa Schurer

    2007-01-01

    Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated…

  6. Learning Analytics as a Counterpart to Surveys of Student Experience

    ERIC Educational Resources Information Center

    Borden, Victor M. H.; Coates, Hamish

    2017-01-01

    Analytics derived from the student learning environment provide new insights into the collegiate experience; they can be used as a supplement to or, to some extent, in place of traditional surveys. To serve this purpose, however, greater attention must be paid to conceptual frameworks and to advancing institutional systems, activating new…

  7. Challenges of Using Learning Analytics Techniques to Support Mobile Learning

    ERIC Educational Resources Information Center

    Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide

    2015-01-01

    Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…

  8. A Move-Analytic Contrastive Study on the Introductions of American and Philippine Master's Theses in Architecture

    ERIC Educational Resources Information Center

    Lintao, Rachelle B.; Erfe, Jonathan P.

    2012-01-01

    This study purports to foster the understanding of profession-based academic writing in two different cultural conventions by examining the rhetorical moves employed by American and Philippine thesis introductions in Architecture using Swales' 2004 Revised CARS move-analytic model as framework. Twenty (20) Master's thesis introductions in…

  9. Applying Learning Analytics for the Early Prediction of Students' Academic Performance in Blended Learning

    ERIC Educational Resources Information Center

    Lu, Owen H. T.; Huang, Anna Y. Q.; Huang, Jeff C. H.; Lin, Albert J. Q.; Ogata, Hiroaki; Yang, Stephen J. H.

    2018-01-01

    Blended learning combines online digital resources with traditional classroom activities and enables students to attain higher learning performance through well-defined interactive strategies involving online and traditional learning activities. Learning analytics is a conceptual framework and is a part of our Precision education used to analyze…

  10. A Bayesian Multi-Level Factor Analytic Model of Consumer Price Sensitivities across Categories

    ERIC Educational Resources Information Center

    Duvvuri, Sri Devi; Gruca, Thomas S.

    2010-01-01

    Identifying price sensitive consumers is an important problem in marketing. We develop a Bayesian multi-level factor analytic model of the covariation among household-level price sensitivities across product categories that are substitutes. Based on a multivariate probit model of category incidence, this framework also allows the researcher to…

  11. A Complete Validated Learning Analytics Framework: Designing Issues from Data Preparation Perspective

    ERIC Educational Resources Information Center

    Tlili, Ahmed; Essalmi, Fathi; Jemni, Mohamed; Kinshuk; Chen, Nian-Shing

    2018-01-01

    With the rapid growth of online education in recent years, Learning Analytics (LA) has gained increasing attention from researchers and educational institutions as an area which can improve the overall effectiveness of learning experiences. However, the lack of guidelines on what should be taken into consideration during application of LA hinders…

  12. Learning Analytics for Communities of Inquiry

    ERIC Educational Resources Information Center

    Kovanovic, Vitomir; Gaševic, Dragan; Hatala, Marek

    2014-01-01

    This paper describes doctoral research that focuses on the development of a learning analytics framework for inquiry-based digital learning. Building on the Community of Inquiry model (CoI)--a foundation commonly used in the research and practice of digital learning and teaching--this research builds on the existing body of knowledge in two…

  13. Translating the Theoretical into Practical: A Logical Framework of Functional Analytic Psychotherapy Interactions for Research, Training, and Clinical Purposes

    ERIC Educational Resources Information Center

    Weeks, Cristal E.; Kanter, Jonathan W.; Bonow, Jordan T.; Landes, Sara J.; Busch, Andrew M.

    2012-01-01

    Functional analytic psychotherapy (FAP) provides a behavioral analysis of the psychotherapy relationship that directly applies basic research findings to outpatient psychotherapy settings. Specifically, FAP suggests that a therapist's in vivo (i.e., in-session) contingent responding to targeted client behaviors, particularly positive reinforcement…

  14. Modeling of proton-induced radioactivation background in hard X-ray telescopes: Geant4-based simulation and its demonstration by Hitomi's measurement in a low Earth orbit

    NASA Astrophysics Data System (ADS)

    Odaka, Hirokazu; Asai, Makoto; Hagino, Kouichi; Koi, Tatsumi; Madejski, Greg; Mizuno, Tsunefumi; Ohno, Masanori; Saito, Shinya; Sato, Tamotsu; Wright, Dennis H.; Enoto, Teruaki; Fukazawa, Yasushi; Hayashi, Katsuhiro; Kataoka, Jun; Katsuta, Junichiro; Kawaharada, Madoka; Kobayashi, Shogo B.; Kokubun, Motohide; Laurent, Philippe; Lebrun, Francois; Limousin, Olivier; Maier, Daniel; Makishima, Kazuo; Mimura, Taketo; Miyake, Katsuma; Mori, Kunishiro; Murakami, Hiroaki; Nakamori, Takeshi; Nakano, Toshio; Nakazawa, Kazuhiro; Noda, Hirofumi; Ohta, Masayuki; Ozaki, Masanobu; Sato, Goro; Sato, Rie; Tajima, Hiroyasu; Takahashi, Hiromitsu; Takahashi, Tadayuki; Takeda, Shin'ichiro; Tanaka, Takaaki; Tanaka, Yasuyuki; Terada, Yukikatsu; Uchiyama, Hideki; Uchiyama, Yasunobu; Watanabe, Shin; Yamaoka, Kazutaka; Yasuda, Tetsuya; Yatsu, Yoichi; Yuasa, Takayuki; Zoglauer, Andreas

    2018-05-01

    Hard X-ray astronomical observatories in orbit suffer from a significant amount of background due to radioactivation induced by cosmic-ray protons and/or geomagnetically trapped protons. Within the framework of a full Monte Carlo simulation, we present modeling of in-orbit instrumental background which is dominated by radioactivation. To reduce the computation time required by straightforward simulations of delayed emissions from activated isotopes, we insert a semi-analytical calculation that converts production probabilities of radioactive isotopes by interaction of the primary protons into decay rates at measurement time of all secondary isotopes. Therefore, our simulation method is separated into three steps: (1) simulation of isotope production, (2) semi-analytical conversion to decay rates, and (3) simulation of decays of the isotopes at measurement time. This method is verified by a simple setup that has a CdTe semiconductor detector, and shows a 100-fold improvement in efficiency over the straightforward simulation. To demonstrate its experimental performance, the simulation framework was tested against data measured with a CdTe sensor in the Hard X-ray Imager onboard the Hitomi X-ray Astronomy Satellite, which was put into a low Earth orbit with an altitude of 570 km and an inclination of 31°, and thus experienced a large amount of irradiation from geomagnetically trapped protons during its passages through the South Atlantic Anomaly. The simulation is able to treat full histories of the proton irradiation and multiple measurement windows. The simulation results agree very well with the measured data, showing that the measured background is well described by the combination of proton-induced radioactivation of the CdTe detector itself and thick Bi4Ge3O12 scintillator shields, leakage of cosmic X-ray background and albedo gamma-ray radiation, and emissions from naturally contaminated isotopes in the detector system.

  15. Modeling of proton-induced radioactivation background in hard X-ray telescopes: Geant4-based simulation and its demonstration by Hitomi ’s measurement in a low Earth orbit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Odaka, Hirokazu; Asai, Makoto; Hagino, Kouichi

    Hard X-ray astronomical observatories in orbit suffer from a significant amount of background due to radioactivation induced by cosmic-ray protons and/or geomagnetically trapped protons. Within the framework of a full Monte Carlo simulation, we present modeling of in-orbit instrumental background which is dominated by radioactivation. To reduce the computation time required by straightforward simulations of delayed emissions from activated isotopes, we insert a semi-analytical calculation that converts production probabilities of radioactive isotopes by interaction of the primary protons into decay rates at measurement time of all secondary isotopes. Therefore, our simulation method is separated into three steps: (1) simulation ofmore » isotope production, (2) semi-analytical conversion to decay rates, and (3) simulation of decays of the isotopes at measurement time. This method is verified by a simple setup that has a CdTe semiconductor detector, and shows a 100-fold improvement in efficiency over the straightforward simulation. To demonstrate its experimental performance, the simulation framework was tested against data measured with a CdTe sensor in the Hard X-ray Imager onboard the Hitomi X-ray Astronomy Satellite, which was put into a low Earth orbit with an altitude of 570 km and an inclination of 31°, and thus experienced a large amount of irradiation from geomagnetically trapped protons during its passages through the South Atlantic Anomaly. The simulation is able to treat full histories of the proton irradiation and multiple measurement windows. As a result, the simulation results agree very well with the measured data, showing that the measured background is well described by the combination of proton-induced radioactivation of the CdTe detector itself and thick Bi 4Ge 3O 12 scintillator shields, leakage of cosmic X-ray background and albedo gamma-ray radiation, and emissions from naturally contaminated isotopes in the detector system.« less

  16. Modeling of proton-induced radioactivation background in hard X-ray telescopes: Geant4-based simulation and its demonstration by Hitomi ’s measurement in a low Earth orbit

    DOE PAGES

    Odaka, Hirokazu; Asai, Makoto; Hagino, Kouichi; ...

    2018-02-19

    Hard X-ray astronomical observatories in orbit suffer from a significant amount of background due to radioactivation induced by cosmic-ray protons and/or geomagnetically trapped protons. Within the framework of a full Monte Carlo simulation, we present modeling of in-orbit instrumental background which is dominated by radioactivation. To reduce the computation time required by straightforward simulations of delayed emissions from activated isotopes, we insert a semi-analytical calculation that converts production probabilities of radioactive isotopes by interaction of the primary protons into decay rates at measurement time of all secondary isotopes. Therefore, our simulation method is separated into three steps: (1) simulation ofmore » isotope production, (2) semi-analytical conversion to decay rates, and (3) simulation of decays of the isotopes at measurement time. This method is verified by a simple setup that has a CdTe semiconductor detector, and shows a 100-fold improvement in efficiency over the straightforward simulation. To demonstrate its experimental performance, the simulation framework was tested against data measured with a CdTe sensor in the Hard X-ray Imager onboard the Hitomi X-ray Astronomy Satellite, which was put into a low Earth orbit with an altitude of 570 km and an inclination of 31°, and thus experienced a large amount of irradiation from geomagnetically trapped protons during its passages through the South Atlantic Anomaly. The simulation is able to treat full histories of the proton irradiation and multiple measurement windows. As a result, the simulation results agree very well with the measured data, showing that the measured background is well described by the combination of proton-induced radioactivation of the CdTe detector itself and thick Bi 4Ge 3O 12 scintillator shields, leakage of cosmic X-ray background and albedo gamma-ray radiation, and emissions from naturally contaminated isotopes in the detector system.« less

  17. Infinite slope stability under steady unsaturated seepage conditions

    USGS Publications Warehouse

    Lu, Ning; Godt, Jonathan W.

    2008-01-01

    We present a generalized framework for the stability of infinite slopes under steady unsaturated seepage conditions. The analytical framework allows the water table to be located at any depth below the ground surface and variation of soil suction and moisture content above the water table under steady infiltration conditions. The framework also explicitly considers the effect of weathering and porosity increase near the ground surface on changes in the friction angle of the soil. The factor of safety is conceptualized as a function of the depth within the vadose zone and can be reduced to the classical analytical solution for subaerial infinite slopes in the saturated zone. Slope stability analyses with hypothetical sandy and silty soils are conducted to illustrate the effectiveness of the framework. These analyses indicate that for hillslopes of both sandy and silty soils, failure can occur above the water table under steady infiltration conditions, which is consistent with some field observations that cannot be predicted by the classical infinite slope theory. A case study of shallow slope failures of sandy colluvium on steep coastal hillslopes near Seattle, Washington, is presented to examine the predictive utility of the proposed framework.

  18. 8D likelihood effective Higgs couplings extraction framework in h → 4ℓ

    DOE PAGES

    Chen, Yi; Di Marco, Emanuele; Lykken, Joe; ...

    2015-01-23

    We present an overview of a comprehensive analysis framework aimed at performing direct extraction of all possible effective Higgs couplings to neutral electroweak gauge bosons in the decay to electrons and muons, the so called ‘golden channel’. Our framework is based primarily on a maximum likelihood method constructed from analytic expressions of the fully differential cross sections for h → 4l and for the dominant irreduciblemore » $$ q\\overline{q} $$ → 4l background, where 4l = 2e2μ, 4e, 4μ. Detector effects are included by an explicit convolution of these analytic expressions with the appropriate transfer function over all center of mass variables. Utilizing the full set of observables, we construct an unbinned detector-level likelihood which is continuous in the effective couplings. We consider possible ZZ, Zγ, and γγ couplings simultaneously, allowing for general CP odd/even admixtures. A broad overview is given of how the convolution is performed and we discuss the principles and theoretical basis of the framework. This framework can be used in a variety of ways to study Higgs couplings in the golden channel using data obtained at the LHC and other future colliders.« less

  19. Simple stochastic model for El Niño with westerly wind bursts

    PubMed Central

    Thual, Sulian; Majda, Andrew J.; Chen, Nan; Stechmann, Samuel N.

    2016-01-01

    Atmospheric wind bursts in the tropics play a key role in the dynamics of the El Niño Southern Oscillation (ENSO). A simple modeling framework is proposed that summarizes this relationship and captures major features of the observational record while remaining physically consistent and amenable to detailed analysis. Within this simple framework, wind burst activity evolves according to a stochastic two-state Markov switching–diffusion process that depends on the strength of the western Pacific warm pool, and is coupled to simple ocean–atmosphere processes that are otherwise deterministic, stable, and linear. A simple model with this parameterization and no additional nonlinearities reproduces a realistic ENSO cycle with intermittent El Niño and La Niña events of varying intensity and strength as well as realistic buildup and shutdown of wind burst activity in the western Pacific. The wind burst activity has a direct causal effect on the ENSO variability: in particular, it intermittently triggers regular El Niño or La Niña events, super El Niño events, or no events at all, which enables the model to capture observed ENSO statistics such as the probability density function and power spectrum of eastern Pacific sea surface temperatures. The present framework provides further theoretical and practical insight on the relationship between wind burst activity and the ENSO. PMID:27573821

  20. A trajectory generation framework for modeling spacecraft entry in MDAO

    NASA Astrophysics Data System (ADS)

    D`Souza, Sarah N.; Sarigul-Klijn, Nesrin

    2016-04-01

    In this paper a novel trajectory generation framework was developed that optimizes trajectory event conditions for use in a Generalized Entry Guidance algorithm. The framework was developed to be adaptable via the use of high fidelity equations of motion and drag based analytical bank profiles. Within this framework, a novel technique was implemented that resolved the sensitivity of the bank profile to atmospheric non-linearities. The framework's adaptability was established by running two different entry bank conditions. Each case yielded a reference trajectory and set of transition event conditions that are flight feasible and implementable in a Generalized Entry Guidance algorithm.

  1. Analytical Computation of Energy-Energy Correlation at Next-to-Leading Order in QCD

    NASA Astrophysics Data System (ADS)

    Dixon, Lance J.; Luo, Ming-xing; Shtabovenko, Vladyslav; Yang, Tong-Zhi; Zhu, Hua Xing

    2018-03-01

    The energy-energy correlation (EEC) between two detectors in e+e- annihilation was computed analytically at leading order in QCD almost 40 years ago, and numerically at next-to-leading order (NLO) starting in the 1980s. We present the first analytical result for the EEC at NLO, which is remarkably simple, and facilitates analytical study of the perturbative structure of the EEC. We provide the expansion of the EEC in the collinear and back-to-back regions through next-to-leading power, information which should aid resummation in these regions.

  2. The rise of environmental analytical chemistry as an interdisciplinary activity.

    PubMed

    Brown, Richard

    2009-07-01

    Modern scientific endeavour is increasingly delivered within an interdisciplinary framework. Analytical environmental chemistry is a long-standing example of an interdisciplinary approach to scientific research where value is added by the close cooperation of different disciplines. This editorial piece discusses the rise of environmental analytical chemistry as an interdisciplinary activity and outlines the scope of the Analytical Chemistry and the Environmental Chemistry domains of TheScientificWorldJOURNAL (TSWJ), and the appropriateness of TSWJ's domain format in covering interdisciplinary research. All contributions of new data, methods, case studies, and instrumentation, or new interpretations and developments of existing data, case studies, methods, and instrumentation, relating to analytical and/or environmental chemistry, to the Analytical and Environmental Chemistry domains, are welcome and will be considered equally.

  3. An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Zhou, Ning

    With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less

  4. Organizational culture and organizational effectiveness: a meta-analytic investigation of the competing values framework's theoretical suppositions.

    PubMed

    Hartnell, Chad A; Ou, Amy Yi; Kinicki, Angelo

    2011-07-01

    We apply Quinn and Rohrbaugh's (1983) competing values framework (CVF) as an organizing taxonomy to meta-analytically test hypotheses about the relationship between 3 culture types and 3 major indices of organizational effectiveness (employee attitudes, operational performance [i.e., innovation and product and service quality], and financial performance). The paper also tests theoretical suppositions undergirding the CVF by investigating the framework's nomological validity and proposed internal structure (i.e., interrelationships among culture types). Results based on data from 84 empirical studies with 94 independent samples indicate that clan, adhocracy, and market cultures are differentially and positively associated with the effectiveness criteria, though not always as hypothesized. The findings provide mixed support for the CVF's nomological validity and fail to support aspects of the CVF's proposed internal structure. We propose an alternative theoretical approach to the CVF and delineate directions for future research.

  5. Conceptual framework for outcomes research studies of hepatitis C: an analytical review

    PubMed Central

    Sbarigia, Urbano; Denee, Tom R; Turner, Norris G; Wan, George J; Morrison, Alan; Kaufman, Anna S; Rice, Gary; Dusheiko, Geoffrey M

    2016-01-01

    Hepatitis C virus infection is one of the main causes of chronic liver disease worldwide. Until recently, the standard antiviral regimen for hepatitis C was a combination of an interferon derivative and ribavirin, but a plethora of new antiviral drugs is becoming available. While these new drugs have shown great efficacy in clinical trials, observational studies are needed to determine their effectiveness in clinical practice. Previous observational studies have shown that multiple factors, besides the drug regimen, affect patient outcomes in clinical practice. Here, we provide an analytical review of published outcomes studies of the management of hepatitis C virus infection. A conceptual framework defines the relationships between four categories of variables: health care system structure, patient characteristics, process-of-care, and patient outcomes. This framework can provide a starting point for outcomes studies addressing the use and effectiveness of new antiviral drug treatments. PMID:27313473

  6. Earth Science Data Fusion with Event Building Approach

    NASA Technical Reports Server (NTRS)

    Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.

    2015-01-01

    Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.

  7. 3-D discrete analytical ridgelet transform.

    PubMed

    Helbert, David; Carré, Philippe; Andres, Eric

    2006-12-01

    In this paper, we propose an implementation of the 3-D Ridgelet transform: the 3-D discrete analytical Ridgelet transform (3-D DART). This transform uses the Fourier strategy for the computation of the associated 3-D discrete Radon transform. The innovative step is the definition of a discrete 3-D transform with the discrete analytical geometry theory by the construction of 3-D discrete analytical lines in the Fourier domain. We propose two types of 3-D discrete lines: 3-D discrete radial lines going through the origin defined from their orthogonal projections and 3-D planes covered with 2-D discrete line segments. These discrete analytical lines have a parameter called arithmetical thickness, allowing us to define a 3-D DART adapted to a specific application. Indeed, the 3-D DART representation is not orthogonal, It is associated with a flexible redundancy factor. The 3-D DART has a very simple forward/inverse algorithm that provides an exact reconstruction without any iterative method. In order to illustrate the potentiality of this new discrete transform, we apply the 3-D DART and its extension to the Local-DART (with smooth windowing) to the denoising of 3-D image and color video. These experimental results show that the simple thresholding of the 3-D DART coefficients is efficient.

  8. Design and performance of optimal detectors for guided wave structural health monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dib, G.; Udpa, L.

    2016-01-01

    Ultrasonic guided wave measurements in a long term structural health monitoring system are affected by measurement noise, environmental conditions, transducer aging and malfunction. This results in measurement variability which affects detection performance, especially in complex structures where baseline data comparison is required. This paper derives the optimal detector structure, within the framework of detection theory, where a guided wave signal at the sensor is represented by a single feature value that can be used for comparison with a threshold. Three different types of detectors are derived depending on the underlying structure’s complexity: (i) Simple structures where defect reflections can bemore » identified without the need for baseline data; (ii) Simple structures that require baseline data due to overlap of defect scatter with scatter from structural features; (iii) Complex structure with dense structural features that require baseline data. The detectors are derived by modeling the effects of variabilities and uncertainties as random processes. Analytical solutions for the performance of detectors in terms of the probability of detection and false alarm are derived. A finite element model is used to generate guided wave signals and the performance results of a Monte-Carlo simulation are compared with the theoretical performance. initial results demonstrate that the problems of signal complexity and environmental variability can in fact be exploited to improve detection performance.« less

  9. A minimalist feedback-regulated model for galaxy formation during the epoch of reionization

    NASA Astrophysics Data System (ADS)

    Furlanetto, Steven R.; Mirocha, Jordan; Mebane, Richard H.; Sun, Guochao

    2017-12-01

    Near-infrared surveys have now determined the luminosity functions of galaxies at 6 ≲ z ≲ 8 to impressive precision and identified a number of candidates at even earlier times. Here, we develop a simple analytic model to describe these populations that allows physically motivated extrapolation to earlier times and fainter luminosities. We assume that galaxies grow through accretion on to dark matter haloes, which we model by matching haloes at fixed number density across redshift, and that stellar feedback limits the star formation rate. We allow for a variety of feedback mechanisms, including regulation through supernova energy and momentum from radiation pressure. We show that reasonable choices for the feedback parameters can fit the available galaxy data, which in turn substantially limits the range of plausible extrapolations of the luminosity function to earlier times and fainter luminosities: for example, the global star formation rate declines rapidly (by a factor of ∼20 from z = 6 to 15 in our fiducial model), but the bright galaxies accessible to observations decline even faster (by a factor ≳ 400 over the same range). Our framework helps us develop intuition for the range of expectations permitted by simple models of high-z galaxies that build on our understanding of 'normal' galaxy evolution. We also provide predictions for galaxy measurements by future facilities, including James Webb Space Telescope and Wide-Field Infrared Survey Telescope.

  10. A computational method for optimizing fuel treatment locations

    Treesearch

    Mark A. Finney

    2006-01-01

    Modeling and experiments have suggested that spatial fuel treatment patterns can influence the movement of large fires. On simple theoretical landscapes consisting of two fuel types (treated and untreated) optimal patterns can be analytically derived that disrupt fire growth efficiently (i.e. with less area treated than random patterns). Although conceptually simple,...

  11. Transport of a decay chain in homogenous porous media: analytical solutions.

    PubMed

    Bauer, P; Attinger, S; Kinzelbach, W

    2001-06-01

    With the aid of integral transforms, analytical solutions for the transport of a decay chain in homogenous porous media are derived. Unidirectional steady-state flow and radial steady-state flow in single and multiple porosity media are considered. At least in Laplace domain, all solutions can be written in closed analytical formulae. Partly, the solutions can also be inverted analytically. If not, analytical calculation of the steady-state concentration distributions, evaluation of temporal moments and numerical inversion are still possible. Formulae for several simple boundary conditions are given and visualized in this paper. The derived novel solutions are widely applicable and are very useful for the validation of numerical transport codes.

  12. Semi-quantitative estimation by IR of framework, extraframework and defect Al species of HBEA zeolites.

    PubMed

    Marques, João P; Gener, Isabelle; Ayrault, Philippe; Lopes, José M; Ribeiro, F Ramôa; Guisnet, Michel

    2004-10-21

    A simple method based on the characterization (composition, Bronsted and Lewis acidities) of acid treated HBEA zeolites was developed for estimating the concentrations of framework, extraframework and defect Al species.

  13. A Framework of Simple Event Detection in Surveillance Video

    NASA Astrophysics Data System (ADS)

    Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao

    Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.

  14. Finite element based N-Port model for preliminary design of multibody systems

    NASA Astrophysics Data System (ADS)

    Sanfedino, Francesco; Alazard, Daniel; Pommier-Budinger, Valérie; Falcoz, Alexandre; Boquet, Fabrice

    2018-02-01

    This article presents and validates a general framework to build a linear dynamic Finite Element-based model of large flexible structures for integrated Control/Structure design. An extension of the Two-Input Two-Output Port (TITOP) approach is here developed. The authors had already proposed such framework for simple beam-like structures: each beam was considered as a TITOP sub-system that could be interconnected to another beam thanks to the ports. The present work studies bodies with multiple attaching points by allowing complex interconnections among several sub-structures in tree-like assembly. The TITOP approach is extended to generate NINOP (N-Input N-Output Port) models. A Matlab toolbox is developed integrating beam and bending plate elements. In particular a NINOP formulation of bending plates is proposed to solve analytic two-dimensional problems. The computation of NINOP models using the outputs of a MSC/Nastran modal analysis is also investigated in order to directly use the results provided by a commercial finite element software. The main advantage of this tool is to provide a model of a multibody system under the form of a block diagram with a minimal number of states. This model is easy to operate for preliminary design and control. An illustrative example highlights the potential of the proposed approach: the synthesis of the dynamical model of a spacecraft with two deployable and flexible solar arrays.

  15. Programming chemistry in DNA-addressable bioreactors.

    PubMed

    Fellermann, Harold; Cardelli, Luca

    2014-10-06

    We present a formal calculus, termed the chemtainer calculus, able to capture the complexity of compartmentalized reaction systems such as populations of possibly nested vesicular compartments. Compartments contain molecular cargo as well as surface markers in the form of DNA single strands. These markers serve as compartment addresses and allow for their targeted transport and fusion, thereby enabling reactions of previously separated chemicals. The overall system organization allows for the set-up of programmable chemistry in microfluidic or other automated environments. We introduce a simple sequential programming language whose instructions are motivated by state-of-the-art microfluidic technology. Our approach integrates electronic control, chemical computing and material production in a unified formal framework that is able to mimic the integrated computational and constructive capabilities of the subcellular matrix. We provide a non-deterministic semantics of our programming language that enables us to analytically derive the computational and constructive power of our machinery. This semantics is used to derive the sets of all constructable chemicals and supermolecular structures that emerge from different underlying instruction sets. Because our proofs are constructive, they can be used to automatically infer control programs for the construction of target structures from a limited set of resource molecules. Finally, we present an example of our framework from the area of oligosaccharide synthesis. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  16. A Comprehensive Analytical Solution of the Nonlinear Pendulum

    ERIC Educational Resources Information Center

    Ochs, Karlheinz

    2011-01-01

    In this paper, an analytical solution for the differential equation of the simple but nonlinear pendulum is derived. This solution is valid for any time and is not limited to any special initial instance or initial values. Moreover, this solution holds if the pendulum swings over or not. The method of approach is based on Jacobi elliptic functions…

  17. A theoretical framework for convergence and continuous dependence of estimates in inverse problems for distributed parameter systems

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Ito, K.

    1988-01-01

    Numerical techniques for parameter identification in distributed-parameter systems are developed analytically. A general convergence and stability framework (for continuous dependence on observations) is derived for first-order systems on the basis of (1) a weak formulation in terms of sesquilinear forms and (2) the resolvent convergence form of the Trotter-Kato approximation. The extension of this framework to second-order systems is considered.

  18. A Simple Sonication Improves Protein Signal in Matrix-Assisted Laser Desorption Ionization Imaging

    NASA Astrophysics Data System (ADS)

    Lin, Li-En; Su, Pin-Rui; Wu, Hsin-Yi; Hsu, Cheng-Chih

    2018-02-01

    Proper matrix application is crucial in obtaining high quality matrix-assisted laser desorption ionization (MALDI) mass spectrometry imaging (MSI). Solvent-free sublimation was essentially introduced as an approach of homogeneous coating that gives small crystal size of the organic matrix. However, sublimation has lower extraction efficiency of analytes. Here, we present that a simple sonication step after the hydration in standard sublimation protocol significantly enhances the sensitivity of MALDI MSI. This modified procedure uses a common laboratory ultrasonicator to immobilize the analytes from tissue sections without noticeable delocalization. Improved imaging quality with additional peaks above 10 kDa in the spectra was thus obtained upon sonication treatment. [Figure not available: see fulltext.

  19. Electrochemistry and analytical determination of lysergic acid diethylamide (LSD) via adsorptive stripping voltammetry.

    PubMed

    Merli, Daniele; Zamboni, Daniele; Protti, Stefano; Pesavento, Maria; Profumo, Antonella

    2014-12-01

    Lysergic acid diethylamide (LSD) is hardly detectable and quantifiable in biological samples because of its low active dose. Although several analytical tests are available, routine analysis of this drug is rarely performed. In this article, we report a simple and accurate method for the determination of LSD, based on adsorptive stripping voltammetry in DMF/tetrabutylammonium perchlorate, with a linear range of 1-90 ng L(-1) for deposition times of 50s. LOD of 1.4 ng L(-1) and LOQ of 4.3 ng L(-1) were found. The method can be also applied to biological samples after a simple extraction with 1-chlorobutane. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Ground temperature measurement by PRT-5 for maps experiment

    NASA Technical Reports Server (NTRS)

    Gupta, S. K.; Tiwari, S. N.

    1978-01-01

    A simple algorithm and computer program were developed for determining the actual surface temperature from the effective brightness temperature as measured remotely by a radiation thermometer called PRT-5. This procedure allows the computation of atmospheric correction to the effective brightness temperature without performing detailed radiative transfer calculations. Model radiative transfer calculations were performed to compute atmospheric corrections for several values of the surface and atmospheric parameters individually and in combination. Polynomial regressions were performed between the magnitudes or deviations of these parameters and the corresponding computed corrections to establish simple analytical relations between them. Analytical relations were also developed to represent combined correction for simultaneous variation of parameters in terms of their individual corrections.

  1. Collector modulation in high-voltage bipolar transistor in the saturation mode: Analytical approach

    NASA Astrophysics Data System (ADS)

    Dmitriev, A. P.; Gert, A. V.; Levinshtein, M. E.; Yuferev, V. S.

    2018-04-01

    A simple analytical model is developed, capable of replacing the numerical solution of a system of nonlinear partial differential equations by solving a simple algebraic equation when analyzing the collector resistance modulation of a bipolar transistor in the saturation mode. In this approach, the leakage of the base current into the emitter and the recombination of non-equilibrium carriers in the base are taken into account. The data obtained are in good agreement with the results of numerical calculations and make it possible to describe both the motion of the front of the minority carriers and the steady state distribution of minority carriers across the collector in the saturation mode.

  2. Simple and Sensitive Paper-Based Device Coupling Electrochemical Sample Pretreatment and Colorimetric Detection.

    PubMed

    Silva, Thalita G; de Araujo, William R; Muñoz, Rodrigo A A; Richter, Eduardo M; Santana, Mário H P; Coltro, Wendell K T; Paixão, Thiago R L C

    2016-05-17

    We report the development of a simple, portable, low-cost, high-throughput visual colorimetric paper-based analytical device for the detection of procaine in seized cocaine samples. The interference of most common cutting agents found in cocaine samples was verified, and a novel electrochemical approach was used for sample pretreatment in order to increase the selectivity. Under the optimized experimental conditions, a linear analytical curve was obtained for procaine concentrations ranging from 5 to 60 μmol L(-1), with a detection limit of 0.9 μmol L(-1). The accuracy of the proposed method was evaluated using seized cocaine samples and an addition and recovery protocol.

  3. SU-FF-T-668: A Simple Algorithm for Range Modulation Wheel Design in Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, X; Nazaryan, Vahagn; Gueye, Paul

    2009-06-01

    Purpose: To develop a simple algorithm in designing the range modulation wheel to generate a very smooth Spread-Out Bragg peak (SOBP) for proton therapy.Method and Materials: A simple algorithm has been developed to generate the weight factors in corresponding pristine Bragg peaks which composed a smooth SOBP in proton therapy. We used a modified analytical Bragg peak function based on Monte Carol simulation tool-kits of Geant4 as pristine Bragg peaks input in our algorithm. A simple METLAB(R) Quad Program was introduced to optimize the cost function in our algorithm. Results: We found out that the existed analytical function of Braggmore » peak can't directly use as pristine Bragg peak dose-depth profile input file in optimization of the weight factors since this model didn't take into account of the scattering factors introducing from the range shifts in modifying the proton beam energies. We have done Geant4 simulations for proton energy of 63.4 MeV with a 1.08 cm SOBP for variation of pristine Bragg peaks which composed this SOBP and modified the existed analytical Bragg peak functions for their peak heights, ranges of R{sub 0}, and Gaussian energies {sigma}{sub E}. We found out that 19 pristine Bragg peaks are enough to achieve a flatness of 1.5% of SOBP which is the best flatness in the publications. Conclusion: This work develops a simple algorithm to generate the weight factors which is used to design a range modulation wheel to generate a smooth SOBP in protonradiation therapy. We have found out that a medium number of pristine Bragg peaks are enough to generate a SOBP with flatness less than 2%. It is potential to generate data base to store in the treatment plan to produce a clinic acceptable SOBP by using our simple algorithm.« less

  4. Technical Note for 8D Likelihood Effective Higgs Couplings Extraction Framework in the Golden Channel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yi; Di Marco, Emanuele; Lykken, Joe

    2014-10-17

    In this technical note we present technical details on various aspects of the framework introduced in arXiv:1401.2077 aimed at extracting effective Higgs couplings in themore » $$h\\to 4\\ell$$ `golden channel'. Since it is the primary feature of the framework, we focus in particular on the convolution integral which takes us from `truth' level to `detector' level and the numerical and analytic techniques used to obtain it. We also briefly discuss other aspects of the framework.« less

  5. Optimization of storage tank locations in an urban stormwater drainage system using a two-stage approach.

    PubMed

    Wang, Mingming; Sun, Yuanxiang; Sweetapple, Chris

    2017-12-15

    Storage is important for flood mitigation and non-point source pollution control. However, to seek a cost-effective design scheme for storage tanks is very complex. This paper presents a two-stage optimization framework to find an optimal scheme for storage tanks using storm water management model (SWMM). The objectives are to minimize flooding, total suspended solids (TSS) load and storage cost. The framework includes two modules: (i) the analytical module, which evaluates and ranks the flooding nodes with the analytic hierarchy process (AHP) using two indicators (flood depth and flood duration), and then obtains the preliminary scheme by calculating two efficiency indicators (flood reduction efficiency and TSS reduction efficiency); (ii) the iteration module, which obtains an optimal scheme using a generalized pattern search (GPS) method based on the preliminary scheme generated by the analytical module. The proposed approach was applied to a catchment in CZ city, China, to test its capability in choosing design alternatives. Different rainfall scenarios are considered to test its robustness. The results demonstrate that the optimal framework is feasible, and the optimization is fast based on the preliminary scheme. The optimized scheme is better than the preliminary scheme for reducing runoff and pollutant loads under a given storage cost. The multi-objective optimization framework presented in this paper may be useful in finding the best scheme of storage tanks or low impact development (LID) controls. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. An Analytical Framework for the Cross-Country Comparison of Higher Education Governance

    ERIC Educational Resources Information Center

    Dobbins, Michael; Knill, Christoph; Vogtle, Eva Maria

    2011-01-01

    In this article we provide an integrated framework for the analysis of higher education governance which allows us to more systematically trace the changes that European higher education systems are currently undergoing. We argue that, despite highly insightful previous analyses, there is a need for more specific empirically observable indicators…

  7. A Human Dimensions Framework: Guidelines for Conducting Social Assessments

    Treesearch

    Alan D. Bright; H. Ken Cordell; Anne P. Hoover; Michael A Tarrant

    2003-01-01

    This paper provides a framework and guidelines for identifying and organizing human dimension information for use in forest planning. It synthesizes concepts from a variety of social science disciplines and connects them with measurable indicators for use in analysis and reporting. Suggestions of analytical approaches and sources of data for employment of the...

  8. A Data Analytical Framework for Improving Real-Time, Decision Support Systems in Healthcare

    ERIC Educational Resources Information Center

    Yahav, Inbal

    2010-01-01

    In this dissertation we develop a framework that combines data mining, statistics and operations research methods for improving real-time decision support systems in healthcare. Our approach consists of three main concepts: data gathering and preprocessing, modeling, and deployment. We introduce the notion of offline and semi-offline modeling to…

  9. How Do Mathematicians Learn Math?: Resources and Acts for Constructing and Understanding Mathematics

    ERIC Educational Resources Information Center

    Wilkerson-Jerde, Michelle H.; Wilensky, Uri J.

    2011-01-01

    In this paper, we present an analytic framework for investigating expert mathematical learning as the process of building a "network of mathematical resources" by establishing relationships between different components and properties of mathematical ideas. We then use this framework to analyze the reasoning of ten mathematicians and mathematics…

  10. Focus for Area Development Analysis: Urban Orientation of Counties.

    ERIC Educational Resources Information Center

    Bluestone, Herman

    The orientation of counties to metropolitan systems and urban centers is identified by population density and percentage of urban population. This analytical framework differentiates 6 kinds of counties, ranging from most urban-oriented (group 1) to least urban-oriented (group 6). With this framework, it can be seen that the economic well-being of…

  11. Analyzing Educators' Online Interactions: A Framework of Online Learning Support Roles

    ERIC Educational Resources Information Center

    Nacu, Denise C.; Martin, Caitlin K.; Pinkard, Nichole; Gray, Tené

    2016-01-01

    While the potential benefits of participating in online learning communities are documented, so too are inequities in terms of how different populations access and use them. We present the online learning support roles (OLSR) framework, an approach using both automated analytics and qualitative interpretation to identify and explore online…

  12. Mind-Sets Matter: A Meta-Analytic Review of Implicit Theories and Self-Regulation

    ERIC Educational Resources Information Center

    Burnette, Jeni L.; O'Boyle, Ernest H.; VanEpps, Eric M.; Pollack, Jeffrey M.; Finkel, Eli J.

    2013-01-01

    This review builds on self-control theory (Carver & Scheier, 1998) to develop a theoretical framework for investigating associations of implicit theories with self-regulation. This framework conceptualizes self-regulation in terms of 3 crucial processes: goal setting, goal operating, and goal monitoring. In this meta-analysis, we included…

  13. Supporting Multidisciplinary Networks through Relationality and a Critical Sense of Belonging: Three "Gardening Tools" and the "Relational Agency Framework"

    ERIC Educational Resources Information Center

    Duhn, Iris; Fleer, Marilyn; Harrison, Linda

    2016-01-01

    This article focuses on the "Relational Agency Framework" (RAF), an analytical tool developed for an Australian review and evaluation study of an early years' policy initiative. We explore Anne Edward's concepts of "relational expertise", "building common knowledge" and "relational agency" to explore how…

  14. An Analytic Framework to Support E.Learning Strategy Development

    ERIC Educational Resources Information Center

    Marshall, Stephen J.

    2012-01-01

    Purpose: The purpose of this paper is to discuss and demonstrate the relevance of a new conceptual framework for leading and managing the development of learning and teaching to e.learning strategy development. Design/methodology/approach: After reviewing and discussing the research literature on e.learning in higher education institutions from…

  15. University Reform and Institutional Autonomy: A Framework for Analysing the Living Autonomy

    ERIC Educational Resources Information Center

    Maassen, Peter; Gornitzka, Åse; Fumasoli, Tatiana

    2017-01-01

    In this article we discuss recent university reforms aimed at enhancing university autonomy, highlighting various tensions in the underlying reform ideologies. We examine how the traditional interpretation of university autonomy has been expanded in the reform rationales. An analytical framework for studying how autonomy is interpreted and used…

  16. High School Students' Informal Reasoning on a Socio-Scientific Issue: Qualitative and Quantitative Analyses

    ERIC Educational Resources Information Center

    Wu, Ying-Tien; Tsai, Chin-Chung

    2007-01-01

    Recently, the significance of learners' informal reasoning on socio-scientific issues has received increasing attention among science educators. To gain deeper insights into this important issue, an integrated analytic framework was developed in this study. With this framework, 71 Grade 10 students' informal reasoning about nuclear energy usage…

  17. A Framework for Describing Mathematics Discourse in Instruction and Interpreting Differences in Teaching

    ERIC Educational Resources Information Center

    Adler, Jill; Ronda, Erlina

    2015-01-01

    We describe and use an analytical framework to document mathematics discourse in instruction (MDI), and interpret differences in mathematics teaching. MDI is characterised by four interacting components in the teaching of a mathematics lesson: exemplification (occurring through a sequence of examples and related tasks), explanatory talk (talk that…

  18. Tracking the debate around marine protected areas: key issues and the BEG framework.

    PubMed

    Thorpe, Andy; Bavinck, Maarten; Coulthard, Sarah

    2011-04-01

    Marine conservation is often criticized for a mono-disciplinary approach, which delivers fragmented solutions to complex problems with differing interpretations of success. As a means of reflecting on the breadth and range of scientific research on the management of the marine environment, this paper develops an analytical framework to gauge the foci of policy documents and published scientific work on Marine Protected Areas. We evaluate the extent to which MPA research articles delineate objectives around three domains: biological-ecological [B]; economic-social[E]; and governance-management [G]. This permits us to develop an analytic [BEG] framework which we then test on a sample of selected journal article cohorts. While the framework reveals the dominance of biologically focussed research [B], analysis also reveals a growing frequency of the use of governance/management terminology in the literature over the last 15 years, which may be indicative of a shift towards more integrated consideration of governance concerns. However, consideration of the economic/social domain appears to lag behind biological and governance concerns in both frequency and presence in MPA literature.

  19. Linguistics and the Study of Literature. Linguistics in the Undergraduate Curriculum, Appendix 4-D.

    ERIC Educational Resources Information Center

    Steward, Ann Harleman

    Linguistics gives the student of literature an analytical tool whose sole purpose is to describe faithfully the workings of language. It provides a theoretical framework, an analytical method, and a vocabulary for communicating its insights--all designed to serve concerns other than literary interpretation and evaluation, but all useful for…

  20. A Two-Stage Approach to Synthesizing Covariance Matrices in Meta-Analytic Structural Equation Modeling

    ERIC Educational Resources Information Center

    Cheung, Mike W. L.; Chan, Wai

    2009-01-01

    Structural equation modeling (SEM) is widely used as a statistical framework to test complex models in behavioral and social sciences. When the number of publications increases, there is a need to systematically synthesize them. Methodology of synthesizing findings in the context of SEM is known as meta-analytic SEM (MASEM). Although correlation…

  1. Using Functional Analytic Therapy to Train Therapists in Acceptance and Commitment Therapy, a Conceptual and Practical Framework

    ERIC Educational Resources Information Center

    Schoendorff, Benjamin; Steinwachs, Joanne

    2012-01-01

    How can therapists be effectively trained in clinical functional contextualism? In this conceptual article we propose a new way of training therapists in Acceptance and Commitment Therapy skills using tools from Functional Analytic Psychotherapy in a training context functionally similar to the therapeutic relationship. FAP has been successfully…

  2. DIVE: A Graph-based Visual Analytics Framework for Big Data

    PubMed Central

    Rysavy, Steven J.; Bromley, Dennis; Daggett, Valerie

    2014-01-01

    The need for data-centric scientific tools is growing; domains like biology, chemistry, and physics are increasingly adopting computational approaches. As a result, scientists must now deal with the challenges of big data. To address these challenges, we built a visual analytics platform named DIVE: Data Intensive Visualization Engine. DIVE is a data-agnostic, ontologically-expressive software framework capable of streaming large datasets at interactive speeds. Here we present the technical details of the DIVE platform, multiple usage examples, and a case study from the Dynameomics molecular dynamics project. We specifically highlight our novel contributions to structured data model manipulation and high-throughput streaming of large, structured datasets. PMID:24808197

  3. Tributyltin--critical pollutant in whole water samples--development of traceable measurement methods for monitoring under the European Water Framework Directive (WFD) 2000/60/EC.

    PubMed

    Richter, Janine; Fettig, Ina; Philipp, Rosemarie; Jakubowski, Norbert

    2015-07-01

    Tributyltin is listed as one of the priority substances in the European Water Framework Directive (WFD). Despite its decreasing input in the environment, it is still present and has to be monitored. In the European Metrology Research Programme project ENV08, a sensitive and reliable analytical method according to the WFD was developed to quantify this environmental pollutant at a very low limit of quantification. With the development of such a primary reference method for tributyltin, the project helped to improve the quality and comparability of monitoring data. An overview of project aims and potential analytical tools is given.

  4. Task-Driven Orbit Design and Implementation on a Robotic C-Arm System for Cone-Beam CT.

    PubMed

    Ouadah, S; Jacobson, M; Stayman, J W; Ehtiati, T; Weiss, C; Siewerdsen, J H

    2017-03-01

    This work applies task-driven optimization to the design of non-circular orbits that maximize imaging performance for a particular imaging task. First implementation of task-driven imaging on a clinical robotic C-arm system is demonstrated, and a framework for orbit calculation is described and evaluated. We implemented a task-driven imaging framework to optimize orbit parameters that maximize detectability index d '. This framework utilizes a specified Fourier domain task function and an analytical model for system spatial resolution and noise. Two experiments were conducted to test the framework. First, a simple task was considered consisting of frequencies lying entirely on the f z -axis (e.g., discrimination of structures oriented parallel to the central axial plane), and a "circle + arc" orbit was incorporated into the framework as a means to improve sampling of these frequencies, and thereby increase task-based detectability. The orbit was implemented on a robotic C-arm (Artis Zeego, Siemens Healthcare). A second task considered visualization of a cochlear implant simulated within a head phantom, with spatial frequency response emphasizing high-frequency content in the ( f y , f z ) plane of the cochlea. An optimal orbit was computed using the task-driven framework, and the resulting image was compared to that for a circular orbit. For the f z -axis task, the circle + arc orbit was shown to increase d ' by a factor of 1.20, with an improvement of 0.71 mm in a 3D edge-spread measurement for edges located far from the central plane and a decrease in streak artifacts compared to a circular orbit. For the cochlear implant task, the resulting orbit favored complementary views of high tilt angles in a 360° orbit, and d ' was increased by a factor of 1.83. This work shows that a prospective definition of imaging task can be used to optimize source-detector orbit and improve imaging performance. The method was implemented for execution of non-circular, task-driven orbits on a clinical robotic C-arm system. The framework is sufficiently general to include both acquisition parameters (e.g., orbit, kV, and mA selection) and reconstruction parameters (e.g., a spatially varying regularizer).

  5. Task-driven orbit design and implementation on a robotic C-arm system for cone-beam CT

    NASA Astrophysics Data System (ADS)

    Ouadah, S.; Jacobson, M.; Stayman, J. W.; Ehtiati, T.; Weiss, C.; Siewerdsen, J. H.

    2017-03-01

    Purpose: This work applies task-driven optimization to the design of non-circular orbits that maximize imaging performance for a particular imaging task. First implementation of task-driven imaging on a clinical robotic C-arm system is demonstrated, and a framework for orbit calculation is described and evaluated. Methods: We implemented a task-driven imaging framework to optimize orbit parameters that maximize detectability index d'. This framework utilizes a specified Fourier domain task function and an analytical model for system spatial resolution and noise. Two experiments were conducted to test the framework. First, a simple task was considered consisting of frequencies lying entirely on the fz-axis (e.g., discrimination of structures oriented parallel to the central axial plane), and a "circle + arc" orbit was incorporated into the framework as a means to improve sampling of these frequencies, and thereby increase task-based detectability. The orbit was implemented on a robotic C-arm (Artis Zeego, Siemens Healthcare). A second task considered visualization of a cochlear implant simulated within a head phantom, with spatial frequency response emphasizing high-frequency content in the (fy, fz) plane of the cochlea. An optimal orbit was computed using the task-driven framework, and the resulting image was compared to that for a circular orbit. Results: For the fz-axis task, the circle + arc orbit was shown to increase d' by a factor of 1.20, with an improvement of 0.71 mm in a 3D edge-spread measurement for edges located far from the central plane and a decrease in streak artifacts compared to a circular orbit. For the cochlear implant task, the resulting orbit favored complementary views of high tilt angles in a 360° orbit, and d' was increased by a factor of 1.83. Conclusions: This work shows that a prospective definition of imaging task can be used to optimize source-detector orbit and improve imaging performance. The method was implemented for execution of non-circular, task-driven orbits on a clinical robotic C-arm system. The framework is sufficiently general to include both acquisition parameters (e.g., orbit, kV, and mA selection) and reconstruction parameters (e.g., a spatially varying regularizer).

  6. Vertical and pitching resonance of train cars moving over a series of simple beams

    NASA Astrophysics Data System (ADS)

    Yang, Y. B.; Yau, J. D.

    2015-02-01

    The resonant response, including both vertical and pitching motions, of an undamped sprung mass unit moving over a series of simple beams is studied by a semi-analytical approach. For a sprung mass that is very small compared with the beam, we first simplify the sprung mass as a constant moving force and obtain the response of the beam in closed form. With this, we then solve for the response of the sprung mass passing over a series of simple beams, and validate the solution by an independent finite element analysis. To evaluate the pitching resonance, we consider the cases of a two-axle model and a coach model traveling over rough rails supported by a series of simple beams. The resonance of a train car is characterized by the fact that its response continues to build up, as it travels over more and more beams. For train cars with long axle intervals, the vertical acceleration induced by pitching resonance dominates the peak response of the train traveling over a series of simple beams. The present semi-analytical study allows us to grasp the key parameters involved in the primary/sub-resonant responses. Other phenomena of resonance are also discussed in the exemplar study.

  7. PyFolding: Open-Source Graphing, Simulation, and Analysis of the Biophysical Properties of Proteins.

    PubMed

    Lowe, Alan R; Perez-Riba, Albert; Itzhaki, Laura S; Main, Ewan R G

    2018-02-06

    For many years, curve-fitting software has been heavily utilized to fit simple models to various types of biophysical data. Although such software packages are easy to use for simple functions, they are often expensive and present substantial impediments to applying more complex models or for the analysis of large data sets. One field that is reliant on such data analysis is the thermodynamics and kinetics of protein folding. Over the past decade, increasingly sophisticated analytical models have been generated, but without simple tools to enable routine analysis. Consequently, users have needed to generate their own tools or otherwise find willing collaborators. Here we present PyFolding, a free, open-source, and extensible Python framework for graphing, analysis, and simulation of the biophysical properties of proteins. To demonstrate the utility of PyFolding, we have used it to analyze and model experimental protein folding and thermodynamic data. Examples include: 1) multiphase kinetic folding fitted to linked equations, 2) global fitting of multiple data sets, and 3) analysis of repeat protein thermodynamics with Ising model variants. Moreover, we demonstrate how PyFolding is easily extensible to novel functionality beyond applications in protein folding via the addition of new models. Example scripts to perform these and other operations are supplied with the software, and we encourage users to contribute notebooks and models to create a community resource. Finally, we show that PyFolding can be used in conjunction with Jupyter notebooks as an easy way to share methods and analysis for publication and among research teams. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  8. A simple expression for quantifying bacterial chemotaxis using capillary assay data: application to the analysis of enhanced chemotactic responses from growth-limited cultures.

    PubMed

    Ford, R M; Lauffenburger, D A

    1992-05-01

    An individual cell-based mathematical model of Rivero et al. provides a framework for determining values of the chemotactic sensitivity coefficient chi 0, an intrinsic cell population parameter that characterizes the chemotactic response of bacterial populations. This coefficient can theoretically relate the swimming behavior of individual cells to the resulting migration of a bacterial population. When this model is applied to the commonly used capillary assay, an approximate solution can be obtained for a particular range of chemotactic strengths yielding a very simple analytical expression for estimating the value of chi 0, [formula: see text] from measurements of cell accumulation in the capillary, N, when attractant uptake is negligible. A0 and A infinity are the dimensionless attractant concentrations initially present at the mouth of the capillary and far into the capillary, respectively, which are scaled by Kd, the effective dissociation constant for receptor-attractant binding. D is the attractant diffusivity, and mu is the cell random motility coefficient. NRM is the cell accumulation in the capillary in the absence of an attractant gradient, from which mu can be determined independently as mu = (pi/4t)(NRM/pi r2bc)2, with r the capillary tube radius and bc the bacterial density initially in the chamber. When attractant uptake is significant, a slightly more involved procedure requiring a simple numerical integration becomes necessary. As an example, we apply this approach to quantitatively characterize, in terms of the chemotactic sensitivity coefficient chi 0, data from Terracciano indicating enhanced chemotactic responses of Escherichia coli to galactose when cultured under growth-limiting galactose levels in a chemostat.

  9. Relation of squeezed states between damped harmonic and simple harmonic oscillators

    NASA Technical Reports Server (NTRS)

    Um, Chung-In; Yeon, Kyu-Hwang; George, Thomas F.; Pandey, Lakshmi N.

    1993-01-01

    The minimum uncertainty and other relations are evaluated in the framework of the coherent states of the damped harmonic oscillator. It is shown that the coherent states of the damped harmonic oscillator are the squeezed coherent states of the simple harmonic oscillator. The unitary operator is also constructed, and this connects coherent states with damped harmonic and simple harmonic oscillators.

  10. An analytical approach to γ-ray self-shielding effects for radioactive bodies encountered nuclear decommissioning scenarios.

    PubMed

    Gamage, K A A; Joyce, M J

    2011-10-01

    A novel analytical approach is described that accounts for self-shielding of γ radiation in decommissioning scenarios. The approach is developed with plutonium-239, cobalt-60 and caesium-137 as examples; stainless steel and concrete have been chosen as the media for cobalt-60 and caesium-137, respectively. The analytical methods have been compared MCNPX 2.6.0 simulations. A simple, linear correction factor relates the analytical results and the simulated estimates. This has the potential to greatly simplify the estimation of self-shielding effects in decommissioning activities. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    PubMed Central

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551

  12. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  13. Sample injection and electrophoretic separation on a simple laminated paper based analytical device.

    PubMed

    Xu, Chunxiu; Zhong, Minghua; Cai, Longfei; Zheng, Qingyu; Zhang, Xiaojun

    2016-02-01

    We described a strategy to perform multistep operations on a simple laminated paper-based separation device by using electrokinetic flow to manipulate the fluids. A laminated crossed-channel paper-based separation device was fabricated by cutting a filter paper sheet followed by lamination. Multiple function units including sample loading, sample injection, and electrophoretic separation were integrated on a single paper based analytical device for the first time, by applying potential at different reservoirs for sample, sample waste, buffer, and buffer waste. As a proof-of-concept demonstration, mixed sample solution containing carmine and sunset yellow were loaded in the sampling channel, and then injected into separation channel followed by electrophoretic separation, by adjusting the potentials applied at the four terminals of sampling and separation channel. The effects of buffer pH, buffer concentration, channel width, and separation time on resolution of electrophoretic separation were studied. This strategy may be used to perform multistep operations such as reagent dilution, sample injection, mixing, reaction, and separation on a single microfluidic paper based analytical device, which is very attractive for building micro total analysis systems on microfluidic paper based analytical devices. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Review of Thawing Time Prediction Models Depending
on Process Conditions and Product Characteristics

    PubMed Central

    Kluza, Franciszek; Spiess, Walter E. L.; Kozłowicz, Katarzyna

    2016-01-01

    Summary Determining thawing times of frozen foods is a challenging problem as the thermophysical properties of the product change during thawing. A number of calculation models and solutions have been developed. The proposed solutions range from relatively simple analytical equations based on a number of assumptions to a group of empirical approaches that sometimes require complex calculations. In this paper analytical, empirical and graphical models are presented and critically reviewed. The conditions of solution, limitations and possible applications of the models are discussed. The graphical and semi--graphical models are derived from numerical methods. Using the numerical methods is not always possible as running calculations takes time, whereas the specialized software and equipment are not always cheap. For these reasons, the application of analytical-empirical models is more useful for engineering. It is demonstrated that there is no simple, accurate and feasible analytical method for thawing time prediction. Consequently, simplified methods are needed for thawing time estimation of agricultural and food products. The review reveals the need for further improvement of the existing solutions or development of new ones that will enable accurate determination of thawing time within a wide range of practical conditions of heat transfer during processing. PMID:27904387

  15. Simple method for the determination of personal care product ingredients in lettuce by ultrasound-assisted extraction combined with solid-phase microextraction followed by GC-MS.

    PubMed

    Cabrera-Peralta, Jerónimo; Peña-Alvarez, Araceli

    2018-05-01

    A simple method for the simultaneous determination of personal care product ingredients: galaxolide, tonalide, oxybenzone, 4-methylbenzyliden camphor, padimate-o, 2-ethylhexyl methoxycinnamate, octocrylene, triclosan, and methyl triclosan in lettuce by ultrasound-assisted extraction combined with solid-phase microextraction followed by gas chromatography with mass spectrometry was developed. Lettuce was directly extracted by ultrasound-assisted extraction with methanol, this extract was combined with water, extracted by solid-phase microextraction in immersion mode, and analyzed by gas chromatography with mass spectrometry. Good linear relationships (25-250 ng/g, R 2  > 0.9702) and low detection limits (1.0-25 ng/g) were obtained for analytes along with acceptable precision for almost all analytes (RSDs < 20%). The validated method was applied for the determination of personal care product ingredients in commercial lettuce and lettuces grown in soil and irrigated with the analytes, identifying the target analytes in leaves and roots of the latter. This procedure is a miniaturized and environmentally friendly proposal which can be a useful tool for quality analysis in lettuce. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Modern analytical chemistry in the contemporary world

    NASA Astrophysics Data System (ADS)

    Šíma, Jan

    2016-12-01

    Students not familiar with chemistry tend to misinterpret analytical chemistry as some kind of the sorcery where analytical chemists working as modern wizards handle magical black boxes able to provide fascinating results. However, this approach is evidently improper and misleading. Therefore, the position of modern analytical chemistry among sciences and in the contemporary world is discussed. Its interdisciplinary character and the necessity of the collaboration between analytical chemists and other experts in order to effectively solve the actual problems of the human society and the environment are emphasized. The importance of the analytical method validation in order to obtain the accurate and precise results is highlighted. The invalid results are not only useless; they can often be even fatal (e.g., in clinical laboratories). The curriculum of analytical chemistry at schools and universities is discussed. It is referred to be much broader than traditional equilibrium chemistry coupled with a simple description of individual analytical methods. Actually, the schooling of analytical chemistry should closely connect theory and practice.

  17. A Framework for Designing Cluster Randomized Trials with Binary Outcomes

    ERIC Educational Resources Information Center

    Spybrook, Jessaca; Martinez, Andres

    2011-01-01

    The purpose of this paper is to provide a frame work for approaching a power analysis for a CRT (cluster randomized trial) with a binary outcome. The authors suggest a framework in the context of a simple CRT and then extend it to a blocked design, or a multi-site cluster randomized trial (MSCRT). The framework is based on proportions, an…

  18. Global dynamic optimization approach to predict activation in metabolic pathways.

    PubMed

    de Hijas-Liste, Gundián M; Klipp, Edda; Balsa-Canto, Eva; Banga, Julio R

    2014-01-06

    During the last decade, a number of authors have shown that the genetic regulation of metabolic networks may follow optimality principles. Optimal control theory has been successfully used to compute optimal enzyme profiles considering simple metabolic pathways. However, applying this optimal control framework to more general networks (e.g. branched networks, or networks incorporating enzyme production dynamics) yields problems that are analytically intractable and/or numerically very challenging. Further, these previous studies have only considered a single-objective framework. In this work we consider a more general multi-objective formulation and we present solutions based on recent developments in global dynamic optimization techniques. We illustrate the performance and capabilities of these techniques considering two sets of problems. First, we consider a set of single-objective examples of increasing complexity taken from the recent literature. We analyze the multimodal character of the associated non linear optimization problems, and we also evaluate different global optimization approaches in terms of numerical robustness, efficiency and scalability. Second, we consider generalized multi-objective formulations for several examples, and we show how this framework results in more biologically meaningful results. The proposed strategy was used to solve a set of single-objective case studies related to unbranched and branched metabolic networks of different levels of complexity. All problems were successfully solved in reasonable computation times with our global dynamic optimization approach, reaching solutions which were comparable or better than those reported in previous literature. Further, we considered, for the first time, multi-objective formulations, illustrating how activation in metabolic pathways can be explained in terms of the best trade-offs between conflicting objectives. This new methodology can be applied to metabolic networks with arbitrary topologies, non-linear dynamics and constraints.

  19. An analytical framework to assist decision makers in the use of forest ecosystem model predictions

    USGS Publications Warehouse

    Larocque, Guy R.; Bhatti, Jagtar S.; Ascough, J.C.; Liu, J.; Luckai, N.; Mailly, D.; Archambault, L.; Gordon, Andrew M.

    2011-01-01

    The predictions from most forest ecosystem models originate from deterministic simulations. However, few evaluation exercises for model outputs are performed by either model developers or users. This issue has important consequences for decision makers using these models to develop natural resource management policies, as they cannot evaluate the extent to which predictions stemming from the simulation of alternative management scenarios may result in significant environmental or economic differences. Various numerical methods, such as sensitivity/uncertainty analyses, or bootstrap methods, may be used to evaluate models and the errors associated with their outputs. However, the application of each of these methods carries unique challenges which decision makers do not necessarily understand; guidance is required when interpreting the output generated from each model. This paper proposes a decision flow chart in the form of an analytical framework to help decision makers apply, in an orderly fashion, different steps involved in examining the model outputs. The analytical framework is discussed with regard to the definition of problems and objectives and includes the following topics: model selection, identification of alternatives, modelling tasks and selecting alternatives for developing policy or implementing management scenarios. Its application is illustrated using an on-going exercise in developing silvicultural guidelines for a forest management enterprise in Ontario, Canada.

  20. Analyzing the influence of institutions on health policy development in Uganda: a case study of the decision to abolish user fees.

    PubMed

    Moat, K A; Abelson, J

    2011-12-01

    During the 2001 election campaign, President Yoweri Museveni announced he was abolishing user fees for health services in Uganda. No analysis has been carried out to explain how he was able to initiate such an important policy decision without encountering any immediate barriers. To explain this outcome through in-depth policy analysis driven by the application of key analytical frameworks. An explanatory case study informed by analytical frameworks from the institutionalism literature was undertaken. Multiple data sources were used including: academic literature, key government documents, grey literature, and a variety of print media. According to the analytical frameworks employed, several formal institutional constraints existed that would have reduced the prospects for the abolition of user fees. However, prevalent informal institutions such as "Big Man" presidentialism and clientelism that were both 'competing' and 'complementary' can be used to explain the policy outcome. The analysis suggests that these factors trumped the impact of more formal institutional structures in the Ugandan context. Consideration should be given to the interactions between formal and informal institutions in the analysis of health policy processes in Uganda, as they provide a more nuanced understanding of how each set of factors influence policy outcomes.

  1. The path dependency theory: analytical framework to study institutional integration. The case of France

    PubMed Central

    Trouvé, Hélène; Couturier, Yves; Etheridge, Francis; Saint-Jean, Olivier; Somme, Dominique

    2010-01-01

    Background The literature on integration indicates the need for an enhanced theorization of institutional integration. This article proposes path dependence as an analytical framework to study the systems in which integration takes place. Purpose PRISMA proposes a model for integrating health and social care services for older adults. This model was initially tested in Quebec. The PRISMA France study gave us an opportunity to analyze institutional integration in France. Methods A qualitative approach was used. Analyses were based on semi-structured interviews with actors of all levels of decision-making, observations of advisory board meetings, and administrative documents. Results Our analyses revealed the complexity and fragmentation of institutional integration. The path dependency theory, which analyzes the change capacity of institutions by taking into account their historic structures, allows analysis of this situation. The path dependency to the Bismarckian system and the incomplete reforms of gerontological policies generate the coexistence and juxtaposition of institutional systems. In such a context, no institution has sufficient ability to determine gerontology policy and build institutional integration by itself. Conclusion Using path dependence as an analytical framework helps to understand the reasons why institutional integration is critical to organizational and clinical integration, and the complex construction of institutional integration in France. PMID:20689740

  2. High-Performance Data Analytics Beyond the Relational and Graph Data Models with GEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Minutoli, Marco; Bhatt, Shreyansh

    Graphs represent an increasingly popular data model for data-analytics, since they can naturally represent relationships and interactions between entities. Relational databases and their pure table-based data model are not well suitable to store and process sparse data. Consequently, graph databases have gained interest in the last few years and the Resource Description Framework (RDF) became the standard data model for graph data. Nevertheless, while RDF is well suited to analyze the relationships between the entities, it is not efficient in representing their attributes and properties. In this work we propose the adoption of a new hybrid data model, based onmore » attributed graphs, that aims at overcoming the limitations of the pure relational and graph data models. We present how we have re-designed the GEMS data-analytics framework to fully take advantage of the proposed hybrid data model. To improve analysts productivity, in addition to a C++ API for applications development, we adopt GraQL as input query language. We validate our approach implementing a set of queries on net-flow data and we compare our framework performance against Neo4j. Experimental results show significant performance improvement over Neo4j, up to several orders of magnitude when increasing the size of the input data.« less

  3. Development and validation of a simple high-performance liquid chromatography analytical method for simultaneous determination of phytosterols, cholesterol and squalene in parenteral lipid emulsions.

    PubMed

    Novak, Ana; Gutiérrez-Zamora, Mercè; Domenech, Lluís; Suñé-Negre, Josep M; Miñarro, Montserrat; García-Montoya, Encarna; Llop, Josep M; Ticó, Josep R; Pérez-Lozano, Pilar

    2018-02-01

    A simple analytical method for simultaneous determination of phytosterols, cholesterol and squalene in lipid emulsions was developed owing to increased interest in their clinical effects. Method development was based on commonly used stationary (C 18 , C 8 and phenyl) and mobile phases (mixtures of acetonitrile, methanol and water) under isocratic conditions. Differences in stationary phases resulted in peak overlapping or coelution of different peaks. The best separation of all analyzed compounds was achieved on Zorbax Eclipse XDB C 8 (150 × 4.6 mm, 5 μm; Agilent) and ACN-H 2 O-MeOH, 80:19.5:0.5 (v/v/v). In order to achieve a shorter time of analysis, the method was further optimized and gradient separation was established. The optimized analytical method was validated and tested for routine use in lipid emulsion analyses. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Exploiting on-node heterogeneity for in-situ analytics of climate simulations via a functional partitioning framework

    NASA Astrophysics Data System (ADS)

    Sapra, Karan; Gupta, Saurabh; Atchley, Scott; Anantharaj, Valentine; Miller, Ross; Vazhkudai, Sudharshan

    2016-04-01

    Efficient resource utilization is critical for improved end-to-end computing and workflow of scientific applications. Heterogeneous node architectures, such as the GPU-enabled Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), present us with further challenges. In many HPC applications on Titan, the accelerators are the primary compute engines while the CPUs orchestrate the offloading of work onto the accelerators, and moving the output back to the main memory. On the other hand, applications that do not exploit GPUs, the CPU usage is dominant while the GPUs idle. We utilized Heterogenous Functional Partitioning (HFP) runtime framework that can optimize usage of resources on a compute node to expedite an application's end-to-end workflow. This approach is different from existing techniques for in-situ analyses in that it provides a framework for on-the-fly analysis on-node by dynamically exploiting under-utilized resources therein. We have implemented in the Community Earth System Model (CESM) a new concurrent diagnostic processing capability enabled by the HFP framework. Various single variate statistics, such as means and distributions, are computed in-situ by launching HFP tasks on the GPU via the node local HFP daemon. Since our current configuration of CESM does not use GPU resources heavily, we can move these tasks to GPU using the HFP framework. Each rank running the atmospheric model in CESM pushes the variables of of interest via HFP function calls to the HFP daemon. This node local daemon is responsible for receiving the data from main program and launching the designated analytics tasks on the GPU. We have implemented these analytics tasks in C and use OpenACC directives to enable GPU acceleration. This methodology is also advantageous while executing GPU-enabled configurations of CESM when the CPUs will be idle during portions of the runtime. In our implementation results, we demonstrate that it is more efficient to use HFP framework to offload the tasks to GPUs instead of doing it in the main application. We observe increased resource utilization and overall productivity in this approach by using HFP framework for end-to-end workflow.

  5. Simple waves in a two-component Bose-Einstein condensate

    NASA Astrophysics Data System (ADS)

    Ivanov, S. K.; Kamchatnov, A. M.

    2018-04-01

    We study the dynamics of so-called simple waves in a two-component Bose-Einstein condensate. The evolution of the condensate is described by Gross-Pitaevskii equations which can be reduced for these simple wave solutions to a system of ordinary differential equations which coincide with those derived by Ovsyannikov for the two-layer fluid dynamics. We solve the Ovsyannikov system for two typical situations of large and small difference between interspecies and intraspecies nonlinear interaction constants. Our analytic results are confirmed by numerical simulations.

  6. Spin Seebeck effect in a simple ferromagnet near T c: a Ginzburg-Landau approach

    NASA Astrophysics Data System (ADS)

    Adachi, Hiroto; Yamamoto, Yutaka; Ichioka, Masanori

    2018-04-01

    A time-dependent Ginzburg-Landau theory is used to examine the longitudinal spin Seebeck effect in a simple ferromagnet in the vicinity of the Curie temperature T c. It is shown analytically that the spin Seebeck effect is proportional to the magnetization near T c, whose result is in line with the previous numerical finding. It is argued that the present result can be tested experimentally using a simple magnetic system such as EuO/Pt or EuS/Pt.

  7. Exploring the Moral Complexity of School Choice: Philosophical Frameworks and Contributions

    ERIC Educational Resources Information Center

    Wilson, Terri S.

    2015-01-01

    In this essay, I describe some of the methodological dimensions of my ongoing research into how parents choose schools. I particularly focus on how philosophical frameworks and analytical strategies have shaped the empirical portion of my research. My goal, in this essay, is to trace and explore the ways in which philosophy of education--as a…

  8. International Processes of Education Policy Formation: An Analytic Framework and the Case of Plan 2021 in El Salvador

    ERIC Educational Resources Information Center

    Edwards, D. Brent, Jr.

    2013-01-01

    This article uses multiple perspectives to frame international processes of education policy formation and then applies the framework to El Salvador's Plan 2021 between 2003 and 2005. These perspectives are policy attraction, policy negotiation, policy imposition, and policy hybridization. Research reveals that the formation of Plan 2021 was the…

  9. Behavioral assessment of personality disorders.

    PubMed

    Nelson-Gray, R O; Farmer, R F

    1999-04-01

    This article examines the definition of personality disorders (PDs) from a functional analytical framework and discusses the potential utility of such a framework to account for behavioral tendencies associated with PD pathology. Also reviewed are specific behavioral assessment methods that can be employed in the assessment of PDs, and how information derived from these assessments may be linked to specific intervention strategies.

  10. An Empirical Investigation of Entrepreneurship Intensity in Iranian State Universities

    ERIC Educational Resources Information Center

    Mazdeh, Mohammad Mahdavi; Razavi, Seyed-Mostafa; Hesamamiri, Roozbeh; Zahedi, Mohammad-Reza; Elahi, Behin

    2013-01-01

    The purpose of this study is to propose a framework to evaluate the entrepreneurship intensity (EI) of Iranian state universities. In order to determine EI, a hybrid multi-method framework consisting of Delphi, Analytic Network Process (ANP), and VIKOR is proposed. The Delphi method is used to localize and reduce the number of criteria extracted…

  11. Digging beneath the Surface: Analyzing the Complexity of Instructors' Participation in Asynchronous Discussion

    ERIC Educational Resources Information Center

    Clarke, Lane Whitney; Bartholomew, Audrey

    2014-01-01

    The purpose of this study was to investigate instructor participation in asynchronous discussions through an in-depth content analysis of instructors' postings and comments through the Community of Inquiry (COI) framework (Garrison et. al, 2001). We developed an analytical tool based on this framework in order to better understand what instructors…

  12. The Life Course Perspective on Drug Use: A Conceptual Framework for Understanding Drug Use Trajectories

    ERIC Educational Resources Information Center

    Hser, Yih-Ing; Longshore, Douglas; Anglin, M. Douglas

    2007-01-01

    This article discusses the life course perspective on drug use, including conceptual and analytic issues involved in developing the life course framework to explain how drug use trajectories develop during an individual's lifetime and how this knowledge can guide new research and approaches to management of drug dependence. Central concepts…

  13. Critical Argument and Writer Identity: Social Constructivism as a Theoretical Framework for EFL Academic Writing

    ERIC Educational Resources Information Center

    McKinley, Jim

    2015-01-01

    This article makes the argument that we need to situate student's academic writing as socially constructed pieces of writing that embody a writer's cultural identity and critical argument. In support, I present and describe a comprehensive model of an original English as a Foreign Language (EFL) writing analytical framework. This article explains…

  14. Meaning Making through Multiple Modalities in a Biology Classroom: A Multimodal Semiotics Discourse Analysis

    ERIC Educational Resources Information Center

    Jaipal, Kamini

    2010-01-01

    The teaching of science is a complex process, involving the use of multiple modalities. This paper illustrates the potential of a multimodal semiotics discourse analysis framework to illuminate meaning-making possibilities during the teaching of a science concept. A multimodal semiotics analytical framework is developed and used to (1) analyze the…

  15. A high-resolution bioclimate map of the world: a unifying framework for global biodiversity research and monitoring

    USGS Publications Warehouse

    Metzger, Marc J.; Bunce, Robert G.H.; Jongman, Rob H.G.; Sayre, Roger G.; Trabucco, Antonio; Zomer, Robert

    2013-01-01

    Main conclusions: The GEnS provides a robust spatial analytical framework for the aggregation of local observations, identification of gaps in current monitoring efforts and systematic design of complementary and new monitoring and research. The dataset is available for non-commercial use through the GEO portal (http://www.geoportal.org).

  16. A Case Study of a Mixed Methods Study Engaged in Integrated Data Analysis

    ERIC Educational Resources Information Center

    Schiazza, Daniela Marie

    2013-01-01

    The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…

  17. Utilising "Low Tech" Analytical Frameworks to Analyse Dyslexic Caribbean Students' Classroom Narratives

    ERIC Educational Resources Information Center

    Blackman, Stacey

    2007-01-01

    The cognitions of Caribbean students with dyslexia are explored as part of an embedded multiple case study approach to teaching and learning at two secondary schools on the island of Barbados. This exploration employed "low tech" approaches to analyse what pupils had said in interviews using a Miles and Huberman (1994) framework.…

  18. Safety risk assessment using analytic hierarchy process (AHP) during planning and budgeting of construction projects.

    PubMed

    Aminbakhsh, Saman; Gunduz, Murat; Sonmez, Rifat

    2013-09-01

    The inherent and unique risks on construction projects quite often present key challenges to contractors. Health and safety risks are among the most significant risks in construction projects since the construction industry is characterized by a relatively high injury and death rate compared to other industries. In construction project management, safety risk assessment is an important step toward identifying potential hazards and evaluating the risks associated with the hazards. Adequate prioritization of safety risks during risk assessment is crucial for planning, budgeting, and management of safety related risks. In this paper, a safety risk assessment framework is presented based on the theory of cost of safety (COS) model and the analytic hierarchy process (AHP). The main contribution of the proposed framework is that it presents a robust method for prioritization of safety risks in construction projects to create a rational budget and to set realistic goals without compromising safety. The framework provides a decision tool for the decision makers to determine the adequate accident/injury prevention investments while considering the funding limits. The proposed safety risk framework is illustrated using a real-life construction project and the advantages and limitations of the framework are discussed. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  19. Promising Opportunities for Black and Latino Young Men: Findings from the Early Implementation of the Expanded Success Initiative. Technical Appendices

    ERIC Educational Resources Information Center

    Villavicencio, Adriana; Klevan, Sarah; Guidry, Brandon; Wulach, Suzanne

    2014-01-01

    This appendix describes the data collection and analytic processes used to develop the findings in the report "Promising Opportunities for Black and Latino Young Men." A central challenge was creating an analytic framework that could be uniformly applied to all schools, despite the individualized nature of their Expanded Success…

  20. Patterns of Work and Family Involvement among Single and Dual Earner Couples: Two Competing Analytical Approaches.

    ERIC Educational Resources Information Center

    Yogev, Sara; Brett, Jeanne

    This paper offers a conceptual framework for the intersection of work and family roles based on the constructs of work involvement and family involvement. The theoretical and empirical literature on the intersection of work and family roles is reviewed from two analytical approaches. From the individual level of analysis, the literature reviewed…

Top