Edwards, Jeffrey R; Lambert, Lisa Schurer
2007-03-01
Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated and the mediated effects under investigation. This article presents a general analytical framework for combining moderation and mediation that integrates moderated regression analysis and path analysis. This framework clarifies how moderator variables influence the paths that constitute the direct, indirect, and total effects of mediated models. The authors empirically illustrate this framework and give step-by-step instructions for estimation and interpretation. They summarize the advantages of their framework over current approaches, explain how it subsumes moderated mediation and mediated moderation, and describe how it can accommodate additional moderator and mediator variables, curvilinear relationships, and structural equation models with latent variables. (c) 2007 APA, all rights reserved.
Analytical close-form solutions to the elastic fields of solids with dislocations and surface stress
NASA Astrophysics Data System (ADS)
Ye, Wei; Paliwal, Bhasker; Ougazzaden, Abdallah; Cherkaoui, Mohammed
2013-07-01
The concept of eigenstrain is adopted to derive a general analytical framework to solve the elastic field for 3D anisotropic solids with general defects by considering the surface stress. The formulation shows the elastic constants and geometrical features of the surface play an important role in determining the elastic fields of the solid. As an application, the analytical close-form solutions to the stress fields of an infinite isotropic circular nanowire are obtained. The stress fields are compared with the classical solutions and those of complex variable method. The stress fields from this work demonstrate the impact from the surface stress when the size of the nanowire shrinks but becomes negligible in macroscopic scale. Compared with the power series solutions of complex variable method, the analytical solutions in this work provide a better platform and they are more flexible in various applications. More importantly, the proposed analytical framework profoundly improves the studies of general 3D anisotropic materials with surface effects.
LEA in Private: A Privacy and Data Protection Framework for a Learning Analytics Toolbox
ERIC Educational Resources Information Center
Steiner, Christina M.; Kickmeier-Rust, Michael D.; Albert, Dietrich
2016-01-01
To find a balance between learning analytics research and individual privacy, learning analytics initiatives need to appropriately address ethical, privacy, and data protection issues. A range of general guidelines, model codes, and principles for handling ethical issues and for appropriate data and privacy protection are available, which may…
A trajectory generation framework for modeling spacecraft entry in MDAO
NASA Astrophysics Data System (ADS)
D`Souza, Sarah N.; Sarigul-Klijn, Nesrin
2016-04-01
In this paper a novel trajectory generation framework was developed that optimizes trajectory event conditions for use in a Generalized Entry Guidance algorithm. The framework was developed to be adaptable via the use of high fidelity equations of motion and drag based analytical bank profiles. Within this framework, a novel technique was implemented that resolved the sensitivity of the bank profile to atmospheric non-linearities. The framework's adaptability was established by running two different entry bank conditions. Each case yielded a reference trajectory and set of transition event conditions that are flight feasible and implementable in a Generalized Entry Guidance algorithm.
ERIC Educational Resources Information Center
Ifenthaler, Dirk; Widanapathirana, Chathuranga
2014-01-01
Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…
Quantifying risks with exact analytical solutions of derivative pricing distribution
NASA Astrophysics Data System (ADS)
Zhang, Kun; Liu, Jing; Wang, Erkang; Wang, Jin
2017-04-01
Derivative (i.e. option) pricing is essential for modern financial instrumentations. Despite of the previous efforts, the exact analytical forms of the derivative pricing distributions are still challenging to obtain. In this study, we established a quantitative framework using path integrals to obtain the exact analytical solutions of the statistical distribution for bond and bond option pricing for the Vasicek model. We discuss the importance of statistical fluctuations away from the expected option pricing characterized by the distribution tail and their associations to value at risk (VaR). The framework established here is general and can be applied to other financial derivatives for quantifying the underlying statistical distributions.
A general framework for parametric survival analysis.
Crowther, Michael J; Lambert, Paul C
2014-12-30
Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.
1990-08-01
evidence for a surprising degree of long-term skill retention. We formulated a theoretical framework , focusing on the importance of procedural reinstatement...considerable forgetting over even relatively short retention intervals. We have been able to place these studies in the same general theoretical framework developed
ERIC Educational Resources Information Center
Edwards, Jeffrey R.; Lambert, Lisa Schurer
2007-01-01
Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated…
Meta-Analysis of Coefficient Alpha
ERIC Educational Resources Information Center
Rodriguez, Michael C.; Maeda, Yukiko
2006-01-01
The meta-analysis of coefficient alpha across many studies is becoming more common in psychology by a methodology labeled reliability generalization. Existing reliability generalization studies have not used the sampling distribution of coefficient alpha for precision weighting and other common meta-analytic procedures. A framework is provided for…
Modelling vortex-induced fluid-structure interaction.
Benaroya, Haym; Gabbai, Rene D
2008-04-13
The principal goal of this research is developing physics-based, reduced-order, analytical models of nonlinear fluid-structure interactions associated with offshore structures. Our primary focus is to generalize the Hamilton's variational framework so that systems of flow-oscillator equations can be derived from first principles. This is an extension of earlier work that led to a single energy equation describing the fluid-structure interaction. It is demonstrated here that flow-oscillator models are a subclass of the general, physical-based framework. A flow-oscillator model is a reduced-order mechanical model, generally comprising two mechanical oscillators, one modelling the structural oscillation and the other a nonlinear oscillator representing the fluid behaviour coupled to the structural motion.Reduced-order analytical model development continues to be carried out using a Hamilton's principle-based variational approach. This provides flexibility in the long run for generalizing the modelling paradigm to complex, three-dimensional problems with multiple degrees of freedom, although such extension is very difficult. As both experimental and analytical capabilities advance, the critical research path to developing and implementing fluid-structure interaction models entails-formulating generalized equations of motion, as a superset of the flow-oscillator models; and-developing experimentally derived, semi-analytical functions to describe key terms in the governing equations of motion. The developed variational approach yields a system of governing equations. This will allow modelling of multiple d.f. systems. The extensions derived generalize the Hamilton's variational formulation for such problems. The Navier-Stokes equations are derived and coupled to the structural oscillator. This general model has been shown to be a superset of the flow-oscillator model. Based on different assumptions, one can derive a variety of flow-oscillator models.
On Connectivity of Wireless Sensor Networks with Directional Antennas
Wang, Qiu; Dai, Hong-Ning; Zheng, Zibin; Imran, Muhammad; Vasilakos, Athanasios V.
2017-01-01
In this paper, we investigate the network connectivity of wireless sensor networks with directional antennas. In particular, we establish a general framework to analyze the network connectivity while considering various antenna models and the channel randomness. Since existing directional antenna models have their pros and cons in the accuracy of reflecting realistic antennas and the computational complexity, we propose a new analytical directional antenna model called the iris model to balance the accuracy against the complexity. We conduct extensive simulations to evaluate the analytical framework. Our results show that our proposed analytical model on the network connectivity is accurate, and our iris antenna model can provide a better approximation to realistic directional antennas than other existing antenna models. PMID:28085081
Estimating Aquifer Properties Using Sinusoidal Pumping Tests
NASA Astrophysics Data System (ADS)
Rasmussen, T. C.; Haborak, K. G.; Young, M. H.
2001-12-01
We develop the theoretical and applied framework for using sinusoidal pumping tests to estimate aquifer properties for confined, leaky, and partially penetrating conditions. The framework 1) derives analytical solutions for three boundary conditions suitable for many practical applications, 2) validates the analytical solutions against a finite element model, 3) establishes a protocol for conducting sinusoidal pumping tests, and 4) estimates aquifer hydraulic parameters based on the analytical solutions. The analytical solutions to sinusoidal stimuli in radial coordinates are derived for boundary value problems that are analogous to the Theis (1935) confined aquifer solution, the Hantush and Jacob (1955) leaky aquifer solution, and the Hantush (1964) partially penetrated confined aquifer solution. The analytical solutions compare favorably to a finite-element solution of a simulated flow domain, except in the region immediately adjacent to the pumping well where the implicit assumption of zero borehole radius is violated. The procedure is demonstrated in one unconfined and two confined aquifer units near the General Separations Area at the Savannah River Site, a federal nuclear facility located in South Carolina. Aquifer hydraulic parameters estimated using this framework provide independent confirmation of parameters obtained from conventional aquifer tests. The sinusoidal approach also resulted in the elimination of investigation-derived wastes.
NASA Technical Reports Server (NTRS)
Banks, H. T.; Ito, K.
1988-01-01
Numerical techniques for parameter identification in distributed-parameter systems are developed analytically. A general convergence and stability framework (for continuous dependence on observations) is derived for first-order systems on the basis of (1) a weak formulation in terms of sesquilinear forms and (2) the resolvent convergence form of the Trotter-Kato approximation. The extension of this framework to second-order systems is considered.
A novel analytical description of periodic volume coil geometries in MRI
NASA Astrophysics Data System (ADS)
Koh, D.; Felder, J.; Shah, N. J.
2018-03-01
MRI volume coils can be represented by equivalent lumped element circuits and for a variety of these circuit configurations analytical design equations have been presented. The unification of several volume coil topologies results in a two-dimensional gridded equivalent lumped element circuit which compromises the birdcage resonator, its multiple endring derivative but also novel structures like the capacitive coupled ring resonator. The theory section analyzes a general two-dimensional circuit by noting that its current distribution can be decomposed into a longitudinal and an azimuthal dependency. This can be exploited to compare the current distribution with a transfer function of filter circuits along one direction. The resonances of the transfer function coincide with the resonance of the volume resonator and the simple analytical solution can be used as a design equation. The proposed framework is verified experimentally against a novel capacitive coupled ring structure which was derived from the general circuit formulation and is proven to exhibit a dominant homogeneous mode. In conclusion, a unified analytical framework is presented that allows determining the resonance frequency of any volume resonator that can be represented by a two dimensional meshed equivalent circuit.
NASA Astrophysics Data System (ADS)
Kushch, Volodymyr I.; Sevostianov, Igor; Giraud, Albert
2017-11-01
An accurate semi-analytical solution of the conductivity problem for a composite with anisotropic matrix and arbitrarily oriented anisotropic ellipsoidal inhomogeneities has been obtained. The developed approach combines the superposition principle with the multipole expansion of perturbation fields of inhomogeneities in terms of ellipsoidal harmonics and reduces the boundary value problem to an infinite system of linear algebraic equations for the induced multipole moments of inhomogeneities. A complete full-field solution is obtained for the multi-particle models comprising inhomogeneities of diverse shape, size, orientation and properties which enables an adequate account for the microstructure parameters. The solution is valid for the general-type anisotropy of constituents and arbitrary orientation of the orthotropy axes. The effective conductivity tensor of the particulate composite with anisotropic constituents is evaluated in the framework of the generalized Maxwell homogenization scheme. Application of the developed method to composites with imperfect ellipsoidal interfaces is straightforward. Their incorporation yields probably the most general model of a composite that may be considered in the framework of analytical approach.
Integrated corridor management analysis, modeling and simulation (AMS) methodology.
DOT National Transportation Integrated Search
2008-03-01
This AMS Methodologies Document provides a discussion of potential ICM analytical approaches for the assessment of generic corridor operations. The AMS framework described in this report identifies strategies and procedures for tailoring AMS general ...
Distribution of Steps with Finite-Range Interactions: Analytic Approximations and Numerical Results
NASA Astrophysics Data System (ADS)
GonzáLez, Diego Luis; Jaramillo, Diego Felipe; TéLlez, Gabriel; Einstein, T. L.
2013-03-01
While most Monte Carlo simulations assume only nearest-neighbor steps interact elastically, most analytic frameworks (especially the generalized Wigner distribution) posit that each step elastically repels all others. In addition to the elastic repulsions, we allow for possible surface-state-mediated interactions. We investigate analytically and numerically how next-nearest neighbor (NNN) interactions and, more generally, interactions out to q'th nearest neighbor alter the form of the terrace-width distribution and of pair correlation functions (i.e. the sum over n'th neighbor distribution functions, which we investigated recently.[2] For physically plausible interactions, we find modest changes when NNN interactions are included and generally negligible changes when more distant interactions are allowed. We discuss methods for extracting from simulated experimental data the characteristic scale-setting terms in assumed potential forms.
Some Observations on Cost-Effectiveness Analysis in Education.
ERIC Educational Resources Information Center
Geske, Terry G.
1979-01-01
The general nature of cost-effectiveness analysis is discussed, analytical frameworks for conducting cost-effectiveness studies are described, and some of the problems inherent in measuring educational costs and in assessing program effectiveness are addressed. (Author/IRT)
Riemannian geometry of Hamiltonian chaos: hints for a general theory.
Cerruti-Sola, Monica; Ciraolo, Guido; Franzosi, Roberto; Pettini, Marco
2008-10-01
We aim at assessing the validity limits of some simplifying hypotheses that, within a Riemmannian geometric framework, have provided an explanation of the origin of Hamiltonian chaos and have made it possible to develop a method of analytically computing the largest Lyapunov exponent of Hamiltonian systems with many degrees of freedom. Therefore, a numerical hypotheses testing has been performed for the Fermi-Pasta-Ulam beta model and for a chain of coupled rotators. These models, for which analytic computations of the largest Lyapunov exponents have been carried out in the mentioned Riemannian geometric framework, appear as paradigmatic examples to unveil the reason why the main hypothesis of quasi-isotropy of the mechanical manifolds sometimes breaks down. The breakdown is expected whenever the topology of the mechanical manifolds is nontrivial. This is an important step forward in view of developing a geometric theory of Hamiltonian chaos of general validity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, Scott D.; Eckberg, Alison D.; Thallapally, Praveen K.
2011-09-01
The metal-organic framework Cu-BTC was evaluated for its ability to selectively interact with Lewis-base analytes, including explosives, by examining retention on GC columns packed with Chromosorb W HP that contained 3.0% SE-30 along with various loadings of Cu-BTC. SEM images of the support material showed the characteristic Cu-BTC crystals embedded in the SE-30 coating on the diatomaceous support. Results indicated that the Cu-BTC-containing stationary phase had limited thermal stability (220°C) and strong general retention for analytes. Kováts index calculations showed selective retention (amounting to about 300 Kováts units) relative to n-alkanes for many small Lewis-base analytes on a column thatmore » contained 0.75% Cu-BTC compared to an SE-30 control. Short columns that contained lower loadings of Cu-BTC (0.10%) were necessary to elute explosives and related analytes; however, selectivity was not observed for aromatic compounds (including nitroaromatics) or nitroalkanes. Observed retention characteristics are discussed.« less
NASA Astrophysics Data System (ADS)
Ikot, Akpan N.; Maghsoodi, Elham; Hassanabadi, Hassan; Obu, Joseph A.
2014-05-01
In this paper, we obtain the approximate analytical bound-state solutions of the Dirac particle with the generalized Yukawa potential within the framework of spin and pseudospin symmetries for the arbitrary к state with a generalized tensor interaction. The generalized parametric Nikiforov-Uvarov method is used to obtain the energy eigenvalues and the corresponding wave functions in closed form. We also report some numerical results and present figures to show the effect of the tensor interaction.
NASA Astrophysics Data System (ADS)
Safeeq, M.; Grant, G. E.; Lewis, S. L.; Kramer, M. G.; Staab, B.
2014-09-01
Summer streamflows in the Pacific Northwest are largely derived from melting snow and groundwater discharge. As the climate warms, diminishing snowpack and earlier snowmelt will cause reductions in summer streamflow. Most regional-scale assessments of climate change impacts on streamflow use downscaled temperature and precipitation projections from general circulation models (GCMs) coupled with large-scale hydrologic models. Here we develop and apply an analytical hydrogeologic framework for characterizing summer streamflow sensitivity to a change in the timing and magnitude of recharge in a spatially explicit fashion. In particular, we incorporate the role of deep groundwater, which large-scale hydrologic models generally fail to capture, into streamflow sensitivity assessments. We validate our analytical streamflow sensitivities against two empirical measures of sensitivity derived using historical observations of temperature, precipitation, and streamflow from 217 watersheds. In general, empirically and analytically derived streamflow sensitivity values correspond. Although the selected watersheds cover a range of hydrologic regimes (e.g., rain-dominated, mixture of rain and snow, and snow-dominated), sensitivity validation was primarily driven by the snow-dominated watersheds, which are subjected to a wider range of change in recharge timing and magnitude as a result of increased temperature. Overall, two patterns emerge from this analysis: first, areas with high streamflow sensitivity also have higher summer streamflows as compared to low-sensitivity areas. Second, the level of sensitivity and spatial extent of highly sensitive areas diminishes over time as the summer progresses. Results of this analysis point to a robust, practical, and scalable approach that can help assess risk at the landscape scale, complement the downscaling approach, be applied to any climate scenario of interest, and provide a framework to assist land and water managers in adapting to an uncertain and potentially challenging future.
Child Development in Developing Countries: Introduction and Methods
ERIC Educational Resources Information Center
Bornstein, Marc H.; Britto, Pia Rebello; Nonoyama-Tarumi, Yuko; Ota, Yumiko; Petrovic, Oliver; Putnick, Diane L.
2012-01-01
The Multiple Indicator Cluster Survey (MICS) is a nationally representative, internationally comparable household survey implemented to examine protective and risk factors of child development in developing countries around the world. This introduction describes the conceptual framework, nature of the MICS3, and general analytic plan of articles…
Infinite slope stability under steady unsaturated seepage conditions
Lu, Ning; Godt, Jonathan W.
2008-01-01
We present a generalized framework for the stability of infinite slopes under steady unsaturated seepage conditions. The analytical framework allows the water table to be located at any depth below the ground surface and variation of soil suction and moisture content above the water table under steady infiltration conditions. The framework also explicitly considers the effect of weathering and porosity increase near the ground surface on changes in the friction angle of the soil. The factor of safety is conceptualized as a function of the depth within the vadose zone and can be reduced to the classical analytical solution for subaerial infinite slopes in the saturated zone. Slope stability analyses with hypothetical sandy and silty soils are conducted to illustrate the effectiveness of the framework. These analyses indicate that for hillslopes of both sandy and silty soils, failure can occur above the water table under steady infiltration conditions, which is consistent with some field observations that cannot be predicted by the classical infinite slope theory. A case study of shallow slope failures of sandy colluvium on steep coastal hillslopes near Seattle, Washington, is presented to examine the predictive utility of the proposed framework.
8D likelihood effective Higgs couplings extraction framework in h → 4ℓ
Chen, Yi; Di Marco, Emanuele; Lykken, Joe; ...
2015-01-23
We present an overview of a comprehensive analysis framework aimed at performing direct extraction of all possible effective Higgs couplings to neutral electroweak gauge bosons in the decay to electrons and muons, the so called ‘golden channel’. Our framework is based primarily on a maximum likelihood method constructed from analytic expressions of the fully differential cross sections for h → 4l and for the dominant irreduciblemore » $$ q\\overline{q} $$ → 4l background, where 4l = 2e2μ, 4e, 4μ. Detector effects are included by an explicit convolution of these analytic expressions with the appropriate transfer function over all center of mass variables. Utilizing the full set of observables, we construct an unbinned detector-level likelihood which is continuous in the effective couplings. We consider possible ZZ, Zγ, and γγ couplings simultaneously, allowing for general CP odd/even admixtures. A broad overview is given of how the convolution is performed and we discuss the principles and theoretical basis of the framework. This framework can be used in a variety of ways to study Higgs couplings in the golden channel using data obtained at the LHC and other future colliders.« less
NASA Astrophysics Data System (ADS)
Donohue, Randall; Yang, Yuting; McVicar, Tim; Roderick, Michael
2016-04-01
A fundamental question in climate and ecosystem science is "how does climate regulate the land surface carbon budget?" To better answer that question, here we develop an analytical model for estimating mean annual terrestrial gross primary productivity (GPP), which is the largest carbon flux over land, based on a rate-limitation framework. Actual GPP (climatological mean from 1982 to 2010) is calculated as a function of the balance between two GPP potentials defined by the climate (i.e., precipitation and solar radiation) and a third parameter that encodes other environmental variables and modifies the GPP-climate relationship. The developed model was tested at three spatial scales using different GPP sources, i.e., (1) observed GPP from 94 flux-sites, (2) modelled GPP (using the model-tree-ensemble approach) at 48654 (0.5 degree) grid-cells and (3) at 32 large catchments across the globe. Results show that the proposed model could account for the spatial GPP patterns, with a root-mean-square error of 0.70, 0.65 and 0.3 g C m-2 d-1 and R2 of 0.79, 0.92 and 0.97 for the flux-site, grid-cell and catchment scales, respectively. This analytical GPP model shares a similar form with the Budyko hydroclimatological model, which opens the possibility of a general analytical framework to analyze the linked carbon-water-energy cycles.
The Role of Geriatric Assessment Units in Caring for the Elderly: An Analytic Review.
ERIC Educational Resources Information Center
Rubenstein, Laurence Z.; And Others
1982-01-01
Although their structures and objectives vary considerably Geriatric Assessment Units (GAUs) are generally designed to assess elderly patients' medical and psychosocial problems, to determine optimal placement, and often to provide therapy and rehabilitation. Offers a framework for examining structural and outcome variables for GAUs. (Author)
A General Critical Discourse Analysis Framework for Educational Research
ERIC Educational Resources Information Center
Mullet, Dianna R.
2018-01-01
Critical discourse analysis (CDA) is a qualitative analytical approach for critically describing, interpreting, and explaining the ways in which discourses construct, maintain, and legitimize social inequalities. CDA rests on the notion that the way we use language is purposeful, regardless of whether discursive choices are conscious or…
Lambert, Amaury; Alexander, Helen K; Stadler, Tanja
2014-07-07
The reconstruction of phylogenetic trees based on viral genetic sequence data sequentially sampled from an epidemic provides estimates of the past transmission dynamics, by fitting epidemiological models to these trees. To our knowledge, none of the epidemiological models currently used in phylogenetics can account for recovery rates and sampling rates dependent on the time elapsed since transmission, i.e. age of infection. Here we introduce an epidemiological model where infectives leave the epidemic, by either recovery or sampling, after some random time which may follow an arbitrary distribution. We derive an expression for the likelihood of the phylogenetic tree of sampled infectives under our general epidemiological model. The analytic concept developed in this paper will facilitate inference of past epidemiological dynamics and provide an analytical framework for performing very efficient simulations of phylogenetic trees under our model. The main idea of our analytic study is that the non-Markovian epidemiological model giving rise to phylogenetic trees growing vertically as time goes by can be represented by a Markovian "coalescent point process" growing horizontally by the sequential addition of pairs of coalescence and sampling times. As examples, we discuss two special cases of our general model, described in terms of influenza and HIV epidemics. Though phrased in epidemiological terms, our framework can also be used for instance to fit macroevolutionary models to phylogenies of extant and extinct species, accounting for general species lifetime distributions. Copyright © 2014 Elsevier Ltd. All rights reserved.
Quo vadis, analytical chemistry?
Valcárcel, Miguel
2016-01-01
This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.
ERIC Educational Resources Information Center
Rienties, Bart; Boroowa, Avinash; Cross, Simon; Kubiak, Chris; Mayles, Kevin; Murphy, Sam
2016-01-01
There is an urgent need to develop an evidence-based framework for learning analytics whereby stakeholders can manage, evaluate, and make decisions about which types of interventions work well and under which conditions. In this article, we will work towards developing a foundation of an Analytics4Action Evaluation Framework (A4AEF) that is…
Educational Approaches to Entrepreneurship in Higher Education: A View from the Swedish Horizon
ERIC Educational Resources Information Center
Hoppe, Magnus; Westerberg, Mats; Leffler, Eva
2017-01-01
Purpose: The purpose of this paper is to present and develop models of educational approaches to entrepreneurship that can provide complementary analytical structures to better study, enact and reflect upon the role of entrepreneurship in higher education. Design/methodology/approach A general framework for entrepreneurship education is developed…
A Monitoring and Assessment Plan for the Youth Employment and Demonstration Projects Act of 1977.
ERIC Educational Resources Information Center
Employment and Training Administration (DOL), Washington, DC.
Intended as a general blueprint for monitoring and assessing activities under the Youth Employment and Demonstration Projects Act of 1977, this document discusses the expected constraints, evaluation and assessment tools, the analytic framework, and monitoring and review schedule. Five problem areas are recognized as potential constraints in…
Bajoub, Aadil; Bendini, Alessandra; Fernández-Gutiérrez, Alberto; Carrasco-Pancorbo, Alegría
2018-03-24
Over the last decades, olive oil quality and authenticity control has become an issue of great importance to consumers, suppliers, retailers, and regulators in both traditional and emerging olive oil producing countries, mainly due to the increasing worldwide popularity and the trade globalization of this product. Thus, in order to ensure olive oil authentication, various national and international laws and regulations have been adopted, although some of them are actually causing an enormous debate about the risk that they can represent for the harmonization of international olive oil trade standards. Within this context, this review was designed to provide a critical overview and comparative analysis of selected regulatory frameworks for olive oil authentication, with special emphasis on the quality and purity criteria considered by these regulation systems, their thresholds and the analytical methods employed for monitoring them. To complete the general overview, recent analytical advances to overcome drawbacks and limitations of the official methods to evaluate olive oil quality and to determine possible adulterations were reviewed. Furthermore, the latest trends on analytical approaches to assess the olive oil geographical and varietal origin traceability were also examined.
Schwartz, Rachel S; Mueller, Rachel L
2010-01-11
Estimates of divergence dates between species improve our understanding of processes ranging from nucleotide substitution to speciation. Such estimates are frequently based on molecular genetic differences between species; therefore, they rely on accurate estimates of the number of such differences (i.e. substitutions per site, measured as branch length on phylogenies). We used simulations to determine the effects of dataset size, branch length heterogeneity, branch depth, and analytical framework on branch length estimation across a range of branch lengths. We then reanalyzed an empirical dataset for plethodontid salamanders to determine how inaccurate branch length estimation can affect estimates of divergence dates. The accuracy of branch length estimation varied with branch length, dataset size (both number of taxa and sites), branch length heterogeneity, branch depth, dataset complexity, and analytical framework. For simple phylogenies analyzed in a Bayesian framework, branches were increasingly underestimated as branch length increased; in a maximum likelihood framework, longer branch lengths were somewhat overestimated. Longer datasets improved estimates in both frameworks; however, when the number of taxa was increased, estimation accuracy for deeper branches was less than for tip branches. Increasing the complexity of the dataset produced more misestimated branches in a Bayesian framework; however, in an ML framework, more branches were estimated more accurately. Using ML branch length estimates to re-estimate plethodontid salamander divergence dates generally resulted in an increase in the estimated age of older nodes and a decrease in the estimated age of younger nodes. Branch lengths are misestimated in both statistical frameworks for simulations of simple datasets. However, for complex datasets, length estimates are quite accurate in ML (even for short datasets), whereas few branches are estimated accurately in a Bayesian framework. Our reanalysis of empirical data demonstrates the magnitude of effects of Bayesian branch length misestimation on divergence date estimates. Because the length of branches for empirical datasets can be estimated most reliably in an ML framework when branches are <1 substitution/site and datasets are > or =1 kb, we suggest that divergence date estimates using datasets, branch lengths, and/or analytical techniques that fall outside of these parameters should be interpreted with caution.
Burns, K C; Zotz, G
2010-02-01
Epiphytes are an important component of many forested ecosystems, yet our understanding of epiphyte communities lags far behind that of terrestrial-based plant communities. This discrepancy is exacerbated by the lack of a theoretical context to assess patterns in epiphyte community structure. We attempt to fill this gap by developing an analytical framework to investigate epiphyte assemblages, which we then apply to a data set on epiphyte distributions in a Panamanian rain forest. On a coarse scale, interactions between epiphyte species and host tree species can be viewed as bipartite networks, similar to pollination and seed dispersal networks. On a finer scale, epiphyte communities on individual host trees can be viewed as meta-communities, or suites of local epiphyte communities connected by dispersal. Similar analytical tools are typically employed to investigate species interaction networks and meta-communities, thus providing a unified analytical framework to investigate coarse-scale (network) and fine-scale (meta-community) patterns in epiphyte distributions. Coarse-scale analysis of the Panamanian data set showed that most epiphyte species interacted with fewer host species than expected by chance. Fine-scale analyses showed that epiphyte species richness on individual trees was lower than null model expectations. Therefore, epiphyte distributions were clumped at both scales, perhaps as a result of dispersal limitations. Scale-dependent patterns in epiphyte species composition were observed. Epiphyte-host networks showed evidence of negative co-occurrence patterns, which could arise from adaptations among epiphyte species to avoid competition for host species, while most epiphyte meta-communities were distributed at random. Application of our "meta-network" analytical framework in other locales may help to identify general patterns in the structure of epiphyte assemblages and their variation in space and time.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-10
... Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records AGENCY: Privacy Office... Homeland Security/U.S. Customs and Border Protection, DHS/CBP--017 Analytical Framework for Intelligence... Analytical Framework for Intelligence (AFI) System of Records'' from one or more provisions of the Privacy...
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.
Argumentation in Science Education: A Model-based Framework
NASA Astrophysics Data System (ADS)
Böttcher, Florian; Meisert, Anke
2011-02-01
The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons for the appropriateness of a theoretical model which explains a certain phenomenon. Argumentation is considered to be the process of the critical evaluation of such a model if necessary in relation to alternative models. Secondly, some methodological details are exemplified for the use of a model-based analysis in the concrete classroom context. Third, the application of the approach in comparison with other analytical models will be presented to demonstrate the explicatory power and depth of the model-based perspective. Primarily, the framework of Toulmin to structurally analyse arguments is contrasted with the approach presented here. It will be demonstrated how common methodological and theoretical problems in the context of Toulmin's framework can be overcome through a model-based perspective. Additionally, a second more complex argumentative sequence will also be analysed according to the invented analytical scheme to give a broader impression of its potential in practical use.
Wang, Mingming; Sun, Yuanxiang; Sweetapple, Chris
2017-12-15
Storage is important for flood mitigation and non-point source pollution control. However, to seek a cost-effective design scheme for storage tanks is very complex. This paper presents a two-stage optimization framework to find an optimal scheme for storage tanks using storm water management model (SWMM). The objectives are to minimize flooding, total suspended solids (TSS) load and storage cost. The framework includes two modules: (i) the analytical module, which evaluates and ranks the flooding nodes with the analytic hierarchy process (AHP) using two indicators (flood depth and flood duration), and then obtains the preliminary scheme by calculating two efficiency indicators (flood reduction efficiency and TSS reduction efficiency); (ii) the iteration module, which obtains an optimal scheme using a generalized pattern search (GPS) method based on the preliminary scheme generated by the analytical module. The proposed approach was applied to a catchment in CZ city, China, to test its capability in choosing design alternatives. Different rainfall scenarios are considered to test its robustness. The results demonstrate that the optimal framework is feasible, and the optimization is fast based on the preliminary scheme. The optimized scheme is better than the preliminary scheme for reducing runoff and pollutant loads under a given storage cost. The multi-objective optimization framework presented in this paper may be useful in finding the best scheme of storage tanks or low impact development (LID) controls. Copyright © 2017 Elsevier Ltd. All rights reserved.
Nonlinear analysis of structures. [within framework of finite element method
NASA Technical Reports Server (NTRS)
Armen, H., Jr.; Levine, H.; Pifko, A.; Levy, A.
1974-01-01
The development of nonlinear analysis techniques within the framework of the finite-element method is reported. Although the emphasis is concerned with those nonlinearities associated with material behavior, a general treatment of geometric nonlinearity, alone or in combination with plasticity is included, and applications presented for a class of problems categorized as axisymmetric shells of revolution. The scope of the nonlinear analysis capabilities includes: (1) a membrane stress analysis, (2) bending and membrane stress analysis, (3) analysis of thick and thin axisymmetric bodies of revolution, (4) a general three dimensional analysis, and (5) analysis of laminated composites. Applications of the methods are made to a number of sample structures. Correlation with available analytic or experimental data range from good to excellent.
ERIC Educational Resources Information Center
Cloud, Sherrill
An issues-oriented framework for selecting data to support typical state-level planning analyses is presented. Attention is focused on questions concerning the planning issues, and the general data requirements required to support a given state's analytical questions and decision requirements. State-level planning issues in postsecondary education…
ERIC Educational Resources Information Center
Moore, Janette; Smith, Gillian W.; Shevlin, Mark; O'Neill, Francis A.
2010-01-01
An alternative models framework was used to test three confirmatory factor analytic models for the Short Leyton Obsessional Inventory-Children's Version (Short LOI-CV) in a general population sample of 517 young adolescent twins (11-16 years). A one-factor model as implicit in current classification systems of Obsessive-Compulsive Disorder (OCD),…
The Challenge of Separating Effects of Simultaneous Education Projects on Student Achievement
ERIC Educational Resources Information Center
Ma, Xin; Ma, Lingling
2009-01-01
When multiple education projects operate in an overlapping or rear-ended manner, it is always a challenge to separate unique project effects on schooling outcomes. Our analysis represents a first attempt to address this challenge. A three-level hierarchical linear model (HLM) was presented as a general analytical framework to separate program…
A new model for fluid velocity slip on a solid surface.
Shu, Jian-Jun; Teo, Ji Bin Melvin; Chan, Weng Kong
2016-10-12
A general adsorption model is developed to describe the interactions between near-wall fluid molecules and solid surfaces. This model serves as a framework for the theoretical modelling of boundary slip phenomena. Based on this adsorption model, a new general model for the slip velocity of fluids on solid surfaces is introduced. The slip boundary condition at a fluid-solid interface has hitherto been considered separately for gases and liquids. In this paper, we show that the slip velocity in both gases and liquids may originate from dynamical adsorption processes at the interface. A unified analytical model that is valid for both gas-solid and liquid-solid slip boundary conditions is proposed based on surface science theory. The corroboration with the experimental data extracted from the literature shows that the proposed model provides an improved prediction compared to existing analytical models for gases at higher shear rates and close agreement for liquid-solid interfaces in general.
Quality Indicators for Learning Analytics
ERIC Educational Resources Information Center
Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus
2014-01-01
This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…
Multiaxis sensing using metal organic frameworks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talin, Albert Alec; Allendorf, Mark D.; Leonard, Francois
2017-01-17
A sensor device including a sensor substrate; and a thin film comprising a porous metal organic framework (MOF) on the substrate that presents more than one transduction mechanism when exposed to an analyte. A method including exposing a porous metal organic framework (MOF) on a substrate to an analyte; and identifying more than one transduction mechanism in response to the exposure to the analyte.
ERIC Educational Resources Information Center
Drachsler, H.; Kalz, M.
2016-01-01
The article deals with the interplay between learning analytics and massive open online courses (MOOCs) and provides a conceptual framework to situate ongoing research in the MOOC and learning analytics innovation cycle (MOLAC framework). The MOLAC framework is organized on three levels: On the micro-level, the data collection and analytics…
Optimizing cosmological surveys in a crowded market
NASA Astrophysics Data System (ADS)
Bassett, Bruce A.
2005-04-01
Optimizing the major next-generation cosmological surveys (such as SNAP, KAOS, etc.) is a key problem given our ignorance of the physics underlying cosmic acceleration and the plethora of surveys planned. We propose a Bayesian design framework which (1) maximizes the discrimination power of a survey without assuming any underlying dark-energy model, (2) finds the best niche survey geometry given current data and future competing experiments, (3) maximizes the cross section for serendipitous discoveries and (4) can be adapted to answer specific questions (such as “is dark energy dynamical?”). Integrated parameter-space optimization (IPSO) is a design framework that integrates projected parameter errors over an entire dark energy parameter space and then extremizes a figure of merit (such as Shannon entropy gain which we show is stable to off-diagonal covariance matrix perturbations) as a function of survey parameters using analytical, grid or MCMC techniques. We discuss examples where the optimization can be performed analytically. IPSO is thus a general, model-independent and scalable framework that allows us to appropriately use prior information to design the best possible surveys.
A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework
NASA Astrophysics Data System (ADS)
Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo
An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.
ERIC Educational Resources Information Center
Lalvani, Priya; Broderick, Alicia A.; Fine, Michelle; Jacobowitz, Tina; Michelli, Nicholas
2015-01-01
In this analytic essay, we initiate a dialogue about the place of disability in a multicultural education framework, and the role of inclusive education in a democracy. Problematizing the common omission of the topic of disability oppression from anti-oppression pedagogies and from social justice education generally, we invite teacher educators…
Estimation of the limit of detection using information theory measures.
Fonollosa, Jordi; Vergara, Alexander; Huerta, Ramón; Marco, Santiago
2014-01-31
Definitions of the limit of detection (LOD) based on the probability of false positive and/or false negative errors have been proposed over the past years. Although such definitions are straightforward and valid for any kind of analytical system, proposed methodologies to estimate the LOD are usually simplified to signals with Gaussian noise. Additionally, there is a general misconception that two systems with the same LOD provide the same amount of information on the source regardless of the prior probability of presenting a blank/analyte sample. Based upon an analogy between an analytical system and a binary communication channel, in this paper we show that the amount of information that can be extracted from an analytical system depends on the probability of presenting the two different possible states. We propose a new definition of LOD utilizing information theory tools that deals with noise of any kind and allows the introduction of prior knowledge easily. Unlike most traditional LOD estimation approaches, the proposed definition is based on the amount of information that the chemical instrumentation system provides on the chemical information source. Our findings indicate that the benchmark of analytical systems based on the ability to provide information about the presence/absence of the analyte (our proposed approach) is a more general and proper framework, while converging to the usual values when dealing with Gaussian noise. Copyright © 2013 Elsevier B.V. All rights reserved.
Hogan, Dianna; Arthaud, Greg; Pattison, Malka; Sayre, Roger G.; Shapiro, Carl
2010-01-01
The analytical framework for understanding ecosystem services in conservation, resource management, and development decisions is multidisciplinary, encompassing a combination of the natural and social sciences. This report summarizes a workshop on 'Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making,' which focused on the analytical process and on identifying research priorities for assessing ecosystem services, their production and use, their spatial and temporal characteristics, their relationship with natural systems, and their interdependencies. Attendees discussed research directions and solutions to key challenges in developing the analytical framework. The discussion was divided into two sessions: (1) the measurement framework: quantities and values, and (2) the spatial framework: mapping and spatial relationships. This workshop was the second of three preconference workshops associated with ACES 2008 (A Conference on Ecosystem Services): Using Science for Decision Making in Dynamic Systems. These three workshops were designed to explore the ACES 2008 theme on decision making and how the concept of ecosystem services can be more effectively incorporated into conservation, restoration, resource management, and development decisions. Preconference workshop 1, 'Developing a Vision: Incorporating Ecosystem Services into Decision Making,' was held on April 15, 2008, in Cambridge, MA. In preconference workshop 1, participants addressed what would have to happen to make ecosystem services be used more routinely and effectively in conservation, restoration, resource management, and development decisions, and they identified some key challenges in developing the analytical framework. Preconference workshop 3, 'Developing an Institutional Framework: Incorporating Ecosystem Services into Decision Making,' was held on October 30, 2008, in Albuquerque, NM; participants examined the relationship between the institutional framework and the use of ecosystem services in decision making.
Evaluation of generalized degrees of freedom for sparse estimation by replica method
NASA Astrophysics Data System (ADS)
Sakata, A.
2016-12-01
We develop a method to evaluate the generalized degrees of freedom (GDF) for linear regression with sparse regularization. The GDF is a key factor in model selection, and thus its evaluation is useful in many modelling applications. An analytical expression for the GDF is derived using the replica method in the large-system-size limit with random Gaussian predictors. The resulting formula has a universal form that is independent of the type of regularization, providing us with a simple interpretation. Within the framework of replica symmetric (RS) analysis, GDF has a physical meaning as the effective fraction of non-zero components. The validity of our method in the RS phase is supported by the consistency of our results with previous mathematical results. The analytical results in the RS phase are calculated numerically using the belief propagation algorithm.
Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework.
Khazaei, Hamzeh; McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil
2015-11-18
Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids' NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution.
Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework
McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil
2015-01-01
Background Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. Objective To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. Methods We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). Results We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids’ NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Conclusions Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution. PMID:26582268
NASA Technical Reports Server (NTRS)
Bhadra, Dipasis; Morser, Frederick R.
2006-01-01
In this paper, the authors review the FAA s current program investments and lay out a preliminary analytical framework to undertake projects that may address some of the noted deficiencies. By drawing upon the well developed theories from corporate finance, an analytical framework is offered that can be used for choosing FAA s investments taking into account risk, expected returns and inherent dependencies across NAS programs. The framework can be expanded into taking multiple assets and realistic values for parameters in drawing an efficient risk-return frontier for the entire FAA investment programs.
On the pursuit of a nuclear development capability: The case of the Cuban nuclear program
NASA Astrophysics Data System (ADS)
Benjamin-Alvarado, Jonathan Calvert
1998-09-01
While there have been many excellent descriptive accounts of modernization schemes in developing states, energy development studies based on prevalent modernization theory have been rare. Moreover, heretofore there have been very few analyses of efforts to develop a nuclear energy capability by developing states. Rarely have these analyses employed social science research methodologies. The purpose of this study was to develop a general analytical framework, based on such a methodology to analyze nuclear energy development and to utilize this framework for the study of the specific case of Cuba's decision to develop nuclear energy. The analytical framework developed focuses on a qualitative tracing of the process of Cuban policy objectives and implementation to develop a nuclear energy capability, and analyzes the policy in response to three models of modernization offered to explain the trajectory of policy development. These different approaches are the politically motivated modernization model, the economic and technological modernization model and the economic and energy security model. Each model provides distinct and functionally differentiated expectations for the path of development toward this objective. Each model provides expected behaviors to external stimuli that would result in specific policy responses. In the study, Cuba's nuclear policy responses to stimuli from domestic constraints and intensities, institutional development, and external influences are analyzed. The analysis revealed that in pursuing the nuclear energy capability, Cuba primarily responded by filtering most of the stimuli through the twin objectives of economic rationality and technological advancement. Based upon the Cuban policy responses to the domestic and international stimuli, the study concluded that the economic and technological modernization model of nuclear energy development offered a more complete explanation of the trajectory of policy development than either the politically-motivated or economic and energy security models. The findings of this case pose some interesting questions for the general study of energy programs in developing states. By applying the analytical framework employed in this study to a number of other cases, perhaps the understanding of energy development schemes may be expanded through future research.
Connecting the shadows: probing inner disk geometries using shadows in transitional disks
NASA Astrophysics Data System (ADS)
Min, M.; Stolker, T.; Dominik, C.; Benisty, M.
2017-08-01
Aims: Shadows in transitional disks are generally interpreted as signs of a misaligned inner disk. This disk is usually beyond the reach of current day high contrast imaging facilities. However, the location and morphology of the shadow features allow us to reconstruct the inner disk geometry. Methods: We derive analytic equations of the locations of the shadow features as a function of the orientation of the inner and outer disk and the height of the outer disk wall. In contrast to previous claims in the literature, we show that the position angle of the line connecting the shadows cannot be directly related to the position angle of the inner disk. Results: We show how the analytic framework derived here can be applied to transitional disks with shadow features. We use estimates of the outer disk height to put constraints on the inner disk orientation. In contrast with the results from Long et al. (2017, ApJ, 838, 62), we derive that for the disk surrounding HD 100453 the analytic estimates and interferometric observations result in a consistent picture of the orientation of the inner disk. Conclusions: The elegant consistency in our analytic framework between observation and theory strongly support both the interpretation of the shadow features as coming from a misaligned inner disk as well as the diagnostic value of near infrared interferometry for inner disk geometry.
Diffusion in the special theory of relativity.
Herrmann, Joachim
2009-11-01
The Markovian diffusion theory is generalized within the framework of the special theory of relativity. Since the velocity space in relativity is a hyperboloid, the mathematical stochastic calculus on Riemanian manifolds can be applied but adopted here to the velocity space. A generalized Langevin equation in the fiber space of position, velocity, and orthonormal velocity frames is defined from which the generalized relativistic Kramers equation in the phase space in external force fields is derived. The obtained diffusion equation is invariant under Lorentz transformations and its stationary solution is given by the Jüttner distribution. Besides, a nonstationary analytical solution is derived for the example of force-free relativistic diffusion.
NASA Astrophysics Data System (ADS)
Mimasu, Ken; Sanz, Verónica; Williams, Ciaran
2016-08-01
We present predictions for the associated production of a Higgs boson at NLO+PS accuracy, including the effect of anomalous interactions between the Higgs and gauge bosons. We present our results in different frameworks, one in which the interaction vertex between the Higgs boson and Standard Model W and Z bosons is parameterized in terms of general Lorentz structures, and one in which Electroweak symmetry breaking is manifestly linear and the resulting operators arise through a six-dimensional effective field theory framework. We present analytic calculations of the Standard Model and Beyond the Standard Model contributions, and discuss the phenomenological impact of the higher order pieces. Our results are implemented in the NLO Monte Carlo program MCFM, and interfaced to shower Monte Carlos through the Powheg box framework.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-07
... Border Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records AGENCY... Framework for Intelligence (AFI) System of Records'' and this proposed rulemaking. In this proposed... Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records.'' AFI enhances DHS's...
ERIC Educational Resources Information Center
Chigisheva, Oksana; Bondarenko, Anna; Soltovets, Elena
2017-01-01
The paper provides analytical insights into highly acute issues concerning preparation and adoption of Qualifications Frameworks being an adequate response to the growing interactions at the global labor market and flourishing of knowledge economy. Special attention is paid to the analyses of transnational Meta Qualifications Frameworks (A…
Degrees of School Democracy: A Holistic Framework
ERIC Educational Resources Information Center
Woods, Philip A.; Woods, Glenys J.
2012-01-01
This article outlines an analytical framework that enables analysis of degrees of democracy in a school or other organizational setting. It is founded in a holistic conception of democracy, which is a model of working together that aspires to truth, goodness, and meaning and the participation of all. We suggest that the analytical framework can be…
Generalized Hill-stability criteria for hierarchical three-body systems at arbitrary inclinations
NASA Astrophysics Data System (ADS)
Grishin, Evgeni; Perets, Hagai B.; Zenati, Yossef; Michaely, Erez
2017-04-01
A fundamental aspect of the three-body problem is its stability. Most stability studies have focused on the co-planar three-body problem, deriving analytic criteria for the dynamical stability of such pro/retrograde systems. Numerical studies of inclined systems phenomenologically mapped their stability regions, but neither complement it by theoretical framework, nor provided satisfactory fit for their dependence on mutual inclinations. Here we present a novel approach to study the stability of hierarchical three-body systems at arbitrary inclinations, which accounts not only for the instantaneous stability of such systems, but also for the secular stability and evolution through Lidov-Kozai cycles and evection. We generalize the Hill-stability criteria to arbitrarily inclined triple systems, explain the existence of quasi-stable regimes and characterize the inclination dependence of their stability. We complement the analytic treatment with an extensive numerical study, to test our analytic results. We find excellent correspondence up to high inclinations (˜120°), beyond which the agreement is marginal. At such high inclinations, the stability radius is larger, the ratio between the outer and inner periods becomes comparable and our secular averaging approach is no longer strictly valid. We therefore combine our analytic results with polynomial fits to the numerical results to obtain a generalized stability formula for triple systems at arbitrary inclinations. Besides providing a generalized secular-based physical explanation for the stability of non-co-planar systems, our results have direct implications for any triple systems and, in particular, binary planets and moon/satellite systems; we briefly discuss the latter as a test case for our models.
Spacecraft drag-free technology development: On-board estimation and control synthesis
NASA Technical Reports Server (NTRS)
Key, R. W.; Mettler, E.; Milman, M. H.; Schaechter, D. B.
1982-01-01
Estimation and control methods for a Drag-Free spacecraft are discussed. The functional and analytical synthesis of on-board estimators and controllers for an integrated attitude and translation control system is represented. The framework for detail definition and design of the baseline drag-free system is created. The techniques for solution of self-gravity and electrostatic charging problems are applicable generally, as is the control system development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.
2009-04-14
This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less
Analytical learning and term-rewriting systems
NASA Technical Reports Server (NTRS)
Laird, Philip; Gamble, Evan
1990-01-01
Analytical learning is a set of machine learning techniques for revising the representation of a theory based on a small set of examples of that theory. When the representation of the theory is correct and complete but perhaps inefficient, an important objective of such analysis is to improve the computational efficiency of the representation. Several algorithms with this purpose have been suggested, most of which are closely tied to a first order logical language and are variants of goal regression, such as the familiar explanation based generalization (EBG) procedure. But because predicate calculus is a poor representation for some domains, these learning algorithms are extended to apply to other computational models. It is shown that the goal regression technique applies to a large family of programming languages, all based on a kind of term rewriting system. Included in this family are three language families of importance to artificial intelligence: logic programming, such as Prolog; lambda calculus, such as LISP; and combinatorial based languages, such as FP. A new analytical learning algorithm, AL-2, is exhibited that learns from success but is otherwise quite different from EBG. These results suggest that term rewriting systems are a good framework for analytical learning research in general, and that further research should be directed toward developing new techniques.
Benjamini, Dan; Basser, Peter J
2014-12-07
In this work, we present an experimental design and analytical framework to measure the nonparametric joint radius-length (R-L) distribution of an ensemble of parallel, finite cylindrical pores, and more generally, the eccentricity distribution of anisotropic pores. Employing a novel 3D double pulsed-field gradient acquisition scheme, we first obtain both the marginal radius and length distributions of a population of cylindrical pores and then use these to constrain and stabilize the estimate of the joint radius-length distribution. Using the marginal distributions as constraints allows the joint R-L distribution to be reconstructed from an underdetermined system (i.e., more variables than equations), which requires a relatively small and feasible number of MR acquisitions. Three simulated representative joint R-L distribution phantoms corrupted by different noise levels were reconstructed to demonstrate the process, using this new framework. As expected, the broader the peaks in the joint distribution, the less stable and more sensitive to noise the estimation of the marginal distributions. Nevertheless, the reconstruction of the joint distribution is remarkably robust to increases in noise level; we attribute this characteristic to the use of the marginal distributions as constraints. Axons are known to exhibit local compartment eccentricity variations upon injury; the extent of the variations depends on the severity of the injury. Nonparametric estimation of the eccentricity distribution of injured axonal tissue is of particular interest since generally one cannot assume a parametric distribution a priori. Reconstructing the eccentricity distribution may provide vital information about changes resulting from injury or that occurred during development.
Tidally induced residual current over the Malin Sea continental slope
NASA Astrophysics Data System (ADS)
Stashchuk, Nataliya; Vlasenko, Vasiliy; Hosegood, Phil; Nimmo-Smith, W. Alex M.
2017-05-01
Tidally induced residual currents generated over shelf-slope topography are investigated analytically and numerically using the Massachusetts Institute of Technology general circulation model. Observational support for the presence of such a slope current was recorded over the Malin Sea continental slope during the 88-th cruise of the RRS ;James Cook; in July 2013. A simple analytical formula developed here in the framework of time-averaged shallow water equations has been validated against a fully nonlinear nonhydrostatic numerical solution. A good agreement between analytical and numerical solutions is found for a wide range of input parameters of the tidal flow and bottom topography. In application to the Malin Shelf area both the numerical model and analytical solution predicted a northward moving current confined to the slope with its core located above the 400 m isobath and with vertically averaged maximum velocities up to 8 cm s-1, which is consistent with the in-situ data recorded at three moorings and along cross-slope transects.
Stochastic Kuramoto oscillators with discrete phase states.
Jörg, David J
2017-09-01
We present a generalization of the Kuramoto phase oscillator model in which phases advance in discrete phase increments through Poisson processes, rendering both intrinsic oscillations and coupling inherently stochastic. We study the effects of phase discretization on the synchronization and precision properties of the coupled system both analytically and numerically. Remarkably, many key observables such as the steady-state synchrony and the quality of oscillations show distinct extrema while converging to the classical Kuramoto model in the limit of a continuous phase. The phase-discretized model provides a general framework for coupled oscillations in a Markov chain setting.
Stochastic Kuramoto oscillators with discrete phase states
NASA Astrophysics Data System (ADS)
Jörg, David J.
2017-09-01
We present a generalization of the Kuramoto phase oscillator model in which phases advance in discrete phase increments through Poisson processes, rendering both intrinsic oscillations and coupling inherently stochastic. We study the effects of phase discretization on the synchronization and precision properties of the coupled system both analytically and numerically. Remarkably, many key observables such as the steady-state synchrony and the quality of oscillations show distinct extrema while converging to the classical Kuramoto model in the limit of a continuous phase. The phase-discretized model provides a general framework for coupled oscillations in a Markov chain setting.
Wave Functions for Time-Dependent Dirac Equation under GUP
NASA Astrophysics Data System (ADS)
Zhang, Meng-Yao; Long, Chao-Yun; Long, Zheng-Wen
2018-04-01
In this work, the time-dependent Dirac equation is investigated under generalized uncertainty principle (GUP) framework. It is possible to construct the exact solutions of Dirac equation when the time-dependent potentials satisfied the proper conditions. In (1+1) dimensions, the analytical wave functions of the Dirac equation under GUP have been obtained for the two kinds time-dependent potentials. Supported by the National Natural Science Foundation of China under Grant No. 11565009
Test-particle dynamics in general spherically symmetric black hole spacetimes
NASA Astrophysics Data System (ADS)
De Laurentis, Mariafelicia; Younsi, Ziri; Porth, Oliver; Mizuno, Yosuke; Rezzolla, Luciano
2018-05-01
To date, the most precise tests of general relativity have been achieved through pulsar timing, albeit in the weak-field regime. Since pulsars are some of the most precise and stable "clocks" in the Universe, present observational efforts are focused on detecting pulsars in the vicinity of supermassive black holes (most notably in the Galactic Centre), enabling pulsar timing to be used as an extremely precise probe of strong-field gravity. In this paper, a mathematical framework to describe test-particle dynamics in general black-hole spacetimes is presented and subsequently used to study a binary system comprising a pulsar orbiting a black hole. In particular, taking into account the parameterization of a general spherically symmetric black-hole metric, general analytic expressions for both the advance of the periastron and for the orbital period of a massive test particle are derived. Furthermore, these expressions are applied to four representative cases of solutions arising in both general relativity and in alternative theories of gravity. Finally, this framework is applied to the Galactic center S -stars and four distinct pulsar toy models. It is shown that by adopting a fully general-relativistic description of test-particle motion which is independent of any particular theory of gravity, observations of pulsars can help impose better constraints on alternative theories of gravity than is presently possible.
Rachid, G; El Fadel, M
2013-08-15
This paper presents a SWOT analysis of SEA systems in the Middle East North Africa region through a comparative examination of the status, application and structure of existing systems based on country-specific legal, institutional and procedural frameworks. The analysis is coupled with the multi-attribute decision making method (MADM) within an analytical framework that involves both performance analysis based on predefined evaluation criteria and countries' self-assessment of their SEA system through open-ended surveys. The results show heterogenous status with a general delayed progress characterized by varied levels of weaknesses embedded in the legal and administrative frameworks and poor integration with the decision making process. Capitalizing on available opportunities, the paper highlights measures to enhance the development and enactment of SEA in the region. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Nic Giolla Mhichíl, Mairéad; van Engen, Jeroen; Ó Ciardúbháin, Colm; Ó Cléircín, Gearóid; Appel, Christine
2014-01-01
This paper sets out to construct and present the evolving conceptual framework of the SpeakApps projects to consider the application of learning analytics to facilitate synchronous and asynchronous oral language skills within this CALL context. Drawing from both the CALL and wider theoretical and empirical literature of learner analytics, the…
Cammi, R
2009-10-28
We present a general formulation of the coupled-cluster (CC) theory for a molecular solute described within the framework of the polarizable continuum model (PCM). The PCM-CC theory is derived in its complete form, called PTDE scheme, in which the correlated electronic density is used to have a self-consistent reaction field, and in an approximate form, called PTE scheme, in which the PCM-CC equations are solved assuming the fixed Hartree-Fock solvent reaction field. Explicit forms for the PCM-CC-PTDE equations are derived at the single and double (CCSD) excitation level of the cluster operator. At the same level, explicit equations for the analytical first derivatives of the PCM basic energy functional are presented, and analytical second derivatives are also discussed. The corresponding PCM-CCSD-PTE equations are given as a special case of the full theory.
Real-Time Analytics for the Healthcare Industry: Arrhythmia Detection.
Agneeswaran, Vijay Srinivas; Mukherjee, Joydeb; Gupta, Ashutosh; Tonpay, Pranay; Tiwari, Jayati; Agarwal, Nitin
2013-09-01
It is time for the healthcare industry to move from the era of "analyzing our health history" to the age of "managing the future of our health." In this article, we illustrate the importance of real-time analytics across the healthcare industry by providing a generic mechanism to reengineer traditional analytics expressed in the R programming language into Storm-based real-time analytics code. This is a powerful abstraction, since most data scientists use R to write the analytics and are not clear on how to make the data work in real-time and on high-velocity data. Our paper focuses on the applications necessary to a healthcare analytics scenario, specifically focusing on the importance of electrocardiogram (ECG) monitoring. A physician can use our framework to compare ECG reports by categorization and consequently detect Arrhythmia. The framework can read the ECG signals and uses a machine learning-based categorizer that runs within a Storm environment to compare different ECG signals. The paper also presents some performance studies of the framework to illustrate the throughput and accuracy trade-off in real-time analytics.
Visual Analytics of integrated Data Systems for Space Weather Purposes
NASA Astrophysics Data System (ADS)
Rosa, Reinaldo; Veronese, Thalita; Giovani, Paulo
Analysis of information from multiple data sources obtained through high resolution instrumental measurements has become a fundamental task in all scientific areas. The development of expert methods able to treat such multi-source data systems, with both large variability and measurement extension, is a key for studying complex scientific phenomena, especially those related to systemic analysis in space and environmental sciences. In this talk, we present a time series generalization introducing the concept of generalized numerical lattice, which represents a discrete sequence of temporal measures for a given variable. In this novel representation approach each generalized numerical lattice brings post-analytical data information. We define a generalized numerical lattice as a set of three parameters representing the following data properties: dimensionality, size and post-analytical measure (e.g., the autocorrelation, Hurst exponent, etc)[1]. From this representation generalization, any multi-source database can be reduced to a closed set of classified time series in spatiotemporal generalized dimensions. As a case study, we show a preliminary application in space science data, highlighting the possibility of a real time analysis expert system. In this particular application, we have selected and analyzed, using detrended fluctuation analysis (DFA), several decimetric solar bursts associated to X flare-classes. The association with geomagnetic activity is also reported. DFA method is performed in the framework of a radio burst automatic monitoring system. Our results may characterize the variability pattern evolution, computing the DFA scaling exponent, scanning the time series by a short windowing before the extreme event [2]. For the first time, the application of systematic fluctuation analysis for space weather purposes is presented. The prototype for visual analytics is implemented in a Compute Unified Device Architecture (CUDA) by using the K20 Nvidia graphics processing units (GPUs) to reduce the integrated analysis runtime. [1] Veronese et al. doi: 10.6062/jcis.2009.01.02.0021, 2010. [2] Veronese et al. doi:http://dx.doi.org/10.1016/j.jastp.2010.09.030, 2011.
Gkoumas, Spyridon; Villanueva-Perez, Pablo; Wang, Zhentian; Romano, Lucia; Abis, Matteo; Stampanoni, Marco
2016-01-01
In X-ray grating interferometry, dark-field contrast arises due to partial extinction of the detected interference fringes. This is also called visibility reduction and is attributed to small-angle scattering from unresolved structures in the imaged object. In recent years, analytical quantitative frameworks of dark-field contrast have been developed for highly diluted monodisperse microsphere suspensions with maximum 6% volume fraction. These frameworks assume that scattering particles are separated by large enough distances, which make any interparticle scattering interference negligible. In this paper, we start from the small-angle scattering intensity equation and, by linking Fourier and real-space, we introduce the structure factor and thus extend the analytical and experimental quantitative interpretation of dark-field contrast, for a range of suspensions with volume fractions reaching 40%. The structure factor accounts for interparticle scattering interference. Without introducing any additional fitting parameters, we successfully predict the experimental values measured at the TOMCAT beamline, Swiss Light Source. Finally, we apply this theoretical framework to an experiment probing a range of system correlation lengths by acquiring dark-field images at different energies. This proposed method has the potential to be applied in single-shot-mode using a polychromatic X-ray tube setup and a single-photon-counting energy-resolving detector. PMID:27734931
Framework for assessing key variable dependencies in loose-abrasive grinding and polishing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, J.S.; Aikens, D.M.; Brown, N.J.
1995-12-01
This memo describes a framework for identifying all key variables that determine the figuring performance of loose-abrasive lapping and polishing machines. This framework is intended as a tool for prioritizing R&D issues, assessing the completeness of process models and experimental data, and for providing a mechanism to identify any assumptions in analytical models or experimental procedures. Future plans for preparing analytical models or performing experiments can refer to this framework in establishing the context of the work.
Role of Knowledge Management and Analytical CRM in Business: Data Mining Based Framework
ERIC Educational Resources Information Center
Ranjan, Jayanthi; Bhatnagar, Vishal
2011-01-01
Purpose: The purpose of the paper is to provide a thorough analysis of the concepts of business intelligence (BI), knowledge management (KM) and analytical CRM (aCRM) and to establish a framework for integrating all the three to each other. The paper also seeks to establish a KM and aCRM based framework using data mining (DM) techniques, which…
RT-18: Value of Flexibility. Phase 1
2010-09-25
an analytical framework based on sound mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory...framework that is mathematically consistent, domain independent and applicable under varying information levels. This report presents our advances in...During this period, we also explored the development of an analytical framework based on sound mathematical constructs. A review of the current state
De Neve, Jan-Walter; Boudreaux, Chantelle; Gill, Roopan; Geldsetzer, Pascal; Vaikath, Maria; Bärnighausen, Till; Bossert, Thomas J
2017-07-03
Many countries have created community-based health worker (CHW) programs for HIV. In most of these countries, several national and non-governmental initiatives have been implemented raising questions of how well these different approaches address the health problems and use health resources in a compatible way. While these questions have led to a general policy initiative to promote harmonization across programs, there is a need for countries to develop a more coherent and organized approach to CHW programs and to generate evidence about the most efficient and effective strategies to ensure their optimal, sustained performance. We conducted a narrative review of the existing published and gray literature on the harmonization of CHW programs. We searched for and noted evidence on definitions, models, and/or frameworks of harmonization; theoretical arguments or hypotheses about the effects of CHW program fragmentation; and empirical evidence. Based on this evidence, we defined harmonization, introduced three priority areas for harmonization, and identified a conceptual framework for analyzing harmonization of CHW programs that can be used to support their expanding role in HIV service delivery. We identified and described the major issues and relationships surrounding the harmonization of CHW programs, including key characteristics, facilitators, and barriers for each of the priority areas of harmonization, and used our analytic framework to map overarching findings. We apply this approach of CHW programs supporting HIV services across four countries in Southern Africa in a separate article. There is a large number and immense diversity of CHW programs for HIV. This includes integration of HIV components into countries' existing national programs along with the development of multiple, stand-alone CHW programs. We defined (i) coordination among stakeholders, (ii) integration into the broader health system, and (iii) assurance of a CHW program's sustainability to be priority areas of harmonization. While harmonization is likely a complex political process, with in many cases incremental steps toward improvement, a wide range of facilitators are available to decision-makers. These can be categorized using an analytic framework assessing the (i) health issue, (ii) intervention itself, (iii) stakeholders, (iv) health system, and (v) broad context. There is a need to address fragmentation of CHW programs to advance and sustain CHW roles and responsibilities for HIV. This study provides a narrative review and analytic framework to understand the process by which harmonization of CHW programs might be achieved and to test the assumption that harmonization is needed to improve CHW performance.
Multiplicative Multitask Feature Learning
Wang, Xin; Bi, Jinbo; Yu, Shipeng; Sun, Jiangwen; Song, Minghu
2016-01-01
We investigate a general framework of multiplicative multitask feature learning which decomposes individual task’s model parameters into a multiplication of two components. One of the components is used across all tasks and the other component is task-specific. Several previous methods can be proved to be special cases of our framework. We study the theoretical properties of this framework when different regularization conditions are applied to the two decomposed components. We prove that this framework is mathematically equivalent to the widely used multitask feature learning methods that are based on a joint regularization of all model parameters, but with a more general form of regularizers. Further, an analytical formula is derived for the across-task component as related to the task-specific component for all these regularizers, leading to a better understanding of the shrinkage effects of different regularizers. Study of this framework motivates new multitask learning algorithms. We propose two new learning formulations by varying the parameters in the proposed framework. An efficient blockwise coordinate descent algorithm is developed suitable for solving the entire family of formulations with rigorous convergence analysis. Simulation studies have identified the statistical properties of data that would be in favor of the new formulations. Extensive empirical studies on various classification and regression benchmark data sets have revealed the relative advantages of the two new formulations by comparing with the state of the art, which provides instructive insights into the feature learning problem with multiple tasks. PMID:28428735
Sandplay therapy with couples within the framework of analytical psychology.
Albert, Susan Carol
2015-02-01
Sandplay therapy with couples is discussed within an analytical framework. Guidelines are proposed as a means of developing this relatively new area within sandplay therapy, and as a platform to open a wider discussion to bring together sandplay therapy and couple therapy. Examples of sand trays created during couple therapy are also presented to illustrate the transformations during the therapeutic process. © 2015, The Society of Analytical Psychology.
NASA Astrophysics Data System (ADS)
Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.
2016-09-01
Hydrologists and engineers may choose from a range of semidistributed rainfall-runoff models such as VIC, PDM, and TOPMODEL, all of which predict runoff from a distribution of watershed properties. However, these models are not easily compared to event-based data and are missing ready-to-use analytical expressions that are analogous to the SCS-CN method. The SCS-CN method is an event-based model that describes the runoff response with a rainfall-runoff curve that is a function of the cumulative storm rainfall and antecedent wetness condition. Here we develop an event-based probabilistic storage framework and distill semidistributed models into analytical, event-based expressions for describing the rainfall-runoff response. The event-based versions called VICx, PDMx, and TOPMODELx also are extended with a spatial description of the runoff concept of "prethreshold" and "threshold-excess" runoff, which occur, respectively, before and after infiltration exceeds a storage capacity threshold. For total storm rainfall and antecedent wetness conditions, the resulting ready-to-use analytical expressions define the source areas (fraction of the watershed) that produce runoff by each mechanism. They also define the probability density function (PDF) representing the spatial variability of runoff depths that are cumulative values for the storm duration, and the average unit area runoff, which describes the so-called runoff curve. These new event-based semidistributed models and the traditional SCS-CN method are unified by the same general expression for the runoff curve. Since the general runoff curve may incorporate different model distributions, it may ease the way for relating such distributions to land use, climate, topography, ecology, geology, and other characteristics.
A channel-based framework for steering, non-locality and beyond
NASA Astrophysics Data System (ADS)
Hoban, Matty J.; Belén Sainz, Ana
2018-05-01
Non-locality and steering are both non-classical phenomena witnessed in nature as a result of quantum entanglement. It is now well-established that one can study non-locality independently of the formalism of quantum mechanics, in the so-called device-independent framework. With regards to steering, although one cannot study it completely independently of the quantum formalism, ‘post-quantum steering’ has been described, which is steering that cannot be reproduced by measurements on entangled states but does not lead to superluminal signalling. In this work we present a framework based on the study of quantum channels in which one can study steering (and non-locality) in quantum theory and beyond. In this framework, we show that kinds of steering, whether quantum or post-quantum, are directly related to particular families of quantum channels that have been previously introduced by Beckman et al (2001 Phys. Rev. A 64 052309). Utilizing this connection we also demonstrate new analytical examples of post-quantum steering, give a quantum channel interpretation of almost quantum non-locality and steering, easily recover and generalize the celebrated Gisin–Hughston–Jozsa–Wootters theorem, and initiate the study of post-quantum Buscemi non-locality and non-classical teleportation. In this way, we see post-quantum non-locality and steering as just two aspects of a more general phenomenon.
A Computational Framework for Bioimaging Simulation.
Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi
2015-01-01
Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.
NASA Astrophysics Data System (ADS)
Zheng, R.-F.; Wu, T.-H.; Li, X.-Y.; Chen, W.-Q.
2018-06-01
The problem of a penny-shaped crack embedded in an infinite space of transversely isotropic multi-ferroic composite medium is investigated. The crack is assumed to be subjected to uniformly distributed mechanical, electric and magnetic loads applied symmetrically on the upper and lower crack surfaces. The semi-permeable (limited-permeable) electro-magnetic boundary condition is adopted. By virtue of the generalized method of potential theory and the general solutions, the boundary integro-differential equations governing the mode I crack problem, which are of nonlinear nature, are established and solved analytically. Exact and complete coupling magneto-electro-elastic field is obtained in terms of elementary functions. Important parameters in fracture mechanics on the crack plane, e.g., the generalized crack surface displacements, the distributions of generalized stresses at the crack tip, the generalized stress intensity factors and the energy release rate, are explicitly presented. To validate the present solutions, a numerical code by virtue of finite element method is established for 3D crack problems in the framework of magneto-electro-elasticity. To evaluate conveniently the effect of the medium inside the crack, several empirical formulae are developed, based on the numerical results.
Analytic Frameworks for Assessing Dialogic Argumentation in Online Learning Environments
ERIC Educational Resources Information Center
Clark, Douglas B; Sampson, Victor; Weinberger, Armin; Erkens, Gijsbert
2007-01-01
Over the last decade, researchers have developed sophisticated online learning environments to support students engaging in dialogic argumentation. This review examines five categories of analytic frameworks for measuring participant interactions within these environments focusing on (1) formal argumentation structure, (2) conceptual quality, (3)…
NASA Astrophysics Data System (ADS)
Sulyok, G.
2017-07-01
Starting from the general definition of a one-loop tensor N-point function, we use its Feynman parametrization to calculate the ultraviolet (UV-)divergent part of an arbitrary tensor coefficient in the framework of dimensional regularization. In contrast to existing recursion schemes, we are able to present a general analytic result in closed form that enables direct determination of the UV-divergent part of any one-loop tensor N-point coefficient independent from UV-divergent parts of other one-loop tensor N-point coefficients. Simplified formulas and explicit expressions are presented for A-, B-, C-, D-, E-, and F-functions.
An Analytical Framework for Evaluating E-Commerce Business Models and Strategies.
ERIC Educational Resources Information Center
Lee, Chung-Shing
2001-01-01
Considers electronic commerce as a paradigm shift, or a disruptive innovation, and presents an analytical framework based on the theories of transaction costs and switching costs. Topics include business transformation process; scale effect; scope effect; new sources of revenue; and e-commerce value creation model and strategy. (LRW)
Reasoning across Ontologically Distinct Levels: Students' Understandings of Molecular Genetics
ERIC Educational Resources Information Center
Duncan, Ravit Golan; Reiser, Brian J.
2007-01-01
In this article we apply a novel analytical framework to explore students' difficulties in understanding molecular genetics--a domain that is particularly challenging to learn. Our analytical framework posits that reasoning in molecular genetics entails mapping across ontologically distinct levels--an information level containing the genetic…
The Illness Narratives of Health Managers: Developing an Analytical Framework
ERIC Educational Resources Information Center
Exworthy, Mark
2011-01-01
This paper examines the personal experience of illness and healthcare by health managers through their illness narratives. By synthesising a wider literature of illness narratives and health management, an analytical framework is presented, which considers the impact of illness narratives, comprising the logic of illness narratives, the actors…
A Data Protection Framework for Learning Analytics
ERIC Educational Resources Information Center
Cormack, Andrew
2016-01-01
Most studies on the use of digital student data adopt an ethical framework derived from human-subject research, based on the informed consent of the experimental subject. However, consent gives universities little guidance on using learning analytics as a routine part of educational provision: which purposes are legitimate and which analyses…
Zhou, Gaochao; Tao, Xudong; Shen, Ze; Zhu, Guanghao; Jin, Biaobing; Kang, Lin; Xu, Weiwei; Chen, Jian; Wu, Peiheng
2016-01-01
We propose a kind of general framework for the design of a perfect linear polarization converter that works in the transmission mode. Using an intuitive picture that is based on the method of bi-directional polarization mode decomposition, it is shown that when the device under consideration simultaneously possesses two complementary symmetry planes, with one being equivalent to a perfect electric conducting surface and the other being equivalent to a perfect magnetic conducting surface, linear polarization conversion can occur with an efficiency of 100% in the absence of absorptive losses. The proposed framework is validated by two design examples that operate near 10 GHz, where the numerical, experimental and analytic results are in good agreements. PMID:27958313
A Discounting Framework for Choice With Delayed and Probabilistic Rewards
Green, Leonard; Myerson, Joel
2005-01-01
When choosing between delayed or uncertain outcomes, individuals discount the value of such outcomes on the basis of the expected time to or the likelihood of their occurrence. In an integrative review of the expanding experimental literature on discounting, the authors show that although the same form of hyperbola-like function describes discounting of both delayed and probabilistic outcomes, a variety of recent findings are inconsistent with a single-process account. The authors also review studies that compare discounting in different populations and discuss the theoretical and practical implications of the findings. The present effort illustrates the value of studying choice involving both delayed and probabilistic outcomes within a general discounting framework that uses similar experimental procedures and a common analytical approach. PMID:15367080
Stability analysis of magnetized neutron stars - a semi-analytic approach
NASA Astrophysics Data System (ADS)
Herbrik, Marlene; Kokkotas, Kostas D.
2017-04-01
We implement a semi-analytic approach for stability analysis, addressing the ongoing uncertainty about stability and structure of neutron star magnetic fields. Applying the energy variational principle, a model system is displaced from its equilibrium state. The related energy density variation is set up analytically, whereas its volume integration is carried out numerically. This facilitates the consideration of more realistic neutron star characteristics within the model compared to analytical treatments. At the same time, our method retains the possibility to yield general information about neutron star magnetic field and composition structures that are likely to be stable. In contrast to numerical studies, classes of parametrized systems can be studied at once, finally constraining realistic configurations for interior neutron star magnetic fields. We apply the stability analysis scheme on polytropic and non-barotropic neutron stars with toroidal, poloidal and mixed fields testing their stability in a Newtonian framework. Furthermore, we provide the analytical scheme for dropping the Cowling approximation in an axisymmetric system and investigate its impact. Our results confirm the instability of simple magnetized neutron star models as well as a stabilization tendency in the case of mixed fields and stratification. These findings agree with analytical studies whose spectrum of model systems we extend by lifting former simplifications.
NASA Astrophysics Data System (ADS)
Clementi, N. C.; Revelli, J. A.; Sibona, G. J.
2015-07-01
We propose a general nonlinear analytical framework to study the effect of an external stimulus in the internal state of a population of moving particles. This novel scheme allows us to study a broad range of excitation transport phenomena. In particular, considering social systems, it gives insight of the spatial dynamics influence in the competition between propaganda (mass media) and convincement. By extending the framework presented by Terranova et al. [Europhys. Lett. 105, 30007 (2014), 10.1209/0295-5075/105/30007], we now allow changes in individual's opinions due to a reflection induced by mass media. The equations of the model could be solved numerically, and, for some special cases, it is possible to derive analytical solutions for the steady states. We implement computational simulations for different social and dynamical systems to check the accuracy of our scheme and to study a broader variety of scenarios. In particular, we compare the numerical outcome with the analytical results for two possible real cases, finding a good agreement. From the results, we observe that mass media dominates the opinion state in slow dynamics communities; whereas, for higher agent active speeds, the rate of interactions increases and the opinion state is determined by a competition between propaganda and persuasion. This difference suggests that kinetics can not be neglected in the study of transport of any excitation over a particle system.
Clementi, N C; Revelli, J A; Sibona, G J
2015-07-01
We propose a general nonlinear analytical framework to study the effect of an external stimulus in the internal state of a population of moving particles. This novel scheme allows us to study a broad range of excitation transport phenomena. In particular, considering social systems, it gives insight of the spatial dynamics influence in the competition between propaganda (mass media) and convincement. By extending the framework presented by Terranova et al. [Europhys. Lett. 105, 30007 (2014)], we now allow changes in individual's opinions due to a reflection induced by mass media. The equations of the model could be solved numerically, and, for some special cases, it is possible to derive analytical solutions for the steady states. We implement computational simulations for different social and dynamical systems to check the accuracy of our scheme and to study a broader variety of scenarios. In particular, we compare the numerical outcome with the analytical results for two possible real cases, finding a good agreement. From the results, we observe that mass media dominates the opinion state in slow dynamics communities; whereas, for higher agent active speeds, the rate of interactions increases and the opinion state is determined by a competition between propaganda and persuasion. This difference suggests that kinetics can not be neglected in the study of transport of any excitation over a particle system.
Two-condition within-participant statistical mediation analysis: A path-analytic framework.
Montoya, Amanda K; Hayes, Andrew F
2017-03-01
Researchers interested in testing mediation often use designs where participants are measured on a dependent variable Y and a mediator M in both of 2 different circumstances. The dominant approach to assessing mediation in such a design, proposed by Judd, Kenny, and McClelland (2001), relies on a series of hypothesis tests about components of the mediation model and is not based on an estimate of or formal inference about the indirect effect. In this article we recast Judd et al.'s approach in the path-analytic framework that is now commonly used in between-participant mediation analysis. By so doing, it is apparent how to estimate the indirect effect of a within-participant manipulation on some outcome through a mediator as the product of paths of influence. This path-analytic approach eliminates the need for discrete hypothesis tests about components of the model to support a claim of mediation, as Judd et al.'s method requires, because it relies only on an inference about the product of paths-the indirect effect. We generalize methods of inference for the indirect effect widely used in between-participant designs to this within-participant version of mediation analysis, including bootstrap confidence intervals and Monte Carlo confidence intervals. Using this path-analytic approach, we extend the method to models with multiple mediators operating in parallel and serially and discuss the comparison of indirect effects in these more complex models. We offer macros and code for SPSS, SAS, and Mplus that conduct these analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Tsunami and acoustic-gravity waves in water of constant depth
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendin, Gali; Stiassnie, Michael
2013-08-15
A study of wave radiation by a rather general bottom displacement, in a compressible ocean of otherwise constant depth, is carried out within the framework of a three-dimensional linear theory. Simple analytic expressions for the flow field, at large distance from the disturbance, are derived. Realistic numerical examples indicate that the Acoustic-Gravity waves, which significantly precede the Tsunami, are expected to leave a measurable signature on bottom-pressure records that should be considered for early detection of Tsunami.
Wagner, J A; Ball, J R
2015-07-01
The Institute of Medicine (IOM) released a groundbreaking 2010 report, Evaluation of Biomarkers and Surrogate Endpoints in Chronic Disease. Key recommendations included a harmonized scientific process and a general framework for biomarker evaluation with three interrelated steps: (1) Analytical validation -- is the biomarker measurement accurate? (2) Qualification -- is the biomarker associated with the clinical endpoint of concern? (3) Utilization -- what is the specific context of the proposed use? © 2015 American Society for Clinical Pharmacology and Therapeutics.
On the dispersion relations for an inhomogeneous waveguide with attenuation
NASA Astrophysics Data System (ADS)
Vatul'yan, A. O.; Yurlov, V. O.
2016-09-01
Some general laws concerning the structure of dispersion relations for solid inhomogeneous waveguides with attenuation are studied. An approach based on the analysis of a first-order matrix differential equation is presented in the framework of the concept of complex moduli. Some laws concerning the structure of components of the dispersion set for a viscoelastic inhomogeneous cylindrical waveguide are studied analytically and numerically, and the asymptotics of components of the dispersion set are constructed for arbitrary inhomogeneity laws in the low-frequency region.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, Jesse D.M.
In the United States overall electrical generation capacity is expected to increase by 10-25 gigawatts (GW) per year to meet increases in demand. Wind energy is a key component of state and federal renewable energy standards, and central to the Department of Energy’s 20% by 2030 wind production goals. Increased wind energy development may present increased resource conflict with avian wildlife, and environmental permitting has been identified as a potential obstacle to expansion in the sector. ICF developed an analytical framework to help applicants and agencies examine potential impacts in support of facility siting and permitting. A key objective ofmore » our work was to develop a framework that is scalable from the local to the national level, and one that is generalizable across the different scales at which biological communities operate – from local influences to meta-populations. The intent was to allow natural resource managers to estimate the cumulative impacts of turbine strikes and habitat changes on long-term population performance in the context of a species demography, genetic potential, and life history. We developed three types of models based on our literature review and participation in the scientific review processes. First, the conceptual model was developed as a general description of the analytical framework. Second, we developed the analytical framework based on the relationships between concepts, and the functions presented in the scientific literature. Third, we constructed an application of the model by parameterizing the framework using data from and relevant to the Altamont Pass Wind Resource Area (APWRA), and an existing golden eagle population model. We developed managed source code, database create statements, and written documentation to allow for the reproduction of each phase of the analysis. ICF identified a potential template adaptive management system in the form of the US Fish & Wildlife Service (USFWS) Adaptive Harvest Management (AHM) program, and developed recommendations for the structure and function of a similar wind-facility related program. We provided a straw-man implementation of the analytical framework based on assumptions for APWRA-wide golden eagle fatalities, and presented a statistical examination of the model performance. APWRA-wide fatality rates appear substantial at all scales examined from the local APWRA population to the Bird Conservation Region. Documented fatality rates significantly influenced population performance in terms of non-territorial non-breeding birds. Breeder, Juvenile, Subadult, and Adult abundance were mostly unaffected by Baseline APWRA-wide fatality rates. However, increased variability in fatality rates would likely have impacts on long-term population performance, and would result in a substantially larger loss of resources. We developed four recommendations for future study. First, we recommend establishment of concept experts through the existing system of non-profits, regulatory agencies, academia, and industry in the wind energy sector. Second, we recommend the development of a central or distributed shared data repository, and establish guidelines for data sharing and transparency. Third, we recommend development a forum and process for model selection at the local and national level. Last, we recommend experimental implementation of the prescribed system at broader scales, and refinement the expectations for modeling and adaptive management.« less
A mathematical description of the inclusive fitness theory.
Wakano, Joe Yuichiro; Ohtsuki, Hisashi; Kobayashi, Yutaka
2013-03-01
Recent developments in the inclusive fitness theory have revealed that the direction of evolution can be analytically predicted in a wider class of models than previously thought, such as those models dealing with network structure. This paper aims to provide a mathematical description of the inclusive fitness theory. Specifically, we provide a general framework based on a Markov chain that can implement basic models of inclusive fitness. Our framework is based on the probability distribution of "offspring-to-parent map", from which the key concepts of the theory, such as fitness function, relatedness and inclusive fitness, are derived in a straightforward manner. We prove theorems showing that inclusive fitness always provides a correct prediction on which of two competing genes more frequently appears in the long run in the Markov chain. As an application of the theorems, we prove a general formula of the optimal dispersal rate in the Wright's island model with recurrent mutations. We also show the existence of the critical mutation rate, which does not depend on the number of islands and below which a positive dispersal rate evolves. Our framework can also be applied to lattice or network structured populations. Copyright © 2012 Elsevier Inc. All rights reserved.
Assessing Proposals for New Global Health Treaties: An Analytic Framework.
Hoffman, Steven J; Røttingen, John-Arne; Frenk, Julio
2015-08-01
We have presented an analytic framework and 4 criteria for assessing when global health treaties have reasonable prospects of yielding net positive effects. First, there must be a significant transnational dimension to the problem being addressed. Second, the goals should justify the coercive nature of treaties. Third, proposed global health treaties should have a reasonable chance of achieving benefits. Fourth, treaties should be the best commitment mechanism among the many competing alternatives. Applying this analytic framework to 9 recent calls for new global health treaties revealed that none fully meet the 4 criteria. Efforts aiming to better use or revise existing international instruments may be more productive than is advocating new treaties.
Assessing Proposals for New Global Health Treaties: An Analytic Framework
Røttingen, John-Arne; Frenk, Julio
2015-01-01
We have presented an analytic framework and 4 criteria for assessing when global health treaties have reasonable prospects of yielding net positive effects. First, there must be a significant transnational dimension to the problem being addressed. Second, the goals should justify the coercive nature of treaties. Third, proposed global health treaties should have a reasonable chance of achieving benefits. Fourth, treaties should be the best commitment mechanism among the many competing alternatives. Applying this analytic framework to 9 recent calls for new global health treaties revealed that none fully meet the 4 criteria. Efforts aiming to better use or revise existing international instruments may be more productive than is advocating new treaties. PMID:26066926
Let's Talk Learning Analytics: A Framework for Implementation in Relation to Student Retention
ERIC Educational Resources Information Center
West, Deborah; Heath, David; Huijser, Henk
2016-01-01
This paper presents a dialogical tool for the advancement of learning analytics implementation for student retention in Higher Education institutions. The framework was developed as an outcome of a project commissioned and funded by the Australian Government's "Office for Learning and Teaching". The project took a mixed-method approach…
ERIC Educational Resources Information Center
Hartnell, Chad A.; Ou, Amy Yi; Kinicki, Angelo
2011-01-01
We apply Quinn and Rohrbaugh's (1983) competing values framework (CVF) as an organizing taxonomy to meta-analytically test hypotheses about the relationship between 3 culture types and 3 major indices of organizational effectiveness (employee attitudes, operational performance [i.e., innovation and product and service quality], and financial…
ERIC Educational Resources Information Center
Šulíková, Jana
2016-01-01
Purpose: This article proposes an analytical framework that helps to identify and challenge misconceptions of ethnocentrism found in pre-tertiary teaching resources for history and the social sciences in numerous countries. Design: Drawing on nationalism studies, the analytical framework employs ideas known under the umbrella terms of…
Web-based Visual Analytics for Extreme Scale Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Evans, Katherine J; Harney, John F
In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less
Tunable Snell's law for spin waves in heterochiral magnetic films
NASA Astrophysics Data System (ADS)
Mulkers, Jeroen; Van Waeyenberge, Bartel; Milošević, Milorad V.
2018-03-01
Thin ferromagnetic films with an interfacially induced DMI exhibit nontrivial asymmetric dispersion relations that lead to unique and useful magnonic properties. Here we derive an analytical expression for the magnon propagation angle within the micromagnetic framework and show how the dispersion relation can be approximated with a comprehensible geometrical interpretation in the k space of the propagation of spin waves. We further explore the refraction of spin waves at DMI interfaces in heterochiral magnetic films, after deriving a generalized Snell's law tunable by an in-plane magnetic field, that yields analytical expressions for critical incident angles. The found asymmetric Brewster angles at interfaces of regions with different DMI strengths, adjustable by magnetic field, support the conclusion that heterochiral ferromagnetic structures are an ideal platform for versatile spin-wave guides.
NASA Astrophysics Data System (ADS)
Trejos, Víctor M.; Santos, Andrés; Gámez, Francisco
2018-05-01
The interest in the description of the properties of fluids of restricted dimensionality is growing for theoretical and practical reasons. In this work, we have firstly developed an analytical expression for the Helmholtz free energy of the two-dimensional square-well fluid in the Barker-Henderson framework. This equation of state is based on an approximate analytical radial distribution function for d-dimensional hard-sphere fluids (1 ≤ d ≤ 3) and is validated against existing and new simulation results. The so-obtained equation of state is implemented in a discrete perturbation theory able to account for general potential shapes. The prototypical Lennard-Jones and Yukawa fluids are tested in its two-dimensional version against available and new simulation data with semiquantitative agreement.
The Earth Data Analytic Services (EDAS) Framework
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; Duffy, D.
2017-12-01
Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.
Big data and high-performance analytics in structural health monitoring for bridge management
NASA Astrophysics Data System (ADS)
Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed
2016-04-01
Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.
Analytical framework and tool kit for SEA follow-up
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran
2009-04-15
Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less
Modal expansions in periodic photonic systems with material loss and dispersion
NASA Astrophysics Data System (ADS)
Wolff, Christian; Busch, Kurt; Mortensen, N. Asger
2018-03-01
We study band-structure properties of periodic optical systems composed of lossy and intrinsically dispersive materials. To this end, we develop an analytical framework based on adjoint modes of a lossy periodic electromagnetic system and show how the problem of linearly dependent eigenmodes in the presence of material dispersion can be overcome. We then formulate expressions for the band-structure derivative (∂ ω )/(∂ k ) (complex group velocity) and the local and total density of transverse optical states. Our exact expressions hold for 3D periodic arrays of materials with arbitrary dispersion properties and in general need to be evaluated numerically. They can be generalized to systems with two, one, or no directions of periodicity provided the fields are localized along nonperiodic directions. Possible applications are photonic crystals, metamaterials, metasurfaces composed of highly dispersive materials such as metals or lossless photonic crystals, and metamaterials or metasurfaces strongly coupled to resonant perturbations such as quantum dots or excitons in 2D materials. For illustration purposes, we analytically evaluate our expressions for some simple systems consisting of lossless dielectrics with one sharp Lorentzian material resonance added. By combining several Lorentz poles, this provides an avenue to perturbatively treat quite general material loss bands in photonic crystals.
NASA Astrophysics Data System (ADS)
Srinivasan, V.; Clement, T. P.
2008-02-01
Multi-species reactive transport equations coupled through sorption and sequential first-order reactions are commonly used to model sites contaminated with radioactive wastes, chlorinated solvents and nitrogenous species. Although researchers have been attempting to solve various forms of these reactive transport equations for over 50 years, a general closed-form analytical solution to this problem is not available in the published literature. In Part I of this two-part article, we derive a closed-form analytical solution to this problem for spatially-varying initial conditions. The proposed solution procedure employs a combination of Laplace and linear transform methods to uncouple and solve the system of partial differential equations. Two distinct solutions are derived for Dirichlet and Cauchy boundary conditions each with Bateman-type source terms. We organize and present the final solutions in a common format that represents the solutions to both boundary conditions. In addition, we provide the mathematical concepts for deriving the solution within a generic framework that can be used for solving similar transport problems.
NASA Astrophysics Data System (ADS)
Klimchitskaya, G. L.; Mostepanenko, V. M.; Petrov, V. M.
2017-12-01
The complete theory of electrical conductivity of graphene at arbitrary temperature is developed with taking into account mass-gap parameter and chemical potential. Both the in-plane and out-of-plane conductivities of graphene are expressed via the components of the polarization tensor in (2+1)-dimensional space-time analytically continued to the real frequency axis. Simple analytic expressions for both the real and imaginary parts of the conductivity of graphene are obtained at zero and nonzero temperature. They demonstrate an interesting interplay depending on the values of mass gap and chemical potential. In the local limit, several results obtained earlier using various approximate and phenomenological approaches are reproduced, refined, and generalized. The numerical computations of both the real and imaginary parts of the conductivity of graphene are performed to illustrate the obtained results. The analytic expressions for the conductivity of graphene obtained in this paper can serve as a guide in the comparison between different theoretical approaches and between experiment and theory.
Shabaev, Andrew; Lambrakos, Samuel G; Bernstein, Noam; Jacobs, Verne L; Finkenstadt, Daniel
2011-04-01
We have developed a general framework for numerical simulation of various types of scenarios that can occur for the detection of improvised explosive devices (IEDs) through the use of excitation using incident electromagnetic waves. A central component model of this framework is an S-matrix representation of a multilayered composite material system. Each layer of the system is characterized by an average thickness and an effective electric permittivity function. The outputs of this component are the reflectivity and the transmissivity as functions of frequency and angle of the incident electromagnetic wave. The input of the component is a parameterized analytic-function representation of the electric permittivity as a function of frequency, which is provided by another component model of the framework. The permittivity function is constructed by fitting response spectra calculated using density functional theory (DFT) and parameter adjustment according to any additional information that may be available, e.g., experimentally measured spectra or theory-based assumptions concerning spectral features. A prototype simulation is described that considers response characteristics for THz excitation of the high explosive β-HMX. This prototype simulation includes a description of a procedure for calculating response spectra using DFT as input to the Smatrix model. For this purpose, the DFT software NRLMOL was adopted. © 2011 Society for Applied Spectroscopy
Expanding Students' Analytical Frameworks through the Study of Graphic Novels
ERIC Educational Resources Information Center
Connors, Sean P.
2015-01-01
When teachers work with students to construct a metalanguage that they can draw on to describe and analyze graphic novels, and then invite students to apply that metalanguage in the service of composing multimodal texts of their own, teachers broaden students' analytical frameworks. In the process of doing so, teachers empower students. In this…
ERIC Educational Resources Information Center
OECD Publishing, 2017
2017-01-01
What is important for citizens to know and be able to do? The OECD Programme for International Student Assessment (PISA) seeks to answer that question through the most comprehensive and rigorous international assessment of student knowledge and skills. The PISA 2015 Assessment and Analytical Framework presents the conceptual foundations of the…
Xpey' Relational Environments: an analytic framework for conceptualizing Indigenous health equity.
Kent, Alexandra; Loppie, Charlotte; Carriere, Jeannine; MacDonald, Marjorie; Pauly, Bernie
2017-12-01
Both health equity research and Indigenous health research are driven by the goal of promoting equitable health outcomes among marginalized and underserved populations. However, the two fields often operate independently, without collaboration. As a result, Indigenous populations are underrepresented in health equity research relative to the disproportionate burden of health inequities they experience. In this methodological article, we present Xpey' Relational Environments, an analytic framework that maps some of the barriers and facilitators to health equity for Indigenous peoples. Health equity research needs to include a focus on Indigenous populations and Indigenized methodologies, a shift that could fill gaps in knowledge with the potential to contribute to 'closing the gap' in Indigenous health. With this in mind, the Equity Lens in Public Health (ELPH) research program adopted the Xpey' Relational Environments framework to add a focus on Indigenous populations to our research on the prioritization and implementation of health equity. The analytic framework introduced an Indigenized health equity lens to our methodology, which facilitated the identification of social, structural and systemic determinants of Indigenous health. To test the framework, we conducted a pilot case study of one of British Columbia's regional health authorities, which included a review of core policies and plans as well as interviews and focus groups with frontline staff, managers and senior executives. ELPH's application of Xpey' Relational Environments serves as an example of the analytic framework's utility for exploring and conceptualizing Indigenous health equity in BC's public health system. Future applications of the framework should be embedded in Indigenous research methodologies.
NASA Astrophysics Data System (ADS)
Vermersch, B.; Elben, A.; Dalmonte, M.; Cirac, J. I.; Zoller, P.
2018-02-01
We present a general framework for the generation of random unitaries based on random quenches in atomic Hubbard and spin models, forming approximate unitary n -designs, and their application to the measurement of Rényi entropies. We generalize our protocol presented in Elben et al. [Phys. Rev. Lett. 120, 050406 (2018), 10.1103/PhysRevLett.120.050406] to a broad class of atomic and spin-lattice models. We further present an in-depth numerical and analytical study of experimental imperfections, including the effect of decoherence and statistical errors, and discuss connections of our approach with many-body quantum chaos.
Calculations of Total Classical Cross Sections for a Central Field
NASA Astrophysics Data System (ADS)
Tsyganov, D. L.
2018-07-01
In order to find the total collision cross-section a direct method of the effective potential (EPM) in the framework of classical mechanics was proposed. EPM allows to over come both the direct scattering problem (calculation of the total collision cross-section) and the inverse scattering problem (reconstruction of the scattering potential) quickly and effectively. A general analytical expression was proposed for the generalized Lennard-Jones potentials: (6-3), (9-3), (12-3), (6-4), (8-4), (12-4), (8-6), (12-6), (18-6). The values for the scattering potential of the total cross section for pairs such as electron-N2, N-N, and O-O2 were obtained in a good approximation.
Principal polynomial analysis.
Laparra, Valero; Jiménez, Sandra; Tuia, Devis; Camps-Valls, Gustau; Malo, Jesus
2014-11-01
This paper presents a new framework for manifold learning based on a sequence of principal polynomials that capture the possibly nonlinear nature of the data. The proposed Principal Polynomial Analysis (PPA) generalizes PCA by modeling the directions of maximal variance by means of curves, instead of straight lines. Contrarily to previous approaches, PPA reduces to performing simple univariate regressions, which makes it computationally feasible and robust. Moreover, PPA shows a number of interesting analytical properties. First, PPA is a volume-preserving map, which in turn guarantees the existence of the inverse. Second, such an inverse can be obtained in closed form. Invertibility is an important advantage over other learning methods, because it permits to understand the identified features in the input domain where the data has physical meaning. Moreover, it allows to evaluate the performance of dimensionality reduction in sensible (input-domain) units. Volume preservation also allows an easy computation of information theoretic quantities, such as the reduction in multi-information after the transform. Third, the analytical nature of PPA leads to a clear geometrical interpretation of the manifold: it allows the computation of Frenet-Serret frames (local features) and of generalized curvatures at any point of the space. And fourth, the analytical Jacobian allows the computation of the metric induced by the data, thus generalizing the Mahalanobis distance. These properties are demonstrated theoretically and illustrated experimentally. The performance of PPA is evaluated in dimensionality and redundancy reduction, in both synthetic and real datasets from the UCI repository.
A Framework for Understanding Physics Students' Computational Modeling Practices
NASA Astrophysics Data System (ADS)
Lunk, Brandon Robert
With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by their existing physics content knowledge, particularly their knowledge of analytic procedures. While this existing knowledge was often applied in inappropriate circumstances, the students were still able to display a considerable amount of understanding of the physics content and of analytic solution procedures. These observations could not be adequately accommodated by the existing literature of programming comprehension. In extending the resource framework to the task of computational modeling, I model students' practices in terms of three important elements. First, a knowledge base includes re- sources for understanding physics, math, and programming structures. Second, a mechanism for monitoring and control describes students' expectations as being directed towards numerical, analytic, qualitative or rote solution approaches and which can be influenced by the problem representation. Third, a set of solution approaches---many of which were identified in this study---describe what aspects of the knowledge base students use and how they use that knowledge to enact their expectations. This framework allows us as researchers to track student discussions and pinpoint the source of difficulties. This work opens up many avenues of potential research. First, this framework gives researchers a vocabulary for extending Resource Theory to other domains of instruction, such as modeling how physics students use graphs. Second, this framework can be used as the basis for modeling expert physicists' programming practices. Important instructional implications also follow from this research. Namely, as we broaden the use of computational modeling in the physics classroom, our instructional practices should focus on helping students understand the step-by-step nature of programming in contrast to the already salient analytic procedures.
Struijs, Jeroen N; Drewes, Hanneke W; Heijink, Richard; Baan, Caroline A
2015-04-01
Many countries face the persistent twin challenge of providing high-quality care while keeping health systems affordable and accessible. As a result, the interest for more efficient strategies to stimulate population health is increasing. A possible successful strategy is population management (PM). PM strives to address health needs for the population at-risk and the chronically ill at all points along the health continuum by integrating services across health care, prevention, social care and welfare. The Care Continuum Alliance (CCA) population health guide, which recently changed their name in Population Health Alliance (PHA) provides a useful instrument for implementing and evaluating such innovative approaches. This framework is developed for PM specifically and describes the core elements of the PM-concept on the basis of six subsequent interrelated steps. The aim of this article is to transform the CCA framework into an analytical framework. Quantitative methods are refined and we operationalized a set of indicators to measure the impact of PM in terms of the Triple Aim (population health, quality of care and cost per capita). Additionally, we added a qualitative part to gain insight into the implementation process of PM. This resulted in a broadly applicable analytical framework based on a mixed-methods approach. In the coming years, the analytical framework will be applied within the Dutch Monitor Population Management to derive transferable 'lessons learned' and to methodologically underpin the concept of PM. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Criticality and Connectivity in Macromolecular Charge Complexation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qin, Jian; de Pablo, Juan J.
We examine the role of molecular connectivity and architecture on the complexation of ionic macromolecules (polyelectrolytes) of finite size. A unified framework is developed and applied to evaluate the electrostatic correlation free energy for point-like, rod-like, and coil-like molecules. That framework is generalized to molecules of variable fractal dimensions, including dendrimers. Analytical expressions for the free energy, correlation length, and osmotic pressure are derived, thereby enabling consideration of the effects of charge connectivity, fractal dimension, and backbone stiffness on the complexation behavior of a wide range of polyelectrolytes. Results are presented for regions in the immediate vicinity of the criticalmore » region and far from it. A transparent and explicit expression for the coexistence curve is derived in order to facilitate analysis of experimentally observed phase diagrams.« less
NASA Astrophysics Data System (ADS)
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
A Computational Framework for Bioimaging Simulation
Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi
2015-01-01
Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508
ERIC Educational Resources Information Center
Zheng, Gaoming; Cai, Yuzhuo; Ma, Shaozhuang
2017-01-01
This paper intends to construct an analytical framework for understanding quality assurance in international joint programmes and to test it in a case analysis of a European--Chinese joint doctoral degree programme. The development of a quality assurance system for an international joint programme is understood as an institutionalization process…
Computing Generalized Matrix Inverse on Spiking Neural Substrate.
Shukla, Rohit; Khoram, Soroosh; Jorgensen, Erik; Li, Jing; Lipasti, Mikko; Wright, Stephen
2018-01-01
Emerging neural hardware substrates, such as IBM's TrueNorth Neurosynaptic System, can provide an appealing platform for deploying numerical algorithms. For example, a recurrent Hopfield neural network can be used to find the Moore-Penrose generalized inverse of a matrix, thus enabling a broad class of linear optimizations to be solved efficiently, at low energy cost. However, deploying numerical algorithms on hardware platforms that severely limit the range and precision of representation for numeric quantities can be quite challenging. This paper discusses these challenges and proposes a rigorous mathematical framework for reasoning about range and precision on such substrates. The paper derives techniques for normalizing inputs and properly quantizing synaptic weights originating from arbitrary systems of linear equations, so that solvers for those systems can be implemented in a provably correct manner on hardware-constrained neural substrates. The analytical model is empirically validated on the IBM TrueNorth platform, and results show that the guarantees provided by the framework for range and precision hold under experimental conditions. Experiments with optical flow demonstrate the energy benefits of deploying a reduced-precision and energy-efficient generalized matrix inverse engine on the IBM TrueNorth platform, reflecting 10× to 100× improvement over FPGA and ARM core baselines.
The Framework of Intervention Engine Based on Learning Analytics
ERIC Educational Resources Information Center
Sahin, Muhittin; Yurdugül, Halil
2017-01-01
Learning analytics primarily deals with the optimization of learning environments and the ultimate goal of learning analytics is to improve learning and teaching efficiency. Studies on learning analytics seem to have been made in the form of adaptation engine and intervention engine. Adaptation engine studies are quite widespread, but intervention…
Davydovskaya, Polina; Ranft, Annekatrin; Lotsch, Bettina V; Pohle, Roland
2014-07-15
Metal-organic frameworks (MOFs) constitute a new generation of porous crystalline materials, which have recently come into focus as analyte-specific active elements in thin-film sensor devices. Cu-BTC--also known as HKUST-1--is one of the most theoretically and experimentally investigated members of the MOF family. Its capability to selectively adsorb different gas molecules renders this material a promising candidate for applications in chemical gas and vapor sensing. Here, we explore details of the host-guest interactions between HKUST-1 and various analytes under different environmental conditions and study the vapor adsorption mechanism by mass-sensitive and work-function-based readouts. These complementary transduction mechanisms were successfully applied for the detection of low ppm (2 to 50 ppm) concentrations of different alcohols (methanol, ethanol, 1-propanol, and 2-propanol) adsorbed into Cu-BTC thin films. Evaluation of the results allows for the comparison of the amounts of adsorbed vapors and the contribution of each vapor to the changes of the electronic properties of Cu-BTC. The influence of the length of the alcohol chain (C1-C3) and geometry (1-propanol, 2-propanol) as well as their polarity on the sensing performance was investigated, revealing that in dry air, short chain alcohols are more likely adsorbed than long chain alcohols, whereas in humid air, this preference is changed, and the sensitivity toward alcohols is generally decreased. The adsorption mechanism is revealed to differ for dry and humid atmospheres, changing from a site-specific binding of alcohols to the open metal sites under dry conditions to weak physisorption of the analytes dissolved in surface-adsorbed water reservoirs in humid air, with the signal strength being governed by their relative concentration.
Analytical steady-state solutions for water-limited cropping systems using saline irrigation water
NASA Astrophysics Data System (ADS)
Skaggs, T. H.; Anderson, R. G.; Corwin, D. L.; Suarez, D. L.
2014-12-01
Due to the diminishing availability of good quality water for irrigation, it is increasingly important that irrigation and salinity management tools be able to target submaximal crop yields and support the use of marginal quality waters. In this work, we present a steady-state irrigated systems modeling framework that accounts for reduced plant water uptake due to root zone salinity. Two explicit, closed-form analytical solutions for the root zone solute concentration profile are obtained, corresponding to two alternative functional forms of the uptake reduction function. The solutions express a general relationship between irrigation water salinity, irrigation rate, crop salt tolerance, crop transpiration, and (using standard approximations) crop yield. Example applications are illustrated, including the calculation of irrigation requirements for obtaining targeted submaximal yields, and the generation of crop-water production functions for varying irrigation waters, irrigation rates, and crops. Model predictions are shown to be mostly consistent with existing models and available experimental data. Yet the new solutions possess advantages over available alternatives, including: (i) the solutions were derived from a complete physical-mathematical description of the system, rather than based on an ad hoc formulation; (ii) the analytical solutions are explicit and can be evaluated without iterative techniques; (iii) the solutions permit consideration of two common functional forms of salinity induced reductions in crop water uptake, rather than being tied to one particular representation; and (iv) the utilized modeling framework is compatible with leading transient-state numerical models.
Glycoconjugate Vaccines: The Regulatory Framework.
Jones, Christopher
2015-01-01
Most vaccines, including the currently available glycoconjugate vaccines, are administered to healthy infants, to prevent future disease. The safety of a prospective vaccine is a key prerequisite for approval. Undesired side effects would not only have the potential to damage the individual infant but also lead to a loss of confidence in the respective vaccine-or vaccines in general-on a population level. Thus, regulatory requirements, particularly with regard to safety, are extremely rigorous. This chapter highlights regulatory aspects on carbohydrate-based vaccines with an emphasis on analytical approaches to ensure the consistent quality of successive manufacturing lots.
The kinetic origin of delayed yielding in metallic glasses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ye, Y. F.; Liu, X. D.; Wang, S.
2016-06-20
Recent experiments showed that irreversible structural change or plasticity could occur in metallic glasses (MGs) even within the apparent elastic limit after a sufficiently long waiting time. To explain this phenomenon, a stochastic shear transformation model is developed based on a unified rate theory to predict delayed yielding in MGs, which is validated afterwards through extensive atomistic simulations carried out on different MGs. On a fundamental level, an analytic framework is established in this work that links time, stress, and temperature altogether into a general yielding criterion for MGs.
A generalized theory of preferential linking
NASA Astrophysics Data System (ADS)
Hu, Haibo; Guo, Jinli; Liu, Xuan; Wang, Xiaofan
2014-12-01
There are diverse mechanisms driving the evolution of social networks. A key open question dealing with understanding their evolution is: How do various preferential linking mechanisms produce networks with different features? In this paper we first empirically study preferential linking phenomena in an evolving online social network, find and validate the linear preference. We propose an analyzable model which captures the real growth process of the network and reveals the underlying mechanism dominating its evolution. Furthermore based on preferential linking we propose a generalized model reproducing the evolution of online social networks, and present unified analytical results describing network characteristics for 27 preference scenarios. We study the mathematical structure of degree distributions and find that within the framework of preferential linking analytical degree distributions can only be the combinations of finite kinds of functions which are related to rational, logarithmic and inverse tangent functions, and extremely complex network structure will emerge even for very simple sublinear preferential linking. This work not only provides a verifiable origin for the emergence of various network characteristics in social networks, but bridges the micro individuals' behaviors and the global organization of social networks.
Analytical connection between thresholds and immunization strategies of SIS model in random networks
NASA Astrophysics Data System (ADS)
Zhou, Ming-Yang; Xiong, Wen-Man; Liao, Hao; Wang, Tong; Wei, Zong-Wen; Fu, Zhong-Qian
2018-05-01
Devising effective strategies for hindering the propagation of viruses and protecting the population against epidemics is critical for public security and health. Despite a number of studies based on the susceptible-infected-susceptible (SIS) model devoted to this topic, we still lack a general framework to compare different immunization strategies in completely random networks. Here, we address this problem by suggesting a novel method based on heterogeneous mean-field theory for the SIS model. Our method builds the relationship between the thresholds and different immunization strategies in completely random networks. Besides, we provide an analytical argument that the targeted large-degree strategy achieves the best performance in random networks with arbitrary degree distribution. Moreover, the experimental results demonstrate the effectiveness of the proposed method in both artificial and real-world networks.
Sheldon, Michael R
2016-01-01
Policy studies are a recent addition to the American Physical Therapy Association's Research Agenda and are critical to our understanding of various federal, state, local, and organizational policies on the provision of physical therapist services across the continuum of care. Policy analyses that help to advance the profession's various policy agendas will require relevant theoretical frameworks to be credible. The purpose of this perspective article is to: (1) demonstrate the use of a policy-making theory as an analytical framework in a policy analysis and (2) discuss how sound policy analysis can assist physical therapists in becoming more effective change agents, policy advocates, and partners with other relevant stakeholder groups. An exploratory study of state agency policy responses to address work-related musculoskeletal disorders is provided as a contemporary example to illustrate key points and to demonstrate the importance of selecting a relevant analytical framework based on the context of the policy issue under investigation. © 2016 American Physical Therapy Association.
A general framework of noise suppression in material decomposition for dual-energy CT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petrongolo, Michael; Dong, Xue; Zhu, Lei, E-mail: leizhu@gatech.edu
Purpose: As a general problem of dual-energy CT (DECT), noise amplification in material decomposition severely reduces the signal-to-noise ratio on the decomposed images compared to that on the original CT images. In this work, the authors propose a general framework of noise suppression in material decomposition for DECT. The method is based on an iterative algorithm recently developed in their group for image-domain decomposition of DECT, with an extension to include nonlinear decomposition models. The generalized framework of iterative DECT decomposition enables beam-hardening correction with simultaneous noise suppression, which improves the clinical benefits of DECT. Methods: The authors propose tomore » suppress noise on the decomposed images of DECT using convex optimization, which is formulated in the form of least-squares estimation with smoothness regularization. Based on the design principles of a best linear unbiased estimator, the authors include the inverse of the estimated variance–covariance matrix of the decomposed images as the penalty weight in the least-squares term. Analytical formulas are derived to compute the variance–covariance matrix for decomposed images with general-form numerical or analytical decomposition. As a demonstration, the authors implement the proposed algorithm on phantom data using an empirical polynomial function of decomposition measured on a calibration scan. The polynomial coefficients are determined from the projection data acquired on a wedge phantom, and the signal decomposition is performed in the projection domain. Results: On the Catphan{sup ®}600 phantom, the proposed noise suppression method reduces the average noise standard deviation of basis material images by one to two orders of magnitude, with a superior performance on spatial resolution as shown in comparisons of line-pair images and modulation transfer function measurements. On the synthesized monoenergetic CT images, the noise standard deviation is reduced by a factor of 2–3. By using nonlinear decomposition on projections, the authors’ method effectively suppresses the streaking artifacts of beam hardening and obtains more uniform images than their previous approach based on a linear model. Similar performance of noise suppression is observed in the results of an anthropomorphic head phantom and a pediatric chest phantom generated by the proposed method. With beam-hardening correction enabled by their approach, the image spatial nonuniformity on the head phantom is reduced from around 10% on the original CT images to 4.9% on the synthesized monoenergetic CT image. On the pediatric chest phantom, their method suppresses image noise standard deviation by a factor of around 7.5, and compared with linear decomposition, it reduces the estimation error of electron densities from 33.3% to 8.6%. Conclusions: The authors propose a general framework of noise suppression in material decomposition for DECT. Phantom studies have shown the proposed method improves the image uniformity and the accuracy of electron density measurements by effective beam-hardening correction and reduces noise level without noticeable resolution loss.« less
A consistent conceptual framework for applying climate metrics in technology life cycle assessment
NASA Astrophysics Data System (ADS)
Mallapragada, Dharik; Mignone, Bryan K.
2017-07-01
Comparing the potential climate impacts of different technologies is challenging for several reasons, including the fact that any given technology may be associated with emissions of multiple greenhouse gases when evaluated on a life cycle basis. In general, analysts must decide how to aggregate the climatic effects of different technologies, taking into account differences in the properties of the gases (differences in atmospheric lifetimes and instantaneous radiative efficiencies) as well as different technology characteristics (differences in emission factors and technology lifetimes). Available metrics proposed in the literature have incorporated these features in different ways and have arrived at different conclusions. In this paper, we develop a general framework for classifying metrics based on whether they measure: (a) cumulative or end point impacts, (b) impacts over a fixed time horizon or up to a fixed end year, and (c) impacts from a single emissions pulse or from a stream of pulses over multiple years. We then use the comparison between compressed natural gas and gasoline-fueled vehicles to illustrate how the choice of metric can affect conclusions about technologies. Finally, we consider tradeoffs involved in selecting a metric, show how the choice of metric depends on the framework that is assumed for climate change mitigation, and suggest which subset of metrics are likely to be most analytically self-consistent.
ERIC Educational Resources Information Center
Golden, Mark
This report briefly describes the procedures for assessing children's psychological development and the data analytic framework used in the New York City Infant Day Care Study. This study is a 5-year, longitudinal investigation in which infants in group and family day care programs and infants reared at home are compared. Children in the study are…
Value of Flexibility - Phase 1
2010-09-25
weaknesses of each approach. During this period, we also explored the development of an analytical framework based on sound mathematical constructs... mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory or guidance on best approaches to...research activities is in developing a coherent value based definition of flexibility that is based on an analytical framework that is mathematically
A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions
Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.
2009-01-01
Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453
An analytical procedure to assist decision-making in a government research organization
H. Dean Claxton; Giuseppe Rensi
1972-01-01
An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Youn, H; Jeon, H; Nam, J
Purpose: To investigate the feasibility of an analytic framework to estimate patients’ absorbed dose distribution owing to daily cone-beam CT scan for image-guided radiation treatment. Methods: To compute total absorbed dose distribution, we separated the framework into primary and scattered dose calculations. Using the source parameters such as voltage, current, and bowtie filtration, for the primary dose calculation, we simulated the forward projection from the source to each voxel of an imaging object including some inhomogeneous inserts. Then we calculated the primary absorbed dose at each voxel based on the absorption probability deduced from the HU values and Beer’s law.more » In sequence, all voxels constructing the phantom were regarded as secondary sources to radiate scattered photons for scattered dose calculation. Details of forward projection were identical to that of the previous step. The secondary source intensities were given by using scatter-to- primary ratios provided by NIST. In addition, we compared the analytically calculated dose distribution with their Monte Carlo simulation results. Results: The suggested framework for absorbed dose estimation successfully provided the primary and secondary dose distributions of the phantom. Moreover, our analytic dose calculations and Monte Carlo calculations were well agreed each other even near the inhomogeneous inserts. Conclusion: This work indicated that our framework can be an effective monitor to estimate a patient’s exposure owing to cone-beam CT scan for image-guided radiation treatment. Therefore, we expected that the patient’s over-exposure during IGRT might be prevented by our framework.« less
NASA Astrophysics Data System (ADS)
Giuliani, M.; Herman, J. D.; Castelletti, A.; Reed, P. M.
2013-12-01
Institutional inertia strongly limits our ability to adapt water reservoir operations to better manage growing water demands as well as their associated uncertainties in a changing climate. Although it has long been recognized that these systems are generally framed in heterogeneous socio-economic contexts involving a myriad of conflicting, non-commensurable operating objectives, our broader understanding of the multiobjective consequences of current operating rules as well as their vulnerability to hydroclimatic uncertainties is severely limited. This study proposes a decision analytic framework to overcome policy inertia and myopia in complex river basin management contexts. The framework combines reservoir policy identification and many-objective optimization under uncertainty to characterize current operations and discover key tradeoffs between alternative policies for balancing evolving demands and system uncertainties. The approach is demonstrated on the Conowingo Dam, located within the Lower Susquehanna River, USA. The Lower Susquehanna River is an interstate water body that has been subject to intensive water management efforts due to the system's competing demands from urban water supply, atomic power plant cooling, hydropower production, and federally regulated environmental flows. Initially our proposed framework uses available streamflow observations to implicitly identify the Conowingo Dam's current but unknown operating policy. This baseline policy is identified by fitting radial basis functions to existing system dynamics. Our assumption in the baseline policy is that the dam operator is represented as a rational agent seeking to maximize primary operational objectives (i.e., guaranteeing the public water supply and maximizing the hydropower revenue). The quality of the identified baseline policy is evaluated by its ability to replicate historical release dynamics. Once identified, the historical baseline policy then provides a means of representing the decision preferences guiding current operations. Our results show that the estimated policy closely captures the dynamics of current releases and flows for the Lower Susquehanna. After identifying the historical baseline policy, our proposed decision analytic framework then combines evolutionary many-objective optimization with visual analytics to discover improved operating policies. Our Lower Susquehanna results confirm that the system's current history-based operations are negatively biased to overestimate the reliability of the reservoir's multi-sector services. Moreover, our proposed framework has successfully identified alternative reservoir policies that are more robust to hydroclimatic uncertainties while being capable of better addressing the tradeoffs across the Conowingo Dam's multi-sector services.
Analytical method of waste allocation in waste management systems: Concept, method and case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergeron, Francis C., E-mail: francis.b.c@videotron.ca
Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. Themore » conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. - Highlights: • The study presents a new analytical method based on the waste allocation process. • The method provides an in-depth and contextual knowledge of the waste management system. • The paper provides a reproducible procedure for professionals, experts and academics. • It may be integrated into impact assessment or environmental system analysis tools. • An illustrative case study is provided based on household waste management in Geneva.« less
Optimal digital dynamical decoupling for general decoherence via Walsh modulation
NASA Astrophysics Data System (ADS)
Qi, Haoyu; Dowling, Jonathan P.; Viola, Lorenza
2017-11-01
We provide a general framework for constructing digital dynamical decoupling sequences based on Walsh modulation—applicable to arbitrary qubit decoherence scenarios. By establishing equivalence between decoupling design based on Walsh functions and on concatenated projections, we identify a family of optimal Walsh sequences, which can be exponentially more efficient, in terms of the required total pulse number, for fixed cancellation order, than known digital sequences based on concatenated design. Optimal sequences for a given cancellation order are highly non-unique—their performance depending sensitively on the control path. We provide an analytic upper bound to the achievable decoupling error and show how sequences within the optimal Walsh family can substantially outperform concatenated decoupling in principle, while respecting realistic timing constraints.
Neutrino-electron scattering: general constraints on Z ' and dark photon models
NASA Astrophysics Data System (ADS)
Lindner, Manfred; Queiroz, Farinaldo S.; Rodejohann, Werner; Xu, Xun-Jie
2018-05-01
We study the framework of U(1) X models with kinetic mixing and/or mass mixing terms. We give general and exact analytic formulas of fermion gauge interactions and the cross sections of neutrino-electron scattering in such models. Then we derive limits on a variety of U(1) X models that induce new physics contributions to neutrino-electron scattering, taking into account interference between the new physics and Standard Model contributions. Data from TEXONO, CHARM-II and GEMMA are analyzed and shown to be complementary to each other to provide the most restrictive bounds on masses of the new vector bosons. In particular, we demonstrate the validity of our results to dark photon-like as well as light Z ' models.
Benjamin, Arlin James; Bushman, Brad J
2018-02-01
In some societies, weapons are plentiful and highly visible. This review examines recent trends in research on the weapons effect, which is the finding that the mere presence of weapons can prime people to behave aggressively. The General Aggression Model provides a theoretical framework to explain why the weapons effect occurs. This model postulates that exposure to weapons increases aggressive thoughts and hostile appraisals, thus explaining why weapons facilitate aggressive behavior. Data from meta-analytic reviews are consistent with the General Aggression Model. These findings have important practical as well as theoretical implications. They suggest that the link between weapons and aggression is very strong in semantic memory, and that merely seeing a weapon can make people more aggressive. Copyright © 2017 Elsevier Ltd. All rights reserved.
Optimal parameter estimation with a fixed rate of abstention
NASA Astrophysics Data System (ADS)
Gendra, B.; Ronco-Bonvehi, E.; Calsamiglia, J.; Muñoz-Tapia, R.; Bagan, E.
2013-07-01
The problems of optimally estimating a phase, a direction, and the orientation of a Cartesian frame (or trihedron) with general pure states are addressed. Special emphasis is put on estimation schemes that allow for inconclusive answers or abstention. It is shown that such schemes enable drastic improvements, up to the extent of attaining the Heisenberg limit in some cases, and the required amount of abstention is quantified. A general mathematical framework to deal with the asymptotic limit of many qubits or large angular momentum is introduced and used to obtain analytical results for all the relevant cases under consideration. Parameter estimation with abstention is also formulated as a semidefinite programming problem, for which very efficient numerical optimization techniques exist.
ERIC Educational Resources Information Center
Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting
2016-01-01
This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…
An Analytical Model for the Performance Analysis of Concurrent Transmission in IEEE 802.15.4
Gezer, Cengiz; Zanella, Alberto; Verdone, Roberto
2014-01-01
Interference is a serious cause of performance degradation for IEEE802.15.4 devices. The effect of concurrent transmissions in IEEE 802.15.4 has been generally investigated by means of simulation or experimental activities. In this paper, a mathematical framework for the derivation of chip, symbol and packet error probability of a typical IEEE 802.15.4 receiver in the presence of interference is proposed. Both non-coherent and coherent demodulation schemes are considered by our model under the assumption of the absence of thermal noise. Simulation results are also added to assess the validity of the mathematical framework when the effect of thermal noise cannot be neglected. Numerical results show that the proposed analysis is in agreement with the measurement results on the literature under realistic working conditions. PMID:24658624
An analytical model for the performance analysis of concurrent transmission in IEEE 802.15.4.
Gezer, Cengiz; Zanella, Alberto; Verdone, Roberto
2014-03-20
Interference is a serious cause of performance degradation for IEEE802.15.4 devices. The effect of concurrent transmissions in IEEE 802.15.4 has been generally investigated by means of simulation or experimental activities. In this paper, a mathematical framework for the derivation of chip, symbol and packet error probability of a typical IEEE 802.15.4 receiver in the presence of interference is proposed. Both non-coherent and coherent demodulation schemes are considered by our model under the assumption of the absence of thermal noise. Simulation results are also added to assess the validity of the mathematical framework when the effect of thermal noise cannot be neglected. Numerical results show that the proposed analysis is in agreement with the measurement results on the literature under realistic working conditions.
Ethics and Justice in Learning Analytics
ERIC Educational Resources Information Center
Johnson, Jeffrey Alan
2017-01-01
The many complex challenges posed by learning analytics can best be understood within a framework of structural justice, which focuses on the ways in which the informational, operational, and organizational structures of learning analytics influence students' capacities for self-development and self-determination. This places primary…
Reading Multimodal Texts: Perceptual, Structural and Ideological Perspectives
ERIC Educational Resources Information Center
Serafini, Frank
2010-01-01
This article presents a tripartite framework for analyzing multimodal texts. The three analytical perspectives presented include: (1) perceptual, (2) structural, and (3) ideological analytical processes. Using Anthony Browne's picturebook "Piggybook" as an example, assertions are made regarding what each analytical perspective brings to the…
Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework
Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.
2016-01-01
Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of TOF scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (Direct Image Reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias vs. variance performance to iterative TOF reconstruction with a matched resolution model. PMID:27032968
Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework
NASA Astrophysics Data System (ADS)
Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.
2016-05-01
Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of time-of-flight (TOF) scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (DIRECT: direct image reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias versus variance performance to iterative TOF reconstruction with a matched resolution model.
NASA Astrophysics Data System (ADS)
Lefèvre, Victor; Lopez-Pamies, Oscar
2017-02-01
This paper presents an analytical framework to construct approximate homogenization solutions for the macroscopic elastic dielectric response - under finite deformations and finite electric fields - of dielectric elastomer composites with two-phase isotropic particulate microstructures. The central idea consists in employing the homogenization solution derived in Part I of this work for ideal elastic dielectric composites within the context of a nonlinear comparison medium method - this is derived as an extension of the comparison medium method of Lopez-Pamies et al. (2013) in nonlinear elastostatics to the coupled realm of nonlinear electroelastostatics - to generate in turn a corresponding solution for composite materials with non-ideal elastic dielectric constituents. Complementary to this analytical framework, a hybrid finite-element formulation to construct homogenization solutions numerically (in three dimensions) is also presented. The proposed analytical framework is utilized to work out a general approximate homogenization solution for non-Gaussian dielectric elastomers filled with nonlinear elastic dielectric particles that may exhibit polarization saturation. The solution applies to arbitrary (non-percolative) isotropic distributions of filler particles. By construction, it is exact in the limit of small deformations and moderate electric fields. For finite deformations and finite electric fields, its accuracy is demonstrated by means of direct comparisons with finite-element solutions. Aimed at gaining physical insight into the extreme enhancement in electrostriction properties displayed by emerging dielectric elastomer composites, various cases wherein the filler particles are of poly- and mono-disperse sizes and exhibit different types of elastic dielectric behavior are discussed in detail. Contrary to an initial conjecture in the literature, it is found (inter alia) that the isotropic addition of a small volume fraction of stiff (semi-)conducting/high-permittivity particles to dielectric elastomers does not lead to the extreme electrostriction enhancements observed in experiments. It is posited that such extreme enhancements are the manifestation of interphasial phenomena.
The Strategic Management of Accountability in Nonprofit Organizations: An Analytical Framework.
ERIC Educational Resources Information Center
Kearns, Kevin P.
1994-01-01
Offers a framework stressing the strategic and tactical choices facing nonprofit organizations and discusses policy and management implications. Claims framework is a useful tool for conducting accountability audits and conceptual foundation for discussions of public policy. (Author/JOW)
RBioCloud: A Light-Weight Framework for Bioconductor and R-based Jobs on the Cloud.
Varghese, Blesson; Patel, Ishan; Barker, Adam
2015-01-01
Large-scale ad hoc analytics of genomic data is popular using the R-programming language supported by over 700 software packages provided by Bioconductor. More recently, analytical jobs are benefitting from on-demand computing and storage, their scalability and their low maintenance cost, all of which are offered by the cloud. While biologists and bioinformaticists can take an analytical job and execute it on their personal workstations, it remains challenging to seamlessly execute the job on the cloud infrastructure without extensive knowledge of the cloud dashboard. How analytical jobs can not only with minimum effort be executed on the cloud, but also how both the resources and data required by the job can be managed is explored in this paper. An open-source light-weight framework for executing R-scripts using Bioconductor packages, referred to as `RBioCloud', is designed and developed. RBioCloud offers a set of simple command-line tools for managing the cloud resources, the data and the execution of the job. Three biological test cases validate the feasibility of RBioCloud. The framework is available from http://www.rbiocloud.com.
Failure detection system design methodology. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chow, E. Y.
1980-01-01
The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.
An Analytical Time–Domain Expression for the Net Ripple Produced by Parallel Interleaved Converters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Brian B.; Krein, Philip T.
We apply modular arithmetic and Fourier series to analyze the superposition of N interleaved triangular waveforms with identical amplitudes and duty-ratios. Here, interleaving refers to the condition when a collection of periodic waveforms with identical periods are each uniformly phase-shifted across one period. The main result is a time-domain expression which provides an exact representation of the summed and interleaved triangular waveforms, where the peak amplitude and parameters of the time-periodic component are all specified in closed-form. Analysis is general and can be used to study various applications in multi-converter systems. This model is unique not only in that itmore » reveals a simple and intuitive expression for the net ripple, but its derivation via modular arithmetic and Fourier series is distinct from prior approaches. The analytical framework is experimentally validated with a system of three parallel converters under time-varying operating conditions.« less
Curriculum Innovation for Marketing Analytics
ERIC Educational Resources Information Center
Wilson, Elizabeth J.; McCabe, Catherine; Smith, Robert S.
2018-01-01
College graduates need better preparation for and experience in data analytics for higher-quality problem solving. Using the curriculum innovation framework of Borin, Metcalf, and Tietje (2007) and case study research methods, we offer rich insights about one higher education institution's work to address the marketing analytics skills gap.…
Conceptualizing community resilience to natural hazards - the emBRACE framework
NASA Astrophysics Data System (ADS)
Kruse, Sylvia; Abeling, Thomas; Deeming, Hugh; Fordham, Maureen; Forrester, John; Jülich, Sebastian; Nuray Karanci, A.; Kuhlicke, Christian; Pelling, Mark; Pedoth, Lydia; Schneiderbauer, Stefan
2017-12-01
The level of community is considered to be vital for building disaster resilience. Yet, community resilience as a scientific concept often remains vaguely defined and lacks the guiding characteristics necessary for analysing and enhancing resilience on the ground. The emBRACE framework of community resilience presented in this paper provides a heuristic analytical tool for understanding, explaining and measuring community resilience to natural hazards. It was developed in an iterative process building on existing scholarly debates, on empirical case study work in five countries and on participatory consultation with community stakeholders where the framework was applied and ground-tested in different contexts and for different hazard types. The framework conceptualizes resilience across three core domains: (i) resources and capacities, (ii) actions and (iii) learning. These three domains are conceptualized as intrinsically conjoined within a whole. Community resilience is influenced by these integral elements as well as by extra-community forces comprising disaster risk governance and thus laws, policies and responsibilities on the one hand and on the other, the general societal context, natural and human-made disturbances and system change over time. The framework is a graphically rendered heuristic, which through application can assist in guiding the assessment of community resilience in a systematic way and identifying key drivers and barriers of resilience that affect any particular hazard-exposed community.
ERIC Educational Resources Information Center
Bodily, Robert; Nyland, Rob; Wiley, David
2017-01-01
The RISE (Resource Inspection, Selection, and Enhancement) Framework is a framework supporting the continuous improvement of open educational resources (OER). The framework is an automated process that identifies learning resources that should be evaluated and either eliminated or improved. This is particularly useful in OER contexts where the…
The dynamics of adapting, unregulated populations and a modified fundamental theorem.
O'Dwyer, James P
2013-01-06
A population in a novel environment will accumulate adaptive mutations over time, and the dynamics of this process depend on the underlying fitness landscape: the fitness of and mutational distance between possible genotypes in the population. Despite its fundamental importance for understanding the evolution of a population, inferring this landscape from empirical data has been problematic. We develop a theoretical framework to describe the adaptation of a stochastic, asexual, unregulated, polymorphic population undergoing beneficial, neutral and deleterious mutations on a correlated fitness landscape. We generate quantitative predictions for the change in the mean fitness and within-population variance in fitness over time, and find a simple, analytical relationship between the distribution of fitness effects arising from a single mutation, and the change in mean population fitness over time: a variant of Fisher's 'fundamental theorem' which explicitly depends on the form of the landscape. Our framework can therefore be thought of in three ways: (i) as a set of theoretical predictions for adaptation in an exponentially growing phase, with applications in pathogen populations, tumours or other unregulated populations; (ii) as an analytically tractable problem to potentially guide theoretical analysis of regulated populations; and (iii) as a basis for developing empirical methods to infer general features of a fitness landscape.
Theory of precipitation effects on dead cylindrical fuels
Michael A. Fosberg
1972-01-01
Numerical and analytical solutions of the Fickian diffusion equation were used to determine the effects of precipitation on dead cylindrical forest fuels. The analytical solution provided a physical framework. The numerical solutions were then used to refine the analytical solution through a similarity argument. The theoretical solutions predicted realistic rates of...
Computing Generalized Matrix Inverse on Spiking Neural Substrate
Shukla, Rohit; Khoram, Soroosh; Jorgensen, Erik; Li, Jing; Lipasti, Mikko; Wright, Stephen
2018-01-01
Emerging neural hardware substrates, such as IBM's TrueNorth Neurosynaptic System, can provide an appealing platform for deploying numerical algorithms. For example, a recurrent Hopfield neural network can be used to find the Moore-Penrose generalized inverse of a matrix, thus enabling a broad class of linear optimizations to be solved efficiently, at low energy cost. However, deploying numerical algorithms on hardware platforms that severely limit the range and precision of representation for numeric quantities can be quite challenging. This paper discusses these challenges and proposes a rigorous mathematical framework for reasoning about range and precision on such substrates. The paper derives techniques for normalizing inputs and properly quantizing synaptic weights originating from arbitrary systems of linear equations, so that solvers for those systems can be implemented in a provably correct manner on hardware-constrained neural substrates. The analytical model is empirically validated on the IBM TrueNorth platform, and results show that the guarantees provided by the framework for range and precision hold under experimental conditions. Experiments with optical flow demonstrate the energy benefits of deploying a reduced-precision and energy-efficient generalized matrix inverse engine on the IBM TrueNorth platform, reflecting 10× to 100× improvement over FPGA and ARM core baselines. PMID:29593483
Random matrices and condensation into multiple states
NASA Astrophysics Data System (ADS)
Sadeghi, Sina; Engel, Andreas
2018-03-01
In the present work, we employ methods from statistical mechanics of disordered systems to investigate static properties of condensation into multiple states in a general framework. We aim at showing how typical properties of random interaction matrices play a vital role in manifesting the statistics of condensate states. In particular, an analytical expression for the fraction of condensate states in the thermodynamic limit is provided that confirms the result of the mean number of coexisting species in a random tournament game. We also study the interplay between the condensation problem and zero-sum games with correlated random payoff matrices.
NASA Astrophysics Data System (ADS)
Shimada, Yutaka; Ikeguchi, Tohru; Shigehara, Takaomi
2012-10-01
In this Letter, we propose a framework to transform a complex network to a time series. The transformation from complex networks to time series is realized by the classical multidimensional scaling. Applying the transformation method to a model proposed by Watts and Strogatz [Nature (London) 393, 440 (1998)], we show that ring lattices are transformed to periodic time series, small-world networks to noisy periodic time series, and random networks to random time series. We also show that these relationships are analytically held by using the circulant-matrix theory and the perturbation theory of linear operators. The results are generalized to several high-dimensional lattices.
Quantum work statistics of charged Dirac particles in time-dependent fields
Deffner, Sebastian; Saxena, Avadh
2015-09-28
The quantum Jarzynski equality is an important theorem of modern quantum thermodynamics. We show that the Jarzynski equality readily generalizes to relativistic quantum mechanics described by the Dirac equation. After establishing the conceptual framework we solve a pedagogical, yet experimentally relevant, system analytically. As a main result we obtain the exact quantum work distributions for charged particles traveling through a time-dependent vector potential evolving under Schrödinger as well as under Dirac dynamics, and for which the Jarzynski equality is verified. Thus, special emphasis is put on the conceptual and technical subtleties arising from relativistic quantum mechanics.
General formulation of long-range degree correlations in complex networks
NASA Astrophysics Data System (ADS)
Fujiki, Yuka; Takaguchi, Taro; Yakubo, Kousuke
2018-06-01
We provide a general framework for analyzing degree correlations between nodes separated by more than one step (i.e., beyond nearest neighbors) in complex networks. One joint and four conditional probability distributions are introduced to fully describe long-range degree correlations with respect to degrees k and k' of two nodes and shortest path length l between them. We present general relations among these probability distributions and clarify the relevance to nearest-neighbor degree correlations. Unlike nearest-neighbor correlations, some of these probability distributions are meaningful only in finite-size networks. Furthermore, as a baseline to determine the existence of intrinsic long-range degree correlations in a network other than inevitable correlations caused by the finite-size effect, the functional forms of these probability distributions for random networks are analytically evaluated within a mean-field approximation. The utility of our argument is demonstrated by applying it to real-world networks.
NASA Astrophysics Data System (ADS)
Shen, Ji; Sung, Shannon; Zhang, Dongmei
2015-11-01
Students need to think and work across disciplinary boundaries in the twenty-first century. However, it is unclear what interdisciplinary thinking means and how to analyze interdisciplinary interactions in teamwork. In this paper, drawing on multiple theoretical perspectives and empirical analysis of discourse contents, we formulate a theoretical framework that helps analyze interdisciplinary reasoning and communication (IRC) processes in interdisciplinary collaboration. Specifically, we propose four interrelated IRC processes-integration, translation, transfer, and transformation, and develop a corresponding analytic framework. We apply the framework to analyze two meetings of a project that aims to develop interdisciplinary science assessment items. The results illustrate that the framework can help interpret the interdisciplinary meeting dynamics and patterns. Our coding process and results also suggest that these IRC processes can be further examined in terms of interconnected sub-processes. We also discuss the implications of using the framework in conceptualizing, practicing, and researching interdisciplinary learning and teaching in science education.
Open-source Framework for Storing and Manipulation of Plasma Chemical Reaction Data
NASA Astrophysics Data System (ADS)
Jenkins, T. G.; Averkin, S. N.; Cary, J. R.; Kruger, S. E.
2017-10-01
We present a new open-source framework for storage and manipulation of plasma chemical reaction data that has emerged from our in-house project MUNCHKIN. This framework consists of python scripts and C + + programs. It stores data in an SQL data base for fast retrieval and manipulation. For example, it is possible to fit cross-section data into most widely used analytical expressions, calculate reaction rates for Maxwellian distribution functions of colliding particles, and fit them into different analytical expressions. Another important feature of this framework is the ability to calculate transport properties based on the cross-section data and supplied distribution functions. In addition, this framework allows the export of chemical reaction descriptions in LaTeX format for ease of inclusion in scientific papers. With the help of this framework it is possible to generate corresponding VSim (Particle-In-Cell simulation code) and USim (unstructured multi-fluid code) input blocks with appropriate cross-sections.
Electrocardiographic interpretation skills of cardiology residents: are they competent?
Sibbald, Matthew; Davies, Edward G; Dorian, Paul; Yu, Eric H C
2014-12-01
Achieving competency at electrocardiogram (ECG) interpretation among cardiology subspecialty residents has traditionally focused on interpreting a target number of ECGs during training. However, there is little evidence to support this approach. Further, there are no data documenting the competency of ECG interpretation skills among cardiology residents, who become de facto the gold standard in their practice communities. We tested 29 Cardiology residents from all 3 years in a large training program using a set of 20 ECGs collected from a community cardiology practice over a 1-month period. Residents interpreted half of the ECGs using a standard analytic framework, and half using their own approach. Residents were scored on the number of correct and incorrect diagnoses listed. Overall diagnostic accuracy was 58%. Of 6 potentially life-threatening diagnoses, residents missed 36% (123 of 348) including hyperkalemia (81%), long QT (52%), complete heart block (35%), and ventricular tachycardia (19%). Residents provided additional inappropriate diagnoses on 238 ECGs (41%). Diagnostic accuracy was similar between ECGs interpreted using an analytic framework vs ECGs interpreted without an analytic framework (59% vs 58%; F(1,1333) = 0.26; P = 0.61). Cardiology resident proficiency at ECG interpretation is suboptimal. Despite the use of an analytic framework, there remain significant deficiencies in ECG interpretation among Cardiology residents. A more systematic method of addressing these important learning gaps is urgently needed. Copyright © 2014 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.
A κ-generalized statistical mechanics approach to income analysis
NASA Astrophysics Data System (ADS)
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2009-02-01
This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.
NASA Technical Reports Server (NTRS)
Schallhorn, Paul; Majumdar, Alok
2012-01-01
This paper describes a finite volume based numerical algorithm that allows multi-dimensional computation of fluid flow within a system level network flow analysis. There are several thermo-fluid engineering problems where higher fidelity solutions are needed that are not within the capacity of system level codes. The proposed algorithm will allow NASA's Generalized Fluid System Simulation Program (GFSSP) to perform multi-dimensional flow calculation within the framework of GFSSP s typical system level flow network consisting of fluid nodes and branches. The paper presents several classical two-dimensional fluid dynamics problems that have been solved by GFSSP's multi-dimensional flow solver. The numerical solutions are compared with the analytical and benchmark solution of Poiseulle, Couette and flow in a driven cavity.
Phase Domain Walls in Weakly Nonlinear Deep Water Surface Gravity Waves.
Tsitoura, F; Gietz, U; Chabchoub, A; Hoffmann, N
2018-06-01
We report a theoretical derivation, an experimental observation and a numerical validation of nonlinear phase domain walls in weakly nonlinear deep water surface gravity waves. The domain walls presented are connecting homogeneous zones of weakly nonlinear plane Stokes waves of identical amplitude and wave vector but differences in phase. By exploiting symmetry transformations within the framework of the nonlinear Schrödinger equation we demonstrate the existence of exact analytical solutions representing such domain walls in the weakly nonlinear limit. The walls are in general oblique to the direction of the wave vector and stationary in moving reference frames. Experimental and numerical studies confirm and visualize the findings. Our present results demonstrate that nonlinear domain walls do exist in the weakly nonlinear regime of general systems exhibiting dispersive waves.
Phase Domain Walls in Weakly Nonlinear Deep Water Surface Gravity Waves
NASA Astrophysics Data System (ADS)
Tsitoura, F.; Gietz, U.; Chabchoub, A.; Hoffmann, N.
2018-06-01
We report a theoretical derivation, an experimental observation and a numerical validation of nonlinear phase domain walls in weakly nonlinear deep water surface gravity waves. The domain walls presented are connecting homogeneous zones of weakly nonlinear plane Stokes waves of identical amplitude and wave vector but differences in phase. By exploiting symmetry transformations within the framework of the nonlinear Schrödinger equation we demonstrate the existence of exact analytical solutions representing such domain walls in the weakly nonlinear limit. The walls are in general oblique to the direction of the wave vector and stationary in moving reference frames. Experimental and numerical studies confirm and visualize the findings. Our present results demonstrate that nonlinear domain walls do exist in the weakly nonlinear regime of general systems exhibiting dispersive waves.
Shigayeva, Altynay; Coker, Richard J
2015-04-01
There is renewed concern over the sustainability of disease control programmes, and re-emergence of policy recommendations to integrate programmes with general health systems. However, the conceptualization of this issue has remarkably received little critical attention. Additionally, the study of programmatic sustainability presents methodological challenges. In this article, we propose a conceptual framework to support analyses of sustainability of communicable disease programmes. Through this work, we also aim to clarify a link between notions of integration and sustainability. As a part of development of the conceptual framework, we conducted a systematic literature review of peer-reviewed literature on concepts, definitions, analytical approaches and empirical studies on sustainability in health systems. Identified conceptual proposals for analysis of sustainability in health systems lack an explicit conceptualization of what a health system is. Drawing upon theoretical concepts originating in sustainability sciences and our review here, we conceptualize a communicable disease programme as a component of a health system which is viewed as a complex adaptive system. We propose five programmatic characteristics that may explain a potential for sustainability: leadership, capacity, interactions (notions of integration), flexibility/adaptability and performance. Though integration of elements of a programme with other system components is important, its role in sustainability is context specific and difficult to predict. The proposed framework might serve as a basis for further empirical evaluations in understanding complex interplay between programmes and broader health systems in the development of sustainable responses to communicable diseases. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.
Emergence of grouping in multi-resource minority game dynamics
NASA Astrophysics Data System (ADS)
Huang, Zi-Gang; Zhang, Ji-Qiang; Dong, Jia-Qi; Huang, Liang; Lai, Ying-Cheng
2012-10-01
Complex systems arising in a modern society typically have many resources and strategies available for their dynamical evolutions. To explore quantitatively the behaviors of such systems, we propose a class of models to investigate Minority Game (MG) dynamics with multiple strategies. In particular, agents tend to choose the least used strategies based on available local information. A striking finding is the emergence of grouping states defined in terms of distinct strategies. We develop an analytic theory based on the mean-field framework to understand the ``bifurcations'' of the grouping states. The grouping phenomenon has also been identified in the Shanghai Stock-Market system, and we discuss its prevalence in other real-world systems. Our work demonstrates that complex systems obeying the MG rules can spontaneously self-organize themselves into certain divided states, and our model represents a basic and general mathematical framework to address this kind of phenomena in social, economical and political systems.
Bayne, Jay S
2008-06-01
In support of a generalization of systems theory, this paper introduces a new approach in modeling complex distributed systems. It offers an analytic framework for describing the behavior of interactive cyberphysical systems (CPSs), which are networked stationary or mobile information systems responsible for the real-time governance of physical processes whose behaviors unfold in cyberspace. The framework is predicated on a cyberspace-time reference model comprising three spatial dimensions plus time. The spatial domains include geospatial, infospatial, and sociospatial references, the latter describing relationships among sovereign enterprises (rational agents) that choose voluntarily to organize and interoperate for individual and mutual benefit through geospatial (physical) and infospatial (logical) transactions. Of particular relevance to CPSs are notions of timeliness and value, particularly as they relate to the real-time governance of physical processes and engagements with other cooperating CPS. Our overarching interest, as with celestial mechanics, is in the formation and evolution of clusters of cyberspatial objects and the federated systems they form.
A new approach to the concept of "relevance" in information retrieval (IR).
Kagolovsky, Y; Möhr, J R
2001-01-01
The concept of "relevance" is the fundamental concept of information science in general and information retrieval, in particular. Although "relevance" is extensively used in evaluation of information retrieval, there are considerable problems associated with reaching an agreement on its definition, meaning, evaluation, and application in information retrieval. There are a number of different views on "relevance" and its use for evaluation. Based on a review of the literature the main problems associated with the concept of "relevance" in information retrieval are identified. The authors argue that the proposal for the solution of the problems can be based on the conceptual IR framework built using a systems analytic approach to IR. Using this framework different kinds of "relevance" relationships in the IR process are identified, and a methodology for evaluation of "relevance" based on methods of semantics capturing and comparison is proposed.
Critical asset and portfolio risk analysis: an all-hazards framework.
Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark
2007-08-01
This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.
On nonlinear thermo-electro-elasticity.
Mehnert, Markus; Hossain, Mokarram; Steinmann, Paul
2016-06-01
Electro-active polymers (EAPs) for large actuations are nowadays well-known and promising candidates for producing sensors, actuators and generators. In general, polymeric materials are sensitive to differential temperature histories. During experimental characterizations of EAPs under electro-mechanically coupled loads, it is difficult to maintain constant temperature not only because of an external differential temperature history but also because of the changes in internal temperature caused by the application of high electric loads. In this contribution, a thermo-electro-mechanically coupled constitutive framework is proposed based on the total energy approach. Departing from relevant laws of thermodynamics, thermodynamically consistent constitutive equations are formulated. To demonstrate the performance of the proposed thermo-electro-mechanically coupled framework, a frequently used non-homogeneous boundary-value problem, i.e. the extension and inflation of a cylindrical tube, is solved analytically. The results illustrate the influence of various thermo-electro-mechanical couplings.
On nonlinear thermo-electro-elasticity
Mehnert, Markus; Hossain, Mokarram
2016-01-01
Electro-active polymers (EAPs) for large actuations are nowadays well-known and promising candidates for producing sensors, actuators and generators. In general, polymeric materials are sensitive to differential temperature histories. During experimental characterizations of EAPs under electro-mechanically coupled loads, it is difficult to maintain constant temperature not only because of an external differential temperature history but also because of the changes in internal temperature caused by the application of high electric loads. In this contribution, a thermo-electro-mechanically coupled constitutive framework is proposed based on the total energy approach. Departing from relevant laws of thermodynamics, thermodynamically consistent constitutive equations are formulated. To demonstrate the performance of the proposed thermo-electro-mechanically coupled framework, a frequently used non-homogeneous boundary-value problem, i.e. the extension and inflation of a cylindrical tube, is solved analytically. The results illustrate the influence of various thermo-electro-mechanical couplings. PMID:27436985
The NIH analytical methods and reference materials program for dietary supplements.
Betz, Joseph M; Fisher, Kenneth D; Saldanha, Leila G; Coates, Paul M
2007-09-01
Quality of botanical products is a great uncertainty that consumers, clinicians, regulators, and researchers face. Definitions of quality abound, and include specifications for sanitation, adventitious agents (pesticides, metals, weeds), and content of natural chemicals. Because dietary supplements (DS) are often complex mixtures, they pose analytical challenges and method validation may be difficult. In response to product quality concerns and the need for validated and publicly available methods for DS analysis, the US Congress directed the Office of Dietary Supplements (ODS) at the National Institutes of Health (NIH) to accelerate an ongoing methods validation process, and the Dietary Supplements Methods and Reference Materials Program was created. The program was constructed from stakeholder input and incorporates several federal procurement and granting mechanisms in a coordinated and interlocking framework. The framework facilitates validation of analytical methods, analytical standards, and reference materials.
NASA Astrophysics Data System (ADS)
Garofalo, David
2017-07-01
The idea that black hole spin is instrumental in the generation of powerful jets in active galactic nuclei and X-ray binaries is arguably the most contentious claim in black hole astrophysics. Because jets are thought to originate in the context of electromagnetism, and the modeling of Maxwell fields in curved spacetime around black holes is challenging, various approximations are made in numerical simulations that fall under the guise of `ideal magnetohydrodynamics'. But the simplifications of this framework may struggle to capture relevant details of real astrophysical environments near black holes. In this work, we highlight tension between analytic and numerical results, specifically between the analytically derived conserved Noether currents for rotating black hole spacetimes and the results of general relativistic numerical simulations (GRMHD). While we cannot definitively attribute the issue to any specific approximation used in the numerical schemes, there seem to be natural candidates, which we explore. GRMHD notwithstanding, if electromagnetic fields around rotating black holes are brought to the hole by accretion, we show from first principles that prograde accreting disks likely experience weaker large-scale black hole-threading fields, implying weaker jets than in retrograde configurations.
Bounds for the price of discrete arithmetic Asian options
NASA Astrophysics Data System (ADS)
Vanmaele, M.; Deelstra, G.; Liinev, J.; Dhaene, J.; Goovaerts, M. J.
2006-01-01
In this paper the pricing of European-style discrete arithmetic Asian options with fixed and floating strike is studied by deriving analytical lower and upper bounds. In our approach we use a general technique for deriving upper (and lower) bounds for stop-loss premiums of sums of dependent random variables, as explained in Kaas et al. (Ins. Math. Econom. 27 (2000) 151-168), and additionally, the ideas of Rogers and Shi (J. Appl. Probab. 32 (1995) 1077-1088) and of Nielsen and Sandmann (J. Financial Quant. Anal. 38(2) (2003) 449-473). We are able to create a unifying framework for European-style discrete arithmetic Asian options through these bounds, that generalizes several approaches in the literature as well as improves the existing results. We obtain analytical and easily computable bounds. The aim of the paper is to formulate an advice of the appropriate choice of the bounds given the parameters, investigate the effect of different conditioning variables and compare their efficiency numerically. Several sets of numerical results are included. We also discuss hedging using these bounds. Moreover, our methods are applicable to a wide range of (pricing) problems involving a sum of dependent random variables.
A Geographically Explicit Genetic Model of Worldwide Human-Settlement History
Liu, Hua; Prugnolle, Franck; Manica, Andrea; Balloux, François
2006-01-01
Currently available genetic and archaeological evidence is generally interpreted as supportive of a recent single origin of modern humans in East Africa. However, this is where the near consensus on human settlement history ends, and considerable uncertainty clouds any more detailed aspect of human colonization history. Here, we present a dynamic genetic model of human settlement history coupled with explicit geographical distances from East Africa, the likely origin of modern humans. We search for the best-supported parameter space by fitting our analytical prediction to genetic data that are based on 52 human populations analyzed at 783 autosomal microsatellite markers. This framework allows us to jointly estimate the key parameters of the expansion of modern humans. Our best estimates suggest an initial expansion of modern humans ∼56,000 years ago from a small founding population of ∼1,000 effective individuals. Our model further points to high growth rates in newly colonized habitats. The general fit of the model with the data is excellent. This suggests that coupling analytical genetic models with explicit demography and geography provides a powerful tool for making inferences on human-settlement history. PMID:16826514
Judge, T A; Bono, J E
2001-02-01
This article presents meta-analytic results of the relationship of 4 traits--self-esteem, generalized self-efficacy, locus of control, and emotional stability (low neuroticism) with job satisfaction and job performance. With respect to job satisfaction, the estimated true score correlations were .26 for self-esteem, .45 for generalized self-efficacy, .32 for internal locus of control, and .24 for emotional stability. With respect to job performance, the correlations were .26 for self-esteem, .23 for generalized self-efficacy, .22 for internal locus of control, and .19 for emotional stability. In total, the results based on 274 correlations suggest that these traits are among the best dispositional predictors of job satisfaction and job performance. T. A. Judge, E. A. Locke. and C. C. Durham's (1997) theory of core self-evaluations is used as a framework for discussing similarities between the 4 traits and their relationships to satisfaction and performance.
NASA Astrophysics Data System (ADS)
Wollman, Adam J. M.; Miller, Helen; Foster, Simon; Leake, Mark C.
2016-10-01
Staphylococcus aureus is an important pathogen, giving rise to antimicrobial resistance in cell strains such as Methicillin Resistant S. aureus (MRSA). Here we report an image analysis framework for automated detection and image segmentation of cells in S. aureus cell clusters, and explicit identification of their cell division planes. We use a new combination of several existing analytical tools of image analysis to detect cellular and subcellular morphological features relevant to cell division from millisecond time scale sampled images of live pathogens at a detection precision of single molecules. We demonstrate this approach using a fluorescent reporter GFP fused to the protein EzrA that localises to a mid-cell plane during division and is involved in regulation of cell size and division. This image analysis framework presents a valuable platform from which to study candidate new antimicrobials which target the cell division machinery, but may also have more general application in detecting morphologically complex structures of fluorescently labelled proteins present in clusters of other types of cells.
A conformal truncation framework for infinite-volume dynamics
Katz, Emanuel; Khandker, Zuhair U.; Walters, Matthew T.
2016-07-28
Here, we present a new framework for studying conformal field theories deformed by one or more relevant operators. The original CFT is described in infinite volume using a basis of states with definite momentum, P, and conformal Casimir, C. The relevant deformation is then considered using lightcone quantization, with the resulting Hamiltonian expressed in terms of this CFT basis. Truncating to states with C ≤ C max, one can numerically find the resulting spectrum, as well as other dynamical quantities, such as spectral densities of operators. This method requires the introduction of an appropriate regulator, which can be chosen tomore » preserve the conformal structure of the basis. We check this framework in three dimensions for various perturbative deformations of a free scalar CFT, and for the case of a free O(N) CFT deformed by a mass term and a non-perturbative quartic interaction at large- N. In all cases, the truncation scheme correctly reproduces known analytic results. As a result, we also discuss a general procedure for generating a basis of Casimir eigenstates for a free CFT in any number of dimensions.« less
An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application
ERIC Educational Resources Information Center
Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth
2016-01-01
Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…
A graph algebra for scalable visual analytics.
Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V
2012-01-01
Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.
Coates, James; Jeyaseelan, Asha K; Ybarra, Norma; David, Marc; Faria, Sergio; Souhami, Luis; Cury, Fabio; Duclos, Marie; El Naqa, Issam
2015-04-01
We explore analytical and data-driven approaches to investigate the integration of genetic variations (single nucleotide polymorphisms [SNPs] and copy number variations [CNVs]) with dosimetric and clinical variables in modeling radiation-induced rectal bleeding (RB) and erectile dysfunction (ED) in prostate cancer patients. Sixty-two patients who underwent curative hypofractionated radiotherapy (66 Gy in 22 fractions) between 2002 and 2010 were retrospectively genotyped for CNV and SNP rs5489 in the xrcc1 DNA repair gene. Fifty-four patients had full dosimetric profiles. Two parallel modeling approaches were compared to assess the risk of severe RB (Grade⩾3) and ED (Grade⩾1); Maximum likelihood estimated generalized Lyman-Kutcher-Burman (LKB) and logistic regression. Statistical resampling based on cross-validation was used to evaluate model predictive power and generalizability to unseen data. Integration of biological variables xrcc1 CNV and SNP improved the fit of the RB and ED analytical and data-driven models. Cross-validation of the generalized LKB models yielded increases in classification performance of 27.4% for RB and 14.6% for ED when xrcc1 CNV and SNP were included, respectively. Biological variables added to logistic regression modeling improved classification performance over standard dosimetric models by 33.5% for RB and 21.2% for ED models. As a proof-of-concept, we demonstrated that the combination of genetic and dosimetric variables can provide significant improvement in NTCP prediction using analytical and data-driven approaches. The improvement in prediction performance was more pronounced in the data driven approaches. Moreover, we have shown that CNVs, in addition to SNPs, may be useful structural genetic variants in predicting radiation toxicities. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Ren, Lei; Howard, David; Ren, Luquan; Nester, Chris; Tian, Limei
2010-01-19
The objective of this paper is to develop an analytical framework to representing the ankle-foot kinematics by modelling the foot as a rollover rocker, which cannot only be used as a generic tool for general gait simulation but also allows for case-specific modelling if required. Previously, the rollover models used in gait simulation have often been based on specific functions that have usually been of a simple form. In contrast, the analytical model described here is in a general form that the effective foot rollover shape can be represented by any polar function rho=rho(phi). Furthermore, a normalized generic foot rollover model has been established based on a normative foot rollover shape dataset of 12 normal healthy subjects. To evaluate model accuracy, the predicted ankle motions and the centre of pressure (CoP) were compared with measurement data for both subject-specific and general cases. The results demonstrated that the ankle joint motions in both vertical and horizontal directions (relative RMSE approximately 10%) and CoP (relative RMSE approximately 15% for most of the subjects) are accurately predicted over most of the stance phase (from 10% to 90% of stance). However, we found that the foot cannot be very accurately represented by a rollover model just after heel strike (HS) and just before toe off (TO), probably due to shear deformation of foot plantar tissues (ankle motion can occur without any foot rotation). The proposed foot rollover model can be used in both inverse and forward dynamics gait simulation studies and may also find applications in rehabilitation engineering. Copyright 2009 Elsevier Ltd. All rights reserved.
Determining when to change course in management actions.
Ng, Chooi Fei; McCarthy, Michael A; Martin, Tara G; Possingham, Hugh P
2014-12-01
Time is of the essence in conservation biology. To secure the persistence of a species, we need to understand how to balance time spent among different management actions. A new and simple method to test the efficacy of a range of conservation actions is required. Thus, we devised a general theoretical framework to help determine whether to test a new action and when to cease a trial and revert to an existing action if the new action did not perform well. The framework involves constructing a general population model under the different management actions and specifying a management objective. By maximizing the management objective, we could generate an analytical solution that identifies the optimal timing of when to change management action. We applied the analytical solution to the case of the Christmas Island pipistrelle bat (Pipistrelle murrayi), a species for which captive breeding might have prevented its extinction. For this case, we used our model to determine whether to start a captive breeding program and when to stop a captive breeding program and revert to managing the species in the wild, given that the management goal is to maximize the chance of reaching a target wild population size. For the pipistrelle bat, captive breeding was to start immediately and it was desirable to place the species in captivity for the entire management period. The optimal time to revert to managing the species in the wild was driven by several key parameters, including the management goal, management time frame, and the growth rates of the population under different management actions. Knowing when to change management actions can help conservation managers' act in a timely fashion to avoid species extinction. © 2014 Society for Conservation Biology.
A Graphics Design Framework to Visualize Multi-Dimensional Economic Datasets
ERIC Educational Resources Information Center
Chandramouli, Magesh; Narayanan, Badri; Bertoline, Gary R.
2013-01-01
This study implements a prototype graphics visualization framework to visualize multidimensional data. This graphics design framework serves as a "visual analytical database" for visualization and simulation of economic models. One of the primary goals of any kind of visualization is to extract useful information from colossal volumes of…
ERIC Educational Resources Information Center
Kohli, Nidhi; Koran, Jennifer; Henn, Lisa
2015-01-01
There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…
Fock space, symbolic algebra, and analytical solutions for small stochastic systems.
Santos, Fernando A N; Gadêlha, Hermes; Gaffney, Eamonn A
2015-12-01
Randomness is ubiquitous in nature. From single-molecule biochemical reactions to macroscale biological systems, stochasticity permeates individual interactions and often regulates emergent properties of the system. While such systems are regularly studied from a modeling viewpoint using stochastic simulation algorithms, numerous potential analytical tools can be inherited from statistical and quantum physics, replacing randomness due to quantum fluctuations with low-copy-number stochasticity. Nevertheless, classical studies remained limited to the abstract level, demonstrating a more general applicability and equivalence between systems in physics and biology rather than exploiting the physics tools to study biological systems. Here the Fock space representation, used in quantum mechanics, is combined with the symbolic algebra of creation and annihilation operators to consider explicit solutions for the chemical master equations describing small, well-mixed, biochemical, or biological systems. This is illustrated with an exact solution for a Michaelis-Menten single enzyme interacting with limited substrate, including a consideration of very short time scales, which emphasizes when stiffness is present even for small copy numbers. Furthermore, we present a general matrix representation for Michaelis-Menten kinetics with an arbitrary number of enzymes and substrates that, following diagonalization, leads to the solution of this ubiquitous, nonlinear enzyme kinetics problem. For this, a flexible symbolic maple code is provided, demonstrating the prospective advantages of this framework compared to stochastic simulation algorithms. This further highlights the possibilities for analytically based studies of stochastic systems in biology and chemistry using tools from theoretical quantum physics.
Environmental Stewardship: A Conceptual Review and Analytical Framework.
Bennett, Nathan J; Whitty, Tara S; Finkbeiner, Elena; Pittman, Jeremy; Bassett, Hannah; Gelcich, Stefan; Allison, Edward H
2018-04-01
There has been increasing attention to and investment in local environmental stewardship in conservation and environmental management policies and programs globally. Yet environmental stewardship has not received adequate conceptual attention. Establishing a clear definition and comprehensive analytical framework could strengthen our ability to understand the factors that lead to the success or failure of environmental stewardship in different contexts and how to most effectively support and enable local efforts. Here we propose such a definition and framework. First, we define local environmental stewardship as the actions taken by individuals, groups or networks of actors, with various motivations and levels of capacity, to protect, care for or responsibly use the environment in pursuit of environmental and/or social outcomes in diverse social-ecological contexts. Next, drawing from a review of the environmental stewardship, management and governance literatures, we unpack the elements of this definition to develop an analytical framework that can facilitate research on local environmental stewardship. Finally, we discuss potential interventions and leverage points for promoting or supporting local stewardship and future applications of the framework to guide descriptive, evaluative, prescriptive or systematic analysis of environmental stewardship. Further application of this framework in diverse environmental and social contexts is recommended to refine the elements and develop insights that will guide and improve the outcomes of environmental stewardship initiatives and investments. Ultimately, our aim is to raise the profile of environmental stewardship as a valuable and holistic concept for guiding productive and sustained relationships with the environment.
Environmental Stewardship: A Conceptual Review and Analytical Framework
NASA Astrophysics Data System (ADS)
Bennett, Nathan J.; Whitty, Tara S.; Finkbeiner, Elena; Pittman, Jeremy; Bassett, Hannah; Gelcich, Stefan; Allison, Edward H.
2018-04-01
There has been increasing attention to and investment in local environmental stewardship in conservation and environmental management policies and programs globally. Yet environmental stewardship has not received adequate conceptual attention. Establishing a clear definition and comprehensive analytical framework could strengthen our ability to understand the factors that lead to the success or failure of environmental stewardship in different contexts and how to most effectively support and enable local efforts. Here we propose such a definition and framework. First, we define local environmental stewardship as the actions taken by individuals, groups or networks of actors, with various motivations and levels of capacity, to protect, care for or responsibly use the environment in pursuit of environmental and/or social outcomes in diverse social-ecological contexts. Next, drawing from a review of the environmental stewardship, management and governance literatures, we unpack the elements of this definition to develop an analytical framework that can facilitate research on local environmental stewardship. Finally, we discuss potential interventions and leverage points for promoting or supporting local stewardship and future applications of the framework to guide descriptive, evaluative, prescriptive or systematic analysis of environmental stewardship. Further application of this framework in diverse environmental and social contexts is recommended to refine the elements and develop insights that will guide and improve the outcomes of environmental stewardship initiatives and investments. Ultimately, our aim is to raise the profile of environmental stewardship as a valuable and holistic concept for guiding productive and sustained relationships with the environment.
Actual Romanian research in post-newtonian dynamics
NASA Astrophysics Data System (ADS)
Mioc, V.; Stavinschi, M.
2007-05-01
We survey the recent Romanian results in the study of the two-body problem in post-Newtonian fields. Such a field is characterized, in general, by a potential of the form U(q)=|q|^{-1}+ something (small, but not compulsorily). We distinguish some classes of post-Newtonian models: relativistic (Schwarzschild, Fock, Einstein PN, Reissner-Nordström, Schwarzschild - de Sitter, etc.) and nonrelativistic (Manev, Mücket-Treder, Seeliger, gravito-elastic, etc.). Generalized models (the zonal-satellite problem, quasihomogeneous fields), as well as special cases (anisotropic Manev-type and Schwarzschild-type models, Popovici or Popovici-Manev photogravitational problem), were also tackled. The methods used in such studies are various: analytical (using mainly the theory of perturbations, but also other theories: functions of complex variable, variational calculus, etc.), geometric (qualitative approach of the theory of dynamical systems), and numerical (especially using the Poincaré-section technique). The areas of interest and the general results obtained focus on: exact or approximate analytical solutions; characteristics of local flows (especially at limit situations: collision and escape); quasiperiodic and periodic orbits; equilibria; symmetries; chaoticity; geometric description of the global flow (and physical interpretation of the phase-space structure). We emphasize some special features, which cannot be met within the Newtonian framework: black-hole effect, oscillatory collisions, radial librations, bounded orbits for nonnegative energy, existence of unstable circular motion (or unstable rest), symmetric periodic orbits within anisotropic models, etc.
Information Tailoring Enhancements for Large Scale Social Data
2016-03-15
i.com) 1 Work Performed within This Reporting Period .................................................... 2 1.1 Implemented Temporal Analytics ...following tasks. Implemented Temporal Analysis Algorithms for Advanced Analytics in Scraawl. We implemented our backend web service design for the...temporal analysis and we created a prototyope GUI web service of Scraawl analytics dashboard. Upgraded Scraawl computational framework to increase
NASA Astrophysics Data System (ADS)
Giuliani, Matteo; Herman, Jonathan D.; Castelletti, Andrea; Reed, Patrick M.
2014-05-01
Current water reservoir operating policies are facing growing water demands as well as increasing uncertainties associated with a changing climate. However, policy inertia and myopia strongly limit the possibility of adapting current water reservoir operations to the undergoing change. Historical agreements and regulatory constraints limit the rate that reservoir operations are innovated and creates policy inertia, where water institutions are unlikely to change their current practices in absence of dramatic failures. Yet, no guarantee exists that historical management policies will not fail in coming years. In reference to policy myopia, although it has long been recognized that water reservoir systems are generally framed in heterogeneous socio-economic contexts involving a myriad of conflicting, non-commensurable operating objectives, the broader understanding of the multi-objective consequences of current operating rules as well as their vulnerability to hydroclimatic uncertainties is severely limited. This study proposes a decision analytic framework to overcome both policy inertia and myopia in complex river basin management contexts. The framework combines reservoir policy identification, many-objective optimization under uncertainty, and visual analytics to characterize current operations and discover key tradeoffs between alternative policies for balancing evolving demands and system uncertainties. The approach is demonstrated on the Conowingo Dam, located within the Lower Susquehanna River, USA. The Lower Susquehanna River is an interstate water body that has been subject to intensive water management efforts due to the system's competing demands from urban water supply, atomic power plant cooling, hydropower production, and federally regulated environmental flows. The proposed framework initially uses available streamflow observations to implicitly identify the current but unknown operating policy of Conowingo Dam. The quality of the identified baseline policy was validated by its ability to replicate historical release dynamics. Starting from this baseline policy, we then combine evolutionary many-objective optimization with visual analytics to discover new operating policies that better balance the tradeoffs within the Lower Susquehanna. Results confirm that the baseline operating policy, which only considers deterministic historical inflows, significantly overestimates the reliability of the reservoir's competing demands. The proposed framework removes this bias by successfully identifying alternative reservoir policies that are more robust to hydroclimatic uncertainties, while also better addressing the tradeoffs across the Conowingo Dam's multi-sector services.
NASA Astrophysics Data System (ADS)
Giuliani, M.; Herman, J. D.; Castelletti, A.; Reed, P.
2014-04-01
This study contributes a decision analytic framework to overcome policy inertia and myopia in complex river basin management contexts. The framework combines reservoir policy identification, many-objective optimization under uncertainty, and visual analytics to characterize current operations and discover key trade-offs between alternative policies for balancing competing demands and system uncertainties. The approach is demonstrated on the Conowingo Dam, located within the Lower Susquehanna River, USA. The Lower Susquehanna River is an interstate water body that has been subject to intensive water management efforts due to competing demands from urban water supply, atomic power plant cooling, hydropower production, and federally regulated environmental flows. We have identified a baseline operating policy for the Conowingo Dam that closely reproduces the dynamics of current releases and flows for the Lower Susquehanna and thus can be used to represent the preferences structure guiding current operations. Starting from this baseline policy, our proposed decision analytic framework then combines evolutionary many-objective optimization with visual analytics to discover new operating policies that better balance the trade-offs within the Lower Susquehanna. Our results confirm that the baseline operating policy, which only considers deterministic historical inflows, significantly overestimates the system's reliability in meeting the reservoir's competing demands. Our proposed framework removes this bias by successfully identifying alternative reservoir policies that are more robust to hydroclimatic uncertainties while also better addressing the trade-offs across the Conowingo Dam's multisector services.
The Climate Data Analytic Services (CDAS) Framework.
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; Duffy, D.
2016-12-01
Faced with unprecedented growth in climate data volume and demand, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute data processing workflows combining common analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted climate data analysis tools (ESMF, CDAT, NCO, etc.). A dynamic caching architecture enables interactive response times. CDAS utilizes Apache Spark for parallelization and a custom array framework for processing huge datasets within limited memory spaces. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using either direct web service calls, a python script, a unix-like shell client, or a javascript-based web application. Client packages in python, scala, or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends and variability, and compare multiple reanalysis datasets.
Network Community Detection based on the Physarum-inspired Computational Framework.
Gao, Chao; Liang, Mingxin; Li, Xianghua; Zhang, Zili; Wang, Zhen; Zhou, Zhili
2016-12-13
Community detection is a crucial and essential problem in the structure analytics of complex networks, which can help us understand and predict the characteristics and functions of complex networks. Many methods, ranging from the optimization-based algorithms to the heuristic-based algorithms, have been proposed for solving such a problem. Due to the inherent complexity of identifying network structure, how to design an effective algorithm with a higher accuracy and a lower computational cost still remains an open problem. Inspired by the computational capability and positive feedback mechanism in the wake of foraging process of Physarum, which is a large amoeba-like cell consisting of a dendritic network of tube-like pseudopodia, a general Physarum-based computational framework for community detection is proposed in this paper. Based on the proposed framework, the inter-community edges can be identified from the intra-community edges in a network and the positive feedback of solving process in an algorithm can be further enhanced, which are used to improve the efficiency of original optimization-based and heuristic-based community detection algorithms, respectively. Some typical algorithms (e.g., genetic algorithm, ant colony optimization algorithm, and Markov clustering algorithm) and real-world datasets have been used to estimate the efficiency of our proposed computational framework. Experiments show that the algorithms optimized by Physarum-inspired computational framework perform better than the original ones, in terms of accuracy and computational cost. Moreover, a computational complexity analysis verifies the scalability of our framework.
NASA Astrophysics Data System (ADS)
Song, Y.; Gui, Z.; Wu, H.; Wei, Y.
2017-09-01
Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise) to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.
An Analytical Framework for the Steady State Impact of Carbonate Compensation on Atmospheric CO2
NASA Astrophysics Data System (ADS)
Omta, Anne Willem; Ferrari, Raffaele; McGee, David
2018-04-01
The deep-ocean carbonate ion concentration impacts the fraction of the marine calcium carbonate production that is buried in sediments. This gives rise to the carbonate compensation feedback, which is thought to restore the deep-ocean carbonate ion concentration on multimillennial timescales. We formulate an analytical framework to investigate the impact of carbonate compensation under various changes in the carbon cycle relevant for anthropogenic change and glacial cycles. Using this framework, we show that carbonate compensation amplifies by 15-20% changes in atmospheric CO2 resulting from a redistribution of carbon between the atmosphere and ocean (e.g., due to changes in temperature, salinity, or nutrient utilization). A counterintuitive result emerges when the impact of organic matter burial in the ocean is examined. The organic matter burial first leads to a slight decrease in atmospheric CO2 and an increase in the deep-ocean carbonate ion concentration. Subsequently, enhanced calcium carbonate burial leads to outgassing of carbon from the ocean to the atmosphere, which is quantified by our framework. Results from simulations with a multibox model including the minor acids and bases important for the ocean-atmosphere exchange of carbon are consistent with our analytical predictions. We discuss the potential role of carbonate compensation in glacial-interglacial cycles as an example of how our theoretical framework may be applied.
Closed-form solutions and scaling laws for Kerr frequency combs
Renninger, William H.; Rakich, Peter T.
2016-01-01
A single closed-form analytical solution of the driven nonlinear Schrödinger equation is developed, reproducing a large class of the behaviors in Kerr-comb systems, including bright-solitons, dark-solitons, and a large class of periodic wavetrains. From this analytical framework, a Kerr-comb area theorem and a pump-detuning relation are developed, providing new insights into soliton- and wavetrain-based combs along with concrete design guidelines for both. This new area theorem reveals significant deviation from the conventional soliton area theorem, which is crucial to understanding cavity solitons in certain limits. Moreover, these closed-form solutions represent the first step towards an analytical framework for wavetrain formation, and reveal new parameter regimes for enhanced Kerr-comb performance. PMID:27108810
Developing an Analytical Framework for Argumentation on Energy Consumption Issues
ERIC Educational Resources Information Center
Jin, Hui; Mehl, Cathy E.; Lan, Deborah H.
2015-01-01
In this study, we aimed to develop a framework for analyzing the argumentation practice of high school students and high school graduates. We developed the framework in a specific context--how energy consumption activities such as changing diet, converting forests into farmlands, and choosing transportation modes affect the carbon cycle. The…
ERIC Educational Resources Information Center
Lam, Gigi
2014-01-01
A socio-psychological analytical framework will be adopted to illuminate the relation between socioeconomic status and academic achievement. The framework puts the emphasis to incorporate micro familial factors into macro factor of the tracking system. Initially, children of the poor families always lack major prerequisite: diminution of cognitive…
European Qualifications Framework: Weighing Some Pros and Cons out of a French Perspective
ERIC Educational Resources Information Center
Bouder, Annie
2008-01-01
Purpose: The purpose of this paper is to question the appropriateness of a proposal for a new European Qualifications Framework. The framework has three perspectives: historical; analytical; and national. Design/methodology/approach: The approaches are diverse since the first insists on the institutional and decision-making processes at European…
Lee, L.; Helsel, D.
2007-01-01
Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.
McNealy, Kim R.; Al-Khattab, Halima; Carter-Harris, Lisa; Oruche, Ukamaka Marian; Naanyu, Violet; Draucker, Claire Burke
2017-01-01
Background AIDS-related illness is the leading cause of mortality for adolescents in sub-Saharan Africa. Together, Kenya, Tanzania, and Uganda account for 21% of HIV-infected adolescents in sub-Saharan Africa. The United Nations framework for addressing the epidemic among adolescents calls for comprehensive sexual and reproductive health education. These HIV prevention efforts could be informed by a synthesis of existing research about the formal and informal sexual education of adolescents in countries experiencing generalized epidemics. The purpose of this study was to describe the process of sexual learning among East African adolescents living in the context of generalized HIV epidemics. Methods Qualitative metasynthesis, a systematic procedure for integrating the results of multiple qualitative studies addressing a similar phenomenon, was used. Thirty-two research reports met study inclusion criteria. The reports were assessed in a four-step analytic process: appraisal, classification of findings, synthesis of findings, and construction of a framework depicting the process of sexual learning in this population. Results The framework includes three phases of sexual learning: 1) being primed for sex, 2) making sense of sex, and 3) having sexual experiences. Adolescents were primed for sex through gender norms, cultural practices, and economic structures as well as through conversations and formal instruction. They made sense of sex by acquiring information about sexual intercourse, reproduction and pregnancy, sexually transmitted infections, and relationships and by developing a variety of beliefs and attitudes about these topics. Some adolescents described having sexual experiences that met wants or needs, but many experienced sex that was coerced or violent. Whether sex was wanted, coerced, or violent, adolescents experienced worry about sexually transmitted infections or premarital pregnancy. Conclusions The three phases of sexual learning interact to shape adolescents’ sexual lives and their risk for HIV infection. This framework will contribute to the development of sexual education programs that address HIV risk within the broader context of sexual learning. PMID:28278210
Knopf, Amelia S; McNealy, Kim R; Al-Khattab, Halima; Carter-Harris, Lisa; Oruche, Ukamaka Marian; Naanyu, Violet; Draucker, Claire Burke
2017-01-01
AIDS-related illness is the leading cause of mortality for adolescents in sub-Saharan Africa. Together, Kenya, Tanzania, and Uganda account for 21% of HIV-infected adolescents in sub-Saharan Africa. The United Nations framework for addressing the epidemic among adolescents calls for comprehensive sexual and reproductive health education. These HIV prevention efforts could be informed by a synthesis of existing research about the formal and informal sexual education of adolescents in countries experiencing generalized epidemics. The purpose of this study was to describe the process of sexual learning among East African adolescents living in the context of generalized HIV epidemics. Qualitative metasynthesis, a systematic procedure for integrating the results of multiple qualitative studies addressing a similar phenomenon, was used. Thirty-two research reports met study inclusion criteria. The reports were assessed in a four-step analytic process: appraisal, classification of findings, synthesis of findings, and construction of a framework depicting the process of sexual learning in this population. The framework includes three phases of sexual learning: 1) being primed for sex, 2) making sense of sex, and 3) having sexual experiences. Adolescents were primed for sex through gender norms, cultural practices, and economic structures as well as through conversations and formal instruction. They made sense of sex by acquiring information about sexual intercourse, reproduction and pregnancy, sexually transmitted infections, and relationships and by developing a variety of beliefs and attitudes about these topics. Some adolescents described having sexual experiences that met wants or needs, but many experienced sex that was coerced or violent. Whether sex was wanted, coerced, or violent, adolescents experienced worry about sexually transmitted infections or premarital pregnancy. The three phases of sexual learning interact to shape adolescents' sexual lives and their risk for HIV infection. This framework will contribute to the development of sexual education programs that address HIV risk within the broader context of sexual learning.
Precise control of molecular dynamics with a femtosecond frequency comb.
Pe'er, Avi; Shapiro, Evgeny A; Stowe, Matthew C; Shapiro, Moshe; Ye, Jun
2007-03-16
We present a general and highly efficient scheme for performing narrow-band Raman transitions between molecular vibrational levels using a coherent train of weak pump-dump pairs of shaped ultrashort pulses. The use of weak pulses permits an analytic description within the framework of coherent control in the perturbative regime, while coherent accumulation of many pulse pairs enables near unity transfer efficiency with a high spectral selectivity, thus forming a powerful combination of pump-dump control schemes and the precision of the frequency comb. Simulations verify the feasibility and robustness of this concept, with the aim to form deeply bound, ultracold molecules.
Stopping power of an electron gas with anisotropic temperature
NASA Astrophysics Data System (ADS)
Khelemelia, O. V.; Kholodov, R. I.
2016-04-01
A general theory of motion of a heavy charged particle in the electron gas with an anisotropic velocity distribution is developed within the quantum-field method. The analytical expressions for the dielectric susceptibility and the stopping power of the electron gas differs in no way from well-known classic formulas in the approximation of large and small velocities. Stopping power of the electron gas with anisotropic temperature in the framework of the quantum-field method is numerically calculated for an arbitrary angle between directions of the motion of the projectile particle and the electron beam. The results of the numerical calculations are compared with the dielectric model approach.
Scattering of a high-order Bessel beam by a spheroidal particle
NASA Astrophysics Data System (ADS)
Han, Lu
2018-05-01
Within the framework of generalized Lorenz-Mie theory (GLMT), scattering from a homogeneous spheroidal particle illuminated by a high-order Bessel beam is formulated analytically. The high-order Bessel beam is expanded in terms of spheroidal vector wave functions, where the spheroidal beam shape coefficients (BSCs) are computed conveniently using an intrinsic method. Numerical results concerning scattered field in the far zone are displayed for various parameters of the incident Bessel beam and of the scatter. These results are expected to provide useful insights into the scattering of a Bessel beam by nonspherical particles and particle manipulation applications using Bessel beams.
On-Orbit Range Set Applications
NASA Astrophysics Data System (ADS)
Holzinger, M.; Scheeres, D.
2011-09-01
History and methodology of Δv range set computation is briefly reviewed, followed by a short summary of the Δv optimal spacecraft servicing problem literature. Service vehicle placement is approached from a Δv range set viewpoint, providing a framework under which the analysis becomes quite geometric and intuitive. The optimal servicing tour design problem is shown to be a specific instantiation of the metric- Traveling Salesman Problem (TSP), which in general is an NP-hard problem. The Δv-TSP is argued to be quite similar to the Euclidean-TSP, for which approximate optimal solutions may be found in polynomial time. Applications of range sets are demonstrated using analytical and simulation results.
NASA Astrophysics Data System (ADS)
Zanotto, Simone; Tredicucci, Alessandro
2016-04-01
In this article we discuss a model describing key features concerning the lineshapes and the coherent absorption conditions in Fano-resonant dissipative coupled oscillators. The model treats on the same footing the weak and strong coupling regimes, and includes the critical coupling concept, which is of great relevance in numerous applications; in addition, the role of asymmetry is thoroughly analyzed. Due to the wide generality of the model, which can be adapted to various frameworks like nanophotonics, plasmonics, and optomechanics, we envisage that the analytical formulas presented here will be crucial to effectively design devices and to interpret experimental results.
Madden, M; Batey Pwj
1983-05-01
Some problems associated with demographic-economic forecasting include finding models appropriate for a declining economy with unemployment, using a multiregional approach in an interregional model, finding a way to show differential consumption while endogenizing unemployment, and avoiding unemployment inconsistencies. The solution to these problems involves the construction of an activity-commodity framework, locating it within a group of forecasting models, and indicating possible ratios towards dynamization of the framework. The authors demonstrate the range of impact multipliers that can be derived from the framework and show how these multipliers relate to Leontief input-output multipliers. It is shown that desired population distribution may be obtained by selecting instruments from the economic sphere to produce, through the constraints vector of an activity-commodity framework, targets selected from demographic activities. The next step in this process, empirical exploitation, was carried out by the authors in the United Kingdom, linking an input-output model with a wide selection of demographic and demographic-economic variables. The generally tenuous control which government has over any variables in systems of this type, especially in market economies, makes application in the policy field of the optimization approach a partly conjectural exercise, although the analytic capacity of the approach can provide clear indications of policy directions.
Karlsson, Alexander; Riveiro, Maria; Améen, Caroline; Åkesson, Karolina; Andersson, Christian X.; Sartipy, Peter; Synnergren, Jane
2017-01-01
The development of high-throughput biomolecular technologies has resulted in generation of vast omics data at an unprecedented rate. This is transforming biomedical research into a big data discipline, where the main challenges relate to the analysis and interpretation of data into new biological knowledge. The aim of this study was to develop a framework for biomedical big data analytics, and apply it for analyzing transcriptomics time series data from early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. To this end, transcriptome profiling by microarray was performed on differentiating human pluripotent stem cells sampled at eleven consecutive days. The gene expression data was analyzed using the five-stage analysis framework proposed in this study, including data preparation, exploratory data analysis, confirmatory analysis, biological knowledge discovery, and visualization of the results. Clustering analysis revealed several distinct expression profiles during differentiation. Genes with an early transient response were strongly related to embryonic- and mesendoderm development, for example CER1 and NODAL. Pluripotency genes, such as NANOG and SOX2, exhibited substantial downregulation shortly after onset of differentiation. Rapid induction of genes related to metal ion response, cardiac tissue development, and muscle contraction were observed around day five and six. Several transcription factors were identified as potential regulators of these processes, e.g. POU1F1, TCF4 and TBP for muscle contraction genes. Pathway analysis revealed temporal activity of several signaling pathways, for example the inhibition of WNT signaling on day 2 and its reactivation on day 4. This study provides a comprehensive characterization of biological events and key regulators of the early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. The proposed analysis framework can be used to structure data analysis in future research, both in stem cell differentiation, and more generally, in biomedical big data analytics. PMID:28654683
Haji Ali Afzali, Hossein; Gray, Jodi; Karnon, Jonathan
2013-04-01
Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed framework should usefully inform guidelines for preparing submissions to reimbursement bodies.
Ulfenborg, Benjamin; Karlsson, Alexander; Riveiro, Maria; Améen, Caroline; Åkesson, Karolina; Andersson, Christian X; Sartipy, Peter; Synnergren, Jane
2017-01-01
The development of high-throughput biomolecular technologies has resulted in generation of vast omics data at an unprecedented rate. This is transforming biomedical research into a big data discipline, where the main challenges relate to the analysis and interpretation of data into new biological knowledge. The aim of this study was to develop a framework for biomedical big data analytics, and apply it for analyzing transcriptomics time series data from early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. To this end, transcriptome profiling by microarray was performed on differentiating human pluripotent stem cells sampled at eleven consecutive days. The gene expression data was analyzed using the five-stage analysis framework proposed in this study, including data preparation, exploratory data analysis, confirmatory analysis, biological knowledge discovery, and visualization of the results. Clustering analysis revealed several distinct expression profiles during differentiation. Genes with an early transient response were strongly related to embryonic- and mesendoderm development, for example CER1 and NODAL. Pluripotency genes, such as NANOG and SOX2, exhibited substantial downregulation shortly after onset of differentiation. Rapid induction of genes related to metal ion response, cardiac tissue development, and muscle contraction were observed around day five and six. Several transcription factors were identified as potential regulators of these processes, e.g. POU1F1, TCF4 and TBP for muscle contraction genes. Pathway analysis revealed temporal activity of several signaling pathways, for example the inhibition of WNT signaling on day 2 and its reactivation on day 4. This study provides a comprehensive characterization of biological events and key regulators of the early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. The proposed analysis framework can be used to structure data analysis in future research, both in stem cell differentiation, and more generally, in biomedical big data analytics.
An integral equation-based numerical solver for Taylor states in toroidal geometries
NASA Astrophysics Data System (ADS)
O'Neil, Michael; Cerfon, Antoine J.
2018-04-01
We present an algorithm for the numerical calculation of Taylor states in toroidal and toroidal-shell geometries using an analytical framework developed for the solution to the time-harmonic Maxwell equations. Taylor states are a special case of what are known as Beltrami fields, or linear force-free fields. The scheme of this work relies on the generalized Debye source representation of Maxwell fields and an integral representation of Beltrami fields which immediately yields a well-conditioned second-kind integral equation. This integral equation has a unique solution whenever the Beltrami parameter λ is not a member of a discrete, countable set of resonances which physically correspond to spontaneous symmetry breaking. Several numerical examples relevant to magnetohydrodynamic equilibria calculations are provided. Lastly, our approach easily generalizes to arbitrary geometries, both bounded and unbounded, and of varying genus.
Assessing habitat connectivity for ground-dwelling animals in an urban environment.
Braaker, S; Moretti, M; Boesch, R; Ghazoul, J; Obrist, M K; Bontadina, F
To ensure viable species populations in fragmented landscapes, individuals must be able to move between suitable habitat patches. Despite the increased interest in biodiversity assessment in urban environments, the ecological relevance of habitat connectivity in highly fragmented landscapes remains largely unknown. The first step to understanding the role of habitat connectivity in urban ecology is the challenging task of assessing connectivity in the complex patchwork of contrasting habitats that is found in cities. We developed a data-based framework, minimizing the use of subjective assumptions, to assess habitat connectivity that consists of the following sequential steps: (1) identification of habitat preference based on empirical habitat-use data; (2) derivation of habitat resistance surfaces evaluating various transformation functions; (3) modeling of different connectivity maps with electrical circuit theory (Circuitscape), a method considering all possible pathways across the landscape simultaneously; and (4) identification of the best connectivity map with information-theoretic model selection. We applied this analytical framework to assess habitat connectivity for the European hedgehog Erinaceus europaeus, a model species for ground-dwelling animals, in the city of Zurich, Switzerland, using GPS track points from 40 individuals. The best model revealed spatially explicit connectivity “pinch points,” as well as multiple habitat connections. Cross-validation indicated the general validity of the selected connectivity model. The results show that both habitat connectivity and habitat quality affect the movement of urban hedgehogs (relative importance of the two variables was 19.2% and 80.8%, respectively), and are thus both relevant for predicting urban animal movements. Our study demonstrates that even in the complex habitat patchwork of cities, habitat connectivity plays a major role for ground-dwelling animal movement. Data-based habitat connectivity maps can thus serve as an important tool for city planners to identify habitat corridors and plan appropriate management and conservation measures for urban animals. The analytical framework we describe to model such connectivity maps is generally applicable to different types of habitat-use data and can be adapted to the movement scale of the focal species. It also allows evaluation of the impact of future landscape changes or management scenarios on habitat connectivity in urban landscapes.
Urban Partnership Agreement and Congestion Reduction Demonstration : National Evaluation Framework
DOT National Transportation Integrated Search
2008-11-21
This report provides an analytical framework for evaluating six deployments under the United States Department of Transportation (U.S. DOT) Urban Partnership Agreement (UPA) and Congestion Reduction Demonstration (CRD) Programs. The six UPA/CRD sites...
Scaling Student Success with Predictive Analytics: Reflections after Four Years in the Data Trenches
ERIC Educational Resources Information Center
Wagner, Ellen; Longanecker, David
2016-01-01
The metrics used in the US to track students do not include adults and part-time students. This has led to the development of a massive data initiative--the Predictive Analytics Reporting (PAR) framework--that uses predictive analytics to trace the progress of all types of students in the system. This development has allowed actionable,…
Human exposure and internal dose assessments of acrylamide in food.
Dybing, E; Farmer, P B; Andersen, M; Fennell, T R; Lalljie, S P D; Müller, D J G; Olin, S; Petersen, B J; Schlatter, J; Scholz, G; Scimeca, J A; Slimani, N; Törnqvist, M; Tuijtelaars, S; Verger, P
2005-03-01
This review provides a framework contributing to the risk assessment of acrylamide in food. It is based on the outcome of the ILSI Europe FOSIE process, a risk assessment framework for chemicals in foods and adds to the overall framework by focusing especially on exposure assessment and internal dose assessment of acrylamide in food. Since the finding that acrylamide is formed in food during heat processing and preparation of food, much effort has been (and still is being) put into understanding its mechanism of formation, on developing analytical methods and determination of levels in food, and on evaluation of its toxicity and potential toxicity and potential human health consequences. Although several exposure estimations have been proposed, a systematic review of key information relevant to exposure assessment is currently lacking. The European and North American branches of the International Life Sciences Institute, ILSI, discussed critical aspects of exposure assessment, parameters influencing the outcome of exposure assessment and summarised data relevant to the acrylamide exposure assessment to aid the risk characterisation process. This paper reviews the data on acrylamide levels in food including its formation and analytical methods, the determination of human consumption patterns, dietary intake of the general population, estimation of maximum intake levels and identification of groups of potentially high intakes. Possible options and consequences of mitigation efforts to reduce exposure are discussed. Furthermore the association of intake levels with biomarkers of exposure and internal dose, considering aspects of bioavailability, is reviewed, and a physiologically-based toxicokinetic (PBTK) model is described that provides a good description of the kinetics of acrylamide in the rat. Each of the sections concludes with a summary of remaining gaps and uncertainties.
Assessment of Critical-Analytic Thinking
ERIC Educational Resources Information Center
Brown, Nathaniel J.; Afflerbach, Peter P.; Croninger, Robert G.
2014-01-01
National policy and standards documents, including the National Assessment of Educational Progress frameworks, the "Common Core State Standards" and the "Next Generation Science Standards," assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for…
Global dynamic optimization approach to predict activation in metabolic pathways.
de Hijas-Liste, Gundián M; Klipp, Edda; Balsa-Canto, Eva; Banga, Julio R
2014-01-06
During the last decade, a number of authors have shown that the genetic regulation of metabolic networks may follow optimality principles. Optimal control theory has been successfully used to compute optimal enzyme profiles considering simple metabolic pathways. However, applying this optimal control framework to more general networks (e.g. branched networks, or networks incorporating enzyme production dynamics) yields problems that are analytically intractable and/or numerically very challenging. Further, these previous studies have only considered a single-objective framework. In this work we consider a more general multi-objective formulation and we present solutions based on recent developments in global dynamic optimization techniques. We illustrate the performance and capabilities of these techniques considering two sets of problems. First, we consider a set of single-objective examples of increasing complexity taken from the recent literature. We analyze the multimodal character of the associated non linear optimization problems, and we also evaluate different global optimization approaches in terms of numerical robustness, efficiency and scalability. Second, we consider generalized multi-objective formulations for several examples, and we show how this framework results in more biologically meaningful results. The proposed strategy was used to solve a set of single-objective case studies related to unbranched and branched metabolic networks of different levels of complexity. All problems were successfully solved in reasonable computation times with our global dynamic optimization approach, reaching solutions which were comparable or better than those reported in previous literature. Further, we considered, for the first time, multi-objective formulations, illustrating how activation in metabolic pathways can be explained in terms of the best trade-offs between conflicting objectives. This new methodology can be applied to metabolic networks with arbitrary topologies, non-linear dynamics and constraints.
Earthdata Cloud Analytics Project
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Lynnes, Chris
2018-01-01
This presentation describes a nascent project in NASA to develop a framework to support end-user analytics of NASA's Earth science data in the cloud. The chief benefit of migrating EOSDIS (Earth Observation System Data and Information Systems) data to the cloud is to position the data next to enormous computing capacity to allow end users to process data at scale. The Earthdata Cloud Analytics project will user a service-based approach to facilitate the infusion of evolving analytics technology and the integration with non-NASA analytics or other complementary functionality at other agencies and in other nations.
Wu, Zheyang; Zhao, Hongyu
2012-01-01
For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies.
Wu, Zheyang; Zhao, Hongyu
2013-01-01
For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies. PMID:23956610
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whipple, C
Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, inmore » a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.« less
On effective temperature in network models of collective behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porfiri, Maurizio, E-mail: mporfiri@nyu.edu; Ariel, Gil, E-mail: arielg@math.biu.ac.il
Collective behavior of self-propelled units is studied analytically within the Vectorial Network Model (VNM), a mean-field approximation of the well-known Vicsek model. We propose a dynamical systems framework to study the stochastic dynamics of the VNM in the presence of general additive noise. We establish that a single parameter, which is a linear function of the circular mean of the noise, controls the macroscopic phase of the system—ordered or disordered. By establishing a fluctuation–dissipation relation, we posit that this parameter can be regarded as an effective temperature of collective behavior. The exact critical temperature is obtained analytically for systems withmore » small connectivity, equivalent to low-density ensembles of self-propelled units. Numerical simulations are conducted to demonstrate the applicability of this new notion of effective temperature to the Vicsek model. The identification of an effective temperature of collective behavior is an important step toward understanding order–disorder phase transitions, informing consistent coarse-graining techniques and explaining the physics underlying the emergence of collective phenomena.« less
Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih
2015-01-01
The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions.
Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih
2015-01-01
The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions. PMID:26600156
Coevolutionary dynamics in large, but finite populations
NASA Astrophysics Data System (ADS)
Traulsen, Arne; Claussen, Jens Christian; Hauert, Christoph
2006-07-01
Coevolving and competing species or game-theoretic strategies exhibit rich and complex dynamics for which a general theoretical framework based on finite populations is still lacking. Recently, an explicit mean-field description in the form of a Fokker-Planck equation was derived for frequency-dependent selection with two strategies in finite populations based on microscopic processes [A. Traulsen, J. C. Claussen, and C. Hauert, Phys. Rev. Lett. 95, 238701 (2005)]. Here we generalize this approach in a twofold way: First, we extend the framework to an arbitrary number of strategies and second, we allow for mutations in the evolutionary process. The deterministic limit of infinite population size of the frequency-dependent Moran process yields the adjusted replicator-mutator equation, which describes the combined effect of selection and mutation. For finite populations, we provide an extension taking random drift into account. In the limit of neutral selection, i.e., whenever the process is determined by random drift and mutations, the stationary strategy distribution is derived. This distribution forms the background for the coevolutionary process. In particular, a critical mutation rate uc is obtained separating two scenarios: above uc the population predominantly consists of a mixture of strategies whereas below uc the population tends to be in homogeneous states. For one of the fundamental problems in evolutionary biology, the evolution of cooperation under Darwinian selection, we demonstrate that the analytical framework provides excellent approximations to individual based simulations even for rather small population sizes. This approach complements simulation results and provides a deeper, systematic understanding of coevolutionary dynamics.
Analyzing Electronic Question/Answer Services: Framework and Evaluations of Selected Services.
ERIC Educational Resources Information Center
White, Marilyn Domas, Ed.
This report develops an analytical framework based on systems analysis for evaluating electronic question/answer or AskA services operated by a wide range of types of organizations, including libraries. Version 1.0 of this framework was applied in June 1999 to a selective sample of 11 electronic question/answer services, which cover a range of…
Rainbow: A Framework for Analysing Computer-Mediated Pedagogical Debates
ERIC Educational Resources Information Center
Baker, Michael; Andriessen, Jerry; Lund, Kristine; van Amelsvoort, Marie; Quignard, Matthieu
2007-01-01
In this paper we present a framework for analysing when and how students engage in a specific form of interactive knowledge elaboration in CSCL environments: broadening and deepening understanding of a space of debate. The framework is termed "Rainbow," as it comprises seven principal analytical categories, to each of which a colour is assigned,…
Analysis of Naval NETWAR FORCEnet Enterprise: Implications for Capabilities Based Budgeting
2006-12-01
of this background information and projecting how ADNS is likely to succeed in the NNFE framework , two fundamental research questions were addressed...background information and projecting how ADNS is likely to succeed in the NNFE framework , two fundamental research questions were addressed. The...Business Approach ......................................................26 Figure 8. Critical Assumption for Common Analytical Framework
ERIC Educational Resources Information Center
Bennison, Anne; Goos, Merrilyn
2013-01-01
This paper reviews recent literature on teacher identity in order to propose an operational framework that can be used to investigate the formation and development of numeracy teacher identities. The proposed framework is based on Van Zoest and Bohl's (2005) framework for mathematics teacher identity with a focus on those characteristics thought…
Green Framework and Its Role in Sustainable City Development (by Example of Yekaterinburg)
NASA Astrophysics Data System (ADS)
Maltseva, A.
2017-11-01
The article focuses on the destruction of the city green framework in Yekaterinburg. The strategy of its recovery by means of a bioactive core represented by a botanic garden has been proposed. The analytical framework for modification in the proportion of green territories and the total city area has been described.
DOT National Transportation Integrated Search
2012-05-01
This report provides an analytical framework for evaluating the two field deployments under the United States Department of Transportation (U.S. DOT) Integrated Corridor Management (ICM) Initiative Demonstration Phase. The San Diego Interstate 15 cor...
Distinctive aspects of the evolution of galactic magnetic fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yar-Mukhamedov, D., E-mail: danial.su@gmail.com
2016-11-15
We perform an in-depth analysis of the evolution of galactic magnetic fields within a semi-analytic galaxy formation and evolution framework, determine various distinctive aspects of the evolution process, and obtain analytic solutions for a wide range of possible evolution scenarios.
Strategic, Analytic and Operational Domains of Information Management.
ERIC Educational Resources Information Center
Diener, Richard AV
1992-01-01
Discussion of information management focuses on three main areas of activities and their interrelationship: (1) strategic, including establishing frameworks and principles of operations; (2) analytic, or research elements, including user needs assessment, data gathering, and data analysis; and (3) operational activities, including reference…
Reed, M S; Podesta, G; Fazey, I; Geeson, N; Hessel, R; Hubacek, K; Letson, D; Nainggolan, D; Prell, C; Rickenbach, M G; Ritsema, C; Schwilch, G; Stringer, L C; Thomas, A D
2013-10-01
Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change.
Reed, M.S.; Podesta, G.; Fazey, I.; Geeson, N.; Hessel, R.; Hubacek, K.; Letson, D.; Nainggolan, D.; Prell, C.; Rickenbach, M.G.; Ritsema, C.; Schwilch, G.; Stringer, L.C.; Thomas, A.D.
2013-01-01
Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change. PMID:25844020
Non-linear analytic and coanalytic problems ( L_p-theory, Clifford analysis, examples)
NASA Astrophysics Data System (ADS)
Dubinskii, Yu A.; Osipenko, A. S.
2000-02-01
Two kinds of new mathematical model of variational type are put forward: non-linear analytic and coanalytic problems. The formulation of these non-linear boundary-value problems is based on a decomposition of the complete scale of Sobolev spaces into the "orthogonal" sum of analytic and coanalytic subspaces. A similar decomposition is considered in the framework of Clifford analysis. Explicit examples are presented.
NASA Astrophysics Data System (ADS)
Ladiges, Daniel R.; Sader, John E.
2018-05-01
Nanomechanical resonators and sensors, operated in ambient conditions, often generate low-Mach-number oscillating rarefied gas flows. Cercignani [C. Cercignani, J. Stat. Phys. 1, 297 (1969), 10.1007/BF01007482] proposed a variational principle for the linearized Boltzmann equation, which can be used to derive approximate analytical solutions of steady (time-independent) flows. Here we extend and generalize this principle to unsteady oscillatory rarefied flows and thus accommodate resonating nanomechanical devices. This includes a mathematical approach that facilitates its general use and allows for systematic improvements in accuracy. This formulation is demonstrated for two canonical flow problems: oscillatory Couette flow and Stokes' second problem. Approximate analytical formulas giving the bulk velocity and shear stress, valid for arbitrary oscillation frequency, are obtained for Couette flow. For Stokes' second problem, a simple system of ordinary differential equations is derived which may be solved to obtain the desired flow fields. Using this framework, a simple and accurate formula is provided for the shear stress at the oscillating boundary, again for arbitrary frequency, which may prove useful in application. These solutions are easily implemented on any symbolic or numerical package, such as Mathematica or matlab, facilitating the characterization of flows produced by nanomechanical devices and providing insight into the underlying flow physics.
Modelling the protocol stack in NCS with deterministic and stochastic petri net
NASA Astrophysics Data System (ADS)
Hui, Chen; Chunjie, Zhou; Weifeng, Zhu
2011-06-01
Protocol stack is the basis of the networked control systems (NCS). Full or partial reconfiguration of protocol stack offers both optimised communication service and system performance. Nowadays, field testing is unrealistic to determine the performance of reconfigurable protocol stack; and the Petri net formal description technique offers the best combination of intuitive representation, tool support and analytical capabilities. Traditionally, separation between the different layers of the OSI model has been a common practice. Nevertheless, such a layered modelling analysis framework of protocol stack leads to the lack of global optimisation for protocol reconfiguration. In this article, we proposed a general modelling analysis framework for NCS based on the cross-layer concept, which is to establish an efficiency system scheduling model through abstracting the time constraint, the task interrelation, the processor and the bus sub-models from upper and lower layers (application, data link and physical layer). Cross-layer design can help to overcome the inadequacy of global optimisation based on information sharing between protocol layers. To illustrate the framework, we take controller area network (CAN) as a case study. The simulation results of deterministic and stochastic Petri-net (DSPN) model can help us adjust the message scheduling scheme and obtain better system performance.
Pilot testing of SHRP 2 reliability data and analytical products: Florida. [supporting datasets
DOT National Transportation Integrated Search
2014-01-01
SHRP 2 initiated the L38 project to pilot test products from five of the programs completed projects. The products support reliability estimation and use based on data analyses, analytical techniques, and decision-making framework. The L38 project...
Joseph v. Brady: Synthesis Reunites What Analysis Has Divided
ERIC Educational Resources Information Center
Thompson, Travis
2012-01-01
Joseph V. Brady (1922-2011) created behavior-analytic neuroscience and the analytic framework for understanding how the external and internal neurobiological environments and mechanisms interact. Brady's approach offered synthesis as well as analysis. He embraced Findley's approach to constructing multioperant behavioral repertoires that found…
2005-04-01
RTO-MP-SAS-055 4 - 1 UNCLASSIFIED/UNLIMITED UNCLASSIFIED/UNLIMITED Analytical Support Capabilities of Turkish General Staff Scientific...the end failed to achieve anything commensurate with the effort. The analytical support capabilities of Turkish Scientific Decision Support Center to...percent of the İpekkan, Z.; Özkil, A. (2005) Analytical Support Capabilities of Turkish General Staff Scientific Decision Support Centre (SDSC) to
A framework for performing workplace hazard and risk analysis: a participative ergonomics approach.
Morag, Ido; Luria, Gil
2013-01-01
Despite the unanimity among researchers about the centrality of workplace analysis based on participatory ergonomics (PE) as a basis for preventive interventions, there is still little agreement about the necessary of a theoretical framework for providing practical guidance. In an effort to develop a conceptual PE framework, the authors, focusing on 20 studies, found five primary dimensions for characterising an analytical structure: (1) extent of workforce involvement; (2) analysis duration; (3) diversity of reporter role types; (4) scope of analysis and (5) supportive information system for analysis management. An ergonomics analysis carried out in a chemical manufacturing plant serves as a case study for evaluating the proposed framework. The study simultaneously demonstrates the five dimensions and evaluates their feasibility. The study showed that managerial leadership was fundamental to the successful implementation of the analysis; that all job holders should participate in analysing their own workplace and simplified reporting methods contributed to a desirable outcome. This paper seeks to clarify the scope of workplace ergonomics analysis by offering a theoretical and structured framework for providing practical advice and guidance. Essential to successfully implementing the analytical framework are managerial involvement, participation of all job holders and simplified reporting methods.
A Framework for Real-Time Collection, Analysis, and Classification of Ubiquitous Infrasound Data
NASA Astrophysics Data System (ADS)
Christe, A.; Garces, M. A.; Magana-Zook, S. A.; Schnurr, J. M.
2015-12-01
Traditional infrasound arrays are generally expensive to install and maintain. There are ~10^3 infrasound channels on Earth today. The amount of data currently provided by legacy architectures can be processed on a modest server. However, the growing availability of low-cost, ubiquitous, and dense infrasonic sensor networks presents a substantial increase in the volume, velocity, and variety of data flow. Initial data from a prototype ubiquitous global infrasound network is already pushing the boundaries of traditional research server and communication systems, in particular when serving data products over heterogeneous, international network topologies. We present a scalable, cloud-based approach for capturing and analyzing large amounts of dense infrasonic data (>10^6 channels). We utilize Akka actors with WebSockets to maintain data connections with infrasound sensors. Apache Spark provides streaming, batch, machine learning, and graph processing libraries which will permit signature classification, cross-correlation, and other analytics in near real time. This new framework and approach provide significant advantages in scalability and cost.
Kumar, Sanjeev; Karmeshu
2018-04-01
A theoretical investigation is presented that characterizes the emerging sub-threshold membrane potential and inter-spike interval (ISI) distributions of an ensemble of IF neurons that group together and fire together. The squared-noise intensity σ 2 of the ensemble of neurons is treated as a random variable to account for the electrophysiological variations across population of nearly identical neurons. Employing superstatistical framework, both ISI distribution and sub-threshold membrane potential distribution of neuronal ensemble are obtained in terms of generalized K-distribution. The resulting distributions exhibit asymptotic behavior akin to stretched exponential family. Extensive simulations of the underlying SDE with random σ 2 are carried out. The results are found to be in excellent agreement with the analytical results. The analysis has been extended to cover the case corresponding to independent random fluctuations in drift in addition to random squared-noise intensity. The novelty of the proposed analytical investigation for the ensemble of IF neurons is that it yields closed form expressions of probability distributions in terms of generalized K-distribution. Based on a record of spiking activity of thousands of neurons, the findings of the proposed model are validated. The squared-noise intensity σ 2 of identified neurons from the data is found to follow gamma distribution. The proposed generalized K-distribution is found to be in excellent agreement with that of empirically obtained ISI distribution of neuronal ensemble. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Colaiori, Francesca; Castellano, Claudio; Cuskley, Christine F.; Loreto, Vittorio; Pugliese, Martina; Tria, Francesca
2015-01-01
Empirical evidence shows that the rate of irregular usage of English verbs exhibits discontinuity as a function of their frequency: the most frequent verbs tend to be totally irregular. We aim to qualitatively understand the origin of this feature by studying simple agent-based models of language dynamics, where each agent adopts an inflectional state for a verb and may change it upon interaction with other agents. At the same time, agents are replaced at some rate by new agents adopting the regular form. In models with only two inflectional states (regular and irregular), we observe that either all verbs regularize irrespective of their frequency, or a continuous transition occurs between a low-frequency state, where the lemma becomes fully regular, and a high-frequency one, where both forms coexist. Introducing a third (mixed) state, wherein agents may use either form, we find that a third, qualitatively different behavior may emerge, namely, a discontinuous transition in frequency. We introduce and solve analytically a very general class of three-state models that allows us to fully understand these behaviors in a unified framework. Realistic sets of interaction rules, including the well-known naming game (NG) model, result in a discontinuous transition, in agreement with recent empirical findings. We also point out that the distinction between speaker and hearer in the interaction has no effect on the collective behavior. The results for the general three-state model, although discussed in terms of language dynamics, are widely applicable.
Clerehan, Rosemary; Hirsh, Di; Buchbinder, Rachelle
2009-01-01
While clinicians may routinely use patient information leaflets about drug therapy, a poorly conceived leaflet has the potential to do harm. We previously developed a novel approach to analysing leaflets about a rheumatoid arthritis drug, using an analytic approach based on systemic functional linguistics. The aim of the present study was to verify the validity of the linguistic framework by applying it to two further arthritis drug leaflets. The findings confirmed the applicability of the framework and were used to refine it. A new stage or 'move' in the genre was identified. While the function of many of the moves appeared to be 'to instruct' the patient, the instruction was often unclear. The role relationships expressed in the text were critical to the meaning. As with our previous study, judged on their lexical density, the leaflets resembled academic text. The framework can provide specific tools to assess and produce medication information leaflets to support readers in taking medication. Future work could utilize the framework to evaluate information on other treatments and procedures or on healthcare information more widely.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less
Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y.; Fairchild, Geoffrey; Hyman, James M.; Kiang, Richard; Morse, Andrew P.; Pancerella, Carmen M.; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina
2016-01-01
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models. PMID:26820405
Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y; Fairchild, Geoffrey; Hyman, James M; Kiang, Richard; Morse, Andrew P; Pancerella, Carmen M; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina
2016-01-01
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.
Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban; ...
2016-01-28
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less
Big data analytics in healthcare: promise and potential.
Raghupathi, Wullianallur; Raghupathi, Viju
2014-01-01
To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.
The Ophidia framework: toward cloud-based data analytics for climate change
NASA Astrophysics Data System (ADS)
Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni
2015-04-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.
The use of intuitive and analytic reasoning styles by patients with persecutory delusions.
Freeman, Daniel; Lister, Rachel; Evans, Nicole
2014-12-01
A previous study has shown an association of paranoid thinking with a reliance on rapid intuitive ('experiential') reasoning and less use of slower effortful analytic ('rational') reasoning. The objectives of the new study were to replicate the test of paranoia and reasoning styles in a large general population sample and to assess the use of these reasoning styles in patients with persecutory delusions. 30 Patients with persecutory delusions in the context of a non-affective psychotic disorder and 1000 non-clinical individuals completed self-report assessments of paranoia and reasoning styles. The patients with delusions reported lower levels of both experiential and analytic reasoning than the non-clinical individuals (effect sizes small to moderate). Both self-rated ability and engagement with the reasoning styles were lower in the clinical group. Within the non-clinical group, greater levels of paranoia were associated with lower levels of analytic reasoning, but there was no association with experiential reasoning. The study is cross-sectional and cannot determine whether the reasoning styles contribute to the occurrence of paranoia. It also cannot be determined whether the patient group's lower reasoning scores are specifically associated with the delusions. Clinical paranoia is associated with less reported use of analytic and experiential reasoning. This may reflect patients with current delusions being unconfident in their reasoning abilities or less aware of decision-making processes and hence less able to re-evaluate fearful cognitions. The dual process theory of reasoning may provide a helpful framework in which to discuss with patients decision-making styles. Copyright © 2014 Elsevier Ltd. All rights reserved.
1 CFR 6.2 - Analytical subject indexes.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 1 General Provisions 1 2010-01-01 2010-01-01 false Analytical subject indexes. 6.2 Section 6.2 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER THE FEDERAL REGISTER INDEXES AND ANCILLARIES § 6.2 Analytical subject indexes. Analytical subject indexes covering the contents of the Federal...
1 CFR 6.2 - Analytical subject indexes.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 1 General Provisions 1 2011-01-01 2011-01-01 false Analytical subject indexes. 6.2 Section 6.2 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER THE FEDERAL REGISTER INDEXES AND ANCILLARIES § 6.2 Analytical subject indexes. Analytical subject indexes covering the contents of the Federal...
1 CFR 6.2 - Analytical subject indexes.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 1 General Provisions 1 2014-01-01 2012-01-01 true Analytical subject indexes. 6.2 Section 6.2 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER THE FEDERAL REGISTER INDEXES AND ANCILLARIES § 6.2 Analytical subject indexes. Analytical subject indexes covering the contents of the Federal...
1 CFR 6.2 - Analytical subject indexes.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 1 General Provisions 1 2012-01-01 2012-01-01 false Analytical subject indexes. 6.2 Section 6.2 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER THE FEDERAL REGISTER INDEXES AND ANCILLARIES § 6.2 Analytical subject indexes. Analytical subject indexes covering the contents of the Federal...
1 CFR 6.2 - Analytical subject indexes.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 1 General Provisions 1 2013-01-01 2012-01-01 true Analytical subject indexes. 6.2 Section 6.2 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER THE FEDERAL REGISTER INDEXES AND ANCILLARIES § 6.2 Analytical subject indexes. Analytical subject indexes covering the contents of the Federal...
NASA Technical Reports Server (NTRS)
Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James;
2016-01-01
Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes.
Preferential assembly of heteromeric kainate and AMPA receptor amino terminal domains
Lomash, Suvendu; Chittori, Sagar; Glasser, Carla
2017-01-01
Ion conductivity and the gating characteristics of tetrameric glutamate receptor ion channels are determined by their subunit composition. Competitive homo- and hetero-dimerization of their amino-terminal domains (ATDs) is a key step controlling assembly. Here we measured systematically the thermodynamic stabilities of homodimers and heterodimers of kainate and AMPA receptors using fluorescence-detected sedimentation velocity analytical ultracentrifugation. Measured affinities span many orders of magnitude, and complexes show large differences in kinetic stabilities. The association of kainate receptor ATD dimers is generally weaker than the association of AMPA receptor ATD dimers, but both show a general pattern of increased heterodimer stability as compared to the homodimers of their constituents, matching well physiologically observed receptor combinations. The free energy maps of AMPA and kainate receptor ATD dimers provide a framework for the interpretation of observed receptor subtype combinations and possible assembly pathways. PMID:29058671
Preferential assembly of heteromeric kainate and AMPA receptor amino terminal domains.
Zhao, Huaying; Lomash, Suvendu; Chittori, Sagar; Glasser, Carla; Mayer, Mark L; Schuck, Peter
2017-10-23
Ion conductivity and the gating characteristics of tetrameric glutamate receptor ion channels are determined by their subunit composition. Competitive homo- and hetero-dimerization of their amino-terminal domains (ATDs) is a key step controlling assembly. Here we measured systematically the thermodynamic stabilities of homodimers and heterodimers of kainate and AMPA receptors using fluorescence-detected sedimentation velocity analytical ultracentrifugation. Measured affinities span many orders of magnitude, and complexes show large differences in kinetic stabilities. The association of kainate receptor ATD dimers is generally weaker than the association of AMPA receptor ATD dimers, but both show a general pattern of increased heterodimer stability as compared to the homodimers of their constituents, matching well physiologically observed receptor combinations. The free energy maps of AMPA and kainate receptor ATD dimers provide a framework for the interpretation of observed receptor subtype combinations and possible assembly pathways.
Baldovin-Stella stochastic volatility process and Wiener process mixtures
NASA Astrophysics Data System (ADS)
Peirano, P. P.; Challet, D.
2012-08-01
Starting from inhomogeneous time scaling and linear decorrelation between successive price returns, Baldovin and Stella recently proposed a powerful and consistent way to build a model describing the time evolution of a financial index. We first make it fully explicit by using Student distributions instead of power law-truncated Lévy distributions and show that the analytic tractability of the model extends to the larger class of symmetric generalized hyperbolic distributions and provide a full computation of their multivariate characteristic functions; more generally, we show that the stochastic processes arising in this framework are representable as mixtures of Wiener processes. The basic Baldovin and Stella model, while mimicking well volatility relaxation phenomena such as the Omori law, fails to reproduce other stylized facts such as the leverage effect or some time reversal asymmetries. We discuss how to modify the dynamics of this process in order to reproduce real data more accurately.
Causes and consequences of collective turnover: a meta-analytic review.
Heavey, Angela L; Holwerda, Jacob A; Hausknecht, John P
2013-05-01
Given growing interest in collective turnover (i.e., employee turnover at unit and organizational levels), the authors propose an organizing framework for its antecedents and consequences and test it using meta-analysis. Based on analysis of 694 effect sizes drawn from 82 studies, results generally support expected relationships across the 6 categories of collective turnover antecedents, with somewhat stronger and more consistent results for 2 categories: human resource management inducements/investments and job embeddedness signals. Turnover was negatively related to numerous performance outcomes, more strongly so for proximal rather than distal outcomes. Several theoretically grounded moderators help to explain average effect-size heterogeneity for both antecedents and consequences of turnover. Relationships generally did not vary according to turnover type (e.g., total or voluntary), although the relative absence of collective-level involuntary turnover studies is noted and remains an important avenue for future research. PsycINFO Database Record (c) 2013 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
Naqwi, Amir A.; Durst, Franz
1993-07-01
Dual-beam laser measuring techniques are now being used, not only for velocimetry, but also for simultaneous measurements of particle size and velocity in particulate two-phase flows. However, certain details of these optical techniques, such as the effect of Gaussian beam profiles on the accuracy of the measurements, need to be further explored. To implement innovative improvements, a general analytic framework is needed in which performances of various dual-beam instruments could be quantitatively studied and compared. For this purpose, the analysis of light scattering in a generalized dual-wave system is presented in this paper. The present simulation model provides a basis for studying effects of nonplanar beam structures of incident waves, taking into account arbitrary modes of polarization. A polarizer is included in the receiving optics as well. The peculiar aspects of numerical integration of scattered light over circular, rectangular, and truncated circular apertures are also considered.
Facilitating Multiple Intelligences through Multimodal Learning Analytics
ERIC Educational Resources Information Center
Perveen, Ayesha
2018-01-01
This paper develops a theoretical framework for employing learning analytics in online education to trace multiple learning variations of online students by considering their potential of being multiple intelligences based on Howard Gardner's 1983 theory of multiple intelligences. The study first emphasizes the need to facilitate students as…
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields. PMID:25383096
Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields.
Chang, Zhiwei; Halle, Bertil
2016-02-28
In aqueous systems with immobilized macromolecules, including biological tissue, the longitudinal spin relaxation of water protons is primarily induced by exchange-mediated orientational randomization (EMOR) of intra- and intermolecular magnetic dipole-dipole couplings. We have embarked on a systematic program to develop, from the stochastic Liouville equation, a general and rigorous theory that can describe relaxation by the dipolar EMOR mechanism over the full range of exchange rates, dipole coupling strengths, and Larmor frequencies. Here, we present a general theoretical framework applicable to spin systems of arbitrary size with symmetric or asymmetric exchange. So far, the dipolar EMOR theory is only available for a two-spin system with symmetric exchange. Asymmetric exchange, when the spin system is fragmented by the exchange, introduces new and unexpected phenomena. Notably, the anisotropic dipole couplings of non-exchanging spins break the axial symmetry in spin Liouville space, thereby opening up new relaxation channels in the locally anisotropic sites, including longitudinal-transverse cross relaxation. Such cross-mode relaxation operates only at low fields; at higher fields it becomes nonsecular, leading to an unusual inverted relaxation dispersion that splits the extreme-narrowing regime into two sub-regimes. The general dipolar EMOR theory is illustrated here by a detailed analysis of the asymmetric two-spin case, for which we present relaxation dispersion profiles over a wide range of conditions as well as analytical results for integral relaxation rates and time-dependent spin modes in the zero-field and motional-narrowing regimes. The general theoretical framework presented here will enable a quantitative analysis of frequency-dependent water-proton longitudinal relaxation in model systems with immobilized macromolecules and, ultimately, will provide a rigorous link between relaxation-based magnetic resonance image contrast and molecular parameters.
NASA Astrophysics Data System (ADS)
Chang, Zhiwei; Halle, Bertil
2016-02-01
In aqueous systems with immobilized macromolecules, including biological tissue, the longitudinal spin relaxation of water protons is primarily induced by exchange-mediated orientational randomization (EMOR) of intra- and intermolecular magnetic dipole-dipole couplings. We have embarked on a systematic program to develop, from the stochastic Liouville equation, a general and rigorous theory that can describe relaxation by the dipolar EMOR mechanism over the full range of exchange rates, dipole coupling strengths, and Larmor frequencies. Here, we present a general theoretical framework applicable to spin systems of arbitrary size with symmetric or asymmetric exchange. So far, the dipolar EMOR theory is only available for a two-spin system with symmetric exchange. Asymmetric exchange, when the spin system is fragmented by the exchange, introduces new and unexpected phenomena. Notably, the anisotropic dipole couplings of non-exchanging spins break the axial symmetry in spin Liouville space, thereby opening up new relaxation channels in the locally anisotropic sites, including longitudinal-transverse cross relaxation. Such cross-mode relaxation operates only at low fields; at higher fields it becomes nonsecular, leading to an unusual inverted relaxation dispersion that splits the extreme-narrowing regime into two sub-regimes. The general dipolar EMOR theory is illustrated here by a detailed analysis of the asymmetric two-spin case, for which we present relaxation dispersion profiles over a wide range of conditions as well as analytical results for integral relaxation rates and time-dependent spin modes in the zero-field and motional-narrowing regimes. The general theoretical framework presented here will enable a quantitative analysis of frequency-dependent water-proton longitudinal relaxation in model systems with immobilized macromolecules and, ultimately, will provide a rigorous link between relaxation-based magnetic resonance image contrast and molecular parameters.
Dhummakupt, Elizabeth S; Carmany, Daniel O; Mach, Phillip M; Tovar, Trenton M; Ploskonka, Ann M; Demond, Paul S; DeCoste, Jared B; Glaros, Trevor
2018-03-07
Paper spray mass spectrometry has been shown to successfully analyze chemical warfare agent (CWA) simulants. However, due to the volatility differences between the simulants and real G-series (i.e., sarin, soman) CWAs, analysis from an untreated paper substrate proved difficult. To extend the analytical lifetime of these G-agents, metal-organic frameworks (MOFs) were successfully integrated onto the paper spray substrates to increase adsorption and desorption. In this study, several MOFs and nanoparticles were tested to extend the analytical lifetimes of sarin, soman, and cyclosarin on paper spray substrates. It was found that the addition of either UiO-66 or HKUST-1 to the paper substrate increased the analytical lifetime of the G-agents from less than 5 min detectability to at least 50 min.
NASA Astrophysics Data System (ADS)
Tang, F. R.; Zhang, Rong; Li, Huichao; Li, C. N.; Liu, Wei; Bai, Long
2018-05-01
The trade-off criterion is used to systemically investigate the performance features of two chemical engine models (the low-dissipation model and the endoreversible model). The optimal efficiencies, the dissipation ratios, and the corresponding ratios of the dissipation rates for two models are analytically determined. Furthermore, the performance properties of two kinds of chemical engines are precisely compared and analyzed, and some interesting physics is revealed. Our investigations show that the certain universal equivalence between two models is within the framework of the linear irreversible thermodynamics, and their differences are rooted in the different physical contexts. Our results can contribute to a precise understanding of the general features of chemical engines.
Quantum dressing orbits on compact groups
NASA Astrophysics Data System (ADS)
Jurčo, Branislav; Šťovíček, Pavel
1993-02-01
The quantum double is shown to imply the dressing transformation on quantum compact groups and the quantum Iwasawa decompositon in the general case. Quantum dressing orbits are described explicitly as *-algebras. The dual coalgebras consisting of differential operators are related to the quantum Weyl elements. Besides, the differential geometry on a quantum leaf allows a remarkably simple construction of irreducible *-representations of the algebras of quantum functions. Representation spaces then consist of analytic functions on classical phase spaces. These representations are also interpreted in the framework of quantization in the spirit of Berezin applied to symplectic leaves on classical compact groups. Convenient “coherent states” are introduced and a correspondence between classical and quantum observables is given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yazawa, Kazuaki; Shakouri, Ali
The energy conversion efficiency of today’s thermoelectric generators is significantly lower than that of conventional mechanical engines. Almost all of the existing research is focused on materials to improve the conversion efficiency. Here we propose a general framework to study the cost-efficiency trade-off for thermoelectric power generation. A key factor is the optimization of thermoelectric modules together with their heat source and heat sinks. Full electrical and thermal co-optimization yield a simple analytical expression for optimum design. Based on this model, power output per unit mass can be maximized. We show that the fractional area coverage of thermoelectric elements inmore » a module could play a significant role in reducing the cost of power generation systems.« less
NASA Astrophysics Data System (ADS)
Leopold-Wildburger, Ulrike; Pickl, Stefan
2008-10-01
In our research we intend to use experiments to study human behavior in a simulation environment based on a simple Lotka-Volterra predator-prey ecology. The aim is to study the influence of participants' harvesting strategies and certain personality traits derived from [1] on the outcome in terms of sustainability and economic performance. Such an approach is embedded in a research program which intends to develop and understand interactive resource planning processes. We present the general framework as well as the new decision support system EXPOSIM. The key element is the combination of experimental design, analytical understanding of time-discrete systems (especially Lotka-Volterra systems) and economic performance. In the first part, the general role of laboratory experiments is discussed. The second part summarizes the concept of sustainable development. It is taken from [18]. As we use Lotka-Volterra systems as the basis for our simulations a theoretical framework is described afterwards. It is possible to determine optimal behavior for those systems. The empirical setting is based on the empirical approach that the subjects are put into the position of a decision-maker. They are able to model the environment in such a way that harvesting can be observed. We suggest an experimental setting which might lead to new insights in an anticipatory sense.
The Dolinar Receiver in an Information Theoretic Framework
NASA Technical Reports Server (NTRS)
Erkmen, Baris I.; Birnbaum, Kevin M.; Moision, Bruce E.; Dolinar, Samuel J.
2011-01-01
Optical communication at the quantum limit requires that measurements on the optical field be maximally informative, but devising physical measurements that accomplish this objective has proven challenging. The Dolinar receiver exemplifies a rare instance of success in distinguishing between two coherent states: an adaptive local oscillator is mixed with the signal prior to photodetection, which yields an error probability that meets the Helstrom lower bound with equality. Here we apply the same local-oscillator-based architecture with aninformation-theoretic optimization criterion. We begin with analysis of this receiver in a general framework for an arbitrary coherent-state modulation alphabet, and then we concentrate on two relevant examples. First, we study a binary antipodal alphabet and show that the Dolinar receiver's feedback function not only minimizes the probability of error, but also maximizes the mutual information. Next, we study ternary modulation consistingof antipodal coherent states and the vacuum state. We derive an analytic expression for a near-optimal local oscillator feedback function, and, via simulation, we determine its photon information efficiency (PIE). We provide the PIE versus dimensional information efficiency (DIE) trade-off curve and show that this modulation and the our receiver combination performs universally better than (generalized) on-off keying plus photoncounting, although, the advantage asymptotically vanishes as the bits-per-photon diverges towards infinity.
Cultural Cleavage and Criminal Justice.
ERIC Educational Resources Information Center
Scheingold, Stuart A.
1978-01-01
Reviews major theories of criminal justice, proposes an alternative analytic framework which focuses on cultural factors, applies this framework to several cases, and discusses implications of a cultural perspective for rule of law values. Journal available from Office of Publication, Department of Political Science, University of Florida,…
Institutional Racist Melancholia: A Structural Understanding of Grief and Power in Schooling
ERIC Educational Resources Information Center
Vaught, Sabina E.
2012-01-01
In this article, Sabina Vaught undertakes the theoretical and analytical project of conceptually integrating "Whiteness as property", a key structural framework of Critical Race Theory (CRT), and "melancholia", a framework originally emerging from psychoanalysis. Specifically, Vaught engages "Whiteness as property" as…
Video-Based Analyses of Motivation and Interaction in Science Classrooms
NASA Astrophysics Data System (ADS)
Moeller Andersen, Hanne; Nielsen, Birgitte Lund
2013-04-01
An analytical framework for examining students' motivation was developed and used for analyses of video excerpts from science classrooms. The framework was developed in an iterative process involving theories on motivation and video excerpts from a 'motivational event' where students worked in groups. Subsequently, the framework was used for an analysis of students' motivation in the whole class situation. A cross-case analysis was carried out illustrating characteristics of students' motivation dependent on the context. This research showed that students' motivation to learn science is stimulated by a range of different factors, with autonomy, relatedness and belonging apparently being the main sources of motivation. The teacher's combined use of questions, uptake and high level evaluation was very important for students' learning processes and motivation, especially students' self-efficacy. By coding and analysing video excerpts from science classrooms, we were able to demonstrate that the analytical framework helped us gain new insights into the effect of teachers' communication and other elements on students' motivation.
ClimateSpark: An in-memory distributed computing framework for big climate data analytics
NASA Astrophysics Data System (ADS)
Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei
2018-06-01
The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.
Defense Resource Management Studies: Introduction to Capability and Acquisition Planning Processes
2010-08-01
interchangeable and useful in a common contextual framework . Currently, both simulations use a common scenario, the same fictitious country, and...culture, legal framework , and institutions. • Incorporate Principles of Good Governance and Respect for Human Rights: Stress accountability and...Preparing for the assessments requires defining the missions to be analyzed; subdividing the mission definitions to provide a framework for analytic work
Using Learning Analytics for Preserving Academic Integrity
ERIC Educational Resources Information Center
Amigud, Alexander; Arnedo-Moreno, Joan; Daradoumis, Thanasis; Guerrero-Roldan, Ana-Elena
2017-01-01
This paper presents the results of integrating learning analytics into the assessment process to enhance academic integrity in the e-learning environment. The goal of this research is to evaluate the computational-based approach to academic integrity. The machine-learning based framework learns students' patterns of language use from data,…
An Active Learning Exercise for Introducing Agent-Based Modeling
ERIC Educational Resources Information Center
Pinder, Jonathan P.
2013-01-01
Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…
Translating Learning into Numbers: A Generic Framework for Learning Analytics
ERIC Educational Resources Information Center
Greller, Wolfgang; Drachsler, Hendrik
2012-01-01
With the increase in available educational data, it is expected that Learning Analytics will become a powerful means to inform and support learners, teachers and their institutions in better understanding and predicting personal learning needs and performance. However, the processes and requirements behind the beneficial application of Learning…
ERIC Educational Resources Information Center
Dawson, Shane; Siemens, George
2014-01-01
The rapid advances in information and communication technologies, coupled with increased access to information and the formation of global communities, have resulted in interest among researchers and academics to revise educational practice to move beyond traditional "literacy" skills towards an enhanced set of…
The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change
NASA Astrophysics Data System (ADS)
Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.
2015-12-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the datasets distribution - and will include intercomparison, ensemble, and outlier analysis. The two-level workflow solution envisioned in INDIGO (coarse grain for distributed tasks orchestration, and fine grain, at the level of a single data analytics cluster instance) will be presented and discussed.
Investigating the two-moment characterisation of subcellular biochemical networks.
Ullah, Mukhtar; Wolkenhauer, Olaf
2009-10-07
While ordinary differential equations (ODEs) form the conceptual framework for modelling many cellular processes, specific situations demand stochastic models to capture the influence of noise. The most common formulation of stochastic models for biochemical networks is the chemical master equation (CME). While stochastic simulations are a practical way to realise the CME, analytical approximations offer more insight into the influence of noise. Towards that end, the two-moment approximation (2MA) is a promising addition to the established analytical approaches including the chemical Langevin equation (CLE) and the related linear noise approximation (LNA). The 2MA approach directly tracks the mean and (co)variance which are coupled in general. This coupling is not obvious in CME and CLE and ignored by LNA and conventional ODE models. We extend previous derivations of 2MA by allowing (a) non-elementary reactions and (b) relative concentrations. Often, several elementary reactions are approximated by a single step. Furthermore, practical situations often require the use of relative concentrations. We investigate the applicability of the 2MA approach to the well-established fission yeast cell cycle model. Our analytical model reproduces the clustering of cycle times observed in experiments. This is explained through multiple resettings of M-phase promoting factor (MPF), caused by the coupling between mean and (co)variance, near the G2/M transition.
Analytical optimization of demand management strategies across all urban water use sectors
NASA Astrophysics Data System (ADS)
Friedman, Kenneth; Heaney, James P.; Morales, Miguel; Palenchar, John
2014-07-01
An effective urban water demand management program can greatly influence both peak and average demand and therefore long-term water supply and infrastructure planning. Although a theoretical framework for evaluating residential indoor demand management has been well established, little has been done to evaluate other water use sectors such as residential irrigation in a compatible manner for integrating these results into an overall solution. This paper presents a systematic procedure to evaluate the optimal blend of single family residential irrigation demand management strategies to achieve a specified goal based on performance functions derived from parcel level tax assessor's data linked to customer level monthly water billing data. This framework is then generalized to apply to any urban water sector, as exponential functions can be fit to all resulting cumulative water savings functions. Two alternative formulations are presented: maximize net benefits, or minimize total costs subject to satisfying a target water savings. Explicit analytical solutions are presented for both formulations based on appropriate exponential best fits of performance functions. A direct result of this solution is the dual variable which represents the marginal cost of water saved at a specified target water savings goal. A case study of 16,303 single family irrigators in Gainesville Regional Utilities utilizing high quality tax assessor and monthly billing data along with parcel level GIS data provide an illustrative example of these techniques. Spatial clustering of targeted homes can be easily performed in GIS to identify priority demand management areas.
A Framework for Integrating Environmental Justice in Regulatory Analysis
Nweke, Onyemaechi C.
2011-01-01
With increased interest in integrating environmental justice into the process for developing environmental regulations in the United States, analysts and decision makers are confronted with the question of what methods and data can be used to assess disproportionate environmental health impacts. However, as a first step to identifying data and methods, it is important that analysts understand what information on equity impacts is needed for decision making. Such knowledge originates from clearly stated equity objectives and the reflection of those objectives throughout the analytical activities that characterize Regulatory Impact Analysis (RIA), a process that is traditionally used to inform decision making. The framework proposed in this paper advocates structuring analyses to explicitly provide pre-defined output on equity impacts. Specifically, the proposed framework emphasizes: (a) defining equity objectives for the proposed regulatory action at the onset of the regulatory process, (b) identifying specific and related sub-objectives for key analytical steps in the RIA process, and (c) developing explicit analytical/research questions to assure that stated sub-objectives and objectives are met. In proposing this framework, it is envisioned that information on equity impacts informs decision-making in regulatory development, and that this is achieved through a systematic and consistent approach that assures linkages between stated equity objectives, regulatory analyses, selection of policy options, and the design of compliance and enforcement activities. PMID:21776235
Meyer, Adrian; Green, Laura; Faulk, Ciearro; Galla, Stephen; Meyer, Anne-Marie
2016-01-01
Introduction: Large amounts of health data generated by a wide range of health care applications across a variety of systems have the potential to offer valuable insight into populations and health care systems, but robust and secure computing and analytic systems are required to leverage this information. Framework: We discuss our experiences deploying a Secure Data Analysis Platform (SeDAP), and provide a framework to plan, build and deploy a virtual desktop infrastructure (VDI) to enable innovation, collaboration and operate within academic funding structures. It outlines 6 core components: Security, Ease of Access, Performance, Cost, Tools, and Training. Conclusion: A platform like SeDAP is not simply successful through technical excellence and performance. It’s adoption is dependent on a collaborative environment where researchers and users plan and evaluate the requirements of all aspects. PMID:27683665
Detecting biological responses to flow management: Missed opportunities; future directions
Souchon, Y.; Sabaton, C.; Deibel, R.; Reiser, D.; Kershner, J.; Gard, M.; Katopodis, C.; Leonard, P.; Poff, N.L.; Miller, W.J.; Lamb, B.L.
2008-01-01
The conclusions of numerous stream restoration assessments all around the world are extremely clear and convergent: there has been insufficient appropriate monitoring to improve general knowledge and expertise. In the specialized field of instream flow alterations, we consider that there are several opportunities comparable to full-size experiments. Hundreds of water management decisions related to instream flow releases have been made by government agencies, native peoples, and non-governmental organizations around the world. These decisions are based on different methods and assumptions and many flow regimes have been adopted by formal or informal rules and regulations. Although, there have been significant advances in analytical capabilities, there has been very little validation monitoring of actual outcomes or research related to the response of aquatic dependent species to new flow regimes. In order to be able to detect these kinds of responses and to better guide decision, a general design template is proposed. The main steps of this template are described and discussed, in terms of objectives, hypotheses, variables, time scale, data management, and information, in the spirit of adaptive management. The adoption of such a framework is not always easy, due to differing interests of actors for the results, regarding the duration of monitoring, nature of funding and differential timetables between facilities managers and technicians. Nevertheless, implementation of such a framework could help researchers and practitioners to coordinate and federate their efforts to improve the general knowledge of the links between the habitat dynamics and biological aquatic responses. Copyright ?? 2008 John Wiley & Sons, Ltd.
Deriving Appropriate Educational Program Costs in Illinois.
ERIC Educational Resources Information Center
Parrish, Thomas B.; Chambers, Jay G.
This document describes the comprehensive analytical framework for school finance used by the Illinois State Board of Education to assist policymakers in their decisions about equitable distribution of state aid and appropriate levels of resources to meet the varying educational requirements of differing student populations. This framework, the…
Analyzing Agricultural Technology Systems: A Research Report.
ERIC Educational Resources Information Center
Swanson, Burton E.
The International Program for Agricultural Knowledge Systems (INTERPAKS) research team is developing a descriptive and analytic framework to examine and assess agricultural technology systems. The first part of the framework is an inductive methodology that organizes data collection and orders data for comparison between countries. It requires and…
Large Ensemble Analytic Framework for Consequence-Driven Discovery of Climate Change Scenarios
NASA Astrophysics Data System (ADS)
Lamontagne, Jonathan R.; Reed, Patrick M.; Link, Robert; Calvin, Katherine V.; Clarke, Leon E.; Edmonds, James A.
2018-03-01
An analytic scenario generation framework is developed based on the idea that the same climate outcome can result from very different socioeconomic and policy drivers. The framework builds on the Scenario Matrix Framework's abstraction of "challenges to mitigation" and "challenges to adaptation" to facilitate the flexible discovery of diverse and consequential scenarios. We combine visual and statistical techniques for interrogating a large factorial data set of 33,750 scenarios generated using the Global Change Assessment Model. We demonstrate how the analytic framework can aid in identifying which scenario assumptions are most tied to user-specified measures for policy relevant outcomes of interest, specifically for our example high or low mitigation costs. We show that the current approach for selecting reference scenarios can miss policy relevant scenario narratives that often emerge as hybrids of optimistic and pessimistic scenario assumptions. We also show that the same scenario assumption can be associated with both high and low mitigation costs depending on the climate outcome of interest and the mitigation policy context. In the illustrative example, we show how agricultural productivity, population growth, and economic growth are most predictive of the level of mitigation costs. Formulating policy relevant scenarios of deeply and broadly uncertain futures benefits from large ensemble-based exploration of quantitative measures of consequences. To this end, we have contributed a large database of climate change futures that can support "bottom-up" scenario generation techniques that capture a broader array of consequences than those that emerge from limited sampling of a few reference scenarios.
2010-01-01
Background The prevention of overweight sometimes raises complex ethical questions. Ethical public health frameworks may be helpful in evaluating programs or policy for overweight prevention. We give an overview of the purpose, form and contents of such public health frameworks and investigate to which extent they are useful for evaluating programs to prevent overweight and/or obesity. Methods Our search for frameworks consisted of three steps. Firstly, we asked experts in the field of ethics and public health for the frameworks they were aware of. Secondly, we performed a search in Pubmed. Thirdly, we checked literature references in the articles on frameworks we found. In total, we thus found six ethical frameworks. We assessed the area on which the available ethical frameworks focus, the users they target at, the type of policy or intervention they propose to address, and their aim. Further, we looked at their structure and content, that is, tools for guiding the analytic process, the main ethical principles or values, possible criteria for dealing with ethical conflicts, and the concrete policy issues they are applied to. Results All frameworks aim to support public health professionals or policymakers. Most of them provide a set of values or principles that serve as a standard for evaluating policy. Most frameworks articulate both the positive ethical foundations for public health and ethical constraints or concerns. Some frameworks offer analytic tools for guiding the evaluative process. Procedural guidelines and concrete criteria for solving important ethical conflicts in the particular area of the prevention of overweight or obesity are mostly lacking. Conclusions Public health ethical frameworks may be supportive in the evaluation of overweight prevention programs or policy, but seem to lack practical guidance to address ethical conflicts in this particular area. PMID:20969761
Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y
2016-03-01
Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.
Building Virtual Watersheds: A Global Opportunity to Strengthen Resource Management and Conservation
NASA Astrophysics Data System (ADS)
Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y.
2016-03-01
Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.
Structure of the Balmer jump. The isolated hydrogen atom
NASA Astrophysics Data System (ADS)
Calvo, F.; Belluzzi, L.; Steiner, O.
2018-06-01
Context. The spectrum of the hydrogen atom was explained by Bohr more than one century ago. We revisit here some of the aspects of the underlying quantum structure, with a modern formalism, focusing on the limit of the Balmer series. Aims: We investigate the behaviour of the absorption coefficient of the isolated hydrogen atom in the neighbourhood of the Balmer limit. Methods: We analytically computed the total cross-section arising from bound-bound and bound-free transitions in the isolated hydrogen atom at the Balmer limit, and established a simplified semi-analytical model for the surroundings of that limit. We worked within the framework of the formalism of Landi Degl'Innocenti & Landolfi (2004, Astrophys. Space Sci. Lib., 307), which permits an almost straight-forward generalization of our results to other atoms and molecules, and which is perfectly suitable for including polarization phenomena in the problem. Results: We analytically show that there is no discontinuity at the Balmer limit, even though the concept of a "Balmer jump" is still meaningful. Furthermore, we give a possible definition of the location of the Balmer jump, and we check that this location is dependent on the broadening mechanisms. At the Balmer limit, we compute the cross-section in a fully analytical way. Conclusions: The Balmer jump is produced by a rapid drop of the total Balmer cross-section, yet this variation is smooth and continuous when both bound-bound and bound-free processes are taken into account, and its shape and location is dependent on the broadening mechanisms.
A visual analytic framework for data fusion in investigative intelligence
NASA Astrophysics Data System (ADS)
Cai, Guoray; Gross, Geoff; Llinas, James; Hall, David
2014-05-01
Intelligence analysis depends on data fusion systems to provide capabilities of detecting and tracking important objects, events, and their relationships in connection to an analytical situation. However, automated data fusion technologies are not mature enough to offer reliable and trustworthy information for situation awareness. Given the trend of increasing sophistication of data fusion algorithms and loss of transparency in data fusion process, analysts are left out of the data fusion process cycle with little to no control and confidence on the data fusion outcome. Following the recent rethinking of data fusion as human-centered process, this paper proposes a conceptual framework towards developing alternative data fusion architecture. This idea is inspired by the recent advances in our understanding of human cognitive systems, the science of visual analytics, and the latest thinking about human-centered data fusion. Our conceptual framework is supported by an analysis of the limitation of existing fully automated data fusion systems where the effectiveness of important algorithmic decisions depend on the availability of expert knowledge or the knowledge of the analyst's mental state in an investigation. The success of this effort will result in next generation data fusion systems that can be better trusted while maintaining high throughput.
Trouvé, Hélène; Couturier, Yves; Etheridge, Francis; Saint-Jean, Olivier; Somme, Dominique
2010-06-30
The literature on integration indicates the need for an enhanced theorization of institutional integration. This article proposes path dependence as an analytical framework to study the systems in which integration takes place. PRISMA proposes a model for integrating health and social care services for older adults. This model was initially tested in Quebec. The PRISMA France study gave us an opportunity to analyze institutional integration in France. A qualitative approach was used. Analyses were based on semi-structured interviews with actors of all levels of decision-making, observations of advisory board meetings, and administrative documents. Our analyses revealed the complexity and fragmentation of institutional integration. The path dependency theory, which analyzes the change capacity of institutions by taking into account their historic structures, allows analysis of this situation. The path dependency to the Bismarckian system and the incomplete reforms of gerontological policies generate the coexistence and juxtaposition of institutional systems. In such a context, no institution has sufficient ability to determine gerontology policy and build institutional integration by itself. Using path dependence as an analytical framework helps to understand the reasons why institutional integration is critical to organizational and clinical integration, and the complex construction of institutional integration in France.
ERIC Educational Resources Information Center
Eick, Caroline Marie; Ryan, Patrick A.
2014-01-01
This article discusses the relevance of an analytic framework that integrates principles of Catholic social teaching, critical pedagogy, and the theory of intersectionality to explain attitudes toward marginalized youth held by Catholic students preparing to become teachers. The framework emerges from five years of action research data collected…
Competency Analytics Tool: Analyzing Curriculum Using Course Competencies
ERIC Educational Resources Information Center
Gottipati, Swapna; Shankararaman, Venky
2018-01-01
The applications of learning outcomes and competency frameworks have brought better clarity to engineering programs in many universities. Several frameworks have been proposed to integrate outcomes and competencies into course design, delivery and assessment. However, in many cases, competencies are course-specific and their overall impact on the…
ERIC Educational Resources Information Center
Turan, Fikret Korhan; Cetinkaya, Saadet; Ustun, Ceyda
2016-01-01
Building sustainable universities calls for participative management and collaboration among stakeholders. Combining analytic hierarchy and network processes (AHP/ANP) with statistical analysis, this research proposes a framework that can be used in higher education institutions for integrating stakeholder preferences into strategic decisions. The…
ERIC Educational Resources Information Center
Kou, Xiaojing
2011-01-01
Various formats of online discussion have proven valuable for enhancing learning and collaboration in distance and blended learning contexts. However, despite their capacity to reveal essential processes in collaborative inquiry, current mainstream analytical frameworks, such as the cognitive presence framework (Garrison, Anderson, & Archer,…
A Cognitive Framework for the Analysis of Online Chemistry Courses
ERIC Educational Resources Information Center
Evans, Karen L.; Leinhardt, Gaea
2008-01-01
Many students now are receiving instruction in online environments created by universities, museums, corporations, and even students. What features of a given online course contribute to its effectiveness? This paper addresses that query by proposing and applying an analytic framework to five online introductory chemistry courses. Introductory…
Managing Offshore Branch Campuses: An Analytical Framework for Institutional Strategies
ERIC Educational Resources Information Center
Shams, Farshid; Huisman, Jeroen
2012-01-01
The aim of this article is to develop a framework that encapsulates the key managerial complexities of running offshore branch campuses. In the transnational higher education (TNHE) literature, several managerial ramifications and impediments have been addressed by scholars and practitioners. However, the strands of the literature are highly…
Analytical procedure validation and the quality by design paradigm.
Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno
2015-01-01
Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.
ERIC Educational Resources Information Center
Christie, Pam
2016-01-01
Reflecting on South African experience, this paper develops an analytical framework using the work of Henri Lefebvre and Nancy Fraser to understand why socially just arrangements may be so difficult to achieve in post-conflict reconstruction. The paper uses Lefebvre's analytic to trace three sets of entangled practices…
ERIC Educational Resources Information Center
Ranga, Marina; Etzkowitz, Henry
2013-01-01
This paper introduces the concept of Triple Helix systems as an analytical construct that synthesizes the key features of university--industry--government (Triple Helix) interactions into an "innovation system" format, defined according to systems theory as a set of components, relationships and functions. Among the components of Triple…
Learning Analytics as a Counterpart to Surveys of Student Experience
ERIC Educational Resources Information Center
Borden, Victor M. H.; Coates, Hamish
2017-01-01
Analytics derived from the student learning environment provide new insights into the collegiate experience; they can be used as a supplement to or, to some extent, in place of traditional surveys. To serve this purpose, however, greater attention must be paid to conceptual frameworks and to advancing institutional systems, activating new…
Challenges of Using Learning Analytics Techniques to Support Mobile Learning
ERIC Educational Resources Information Center
Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide
2015-01-01
Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…
ERIC Educational Resources Information Center
Lintao, Rachelle B.; Erfe, Jonathan P.
2012-01-01
This study purports to foster the understanding of profession-based academic writing in two different cultural conventions by examining the rhetorical moves employed by American and Philippine thesis introductions in Architecture using Swales' 2004 Revised CARS move-analytic model as framework. Twenty (20) Master's thesis introductions in…
ERIC Educational Resources Information Center
Lu, Owen H. T.; Huang, Anna Y. Q.; Huang, Jeff C. H.; Lin, Albert J. Q.; Ogata, Hiroaki; Yang, Stephen J. H.
2018-01-01
Blended learning combines online digital resources with traditional classroom activities and enables students to attain higher learning performance through well-defined interactive strategies involving online and traditional learning activities. Learning analytics is a conceptual framework and is a part of our Precision education used to analyze…
A Bayesian Multi-Level Factor Analytic Model of Consumer Price Sensitivities across Categories
ERIC Educational Resources Information Center
Duvvuri, Sri Devi; Gruca, Thomas S.
2010-01-01
Identifying price sensitive consumers is an important problem in marketing. We develop a Bayesian multi-level factor analytic model of the covariation among household-level price sensitivities across product categories that are substitutes. Based on a multivariate probit model of category incidence, this framework also allows the researcher to…
ERIC Educational Resources Information Center
Tlili, Ahmed; Essalmi, Fathi; Jemni, Mohamed; Kinshuk; Chen, Nian-Shing
2018-01-01
With the rapid growth of online education in recent years, Learning Analytics (LA) has gained increasing attention from researchers and educational institutions as an area which can improve the overall effectiveness of learning experiences. However, the lack of guidelines on what should be taken into consideration during application of LA hinders…
Learning Analytics for Communities of Inquiry
ERIC Educational Resources Information Center
Kovanovic, Vitomir; Gaševic, Dragan; Hatala, Marek
2014-01-01
This paper describes doctoral research that focuses on the development of a learning analytics framework for inquiry-based digital learning. Building on the Community of Inquiry model (CoI)--a foundation commonly used in the research and practice of digital learning and teaching--this research builds on the existing body of knowledge in two…
ERIC Educational Resources Information Center
Weeks, Cristal E.; Kanter, Jonathan W.; Bonow, Jordan T.; Landes, Sara J.; Busch, Andrew M.
2012-01-01
Functional analytic psychotherapy (FAP) provides a behavioral analysis of the psychotherapy relationship that directly applies basic research findings to outpatient psychotherapy settings. Specifically, FAP suggests that a therapist's in vivo (i.e., in-session) contingent responding to targeted client behaviors, particularly positive reinforcement…
Micromechanics Analysis Code With Generalized Method of Cells (MAC/GMC): User Guide. Version 3
NASA Technical Reports Server (NTRS)
Arnold, S. M.; Bednarcyk, B. A.; Wilt, T. E.; Trowbridge, D.
1999-01-01
The ability to accurately predict the thermomechanical deformation response of advanced composite materials continues to play an important role in the development of these strategic materials. Analytical models that predict the effective behavior of composites are used not only by engineers performing structural analysis of large-scale composite components but also by material scientists in developing new material systems. For an analytical model to fulfill these two distinct functions it must be based on a micromechanics approach which utilizes physically based deformation and life constitutive models and allows one to generate the average (macro) response of a composite material given the properties of the individual constituents and their geometric arrangement. Here the user guide for the recently developed, computationally efficient and comprehensive micromechanics analysis code, MAC, who's predictive capability rests entirely upon the fully analytical generalized method of cells, GMC, micromechanics model is described. MAC/ GMC is a versatile form of research software that "drives" the double or triply periodic micromechanics constitutive models based upon GMC. MAC/GMC enhances the basic capabilities of GMC by providing a modular framework wherein 1) various thermal, mechanical (stress or strain control) and thermomechanical load histories can be imposed, 2) different integration algorithms may be selected, 3) a variety of material constitutive models (both deformation and life) may be utilized and/or implemented, and 4) a variety of fiber architectures (both unidirectional, laminate and woven) may be easily accessed through their corresponding representative volume elements contained within the supplied library of RVEs or input directly by the user, and 5) graphical post processing of the macro and/or micro field quantities is made available.
The rise of environmental analytical chemistry as an interdisciplinary activity.
Brown, Richard
2009-07-01
Modern scientific endeavour is increasingly delivered within an interdisciplinary framework. Analytical environmental chemistry is a long-standing example of an interdisciplinary approach to scientific research where value is added by the close cooperation of different disciplines. This editorial piece discusses the rise of environmental analytical chemistry as an interdisciplinary activity and outlines the scope of the Analytical Chemistry and the Environmental Chemistry domains of TheScientificWorldJOURNAL (TSWJ), and the appropriateness of TSWJ's domain format in covering interdisciplinary research. All contributions of new data, methods, case studies, and instrumentation, or new interpretations and developments of existing data, case studies, methods, and instrumentation, relating to analytical and/or environmental chemistry, to the Analytical and Environmental Chemistry domains, are welcome and will be considered equally.
Vortex Core Size in the Rotor Near-Wake
NASA Technical Reports Server (NTRS)
Young, Larry A.
2003-01-01
Using a kinetic energy conservation approach, a number of simple analytic expressions are derived for estimating the core size of tip vortices in the near-wake of rotors in hover and axial-flow flight. The influence of thrust, induced power losses, advance ratio, and vortex structure on rotor vortex core size is assessed. Experimental data from the literature is compared to the analytical results derived in this paper. In general, three conclusions can be drawn from the work in this paper. First, the greater the rotor thrust, t h e larger the vortex core size in the rotor near-wake. Second, the more efficient a rotor is with respect to induced power losses, the smaller the resulting vortex core size. Third, and lastly, vortex core size initially decreases for low axial-flow advance ratios, but for large advance ratios core size asymptotically increases to a nominal upper limit. Insights gained from this work should enable improved modeling of rotary-wing aerodynamics, as well as provide a framework for improved experimental investigations of rotor a n d advanced propeller wakes.
Quantifying drivers of wild pig movement across multiple spatial and temporal scales.
Kay, Shannon L; Fischer, Justin W; Monaghan, Andrew J; Beasley, James C; Boughton, Raoul; Campbell, Tyler A; Cooper, Susan M; Ditchkoff, Stephen S; Hartley, Steve B; Kilgo, John C; Wisely, Samantha M; Wyckoff, A Christy; VerCauteren, Kurt C; Pepin, Kim M
2017-01-01
The movement behavior of an animal is determined by extrinsic and intrinsic factors that operate at multiple spatio-temporal scales, yet much of our knowledge of animal movement comes from studies that examine only one or two scales concurrently. Understanding the drivers of animal movement across multiple scales is crucial for understanding the fundamentals of movement ecology, predicting changes in distribution, describing disease dynamics, and identifying efficient methods of wildlife conservation and management. We obtained over 400,000 GPS locations of wild pigs from 13 different studies spanning six states in southern U.S.A., and quantified movement rates and home range size within a single analytical framework. We used a generalized additive mixed model framework to quantify the effects of five broad predictor categories on movement: individual-level attributes, geographic factors, landscape attributes, meteorological conditions, and temporal variables. We examined effects of predictors across three temporal scales: daily, monthly, and using all data during the study period. We considered both local environmental factors such as daily weather data and distance to various resources on the landscape, as well as factors acting at a broader spatial scale such as ecoregion and season. We found meteorological variables (temperature and pressure), landscape features (distance to water sources), a broad-scale geographic factor (ecoregion), and individual-level characteristics (sex-age class), drove wild pig movement across all scales, but both the magnitude and shape of covariate relationships to movement differed across temporal scales. The analytical framework we present can be used to assess movement patterns arising from multiple data sources for a range of species while accounting for spatio-temporal correlations. Our analyses show the magnitude by which reaction norms can change based on the temporal scale of response data, illustrating the importance of appropriately defining temporal scales of both the movement response and covariates depending on the intended implications of research (e.g., predicting effects of movement due to climate change versus planning local-scale management). We argue that consideration of multiple spatial scales within the same framework (rather than comparing across separate studies post-hoc ) gives a more accurate quantification of cross-scale spatial effects by appropriately accounting for error correlation.
Elliptic-cylindrical analytical flux-rope model for ICMEs
NASA Astrophysics Data System (ADS)
Nieves-Chinchilla, T.; Linton, M.; Hidalgo, M. A. U.; Vourlidas, A.
2016-12-01
We present an analytical flux-rope model for realistic magnetic structures embedded in Interplanetary Coronal Mass Ejections. The framework of this model was established by Nieves-Chinchilla et al. (2016) with the circular-cylindrical analytical flux rope model and under the concept developed by Hidalgo et al. (2002). Elliptic-cylindrical geometry establishes the first-grade of complexity of a series of models. The model attempts to describe the magnetic flux rope topology with distorted cross-section as a possible consequence of the interaction with the solar wind. In this model, the flux rope is completely described in the non-euclidean geometry. The Maxwell equations are solved using tensor calculus consistently with the geometry chosen, invariance along the axial component, and with the only assumption of no radial current density. The model is generalized in terms of the radial dependence of the poloidal current density component and axial current density component. The misalignment between current density and magnetic field is studied in detail for the individual cases of different pairs of indexes for the axial and poloidal current density components. This theoretical analysis provides a map of the force distribution inside of the flux-rope. The reconstruction technique has been adapted to the model and compared with in situ ICME set of events with different in situ signatures. The successful result is limited to some cases with clear in-situ signatures of distortion. However, the model adds a piece in the puzzle of the physical-analytical representation of these magnetic structures. Other effects such as axial curvature, expansion and/or interaction could be incorporated in the future to fully understand the magnetic structure. Finally, the mathematical formulation of this model opens the door to the next model: toroidal flux rope analytical model.
An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Zhou, Ning
With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less
Hartnell, Chad A; Ou, Amy Yi; Kinicki, Angelo
2011-07-01
We apply Quinn and Rohrbaugh's (1983) competing values framework (CVF) as an organizing taxonomy to meta-analytically test hypotheses about the relationship between 3 culture types and 3 major indices of organizational effectiveness (employee attitudes, operational performance [i.e., innovation and product and service quality], and financial performance). The paper also tests theoretical suppositions undergirding the CVF by investigating the framework's nomological validity and proposed internal structure (i.e., interrelationships among culture types). Results based on data from 84 empirical studies with 94 independent samples indicate that clan, adhocracy, and market cultures are differentially and positively associated with the effectiveness criteria, though not always as hypothesized. The findings provide mixed support for the CVF's nomological validity and fail to support aspects of the CVF's proposed internal structure. We propose an alternative theoretical approach to the CVF and delineate directions for future research.
Conceptual framework for outcomes research studies of hepatitis C: an analytical review
Sbarigia, Urbano; Denee, Tom R; Turner, Norris G; Wan, George J; Morrison, Alan; Kaufman, Anna S; Rice, Gary; Dusheiko, Geoffrey M
2016-01-01
Hepatitis C virus infection is one of the main causes of chronic liver disease worldwide. Until recently, the standard antiviral regimen for hepatitis C was a combination of an interferon derivative and ribavirin, but a plethora of new antiviral drugs is becoming available. While these new drugs have shown great efficacy in clinical trials, observational studies are needed to determine their effectiveness in clinical practice. Previous observational studies have shown that multiple factors, besides the drug regimen, affect patient outcomes in clinical practice. Here, we provide an analytical review of published outcomes studies of the management of hepatitis C virus infection. A conceptual framework defines the relationships between four categories of variables: health care system structure, patient characteristics, process-of-care, and patient outcomes. This framework can provide a starting point for outcomes studies addressing the use and effectiveness of new antiviral drug treatments. PMID:27313473
Earth Science Data Fusion with Event Building Approach
NASA Technical Reports Server (NTRS)
Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.
2015-01-01
Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.
Evaluating efficiency-equality tradeoffs for mobile source control strategies in an urban area
Levy, Jonathan I.; Greco, Susan L.; Melly, Steven J.; Mukhi, Neha
2013-01-01
In environmental risk management, there are often interests in maximizing public health benefits (efficiency) and addressing inequality in the distribution of health outcomes. However, both dimensions are not generally considered within a single analytical framework. In this study, we estimate both total population health benefits and changes in quantitative indicators of health inequality for a number of alternative spatial distributions of diesel particulate filter retrofits across half of an urban bus fleet in Boston, Massachusetts. We focus on the impact of emissions controls on primary fine particulate matter (PM2.5) emissions, modeling the effect on PM2.5 concentrations and premature mortality. Given spatial heterogeneity in baseline mortality rates, we apply the Atkinson index and other inequality indicators to quantify changes in the distribution of mortality risk. Across the different spatial distributions of control strategies, the public health benefits varied by more than a factor of two, related to factors such as mileage driven per day, population density near roadways, and baseline mortality rates in exposed populations. Changes in health inequality indicators varied across control strategies, with the subset of optimal strategies considering both efficiency and equality generally robust across different parametric assumptions and inequality indicators. Our analysis demonstrates the viability of formal analytical approaches to jointly address both efficiency and equality in risk assessment, providing a tool for decision-makers who wish to consider both issues. PMID:18793281
Gradient design for liquid chromatography using multi-scale optimization.
López-Ureña, S; Torres-Lapasió, J R; Donat, R; García-Alvarez-Coque, M C
2018-01-26
In reversed phase-liquid chromatography, the usual solution to the "general elution problem" is the application of gradient elution with programmed changes of organic solvent (or other properties). A correct quantification of chromatographic peaks in liquid chromatography requires well resolved signals in a proper analysis time. When the complexity of the sample is high, the gradient program should be accommodated to the local resolution needs of each analyte. This makes the optimization of such situations rather troublesome, since enhancing the resolution for a given analyte may imply a collateral worsening of the resolution of other analytes. The aim of this work is to design multi-linear gradients that maximize the resolution, while fulfilling some restrictions: all peaks should be eluted before a given maximal time, the gradient should be flat or increasing, and sudden changes close to eluting peaks are penalized. Consequently, an equilibrated baseline resolution for all compounds is sought. This goal is achieved by splitting the optimization problem in a multi-scale framework. In each scale κ, an optimization problem is solved with N κ ≈ 2 κ variables that are used to build the gradients. The N κ variables define cubic splines written in terms of a B-spline basis. This allows expressing gradients as polygonals of M points approximating the splines. The cubic splines are built using subdivision schemes, a technique of fast generation of smooth curves, compatible with the multi-scale framework. Owing to the nature of the problem and the presence of multiple local maxima, the algorithm used in the optimization problem of each scale κ should be "global", such as the pattern-search algorithm. The multi-scale optimization approach is successfully applied to find the best multi-linear gradient for resolving a mixture of amino acid derivatives. Copyright © 2017 Elsevier B.V. All rights reserved.
Tinghög, Gustav; Carlsson, Per
2012-12-01
To operationalise and apply a conceptual framework for exploring when health services contain characteristics that facilitate individuals' ability to take individual responsibility for health care through out-of-pocket payment. In addition, we investigate if the levels of out-of-pocket payment for assistive devices (ADs) in Sweden are in line with the proposed framework. Focus groups were used to operationalise the core concepts of sufficient knowledge, individual autonomy, positive externalities, sufficient demand, affordability, and lifestyle enhancement into a measurable and replicable rationing tool. A selection of 28 ADs were graded separately as having high, medium, or low suitability for private financing according to the measurement scale provided through the operationalised framework. To investigate the actual level of private financing, a questionnaire about the level of out-of-pocket payment for the specific ADs was administered to county councils in Sweden. Concepts were operationalised into three levels indicating possible suitability for private financing. Responses to the questionnaire indicate that financing of ADs in Sweden varies across county councils as regards co-payment, full payment, discretionary payment for certain healthcare consumer groups, and full reimbursement. According to the framework, ADs commonly funded privately were generally considered to be more suitable for private financing. Sufficient knowledge, individual autonomy, and sufficient demand did not appear to influence why certain ADs were financed out-of-pocket. The level of positive externalities, affordability, and lifestyle enhancement appeared to be somewhat higher for ADs that were financed out-of-pocket, but the differences were small. Affordability seemed to be the most influential concept.
Temporal efficiency evaluation and small-worldness characterization in temporal networks
Dai, Zhongxiang; Chen, Yu; Li, Junhua; Fam, Johnson; Bezerianos, Anastasios; Sun, Yu
2016-01-01
Numerous real-world systems can be modeled as networks. To date, most network studies have been conducted assuming stationary network characteristics. Many systems, however, undergo topological changes over time. Temporal networks, which incorporate time into conventional network models, are therefore more accurate representations of such dynamic systems. Here, we introduce a novel generalized analytical framework for temporal networks, which enables 1) robust evaluation of the efficiency of temporal information exchange using two new network metrics and 2) quantitative inspection of the temporal small-worldness. Specifically, we define new robust temporal network efficiency measures by incorporating the time dependency of temporal distance. We propose a temporal regular network model, and based on this plus the redefined temporal efficiency metrics and widely used temporal random network models, we introduce a quantitative approach for identifying temporal small-world architectures (featuring high temporal network efficiency both globally and locally). In addition, within this framework, we can uncover network-specific dynamic structures. Applications to brain networks, international trade networks, and social networks reveal prominent temporal small-world properties with distinct dynamic network structures. We believe that the framework can provide further insight into dynamic changes in the network topology of various real-world systems and significantly promote research on temporal networks. PMID:27682314
Gerber, Stefan; Brookshire, E N Jack
2014-03-01
Nutrient limitation in terrestrial ecosystems is often accompanied with maintaining a nearly closed vegetation-soil nutrient cycle. The ability to retain nutrients in an ecosystem requires the capacity of the plant-soil system to draw down nutrient levels in soils effectually such that export concentrations in soil solutions remain low. Here we address the physical constraints of plant nutrient uptake that may be limited by the diffusive movement of nutrients in soils, by the uptake at the root/mycorrhizal surface, and from interactions with soil water flow. We derive an analytical framework of soil nutrient transport and uptake and predict levels of plant available nutrient concentration and residence time. Our results, which we evaluate for nitrogen, show that the physical environment permits plants to lower soil solute concentration substantially. Our analysis confirms that plant uptake capacities in soils are considerable, such that water movement in soils is generally too small to significantly erode dissolved plant-available nitrogen. Inorganic nitrogen concentrations in headwater streams are congruent with the prediction of our theoretical framework. Our framework offers a physical-based parameterization of nutrient uptake in ecosystem models and has the potential to serve as an important tool toward scaling biogeochemical cycles from individual roots to landscapes.
A Stochastic Water Balance Framework for Lowland Watersheds
NASA Astrophysics Data System (ADS)
Thompson, Sally; MacVean, Lissa; Sivapalan, Murugesu
2017-11-01
The water balance dynamics in lowland watersheds are influenced not only by local hydroclimatic controls on energy and water availability, but also by imports of water from the upstream watershed. These imports result in a stochastic extent of inundation in lowland watersheds that is determined by the local flood regime, watershed topography, and the rate of loss processes such as drainage and evaporation. Thus, lowland watershed water balances depend on two stochastic processes—rainfall and local inundation dynamics. Lowlands are high productivity environments that are disproportionately associated with urbanization, high productivity agriculture, biodiversity, and flood risk. Consequently, they are being rapidly altered by human development—generally with clear economic and social motivation—but also with significant trade-offs in ecosystem services provision, directly related to changes in the components and variability of the lowland water balance. We present a stochastic framework to assess the lowland water balance and its sensitivity to two common human interventions—replacement of native vegetation with alternative land uses, and construction of local flood protection levees. By providing analytical solutions for the mean and PDF of the water balance components, the proposed framework provides a mechanism to connect human interventions to hydrologic outcomes, and, in conjunction with ecosystem service production estimates, to evaluate trade-offs associated with lowland watershed development.
Temporal efficiency evaluation and small-worldness characterization in temporal networks
NASA Astrophysics Data System (ADS)
Dai, Zhongxiang; Chen, Yu; Li, Junhua; Fam, Johnson; Bezerianos, Anastasios; Sun, Yu
2016-09-01
Numerous real-world systems can be modeled as networks. To date, most network studies have been conducted assuming stationary network characteristics. Many systems, however, undergo topological changes over time. Temporal networks, which incorporate time into conventional network models, are therefore more accurate representations of such dynamic systems. Here, we introduce a novel generalized analytical framework for temporal networks, which enables 1) robust evaluation of the efficiency of temporal information exchange using two new network metrics and 2) quantitative inspection of the temporal small-worldness. Specifically, we define new robust temporal network efficiency measures by incorporating the time dependency of temporal distance. We propose a temporal regular network model, and based on this plus the redefined temporal efficiency metrics and widely used temporal random network models, we introduce a quantitative approach for identifying temporal small-world architectures (featuring high temporal network efficiency both globally and locally). In addition, within this framework, we can uncover network-specific dynamic structures. Applications to brain networks, international trade networks, and social networks reveal prominent temporal small-world properties with distinct dynamic network structures. We believe that the framework can provide further insight into dynamic changes in the network topology of various real-world systems and significantly promote research on temporal networks.
On species persistence-time distributions.
Suweis, S; Bertuzzo, E; Mari, L; Rodriguez-Iturbe, I; Maritan, A; Rinaldo, A
2012-06-21
We present new theoretical and empirical results on the probability distributions of species persistence times in natural ecosystems. Persistence times, defined as the timespans occurring between species' colonization and local extinction in a given geographic region, are empirically estimated from local observations of species' presence/absence. A connected sampling problem is presented, generalized and solved analytically. Species persistence is shown to provide a direct connection with key spatial macroecological patterns like species-area and endemics-area relationships. Our empirical analysis pertains to two different ecosystems and taxa: a herbaceous plant community and a estuarine fish database. Despite the substantial differences in ecological interactions and spatial scales, we confirm earlier evidence on the general properties of the scaling of persistence times, including the predicted effects of the structure of the spatial interaction network. The framework tested here allows to investigate directly nature and extent of spatial effects in the context of ecosystem dynamics. The notable coherence between spatial and temporal macroecological patterns, theoretically derived and empirically verified, is suggested to underlie general features of the dynamic evolution of ecosystems. Copyright © 2012 Elsevier Ltd. All rights reserved.
Comparison and Contrast of Two General Functional Regression Modeling Frameworks
Morris, Jeffrey S.
2017-01-01
In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable. PMID:28736502
Comparison and Contrast of Two General Functional Regression Modeling Frameworks.
Morris, Jeffrey S
2017-02-01
In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yi; Di Marco, Emanuele; Lykken, Joe
2014-10-17
In this technical note we present technical details on various aspects of the framework introduced in arXiv:1401.2077 aimed at extracting effective Higgs couplings in themore » $$h\\to 4\\ell$$ `golden channel'. Since it is the primary feature of the framework, we focus in particular on the convolution integral which takes us from `truth' level to `detector' level and the numerical and analytic techniques used to obtain it. We also briefly discuss other aspects of the framework.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bovy, Jo, E-mail: bovy@ias.edu
I describe the design, implementation, and usage of galpy, a python package for galactic-dynamics calculations. At its core, galpy consists of a general framework for representing galactic potentials both in python and in C (for accelerated computations); galpy functions, objects, and methods can generally take arbitrary combinations of these as arguments. Numerical orbit integration is supported with a variety of Runge-Kutta-type and symplectic integrators. For planar orbits, integration of the phase-space volume is also possible. galpy supports the calculation of action-angle coordinates and orbital frequencies for a given phase-space point for general spherical potentials, using state-of-the-art numerical approximations for axisymmetricmore » potentials, and making use of a recent general approximation for any static potential. A number of different distribution functions (DFs) are also included in the current release; currently, these consist of two-dimensional axisymmetric and non-axisymmetric disk DFs, a three-dimensional disk DF, and a DF framework for tidal streams. I provide several examples to illustrate the use of the code. I present a simple model for the Milky Way's gravitational potential consistent with the latest observations. I also numerically calculate the Oort functions for different tracer populations of stars and compare them to a new analytical approximation. Additionally, I characterize the response of a kinematically warm disk to an elliptical m = 2 perturbation in detail. Overall, galpy consists of about 54,000 lines, including 23,000 lines of code in the module, 11,000 lines of test code, and about 20,000 lines of documentation. The test suite covers 99.6% of the code. galpy is available at http://github.com/jobovy/galpy with extensive documentation available at http://galpy.readthedocs.org/en/latest.« less
Vortex lattices and defect-mediated viscosity reduction in active liquids
NASA Astrophysics Data System (ADS)
Slomka, Jonasz; Dunkel, Jorn
2016-11-01
Generic pattern-formation and viscosity-reduction mechanisms in active fluids are investigated using a generalized Navier-Stokes model that captures the experimentally observed bulk vortex dynamics in microbial suspensions. We present exact analytical solutions including stress-free vortex lattices and introduce a computational framework that allows the efficient treatment of previously intractable higher-order shear boundary conditions. Large-scale parameter scans identify the conditions for spontaneous flow symmetry breaking, defect-mediated low-viscosity phases and negative-viscosity states amenable to energy harvesting in confined suspensions. The theory uses only generic assumptions about the symmetries and long-wavelength structure of active stress tensors, suggesting that inviscid phases may be achievable in a broad class of non-equilibrium fluids by tuning confinement geometry and pattern scale selection.
Geometry-dependent viscosity reduction in sheared active fluids
NASA Astrophysics Data System (ADS)
Słomka, Jonasz; Dunkel, Jörn
2017-04-01
We investigate flow pattern formation and viscosity reduction mechanisms in active fluids by studying a generalized Navier-Stokes model that captures the experimentally observed bulk vortex dynamics in microbial suspensions. We present exact analytical solutions including stress-free vortex lattices and introduce a computational framework that allows the efficient treatment of higher-order shear boundary conditions. Large-scale parameter scans identify the conditions for spontaneous flow symmetry breaking, geometry-dependent viscosity reduction, and negative-viscosity states amenable to energy harvesting in confined suspensions. The theory uses only generic assumptions about the symmetries and long-wavelength structure of active stress tensors, suggesting that inviscid phases may be achievable in a broad class of nonequilibrium fluids by tuning confinement geometry and pattern scale selection.
Child Development in Developing Countries: Introduction and Methods
Bornstein, Marc H.; Britto, Pia Rebello; Nonoyama-Tarumi, Yuko; Ota, Yumiko; Petrovic, Oliver; Putnick, Diane L.
2011-01-01
The Multiple Indicator Cluster Survey (MICS) is a nationally representative, internationally comparable household survey implemented to examine protective and risk factors of child development in developing countries around the world. This Introduction describes the conceptual framework, nature of the MICS3, and general analytic plan of articles in this Special Section. The articles that follow describe the situations of children with successive foci on nutrition, parenting, discipline and violence, and the home environment addressing two common questions: How do developing and underresearched countries in the world vary with respect to these central indicators of children's development? and How do key indicators of national development relate to child development in each of these substantive areas? The Special Section concludes with policy implications from the international findings. PMID:22277004
Controlling the light shift of the CPT resonance by modulation technique
NASA Astrophysics Data System (ADS)
Tsygankov, E. A.; Petropavlovsky, S. V.; Vaskovskaya, M. I.; Zibrov, S. A.; Velichansky, V. L.; Yakovlev, V. P.
2017-12-01
Motivated by recent developments in atomic frequency standards employing the effect of coherent population trapping (CPT), we propose a theoretical framework for the frequency modulation spectroscopy of the CPT resonances. Under realistic assumptions we provide simple yet non-trivial analytical formulae for the major spectroscopic signals such as the CPT resonance line and the in-phase/quadrature responses. We discuss the influence of the light shift and, in particular, derive a simple expression for the displacement of the resonance as a function of modulation index. The performance of the model is checked against numerical simulations, the agreement is good to perfect. The obtained results can be used in more general models accounting for light absorption in the thick optical medium.
Cost-efficiency trade-off and the design of thermoelectric power generators.
Yazawa, Kazuaki; Shakouri, Ali
2011-09-01
The energy conversion efficiency of today's thermoelectric generators is significantly lower than that of conventional mechanical engines. Almost all of the existing research is focused on materials to improve the conversion efficiency. Here we propose a general framework to study the cost-efficiency trade-off for thermoelectric power generation. A key factor is the optimization of thermoelectric modules together with their heat source and heat sinks. Full electrical and thermal co-optimization yield a simple analytical expression for optimum design. Based on this model, power output per unit mass can be maximized. We show that the fractional area coverage of thermoelectric elements in a module could play a significant role in reducing the cost of power generation systems.
Geometric quantification of features in large flow fields.
Kendall, Wesley; Huang, Jian; Peterka, Tom
2012-01-01
Interactive exploration of flow features in large-scale 3D unsteady-flow data is one of the most challenging visualization problems today. To comprehensively explore the complex feature spaces in these datasets, a proposed system employs a scalable framework for investigating a multitude of characteristics from traced field lines. This capability supports the examination of various neighborhood-based geometric attributes in concert with other scalar quantities. Such an analysis wasn't previously possible because of the large computational overhead and I/O requirements. The system integrates visual analytics methods by letting users procedurally and interactively describe and extract high-level flow features. An exploration of various phenomena in a large global ocean-modeling simulation demonstrates the approach's generality and expressiveness as well as its efficacy.
Crossover physics in the nonequilibrium dynamics of quenched quantum impurity systems.
Vasseur, Romain; Trinh, Kien; Haas, Stephan; Saleur, Hubert
2013-06-14
A general framework is proposed to tackle analytically local quantum quenches in integrable impurity systems, combining a mapping onto a boundary problem with the form factor approach to boundary-condition-changing operators introduced by Lesage and Saleur [Phys. Rev. Lett. 80, 4370 (1998)]. We discuss how to compute exactly the following two central quantities of interest: the Loschmidt echo and the distribution of the work done during the quantum quench. Our results display an interesting crossover physics characterized by the energy scale T(b) of the impurity corresponding to the Kondo temperature. We discuss in detail the noninteracting case as a paradigm and benchmark for more complicated integrable impurity models and check our results using numerical methods.
Recent applications of nanomaterials in capillary electrophoresis.
González-Curbelo, Miguel Ángel; Varela-Martínez, Diana Angélica; Socas-Rodríguez, Bárbara; Hernández-Borges, Javier
2017-10-01
Nanomaterials have found an important place in Analytical Chemistry and, in particular, in Separation Science. Among them, metal-organic frameworks, magnetic and non-magnetic nanoparticles, carbon nanotubes and graphene, as well as their combinations, are the most important nanomaterials that have been used up to now. Concerning capillary electromigration techniques, these nanomaterials have also been used as both pseudostationary phases in electrokinetic chromatography (EKC) and as stationary phases in microchip capillary electrophoresis (CE) and capillary electrochromatography (CEC), as a result of their interesting and particular properties. This review article pretends to provide a general and critical revision of the most recent applications of nanomaterials in this field (period 2010-2017). © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
An Analytical Framework for the Cross-Country Comparison of Higher Education Governance
ERIC Educational Resources Information Center
Dobbins, Michael; Knill, Christoph; Vogtle, Eva Maria
2011-01-01
In this article we provide an integrated framework for the analysis of higher education governance which allows us to more systematically trace the changes that European higher education systems are currently undergoing. We argue that, despite highly insightful previous analyses, there is a need for more specific empirically observable indicators…
A Human Dimensions Framework: Guidelines for Conducting Social Assessments
Alan D. Bright; H. Ken Cordell; Anne P. Hoover; Michael A Tarrant
2003-01-01
This paper provides a framework and guidelines for identifying and organizing human dimension information for use in forest planning. It synthesizes concepts from a variety of social science disciplines and connects them with measurable indicators for use in analysis and reporting. Suggestions of analytical approaches and sources of data for employment of the...
A Data Analytical Framework for Improving Real-Time, Decision Support Systems in Healthcare
ERIC Educational Resources Information Center
Yahav, Inbal
2010-01-01
In this dissertation we develop a framework that combines data mining, statistics and operations research methods for improving real-time decision support systems in healthcare. Our approach consists of three main concepts: data gathering and preprocessing, modeling, and deployment. We introduce the notion of offline and semi-offline modeling to…
How Do Mathematicians Learn Math?: Resources and Acts for Constructing and Understanding Mathematics
ERIC Educational Resources Information Center
Wilkerson-Jerde, Michelle H.; Wilensky, Uri J.
2011-01-01
In this paper, we present an analytic framework for investigating expert mathematical learning as the process of building a "network of mathematical resources" by establishing relationships between different components and properties of mathematical ideas. We then use this framework to analyze the reasoning of ten mathematicians and mathematics…
Focus for Area Development Analysis: Urban Orientation of Counties.
ERIC Educational Resources Information Center
Bluestone, Herman
The orientation of counties to metropolitan systems and urban centers is identified by population density and percentage of urban population. This analytical framework differentiates 6 kinds of counties, ranging from most urban-oriented (group 1) to least urban-oriented (group 6). With this framework, it can be seen that the economic well-being of…
Analyzing Educators' Online Interactions: A Framework of Online Learning Support Roles
ERIC Educational Resources Information Center
Nacu, Denise C.; Martin, Caitlin K.; Pinkard, Nichole; Gray, Tené
2016-01-01
While the potential benefits of participating in online learning communities are documented, so too are inequities in terms of how different populations access and use them. We present the online learning support roles (OLSR) framework, an approach using both automated analytics and qualitative interpretation to identify and explore online…
Mind-Sets Matter: A Meta-Analytic Review of Implicit Theories and Self-Regulation
ERIC Educational Resources Information Center
Burnette, Jeni L.; O'Boyle, Ernest H.; VanEpps, Eric M.; Pollack, Jeffrey M.; Finkel, Eli J.
2013-01-01
This review builds on self-control theory (Carver & Scheier, 1998) to develop a theoretical framework for investigating associations of implicit theories with self-regulation. This framework conceptualizes self-regulation in terms of 3 crucial processes: goal setting, goal operating, and goal monitoring. In this meta-analysis, we included…
ERIC Educational Resources Information Center
Duhn, Iris; Fleer, Marilyn; Harrison, Linda
2016-01-01
This article focuses on the "Relational Agency Framework" (RAF), an analytical tool developed for an Australian review and evaluation study of an early years' policy initiative. We explore Anne Edward's concepts of "relational expertise", "building common knowledge" and "relational agency" to explore how…
An Analytic Framework to Support E.Learning Strategy Development
ERIC Educational Resources Information Center
Marshall, Stephen J.
2012-01-01
Purpose: The purpose of this paper is to discuss and demonstrate the relevance of a new conceptual framework for leading and managing the development of learning and teaching to e.learning strategy development. Design/methodology/approach: After reviewing and discussing the research literature on e.learning in higher education institutions from…
University Reform and Institutional Autonomy: A Framework for Analysing the Living Autonomy
ERIC Educational Resources Information Center
Maassen, Peter; Gornitzka, Åse; Fumasoli, Tatiana
2017-01-01
In this article we discuss recent university reforms aimed at enhancing university autonomy, highlighting various tensions in the underlying reform ideologies. We examine how the traditional interpretation of university autonomy has been expanded in the reform rationales. An analytical framework for studying how autonomy is interpreted and used…
ERIC Educational Resources Information Center
Wu, Ying-Tien; Tsai, Chin-Chung
2007-01-01
Recently, the significance of learners' informal reasoning on socio-scientific issues has received increasing attention among science educators. To gain deeper insights into this important issue, an integrated analytic framework was developed in this study. With this framework, 71 Grade 10 students' informal reasoning about nuclear energy usage…
ERIC Educational Resources Information Center
Adler, Jill; Ronda, Erlina
2015-01-01
We describe and use an analytical framework to document mathematics discourse in instruction (MDI), and interpret differences in mathematics teaching. MDI is characterised by four interacting components in the teaching of a mathematics lesson: exemplification (occurring through a sequence of examples and related tasks), explanatory talk (talk that…
Tracking the debate around marine protected areas: key issues and the BEG framework.
Thorpe, Andy; Bavinck, Maarten; Coulthard, Sarah
2011-04-01
Marine conservation is often criticized for a mono-disciplinary approach, which delivers fragmented solutions to complex problems with differing interpretations of success. As a means of reflecting on the breadth and range of scientific research on the management of the marine environment, this paper develops an analytical framework to gauge the foci of policy documents and published scientific work on Marine Protected Areas. We evaluate the extent to which MPA research articles delineate objectives around three domains: biological-ecological [B]; economic-social[E]; and governance-management [G]. This permits us to develop an analytic [BEG] framework which we then test on a sample of selected journal article cohorts. While the framework reveals the dominance of biologically focussed research [B], analysis also reveals a growing frequency of the use of governance/management terminology in the literature over the last 15 years, which may be indicative of a shift towards more integrated consideration of governance concerns. However, consideration of the economic/social domain appears to lag behind biological and governance concerns in both frequency and presence in MPA literature.
Hydrostatic equilibrium of stars without electroneutrality constraint
NASA Astrophysics Data System (ADS)
Krivoruchenko, M. I.; Nadyozhin, D. K.; Yudin, A. V.
2018-04-01
The general solution of hydrostatic equilibrium equations for a two-component fluid of ions and electrons without a local electroneutrality constraint is found in the framework of Newtonian gravity theory. In agreement with the Poincaré theorem on analyticity and in the context of Dyson's argument, the general solution is demonstrated to possess a fixed (essential) singularity in the gravitational constant G at G =0 . The regular component of the general solution can be determined by perturbation theory in G starting from a locally neutral solution. The nonperturbative component obtained using the method of Wentzel, Kramers and Brillouin is exponentially small in the inner layers of the star and grows rapidly in the outward direction. Near the surface of the star, both components are comparable in magnitude, and their nonlinear interplay determines the properties of an electro- or ionosphere. The stellar charge varies within the limits of -0.1 to 150 C per solar mass. The properties of electro- and ionospheres are exponentially sensitive to variations of the fluid densities in the central regions of the star. The general solutions of two exactly solvable stellar models without a local electroneutrality constraint are also presented.
Linguistics and the Study of Literature. Linguistics in the Undergraduate Curriculum, Appendix 4-D.
ERIC Educational Resources Information Center
Steward, Ann Harleman
Linguistics gives the student of literature an analytical tool whose sole purpose is to describe faithfully the workings of language. It provides a theoretical framework, an analytical method, and a vocabulary for communicating its insights--all designed to serve concerns other than literary interpretation and evaluation, but all useful for…
ERIC Educational Resources Information Center
Cheung, Mike W. L.; Chan, Wai
2009-01-01
Structural equation modeling (SEM) is widely used as a statistical framework to test complex models in behavioral and social sciences. When the number of publications increases, there is a need to systematically synthesize them. Methodology of synthesizing findings in the context of SEM is known as meta-analytic SEM (MASEM). Although correlation…
ERIC Educational Resources Information Center
Schoendorff, Benjamin; Steinwachs, Joanne
2012-01-01
How can therapists be effectively trained in clinical functional contextualism? In this conceptual article we propose a new way of training therapists in Acceptance and Commitment Therapy skills using tools from Functional Analytic Psychotherapy in a training context functionally similar to the therapeutic relationship. FAP has been successfully…
7 CFR 90.2 - General terms defined.
Code of Federal Regulations, 2011 CFR
2011-01-01
... agency, or other agency, organization or person that defines in the general terms the basis on which the... analytical data using proficiency check sample or analyte recovery techniques. In addition, the certainty.... Quality control. The system of close examination of the critical details of an analytical procedure in...
DIVE: A Graph-based Visual Analytics Framework for Big Data
Rysavy, Steven J.; Bromley, Dennis; Daggett, Valerie
2014-01-01
The need for data-centric scientific tools is growing; domains like biology, chemistry, and physics are increasingly adopting computational approaches. As a result, scientists must now deal with the challenges of big data. To address these challenges, we built a visual analytics platform named DIVE: Data Intensive Visualization Engine. DIVE is a data-agnostic, ontologically-expressive software framework capable of streaming large datasets at interactive speeds. Here we present the technical details of the DIVE platform, multiple usage examples, and a case study from the Dynameomics molecular dynamics project. We specifically highlight our novel contributions to structured data model manipulation and high-throughput streaming of large, structured datasets. PMID:24808197
Richter, Janine; Fettig, Ina; Philipp, Rosemarie; Jakubowski, Norbert
2015-07-01
Tributyltin is listed as one of the priority substances in the European Water Framework Directive (WFD). Despite its decreasing input in the environment, it is still present and has to be monitored. In the European Metrology Research Programme project ENV08, a sensitive and reliable analytical method according to the WFD was developed to quantify this environmental pollutant at a very low limit of quantification. With the development of such a primary reference method for tributyltin, the project helped to improve the quality and comparability of monitoring data. An overview of project aims and potential analytical tools is given.
Applying network theory to animal movements to identify properties of landscape space use.
Bastille-Rousseau, Guillaume; Douglas-Hamilton, Iain; Blake, Stephen; Northrup, Joseph M; Wittemyer, George
2018-04-01
Network (graph) theory is a popular analytical framework to characterize the structure and dynamics among discrete objects and is particularly effective at identifying critical hubs and patterns of connectivity. The identification of such attributes is a fundamental objective of animal movement research, yet network theory has rarely been applied directly to animal relocation data. We develop an approach that allows the analysis of movement data using network theory by defining occupied pixels as nodes and connection among these pixels as edges. We first quantify node-level (local) metrics and graph-level (system) metrics on simulated movement trajectories to assess the ability of these metrics to pull out known properties in movement paths. We then apply our framework to empirical data from African elephants (Loxodonta africana), giant Galapagos tortoises (Chelonoidis spp.), and mule deer (Odocoileous hemionus). Our results indicate that certain node-level metrics, namely degree, weight, and betweenness, perform well in capturing local patterns of space use, such as the definition of core areas and paths used for inter-patch movement. These metrics were generally applicable across data sets, indicating their robustness to assumptions structuring analysis or strategies of movement. Other metrics capture local patterns effectively, but were sensitive to specified graph properties, indicating case specific applications. Our analysis indicates that graph-level metrics are unlikely to outperform other approaches for the categorization of general movement strategies (central place foraging, migration, nomadism). By identifying critical nodes, our approach provides a robust quantitative framework to identify local properties of space use that can be used to evaluate the effect of the loss of specific nodes on range wide connectivity. Our network approach is intuitive, and can be implemented across imperfectly sampled or large-scale data sets efficiently, providing a framework for conservationists to analyze movement data. Functions created for the analyses are available within the R package moveNT. © 2018 by the Ecological Society of America.
Thurston, George D; Kipen, Howard; Annesi-Maesano, Isabella; Balmes, John; Brook, Robert D; Cromar, Kevin; De Matteis, Sara; Forastiere, Francesco; Forsberg, Bertil; Frampton, Mark W; Grigg, Jonathan; Heederik, Dick; Kelly, Frank J; Kuenzli, Nino; Laumbach, Robert; Peters, Annette; Rajagopalan, Sanjay T; Rich, David; Ritz, Beate; Samet, Jonathan M; Sandstrom, Thomas; Sigsgaard, Torben; Sunyer, Jordi; Brunekreef, Bert
2017-01-01
The American Thoracic Society has previously published statements on what constitutes an adverse effect on health of air pollution in 1985 and 2000. We set out to update and broaden these past statements that focused primarily on effects on the respiratory system. Since then, many studies have documented effects of air pollution on other organ systems, such as on the cardiovascular and central nervous systems. In addition, many new biomarkers of effects have been developed and applied in air pollution studies.This current report seeks to integrate the latest science into a general framework for interpreting the adversity of the human health effects of air pollution. Rather than trying to provide a catalogue of what is and what is not an adverse effect of air pollution, we propose a set of considerations that can be applied in forming judgments of the adversity of not only currently documented, but also emerging and future effects of air pollution on human health. These considerations are illustrated by the inclusion of examples for different types of health effects of air pollution. Copyright ©ERS 2017.
Thurston, George D.; Kipen, Howard; Annesi-Maesano, Isabella; Balmes, John; Brook, Robert D.; Cromar, Kevin; De Matteis, Sara; Forastiere, Francesco; Forsberg, Bertil; Frampton, Mark W.; Grigg, Jonathan; Heederik, Dick; Kelly, Frank J.; Kuenzli, Nino; Laumbach, Robert; Peters, Annette; Rajagopalan, Sanjay T.; Rich, David; Ritz, Beate; Samet, Jonathan M.; Sandstrom, Thomas; Sigsgaard, Torben; Sunyer, Jordi; Brunekreef, Bert
2017-01-01
The American Thoracic Society has previously published statements on what constitutes an adverse effect on health of air pollution in 1985 and 2000. We set out to update and broaden these past statements that focused primarily on effects on the respiratory system. Since then, many studies have documented effects of air pollution on other organ systems, such as on the cardiovascular and central nervous systems. In addition, many new biomarkers of effects have been developed and applied in air pollution studies. This current report seeks to integrate the latest science into a general framework for interpreting the adversity of the human health effects of air pollution. Rather than trying to provide a catalogue of what is and what is not an adverse effect of air pollution, we propose a set of considerations that can be applied in forming judgments of the adversity of not only currently documented, but also emerging and future effects of air pollution on human health. These considerations are illustrated by the inclusion of examples for different types of health effects of air pollution. PMID:28077473
Systems resilience : a new analytical framework for nuclear nonproliferation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pregenzer, Arian Leigh
2011-12-01
This paper introduces the concept of systems resilience as a new framework for thinking about the future of nonproliferation. Resilience refers to the ability of a system to maintain its vital functions in the face of continuous and unpredictable change. The nonproliferation regime can be viewed as a complex system, and key themes from the literature on systems resilience can be applied to the nonproliferation system. Most existing nonproliferation strategies are aimed at stability rather than resilience, and the current nonproliferation system may be over-constrained by the cumulative evolution of strategies, increasing its vulnerability to collapse. The resilience of themore » nonproliferation system can be enhanced by diversifying nonproliferation strategies to include general international capabilities to respond to proliferation and focusing more attention on reducing the motivation to acquire nuclear weapons in the first place. Ideas for future research, include understanding unintended consequences and feedbacks among nonproliferation strategies, developing methodologies for measuring the resilience of the nonproliferation system, and accounting for interactions of the nonproliferation system with other systems on larger and smaller scales.« less
Panoptes: web-based exploration of large scale genome variation data.
Vauterin, Paul; Jeffery, Ben; Miles, Alistair; Amato, Roberto; Hart, Lee; Wright, Ian; Kwiatkowski, Dominic
2017-10-15
The size and complexity of modern large-scale genome variation studies demand novel approaches for exploring and sharing the data. In order to unlock the potential of these data for a broad audience of scientists with various areas of expertise, a unified exploration framework is required that is accessible, coherent and user-friendly. Panoptes is an open-source software framework for collaborative visual exploration of large-scale genome variation data and associated metadata in a web browser. It relies on technology choices that allow it to operate in near real-time on very large datasets. It can be used to browse rich, hybrid content in a coherent way, and offers interactive visual analytics approaches to assist the exploration. We illustrate its application using genome variation data of Anopheles gambiae, Plasmodium falciparum and Plasmodium vivax. Freely available at https://github.com/cggh/panoptes, under the GNU Affero General Public License. paul.vauterin@gmail.com. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
A New Time-varying Concept of Risk in a Changing Climate.
Sarhadi, Ali; Ausín, María Concepción; Wiper, Michael P
2016-10-20
In a changing climate arising from anthropogenic global warming, the nature of extreme climatic events is changing over time. Existing analytical stationary-based risk methods, however, assume multi-dimensional extreme climate phenomena will not significantly vary over time. To strengthen the reliability of infrastructure designs and the management of water systems in the changing environment, multidimensional stationary risk studies should be replaced with a new adaptive perspective. The results of a comparison indicate that current multi-dimensional stationary risk frameworks are no longer applicable to projecting the changing behaviour of multi-dimensional extreme climate processes. Using static stationary-based multivariate risk methods may lead to undesirable consequences in designing water system infrastructures. The static stationary concept should be replaced with a flexible multi-dimensional time-varying risk framework. The present study introduces a new multi-dimensional time-varying risk concept to be incorporated in updating infrastructure design strategies under changing environments arising from human-induced climate change. The proposed generalized time-varying risk concept can be applied for all stochastic multi-dimensional systems that are under the influence of changing environments.
Coupled Thermo-Hydro-Mechanical Numerical Framework for Simulating Unconventional Formations
NASA Astrophysics Data System (ADS)
Garipov, T. T.; White, J. A.; Lapene, A.; Tchelepi, H.
2016-12-01
Unconventional deposits are found in all world oil provinces. Modeling these systems is challenging, however, due to complex thermo-hydro-mechanical processes that govern their behavior. As a motivating example, we consider in situ thermal processing of oil shale deposits. When oil shale is heated to sufficient temperatures, kerogen can be converted to oil and gas products over a relatively short timespan. This phase change dramatically impact both the mechanical and hydrologic properties of the rock, leading to strongly coupled THMC interactions. Here, we present a numerical framework for simulating tightly-coupled chemistry, geomechanics, and multiphase flow within a reservoir simulator (the AD-GPRS General Purpose Research Simulator). We model changes in constitutive behavior of the rock using a thermoplasticity model that accounts for microstructural evolution. The multi-component, multiphase flow and transport processes of both mass and heat are modeled at the macroscopic (e.g., Darcy) scale. The phase compositions and properties are described by a cubic equation of state; Arrhenius-type chemical reactions are used to represent kerogen conversion. The system of partial differential equations is discretized using a combination of finite-volumes and finite-elements, respectively, for the flow and mechanics problems. Fully implicit and sequentially implicit method are used to solve resulting nonlinear problem. The proposed framework is verified against available analytical and numerical benchmark cases. We demonstrate the efficiency, performance, and capabilities of the proposed simulation framework by analyzing near well deformation in an oil shale formation.
Geue, Claudia; Wu, Olivia; Xin, Yiqiao; Heggie, Robert; Hutchinson, Sharon; Martin, Natasha K.; Fenwick, Elisabeth; Goldberg, David
2015-01-01
Introduction Studies evaluating the cost-effectiveness of screening for Hepatitis B Virus (HBV) and Hepatitis C Virus (HCV) are generally heterogeneous in terms of risk groups, settings, screening intervention, outcomes and the economic modelling framework. It is therefore difficult to compare cost-effectiveness results between studies. This systematic review aims to summarise and critically assess existing economic models for HBV and HCV in order to identify the main methodological differences in modelling approaches. Methods A structured search strategy was developed and a systematic review carried out. A critical assessment of the decision-analytic models was carried out according to the guidelines and framework developed for assessment of decision-analytic models in Health Technology Assessment of health care interventions. Results The overall approach to analysing the cost-effectiveness of screening strategies was found to be broadly consistent for HBV and HCV. However, modelling parameters and related structure differed between models, producing different results. More recent publications performed better against a performance matrix, evaluating model components and methodology. Conclusion When assessing screening strategies for HBV and HCV infection, the focus should be on more recent studies, which applied the latest treatment regimes, test methods and had better and more complete data on which to base their models. In addition to parameter selection and associated assumptions, careful consideration of dynamic versus static modelling is recommended. Future research may want to focus on these methodological issues. In addition, the ability to evaluate screening strategies for multiple infectious diseases, (HCV and HIV at the same time) might prove important for decision makers. PMID:26689908
Trachtenberg, Shlomo; Schuck, Peter; Phillips, Terry M.; Andrews, S. Brian; Leapman, Richard D.
2014-01-01
Spiroplasma melliferum is a wall-less bacterium with dynamic helical geometry. This organism is geometrically well defined and internally well ordered, and has an exceedingly small genome. Individual cells are chemotactic, polar, and swim actively. Their dynamic helicity can be traced at the molecular level to a highly ordered linear motor (composed essentially of the proteins fib and MreB) that is positioned on a defined helical line along the internal face of the cell’s membrane. Using an array of complementary, informationally overlapping approaches, we have taken advantage of this uniquely simple, near-minimal life-form and its helical geometry to analyze the copy numbers of Spiroplasma’s essential parts, as well as to elucidate how these components are spatially organized to subserve the whole living cell. Scanning transmission electron microscopy (STEM) was used to measure the mass-per-length and mass-per-area of whole cells, membrane fractions, intact cytoskeletons and cytoskeletal components. These local data were fit into whole-cell geometric parameters determined by a variety of light microscopy modalities. Hydrodynamic data obtained by analytical ultracentrifugation allowed computation of the hydration state of whole living cells, for which the relative amounts of protein, lipid, carbohydrate, DNA, and RNA were also estimated analytically. Finally, ribosome and RNA content, genome size and gene expression were also estimated (using stereology, spectroscopy and 2D-gel analysis, respectively). Taken together, the results provide a general framework for a minimal inventory and arrangement of the major cellular components needed to support life. PMID:24586297
Ensuring Food Integrity by Metrology and FAIR Data Principles
Rychlik, Michael; Zappa, Giovanna; Añorga, Larraitz; Belc, Nastasia; Castanheira, Isabel; Donard, Olivier F. X.; Kouřimská, Lenka; Ogrinc, Nives; Ocké, Marga C.; Presser, Karl; Zoani, Claudia
2018-01-01
Food integrity is a general term for sound, nutritive, healthy, tasty, safe, authentic, traceable, as well as ethically, safely, environment-friendly, and sustainably produced foods. In order to verify these properties, analytical methods with a higher degree of accuracy, sensitivity, standardization and harmonization and a harmonized system for their application in analytical laboratories are required. In this view, metrology offers the opportunity to achieve these goals. In this perspective article the current global challenges in food analysis and the principles of metrology to fill these gaps are presented. Therefore, the pan-European project METROFOOD-RI within the framework of the European Strategy Forum on Research Infrastructures (ESFRI) was developed to establish a strategy to allow reliable and comparable analytical measurements in foods along the whole process line starting from primary producers until consumers and to make all data findable, accessible, interoperable, and re-usable according to the FAIR data principles. The initiative currently consists of 48 partners from 18 European Countries and concluded its “Early Phase” as research infrastructure by organizing its future structure and presenting a proof of concept by preparing, distributing and comprehensively analyzing three candidate Reference Materials (rice grain, rice flour, and oyster tissue) and establishing a system how to compile, process, and store the generated data and how to exchange, compare them and make them accessible in data bases. PMID:29872651
Papadopoulos, A; Sioen, I; Cubadda, F; Ozer, H; Basegmez, H I Oktay; Turrini, A; Lopez Esteban, M T; Fernandez San Juan, P M; Sokolić-Mihalak, D; Jurkovic, M; De Henauw, S; Aureli, F; Vin, K; Sirot, V
2015-02-01
The objective of this article is to develop a general method based on the analytic hierarchy process (AHP) methodology to rank the substances to be studied in a Total Diet Studies (TDS). This method was tested for different substances and groups of substances (N = 113), for which the TDS approach has been considered relevant. This work was performed by a group of 7 experts from different European countries representing their institutes, which are involved in the TDS EXPOSURE project. The AHP methodology is based on a score system taking into account experts' judgments quantified assigning comparative scores to the different identified issues. Hence, the 10 substances of highest interest in the framework of a TDS are trace elements (methylmercury, cadmium, inorganic arsenic, lead, aluminum, inorganic mercury), dioxins, furans and polychlorinated biphenyls (PCBs), and some additives (sulfites and nitrites). The priority list depends on both the national situation (geographical variations, consumer concern, etc.) and the availability of data. Thus, the list depends on the objectives of the TDS and on reachable analytical performances. Moreover, such a list is highly variable with time and new data (e.g. social context, vulnerable population groups, emerging substances, new toxicological data or health-based guidance values). Copyright © 2014 Elsevier Ltd. All rights reserved.
Micromechanics Analysis Code (MAC) User Guide: Version 1.0
NASA Technical Reports Server (NTRS)
Wilt, T. E.; Arnold, S. M.
1994-01-01
The ability to accurately predict the thermomechanical deformation response of advanced composite materials continues to play an important role in the development of these strategic materials. Analytical models that predict the effective behavior of composites are used not only by engineers performing structural analysis of large-scale composite components but also by material scientists in developing new material systems. For an analytical model to fulfill these two distinct functions it must be based on a micromechanics approach which utilizes physically based deformation and life constitutive models and allows one to generate the average (macro) response of a composite material given the properties of the individual constituents and their geometric arrangement. Here the user guide for the recently developed, computationally efficient and comprehensive micromechanics analysis code, MAC, who's predictive capability rests entirely upon the fully analytical generalized method of cells, GMC, micromechanics model is described. MAC is a versatile form of research software that 'drives' the double or triple ply periodic micromechanics constitutive models based upon GMC. MAC enhances the basic capabilities of GMC by providing a modular framework wherein (1) various thermal, mechanical (stress or strain control), and thermomechanical load histories can be imposed; (2) different integration algorithms may be selected; (3) a variety of constituent constitutive models may be utilized and/or implemented; and (4) a variety of fiber architectures may be easily accessed through their corresponding representative volume elements.
Ensuring Food Integrity by Metrology and FAIR Data Principles
NASA Astrophysics Data System (ADS)
Rychlik, Michael; Zappa, Giovanna; Añorga, Larraitz; Belc, Nastasia; Castanheira, Isabel; Donard, Olivier F. X.; Kouřimská, Lenka; Ogrinc, Nives; Ocké, Marga C.; Presser, Karl; Zoani, Claudia
2018-05-01
Food integrity is a general term for sound, nutritive, healthy, tasty, safe, authentic, traceable, as well as ethically, safely, environment-friendly and sustainably produced foods. In order to verify these properties, analytical methods with a higher degree of accuracy, sensitivity, standardization and harmonization and a harmonized system for their application in analytical laboratories are required. In this view, metrology offers the opportunity to achieve these goals. In this perspective article the current global challenges in food analysis and the principles of metrology to fill these gaps are presented. Therefore, the pan-European project METROFOOD-RI within the framework of the European Strategy Forum on Research Infrastructures (ESFRI) was developed to establish a strategy to allow reliable and comparable analytical measurements in foods along the whole process line starting from primary producers until consumers and to make all data findable, accessible, interoperable, and re-usable according to the FAIR data principles. The initiative currently consists of 48 partners from 18 European Countries and concluded its “Early Phase” as research infrastructure by organizing its future structure and presenting a proof of concept by preparing, distributing and comprehensively analyzing three candidate Reference Materials (rice grain, rice flour and oyster tissue) and establishing a system how to compile, process and store the generated data and how to exchange, compare them and make them accessible in data bases.
NASA Astrophysics Data System (ADS)
Menzel, Andreas M.
2015-11-01
Diffusion of colloidal particles in a complex environment such as polymer networks or biological cells is a topic of high complexity with significant biological and medical relevance. In such situations, the interaction between the surroundings and the particle motion has to be taken into account. We analyze a simplified diffusion model that includes some aspects of a complex environment in the framework of a nonlinear friction process: at low particle speeds, friction grows linearly with the particle velocity as for regular viscous friction; it grows more than linearly at higher particle speeds; finally, at a maximum of the possible particle speed, the friction diverges. In addition to bare diffusion, we study the influence of a constant drift force acting on the diffusing particle. While the corresponding stationary velocity distributions can be derived analytically, the displacement statistics generally must be determined numerically. However, as a benefit of our model, analytical progress can be made in one case of a special maximum particle speed. The effect of a drift force in this case is analytically determined by perturbation theory. It will be interesting in the future to compare our results to real experimental systems. One realization could be magnetic colloidal particles diffusing through a shear-thickening environment such as starch suspensions, possibly exposed to an external magnetic field gradient.
Micromechanics Analysis Code (MAC). User Guide: Version 2.0
NASA Technical Reports Server (NTRS)
Wilt, T. E.; Arnold, S. M.
1996-01-01
The ability to accurately predict the thermomechanical deformation response of advanced composite materials continues to play an important role in the development of these strategic materials. Analytical models that predict the effective behavior of composites are used not only by engineers performing structural analysis of large-scale composite components but also by material scientists in developing new material systems. For an analytical model to fulfill these two distinct functions it must be based on a micromechanics approach which utilizes physically based deformation and life constitutive models and allows one to generate the average (macro) response of a composite material given the properties of the individual constituents and their geometric arrangement. Here the user guide for the recently developed, computationally efficient and comprehensive micromechanics analysis code's (MAC) who's predictive capability rests entirely upon the fully analytical generalized method of cells (GMC), micromechanics model is described. MAC is a versatile form of research software that 'drives' the double or triply periodic micromechanics constitutive models based upon GMC. MAC enhances the basic capabilities of GMC by providing a modular framework wherein (1) various thermal, mechanical (stress or strain control) and thermomechanical load histories can be imposed, (2) different integration algorithms may be selected, (3) a variety of constituent constitutive models may be utilized and/or implemented, and (4) a variety of fiber and laminate architectures may be easily accessed through their corresponding representative volume elements.
Ensuring Food Integrity by Metrology and FAIR Data Principles.
Rychlik, Michael; Zappa, Giovanna; Añorga, Larraitz; Belc, Nastasia; Castanheira, Isabel; Donard, Olivier F X; Kouřimská, Lenka; Ogrinc, Nives; Ocké, Marga C; Presser, Karl; Zoani, Claudia
2018-01-01
Food integrity is a general term for sound, nutritive, healthy, tasty, safe, authentic, traceable, as well as ethically, safely, environment-friendly, and sustainably produced foods. In order to verify these properties, analytical methods with a higher degree of accuracy, sensitivity, standardization and harmonization and a harmonized system for their application in analytical laboratories are required. In this view, metrology offers the opportunity to achieve these goals. In this perspective article the current global challenges in food analysis and the principles of metrology to fill these gaps are presented. Therefore, the pan-European project METROFOOD-RI within the framework of the European Strategy Forum on Research Infrastructures (ESFRI) was developed to establish a strategy to allow reliable and comparable analytical measurements in foods along the whole process line starting from primary producers until consumers and to make all data findable, accessible, interoperable, and re-usable according to the FAIR data principles. The initiative currently consists of 48 partners from 18 European Countries and concluded its "Early Phase" as research infrastructure by organizing its future structure and presenting a proof of concept by preparing, distributing and comprehensively analyzing three candidate Reference Materials (rice grain, rice flour, and oyster tissue) and establishing a system how to compile, process, and store the generated data and how to exchange, compare them and make them accessible in data bases.
An analytical framework to assist decision makers in the use of forest ecosystem model predictions
Larocque, Guy R.; Bhatti, Jagtar S.; Ascough, J.C.; Liu, J.; Luckai, N.; Mailly, D.; Archambault, L.; Gordon, Andrew M.
2011-01-01
The predictions from most forest ecosystem models originate from deterministic simulations. However, few evaluation exercises for model outputs are performed by either model developers or users. This issue has important consequences for decision makers using these models to develop natural resource management policies, as they cannot evaluate the extent to which predictions stemming from the simulation of alternative management scenarios may result in significant environmental or economic differences. Various numerical methods, such as sensitivity/uncertainty analyses, or bootstrap methods, may be used to evaluate models and the errors associated with their outputs. However, the application of each of these methods carries unique challenges which decision makers do not necessarily understand; guidance is required when interpreting the output generated from each model. This paper proposes a decision flow chart in the form of an analytical framework to help decision makers apply, in an orderly fashion, different steps involved in examining the model outputs. The analytical framework is discussed with regard to the definition of problems and objectives and includes the following topics: model selection, identification of alternatives, modelling tasks and selecting alternatives for developing policy or implementing management scenarios. Its application is illustrated using an on-going exercise in developing silvicultural guidelines for a forest management enterprise in Ontario, Canada.
Moat, K A; Abelson, J
2011-12-01
During the 2001 election campaign, President Yoweri Museveni announced he was abolishing user fees for health services in Uganda. No analysis has been carried out to explain how he was able to initiate such an important policy decision without encountering any immediate barriers. To explain this outcome through in-depth policy analysis driven by the application of key analytical frameworks. An explanatory case study informed by analytical frameworks from the institutionalism literature was undertaken. Multiple data sources were used including: academic literature, key government documents, grey literature, and a variety of print media. According to the analytical frameworks employed, several formal institutional constraints existed that would have reduced the prospects for the abolition of user fees. However, prevalent informal institutions such as "Big Man" presidentialism and clientelism that were both 'competing' and 'complementary' can be used to explain the policy outcome. The analysis suggests that these factors trumped the impact of more formal institutional structures in the Ugandan context. Consideration should be given to the interactions between formal and informal institutions in the analysis of health policy processes in Uganda, as they provide a more nuanced understanding of how each set of factors influence policy outcomes.
Trouvé, Hélène; Couturier, Yves; Etheridge, Francis; Saint-Jean, Olivier; Somme, Dominique
2010-01-01
Background The literature on integration indicates the need for an enhanced theorization of institutional integration. This article proposes path dependence as an analytical framework to study the systems in which integration takes place. Purpose PRISMA proposes a model for integrating health and social care services for older adults. This model was initially tested in Quebec. The PRISMA France study gave us an opportunity to analyze institutional integration in France. Methods A qualitative approach was used. Analyses were based on semi-structured interviews with actors of all levels of decision-making, observations of advisory board meetings, and administrative documents. Results Our analyses revealed the complexity and fragmentation of institutional integration. The path dependency theory, which analyzes the change capacity of institutions by taking into account their historic structures, allows analysis of this situation. The path dependency to the Bismarckian system and the incomplete reforms of gerontological policies generate the coexistence and juxtaposition of institutional systems. In such a context, no institution has sufficient ability to determine gerontology policy and build institutional integration by itself. Conclusion Using path dependence as an analytical framework helps to understand the reasons why institutional integration is critical to organizational and clinical integration, and the complex construction of institutional integration in France. PMID:20689740
High-Performance Data Analytics Beyond the Relational and Graph Data Models with GEMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castellana, Vito G.; Minutoli, Marco; Bhatt, Shreyansh
Graphs represent an increasingly popular data model for data-analytics, since they can naturally represent relationships and interactions between entities. Relational databases and their pure table-based data model are not well suitable to store and process sparse data. Consequently, graph databases have gained interest in the last few years and the Resource Description Framework (RDF) became the standard data model for graph data. Nevertheless, while RDF is well suited to analyze the relationships between the entities, it is not efficient in representing their attributes and properties. In this work we propose the adoption of a new hybrid data model, based onmore » attributed graphs, that aims at overcoming the limitations of the pure relational and graph data models. We present how we have re-designed the GEMS data-analytics framework to fully take advantage of the proposed hybrid data model. To improve analysts productivity, in addition to a C++ API for applications development, we adopt GraQL as input query language. We validate our approach implementing a set of queries on net-flow data and we compare our framework performance against Neo4j. Experimental results show significant performance improvement over Neo4j, up to several orders of magnitude when increasing the size of the input data.« less
Razavi, Sonia M; Gonzalez, Marcial; Cuitiño, Alberto M
2015-04-30
We propose a general framework for determining optimal relationships for tensile strength of doubly convex tablets under diametrical compression. This approach is based on the observation that tensile strength is directly proportional to the breaking force and inversely proportional to a non-linear function of geometric parameters and materials properties. This generalization reduces to the analytical expression commonly used for flat faced tablets, i.e., Hertz solution, and to the empirical relationship currently used in the pharmaceutical industry for convex-faced tablets, i.e., Pitt's equation. Under proper parametrization, optimal tensile strength relationship can be determined from experimental results by minimizing a figure of merit of choice. This optimization is performed under the first-order approximation that a flat faced tablet and a doubly curved tablet have the same tensile strength if they have the same relative density and are made of the same powder, under equivalent manufacturing conditions. Furthermore, we provide a set of recommendations and best practices for assessing the performance of optimal tensile strength relationships in general. Based on these guidelines, we identify two new models, namely the general and mechanistic models, which are effective and predictive alternatives to the tensile strength relationship currently used in the pharmaceutical industry. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sapra, Karan; Gupta, Saurabh; Atchley, Scott; Anantharaj, Valentine; Miller, Ross; Vazhkudai, Sudharshan
2016-04-01
Efficient resource utilization is critical for improved end-to-end computing and workflow of scientific applications. Heterogeneous node architectures, such as the GPU-enabled Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), present us with further challenges. In many HPC applications on Titan, the accelerators are the primary compute engines while the CPUs orchestrate the offloading of work onto the accelerators, and moving the output back to the main memory. On the other hand, applications that do not exploit GPUs, the CPU usage is dominant while the GPUs idle. We utilized Heterogenous Functional Partitioning (HFP) runtime framework that can optimize usage of resources on a compute node to expedite an application's end-to-end workflow. This approach is different from existing techniques for in-situ analyses in that it provides a framework for on-the-fly analysis on-node by dynamically exploiting under-utilized resources therein. We have implemented in the Community Earth System Model (CESM) a new concurrent diagnostic processing capability enabled by the HFP framework. Various single variate statistics, such as means and distributions, are computed in-situ by launching HFP tasks on the GPU via the node local HFP daemon. Since our current configuration of CESM does not use GPU resources heavily, we can move these tasks to GPU using the HFP framework. Each rank running the atmospheric model in CESM pushes the variables of of interest via HFP function calls to the HFP daemon. This node local daemon is responsible for receiving the data from main program and launching the designated analytics tasks on the GPU. We have implemented these analytics tasks in C and use OpenACC directives to enable GPU acceleration. This methodology is also advantageous while executing GPU-enabled configurations of CESM when the CPUs will be idle during portions of the runtime. In our implementation results, we demonstrate that it is more efficient to use HFP framework to offload the tasks to GPUs instead of doing it in the main application. We observe increased resource utilization and overall productivity in this approach by using HFP framework for end-to-end workflow.
Exploring the Moral Complexity of School Choice: Philosophical Frameworks and Contributions
ERIC Educational Resources Information Center
Wilson, Terri S.
2015-01-01
In this essay, I describe some of the methodological dimensions of my ongoing research into how parents choose schools. I particularly focus on how philosophical frameworks and analytical strategies have shaped the empirical portion of my research. My goal, in this essay, is to trace and explore the ways in which philosophy of education--as a…
ERIC Educational Resources Information Center
Edwards, D. Brent, Jr.
2013-01-01
This article uses multiple perspectives to frame international processes of education policy formation and then applies the framework to El Salvador's Plan 2021 between 2003 and 2005. These perspectives are policy attraction, policy negotiation, policy imposition, and policy hybridization. Research reveals that the formation of Plan 2021 was the…
Behavioral assessment of personality disorders.
Nelson-Gray, R O; Farmer, R F
1999-04-01
This article examines the definition of personality disorders (PDs) from a functional analytical framework and discusses the potential utility of such a framework to account for behavioral tendencies associated with PD pathology. Also reviewed are specific behavioral assessment methods that can be employed in the assessment of PDs, and how information derived from these assessments may be linked to specific intervention strategies.
An Empirical Investigation of Entrepreneurship Intensity in Iranian State Universities
ERIC Educational Resources Information Center
Mazdeh, Mohammad Mahdavi; Razavi, Seyed-Mostafa; Hesamamiri, Roozbeh; Zahedi, Mohammad-Reza; Elahi, Behin
2013-01-01
The purpose of this study is to propose a framework to evaluate the entrepreneurship intensity (EI) of Iranian state universities. In order to determine EI, a hybrid multi-method framework consisting of Delphi, Analytic Network Process (ANP), and VIKOR is proposed. The Delphi method is used to localize and reduce the number of criteria extracted…
ERIC Educational Resources Information Center
Clarke, Lane Whitney; Bartholomew, Audrey
2014-01-01
The purpose of this study was to investigate instructor participation in asynchronous discussions through an in-depth content analysis of instructors' postings and comments through the Community of Inquiry (COI) framework (Garrison et. al, 2001). We developed an analytical tool based on this framework in order to better understand what instructors…
ERIC Educational Resources Information Center
Hser, Yih-Ing; Longshore, Douglas; Anglin, M. Douglas
2007-01-01
This article discusses the life course perspective on drug use, including conceptual and analytic issues involved in developing the life course framework to explain how drug use trajectories develop during an individual's lifetime and how this knowledge can guide new research and approaches to management of drug dependence. Central concepts…
ERIC Educational Resources Information Center
McKinley, Jim
2015-01-01
This article makes the argument that we need to situate student's academic writing as socially constructed pieces of writing that embody a writer's cultural identity and critical argument. In support, I present and describe a comprehensive model of an original English as a Foreign Language (EFL) writing analytical framework. This article explains…
ERIC Educational Resources Information Center
Jaipal, Kamini
2010-01-01
The teaching of science is a complex process, involving the use of multiple modalities. This paper illustrates the potential of a multimodal semiotics discourse analysis framework to illuminate meaning-making possibilities during the teaching of a science concept. A multimodal semiotics analytical framework is developed and used to (1) analyze the…
Metzger, Marc J.; Bunce, Robert G.H.; Jongman, Rob H.G.; Sayre, Roger G.; Trabucco, Antonio; Zomer, Robert
2013-01-01
Main conclusions: The GEnS provides a robust spatial analytical framework for the aggregation of local observations, identification of gaps in current monitoring efforts and systematic design of complementary and new monitoring and research. The dataset is available for non-commercial use through the GEO portal (http://www.geoportal.org).
A Case Study of a Mixed Methods Study Engaged in Integrated Data Analysis
ERIC Educational Resources Information Center
Schiazza, Daniela Marie
2013-01-01
The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…
ERIC Educational Resources Information Center
Blackman, Stacey
2007-01-01
The cognitions of Caribbean students with dyslexia are explored as part of an embedded multiple case study approach to teaching and learning at two secondary schools on the island of Barbados. This exploration employed "low tech" approaches to analyse what pupils had said in interviews using a Miles and Huberman (1994) framework.…
On Establishing Big Data Wave Breakwaters with Analytics (Invited)
NASA Astrophysics Data System (ADS)
Riedel, M.
2013-12-01
The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.
Bias Assessment of General Chemistry Analytes using Commutable Samples.
Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter
2014-11-01
Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.
Aminbakhsh, Saman; Gunduz, Murat; Sonmez, Rifat
2013-09-01
The inherent and unique risks on construction projects quite often present key challenges to contractors. Health and safety risks are among the most significant risks in construction projects since the construction industry is characterized by a relatively high injury and death rate compared to other industries. In construction project management, safety risk assessment is an important step toward identifying potential hazards and evaluating the risks associated with the hazards. Adequate prioritization of safety risks during risk assessment is crucial for planning, budgeting, and management of safety related risks. In this paper, a safety risk assessment framework is presented based on the theory of cost of safety (COS) model and the analytic hierarchy process (AHP). The main contribution of the proposed framework is that it presents a robust method for prioritization of safety risks in construction projects to create a rational budget and to set realistic goals without compromising safety. The framework provides a decision tool for the decision makers to determine the adequate accident/injury prevention investments while considering the funding limits. The proposed safety risk framework is illustrated using a real-life construction project and the advantages and limitations of the framework are discussed. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.
Electromagnetic gyrokinetic simulation in GTS
NASA Astrophysics Data System (ADS)
Ma, Chenhao; Wang, Weixing; Startsev, Edward; Lee, W. W.; Ethier, Stephane
2017-10-01
We report the recent development in the electromagnetic simulations for general toroidal geometry based on the particle-in-cell gyrokinetic code GTS. Because of the cancellation problem, the EM gyrokinetic simulation has numerical difficulties in the MHD limit where k⊥ρi -> 0 and/or β >me /mi . Recently several approaches has been developed to circumvent this problem: (1) p∥ formulation with analytical skin term iteratively approximated by simulation particles (Yang Chen), (2) A modified p∥ formulation with ∫ dtE∥ used in place of A∥ (Mishichenko); (3) A conservative theme where the electron density perturbation for the Poisson equation is calculated from an electron continuity equation (Bao) ; (4) double-split-weight scheme with two weights, one for Poisson equation and one for time derivative of Ampere's law, each with different splits designed to remove large terms from Vlasov equation (Startsev). These algorithms are being implemented into GTS framework for general toroidal geometry. The performance of these different algorithms will be compared for various EM modes.
Epidemic spreading on activity-driven networks with attractiveness.
Pozzana, Iacopo; Sun, Kaiyuan; Perra, Nicola
2017-10-01
We study SIS epidemic spreading processes unfolding on a recent generalization of the activity-driven modeling framework. In this model of time-varying networks, each node is described by two variables: activity and attractiveness. The first describes the propensity to form connections, while the second defines the propensity to attract them. We derive analytically the epidemic threshold considering the time scale driving the evolution of contacts and the contagion as comparable. The solutions are general and hold for any joint distribution of activity and attractiveness. The theoretical picture is confirmed via large-scale numerical simulations performed considering heterogeneous distributions and different correlations between the two variables. We find that heterogeneous distributions of attractiveness alter the contagion process. In particular, in the case of uncorrelated and positive correlations between the two variables, heterogeneous attractiveness facilitates the spreading. On the contrary, negative correlations between activity and attractiveness hamper the spreading. The results presented contribute to the understanding of the dynamical properties of time-varying networks and their effects on contagion phenomena unfolding on their fabric.
Fundamental procedures of geographic information analysis
NASA Technical Reports Server (NTRS)
Berry, J. K.; Tomlin, C. D.
1981-01-01
Analytical procedures common to most computer-oriented geographic information systems are composed of fundamental map processing operations. A conceptual framework for such procedures is developed and basic operations common to a broad range of applications are described. Among the major classes of primitive operations identified are those associated with: reclassifying map categories as a function of the initial classification, the shape, the position, or the size of the spatial configuration associated with each category; overlaying maps on a point-by-point, a category-wide, or a map-wide basis; measuring distance; establishing visual or optimal path connectivity; and characterizing cartographic neighborhoods based on the thematic or spatial attributes of the data values within each neighborhood. By organizing such operations in a coherent manner, the basis for a generalized cartographic modeling structure can be developed which accommodates a variety of needs in a common, flexible and intuitive manner. The use of each is limited only by the general thematic and spatial nature of the data to which it is applied.
ERIC Educational Resources Information Center
Villavicencio, Adriana; Klevan, Sarah; Guidry, Brandon; Wulach, Suzanne
2014-01-01
This appendix describes the data collection and analytic processes used to develop the findings in the report "Promising Opportunities for Black and Latino Young Men." A central challenge was creating an analytic framework that could be uniformly applied to all schools, despite the individualized nature of their Expanded Success…
ERIC Educational Resources Information Center
Yogev, Sara; Brett, Jeanne
This paper offers a conceptual framework for the intersection of work and family roles based on the constructs of work involvement and family involvement. The theoretical and empirical literature on the intersection of work and family roles is reviewed from two analytical approaches. From the individual level of analysis, the literature reviewed…
Sensemaking during the Use of Learning Analytics in the Context of a Large College System
ERIC Educational Resources Information Center
Morse, Robert Kenneth
2017-01-01
This research took place as a cognitive exploration of sensemaking of learning analytics at Ivy Tech Community College of Indiana. For the courses with the largest online enrollment, quality standards in the course design are maintained by creating sections from a course design framework. This means all sections have the same starting content and…
ERIC Educational Resources Information Center
Fisher, James E.; Sealey, Ronald W.
The study describes the analytical pragmatic structure of concepts and applies this structure to the legal concept of procedural due process. This structure consists of form, purpose, content, and function. The study conclusions indicate that the structure of the concept of procedural due process, or any legal concept, is not the same as the…
ERIC Educational Resources Information Center
Ingrey, Jennifer C.
2012-01-01
This paper derives from a larger study, looking at how students in one secondary school in Ontario problematised and understood gender expression. This study applies a Foucaultian analytic framework of disciplinary space to the problem of the bathroom in public schools. It focuses specifically on the surveillance and regulation of gendered bodies…
Learning Analytics in Small-Scale Teacher-Led Innovations: Ethical and Data Privacy Issues
ERIC Educational Resources Information Center
Rodríguez-Triana, María Jesús; Martínez-Monés, Alejandra; Villagrá-Sobrino, Sara
2016-01-01
As a further step towards maturity, the field of learning analytics (LA) is working on the definition of frameworks that structure the legal and ethical issues that scholars and practitioners must take into account when planning and applying LA solutions to their learning contexts. However, current efforts in this direction tend to be focused on…
A genetic algorithm-based job scheduling model for big data analytics.
Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei
Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.
2014-12-01
An Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective by Robert A...Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective Robert A Sottilare and Anne M Sinatra Human...2014 4. TITLE AND SUBTITLE An Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective
Evaluation Framework for NASA's Educational Outreach Programs
NASA Technical Reports Server (NTRS)
Berg, Rick; Booker, Angela; Linde, Charlotte; Preston, Connie
1999-01-01
The objective of the proposed work is to develop an evaluation framework for NASA's educational outreach efforts. We focus on public (rather than technical or scientific) dissemination efforts, specifically on Internet-based outreach sites for children.The outcome of this work is to propose both methods and criteria for evaluation, which would enable NASA to do a more analytic evaluation of its outreach efforts. The proposed framework is based on IRL's ethnographic and video-based observational methods, which allow us to analyze how these sites are actually used.
Eghdam, Aboozar; Scholl, Jeremiah; Bartfai, Aniko; Koch, Sabine
2012-11-19
Mild acquired cognitive impairment (MACI) is a new term used to describe a subgroup of patients with mild cognitive impairment (MCI) who are expected to reach a stable cognitive level over time. This patient group is generally young and have acquired MCI from a head injury or mild stroke. Although the past decade has seen a large amount of research on how to use information and communication technology (ICT) to support self-management of patients with chronic diseases, MACI has not received much attention. Therefore, there is a lack of information about what tools have been created and evaluated that are suitable for self-management of MACI patients, and a lack of clear direction on how best to proceed with ICT tools to support self-management of MACI patients. This paper aims to provide direction for further research and development of tools that can support health care professionals in assisting MACI patients with self-management. An overview of studies reporting on the design and/or evaluation of ICT tools for assisting MACI patients in self-management is presented. We also analyze the evidence of benefit provided by these tools, and how their functionality matches MACI patients' needs to determine areas of interest for further research and development. A review of the existing literature about available assistive ICT tools for MACI patients was conducted using 8 different medical, scientific, engineering, and physiotherapy library databases. The functionality of tools was analyzed using an analytical framework based on the International Classification of Functioning, Disability and Health (ICF) and a subset of common and important problems for patients with MACI created by MACI experts in Sweden. A total of 55 search phrases applied in the 8 databases returned 5969 articles. After review, 7 articles met the inclusion criteria. Most articles reported case reports and exploratory research. Out of the 7 articles, 4 (57%) studies had less than 10 participants, 5 (71%) technologies were memory aids, and 6 studies were mobile technologies. All 7 studies fit the profile for patients with MACI as described by our analytical framework. However, several areas in the framework important for meeting patient needs were not covered by the functionality in any of the ICT tools. This study shows a lack of ICT tools developed and evaluated for supporting self-management of MACI patients. Our analytical framework was a valuable tool for providing an overview of how the functionality of these tools matched patient needs. There are a number of important areas for MACI patients that are not covered by the functionality of existing tools, such as support for interpersonal interactions and relationships. Further research on ICT tools to support self-management for patients with MACI is needed.
NASA Astrophysics Data System (ADS)
Jawitz, J. W.
2011-12-01
What are the relative contributions of climatic variability, land management, and local geomorphology in determining the temporal dynamics of streamflow and the export of solutes from watersheds to receiving water bodies? A simple analytical framework is introduced for characterizing the temporal inequality of stream discharge and solute export from catchments using Lorenz diagrams and the associated Gini coefficient. These descriptors are used to illustrate a broad range of observed flow variability with a synthesis of multi-decadal flow data from 22 rivers in Florida. The analytical framework is extended to comprehensively link variability in flows and loads to climatically-driven inputs in terms of these inequality-based metrics. Further, based on a synthesis of data from the basins of the Baltic Sea, the Mississippi River, the Kissimmee River and other tributaries to Lake Okeechobee, FL, it is shown that inter-annual variations in exported loads for geogenic constituents, and for total N and total P, are dominantly controlled by discharge. Emergence of this consistent pattern across diverse managed catchments is attributed to the anthropogenic legacy of accumulated nutrient sources generating memory, similar to ubiquitously present sources for geogenic constituents. Multi-decadal phosphorus load data from 4 of the primary tributaries to Lake Okeechobee and sodium and nitrate load data from 9 of the Hubbard Brook, NH long-term study site catchments are used to examine the relation between inequality of climatic inputs, river flows and catchment loads. The intra-annual loads to Lake Okeechobee are shown to be highly unequal, such that 90% of annual load is delivered in as little as 15% of the time. Analytic expressions are developed for measures of inequality in terms of parameters of the lognormal distribution under general conditions that include intermittency. In cases where climatic variability is high compared to that of concentrations (chemostatic conditions), such as for P in the Lake Okeechobee basin and Na in Hubbard Brook, the temporal inequality of rainfall and flow are strong surrogates for load inequality. However, in cases where variability of concentrations is high compared to that of flows (chemodynamic conditions), such as for nitrate in the Hubbard Brook catchments, load inequality is greater than rainfall or flow inequality. The measured degree of correspondence between climatic, flow, and load inequality for these data sets are shown to be well described using the general inequality framework introduced here. Important implications are that (1) variations in hydro-climatic or anthropogenic forcing can be used to robustly predict inter-annual variations in flows and loads, (2) water quality problems in receiving inland and coastal waters may persist until the accumulated storages of nutrients have been substantially depleted, and (3) remedial measures designed to intercept or capture exported flows and loads must be designed with consideration of the intra-annual inequality.
Total analysis systems with Thermochromic Etching Discs technology.
Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel
2014-12-16
A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.
A consortium-driven framework to guide the implementation of ICH M7 Option 4 control strategies.
Barber, Chris; Antonucci, Vincent; Baumann, Jens-Christoph; Brown, Roland; Covey-Crump, Elizabeth; Elder, David; Elliott, Eric; Fennell, Jared W; Gallou, Fabrice; Ide, Nathan D; Jordine, Guido; Kallemeyn, Jeffrey M; Lauwers, Dirk; Looker, Adam R; Lovelle, Lucie E; McLaughlin, Mark; Molzahn, Robert; Ott, Martin; Schils, Didier; Oestrich, Rolf Schulte; Stevenson, Neil; Talavera, Pere; Teasdale, Andrew; Urquhart, Michael W; Varie, David L; Welch, Dennie
2017-11-01
The ICH M7 Option 4 control of (potentially) mutagenic impurities is based on the use of scientific principles in lieu of routine analytical testing. This approach can reduce the burden of analytical testing without compromising patient safety, provided a scientifically rigorous approach is taken which is backed up by sufficient theoretical and/or analytical data. This paper introduces a consortium-led initiative and offers a proposal on the supporting evidence that could be presented in regulatory submissions. Copyright © 2017 Elsevier Inc. All rights reserved.
Glenn, Catherine R; Kleiman, Evan M; Cha, Christine B; Deming, Charlene A; Franklin, Joseph C; Nock, Matthew K
2018-01-01
The field is in need of novel and transdiagnostic risk factors for suicide. The National Institute of Mental Health's Research Domain Criteria (RDoC) provides a framework that may help advance research on suicidal behavior. We conducted a meta-analytic review of existing prospective risk and protective factors for suicidal thoughts and behaviors (ideation, attempts, and deaths) that fall within one of the five RDoC domains or relate to a prominent suicide theory. Predictors were selected from a database of 4,082 prospective risk and protective factors for suicide outcomes. A total of 460 predictors met inclusion criteria for this meta-analytic review and most examined risk (vs. protective) factors for suicidal thoughts and behaviors. The overall effect of risk factors was statistically significant, but relatively small, in predicting suicide ideation (weighted mean odds ratio: wOR = 1.72; 95% CI: 1.59-1.87), suicide attempt (wOR = 1.66 [1.57-1.76), and suicide death (wOR = 1.41 [1.24-1.60]). Across all suicide outcomes, most risk factors related to the Negative Valence Systems domain, although effect sizes were of similar magnitude across RDoC domains. This study demonstrated that the RDoC framework provides a novel and promising approach to suicide research; however, relatively few studies of suicidal behavior fit within this framework. Future studies must go beyond the "usual suspects" of suicide risk factors (e.g., mental disorders, sociodemographics) to understand the processes that combine to lead to this deadly outcome. © 2017 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutmacher, R.; Crawford, R.
This comprehensive guide to the analytical capabilities of Lawrence Livermore Laboratory's General Chemistry Division describes each analytical method in terms of its principle, field of application, and qualitative and quantitative uses. Also described are the state and quantity of sample required for analysis, processing time, available instrumentation, and responsible personnel.
NASA Astrophysics Data System (ADS)
Bhatnagar, Shashank; Alemu, Lmenew
2018-02-01
In this work we calculate the mass spectra of charmonium for 1 P ,…,4 P states of 0++ and 1++, for 1 S ,…,5 S states of 0-+, and for 1 S ,…,4 D states of 1- along with the two-photon decay widths of the ground and first excited states of 0++ quarkonia for the process O++→γ γ in the framework of a QCD-motivated Bethe-Salpeter equation (BSE). In this 4 ×4 BSE framework, the coupled Salpeter equations are first shown to decouple for the confining part of the interaction (under the heavy-quark approximation) and are analytically solved, and later the one-gluon-exchange interaction is perturbatively incorporated, leading to mass spectral equations for various quarkonia. The analytic forms of wave functions obtained are used for the calculation of the two-photon decay widths of χc 0. Our results are in reasonable agreement with data (where available) and other models.
Co-governing decentralised water systems: an analytical framework.
Yu, C; Brown, R; Morison, P
2012-01-01
Current discourses in urban water management emphasise a diversity of water sources and scales of infrastructure for resilience and adaptability. During the last 2 decades, in particular, various small-scale systems emerged and developed so that the debate has largely moved from centralised versus decentralised water systems toward governing integrated and networked systems of provision and consumption where small-scale technologies are embedded in large-scale centralised infrastructures. However, while centralised systems have established boundaries of ownership and management, decentralised water systems (such as stormwater harvesting technologies for the street, allotment/house scales) do not, therefore the viability for adoption and/or continued use of decentralised water systems is challenged. This paper brings together insights from the literature on public sector governance, co-production and social practices model to develop an analytical framework for co-governing such systems. The framework provides urban water practitioners with guidance when designing co-governance arrangements for decentralised water systems so that these systems continue to exist, and become widely adopted, within the established urban water regime.
Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.
El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher
2018-01-01
Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.
NASA Astrophysics Data System (ADS)
Yang, Jianwen
2012-04-01
A general analytical solution is derived by using the Laplace transformation to describe transient reactive silica transport in a conceptualized 2-D system involving a set of parallel fractures embedded in an impermeable host rock matrix, taking into account of hydrodynamic dispersion and advection of silica transport along the fractures, molecular diffusion from each fracture to the intervening rock matrix, and dissolution of quartz. A special analytical solution is also developed by ignoring the longitudinal hydrodynamic dispersion term but remaining other conditions the same. The general and special solutions are in the form of a double infinite integral and a single infinite integral, respectively, and can be evaluated using Gauss-Legendre quadrature technique. A simple criterion is developed to determine under what conditions the general analytical solution can be approximated by the special analytical solution. It is proved analytically that the general solution always lags behind the special solution, unless a dimensionless parameter is less than a critical value. Several illustrative calculations are undertaken to demonstrate the effect of fracture spacing, fracture aperture and fluid flow rate on silica transport. The analytical solutions developed here can serve as a benchmark to validate numerical models that simulate reactive mass transport in fractured porous media.
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Dlugach, Janna M.; Yurkin, Maxim A.; Bi, Lei; Cairns, Brian; Liu, Li; Panetta, R. Lee; Travis, Larry D.; Yang, Ping; Zakharova, Nadezhda T.
2016-01-01
A discrete random medium is an object in the form of a finite volume of a vacuum or a homogeneous material medium filled with quasi-randomly and quasi-uniformly distributed discrete macroscopic impurities called small particles. Such objects are ubiquitous in natural and artificial environments. They are often characterized by analyzing theoretically the results of laboratory, in situ, or remote-sensing measurements of the scattering of light and other electromagnetic radiation. Electromagnetic scattering and absorption by particles can also affect the energy budget of a discrete random medium and hence various ambient physical and chemical processes. In either case electromagnetic scattering must be modeled in terms of appropriate optical observables, i.e., quadratic or bilinear forms in the field that quantify the reading of a relevant optical instrument or the electromagnetic energy budget. It is generally believed that time-harmonic Maxwell's equations can accurately describe elastic electromagnetic scattering by macroscopic particulate media that change in time much more slowly than the incident electromagnetic field. However, direct solutions of these equations for discrete random media had been impracticable until quite recently. This has led to a widespread use of various phenomenological approaches in situations when their very applicability can be questioned. Recently, however, a new branch of physical optics has emerged wherein electromagnetic scattering by discrete and discretely heterogeneous random media is modeled directly by using analytical or numerically exact computer solutions of the Maxwell equations. Therefore, the main objective of this Report is to formulate the general theoretical framework of electromagnetic scattering by discrete random media rooted in the Maxwell- Lorentz electromagnetics and discuss its immediate analytical and numerical consequences. Starting from the microscopic Maxwell-Lorentz equations, we trace the development of the first principles formalism enabling accurate calculations of monochromatic and quasi-monochromatic scattering by static and randomly varying multiparticle groups. We illustrate how this general framework can be coupled with state-of-the-art computer solvers of the Maxwell equations and applied to direct modeling of electromagnetic scattering by representative random multi-particle groups with arbitrary packing densities. This first-principles modeling yields general physical insights unavailable with phenomenological approaches. We discuss how the first-order-scattering approximation, the radiative transfer theory, and the theory of weak localization of electromagnetic waves can be derived as immediate corollaries of the Maxwell equations for very specific and well-defined kinds of particulate medium. These recent developments confirm the mesoscopic origin of the radiative transfer, weak localization, and effective-medium regimes and help evaluate the numerical accuracy of widely used approximate modeling methodologies.
Mishchenko, Michael I; Dlugach, Janna M; Yurkin, Maxim A; Bi, Lei; Cairns, Brian; Liu, Li; Panetta, R Lee; Travis, Larry D; Yang, Ping; Zakharova, Nadezhda T
2016-05-16
A discrete random medium is an object in the form of a finite volume of a vacuum or a homogeneous material medium filled with quasi-randomly and quasi-uniformly distributed discrete macroscopic impurities called small particles. Such objects are ubiquitous in natural and artificial environments. They are often characterized by analyzing theoretically the results of laboratory, in situ , or remote-sensing measurements of the scattering of light and other electromagnetic radiation. Electromagnetic scattering and absorption by particles can also affect the energy budget of a discrete random medium and hence various ambient physical and chemical processes. In either case electromagnetic scattering must be modeled in terms of appropriate optical observables, i.e., quadratic or bilinear forms in the field that quantify the reading of a relevant optical instrument or the electromagnetic energy budget. It is generally believed that time-harmonic Maxwell's equations can accurately describe elastic electromagnetic scattering by macroscopic particulate media that change in time much more slowly than the incident electromagnetic field. However, direct solutions of these equations for discrete random media had been impracticable until quite recently. This has led to a widespread use of various phenomenological approaches in situations when their very applicability can be questioned. Recently, however, a new branch of physical optics has emerged wherein electromagnetic scattering by discrete and discretely heterogeneous random media is modeled directly by using analytical or numerically exact computer solutions of the Maxwell equations. Therefore, the main objective of this Report is to formulate the general theoretical framework of electromagnetic scattering by discrete random media rooted in the Maxwell-Lorentz electromagnetics and discuss its immediate analytical and numerical consequences. Starting from the microscopic Maxwell-Lorentz equations, we trace the development of the first-principles formalism enabling accurate calculations of monochromatic and quasi-monochromatic scattering by static and randomly varying multiparticle groups. We illustrate how this general framework can be coupled with state-of-the-art computer solvers of the Maxwell equations and applied to direct modeling of electromagnetic scattering by representative random multi-particle groups with arbitrary packing densities. This first-principles modeling yields general physical insights unavailable with phenomenological approaches. We discuss how the first-order-scattering approximation, the radiative transfer theory, and the theory of weak localization of electromagnetic waves can be derived as immediate corollaries of the Maxwell equations for very specific and well-defined kinds of particulate medium. These recent developments confirm the mesoscopic origin of the radiative transfer, weak localization, and effective-medium regimes and help evaluate the numerical accuracy of widely used approximate modeling methodologies.
NASA Astrophysics Data System (ADS)
Kulatunga, Ushiri Kumarihamy
This dissertation work entails three related studies on the investigation of Peer-Led Guided Inquiry student discourse in a General Chemistry I course through argumentation. The first study, Argumentation and participation patterns in general chemistry peer-led sessions, is focused on examining arguments and participation patterns in small student groups without peer leader intervention. The findings of this study revealed that students were mostly engaged in co-constructed arguments, that a discrepancy in the participation of the group members existed, and students were able to correct most of the incorrect claims on their own via argumentation. The second study, Exploration of peer leader verbal behaviors as they intervene with small groups in college general chemistry, examines the interactive discourse of the peer leaders and the students during peer leader intervention. The relationship between the verbal behaviors of the peer leaders and the student argumentation is explored in this study. The findings of this study demonstrated that peer leaders used an array of verbal behaviors to guide students to construct chemistry concepts, and that a relationship existed between student argument components and peer leader verbal behaviors. The third study, Use of Tolumin's Argumentation Scheme for student discourse to gain insight about guided inquiry activities in college chemistry , is focused on investigating the relationship between student arguments without peer leader intervention and the structure of published guided inquiry ChemActivities. The relationship between argumentation and the structure of the activities is explored with respect to prompts, questions, and the segmented Learning Cycle structure of the ChemActivities. Findings of this study revealed that prompts were effective in eliciting arguments, that convergent questions produced more arguments than directed questions, and that the structure of the Learning Cycle successfully scaffolded arguments. A semester of video data from two different small student groups facilitated by two different peer leaders was used for these three related studies. An analytic framework based on Toulmin's argumentation scheme was used for the argumentation analysis of the studies. This dissertation work focused on the three central elements of the peer-led classroom, students, peer leader, and the ChemActivities, illuminates effective discourse important for group learning. Overall, this dissertation work contributes to science education by providing both an analytic framework useful for investigating group processes and crucial strategies for conducting effective cooperative learning and promoting student argumentation. The findings of this dissertation work have valuable implications in the professional development of teachers specifically for group interventions in the implementation of cooperative learning reforms.
Mishchenko, Michael I.; Dlugach, Janna M.; Yurkin, Maxim A.; Bi, Lei; Cairns, Brian; Liu, Li; Panetta, R. Lee; Travis, Larry D.; Yang, Ping; Zakharova, Nadezhda T.
2018-01-01
A discrete random medium is an object in the form of a finite volume of a vacuum or a homogeneous material medium filled with quasi-randomly and quasi-uniformly distributed discrete macroscopic impurities called small particles. Such objects are ubiquitous in natural and artificial environments. They are often characterized by analyzing theoretically the results of laboratory, in situ, or remote-sensing measurements of the scattering of light and other electromagnetic radiation. Electromagnetic scattering and absorption by particles can also affect the energy budget of a discrete random medium and hence various ambient physical and chemical processes. In either case electromagnetic scattering must be modeled in terms of appropriate optical observables, i.e., quadratic or bilinear forms in the field that quantify the reading of a relevant optical instrument or the electromagnetic energy budget. It is generally believed that time-harmonic Maxwell’s equations can accurately describe elastic electromagnetic scattering by macroscopic particulate media that change in time much more slowly than the incident electromagnetic field. However, direct solutions of these equations for discrete random media had been impracticable until quite recently. This has led to a widespread use of various phenomenological approaches in situations when their very applicability can be questioned. Recently, however, a new branch of physical optics has emerged wherein electromagnetic scattering by discrete and discretely heterogeneous random media is modeled directly by using analytical or numerically exact computer solutions of the Maxwell equations. Therefore, the main objective of this Report is to formulate the general theoretical framework of electromagnetic scattering by discrete random media rooted in the Maxwell–Lorentz electromagnetics and discuss its immediate analytical and numerical consequences. Starting from the microscopic Maxwell–Lorentz equations, we trace the development of the first-principles formalism enabling accurate calculations of monochromatic and quasi-monochromatic scattering by static and randomly varying multiparticle groups. We illustrate how this general framework can be coupled with state-of-the-art computer solvers of the Maxwell equations and applied to direct modeling of electromagnetic scattering by representative random multi-particle groups with arbitrary packing densities. This first-principles modeling yields general physical insights unavailable with phenomenological approaches. We discuss how the first-order-scattering approximation, the radiative transfer theory, and the theory of weak localization of electromagnetic waves can be derived as immediate corollaries of the Maxwell equations for very specific and well-defined kinds of particulate medium. These recent developments confirm the mesoscopic origin of the radiative transfer, weak localization, and effective-medium regimes and help evaluate the numerical accuracy of widely used approximate modeling methodologies. PMID:29657355
Stakeholder perspectives on decision-analytic modeling frameworks to assess genetic services policy.
Guzauskas, Gregory F; Garrison, Louis P; Stock, Jacquie; Au, Sylvia; Doyle, Debra Lochner; Veenstra, David L
2013-01-01
Genetic services policymakers and insurers often make coverage decisions in the absence of complete evidence of clinical utility and under budget constraints. We evaluated genetic services stakeholder opinions on the potential usefulness of decision-analytic modeling to inform coverage decisions, and asked them to identify genetic tests for decision-analytic modeling studies. We presented an overview of decision-analytic modeling to members of the Western States Genetic Services Collaborative Reimbursement Work Group and state Medicaid representatives and conducted directed content analysis and an anonymous survey to gauge their attitudes toward decision-analytic modeling. Participants also identified and prioritized genetic services for prospective decision-analytic evaluation. Participants expressed dissatisfaction with current processes for evaluating insurance coverage of genetic services. Some participants expressed uncertainty about their comprehension of decision-analytic modeling techniques. All stakeholders reported openness to using decision-analytic modeling for genetic services assessments. Participants were most interested in application of decision-analytic concepts to multiple-disorder testing platforms, such as next-generation sequencing and chromosomal microarray. Decision-analytic modeling approaches may provide a useful decision tool to genetic services stakeholders and Medicaid decision-makers.
ERIC Educational Resources Information Center
Waite, Sue; Bølling, Mads; Bentsen, Peter
2016-01-01
Using a conceptual model focused on purposes, aims, content, pedagogy, outcomes, and barriers, we review and interpret literature on two forms of outdoor learning: Forest Schools in England and "udeskole" in Denmark. We examine pedagogical principles within a comparative analytical framework and consider how adopted pedagogies reflect…
ERIC Educational Resources Information Center
Kim, Rae Young
2009-01-01
This study is an initial analytic attempt to iteratively develop a conceptual framework informed by both theoretical and practical perspectives that may be used to analyze non-textual elements in mathematics textbooks. Despite the importance of visual representations in teaching and learning, little effort has been made to specify in any…
ERIC Educational Resources Information Center
Markley, O. W.
The primary objective of this study is to develop a systems-oriented analytical framework with which to better understand how formal policies serve as regulatory influences on knowledge production and utilization (KPU) in education. When completed, the framework being developed should be able to organize information about the KPU system and its…
ERIC Educational Resources Information Center
Danielsson, Anna T.; Berge, Maria; Lidar, Malena
2018-01-01
The purpose of this paper is to develop and illustrate an analytical framework for exploring how relations between knowledge and power are constituted in science and technology classrooms. In addition, the empirical purpose of this paper is to explore how disciplinary knowledge and knowledge-making are constituted in teacher-student interactions.…
David C. Calkin; Mark A. Finney; Alan A. Ager; Matthew P. Thompson; Krista M. Gebert
2011-01-01
In this paper we review progress towards the implementation of a riskmanagement framework for US federal wildland fire policy and operations. We first describe new developments in wildfire simulation technology that catalyzed the development of risk-based decision support systems for strategic wildfire management. These systems include new analytical methods to measure...
What is Informal Learning and What are its Antecedents? An Integrative and Meta-Analytic Review
2014-07-01
formal training. Unfortunately, theory and research surrounding informal learning remains fragmented. Given that there has been little systematic...future-oriented. Applying this framework, the construct domain of informal learning in organizations is articulated. Second, an interactionist theory ...theoretical framework and outline an agenda for future theory development, research, and application of informal learning principles in organizations
ERIC Educational Resources Information Center
Holman, Gareth; Kohlenberg, Robert J.; Tsai, Mavis; Haworth, Kevin; Jacobson, Emily; Liu, Sarah
2012-01-01
Depression and cigarette smoking are recurrent, interacting problems that co-occur at high rates and--especially when depression is chronic--are difficult to treat and associated with costly health consequences. In this paper we present an integrative therapeutic framework for concurrent treatment of these problems based on evidence-based…
2010-10-22
4. TITLE AND SUBTITLE Enhanced Systemic Understanding of the Information Environment in Complex Crisis Management Analytical Concept, Version 1.0...Email: schmidtb@iabg.de UNCLASSIFIED FOR PUBLIC RELEASE – Enhanced Systemic Understanding of the Information Environment in Complex Crisis ...multinational crisis management and the security sector about the significance and characteristics of the information environment. The framework is
ERIC Educational Resources Information Center
Lamb, Theodore A.; Chin, Keric B. O.
This paper proposes a conceptual framework based on different levels of analysis using the metaphor of the layers of an onion to help organize and structure thinking on research issues concerning training. It discusses the core of the "analytic onion," the biological level, and seven levels of analysis that surround that core: the individual, the…
ERIC Educational Resources Information Center
Monroy, Carlos; Rangel, Virginia Snodgrass; Whitaker, Reid
2014-01-01
In this paper, we discuss a scalable approach for integrating learning analytics into an online K-12 science curriculum. A description of the curriculum and the underlying pedagogical framework is followed by a discussion of the challenges to be tackled as part of this integration. We include examples of data visualization based on teacher usage…
ERIC Educational Resources Information Center
Teplovs, Chris
2015-01-01
This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…
Kawamoto, Kensaku; Martin, Cary J; Williams, Kip; Tu, Ming-Chieh; Park, Charlton G; Hunter, Cheri; Staes, Catherine J; Bray, Bruce E; Deshmukh, Vikrant G; Holbrook, Reid A; Morris, Scott J; Fedderson, Matthew B; Sletta, Amy; Turnbull, James; Mulvihill, Sean J; Crabtree, Gordon L; Entwistle, David E; McKenna, Quinn L; Strong, Michael B; Pendleton, Robert C; Lee, Vivian S
2015-01-01
Objective To develop expeditiously a pragmatic, modular, and extensible software framework for understanding and improving healthcare value (costs relative to outcomes). Materials and methods In 2012, a multidisciplinary team was assembled by the leadership of the University of Utah Health Sciences Center and charged with rapidly developing a pragmatic and actionable analytics framework for understanding and enhancing healthcare value. Based on an analysis of relevant prior work, a value analytics framework known as Value Driven Outcomes (VDO) was developed using an agile methodology. Evaluation consisted of measurement against project objectives, including implementation timeliness, system performance, completeness, accuracy, extensibility, adoption, satisfaction, and the ability to support value improvement. Results A modular, extensible framework was developed to allocate clinical care costs to individual patient encounters. For example, labor costs in a hospital unit are allocated to patients based on the hours they spent in the unit; actual medication acquisition costs are allocated to patients based on utilization; and radiology costs are allocated based on the minutes required for study performance. Relevant process and outcome measures are also available. A visualization layer facilitates the identification of value improvement opportunities, such as high-volume, high-cost case types with high variability in costs across providers. Initial implementation was completed within 6 months, and all project objectives were fulfilled. The framework has been improved iteratively and is now a foundational tool for delivering high-value care. Conclusions The framework described can be expeditiously implemented to provide a pragmatic, modular, and extensible approach to understanding and improving healthcare value. PMID:25324556
Synchronistic mind-matter correlations in therapeutic practice: a commentary on Connolly (2015).
Atmanspacher, Harald; Fach, Wolfgang
2016-02-01
This commentary adds some ideas and refinements to the inspiring discussion in a recent paper by Connolly () that makes use of a dual-aspect framework developed by us earlier. One key point is that exceptional experiences (of which synchronicities are a special case) cannot in general be identified with experiences of non-categorial or acategorial mental states. In fact, most exceptional experiences reported in the literature are experiences of categorial states. Conversely, there are non-categorial and acategorial states whose experience is not exceptional. Moreover, the psychodynamics of a synchronistic experience contain a subtle mesh of interacting processes pertaining to categorial, non-categorial and acategorial domains. We outline how this mesh may be addressed in particular cases of synchronicity described by Connolly. © 2016, The Society of Analytical Psychology.
Light scattering of a Bessel beam by a nucleated biological cell: An eccentric sphere model
NASA Astrophysics Data System (ADS)
Wang, Jia Jie; Han, Yi Ping; Chang, Jiao Yong; Chen, Zhu Yang
2018-02-01
Within the framework of generalized Lorenz-Mie theory (GLMT), an eccentrically stratified dielectric sphere model illuminated by an arbitrarily incident Bessel beam is applied to investigate the scattering characteristics of a single nucleated biological cell. The Bessel beam propagating in an arbitrary direction is expanded in terms of vector spherical wave functions (VSWFs), where the beam shape coefficients (BSCs) are calculated rigorously in a closed analytical form. The effects of the half-cone angle of Bessel beam, the location of the particle in the beam, the size ratio of nucleus to cell, and the location of the nucleus inside the cell on the scattering properties of a nucleated cell are analyzed. The results provide useful references for optical diagnostic and imaging of particle having nucleated structure.
exprso: an R-package for the rapid implementation of machine learning algorithms.
Quinn, Thomas; Tylee, Daniel; Glatt, Stephen
2016-01-01
Machine learning plays a major role in many scientific investigations. However, non-expert programmers may struggle to implement the elaborate pipelines necessary to build highly accurate and generalizable models. We introduce exprso , a new R package that is an intuitive machine learning suite designed specifically for non-expert programmers. Built initially for the classification of high-dimensional data, exprso uses an object-oriented framework to encapsulate a number of common analytical methods into a series of interchangeable modules. This includes modules for feature selection, classification, high-throughput parameter grid-searching, elaborate cross-validation schemes (e.g., Monte Carlo and nested cross-validation), ensemble classification, and prediction. In addition, exprso also supports multi-class classification (through the 1-vs-all generalization of binary classifiers) and the prediction of continuous outcomes.
Parandekar, Priya V; Hratchian, Hrant P; Raghavachari, Krishnan
2008-10-14
Hybrid QM:QM (quantum mechanics:quantum mechanics) and QM:MM (quantum mechanics:molecular mechanics) methods are widely used to calculate the electronic structure of large systems where a full quantum mechanical treatment at a desired high level of theory is computationally prohibitive. The ONIOM (our own N-layer integrated molecular orbital molecular mechanics) approximation is one of the more popular hybrid methods, where the total molecular system is divided into multiple layers, each treated at a different level of theory. In a previous publication, we developed a novel QM:QM electronic embedding scheme within the ONIOM framework, where the model system is embedded in the external Mulliken point charges of the surrounding low-level region to account for the polarization of the model system wave function. Therein, we derived and implemented a rigorous expression for the embedding energy as well as analytic gradients that depend on the derivatives of the external Mulliken point charges. In this work, we demonstrate the applicability of our QM:QM method with point charge embedding and assess its accuracy. We study two challenging systems--zinc metalloenzymes and silicon oxide cages--and demonstrate that electronic embedding shows significant improvement over mechanical embedding. We also develop a modified technique for the energy and analytic gradients using a generalized asymmetric Mulliken embedding method involving an unequal splitting of the Mulliken overlap populations to offer improvement in situations where the Mulliken charges may be deficient.
NASA Astrophysics Data System (ADS)
Penoyre, Zephyr; Haiman, Zoltán
2018-01-01
In symmetric gravitating systems experiencing rapid mass-loss, particle orbits change almost instantaneously, which can lead to the development of a sharply contoured density profile, including singular caustics for collisionless systems. This framework can be used to model a variety of dynamical systems, such as accretion discs following a massive black hole merger and dwarf galaxies following violent early star formation feedback. Particle interactions in the high-density peaks seem a promising source of observable signatures of these mass-loss events (i.e. a possible EM counterpart for black hole mergers or strong gamma-ray emission from dark matter annihilation around young galaxies), because the interaction rate depends on the square of the density. We study post-mass-loss density profiles, both analytic and numerical, in idealized cases and present arguments and methods to extend to any general system. An analytic derivation is presented for particles on Keplerian orbits responding to a drop in the central mass. We argue that this case, with initially circular orbits, gives the most sharply contoured profile possible. We find that despite the presence of a set of singular caustics, the total particle interaction rate is reduced compared to the unperturbed system; this is a result of the overall expansion of the system dominating over the steep caustics. Finally, we argue that this result holds more generally, and the loss of central mass decreases the particle interaction rate in any physical system.
Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO
NASA Technical Reports Server (NTRS)
Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.
2016-01-01
A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.
High resolution x-ray CMT: Reconstruction methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, J.K.
This paper qualitatively discusses the primary characteristics of methods for reconstructing tomographic images from a set of projections. These reconstruction methods can be categorized as either {open_quotes}analytic{close_quotes} or {open_quotes}iterative{close_quotes} techniques. Analytic algorithms are derived from the formal inversion of equations describing the imaging process, while iterative algorithms incorporate a model of the imaging process and provide a mechanism to iteratively improve image estimates. Analytic reconstruction algorithms are typically computationally more efficient than iterative methods; however, analytic algorithms are available for a relatively limited set of imaging geometries and situations. Thus, the framework of iterative reconstruction methods is better suited formore » high accuracy, tomographic reconstruction codes.« less
Semi-analytic valuation of stock loans with finite maturity
NASA Astrophysics Data System (ADS)
Lu, Xiaoping; Putri, Endah R. M.
2015-10-01
In this paper we study stock loans of finite maturity with different dividend distributions semi-analytically using the analytical approximation method in Zhu (2006). Stock loan partial differential equations (PDEs) are established under Black-Scholes framework. Laplace transform method is used to solve the PDEs. Optimal exit price and stock loan value are obtained in Laplace space. Values in the original time space are recovered by numerical Laplace inversion. To demonstrate the efficiency and accuracy of our semi-analytic method several examples are presented, the results are compared with those calculated using existing methods. We also present a calculation of fair service fee charged by the lender for different loan parameters.
Turgeon, Maxime; Oualkacha, Karim; Ciampi, Antonio; Miftah, Hanane; Dehghan, Golsa; Zanke, Brent W; Benedet, Andréa L; Rosa-Neto, Pedro; Greenwood, Celia Mt; Labbe, Aurélie
2018-05-01
The genomics era has led to an increase in the dimensionality of data collected in the investigation of biological questions. In this context, dimension-reduction techniques can be used to summarise high-dimensional signals into low-dimensional ones, to further test for association with one or more covariates of interest. This paper revisits one such approach, previously known as principal component of heritability and renamed here as principal component of explained variance (PCEV). As its name suggests, the PCEV seeks a linear combination of outcomes in an optimal manner, by maximising the proportion of variance explained by one or several covariates of interest. By construction, this method optimises power; however, due to its computational complexity, it has unfortunately received little attention in the past. Here, we propose a general analytical PCEV framework that builds on the assets of the original method, i.e. conceptually simple and free of tuning parameters. Moreover, our framework extends the range of applications of the original procedure by providing a computationally simple strategy for high-dimensional outcomes, along with exact and asymptotic testing procedures that drastically reduce its computational cost. We investigate the merits of the PCEV using an extensive set of simulations. Furthermore, the use of the PCEV approach is illustrated using three examples taken from the fields of epigenetics and brain imaging.
Amarasingham, Ruben; Audet, Anne-Marie J.; Bates, David W.; Glenn Cohen, I.; Entwistle, Martin; Escobar, G. J.; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M.; Shahi, Anand; Stewart, Walter F.; Steyerberg, Ewout W.; Xie, Bin
2016-01-01
Context: The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Objectives: Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. Methods: To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. Findings: The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and Ethics) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility.Ethics: Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models. PMID:27141516
Amarasingham, Ruben; Audet, Anne-Marie J; Bates, David W; Glenn Cohen, I; Entwistle, Martin; Escobar, G J; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M; Shahi, Anand; Stewart, Walter F; Steyerberg, Ewout W; Xie, Bin
2016-01-01
The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and ETHICS) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility. Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models.
Perturbatively deformed defects in Pöschl-Teller-driven scenarios for quantum mechanics
NASA Astrophysics Data System (ADS)
Bernardini, Alex E.; da Rocha, Roldão
2016-07-01
Pöschl-Teller-driven solutions for quantum mechanical fluctuations are triggered off by single scalar field theories obtained through a systematic perturbative procedure for generating deformed defects. The analytical properties concerning the quantum fluctuations in one-dimension, zero-mode states, first- and second-excited states, and energy density profiles are all obtained from deformed topological and non-topological structures supported by real scalar fields. Results are firstly derived from an integrated λϕ4 theory, with corresponding generalizations applied to starting λχ4 and sine-Gordon theories. By focusing our calculations on structures supported by the λϕ4 theory, the outcome of our study suggests an exact quantitative correspondence to Pöschl-Teller-driven systems. Embedded into the perturbative quantum mechanics framework, such a correspondence turns into a helpful tool for computing excited states and continuous mode solutions, as well as their associated energy spectrum, for quantum fluctuations of perturbatively deformed structures. Perturbative deformations create distinct physical scenarios in the context of exactly solvable quantum systems and may also work as an analytical support for describing novel braneworld universes embedded into a 5-dimensional gravity bulk.
On the distribution of local dissipation scales in turbulent flows
NASA Astrophysics Data System (ADS)
May, Ian; Morshed, Khandakar; Venayagamoorthy, Karan; Dasi, Lakshmi
2014-11-01
Universality of dissipation scales in turbulence relies on self-similar scaling and large scale independence. We show that the probability density function of dissipation scales, Q (η) , is analytically defined by the two-point correlation function, and the Reynolds number (Re). We also present a new analytical form for the two-point correlation function for the dissipation scales through a generalized definition of a directional Taylor microscale. Comparison of Q (η) predicted within this framework and published DNS data shows excellent agreement. It is shown that for finite Re no single similarity law exists even for the case of homogeneous isotropic turbulence. Instead a family of scaling is presented, defined by Re and a dimensionless local inhomogeneity parameter based on the spatial gradient of the rms velocity. For moderate Re inhomogeneous flows, we note a strong directional dependence of Q (η) dictated by the principal Reynolds stresses. It is shown that the mode of the distribution Q (η) significantly shifts to sub-Kolmogorov scales along the inhomogeneous directions, as in wall bounded turbulence. This work extends the classical Kolmogorov's theory to finite Re homogeneous isotropic turbulence as well as the case of inhomogeneous anisotropic turbulence.
Stochastic and deterministic multiscale models for systems biology: an auxin-transport case study.
Twycross, Jamie; Band, Leah R; Bennett, Malcolm J; King, John R; Krasnogor, Natalio
2010-03-26
Stochastic and asymptotic methods are powerful tools in developing multiscale systems biology models; however, little has been done in this context to compare the efficacy of these methods. The majority of current systems biology modelling research, including that of auxin transport, uses numerical simulations to study the behaviour of large systems of deterministic ordinary differential equations, with little consideration of alternative modelling frameworks. In this case study, we solve an auxin-transport model using analytical methods, deterministic numerical simulations and stochastic numerical simulations. Although the three approaches in general predict the same behaviour, the approaches provide different information that we use to gain distinct insights into the modelled biological system. We show in particular that the analytical approach readily provides straightforward mathematical expressions for the concentrations and transport speeds, while the stochastic simulations naturally provide information on the variability of the system. Our study provides a constructive comparison which highlights the advantages and disadvantages of each of the considered modelling approaches. This will prove helpful to researchers when weighing up which modelling approach to select. In addition, the paper goes some way to bridging the gap between these approaches, which in the future we hope will lead to integrative hybrid models.
Diagnosing Alzheimer's disease: a systematic review of economic evaluations.
Handels, Ron L H; Wolfs, Claire A G; Aalten, Pauline; Joore, Manuela A; Verhey, Frans R J; Severens, Johan L
2014-03-01
The objective of this study is to systematically review the literature on economic evaluations of interventions for the early diagnosis of Alzheimer's disease (AD) and related disorders and to describe their general and methodological characteristics. We focused on the diagnostic aspects of the decision models to assess the applicability of existing decision models for the evaluation of the recently revised diagnostic research criteria for AD. PubMed and the National Institute for Health Research Economic Evaluation database were searched for English-language publications related to economic evaluations on diagnostic technologies. Trial-based economic evaluations were assessed using the Consensus on Health Economic Criteria list. Modeling studies were assessed using the framework for quality assessment of decision-analytic models. The search retrieved 2109 items, from which eight decision-analytic modeling studies and one trial-based economic evaluation met all eligibility criteria. Diversity among the study objective and characteristics was considerable and, despite considerable methodological quality, several flaws were indicated. Recommendations were focused on diagnostic aspects and the applicability of existing models for the evaluation of recently revised diagnostic research criteria for AD. Copyright © 2014 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
A Decision Analytic Approach to Exposure-Based Chemical Prioritization
Mitchell, Jade; Pabon, Nicolas; Collier, Zachary A.; Egeghy, Peter P.; Cohen-Hubal, Elaine; Linkov, Igor; Vallero, Daniel A.
2013-01-01
The manufacture of novel synthetic chemicals has increased in volume and variety, but often the environmental and health risks are not fully understood in terms of toxicity and, in particular, exposure. While efforts to assess risks have generally been effective when sufficient data are available, the hazard and exposure data necessary to assess risks adequately are unavailable for the vast majority of chemicals in commerce. The US Environmental Protection Agency has initiated the ExpoCast Program to develop tools for rapid chemical evaluation based on potential for exposure. In this context, a model is presented in which chemicals are evaluated based on inherent chemical properties and behaviorally-based usage characteristics over the chemical’s life cycle. These criteria are assessed and integrated within a decision analytic framework, facilitating rapid assessment and prioritization for future targeted testing and systems modeling. A case study outlines the prioritization process using 51 chemicals. The results show a preliminary relative ranking of chemicals based on exposure potential. The strength of this approach is the ability to integrate relevant statistical and mechanistic data with expert judgment, allowing for an initial tier assessment that can further inform targeted testing and risk management strategies. PMID:23940664
Consistency properties of chaotic systems driven by time-delayed feedback
NASA Astrophysics Data System (ADS)
Jüngling, T.; Soriano, M. C.; Oliver, N.; Porte, X.; Fischer, I.
2018-04-01
Consistency refers to the property of an externally driven dynamical system to respond in similar ways to similar inputs. In a delay system, the delayed feedback can be considered as an external drive to the undelayed subsystem. We analyze the degree of consistency in a generic chaotic system with delayed feedback by means of the auxiliary system approach. In this scheme an identical copy of the nonlinear node is driven by exactly the same signal as the original, allowing us to verify complete consistency via complete synchronization. In the past, the phenomenon of synchronization in delay-coupled chaotic systems has been widely studied using correlation functions. Here, we analytically derive relationships between characteristic signatures of the correlation functions in such systems and unequivocally relate them to the degree of consistency. The analytical framework is illustrated and supported by numerical calculations of the logistic map with delayed feedback for different replica configurations. We further apply the formalism to time series from an experiment based on a semiconductor laser with a double fiber-optical feedback loop. The experiment constitutes a high-quality replica scheme for studying consistency of the delay-driven laser and confirms the general theoretical results.
Contribution of human and climate change impacts to changes in streamflow of Canada.
Tan, Xuezhi; Gan, Thian Yew
2015-12-04
Climate change exerts great influence on streamflow by changing precipitation, temperature, snowpack and potential evapotranspiration (PET), while human activities in a watershed can directly alter the runoff production and indirectly through affecting climatic variables. However, to separate contribution of anthropogenic and natural drivers to observed changes in streamflow is non-trivial. Here we estimated the direct influence of human activities and climate change effect to changes of the mean annual streamflow (MAS) of 96 Canadian watersheds based on the elasticity of streamflow in relation to precipitation, PET and human impacts such as land use and cover change. Elasticities of streamflow for each watershed are analytically derived using the Budyko Framework. We found that climate change generally caused an increase in MAS, while human impacts generally a decrease in MAS and such impact tends to become more severe with time, even though there are exceptions. Higher proportions of human contribution, compared to that of climate change contribution, resulted in generally decreased streamflow of Canada observed in recent decades. Furthermore, if without contributions from retreating glaciers to streamflow, human impact would have resulted in a more severe decrease in Canadian streamflow.
Spike-Threshold Variability Originated from Separatrix-Crossing in Neuronal Dynamics
Wang, Longfei; Wang, Hengtong; Yu, Lianchun; Chen, Yong
2016-01-01
The threshold voltage for action potential generation is a key regulator of neuronal signal processing, yet the mechanism of its dynamic variation is still not well described. In this paper, we propose that threshold phenomena can be classified as parameter thresholds and state thresholds. Voltage thresholds which belong to the state threshold are determined by the ‘general separatrix’ in state space. We demonstrate that the separatrix generally exists in the state space of neuron models. The general form of separatrix was assumed as the function of both states and stimuli and the previously assumed threshold evolving equation versus time is naturally deduced from the separatrix. In terms of neuronal dynamics, the threshold voltage variation, which is affected by different stimuli, is determined by crossing the separatrix at different points in state space. We suggest that the separatrix-crossing mechanism in state space is the intrinsic dynamic mechanism for threshold voltages and post-stimulus threshold phenomena. These proposals are also systematically verified in example models, three of which have analytic separatrices and one is the classic Hodgkin-Huxley model. The separatrix-crossing framework provides an overview of the neuronal threshold and will facilitate understanding of the nature of threshold variability. PMID:27546614
Spike-Threshold Variability Originated from Separatrix-Crossing in Neuronal Dynamics.
Wang, Longfei; Wang, Hengtong; Yu, Lianchun; Chen, Yong
2016-08-22
The threshold voltage for action potential generation is a key regulator of neuronal signal processing, yet the mechanism of its dynamic variation is still not well described. In this paper, we propose that threshold phenomena can be classified as parameter thresholds and state thresholds. Voltage thresholds which belong to the state threshold are determined by the 'general separatrix' in state space. We demonstrate that the separatrix generally exists in the state space of neuron models. The general form of separatrix was assumed as the function of both states and stimuli and the previously assumed threshold evolving equation versus time is naturally deduced from the separatrix. In terms of neuronal dynamics, the threshold voltage variation, which is affected by different stimuli, is determined by crossing the separatrix at different points in state space. We suggest that the separatrix-crossing mechanism in state space is the intrinsic dynamic mechanism for threshold voltages and post-stimulus threshold phenomena. These proposals are also systematically verified in example models, three of which have analytic separatrices and one is the classic Hodgkin-Huxley model. The separatrix-crossing framework provides an overview of the neuronal threshold and will facilitate understanding of the nature of threshold variability.
Intellectual Biography in Higher Education: The Public Career of Earl J. McGrath as a Case Study.
ERIC Educational Resources Information Center
Reid, John Y.
The method of writing an intellectual biography and the public career of Earl J. McGrath in the post-World War I cultural milieu are analyzed. One analytical framework is adapted from cultural anthropology and is used to describe the relationship of educational systems to other social systems and to culture as a whole. The second analytic frame,…
2006-07-27
unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The goal of this project was to develop analytical and computational tools to make vision a Viable sensor for...vision.ucla. edu July 27, 2006 Abstract The goal of this project was to develop analytical and computational tools to make vision a viable sensor for the ... sensors . We have proposed the framework of stereoscopic segmentation where multiple images of the same obejcts were jointly processed to extract geometry
Yang, Cheng-Xiong; Liu, Chang; Cao, Yi-Meng; Yan, Xiu-Ping
2015-08-07
A simple and facile room-temperature solution-phase synthesis was developed to fabricate a spherical covalent organic framework with large surface area, good solvent stability and high thermostability for high-resolution chromatographic separation of diverse important industrial analytes including alkanes, cyclohexane and benzene, α-pinene and β-pinene, and alcohols with high column efficiency and good precision.
ERIC Educational Resources Information Center
Bull, Susan; Kay, Judy
2016-01-01
The SMILI? (Student Models that Invite the Learner In) Open Learner Model Framework was created to provide a coherent picture of the many and diverse forms of Open Learner Models (OLMs). The aim was for SMILI? to provide researchers with a systematic way to describe, compare and critique OLMs. We expected it to highlight those areas where there…
The "A" Factor: Coming to Terms with the Question of Legacy in South African Education
ERIC Educational Resources Information Center
Soudien, Crain
2007-01-01
This paper attempts to offer an alternative framework for assessing education delivery in South Africa. Its purpose is to develop an analytic approach for understanding education delivery in South Africa in the last 11 years and to use this framework to pose a set of strategic questions about how policy might be framed to deal with delivery. The…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekechukwu, A
Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validatemore » analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.« less
Life cycle thinking in impact assessment—Current practice and LCA gains
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bidstrup, Morten, E-mail: Bidstrup@plan.aau.dk
It has been advocated that life cycle thinking (LCT) should be applied in impact assessment (IA) to a greater extent, since some development proposals pose a risk of significant impacts throughout the interconnected activities of product systems. Multiple authors have proposed the usage of life cycle assessment (LCA) for such analytical advancement, but little to no research on this tool application has been founded in IA practice so far. The aim of this article is to elaborate further on the gains assigned to application of LCA. The research builds on a review of 85 Danish IA reports, which were analysedmore » for analytical appropriateness and application of LCT. Through a focus on the non-technical summary, the conclusion and the use of specific search words, passages containing LCT were searched for in each IA report. These passages were then analysed with a generic framework. The results reveal that LCT is appropriate for most of the IAs, but that LCA is rarely applied to provide such a perspective. Without LCA, the IAs show mixed performance in regard to LCT. Most IAs do consider the product provision of development proposals, but they rarely relate impacts to this function explicitly. Many IAs do consider downstream impacts, but assessments of upstream, distant impacts are generally absent. It is concluded that multiple analytical gains can be attributed to greater application of LCA in IA practice, though some level of LCT already exists. - Highlights: • Life cycle thinking is appropriate across the types and topics of impact assessment. • Yet, life cycle assessment is rarely used for adding such perspective. • Impact assessment practice does apply some degree of life cycle thinking. • However, application of life cycle assessment could bring analytical gains.« less
Boom Minimization Framework for Supersonic Aircraft Using CFD Analysis
NASA Technical Reports Server (NTRS)
Ordaz, Irian; Rallabhandi, Sriram K.
2010-01-01
A new framework is presented for shape optimization using analytical shape functions and high-fidelity computational fluid dynamics (CFD) via Cart3D. The focus of the paper is the system-level integration of several key enabling analysis tools and automation methods to perform shape optimization and reduce sonic boom footprint. A boom mitigation case study subject to performance, stability and geometrical requirements is presented to demonstrate a subset of the capabilities of the framework. Lastly, a design space exploration is carried out to assess the key parameters and constraints driving the design.
Moral panic, moral regulation, and the civilizing process.
Hier, Sean
2016-09-01
This article compares two analytical frameworks ostensibly formulated to widen the focus of moral panic studies. The comparative analysis suggests that attempts to conceptualize moral panics in terms of decivilizing processes have neither substantively supplemented the explanatory gains made by conceptualizing moral panic as a form of moral regulation nor provided a viable alternative framework that better explains the dynamics of contemporary moral panics. The article concludes that Elias's meta-theory of the civilizing process potentially provides explanatory resources to investigate a possible historical-structural shift towards the so-called age of (a)moral panic; the analytical demands of such a project, however, require a sufficiently different line of inquiry than the one encouraged by both the regulatory and decivilizing perspectives on moral panic. © London School of Economics and Political Science 2016.
NASA Astrophysics Data System (ADS)
Margitus, Michael R.; Tagliaferri, William A., Jr.; Sudit, Moises; LaMonica, Peter M.
2012-06-01
Understanding the structure and dynamics of networks are of vital importance to winning the global war on terror. To fully comprehend the network environment, analysts must be able to investigate interconnected relationships of many diverse network types simultaneously as they evolve both spatially and temporally. To remove the burden from the analyst of making mental correlations of observations and conclusions from multiple domains, we introduce the Dynamic Graph Analytic Framework (DYGRAF). DYGRAF provides the infrastructure which facilitates a layered multi-modal network analysis (LMMNA) approach that enables analysts to assemble previously disconnected, yet related, networks in a common battle space picture. In doing so, DYGRAF provides the analyst with timely situation awareness, understanding and anticipation of threats, and support for effective decision-making in diverse environments.
Critical Medical Anthropology in Midwifery Research: A Framework for Ethnographic Analysis.
Newnham, Elizabeth C; Pincombe, Jan I; McKellar, Lois V
2016-01-01
In this article, we discuss the use of critical medical anthropology (CMA) as a theoretical framework for research in the maternity care setting. With reference to the doctoral research of the first author, we argue for the relevance of using CMA for research into the maternity care setting, particularly as it relates to midwifery. We then give an overview of an existing analytic model within CMA that we adapted for looking specifically at childbirth practices and which was then used in both analyzing the data and structuring the thesis. There is often no clear guide to the analysis or writing up of data in ethnographic research; we therefore offer this Critical analytic model of childbirth practices for other researchers conducting ethnographic research into childbirth or maternity care.
From classical to quantum mechanics: ``How to translate physical ideas into mathematical language''
NASA Astrophysics Data System (ADS)
Bergeron, H.
2001-09-01
Following previous works by E. Prugovečki [Physica A 91A, 202 (1978) and Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)] on common features of classical and quantum mechanics, we develop a unified mathematical framework for classical and quantum mechanics (based on L2-spaces over classical phase space), in order to investigate to what extent quantum mechanics can be obtained as a simple modification of classical mechanics (on both logical and analytical levels). To obtain this unified framework, we split quantum theory in two parts: (i) general quantum axiomatics (a system is described by a state in a Hilbert space, observables are self-adjoints operators, and so on) and (ii) quantum mechanics proper that specifies the Hilbert space as L2(Rn); the Heisenberg rule [pi,qj]=-iℏδij with p=-iℏ∇, the free Hamiltonian H=-ℏ2Δ/2m and so on. We show that general quantum axiomatics (up to a supplementary "axiom of classicity") can be used as a nonstandard mathematical ground to formulate physical ideas and equations of ordinary classical statistical mechanics. So, the question of a "true quantization" with "ℏ" must be seen as an independent physical problem not directly related with quantum formalism. At this stage, we show that this nonstandard formulation of classical mechanics exhibits a new kind of operation that has no classical counterpart: this operation is related to the "quantization process," and we show why quantization physically depends on group theory (the Galilei group). This analytical procedure of quantization replaces the "correspondence principle" (or canonical quantization) and allows us to map classical mechanics into quantum mechanics, giving all operators of quantum dynamics and the Schrödinger equation. The great advantage of this point of view is that quantization is based on concrete physical arguments and not derived from some "pure algebraic rule" (we exhibit also some limit of the correspondence principle). Moreover spins for particles are naturally generated, including an approximation of their interaction with magnetic fields. We also recover by this approach the semi-classical formalism developed by E. Prugovečki [Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)].
Towards an analytical framework for tailoring supercontinuum generation.
Castelló-Lurbe, David; Vermeulen, Nathalie; Silvestre, Enrique
2016-11-14
A fully analytical toolbox for supercontinuum generation relying on scenarios without pulse splitting is presented. Furthermore, starting from the new insights provided by this formalism about the physical nature of direct and cascaded dispersive wave emission, a unified description of this radiation in both normal and anomalous dispersion regimes is derived. Previously unidentified physics of broadband spectra reported in earlier works is successfully explained on this basis. Finally, a foundry-compatible few-millimeters-long silicon waveguide allowing octave-spanning supercontinuum generation pumped at telecom wavelengths in the normal dispersion regime is designed, hence showcasing the potential of this new analytical approach.
service line analytics in the new era.
Spence, Jay; Seargeant, Dan
2015-08-01
To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.