RT-18: Value of Flexibility. Phase 1
2010-09-25
an analytical framework based on sound mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory...framework that is mathematically consistent, domain independent and applicable under varying information levels. This report presents our advances in...During this period, we also explored the development of an analytical framework based on sound mathematical constructs. A review of the current state
Value of Flexibility - Phase 1
2010-09-25
weaknesses of each approach. During this period, we also explored the development of an analytical framework based on sound mathematical constructs... mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory or guidance on best approaches to...research activities is in developing a coherent value based definition of flexibility that is based on an analytical framework that is mathematically
Papadimitriou, Konstantinos I.; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M.
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4th) order topology. PMID:25653579
Papadimitriou, Konstantinos I; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4(th)) order topology.
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view. PMID:23515240
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view.
A general modeling framework for describing spatially structured population dynamics
Sample, Christine; Fryxell, John; Bieri, Joanna; Federico, Paula; Earl, Julia; Wiederholt, Ruscena; Mattsson, Brady; Flockhart, Tyler; Nicol, Sam; Diffendorfer, James E.; Thogmartin, Wayne E.; Erickson, Richard A.; Norris, D. Ryan
2017-01-01
Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance that comparative analyses are colored by model details rather than general principles
Development and application of unified algorithms for problems in computational science
NASA Technical Reports Server (NTRS)
Shankar, Vijaya; Chakravarthy, Sukumar
1987-01-01
A framework is presented for developing computationally unified numerical algorithms for solving nonlinear equations that arise in modeling various problems in mathematical physics. The concept of computational unification is an attempt to encompass efficient solution procedures for computing various nonlinear phenomena that may occur in a given problem. For example, in Computational Fluid Dynamics (CFD), a unified algorithm will be one that allows for solutions to subsonic (elliptic), transonic (mixed elliptic-hyperbolic), and supersonic (hyperbolic) flows for both steady and unsteady problems. The objectives are: development of superior unified algorithms emphasizing accuracy and efficiency aspects; development of codes based on selected algorithms leading to validation; application of mature codes to realistic problems; and extension/application of CFD-based algorithms to problems in other areas of mathematical physics. The ultimate objective is to achieve integration of multidisciplinary technologies to enhance synergism in the design process through computational simulation. Specific unified algorithms for a hierarchy of gas dynamics equations and their applications to two other areas: electromagnetic scattering, and laser-materials interaction accounting for melting.
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental disturbances, is also proposed, together with a simulation example. PMID:23515190
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental disturbances, is also proposed, together with a simulation example.
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
In ecological networks, network robustness should be large enough to confer intrinsic robustness for tolerating intrinsic parameter fluctuations, as well as environmental robustness for resisting environmental disturbances, so that the phenotype stability of ecological networks can be maintained, thus guaranteeing phenotype robustness. However, it is difficult to analyze the network robustness of ecological systems because they are complex nonlinear partial differential stochastic systems. This paper develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance sensitivity in ecological networks. We found that the phenotype robustness criterion for ecological networks is that if intrinsic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations and environmental disturbances. These results in robust ecological networks are similar to that in robust gene regulatory networks and evolutionary networks even they have different spatial-time scales. PMID:23515112
A Unified Model of Geostrophic Adjustment and Frontogenesis
NASA Astrophysics Data System (ADS)
Taylor, John; Shakespeare, Callum
2013-11-01
Fronts, or regions with strong horizontal density gradients, are ubiquitous and dynamically important features of the ocean and atmosphere. In the ocean, fronts are associated with enhanced air-sea fluxes, turbulence, and biological productivity, while atmospheric fronts are associated with some of the most extreme weather events. Here, we describe a new mathematical framework for describing the formation of fronts, or frontogenesis. This framework unifies two classical problems in geophysical fluid dynamics, geostrophic adjustment and strain-driven frontogenesis, and provides a number of important extensions beyond previous efforts. The model solutions closely match numerical simulations during the early stages of frontogenesis, and provide a means to describe the development of turbulence at mature fronts.
Unified framework for information integration based on information geometry
Oizumi, Masafumi; Amari, Shun-ichi
2016-01-01
Assessment of causal influences is a ubiquitous and important subject across diverse research fields. Drawn from consciousness studies, integrated information is a measure that defines integration as the degree of causal influences among elements. Whereas pairwise causal influences between elements can be quantified with existing methods, quantifying multiple influences among many elements poses two major mathematical difficulties. First, overestimation occurs due to interdependence among influences if each influence is separately quantified in a part-based manner and then simply summed over. Second, it is difficult to isolate causal influences while avoiding noncausal confounding influences. To resolve these difficulties, we propose a theoretical framework based on information geometry for the quantification of multiple causal influences with a holistic approach. We derive a measure of integrated information, which is geometrically interpreted as the divergence between the actual probability distribution of a system and an approximated probability distribution where causal influences among elements are statistically disconnected. This framework provides intuitive geometric interpretations harmonizing various information theoretic measures in a unified manner, including mutual information, transfer entropy, stochastic interaction, and integrated information, each of which is characterized by how causal influences are disconnected. In addition to the mathematical assessment of consciousness, our framework should help to analyze causal relationships in complex systems in a complete and hierarchical manner. PMID:27930289
Optimization Techniques for Analysis of Biological and Social Networks
2012-03-28
analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms , test and fine...alternative mathematical programming formulations, their theoretical analysis, the development of exact algorithms , and heuristics. Originally, clusters...systematic fashion under a unifying theoretical and algorithmic framework. Optimization, Complex Networks, Social Network Analysis, Computational
String Theory: Big Problem for Small Size
ERIC Educational Resources Information Center
Sahoo, S.
2009-01-01
String theory is the most promising candidate theory for a unified description of all the fundamental forces that exist in nature. It provides a mathematical framework that combines quantum theory with Einstein's general theory of relativity. The typical size of a string is of the order of 10[superscript -33] cm, called the Planck length. But due…
NASA Astrophysics Data System (ADS)
Beretta, Gian Paolo
2014-10-01
By suitable reformulations, we cast the mathematical frameworks of several well-known different approaches to the description of nonequilibrium dynamics into a unified formulation valid in all these contexts, which extends to such frameworks the concept of steepest entropy ascent (SEA) dynamics introduced by the present author in previous works on quantum thermodynamics. Actually, the present formulation constitutes a generalization also for the quantum thermodynamics framework. The analysis emphasizes that in the SEA modeling principle a key role is played by the geometrical metric with respect to which to measure the length of a trajectory in state space. In the near-thermodynamic-equilibrium limit, the metric tensor is directly related to the Onsager's generalized resistivity tensor. Therefore, through the identification of a suitable metric field which generalizes the Onsager generalized resistance to the arbitrarily far-nonequilibrium domain, most of the existing theories of nonequilibrium thermodynamics can be cast in such a way that the state exhibits the spontaneous tendency to evolve in state space along the path of SEA compatible with the conservation constraints and the boundary conditions. The resulting unified family of SEA dynamical models is intrinsically and strongly consistent with the second law of thermodynamics. The non-negativity of the entropy production is a general and readily proved feature of SEA dynamics. In several of the different approaches to nonequilibrium description we consider here, the SEA concept has not been investigated before. We believe it defines the precise meaning and the domain of general validity of the so-called maximum entropy production principle. Therefore, it is hoped that the present unifying approach may prove useful in providing a fresh basis for effective, thermodynamically consistent, numerical models and theoretical treatments of irreversible conservative relaxation towards equilibrium from far nonequilibrium states. The mathematical frameworks we consider are the following: (A) statistical or information-theoretic models of relaxation; (B) small-scale and rarefied gas dynamics (i.e., kinetic models for the Boltzmann equation); (C) rational extended thermodynamics, macroscopic nonequilibrium thermodynamics, and chemical kinetics; (D) mesoscopic nonequilibrium thermodynamics, continuum mechanics with fluctuations; and (E) quantum statistical mechanics, quantum thermodynamics, mesoscopic nonequilibrium quantum thermodynamics, and intrinsic quantum thermodynamics.
A Unified Mathematical Definition of Classical Information Retrieval.
ERIC Educational Resources Information Center
Dominich, Sandor
2000-01-01
Presents a unified mathematical definition for the classical models of information retrieval and identifies a mathematical structure behind relevance feedback. Highlights include vector information retrieval; probabilistic information retrieval; and similarity information retrieval. (Contains 118 references.) (Author/LRW)
Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya
2018-06-17
Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.
Unified reduction principle for the evolution of mutation, migration, and recombination
Altenberg, Lee; Liberman, Uri; Feldman, Marcus W.
2017-01-01
Modifier-gene models for the evolution of genetic information transmission between generations of organisms exhibit the reduction principle: Selection favors reduction in the rate of variation production in populations near equilibrium under a balance of constant viability selection and variation production. Whereas this outcome has been proven for a variety of genetic models, it has not been proven in general for multiallelic genetic models of mutation, migration, and recombination modification with arbitrary linkage between the modifier and major genes under viability selection. We show that the reduction principle holds for all of these cases by developing a unifying mathematical framework that characterizes all of these evolutionary models. PMID:28265103
The formal Darwinism project: a mid-term report.
Grafen, A
2007-07-01
For 8 years I have been pursuing in print an ambitious and at times highly technical programme of work, the 'Formal Darwinism Project', whose essence is to underpin and formalize the fitness optimization ideas used by behavioural ecologists, using a new kind of argument linking the mathematics of motion and the mathematics of optimization. The value of the project is to give stronger support to current practices, and at the same time sharpening theoretical ideas and suggesting principled resolutions of some untidy areas, for example, how to define fitness. The aim is also to unify existing free-standing theoretical structures, such as inclusive fitness theory, Evolutionary Stable Strategy (ESS) theory and bet-hedging theory. The 40-year-old misunderstanding over the meaning of fitness optimization between mathematicians and biologists is explained. Most of the elements required for a general theory have now been implemented, but not together in the same framework, and 'general time' remains to be developed and integrated with the other elements to produce a final unified theory of neo-Darwinian natural selection.
The Vector Space as a Unifying Concept in School Mathematics.
ERIC Educational Resources Information Center
Riggle, Timothy Andrew
The purpose of this study was to show how the concept of vector space can serve as a unifying thread for mathematics programs--elementary school to pre-calculus college level mathematics. Indicated are a number of opportunities to demonstrate how emphasis upon the vector space structure can enhance the organization of the mathematics curriculum.…
Townsend, James T; Eidels, Ami
2011-08-01
Increasing the number of available sources of information may impair or facilitate performance, depending on the capacity of the processing system. Tests performed on response time distributions are proving to be useful tools in determining the workload capacity (as well as other properties) of cognitive systems. In this article, we develop a framework and relevant mathematical formulae that represent different capacity assays (Miller's race model bound, Grice's bound, and Townsend's capacity coefficient) in the same space. The new space allows a direct comparison between the distinct bounds and the capacity coefficient values and helps explicate the relationships among the different measures. An analogous common space is proposed for the AND paradigm, relating the capacity index to the Colonius-Vorberg bounds. We illustrate the effectiveness of the unified spaces by presenting data from two simulated models (standard parallel, coactive) and a prototypical visual detection experiment. A conversion table for the unified spaces is provided.
Polynomial algebra of discrete models in systems biology.
Veliz-Cuba, Alan; Jarrah, Abdul Salam; Laubenbacher, Reinhard
2010-07-01
An increasing number of discrete mathematical models are being published in Systems Biology, ranging from Boolean network models to logical models and Petri nets. They are used to model a variety of biochemical networks, such as metabolic networks, gene regulatory networks and signal transduction networks. There is increasing evidence that such models can capture key dynamic features of biological networks and can be used successfully for hypothesis generation. This article provides a unified framework that can aid the mathematical analysis of Boolean network models, logical models and Petri nets. They can be represented as polynomial dynamical systems, which allows the use of a variety of mathematical tools from computer algebra for their analysis. Algorithms are presented for the translation into polynomial dynamical systems. Examples are given of how polynomial algebra can be used for the model analysis. alanavc@vt.edu Supplementary data are available at Bioinformatics online.
Oakland and San Francisco Create Course Pathways through Common Core Mathematics. White Paper
ERIC Educational Resources Information Center
Daro, Phil
2014-01-01
The Common Core State Standards for Mathematics (CCSS-M) set rigorous standards for each of grades 6, 7 and 8. Strategic Education Research Partnership (SERP) has been working with two school districts, Oakland Unified School District and San Francisco Unified School District, to evaluate extant policies and practices and formulate new policies…
Standard representation and unified stability analysis for dynamic artificial neural network models.
Kim, Kwang-Ki K; Patrón, Ernesto Ríos; Braatz, Richard D
2018-02-01
An overview is provided of dynamic artificial neural network models (DANNs) for nonlinear dynamical system identification and control problems, and convex stability conditions are proposed that are less conservative than past results. The three most popular classes of dynamic artificial neural network models are described, with their mathematical representations and architectures followed by transformations based on their block diagrams that are convenient for stability and performance analyses. Classes of nonlinear dynamical systems that are universally approximated by such models are characterized, which include rigorous upper bounds on the approximation errors. A unified framework and linear matrix inequality-based stability conditions are described for different classes of dynamic artificial neural network models that take additional information into account such as local slope restrictions and whether the nonlinearities within the DANNs are odd. A theoretical example shows reduced conservatism obtained by the conditions. Copyright © 2017. Published by Elsevier Ltd.
War-gaming application for future space systems acquisition
NASA Astrophysics Data System (ADS)
Nguyen, Tien M.; Guillen, Andy T.
2016-05-01
Recently the U.S. Department of Defense (DOD) released the Defense Innovation Initiative (DII) [1] to focus DOD on five key aspects; Aspect #1: Recruit talented and innovative people, Aspect #2: Reinvigorate war-gaming, Aspect #3: Initiate long-range research and development programs, Aspect #4: Make DOD practices more innovative, and Aspect #5: Advance technology and new operational concepts. Per DII instruction, this paper concentrates on Aspect #2 and Aspect #4 by reinvigorating the war-gaming effort with a focus on an innovative approach for developing the optimum Program and Technical Baselines (PTBs) and their corresponding optimum acquisition strategies for acquiring future space systems. The paper describes a unified approach for applying the war-gaming concept for future DOD acquisition of space systems. The proposed approach includes a Unified Game-based Acquisition Framework (UGAF) and an Advanced Game-Based Mathematical Framework (AGMF) using Bayesian war-gaming engines to optimize PTB solutions and select the corresponding optimum acquisition strategies for acquiring a space system. The framework defines the action space for all players with a complete description of the elements associated with the games, including Department of Defense Acquisition Authority (DAA), stakeholders, warfighters, and potential contractors, War-Gaming Engines (WGEs) played by DAA, WGEs played by Contractor (KTR), and the players' Payoff and Cost functions (PCFs). The AGMF presented here addresses both complete and incomplete information cases. The proposed framework provides a recipe for the DAA and USAF-Space and Missile Systems Center (SMC) to acquire future space systems optimally.
Intelligent control of a planning system for astronaut training.
Ortiz, J; Chen, G
1999-07-01
This work intends to design, analyze and solve, from the systems control perspective, a complex, dynamic, and multiconstrained planning system for generating training plans for crew members of the NASA-led International Space Station. Various intelligent planning systems have been developed within the framework of artificial intelligence. These planning systems generally lack a rigorous mathematical formalism to allow a reliable and flexible methodology for their design, modeling, and performance analysis in a dynamical, time-critical, and multiconstrained environment. Formulating the planning problem in the domain of discrete-event systems under a unified framework such that it can be modeled, designed, and analyzed as a control system will provide a self-contained theory for such planning systems. This will also provide a means to certify various planning systems for operations in the dynamical and complex environments in space. The work presented here completes the design, development, and analysis of an intricate, large-scale, and representative mathematical formulation for intelligent control of a real planning system for Space Station crew training. This planning system has been tested and used at NASA-Johnson Space Center.
Backpropagation and ordered derivatives in the time scales calculus.
Seiffertt, John; Wunsch, Donald C
2010-08-01
Backpropagation is the most widely used neural network learning technique. It is based on the mathematical notion of an ordered derivative. In this paper, we present a formulation of ordered derivatives and the backpropagation training algorithm using the important emerging area of mathematics known as the time scales calculus. This calculus, with its potential for application to a wide variety of inter-disciplinary problems, is becoming a key area of mathematics. It is capable of unifying continuous and discrete analysis within one coherent theoretical framework. Using this calculus, we present here a generalization of backpropagation which is appropriate for cases beyond the specifically continuous or discrete. We develop a new multivariate chain rule of this calculus, define ordered derivatives on time scales, prove a key theorem about them, and derive the backpropagation weight update equations for a feedforward multilayer neural network architecture. By drawing together the time scales calculus and the area of neural network learning, we present the first connection of two major fields of research.
Theory and applications of structured light single pixel imaging
NASA Astrophysics Data System (ADS)
Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.
2018-02-01
Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.
Analysis and Management of Animal Populations: Modeling, Estimation and Decision Making
Williams, B.K.; Nichols, J.D.; Conroy, M.J.
2002-01-01
This book deals with the processes involved in making informed decisions about the management of animal populations. It covers the modeling of population responses to management actions, the estimation of quantities needed in the modeling effort, and the application of these estimates and models to the development of sound management decisions. The book synthesizes and integrates in a single volume the methods associated with these themes, as they apply to ecological assessment and conservation of animal populations. KEY FEATURES * Integrates population modeling, parameter estimation and * decision-theoretic approaches to management in a single, cohesive framework * Provides authoritative, state-of-the-art descriptions of quantitative * approaches to modeling, estimation and decision-making * Emphasizes the role of mathematical modeling in the conduct of science * and management * Utilizes a unifying biological context, consistent mathematical notation, * and numerous biological examples
A unifying framework for marginalized random intercept models of correlated binary outcomes
Swihart, Bruce J.; Caffo, Brian S.; Crainiceanu, Ciprian M.
2013-01-01
We demonstrate that many current approaches for marginal modeling of correlated binary outcomes produce likelihoods that are equivalent to the copula-based models herein. These general copula models of underlying latent threshold random variables yield likelihood-based models for marginal fixed effects estimation and interpretation in the analysis of correlated binary data with exchangeable correlation structures. Moreover, we propose a nomenclature and set of model relationships that substantially elucidates the complex area of marginalized random intercept models for binary data. A diverse collection of didactic mathematical and numerical examples are given to illustrate concepts. PMID:25342871
Olbert, Charles M; Gala, Gary J; Tupler, Larry A
2014-05-01
Heterogeneity within psychiatric disorders is both theoretically and practically problematic: For many disorders, it is possible for 2 individuals to share very few or even no symptoms in common yet share the same diagnosis. Polythetic diagnostic criteria have long been recognized to contribute to this heterogeneity, yet no unified theoretical understanding of the coherence of symptom criteria sets currently exists. A general framework for analyzing the logical and mathematical structure, coherence, and diversity of Diagnostic and Statistical Manual diagnostic categories (DSM-5 and DSM-IV-TR) is proposed, drawing from combinatorial mathematics, set theory, and information theory. Theoretical application of this framework to 18 diagnostic categories indicates that in most categories, 2 individuals with the same diagnosis may share no symptoms in common, and that any 2 theoretically possible symptom combinations will share on average less than half their symptoms. Application of this framework to 2 large empirical datasets indicates that patients who meet symptom criteria for major depressive disorder and posttraumatic stress disorder tend to share approximately three-fifths of symptoms in common. For both disorders in each of the datasets, pairs of individuals who shared no common symptoms were observed. Any 2 individuals with either diagnosis were unlikely to exhibit identical symptomatology. The theoretical and empirical results stemming from this approach have substantive implications for etiological research into, and measurement of, psychiatric disorders.
Theory of the Origin, Evolution, and Nature of Life
Andrulis, Erik D.
2011-01-01
Life is an inordinately complex unsolved puzzle. Despite significant theoretical progress, experimental anomalies, paradoxes, and enigmas have revealed paradigmatic limitations. Thus, the advancement of scientific understanding requires new models that resolve fundamental problems. Here, I present a theoretical framework that economically fits evidence accumulated from examinations of life. This theory is based upon a straightforward and non-mathematical core model and proposes unique yet empirically consistent explanations for major phenomena including, but not limited to, quantum gravity, phase transitions of water, why living systems are predominantly CHNOPS (carbon, hydrogen, nitrogen, oxygen, phosphorus, and sulfur), homochirality of sugars and amino acids, homeoviscous adaptation, triplet code, and DNA mutations. The theoretical framework unifies the macrocosmic and microcosmic realms, validates predicted laws of nature, and solves the puzzle of the origin and evolution of cellular life in the universe. PMID:25382118
Chung, Moo K; Qiu, Anqi; Seo, Seongho; Vorperian, Houri K
2015-05-01
We present a novel kernel regression framework for smoothing scalar surface data using the Laplace-Beltrami eigenfunctions. Starting with the heat kernel constructed from the eigenfunctions, we formulate a new bivariate kernel regression framework as a weighted eigenfunction expansion with the heat kernel as the weights. The new kernel method is mathematically equivalent to isotropic heat diffusion, kernel smoothing and recently popular diffusion wavelets. The numerical implementation is validated on a unit sphere using spherical harmonics. As an illustration, the method is applied to characterize the localized growth pattern of mandible surfaces obtained in CT images between ages 0 and 20 by regressing the length of displacement vectors with respect to a surface template. Copyright © 2015 Elsevier B.V. All rights reserved.
Tropical geometry of statistical models.
Pachter, Lior; Sturmfels, Bernd
2004-11-16
This article presents a unified mathematical framework for inference in graphical models, building on the observation that graphical models are algebraic varieties. From this geometric viewpoint, observations generated from a model are coordinates of a point in the variety, and the sum-product algorithm is an efficient tool for evaluating specific coordinates. Here, we address the question of how the solutions to various inference problems depend on the model parameters. The proposed answer is expressed in terms of tropical algebraic geometry. The Newton polytope of a statistical model plays a key role. Our results are applied to the hidden Markov model and the general Markov model on a binary tree.
NASA Technical Reports Server (NTRS)
Zolotukhin, V. G.; Kolosov, B. I.; Usikov, D. A.; Borisenko, V. I.; Mosin, S. T.; Gorokhov, V. N.
1980-01-01
A description of a batch of programs for the YeS-1040 computer combined into an automated system for processing photo (and video) images of the Earth's surface, taken from spacecraft, is presented. Individual programs with the detailed discussion of the algorithmic and programmatic facilities needed by the user are presented. The basic principles for assembling the system, and the control programs are included. The exchange format within whose framework the cataloging of any programs recommended for the system of processing will be activated in the future is displayed.
Discrete Mathematics across the Curriculum, K-12. 1991 Yearbook.
ERIC Educational Resources Information Center
Kenney, Margaret J., Ed.; Hirsch, Christian R., Ed.
This yearbook provides the mathematics education community with specific perceptions about discrete mathematics concerning its importance, its composition at various grade levels, and ideas about how to teach it. Many practical suggestions with respect to the implementation of a discrete mathematics school program are included. A unifying thread…
SECONDARY SCHOOL MATHEMATICS CURRICULUM IMPROVEMENT STUDY. FINAL REPORT.
ERIC Educational Resources Information Center
FEHR, HOWARD F.
THIS SECONDARY SCHOOL MATHEMATICS CURRICULUM IMPROVEMENT STUDY GROUP (SSMCIS), COMPOSED OF BOTH AMERICAN AND EUROPEAN EDUCATORS, WAS GUIDED BY TWO MAIN OBJECTIVES--(1) TO CONSTRUCT AND EVALUATE A UNIFIED SECONDARY SCHOOL MATHEMATICS PROGRAM FOR GRADES 7-12 THAT WOULD TAKE THE CAPABLE STUDENT WELL INTO CURRENT COLLEGE MATHEMATICS, AND (2) DETERMINE…
Toward Model Building for Visual Aesthetic Perception
Lughofer, Edwin; Zeng, Xianyi
2017-01-01
Several models of visual aesthetic perception have been proposed in recent years. Such models have drawn on investigations into the neural underpinnings of visual aesthetics, utilizing neurophysiological techniques and brain imaging techniques including functional magnetic resonance imaging, magnetoencephalography, and electroencephalography. The neural mechanisms underlying the aesthetic perception of the visual arts have been explained from the perspectives of neuropsychology, brain and cognitive science, informatics, and statistics. Although corresponding models have been constructed, the majority of these models contain elements that are difficult to be simulated or quantified using simple mathematical functions. In this review, we discuss the hypotheses, conceptions, and structures of six typical models for human aesthetic appreciation in the visual domain: the neuropsychological, information processing, mirror, quartet, and two hierarchical feed-forward layered models. Additionally, the neural foundation of aesthetic perception, appreciation, or judgement for each model is summarized. The development of a unified framework for the neurobiological mechanisms underlying the aesthetic perception of visual art and the validation of this framework via mathematical simulation is an interesting challenge in neuroaesthetics research. This review aims to provide information regarding the most promising proposals for bridging the gap between visual information processing and brain activity involved in aesthetic appreciation. PMID:29270194
Common foundations of optimal control across the sciences: evidence of a free lunch
NASA Astrophysics Data System (ADS)
Russell, Benjamin; Rabitz, Herschel
2017-03-01
A common goal in the sciences is optimization of an objective function by selecting control variables such that a desired outcome is achieved. This scenario can be expressed in terms of a control landscape of an objective considered as a function of the control variables. At the most basic level, it is known that the vast majority of quantum control landscapes possess no traps, whose presence would hinder reaching the objective. This paper reviews and extends the quantum control landscape assessment, presenting evidence that the same highly favourable landscape features exist in many other domains of science. The implications of this broader evidence are discussed. Specifically, control landscape examples from quantum mechanics, chemistry and evolutionary biology are presented. Despite the obvious differences, commonalities between these areas are highlighted within a unified mathematical framework. This mathematical framework is driven by the wide-ranging experimental evidence on the ease of finding optimal controls (in terms of the required algorithmic search effort beyond the laboratory set-up overhead). The full scope and implications of this observed common control behaviour pose an open question for assessment in further work. This article is part of the themed issue 'Horizons of cybernetical physics'.
From classical to quantum mechanics: ``How to translate physical ideas into mathematical language''
NASA Astrophysics Data System (ADS)
Bergeron, H.
2001-09-01
Following previous works by E. Prugovečki [Physica A 91A, 202 (1978) and Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)] on common features of classical and quantum mechanics, we develop a unified mathematical framework for classical and quantum mechanics (based on L2-spaces over classical phase space), in order to investigate to what extent quantum mechanics can be obtained as a simple modification of classical mechanics (on both logical and analytical levels). To obtain this unified framework, we split quantum theory in two parts: (i) general quantum axiomatics (a system is described by a state in a Hilbert space, observables are self-adjoints operators, and so on) and (ii) quantum mechanics proper that specifies the Hilbert space as L2(Rn); the Heisenberg rule [pi,qj]=-iℏδij with p=-iℏ∇, the free Hamiltonian H=-ℏ2Δ/2m and so on. We show that general quantum axiomatics (up to a supplementary "axiom of classicity") can be used as a nonstandard mathematical ground to formulate physical ideas and equations of ordinary classical statistical mechanics. So, the question of a "true quantization" with "ℏ" must be seen as an independent physical problem not directly related with quantum formalism. At this stage, we show that this nonstandard formulation of classical mechanics exhibits a new kind of operation that has no classical counterpart: this operation is related to the "quantization process," and we show why quantization physically depends on group theory (the Galilei group). This analytical procedure of quantization replaces the "correspondence principle" (or canonical quantization) and allows us to map classical mechanics into quantum mechanics, giving all operators of quantum dynamics and the Schrödinger equation. The great advantage of this point of view is that quantization is based on concrete physical arguments and not derived from some "pure algebraic rule" (we exhibit also some limit of the correspondence principle). Moreover spins for particles are naturally generated, including an approximation of their interaction with magnetic fields. We also recover by this approach the semi-classical formalism developed by E. Prugovečki [Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)].
Phase noise suppression for coherent optical block transmission systems: a unified framework.
Yang, Chuanchuan; Yang, Feng; Wang, Ziyu
2011-08-29
A unified framework for phase noise suppression is proposed in this paper, which could be applied in any coherent optical block transmission systems, including coherent optical orthogonal frequency-division multiplexing (CO-OFDM), coherent optical single-carrier frequency-domain equalization block transmission (CO-SCFDE), etc. Based on adaptive modeling of phase noise, unified observation equations for different coherent optical block transmission systems are constructed, which lead to unified phase noise estimation and suppression. Numerical results demonstrate that the proposal is powerful in mitigating laser phase noise.
Secondary School Mathematics Curriculum Improvement Study Information Bulletin 7.
ERIC Educational Resources Information Center
Secondary School Mathematics Curriculum Improvement Study, New York, NY.
The background, objectives, and design of Secondary School Mathematics Curriculum Improvement Study (SSMCIS) are summarized. Details are given of the content of the text series, "Unified Modern Mathematics," in the areas of algebra, geometry, linear algebra, probability and statistics, analysis (calculus), logic, and computer…
Unraveling dynamics of human physical activity patterns in chronic pain conditions
NASA Astrophysics Data System (ADS)
Paraschiv-Ionescu, Anisoara; Buchser, Eric; Aminian, Kamiar
2013-06-01
Chronic pain is a complex disabling experience that negatively affects the cognitive, affective and physical functions as well as behavior. Although the interaction between chronic pain and physical functioning is a well-accepted paradigm in clinical research, the understanding of how pain affects individuals' daily life behavior remains a challenging task. Here we develop a methodological framework allowing to objectively document disruptive pain related interferences on real-life physical activity. The results reveal that meaningful information is contained in the temporal dynamics of activity patterns and an analytical model based on the theory of bivariate point processes can be used to describe physical activity behavior. The model parameters capture the dynamic interdependence between periods and events and determine a `signature' of activity pattern. The study is likely to contribute to the clinical understanding of complex pain/disease-related behaviors and establish a unified mathematical framework to quantify the complex dynamics of various human activities.
NASA Astrophysics Data System (ADS)
Vienhage, Paul; Barcomb, Heather; Marshall, Karel; Black, William A.; Coons, Amanda; Tran, Hien T.; Nguyen, Tien M.; Guillen, Andy T.; Yoh, James; Kizer, Justin; Rogers, Blake A.
2017-05-01
The paper describes the MATLAB (MathWorks) programs that were developed during the REU workshop1 to implement The Aerospace Corporation developed Unified Game-based Acquisition Framework and Advanced Game - based Mathematical Framework (UGAF-AGMF) and its associated War-Gaming Engine (WGE) models. Each game can be played from the perspectives of the Department of Defense Acquisition Authority (DAA) or of an individual contractor (KTR). The programs also implement Aerospace's optimum "Program and Technical Baseline (PTB) and associated acquisition" strategy that combines low Total Ownership Cost (TOC) with innovative designs while still meeting warfighter needs. The paper also describes the Bayesian Acquisition War-Gaming approach using Monte Carlo simulations, a numerical analysis technique to account for uncertainty in decision making, which simulate the PTB development and acquisition processes and will detail the procedure of the implementation and the interactions between the games.
Chung, Moo K.; Qiu, Anqi; Seo, Seongho; Vorperian, Houri K.
2014-01-01
We present a novel kernel regression framework for smoothing scalar surface data using the Laplace-Beltrami eigenfunctions. Starting with the heat kernel constructed from the eigenfunctions, we formulate a new bivariate kernel regression framework as a weighted eigenfunction expansion with the heat kernel as the weights. The new kernel regression is mathematically equivalent to isotropic heat diffusion, kernel smoothing and recently popular diffusion wavelets. Unlike many previous partial differential equation based approaches involving diffusion, our approach represents the solution of diffusion analytically, reducing numerical inaccuracy and slow convergence. The numerical implementation is validated on a unit sphere using spherical harmonics. As an illustration, we have applied the method in characterizing the localized growth pattern of mandible surfaces obtained in CT images from subjects between ages 0 and 20 years by regressing the length of displacement vectors with respect to the template surface. PMID:25791435
A Unified Framework for Analyzing and Designing for Stationary Arterial Networks
DOT National Transportation Integrated Search
2017-05-17
This research aims to develop a unified theoretical and simulation framework for analyzing and designing signals for stationary arterial networks. Existing traffic flow models used in design and analysis of signal control strategies are either too si...
Computation of elementary modes: a unifying framework and the new binary approach
Gagneur, Julien; Klamt, Steffen
2004-01-01
Background Metabolic pathway analysis has been recognized as a central approach to the structural analysis of metabolic networks. The concept of elementary (flux) modes provides a rigorous formalism to describe and assess pathways and has proven to be valuable for many applications. However, computing elementary modes is a hard computational task. In recent years we assisted in a multiplication of algorithms dedicated to it. We require a summarizing point of view and a continued improvement of the current methods. Results We show that computing the set of elementary modes is equivalent to computing the set of extreme rays of a convex cone. This standard mathematical representation provides a unified framework that encompasses the most prominent algorithmic methods that compute elementary modes and allows a clear comparison between them. Taking lessons from this benchmark, we here introduce a new method, the binary approach, which computes the elementary modes as binary patterns of participating reactions from which the respective stoichiometric coefficients can be computed in a post-processing step. We implemented the binary approach in FluxAnalyzer 5.1, a software that is free for academics. The binary approach decreases the memory demand up to 96% without loss of speed giving the most efficient method available for computing elementary modes to date. Conclusions The equivalence between elementary modes and extreme ray computations offers opportunities for employing tools from polyhedral computation for metabolic pathway analysis. The new binary approach introduced herein was derived from this general theoretical framework and facilitates the computation of elementary modes in considerably larger networks. PMID:15527509
NASA Astrophysics Data System (ADS)
Papalexiou, Simon Michael
2018-05-01
Hydroclimatic processes come in all "shapes and sizes". They are characterized by different spatiotemporal correlation structures and probability distributions that can be continuous, mixed-type, discrete or even binary. Simulating such processes by reproducing precisely their marginal distribution and linear correlation structure, including features like intermittency, can greatly improve hydrological analysis and design. Traditionally, modelling schemes are case specific and typically attempt to preserve few statistical moments providing inadequate and potentially risky distribution approximations. Here, a single framework is proposed that unifies, extends, and improves a general-purpose modelling strategy, based on the assumption that any process can emerge by transforming a specific "parent" Gaussian process. A novel mathematical representation of this scheme, introducing parametric correlation transformation functions, enables straightforward estimation of the parent-Gaussian process yielding the target process after the marginal back transformation, while it provides a general description that supersedes previous specific parameterizations, offering a simple, fast and efficient simulation procedure for every stationary process at any spatiotemporal scale. This framework, also applicable for cyclostationary and multivariate modelling, is augmented with flexible parametric correlation structures that parsimoniously describe observed correlations. Real-world simulations of various hydroclimatic processes with different correlation structures and marginals, such as precipitation, river discharge, wind speed, humidity, extreme events per year, etc., as well as a multivariate example, highlight the flexibility, advantages, and complete generality of the method.
Gapless edges of 2d topological orders and enriched monoidal categories
NASA Astrophysics Data System (ADS)
Kong, Liang; Zheng, Hao
2018-02-01
In this work, we give a mathematical description of a chiral gapless edge of a 2d topological order (without symmetry). We show that the observables on the 1+1D world sheet of such an edge consist of a family of topological edge excitations, boundary CFT's and walls between boundary CFT's. These observables can be described by a chiral algebra and an enriched monoidal category. This mathematical description automatically includes that of gapped edges as special cases. Therefore, it gives a unified framework to study both gapped and gapless edges. Moreover, the boundary-bulk duality also holds for gapless edges. More precisely, the unitary modular tensor category that describes the 2d bulk phase is exactly the Drinfeld center of the enriched monoidal category that describes the gapless/gapped edge. We propose a classification of all gapped and chiral gapless edges of a given bulk phase. In the end, we explain how modular-invariant bulk rational conformal field theories naturally emerge on certain gapless walls between two trivial phases.
Control of Distributed Parameter Systems
1990-08-01
vari- ant of the general Lotka - Volterra model for interspecific competition. The variant described the emergence of one subpopulation from another as a...distribut ion unlimited. I&. ARSTRACT (MAUMUnw2O1 A unified arioroximation framework for Parameter estimation In general linear POE models has been completed...unified approximation framework for parameter estimation in general linear PDE models. This framework has provided the theoretical basis for a number of
NASA Astrophysics Data System (ADS)
Anku, Sitsofe E.
1997-09-01
Using the reform documents of the National Council of Teachers of Mathematics (NCTM) (NCTM, 1989, 1991, 1995), a theory-based multi-dimensional assessment framework (the "SEA" framework) which should help expand the scope of assessment in mathematics is proposed. This framework uses a context based on mathematical reasoning and has components that comprise mathematical concepts, mathematical procedures, mathematical communication, mathematical problem solving, and mathematical disposition.
Universal Darwinism As a Process of Bayesian Inference.
Campbell, John O
2016-01-01
Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment." Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.
Universal Darwinism As a Process of Bayesian Inference
Campbell, John O.
2016-01-01
Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an “experiment” in the external world environment, and the results of that “experiment” or the “surprise” entailed by predicted and actual outcomes of the “experiment.” Minimization of free energy implies that the implicit measure of “surprise” experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature. PMID:27375438
A model for calculating expected performance of the Apollo unified S-band (USB) communication system
NASA Technical Reports Server (NTRS)
Schroeder, N. W.
1971-01-01
A model for calculating the expected performance of the Apollo unified S-band (USB) communication system is presented. The general organization of the Apollo USB is described. The mathematical model is reviewed and the computer program for implementation of the calculations is included.
Unified Technical Concepts. Math for Technicians.
ERIC Educational Resources Information Center
Center for Occupational Research and Development, Inc., Waco, TX.
Unified Technical Concepts (UTC) is a modular system for teaching applied physics in two-year postsecondary technician programs. This UTC classroom textbook, consisting of 10 chapters, deals with mathematical concepts as they apply to the study of physics. Addressed in the individual chapters of the text are the following topics: angles and…
ERIC Educational Resources Information Center
Sriraman, Bharath, Ed.; Bergsten, Christer, Ed.; Goodchild, Simon, Ed.; Palsdottir, Gudbjorg, Ed.; Sondergaard, Bettina Dahl, Ed.; Haapasalo, Lenni, Ed.
2010-01-01
The First Sourcebook on Nordic Research in Mathematics Education: Norway, Sweden, Iceland, Denmark and contributions from Finland provides the first comprehensive and unified treatment of historical and contemporary research trends in mathematics education in the Nordic world. The book is organized in sections co-ordinated by active researchers in…
Using a Functional Model to Develop a Mathematical Formula
ERIC Educational Resources Information Center
Otto, Charlotte A.; Everett, Susan A.; Luera, Gail R.
2008-01-01
The unifying theme of models was incorporated into a required Science Capstone course for pre-service elementary teachers based on national standards in science and mathematics. A model of a teeter-totter was selected for use as an example of a functional model for gathering data as well as a visual model of a mathematical equation for developing…
Functions in the Secondary School Mathematics Curriculum
ERIC Educational Resources Information Center
Denbel, Dejene Girma
2015-01-01
Functions are used in every branch of mathematics, as algebraic operations on numbers, transformations on points in the plane or in space, intersection and union of pairs of sets, and so forth. Function is a unifying concept in all mathematics. Relationships among phenomena in everyday life, such as the relationship between the speed of a car and…
Applications of airborne ultrasound in human-computer interaction.
Dahl, Tobias; Ealo, Joao L; Bang, Hans J; Holm, Sverre; Khuri-Yakub, Pierre
2014-09-01
Airborne ultrasound is a rapidly developing subfield within human-computer interaction (HCI). Touchless ultrasonic interfaces and pen tracking systems are part of recent trends in HCI and are gaining industry momentum. This paper aims to provide the background and overview necessary to understand the capabilities of ultrasound and its potential future in human-computer interaction. The latest developments on the ultrasound transducer side are presented, focusing on capacitive micro-machined ultrasonic transducers, or CMUTs. Their introduction is an important step toward providing real, low-cost multi-sensor array and beam-forming options. We also provide a unified mathematical framework for understanding and analyzing algorithms used for ultrasound detection and tracking for some of the most relevant applications. Copyright © 2014. Published by Elsevier B.V.
State estimation applications in aircraft flight-data analysis: A user's manual for SMACK
NASA Technical Reports Server (NTRS)
Bach, Ralph E., Jr.
1991-01-01
The evolution in the use of state estimation is traced for the analysis of aircraft flight data. A unifying mathematical framework for state estimation is reviewed, and several examples are presented that illustrate a general approach for checking instrument accuracy and data consistency, and for estimating variables that are difficult to measure. Recent applications associated with research aircraft flight tests and airline turbulence upsets are described. A computer program for aircraft state estimation is discussed in some detail. This document is intended to serve as a user's manual for the program called SMACK (SMoothing for AirCraft Kinematics). The diversity of the applications described emphasizes the potential advantages in using SMACK for flight-data analysis.
Truth-Valued-Flow Inference (TVFI) and its applications in approximate reasoning
NASA Technical Reports Server (NTRS)
Wang, Pei-Zhuang; Zhang, Hongmin; Xu, Wei
1993-01-01
The framework of the theory of Truth-valued-flow Inference (TVFI) is introduced. Even though there are dozens of papers presented on fuzzy reasoning, we think it is still needed to explore a rather unified fuzzy reasoning theory which has the following two features: (1) it is simplified enough to be executed feasibly and easily; and (2) it is well structural and well consistent enough that it can be built into a strict mathematical theory and is consistent with the theory proposed by L.A. Zadeh. TVFI is one of the fuzzy reasoning theories that satisfies the above two features. It presents inference by the form of networks, and naturally views inference as a process of truth values flowing among propositions.
Deformation Theory and Physics Model Building
NASA Astrophysics Data System (ADS)
Sternheimer, Daniel
2006-08-01
The mathematical theory of deformations has proved to be a powerful tool in modeling physical reality. We start with a short historical and philosophical review of the context and concentrate this rapid presentation on a few interrelated directions where deformation theory is essential in bringing a new framework - which has then to be developed using adapted tools, some of which come from the deformation aspect. Minkowskian space-time can be deformed into Anti de Sitter, where massless particles become composite (also dynamically): this opens new perspectives in particle physics, at least at the electroweak level, including prediction of new mesons. Nonlinear group representations and covariant field equations, coming from interactions, can be viewed as some deformation of their linear (free) part: recognizing this fact can provide a good framework for treating problems in this area, in particular global solutions. Last but not least, (algebras associated with) classical mechanics (and field theory) on a Poisson phase space can be deformed to (algebras associated with) quantum mechanics (and quantum field theory). That is now a frontier domain in mathematics and theoretical physics called deformation quantization, with multiple ramifications, avatars and connections in both mathematics and physics. These include representation theory, quantum groups (when considering Hopf algebras instead of associative or Lie algebras), noncommutative geometry and manifolds, algebraic geometry, number theory, and of course what is regrouped under the name of M-theory. We shall here look at these from the unifying point of view of deformation theory and refer to a limited number of papers as a starting point for further study.
Pattern formation in mass conserving reaction-diffusion systems
NASA Astrophysics Data System (ADS)
Brauns, Fridtjof; Halatek, Jacob; Frey, Erwin
We present a rigorous theoretical framework able to generalize and unify pattern formation for quantitative mass conserving reaction-diffusion models. Mass redistribution controls chemical equilibria locally. Separation of diffusive mass redistribution on the level of conserved species provides a general mathematical procedure to decompose complex reaction-diffusion systems into effectively independent functional units, and to reveal the general underlying bifurcation scenarios. We apply this framework to Min protein pattern formation and identify the mechanistic roles of both involved protein species. MinD generates polarity through phase separation, whereas MinE takes the role of a control variable regulating the existence of MinD phases. Hence, polarization and not oscillations is the generic core dynamics of Min proteins in vivo. This establishes an intrinsic mechanistic link between the Min system and a broad class of intracellular pattern forming systems based on bistability and phase separation (wave-pinning). Oscillations are facilitated by MinE redistribution and can be understood mechanistically as relaxation oscillations of the polarization direction.
ERIC Educational Resources Information Center
McGee, Daniel; Moore-Russo, Deborah
2015-01-01
A test project at the University of Puerto Rico in Mayagüez used GeoGebra applets to promote the concept of multirepresentational fluency among high school mathematics preservice teachers. For this study, this fluency was defined as simultaneous awareness of all representations associated with a mathematical concept, as measured by the ability to…
Theoretical foundations of spatially-variant mathematical morphology part ii: gray-level images.
Bouaynaya, Nidhal; Schonfeld, Dan
2008-05-01
In this paper, we develop a spatially-variant (SV) mathematical morphology theory for gray-level signals and images in the Euclidean space. The proposed theory preserves the geometrical concept of the structuring function, which provides the foundation of classical morphology and is essential in signal and image processing applications. We define the basic SV gray-level morphological operators (i.e., SV gray-level erosion, dilation, opening, and closing) and investigate their properties. We demonstrate the ubiquity of SV gray-level morphological systems by deriving a kernel representation for a large class of systems, called V-systems, in terms of the basic SV graylevel morphological operators. A V-system is defined to be a gray-level operator, which is invariant under gray-level (vertical) translations. Particular attention is focused on the class of SV flat gray-level operators. The kernel representation for increasing V-systems is a generalization of Maragos' kernel representation for increasing and translation-invariant function-processing systems. A representation of V-systems in terms of their kernel elements is established for increasing and upper-semi-continuous V-systems. This representation unifies a large class of spatially-variant linear and non-linear systems under the same mathematical framework. Finally, simulation results show the potential power of the general theory of gray-level spatially-variant mathematical morphology in several image analysis and computer vision applications.
Unified Program Design: Organizing Existing Programming Models, Delivery Options, and Curriculum
ERIC Educational Resources Information Center
Rubenstein, Lisa DaVia; Ridgley, Lisa M.
2017-01-01
A persistent problem in the field of gifted education has been the lack of categorization and delineation of gifted programming options. To address this issue, we propose Unified Program Design as a structural framework for gifted program models. This framework defines gifted programs as the combination of delivery methods and curriculum models.…
A Unified Framework for Complex Networks with Degree Trichotomy Based on Markov Chains.
Hui, David Shui Wing; Chen, Yi-Chao; Zhang, Gong; Wu, Weijie; Chen, Guanrong; Lui, John C S; Li, Yingtao
2017-06-16
This paper establishes a Markov chain model as a unified framework for describing the evolution processes in complex networks. The unique feature of the proposed model is its capability in addressing the formation mechanism that can reflect the "trichotomy" observed in degree distributions, based on which closed-form solutions can be derived. Important special cases of the proposed unified framework are those classical models, including Poisson, Exponential, Power-law distributed networks. Both simulation and experimental results demonstrate a good match of the proposed model with real datasets, showing its superiority over the classical models. Implications of the model to various applications including citation analysis, online social networks, and vehicular networks design, are also discussed in the paper.
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration.
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Objective Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. Materials and methods We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Results Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. Conclusions We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration. PMID:23571850
Focus Group Research on the Implications of Adopting the Unified English Braille Code
ERIC Educational Resources Information Center
Wetzel, Robin; Knowlton, Marie
2006-01-01
Five focus groups explored concerns about adopting the Unified English Braille Code. The consensus was that while the proposed changes to the literary braille code would be minor, those to the mathematics braille code would be much more extensive. The participants emphasized that "any code that reduces the number of individuals who can access…
Studies of Braille Reading Rates and Implications for the Unified English Braille Code
ERIC Educational Resources Information Center
Wetzel, Robin; Knowlton, Marie
2006-01-01
Reading rate data was collected from both print and braille readers in the areas of mathematics and literary braille. Literary braille data was collected for contracted and uncontracted braille text with dropped whole-word contractions and part-word contractions as they would appear in the Unified English Braille Code. No significant differences…
Transferring Standard English Braille Skills to the Unified English Braille Code: A Pilot Study
ERIC Educational Resources Information Center
Steinman, Bernard A.; Kimbrough, B. T.; Johnson, Franklin; LeJeune, B. J.
2004-01-01
The enormously complex and sometimes controversial project to unify the traditional literary Braille code used in English-speaking countries with the technical and mathematical codes authorized by the Braille Authority of North America (BANA) and the Braille Authority of the United Kingdom (BAUK) proposes to change English Grade Two Braille on a…
A Unified Model of Performance for Predicting the Effects of Sleep and Caffeine
Ramakrishnan, Sridhar; Wesensten, Nancy J.; Kamimori, Gary H.; Moon, James E.; Balkin, Thomas J.; Reifman, Jaques
2016-01-01
Study Objectives: Existing mathematical models of neurobehavioral performance cannot predict the beneficial effects of caffeine across the spectrum of sleep loss conditions, limiting their practical utility. Here, we closed this research gap by integrating a model of caffeine effects with the recently validated unified model of performance (UMP) into a single, unified modeling framework. We then assessed the accuracy of this new UMP in predicting performance across multiple studies. Methods: We hypothesized that the pharmacodynamics of caffeine vary similarly during both wakefulness and sleep, and that caffeine has a multiplicative effect on performance. Accordingly, to represent the effects of caffeine in the UMP, we multiplied a dose-dependent caffeine factor (which accounts for the pharmacokinetics and pharmacodynamics of caffeine) to the performance estimated in the absence of caffeine. We assessed the UMP predictions in 14 distinct laboratory- and field-study conditions, including 7 different sleep-loss schedules (from 5 h of sleep per night to continuous sleep loss for 85 h) and 6 different caffeine doses (from placebo to repeated 200 mg doses to a single dose of 600 mg). Results: The UMP accurately predicted group-average psychomotor vigilance task performance data across the different sleep loss and caffeine conditions (6% < error < 27%), yielding greater accuracy for mild and moderate sleep loss conditions than for more severe cases. Overall, accounting for the effects of caffeine resulted in improved predictions (after caffeine consumption) by up to 70%. Conclusions: The UMP provides the first comprehensive tool for accurate selection of combinations of sleep schedules and caffeine countermeasure strategies to optimize neurobehavioral performance. Citation: Ramakrishnan S, Wesensten NJ, Kamimori GH, Moon JE, Balkin TJ, Reifman J. A unified model of performance for predicting the effects of sleep and caffeine. SLEEP 2016;39(10):1827–1841. PMID:27397562
Zhou, Xiang
2017-12-01
Linear mixed models (LMMs) are among the most commonly used tools for genetic association studies. However, the standard method for estimating variance components in LMMs-the restricted maximum likelihood estimation method (REML)-suffers from several important drawbacks: REML requires individual-level genotypes and phenotypes from all samples in the study, is computationally slow, and produces downward-biased estimates in case control studies. To remedy these drawbacks, we present an alternative framework for variance component estimation, which we refer to as MQS. MQS is based on the method of moments (MoM) and the minimal norm quadratic unbiased estimation (MINQUE) criterion, and brings two seemingly unrelated methods-the renowned Haseman-Elston (HE) regression and the recent LD score regression (LDSC)-into the same unified statistical framework. With this new framework, we provide an alternative but mathematically equivalent form of HE that allows for the use of summary statistics. We provide an exact estimation form of LDSC to yield unbiased and statistically more efficient estimates. A key feature of our method is its ability to pair marginal z -scores computed using all samples with SNP correlation information computed using a small random subset of individuals (or individuals from a proper reference panel), while capable of producing estimates that can be almost as accurate as if both quantities are computed using the full data. As a result, our method produces unbiased and statistically efficient estimates, and makes use of summary statistics, while it is computationally efficient for large data sets. Using simulations and applications to 37 phenotypes from 8 real data sets, we illustrate the benefits of our method for estimating and partitioning SNP heritability in population studies as well as for heritability estimation in family studies. Our method is implemented in the GEMMA software package, freely available at www.xzlab.org/software.html.
ERIC Educational Resources Information Center
Vermont Univ., Burlington.
This book, written by classroom teachers, introduces the application of secondary school mathematics to space exploration, and is intended to unify science and mathematics. In early chapters geometric concepts are used with general concepts of space and rough approximations of space measurements. Later, these concepts are refined to include the…
The Mathematics of Starry Nights
ERIC Educational Resources Information Center
Barman, Farshad
2008-01-01
The mathematics for finding and plotting the locations of stars and constellations are available in many books on astronomy, but the steps involve mystifying and fragmented equations, calculations, and terminology. This paper will introduce an entirely new unified and cohesive technique that is easy to understand by mathematicians, and simple…
A Unified Mathematical Framework for Coding Time, Space, and Sequences in the Hippocampal Region
MacDonald, Christopher J.; Tiganj, Zoran; Shankar, Karthik H.; Du, Qian; Hasselmo, Michael E.; Eichenbaum, Howard
2014-01-01
The medial temporal lobe (MTL) is believed to support episodic memory, vivid recollection of a specific event situated in a particular place at a particular time. There is ample neurophysiological evidence that the MTL computes location in allocentric space and more recent evidence that the MTL also codes for time. Space and time represent a similar computational challenge; both are variables that cannot be simply calculated from the immediately available sensory information. We introduce a simple mathematical framework that computes functions of both spatial location and time as special cases of a more general computation. In this framework, experience unfolding in time is encoded via a set of leaky integrators. These leaky integrators encode the Laplace transform of their input. The information contained in the transform can be recovered using an approximation to the inverse Laplace transform. In the temporal domain, the resulting representation reconstructs the temporal history. By integrating movements, the equations give rise to a representation of the path taken to arrive at the present location. By modulating the transform with information about allocentric velocity, the equations code for position of a landmark. Simulated cells show a close correspondence to neurons observed in various regions for all three cases. In the temporal domain, novel secondary analyses of hippocampal time cells verified several qualitative predictions of the model. An integrated representation of spatiotemporal context can be computed by taking conjunctions of these elemental inputs, leading to a correspondence with conjunctive neural representations observed in dorsal CA1. PMID:24672015
Incubation, Insight, and Creative Problem Solving: A Unified Theory and a Connectionist Model
ERIC Educational Resources Information Center
Helie, Sebastien; Sun, Ron
2010-01-01
This article proposes a unified framework for understanding creative problem solving, namely, the explicit-implicit interaction theory. This new theory of creative problem solving constitutes an attempt at providing a more unified explanation of relevant phenomena (in part by reinterpreting/integrating various fragmentary existing theories of…
Energy Transfer and a Recurring Mathematical Function
ERIC Educational Resources Information Center
Atkin, Keith
2013-01-01
This paper extends the interesting work of a previous contributor concerning the analogies between physical phenomena such as mechanical collisions and the transfer of power in an electric circuit. Emphasis is placed on a mathematical function linking these different areas of physics. This unifying principle is seen as an exciting opportunity to…
Identification of vortices in complex flows
NASA Astrophysics Data System (ADS)
Chakraborty, P.; Balachandar, S.; Adrian, R. J.
2007-12-01
Dating back to Leonardo da Vinci's famous sketches of vortices in turbulent flows, fluid dynamicists for over five centuries have continued to visualize and interpret complex flows in terms of motion of vortices. Nevertheless, much debate surrounds the question of how to unambiguously define vortices in complex flows. This debate has resulted in the availability of many vortex identification criteria---mathematical statements of what constitutes a vortex. Here we review the popularly used local or point- wise vortex identification criteria. Based on local flow kinematics, we describe a unified framework to interpret the similarities and differences in the usage of these criteria. We discuss the limitations on the applicability of these criteria when there is a significant component of vortex interactions. Finally, we provide guidelines for applying these criteria to geophysical flows.
Embedding Quantum Mechanics Into a Broader Noncontextual Theory: A Conciliatory Result
NASA Astrophysics Data System (ADS)
Garola, Claudio; Sozzo, Sandro
2010-12-01
The extended semantic realism ( ESR) model embodies the mathematical formalism of standard (Hilbert space) quantum mechanics in a noncontextual framework, reinterpreting quantum probabilities as conditional instead of absolute. We provide here an improved version of this model and show that it predicts that, whenever idealized measurements are performed, a modified Bell-Clauser-Horne-Shimony-Holt ( BCHSH) inequality holds if one takes into account all individual systems that are prepared, standard quantum predictions hold if one considers only the individual systems that are detected, and a standard BCHSH inequality holds at a microscopic (purely theoretical) level. These results admit an intuitive explanation in terms of an unconventional kind of unfair sampling and constitute a first example of the unified perspective that can be attained by adopting the ESR model.
ERIC Educational Resources Information Center
Holbrook, M. Cay; MacCuspie, P. Ann
2010-01-01
Braille-reading mathematicians, scientists, and computer scientists were asked to examine the usability of the Unified English Braille Code (UEB) for technical materials. They had little knowledge of the code prior to the study. The research included two reading tasks, a short tutorial about UEB, and a focus group. The results indicated that the…
ERIC Educational Resources Information Center
Center for Mental Health in Schools at UCLA, 2005
2005-01-01
This report was developed to highlight the current state of affairs and illustrate the value of a unifying framework and integrated infrastructure for the many initiatives, projects, programs, and services schools pursue in addressing barriers to learning and promoting healthy development. Specifically, it highlights how initiatives can be…
Toward a unifying framework for evolutionary processes.
Paixão, Tiago; Badkobeh, Golnaz; Barton, Nick; Çörüş, Doğan; Dang, Duc-Cuong; Friedrich, Tobias; Lehre, Per Kristian; Sudholt, Dirk; Sutton, Andrew M; Trubenová, Barbora
2015-10-21
The theory of population genetics and evolutionary computation have been evolving separately for nearly 30 years. Many results have been independently obtained in both fields and many others are unique to its respective field. We aim to bridge this gap by developing a unifying framework for evolutionary processes that allows both evolutionary algorithms and population genetics models to be cast in the same formal framework. The framework we present here decomposes the evolutionary process into its several components in order to facilitate the identification of similarities between different models. In particular, we propose a classification of evolutionary operators based on the defining properties of the different components. We cast several commonly used operators from both fields into this common framework. Using this, we map different evolutionary and genetic algorithms to different evolutionary regimes and identify candidates with the most potential for the translation of results between the fields. This provides a unified description of evolutionary processes and represents a stepping stone towards new tools and results to both fields. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
A Unified Classification Framework for FP, DP and CP Data at X-Band in Southern China
NASA Astrophysics Data System (ADS)
Xie, Lei; Zhang, Hong; Li, Hhongzhong; Wang, Chao
2015-04-01
The main objective of this paper is to introduce an unified framework for crop classification in Southern China using data in fully polarimetric (FP), dual-pol (DP) and compact polarimetric (CP) modes. The TerraSAR-X data acquired over the Leizhou Peninsula, South China are used in our experiments. The study site involves four main crops (rice, banana, sugarcane eucalyptus). Through exploring the similarities between data in these three modes, a knowledge-based characteristic space is created and the unified framework is presented. The overall classification accuracies for data in the FP, coherent HH/VV are about 95%, and is about 91% in CP modes, which suggests that the proposed classification scheme is effective and promising. Compared with the Wishart Maximum Likelihood (ML) classifier, the proposed method exhibits higher classification accuracy.
Traffic Flow - USMES Teacher Resource Book. Fourth Edition. Trial Edition.
ERIC Educational Resources Information Center
Keskulla, Jean
This Unified Sciences and Mathematics for Elementary Schools (USMES) unit challenges students to improve traffic flow at a problem location. The challenge is general enough to apply to many problem-solving situations in mathematics, science, social science, and language arts at any elementary school level (grades 1-8). The Teacher Resource Book…
Pedestrian Crossings - USMES Teacher Resource Book. Fifth Edition. Trial Edition.
ERIC Educational Resources Information Center
Keskulla, Jean
This Unified Sciences and Mathematics for Elementary Schools (USMES) unit challenges students to improve the safety and convenience of a pedestrian crossing near a school. The challenge is general enough to apply to many problem-solving situations in mathematics, science, social science, and language arts at any elementary school level (grades…
Applications of Dirac's Delta Function in Statistics
ERIC Educational Resources Information Center
Khuri, Andre
2004-01-01
The Dirac delta function has been used successfully in mathematical physics for many years. The purpose of this article is to bring attention to several useful applications of this function in mathematical statistics. Some of these applications include a unified representation of the distribution of a function (or functions) of one or several…
Protecting Property - USMES Teacher Resource Book. First Edition. Trial Edition.
ERIC Educational Resources Information Center
Bussey, Margery Koo
This Unified Sciences and Mathematics for Elementary Schools (USMES) unit challenges students to find good ways to protect property (property in desks or lockers; animals; bicycles; tools). The challenge is general enough to apply to many problem-solving situations in mathematics, science, social science, and language arts at any elementary school…
ERIC Educational Resources Information Center
Fehr, Howard F.
1970-01-01
Describes an experimental study attempting to construct a unified school mathematics curriculum for grades seven through twelve. Study was initiated in 1965 and is to be a six-year study. The total program includes, in the following order, syllabus writing, conferences, writing of experimental textbook, education of classroom teachers, pilot class…
Manufacturing - USMES Teacher Resource Book. Second Edition. Trial Edition.
ERIC Educational Resources Information Center
Agro, Sally
This Unified Sciences and Mathematics for Elementary Schools (USMES) unit challenges students to find the best way to produce an item in quantities needed. The challenge is general enough to apply to many problem-solving situations in mathematics, science, social science, and language arts at any elementary school level (grades 1-8). The Teacher…
Mathematics: PROJECT DESIGN. Educational Needs, Fresno, 1968, Number 12.
ERIC Educational Resources Information Center
Smart, James R.
This report examines and summarizes the needs in mathematics of the Fresno City school system. The study is one in a series of needs assessment reports for PROJECT DESIGN, an ESEA Title III project administered by the Fresno City Unified School District. Theoretical concepts, rather than computational drill, would be emphasized in the proposed…
The Unified Behavior Framework for the Simulation of Autonomous Agents
2015-03-01
1980s, researchers have designed a variety of robot control architectures intending to imbue robots with some degree of autonomy. A recently developed ...Identification Friend or Foe viii THE UNIFIED BEHAVIOR FRAMEWORK FOR THE SIMULATION OF AUTONOMOUS AGENTS I. Introduction The development of autonomy has...room for research by utilizing methods like simulation and modeling that consume less time and fewer monetary resources. A recently developed reactive
A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects
Slob, Wout
2015-01-01
Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063
Robopedia: Leveraging Sensorpedia for Web-Enabled Robot Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Resseguie, David R
There is a growing interest in building Internetscale sensor networks that integrate sensors from around the world into a single unified system. In contrast, robotics application development has primarily focused on building specialized systems. These specialized systems take scalability and reliability into consideration, but generally neglect exploring the key components required to build a large scale system. Integrating robotic applications with Internet-scale sensor networks will unify specialized robotics applications and provide answers to large scale implementation concerns. We focus on utilizing Internet-scale sensor network technology to construct a framework for unifying robotic systems. Our framework web-enables a surveillance robot smore » sensor observations and provides a webinterface to the robot s actuators. This lets robots seamlessly integrate into web applications. In addition, the framework eliminates most prerequisite robotics knowledge, allowing for the creation of general web-based robotics applications. The framework also provides mechanisms to create applications that can interface with any robot. Frameworks such as this one are key to solving large scale mobile robotics implementation problems. We provide an overview of previous Internetscale sensor networks, Sensorpedia (an ad-hoc Internet-scale sensor network), our framework for integrating robots with Sensorpedia, two applications which illustrate our frameworks ability to support general web-based robotic control, and offer experimental results that illustrate our framework s scalability, feasibility, and resource requirements.« less
A guide to phylogenetic metrics for conservation, community ecology and macroecology.
Tucker, Caroline M; Cadotte, Marc W; Carvalho, Silvia B; Davies, T Jonathan; Ferrier, Simon; Fritz, Susanne A; Grenyer, Rich; Helmus, Matthew R; Jin, Lanna S; Mooers, Arne O; Pavoine, Sandrine; Purschke, Oliver; Redding, David W; Rosauer, Dan F; Winter, Marten; Mazel, Florent
2017-05-01
The use of phylogenies in ecology is increasingly common and has broadened our understanding of biological diversity. Ecological sub-disciplines, particularly conservation, community ecology and macroecology, all recognize the value of evolutionary relationships but the resulting development of phylogenetic approaches has led to a proliferation of phylogenetic diversity metrics. The use of many metrics across the sub-disciplines hampers potential meta-analyses, syntheses, and generalizations of existing results. Further, there is no guide for selecting the appropriate metric for a given question, and different metrics are frequently used to address similar questions. To improve the choice, application, and interpretation of phylo-diversity metrics, we organize existing metrics by expanding on a unifying framework for phylogenetic information. Generally, questions about phylogenetic relationships within or between assemblages tend to ask three types of question: how much; how different; or how regular? We show that these questions reflect three dimensions of a phylogenetic tree: richness, divergence, and regularity. We classify 70 existing phylo-diversity metrics based on their mathematical form within these three dimensions and identify 'anchor' representatives: for α-diversity metrics these are PD (Faith's phylogenetic diversity), MPD (mean pairwise distance), and VPD (variation of pairwise distances). By analysing mathematical formulae and using simulations, we use this framework to identify metrics that mix dimensions, and we provide a guide to choosing and using the most appropriate metrics. We show that metric choice requires connecting the research question with the correct dimension of the framework and that there are logical approaches to selecting and interpreting metrics. The guide outlined herein will help researchers navigate the current jungle of indices. © 2016 The Authors. Biological Reviews published by John Wiley © Sons Ltd on behalf of Cambridge Philosophical Society.
A guide to phylogenetic metrics for conservation, community ecology and macroecology
Cadotte, Marc W.; Carvalho, Silvia B.; Davies, T. Jonathan; Ferrier, Simon; Fritz, Susanne A.; Grenyer, Rich; Helmus, Matthew R.; Jin, Lanna S.; Mooers, Arne O.; Pavoine, Sandrine; Purschke, Oliver; Redding, David W.; Rosauer, Dan F.; Winter, Marten; Mazel, Florent
2016-01-01
ABSTRACT The use of phylogenies in ecology is increasingly common and has broadened our understanding of biological diversity. Ecological sub‐disciplines, particularly conservation, community ecology and macroecology, all recognize the value of evolutionary relationships but the resulting development of phylogenetic approaches has led to a proliferation of phylogenetic diversity metrics. The use of many metrics across the sub‐disciplines hampers potential meta‐analyses, syntheses, and generalizations of existing results. Further, there is no guide for selecting the appropriate metric for a given question, and different metrics are frequently used to address similar questions. To improve the choice, application, and interpretation of phylo‐diversity metrics, we organize existing metrics by expanding on a unifying framework for phylogenetic information. Generally, questions about phylogenetic relationships within or between assemblages tend to ask three types of question: how much; how different; or how regular? We show that these questions reflect three dimensions of a phylogenetic tree: richness, divergence, and regularity. We classify 70 existing phylo‐diversity metrics based on their mathematical form within these three dimensions and identify ‘anchor’ representatives: for α‐diversity metrics these are PD (Faith's phylogenetic diversity), MPD (mean pairwise distance), and VPD (variation of pairwise distances). By analysing mathematical formulae and using simulations, we use this framework to identify metrics that mix dimensions, and we provide a guide to choosing and using the most appropriate metrics. We show that metric choice requires connecting the research question with the correct dimension of the framework and that there are logical approaches to selecting and interpreting metrics. The guide outlined herein will help researchers navigate the current jungle of indices. PMID:26785932
Representing Thoughts, Words, and Things in the UMLS
Campbell, Keith E.; Oliver, Diane E.; Spackman, Kent A.; Shortliffe, Edward H.
1998-01-01
The authors describe a framework, based on the Ogden-Richards semiotic triangle, for understanding the relationship between the Unified Medical Language System (UMLS) and the source terminologies from which the UMLS derives its content. They pay particular attention to UMLS's Concept Unique Identifier (CUI) and the sense of “meaning” it represents as contrasted with the sense of “meaning” represented by the source terminologies. The CUI takes on emergent meaning through linkage to terms in different terminology systems. In some cases, a CUI's emergent meaning can differ significantly from the original sources' intended meanings of terms linked by that CUI. Identification of these different senses of meaning within the UMLS is consistent with historical themes of semantic interpretation of language. Examination of the UMLS within such a historical framework makes it possible to better understand the strengths and limitations of the UMLS approach for integrating disparate terminologic systems and to provide a model, or theoretic foundation, for evaluating the UMLS as a Possible World—that is, as a mathematical formalism that represents propositions about some perspective or interpretation of the physical world. PMID:9760390
Designing for Human Proportions - USMES Teacher Resource Book. Fourth Edition. Trial Edition.
ERIC Educational Resources Information Center
Bussey, Margery Koo
Designing or making changes in things students use or wear is the challenge of this Unified Sciences and Mathematics for Elementary Schools (USMES) unit. The challenge is general enough to apply to many problem-solving situations in mathematics, science, social science, and language arts at any elementary school level (grades 1-8). The Teacher…
ERIC Educational Resources Information Center
New York City Board of Education, Brooklyn, NY. Div. of Curriculum and Instruction.
This document is part 2 of the workbook for kindergarten pupils in the Comprehensive Instructional Management System, a unified mathematics curriculum for kindergarten through grade 7. Each objective is developed by a variety of strategies, with mastery of objectives diagnosed through a testing component. The activities in the student workbook are…
A Categorization Model for Educational Values of the History of Mathematics. An Empirical Study
NASA Astrophysics Data System (ADS)
Wang, Xiao-qin; Qi, Chun-yan; Wang, Ke
2017-11-01
There is not a clear consensus on the categorization framework of the educational values of the history of mathematics. By analyzing 20 Chinese teaching cases on integrating the history of mathematics into mathematics teaching based on the relevant literature, this study examined a new categorization framework of the educational values of the history of mathematics by combining the objectives of high school mathematics curriculum in China. This framework includes six dimensions: the harmony of knowledge, the beauty of ideas or methods, the pleasure of inquiries, the improvement of capabilities, the charm of cultures, and the availability of moral education. The results show that this framework better explained the all-educational values of the history of mathematics that all teaching cases showed. Therefore, the framework can guide teachers to better integrate the history of mathematics into teaching.
Loop transfer recovery for general nonminimum phase discrete time systems. I - Analysis
NASA Technical Reports Server (NTRS)
Chen, Ben M.; Saberi, Ali; Sannuti, Peddapullaiah; Shamash, Yacov
1992-01-01
A complete analysis of loop transfer recovery (LTR) for general nonstrictly proper, not necessarily minimum phase discrete time systems is presented. Three different observer-based controllers, namely, `prediction estimator' and full or reduced-order type `current estimator' based controllers, are used. The analysis corresponding to all these three controllers is unified into a single mathematical framework. The LTR analysis given here focuses on three fundamental issues: (1) the recoverability of a target loop when it is arbitrarily given, (2) the recoverability of a target loop while taking into account its specific characteristics, and (3) the establishment of necessary and sufficient conditions on the given system so that it has at least one recoverable target loop transfer function or sensitivity function. Various differences that arise in LTR analysis of continuous and discrete systems are pointed out.
Gartner, Daniel; Padman, Rema
2017-01-01
In this paper, we describe the development of a unified framework and a digital workbench for the strategic, tactical and operational hospital management plan driven by information technology and analytics. The workbench can be used not only by multiple stakeholders in the healthcare delivery setting, but also for pedagogical purposes on topics such as healthcare analytics, services management, and information systems. This tool combines the three classical hierarchical decision-making levels in one integrated environment. At each level, several decision problems can be chosen. Extensions of mathematical models from the literature are presented and incorporated into the digital platform. In a case study using real-world data, we demonstrate how we used the workbench to inform strategic capacity planning decisions in a multi-hospital, multi-stakeholder setting in the United Kingdom.
Automated speech understanding: the next generation
NASA Astrophysics Data System (ADS)
Picone, J.; Ebel, W. J.; Deshmukh, N.
1995-04-01
Modern speech understanding systems merge interdisciplinary technologies from Signal Processing, Pattern Recognition, Natural Language, and Linguistics into a unified statistical framework. These systems, which have applications in a wide range of signal processing problems, represent a revolution in Digital Signal Processing (DSP). Once a field dominated by vector-oriented processors and linear algebra-based mathematics, the current generation of DSP-based systems rely on sophisticated statistical models implemented using a complex software paradigm. Such systems are now capable of understanding continuous speech input for vocabularies of several thousand words in operational environments. The current generation of deployed systems, based on small vocabularies of isolated words, will soon be replaced by a new technology offering natural language access to vast information resources such as the Internet, and provide completely automated voice interfaces for mundane tasks such as travel planning and directory assistance.
A Unified Model of Performance for Predicting the Effects of Sleep and Caffeine.
Ramakrishnan, Sridhar; Wesensten, Nancy J; Kamimori, Gary H; Moon, James E; Balkin, Thomas J; Reifman, Jaques
2016-10-01
Existing mathematical models of neurobehavioral performance cannot predict the beneficial effects of caffeine across the spectrum of sleep loss conditions, limiting their practical utility. Here, we closed this research gap by integrating a model of caffeine effects with the recently validated unified model of performance (UMP) into a single, unified modeling framework. We then assessed the accuracy of this new UMP in predicting performance across multiple studies. We hypothesized that the pharmacodynamics of caffeine vary similarly during both wakefulness and sleep, and that caffeine has a multiplicative effect on performance. Accordingly, to represent the effects of caffeine in the UMP, we multiplied a dose-dependent caffeine factor (which accounts for the pharmacokinetics and pharmacodynamics of caffeine) to the performance estimated in the absence of caffeine. We assessed the UMP predictions in 14 distinct laboratory- and field-study conditions, including 7 different sleep-loss schedules (from 5 h of sleep per night to continuous sleep loss for 85 h) and 6 different caffeine doses (from placebo to repeated 200 mg doses to a single dose of 600 mg). The UMP accurately predicted group-average psychomotor vigilance task performance data across the different sleep loss and caffeine conditions (6% < error < 27%), yielding greater accuracy for mild and moderate sleep loss conditions than for more severe cases. Overall, accounting for the effects of caffeine resulted in improved predictions (after caffeine consumption) by up to 70%. The UMP provides the first comprehensive tool for accurate selection of combinations of sleep schedules and caffeine countermeasure strategies to optimize neurobehavioral performance. © 2016 Associated Professional Sleep Societies, LLC.
Alimohammadi, Mona; Pichardo-Almarza, Cesar; Agu, Obiekezie; Díaz-Zuccarini, Vanessa
2017-01-01
Atherogenesis, the formation of plaques in the wall of blood vessels, starts as a result of lipid accumulation (low-density lipoprotein cholesterol) in the vessel wall. Such accumulation is related to the site of endothelial mechanotransduction, the endothelial response to mechanical stimuli and haemodynamics, which determines biochemical processes regulating the vessel wall permeability. This interaction between biomechanical and biochemical phenomena is complex, spanning different biological scales and is patient-specific, requiring tools able to capture such mathematical and biological complexity in a unified framework. Mathematical models offer an elegant and efficient way of doing this, by taking into account multifactorial and multiscale processes and mechanisms, in order to capture the fundamentals of plaque formation in individual patients. In this study, a mathematical model to understand plaque and calcification locations is presented: this model provides a strong interpretability and physical meaning through a multiscale, complex index or metric (the penetration site of low-density lipoprotein cholesterol, expressed as volumetric flux). Computed tomography scans of the aortic bifurcation and iliac arteries are analysed and compared with the results of the multifactorial model. The results indicate that the model shows potential to predict the majority of the plaque locations, also not predicting regions where plaques are absent. The promising results from this case study provide a proof of concept that can be applied to a larger patient population. PMID:28427316
Bavassi, M Luz; Tagliazucchi, Enzo; Laje, Rodrigo
2013-02-01
Time processing in the few hundred milliseconds range is involved in the human skill of sensorimotor synchronization, like playing music in an ensemble or finger tapping to an external beat. In finger tapping, a mechanistic explanation in biologically plausible terms of how the brain achieves synchronization is still missing despite considerable research. In this work we show that nonlinear effects are important for the recovery of synchronization following a perturbation (a step change in stimulus period), even for perturbation magnitudes smaller than 10% of the period, which is well below the amount of perturbation needed to evoke other nonlinear effects like saturation. We build a nonlinear mathematical model for the error correction mechanism and test its predictions, and further propose a framework that allows us to unify the description of the three common types of perturbations. While previous authors have used two different model mechanisms for fitting different perturbation types, or have fitted different parameter value sets for different perturbation magnitudes, we propose the first unified description of the behavior following all perturbation types and magnitudes as the dynamical response of a compound model with fixed terms and a single set of parameter values. Copyright © 2012 Elsevier B.V. All rights reserved.
A Unified Mathematical Approach to Image Analysis.
1987-08-31
describes four instances of the paradigm in detail. Directions for ongoing and future research are also indicated. Keywords: Image processing; Algorithms; Segmentation; Boundary detection; tomography; Global image analysis .
A Framework for Examining Teachers' Noticing of Mathematical Cognitive Technologies
ERIC Educational Resources Information Center
Smith, Ryan; Shin, Dongjo; Kim, Somin
2017-01-01
In this paper, we propose the mathematical cognitive technology noticing framework for examining how mathematics teachers evaluate, select, and modify mathematical cognitive technology to use in their classrooms. Our framework is based on studies of professional and curricular noticing and data collected in a study that explored how secondary…
Statistical Teleodynamics: Toward a Theory of Emergence.
Venkatasubramanian, Venkat
2017-10-24
The central scientific challenge of the 21st century is developing a mathematical theory of emergence that can explain and predict phenomena such as consciousness and self-awareness. The most successful research program of the 20th century, reductionism, which goes from the whole to parts, seems unable to address this challenge. This is because addressing this challenge inherently requires an opposite approach, going from parts to the whole. In addition, reductionism, by the very nature of its inquiry, typically does not concern itself with teleology or purposeful behavior. Modeling emergence, in contrast, requires the addressing of teleology. Together, these two requirements present a formidable challenge in developing a successful mathematical theory of emergence. In this article, I describe a new theory of emergence, called statistical teleodynamics, that addresses certain aspects of the general problem. Statistical teleodynamics is a mathematical framework that unifies three seemingly disparate domains-purpose-free entities in statistical mechanics, human engineered teleological systems in systems engineering, and nature-evolved teleological systems in biology and sociology-within the same conceptual formalism. This theory rests on several key conceptual insights, the most important one being the recognition that entropy mathematically models the concept of fairness in economics and philosophy and, equivalently, the concept of robustness in systems engineering. These insights help prove that the fairest inequality of income is a log-normal distribution, which will emerge naturally at equilibrium in an ideal free market society. Similarly, the theory predicts the emergence of the three classes of network organization-exponential, scale-free, and Poisson-seen widely in a variety of domains. Statistical teleodynamics is the natural generalization of statistical thermodynamics, the most successful parts-to-whole systems theory to date, but this generalization is only a modest step toward a more comprehensive mathematical theory of emergence.
Combining statistical inference and decisions in ecology
Williams, Perry J.; Hooten, Mevin B.
2016-01-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.
Introductory science and mathematics education for 21st-Century biologists.
Bialek, William; Botstein, David
2004-02-06
Galileo wrote that "the book of nature is written in the language of mathematics"; his quantitative approach to understanding the natural world arguably marks the beginning of modern science. Nearly 400 years later, the fragmented teaching of science in our universities still leaves biology outside the quantitative and mathematical culture that has come to define the physical sciences and engineering. This strikes us as particularly inopportune at a time when opportunities for quantitative thinking about biological systems are exploding. We propose that a way out of this dilemma is a unified introductory science curriculum that fully incorporates mathematics and quantitative thinking.
Measuring the Mathematical Quality of Instruction
ERIC Educational Resources Information Center
Journal of Mathematics Teacher Education, 2011
2011-01-01
In this article, we describe a framework and instrument for measuring the mathematical quality of mathematics instruction. In describing this framework, we argue for the separation of the "mathematical quality of instruction" (MQI), such as the absence of mathematical errors and the presence of sound mathematical reasoning, from pedagogical…
An Unified Multiscale Framework for Planar, Surface, and Curve Skeletonization.
Jalba, Andrei C; Sobiecki, Andre; Telea, Alexandru C
2016-01-01
Computing skeletons of 2D shapes, and medial surface and curve skeletons of 3D shapes, is a challenging task. In particular, there is no unified framework that detects all types of skeletons using a single model, and also produces a multiscale representation which allows to progressively simplify, or regularize, all skeleton types. In this paper, we present such a framework. We model skeleton detection and regularization by a conservative mass transport process from a shape's boundary to its surface skeleton, next to its curve skeleton, and finally to the shape center. The resulting density field can be thresholded to obtain a multiscale representation of progressively simplified surface, or curve, skeletons. We detail a numerical implementation of our framework which is demonstrably stable and has high computational efficiency. We demonstrate our framework on several complex 2D and 3D shapes.
Massachusetts Adult Basic Education Curriculum Framework for Mathematics and Numeracy
ERIC Educational Resources Information Center
Massachusetts Department of Education, 2005
2005-01-01
Over the past number of years, several initiatives have set the stage for writing the Massachusetts ABE (Adult Basic Education) Curriculum Frameworks for Mathematics and Numeracy. This current version of the "Massachusetts ABE Mathematics Curriculum Frameworks" is a second revision of that first framework, but it is heavily influenced by…
ERIC Educational Resources Information Center
Artzt, Alice F.; Armour-Thomas, Eleanor
The roles of cognition and metacognition were examined in the mathematical problem-solving behaviors of students as they worked in small groups. As an outcome, a framework that links the literature of cognitive science and mathematical problem solving was developed for protocol analysis of mathematical problem solving. Within this framework, each…
3D Fluid-Structure Interaction Simulation of Aortic Valves Using a Unified Continuum ALE FEM Model.
Spühler, Jeannette H; Jansson, Johan; Jansson, Niclas; Hoffman, Johan
2018-01-01
Due to advances in medical imaging, computational fluid dynamics algorithms and high performance computing, computer simulation is developing into an important tool for understanding the relationship between cardiovascular diseases and intraventricular blood flow. The field of cardiac flow simulation is challenging and highly interdisciplinary. We apply a computational framework for automated solutions of partial differential equations using Finite Element Methods where any mathematical description directly can be translated to code. This allows us to develop a cardiac model where specific properties of the heart such as fluid-structure interaction of the aortic valve can be added in a modular way without extensive efforts. In previous work, we simulated the blood flow in the left ventricle of the heart. In this paper, we extend this model by placing prototypes of both a native and a mechanical aortic valve in the outflow region of the left ventricle. Numerical simulation of the blood flow in the vicinity of the valve offers the possibility to improve the treatment of aortic valve diseases as aortic stenosis (narrowing of the valve opening) or regurgitation (leaking) and to optimize the design of prosthetic heart valves in a controlled and specific way. The fluid-structure interaction and contact problem are formulated in a unified continuum model using the conservation laws for mass and momentum and a phase function. The discretization is based on an Arbitrary Lagrangian-Eulerian space-time finite element method with streamline diffusion stabilization, and it is implemented in the open source software Unicorn which shows near optimal scaling up to thousands of cores. Computational results are presented to demonstrate the capability of our framework.
2012-01-01
Background Chaos Game Representation (CGR) is an iterated function that bijectively maps discrete sequences into a continuous domain. As a result, discrete sequences can be object of statistical and topological analyses otherwise reserved to numerical systems. Characteristically, CGR coordinates of substrings sharing an L-long suffix will be located within 2-L distance of each other. In the two decades since its original proposal, CGR has been generalized beyond its original focus on genomic sequences and has been successfully applied to a wide range of problems in bioinformatics. This report explores the possibility that it can be further extended to approach algorithms that rely on discrete, graph-based representations. Results The exploratory analysis described here consisted of selecting foundational string problems and refactoring them using CGR-based algorithms. We found that CGR can take the role of suffix trees and emulate sophisticated string algorithms, efficiently solving exact and approximate string matching problems such as finding all palindromes and tandem repeats, and matching with mismatches. The common feature of these problems is that they use longest common extension (LCE) queries as subtasks of their procedures, which we show to have a constant time solution with CGR. Additionally, we show that CGR can be used as a rolling hash function within the Rabin-Karp algorithm. Conclusions The analysis of biological sequences relies on algorithmic foundations facing mounting challenges, both logistic (performance) and analytical (lack of unifying mathematical framework). CGR is found to provide the latter and to promise the former: graph-based data structures for sequence analysis operations are entailed by numerical-based data structures produced by CGR maps, providing a unifying analytical framework for a diversity of pattern matching problems. PMID:22551152
3D Fluid-Structure Interaction Simulation of Aortic Valves Using a Unified Continuum ALE FEM Model
Spühler, Jeannette H.; Jansson, Johan; Jansson, Niclas; Hoffman, Johan
2018-01-01
Due to advances in medical imaging, computational fluid dynamics algorithms and high performance computing, computer simulation is developing into an important tool for understanding the relationship between cardiovascular diseases and intraventricular blood flow. The field of cardiac flow simulation is challenging and highly interdisciplinary. We apply a computational framework for automated solutions of partial differential equations using Finite Element Methods where any mathematical description directly can be translated to code. This allows us to develop a cardiac model where specific properties of the heart such as fluid-structure interaction of the aortic valve can be added in a modular way without extensive efforts. In previous work, we simulated the blood flow in the left ventricle of the heart. In this paper, we extend this model by placing prototypes of both a native and a mechanical aortic valve in the outflow region of the left ventricle. Numerical simulation of the blood flow in the vicinity of the valve offers the possibility to improve the treatment of aortic valve diseases as aortic stenosis (narrowing of the valve opening) or regurgitation (leaking) and to optimize the design of prosthetic heart valves in a controlled and specific way. The fluid-structure interaction and contact problem are formulated in a unified continuum model using the conservation laws for mass and momentum and a phase function. The discretization is based on an Arbitrary Lagrangian-Eulerian space-time finite element method with streamline diffusion stabilization, and it is implemented in the open source software Unicorn which shows near optimal scaling up to thousands of cores. Computational results are presented to demonstrate the capability of our framework. PMID:29713288
Extreme values and fat tails of multifractal fluctuations
NASA Astrophysics Data System (ADS)
Muzy, J. F.; Bacry, E.; Kozhemyak, A.
2006-06-01
In this paper we discuss the problem of the estimation of extreme event occurrence probability for data drawn from some multifractal process. We also study the heavy (power-law) tail behavior of probability density function associated with such data. We show that because of strong correlations, the standard extreme value approach is not valid and classical tail exponent estimators should be interpreted cautiously. Extreme statistics associated with multifractal random processes turn out to be characterized by non-self-averaging properties. Our considerations rely upon some analogy between random multiplicative cascades and the physics of disordered systems and also on recent mathematical results about the so-called multifractal formalism. Applied to financial time series, our findings allow us to propose an unified framework that accounts for the observed multiscaling properties of return fluctuations, the volatility clustering phenomenon and the observed “inverse cubic law” of the return pdf tails.
Fulop, Sean A; Fitz, Kelly
2006-01-01
A modification of the spectrogram (log magnitude of the short-time Fourier transform) to more accurately show the instantaneous frequencies of signal components was first proposed in 1976 [Kodera et al., Phys. Earth Planet. Inter. 12, 142-150 (1976)], and has been considered or reinvented a few times since but never widely adopted. This paper presents a unified theoretical picture of this time-frequency analysis method, the time-corrected instantaneous frequency spectrogram, together with detailed implementable algorithms comparing three published techniques for its computation. The new representation is evaluated against the conventional spectrogram for its superior ability to track signal components. The lack of a uniform framework for either mathematics or implementation details which has characterized the disparate literature on the schemes has been remedied here. Fruitful application of the method is shown in the realms of speech phonation analysis, whale song pitch tracking, and additive sound modeling.
Physiological utility theory and the neuroeconomics of choice
Glimcher, Paul W.; Dorris, Michael C.; Bayer, Hannah M.
2006-01-01
Over the past half century economists have responded to the challenges of Allais [Econometrica (1953) 53], Ellsberg [Quart. J. Econ. (1961) 643] and others raised to neoclassicism either by bounding the reach of economic theory or by turning to descriptive approaches. While both of these strategies have been enormously fruitful, neither has provided a clear programmatic approach that aspires to a complete understanding of human decision making as did neoclassicism. There is, however, growing evidence that economists and neurobiologists are now beginning to reveal the physical mechanisms by which the human neuroarchitecture accomplishes decision making. Although in their infancy, these studies suggest both a single unified framework for understanding human decision making and a methodology for constraining the scope and structure of economic theory. Indeed, there is already evidence that these studies place mathematical constraints on existing economic models. This article reviews some of those constraints and suggests the outline of a neuroeconomic theory of decision. PMID:16845435
ERIC Educational Resources Information Center
Kollosche, David
2016-01-01
Socio-political studies in mathematics education often touch complex fields of interaction between education, mathematics and the political. In this paper I present a Foucault-based framework for socio-political studies in mathematics education which may guide research in that area. In order to show the potential of such a framework, I discuss the…
Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S; Breen, Lauren J; Witt, Regina R; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin
2016-01-01
Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care.
NASA Astrophysics Data System (ADS)
Everingham, Yvette L.; Gyuris, Emma; Connolly, Sean R.
2017-11-01
Contemporary science educators must equip their students with the knowledge and practical know-how to connect multiple disciplines like mathematics, computing and the natural sciences to gain a richer and deeper understanding of a scientific problem. However, many biology and earth science students are prejudiced against mathematics due to negative emotions like high mathematical anxiety and low mathematical confidence. Here, we present a theoretical framework that investigates linkages between student engagement, mathematical anxiety, mathematical confidence, student achievement and subject mastery. We implement this framework in a large, first-year interdisciplinary science subject and monitor its impact over several years from 2010 to 2015. The implementation of the framework coincided with an easing of anxiety and enhanced confidence, as well as higher student satisfaction, retention and achievement. The framework offers interdisciplinary science educators greater flexibility and confidence in their approach to designing and delivering subjects that rely on mathematical concepts and practices.
A Unified Framework for Association Analysis with Multiple Related Phenotypes
Stephens, Matthew
2013-01-01
We consider the problem of assessing associations between multiple related outcome variables, and a single explanatory variable of interest. This problem arises in many settings, including genetic association studies, where the explanatory variable is genotype at a genetic variant. We outline a framework for conducting this type of analysis, based on Bayesian model comparison and model averaging for multivariate regressions. This framework unifies several common approaches to this problem, and includes both standard univariate and standard multivariate association tests as special cases. The framework also unifies the problems of testing for associations and explaining associations – that is, identifying which outcome variables are associated with genotype. This provides an alternative to the usual, but conceptually unsatisfying, approach of resorting to univariate tests when explaining and interpreting significant multivariate findings. The method is computationally tractable genome-wide for modest numbers of phenotypes (e.g. 5–10), and can be applied to summary data, without access to raw genotype and phenotype data. We illustrate the methods on both simulated examples, and to a genome-wide association study of blood lipid traits where we identify 18 potential novel genetic associations that were not identified by univariate analyses of the same data. PMID:23861737
Ridgeway, Jennifer L; Wang, Zhen; Finney Rutten, Lila J; van Ryn, Michelle; Griffin, Joan M; Murad, M Hassan; Asiedu, Gladys B; Egginton, Jason S; Beebe, Timothy J
2017-08-04
There exists a paucity of work in the development and testing of theoretical models specific to childhood health disparities even though they have been linked to the prevalence of adult health disparities including high rates of chronic disease. We conducted a systematic review and thematic analysis of existing models of health disparities specific to children to inform development of a unified conceptual framework. We systematically reviewed articles reporting theoretical or explanatory models of disparities on a range of outcomes related to child health. We searched Ovid Medline In-Process & Other Non-Indexed Citations, Ovid MEDLINE, Ovid Embase, Ovid Cochrane Central Register of Controlled Trials, Ovid Cochrane Database of Systematic Reviews, and Scopus (database inception to 9 July 2015). A metanarrative approach guided the analysis process. A total of 48 studies presenting 48 models were included. This systematic review found multiple models but no consensus on one approach. However, we did discover a fair amount of overlap, such that the 48 models reviewed converged into the unified conceptual framework. The majority of models included factors in three domains: individual characteristics and behaviours (88%), healthcare providers and systems (63%), and environment/community (56%), . Only 38% of models included factors in the health and public policies domain. A disease-agnostic unified conceptual framework may inform integration of existing knowledge of child health disparities and guide future research. This multilevel framework can focus attention among clinical, basic and social science research on the relationships between policy, social factors, health systems and the physical environment that impact children's health outcomes. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.
Chiu, Weihsueh A; Slob, Wout
2015-12-01
When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.
TIMSS Advanced 2008 Assessment Frameworks
ERIC Educational Resources Information Center
Garden, Robert A.; Lie, Svein; Robitaille, David F.; Angell, Carl; Martin, Michael O.; Mullis, Ina V.S.; Foy, Pierre; Arora, Alka
2006-01-01
Developing the Trends in International Mathematics and Science Study (TIMSS) Advanced 2008 Assessment Frameworks was a collaborative venture involving mathematics and physics experts from around the world. The document contains two frameworks for implementing TIMSS Advanced 2008--one for advanced mathematics and one for physics. It also contains…
A Framework for Examining How Mathematics Teachers Evaluate Technology
ERIC Educational Resources Information Center
Smith, Ryan C.; Shin, Dongjo; Kim, Somin
2016-01-01
Our mathematics cognitive technology noticing framework is based on professional noticing and curricular noticing frameworks and data collected in a study that explored how secondary mathematics teachers evaluate technology. Our participants displayed three categories of noticing: attention to features of technology, interpretation of the…
NASA Astrophysics Data System (ADS)
Bogdanov, Alexander; Khramushin, Vasily
2016-02-01
The architecture of a digital computing system determines the technical foundation of a unified mathematical language for exact arithmetic-logical description of phenomena and laws of continuum mechanics for applications in fluid mechanics and theoretical physics. The deep parallelization of the computing processes results in functional programming at a new technological level, providing traceability of the computing processes with automatic application of multiscale hybrid circuits and adaptive mathematical models for the true reproduction of the fundamental laws of physics and continuum mechanics.
Communicational Perspectives on Learning and Teaching Mathematics: Prologue
ERIC Educational Resources Information Center
Tabach, Michal; Nachlieli, Talli
2016-01-01
This special issue comprises five studies which vary in their focus and mathematical content, yet they all share an underlying communicational theoretical framework--commognition. Within this framework, learning mathematics is defined as a change in one's mathematical discourse, that is, in the form of communication known as mathematical. Teaching…
Oakes, J M; Feldman, H A
2001-02-01
Nonequivalent controlled pretest-posttest designs are central to evaluation science, yet no practical and unified approach for estimating power in the two most widely used analytic approaches to these designs exists. This article fills the gap by presenting and comparing useful, unified power formulas for ANCOVA and change-score analyses, indicating the implications of each on sample-size requirements. The authors close with practical recommendations for evaluators. Mathematical details and a simple spreadsheet approach are included in appendices.
A unified framework for approximation in inverse problems for distributed parameter systems
NASA Technical Reports Server (NTRS)
Banks, H. T.; Ito, K.
1988-01-01
A theoretical framework is presented that can be used to treat approximation techniques for very general classes of parameter estimation problems involving distributed systems that are either first or second order in time. Using the approach developed, one can obtain both convergence and stability (continuous dependence of parameter estimates with respect to the observations) under very weak regularity and compactness assumptions on the set of admissible parameters. This unified theory can be used for many problems found in the recent literature and in many cases offers significant improvements to existing results.
In Search of a Unified Model of Language Contact
ERIC Educational Resources Information Center
Winford, Donald
2013-01-01
Much previous research has pointed to the need for a unified framework for language contact phenomena -- one that would include social factors and motivations, structural factors and linguistic constraints, and psycholinguistic factors involved in processes of language processing and production. While Contact Linguistics has devoted a great deal…
The Alberta K-9 Mathematics Program of Studies with Achievement Indicators
ERIC Educational Resources Information Center
Alberta Education, 2007
2007-01-01
The "Alberta K-9 Mathematics Program of Studies with Achievement Indicators" has been derived from "The Common Curriculum Framework for K-9 Mathematics: Western and Northern Canadian Protocol," May 2006 (the Common Curriculum Framework). The program of studies incorporates the conceptual framework for Kindergarten to Grade 9…
Metacognition, Positioning and Emotions in Mathematical Activities
ERIC Educational Resources Information Center
Daher, Wajeeh; Anabousy, Ahlam; Jabarin, Roqaya
2018-01-01
Researchers of mathematics education have been paying attention to the affective aspect of learning mathematics for more than one decade. Different theoretical frameworks have been suggested to analyze this aspect, where we utilize in the present research the discursive framework of Evans, Morgan and Tsatsaroni. This framework enables to link…
ERIC Educational Resources Information Center
Hole, Arne; Grønmo, Liv Sissel; Onstad, Torgeir
2018-01-01
Background: This paper discusses a framework for analyzing the dependence on mathematical theory in test items, that is, a framework for discussing to what extent knowledge of mathematical theory is helpful for the student in solving the item. The framework can be applied to any test in which some knowledge of mathematical theory may be useful,…
Topological framework for local structure analysis in condensed matter
Lazar, Emanuel A.; Han, Jian; Srolovitz, David J.
2015-01-01
Physical systems are frequently modeled as sets of points in space, each representing the position of an atom, molecule, or mesoscale particle. As many properties of such systems depend on the underlying ordering of their constituent particles, understanding that structure is a primary objective of condensed matter research. Although perfect crystals are fully described by a set of translation and basis vectors, real-world materials are never perfect, as thermal vibrations and defects introduce significant deviation from ideal order. Meanwhile, liquids and glasses present yet more complexity. A complete understanding of structure thus remains a central, open problem. Here we propose a unified mathematical framework, based on the topology of the Voronoi cell of a particle, for classifying local structure in ordered and disordered systems that is powerful and practical. We explain the underlying reason why this topological description of local structure is better suited for structural analysis than continuous descriptions. We demonstrate the connection of this approach to the behavior of physical systems and explore how crystalline structure is compromised at elevated temperatures. We also illustrate potential applications to identifying defects in plastically deformed polycrystals at high temperatures, automating analysis of complex structures, and characterizing general disordered systems. PMID:26460045
Chen, Zhe; Purdon, Patrick L.; Brown, Emery N.; Barbieri, Riccardo
2012-01-01
In recent years, time-varying inhomogeneous point process models have been introduced for assessment of instantaneous heartbeat dynamics as well as specific cardiovascular control mechanisms and hemodynamics. Assessment of the model’s statistics is established through the Wiener-Volterra theory and a multivariate autoregressive (AR) structure. A variety of instantaneous cardiovascular metrics, such as heart rate (HR), heart rate variability (HRV), respiratory sinus arrhythmia (RSA), and baroreceptor-cardiac reflex (baroreflex) sensitivity (BRS), are derived within a parametric framework and instantaneously updated with adaptive and local maximum likelihood estimation algorithms. Inclusion of second-order non-linearities, with subsequent bispectral quantification in the frequency domain, further allows for definition of instantaneous metrics of non-linearity. We here present a comprehensive review of the devised methods as applied to experimental recordings from healthy subjects during propofol anesthesia. Collective results reveal interesting dynamic trends across the different pharmacological interventions operated within each anesthesia session, confirming the ability of the algorithm to track important changes in cardiorespiratory elicited interactions, and pointing at our mathematical approach as a promising monitoring tool for an accurate, non-invasive assessment in clinical practice. We also discuss the limitations and other alternative modeling strategies of our point process approach. PMID:22375120
a Unified Matrix Polynomial Approach to Modal Identification
NASA Astrophysics Data System (ADS)
Allemang, R. J.; Brown, D. L.
1998-04-01
One important current focus of modal identification is a reformulation of modal parameter estimation algorithms into a single, consistent mathematical formulation with a corresponding set of definitions and unifying concepts. Particularly, a matrix polynomial approach is used to unify the presentation with respect to current algorithms such as the least-squares complex exponential (LSCE), the polyreference time domain (PTD), Ibrahim time domain (ITD), eigensystem realization algorithm (ERA), rational fraction polynomial (RFP), polyreference frequency domain (PFD) and the complex mode indication function (CMIF) methods. Using this unified matrix polynomial approach (UMPA) allows a discussion of the similarities and differences of the commonly used methods. the use of least squares (LS), total least squares (TLS), double least squares (DLS) and singular value decomposition (SVD) methods is discussed in order to take advantage of redundant measurement data. Eigenvalue and SVD transformation methods are utilized to reduce the effective size of the resulting eigenvalue-eigenvector problem as well.
A Reconceptualized Framework for "Opportunity to Learn" in School Mathematics
ERIC Educational Resources Information Center
Walkowiak, Temple A.; Pinter, Holly H.; Berry, Robert Q.
2017-01-01
We present a reconceptualized framework for opportunity to learn (OTL) in school mathematics that builds on previous conceptualizations of OTL and includes features related to both quantity (i.e., time) and quality. Our framework draws on existing literature and on our own observational research of mathematics teaching practices. Through the…
A Framework for Authenticity in the Mathematics and Statistics Classroom
ERIC Educational Resources Information Center
Garrett, Lauretta; Huang, Li; Charleton, Maria Calhoun
2016-01-01
Authenticity is a term commonly used in reference to pedagogical and curricular qualities of mathematics teaching and learning, but its use lacks a coherent framework. The work of researchers in engineering education provides such a framework. Authentic qualities of mathematics teaching and learning are fit within a model described by Strobel,…
Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S.; Breen, Lauren J.; Witt, Regina R.; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin
2016-01-01
Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care. PMID:27242567
Stam, Henderikus J.
2015-01-01
The search for a so-called unified or integrated theory has long served as a goal for some psychologists, even if the search is often implicit. But if the established sciences do not have an explicitly unified set of theories, then why should psychology? After examining this question again I argue that psychology is in fact reasonably unified around its methods and its commitment to functional explanations, an indeterminate functionalism. The question of the place of the neurosciences in this framework is complex. On the one hand, the neuroscientific project will not likely renew and synthesize the disparate arms of psychology. On the other hand, their reformulation of what it means to be human will exert an influence in multiple ways. One way to capture that influence is to conceptualize the brain in terms of a technology that we interact with in a manner that we do not yet fully understand. In this way we maintain both a distance from neuro-reductionism and refrain from committing to an unfettered subjectivity. PMID:26500571
ERIC Educational Resources Information Center
Polaki, Mokaeane Victor
2005-01-01
It is a well-known fact that the idea of function plays a unifying role in the development of mathematical concepts. Yet research has shown that many students do not understand it adequately even though they have experienced a great deal of success in performing a plethora of operations on function, and on using functions to solve various types of…
Combining statistical inference and decisions in ecology.
Williams, Perry J; Hooten, Mevin B
2016-09-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.
NASA Astrophysics Data System (ADS)
Abdi, Daniel S.; Giraldo, Francis X.
2016-09-01
A unified approach for the numerical solution of the 3D hyperbolic Euler equations using high order methods, namely continuous Galerkin (CG) and discontinuous Galerkin (DG) methods, is presented. First, we examine how classical CG that uses a global storage scheme can be constructed within the DG framework using constraint imposition techniques commonly used in the finite element literature. Then, we implement and test a simplified version in the Non-hydrostatic Unified Model of the Atmosphere (NUMA) for the case of explicit time integration and a diagonal mass matrix. Constructing CG within the DG framework allows CG to benefit from the desirable properties of DG such as, easier hp-refinement, better stability etc. Moreover, this representation allows for regional mixing of CG and DG depending on the flow regime in an area. The different flavors of CG and DG in the unified implementation are then tested for accuracy and performance using a suite of benchmark problems representative of cloud-resolving scale, meso-scale and global-scale atmospheric dynamics. The value of our unified approach is that we are able to show how to carry both CG and DG methods within the same code and also offer a simple recipe for modifying an existing CG code to DG and vice versa.
U.S. History Framework for the 2010 National Assessment of Educational Progress
ERIC Educational Resources Information Center
National Assessment Governing Board, 2009
2009-01-01
This framework identifies the main ideas, major events, key individuals, and unifying themes of American history as a basis for preparing the 2010 assessment. The framework recognizes that U.S. history includes powerful ideas, common and diverse traditions, economic developments, technological and scientific innovations, philosophical debates,…
Applying Laban's Movement Framework in Elementary Physical Education
ERIC Educational Resources Information Center
Langton, Terence W.
2007-01-01
This article recommends raising the bar in elementary physical education by using Laban's movement framework to develop curriculum content in the areas of games, gymnastics, and dance (with physical fitness concepts blended in) in order to help students achieve the NASPE content standards. The movement framework can permeate and unify an…
Depth of Teachers' Knowledge: Frameworks for Teachers' Knowledge of Mathematics
ERIC Educational Resources Information Center
Holmes, Vicki-Lynn
2012-01-01
This article describes seven teacher knowledge frameworks and relates these frameworks to the teaching and assessment of elementary teacher's mathematics knowledge. The frameworks classify teachers' knowledge and provide a vocabulary and common language through which knowledge can be discussed and assessed. These frameworks are categorized into…
TIMSS 2007 Assessment Frameworks
ERIC Educational Resources Information Center
Mullis, Ina V. S.; Martin, Michael O.; Ruddock, Graham J.; O'Sullivan, Christine Y.; Arora, Alka; Erberber, Ebru
2005-01-01
Developing the Trends in International Mathematics and Science Study (TIMSS) 2007 Assessment Frameworks represents an extensive collaborative effort involving individuals and expert groups from more than 60 countries around the world. The document contains three frameworks for implementing TIMSS 2007--the Mathematics Framework, the Science…
NASA Astrophysics Data System (ADS)
Tuminaro, Jonathan
Many introductory, algebra-based physics students perform poorly on mathematical problem solving tasks in physics. There are at least two possible, distinct reasons for this poor performance: (1) students simply lack the mathematical skills needed to solve problems in physics, or (2) students do not know how to apply the mathematical skills they have to particular problem situations in physics. While many students do lack the requisite mathematical skills, a major finding from this work is that the majority of students possess the requisite mathematical skills, yet fail to use or interpret them in the context of physics. In this thesis I propose a theoretical framework to analyze and describe students' mathematical thinking in physics. In particular, I attempt to answer two questions. What are the cognitive tools involved in formal mathematical thinking in physics? And, why do students make the kinds of mistakes they do when using mathematics in physics? According to the proposed theoretical framework there are three major theoretical constructs: mathematical resources, which are the knowledge elements that are activated in mathematical thinking and problem solving; epistemic games, which are patterns of activities that use particular kinds of knowledge to create new knowledge or solve a problem; and frames, which are structures of expectations that determine how individuals interpret situations or events. The empirical basis for this study comes from videotaped sessions of college students solving homework problems. The students are enrolled in an algebra-based introductory physics course. The videotapes were transcribed and analyzed using the aforementioned theoretical framework. Two important results from this work are: (1) the construction of a theoretical framework that offers researchers a vocabulary (ontological classification of cognitive structures) and grammar (relationship between the cognitive structures) for understanding the nature and origin of mathematical use in the context physics, and (2) a detailed understanding, in terms of the proposed theoretical framework, of the errors that students make when using mathematics in the context of physics.
Roux-Rouquié, Magali; Caritey, Nicolas; Gaubert, Laurent; Rosenthal-Sabroux, Camille
2004-07-01
One of the main issues in Systems Biology is to deal with semantic data integration. Previously, we examined the requirements for a reference conceptual model to guide semantic integration based on the systemic principles. In the present paper, we examine the usefulness of the Unified Modelling Language (UML) to describe and specify biological systems and processes. This makes unambiguous representations of biological systems, which would be suitable for translation into mathematical and computational formalisms, enabling analysis, simulation and prediction of these systems behaviours.
ERIC Educational Resources Information Center
Grimm, C. A.
This document contains two units that examine integral transforms and series expansions. In the first module, the user is expected to learn how to use the unified method presented to obtain Laplace transforms, Fourier transforms, complex Fourier series, real Fourier series, and half-range sine series for given piecewise continuous functions. In…
NASA Astrophysics Data System (ADS)
Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The
2015-11-01
While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.
NASA Astrophysics Data System (ADS)
Fasni, N.; Turmudi, T.; Kusnandi, K.
2017-09-01
This research background of this research is the importance of student problem solving abilities. The purpose of this study is to find out whether there are differences in the ability to solve mathematical problems between students who have learned mathematics using Ang’s Framework for Mathematical Modelling Instruction (AFFMMI) and students who have learned using scientific approach (SA). The method used in this research is a quasi-experimental method with pretest-postest control group design. Data analysis of mathematical problem solving ability using Indepent Sample Test. The results showed that there was a difference in the ability to solve mathematical problems between students who received learning with Ang’s Framework for Mathematical Modelling Instruction and students who received learning with a scientific approach. AFFMMI focuses on mathematical modeling. This modeling allows students to solve problems. The use of AFFMMI is able to improve the solving ability.
ERIC Educational Resources Information Center
Carter, Merilyn; Cooper, Tom; Anderson, Robyn
2016-01-01
This paper describes the pedagogical framework used by YuMi Deadly Maths, a school change process used to improve mathematics teaching and thus enhance employment and life chances for socially disadvantaged students. The framework, called the RAMR cycle, is capable of being used by mathematics teachers for planning and delivering lessons and units…
How Do Mathematicians Learn Math?: Resources and Acts for Constructing and Understanding Mathematics
ERIC Educational Resources Information Center
Wilkerson-Jerde, Michelle H.; Wilensky, Uri J.
2011-01-01
In this paper, we present an analytic framework for investigating expert mathematical learning as the process of building a "network of mathematical resources" by establishing relationships between different components and properties of mathematical ideas. We then use this framework to analyze the reasoning of ten mathematicians and mathematics…
ERIC Educational Resources Information Center
Adler, Jill; Ronda, Erlina
2015-01-01
We describe and use an analytical framework to document mathematics discourse in instruction (MDI), and interpret differences in mathematics teaching. MDI is characterised by four interacting components in the teaching of a mathematics lesson: exemplification (occurring through a sequence of examples and related tasks), explanatory talk (talk that…
Growth in Mathematical Understanding While Learning How To Teach: A Theoretical Perspective.
ERIC Educational Resources Information Center
Cavey, Laurie O.
This theoretical paper outlines a conceptual framework for examining growth in prospective teachers' mathematical understanding as they engage in thinking about and planning for the mathematical learning of others. The framework is based on the Pirie-Kieren (1994) Dynamical Theory for the Growth of Mathematical Understanding and extends into the…
Sheldon Glashow, the Electroweak Theory, and the Grand Unified Theory
] 'Glashow shared the 1979 Nobel Prize for physics with Steven Weinberg and Abdus Salam for unifying the particle physics and provides a framework for understanding how the early universe evolved and how the our universe came into being," says Lawrence R. Sulak, chairman of the Boston University physics
"UNICERT," or: Towards the Development of a Unified Language Certificate for German Universities.
ERIC Educational Resources Information Center
Voss, Bernd
The standardization of second language proficiency levels for university students in Germany is discussed. Problems with the current system, in which each university has developed its own program of study and proficiency certification, are examined and a framework for development of a unified language certificate for all universities is outlined.…
A unified framework for heat and mass transport at the atomic scale
NASA Astrophysics Data System (ADS)
Ponga, Mauricio; Sun, Dingyi
2018-04-01
We present a unified framework to simulate heat and mass transport in systems of particles. The proposed framework is based on kinematic mean field theory and uses a phenomenological master equation to compute effective transport rates between particles without the need to evaluate operators. We exploit this advantage and apply the model to simulate transport phenomena at the nanoscale. We demonstrate that, when calibrated to experimentally-measured transport coefficients, the model can accurately predict transient and steady state temperature and concentration profiles even in scenarios where the length of the device is comparable to the mean free path of the carriers. Through several example applications, we demonstrate the validity of our model for all classes of materials, including ones that, until now, would have been outside the domain of computational feasibility.
A unified theoretical framework for mapping models for the multi-state Hamiltonian.
Liu, Jian
2016-11-28
We propose a new unified theoretical framework to construct equivalent representations of the multi-state Hamiltonian operator and present several approaches for the mapping onto the Cartesian phase space. After mapping an F-dimensional Hamiltonian onto an F+1 dimensional space, creation and annihilation operators are defined such that the F+1 dimensional space is complete for any combined excitation. Commutation and anti-commutation relations are then naturally derived, which show that the underlying degrees of freedom are neither bosons nor fermions. This sets the scene for developing equivalent expressions of the Hamiltonian operator in quantum mechanics and their classical/semiclassical counterparts. Six mapping models are presented as examples. The framework also offers a novel way to derive such as the well-known Meyer-Miller model.
ERIC Educational Resources Information Center
Partnership for 21st Century Skills, 2009
2009-01-01
To help practitioners integrate skills into the teaching of core academic subjects, the Partnership for 21st Century Skills has developed a unified, collective vision for learning known as the Framework for 21st Century Learning. This Framework describes the skills, knowledge and expertise students must master to succeed in work and life; it is a…
Toward a Unified Validation Framework in Mixed Methods Research
ERIC Educational Resources Information Center
Dellinger, Amy B.; Leech, Nancy L.
2007-01-01
The primary purpose of this article is to further discussions of validity in mixed methods research by introducing a validation framework to guide thinking about validity in this area. To justify the use of this framework, the authors discuss traditional terminology and validity criteria for quantitative and qualitative research, as well as…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang Yinan; Shi Handuo; Xiong Zhaoxi
We present a unified universal quantum cloning machine, which combines several different existing universal cloning machines together, including the asymmetric case. In this unified framework, the identical pure states are projected equally into each copy initially constituted by input and one half of the maximally entangled states. We show explicitly that the output states of those universal cloning machines are the same. One importance of this unified cloning machine is that the cloning procession is always the symmetric projection, which reduces dramatically the difficulties for implementation. Also, it is found that this unified cloning machine can be directly modified tomore » the general asymmetric case. Besides the global fidelity and the single-copy fidelity, we also present all possible arbitrary-copy fidelities.« less
[Arabian food pyramid: unified framework for nutritional health messages].
Shokr, Adel M
2008-01-01
There are several ways to present nutritional health messages, particularly pyramidic indices, but they have many deficiencies such as lack of agreement on a unified or clear methodology for food grouping and ignoring nutritional group inter-relation and integration. This causes confusion for health educators and target individuals. This paper presents an Arabian food pyramid that aims to unify the bases of nutritional health messages, bringing together the function, contents, source and nutritional group servings and indicating the inter-relation and integration of nutritional groups. This provides comprehensive, integrated, simple and flexible health messages.
ERIC Educational Resources Information Center
Contreras, Jose
2007-01-01
In this article, I model how a problem-posing framework can be used to enhance our abilities to systematically generate mathematical problems by modifying the attributes of a given problem. The problem-posing model calls for the application of the following fundamental mathematical processes: proving, reversing, specializing, generalizing, and…
ERIC Educational Resources Information Center
Komatsu, Kotaro
2016-01-01
The process of proofs and refutations described by Lakatos is essential in school mathematics to provide students with an opportunity to experience how mathematical knowledge develops dynamically within the discipline of mathematics. In this paper, a framework for describing student processes of proofs and refutations is constructed using a set of…
A Categorization Model for Educational Values of the History of Mathematics: An Empirical Study
ERIC Educational Resources Information Center
Wang, Xiao-qin; Qi, Chun-yan; Wang, Ke
2017-01-01
There is not a clear consensus on the categorization framework of the educational values of the history of mathematics. By analyzing 20 Chinese teaching cases on integrating the history of mathematics into mathematics teaching based on the relevant literature, this study examined a new categorization framework of the educational values of the…
ELPSA as a Lesson Design Framework
ERIC Educational Resources Information Center
Lowrie, Tom; Patahuddin, Sitti Maesuri
2015-01-01
This paper offers a framework for a mathematics lesson design that is consistent with the way we learn about, and discover, most things in life. In addition, the framework provides a structure for identifying how mathematical concepts and understanding are acquired and developed. This framework is called ELPSA and represents five learning…
Chao, Anne; Chiu, Chun-Huo; Colwell, Robert K; Magnago, Luiz Fernando S; Chazdon, Robin L; Gotelli, Nicholas J
2017-11-01
Estimating the species, phylogenetic, and functional diversity of a community is challenging because rare species are often undetected, even with intensive sampling. The Good-Turing frequency formula, originally developed for cryptography, estimates in an ecological context the true frequencies of rare species in a single assemblage based on an incomplete sample of individuals. Until now, this formula has never been used to estimate undetected species, phylogenetic, and functional diversity. Here, we first generalize the Good-Turing formula to incomplete sampling of two assemblages. The original formula and its two-assemblage generalization provide a novel and unified approach to notation, terminology, and estimation of undetected biological diversity. For species richness, the Good-Turing framework offers an intuitive way to derive the non-parametric estimators of the undetected species richness in a single assemblage, and of the undetected species shared between two assemblages. For phylogenetic diversity, the unified approach leads to an estimator of the undetected Faith's phylogenetic diversity (PD, the total length of undetected branches of a phylogenetic tree connecting all species), as well as a new estimator of undetected PD shared between two phylogenetic trees. For functional diversity based on species traits, the unified approach yields a new estimator of undetected Walker et al.'s functional attribute diversity (FAD, the total species-pairwise functional distance) in a single assemblage, as well as a new estimator of undetected FAD shared between two assemblages. Although some of the resulting estimators have been previously published (but derived with traditional mathematical inequalities), all taxonomic, phylogenetic, and functional diversity estimators are now derived under the same framework. All the derived estimators are theoretically lower bounds of the corresponding undetected diversities; our approach reveals the sufficient conditions under which the estimators are nearly unbiased, thus offering new insights. Simulation results are reported to numerically verify the performance of the derived estimators. We illustrate all estimators and assess their sampling uncertainty with an empirical dataset for Brazilian rain forest trees. These estimators should be widely applicable to many current problems in ecology, such as the effects of climate change on spatial and temporal beta diversity and the contribution of trait diversity to ecosystem multi-functionality. © 2017 by the Ecological Society of America.
Collusion-resistant multimedia fingerprinting: a unified framework
NASA Astrophysics Data System (ADS)
Wu, Min; Trappe, Wade; Wang, Z. Jane; Liu, K. J. Ray
2004-06-01
Digital fingerprints are unique labels inserted in different copies of the same content before distribution. Each digital fingerprint is assigned to an inteded recipient, and can be used to trace the culprits who use their content for unintended purposes. Attacks mounted by multiple users, known as collusion attacks, provide a cost-effective method for attenuating the identifying fingerprint from each coluder, thus collusion poses a reeal challenge to protect the digital media data and enforce usage policies. This paper examines a few major design methodologies for collusion-resistant fingerprinting of multimedia, and presents a unified framework that helps highlight the common issues and the uniqueness of different fingerprinting techniques.
Science Education Attuned to Social Issues: Challenge for the '80s.
ERIC Educational Resources Information Center
Yager, Robert E.; And Others
1981-01-01
Provides rationale for interdisciplinary science curricula which emphasize decision-making skills. Includes examples of interdisciplinary curricula using an issue-centered approach: Unified Science and Mathematics for Elementary School (USMES), Health Activities Program (HAP), Human Sciences Program (HSP), Individualized Science Instructional…
ERIC Educational Resources Information Center
DeLucca, Adolph
1982-01-01
As a state and national model for a basic skills curriculum for Kindergarten through grade 12 students, Coordination Learning Integration--Middlesex Basics (Project CLIMB) is described. The unified system was developed by teachers with administrative support to accomodate all students' reading and mathematics needs. Project CLIMB's development and…
A development framework for semantically interoperable health information systems.
Lopez, Diego M; Blobel, Bernd G M E
2009-02-01
Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.
Characterizing the reliability of a bioMEMS-based cantilever sensor
NASA Astrophysics Data System (ADS)
Bhalerao, Kaustubh D.
2004-12-01
The cantilever-based BioMEMS sensor represents one instance from many competing ideas of biosensor technology based on Micro Electro Mechanical Systems. The advancement of BioMEMS from laboratory-scale experiments to applications in the field will require standardization of their components and manufacturing procedures as well as frameworks to evaluate their performance. Reliability, the likelihood with which a system performs its intended task, is a compact mathematical description of its performance. The mathematical and statistical foundation of systems-reliability has been applied to the cantilever-based BioMEMS sensor. The sensor is designed to detect one aspect of human ovarian cancer, namely the over-expression of the folate receptor surface protein (FR-alpha). Even as the application chosen is clinically motivated, the objective of this study was to demonstrate the underlying systems-based methodology used to design, develop and evaluate the sensor. The framework development can be readily extended to other BioMEMS-based devices for disease detection and will have an impact in the rapidly growing $30 bn industry. The Unified Modeling Language (UML) is a systems-based framework for design and development of object-oriented information systems which has potential application for use in systems designed to interact with biological environments. The UML has been used to abstract and describe the application of the biosensor, to identify key components of the biosensor, and the technology needed to link them together in a coherent manner. The use of the framework is also demonstrated in computation of system reliability from first principles as a function of the structure and materials of the biosensor. The outcomes of applying the systems-based framework to the study are the following: (1) Characterizing the cantilever-based MEMS device for disease (cell) detection. (2) Development of a novel chemical interface between the analyte and the sensor that provides a degree of selectivity towards the disease. (3) Demonstrating the performance and measuring the reliability of the biosensor prototype, and (4) Identification of opportunities in technological development in order to further refine the proposed biosensor. Application of the methodology to design develop and evaluate the reliability of BioMEMS devices will be beneficial in the streamlining the growth of the BioMEMS industry, while providing a decision-support tool in comparing and adopting suitable technologies from available competing options.
Rosenfeld, Daniel L; Burrow, Anthony L
2017-05-01
By departing from social norms regarding food behaviors, vegetarians acquire membership in a distinct social group and can develop a salient vegetarian identity. However, vegetarian identities are diverse, multidimensional, and unique to each individual. Much research has identified fundamental psychological aspects of vegetarianism, and an identity framework that unifies these findings into common constructs and conceptually defines variables is needed. Integrating psychological theories of identity with research on food choices and vegetarianism, this paper proposes a conceptual model for studying vegetarianism: The Unified Model of Vegetarian Identity (UMVI). The UMVI encompasses ten dimensions-organized into three levels (contextual, internalized, and externalized)-that capture the role of vegetarianism in an individual's self-concept. Contextual dimensions situate vegetarianism within contexts; internalized dimensions outline self-evaluations; and externalized dimensions describe enactments of identity through behavior. Together, these dimensions form a coherent vegetarian identity, characterizing one's thoughts, feelings, and behaviors regarding being vegetarian. By unifying dimensions that capture psychological constructs universally, the UMVI can prevent discrepancies in operationalization, capture the inherent diversity of vegetarian identities, and enable future research to generate greater insight into how people understand themselves and their food choices. Copyright © 2017 Elsevier Ltd. All rights reserved.
Franz, A; Triesch, J
2010-12-01
The perception of the unity of objects, their permanence when out of sight, and the ability to perceive continuous object trajectories even during occlusion belong to the first and most important capacities that infants have to acquire. Despite much research a unified model of the development of these abilities is still missing. Here we make an attempt to provide such a unified model. We present a recurrent artificial neural network that learns to predict the motion of stimuli occluding each other and that develops representations of occluded object parts. It represents completely occluded, moving objects for several time steps and successfully predicts their reappearance after occlusion. This framework allows us to account for a broad range of experimental data. Specifically, the model explains how the perception of object unity develops, the role of the width of the occluders, and it also accounts for differences between data for moving and stationary stimuli. We demonstrate that these abilities can be acquired by learning to predict the sensory input. The model makes specific predictions and provides a unifying framework that has the potential to be extended to other visual event categories. Copyright © 2010 Elsevier Inc. All rights reserved.
Dispersive hydrodynamics: Preface
NASA Astrophysics Data System (ADS)
Biondini, G.; El, G. A.; Hoefer, M. A.; Miller, P. D.
2016-10-01
This Special Issue on Dispersive Hydrodynamics is dedicated to the memory and work of G.B. Whitham who was one of the pioneers in this field of physical applied mathematics. Some of the papers appearing here are related to work reported on at the workshop "Dispersive Hydrodynamics: The Mathematics of Dispersive Shock Waves and Applications" held in May 2015 at the Banff International Research Station. This Preface provides a broad overview of the field and summaries of the various contributions to the Special Issue, placing them in a unified context.
Mathematical correlation of modal-parameter-identification methods via system-realization theory
NASA Technical Reports Server (NTRS)
Juang, Jer-Nan
1987-01-01
A unified approach is introduced using system-realization theory to derive and correlate modal-parameter-identification methods for flexible structures. Several different time-domain methods are analyzed and treated. A basic mathematical foundation is presented which provides insight into the field of modal-parameter identification for comparison and evaluation. The relation among various existing methods is established and discussed. This report serves as a starting point to stimulate additional research toward the unification of the many possible approaches for modal-parameter identification.
A Unified Theoretical Framework for Cognitive Sequencing.
Savalia, Tejas; Shukla, Anuj; Bapi, Raju S
2016-01-01
The capacity to sequence information is central to human performance. Sequencing ability forms the foundation stone for higher order cognition related to language and goal-directed planning. Information related to the order of items, their timing, chunking and hierarchical organization are important aspects in sequencing. Past research on sequencing has emphasized two distinct and independent dichotomies: implicit vs. explicit and goal-directed vs. habits. We propose a theoretical framework unifying these two streams. Our proposal relies on brain's ability to implicitly extract statistical regularities from the stream of stimuli and with attentional engagement organizing sequences explicitly and hierarchically. Similarly, sequences that need to be assembled purposively to accomplish a goal require engagement of attentional processes. With repetition, these goal-directed plans become habits with concomitant disengagement of attention. Thus, attention and awareness play a crucial role in the implicit-to-explicit transition as well as in how goal-directed plans become automatic habits. Cortico-subcortical loops basal ganglia-frontal cortex and hippocampus-frontal cortex loops mediate the transition process. We show how the computational principles of model-free and model-based learning paradigms, along with a pivotal role for attention and awareness, offer a unifying framework for these two dichotomies. Based on this framework, we make testable predictions related to the potential influence of response-to-stimulus interval (RSI) on developing awareness in implicit learning tasks.
A Unified Theoretical Framework for Cognitive Sequencing
Savalia, Tejas; Shukla, Anuj; Bapi, Raju S.
2016-01-01
The capacity to sequence information is central to human performance. Sequencing ability forms the foundation stone for higher order cognition related to language and goal-directed planning. Information related to the order of items, their timing, chunking and hierarchical organization are important aspects in sequencing. Past research on sequencing has emphasized two distinct and independent dichotomies: implicit vs. explicit and goal-directed vs. habits. We propose a theoretical framework unifying these two streams. Our proposal relies on brain's ability to implicitly extract statistical regularities from the stream of stimuli and with attentional engagement organizing sequences explicitly and hierarchically. Similarly, sequences that need to be assembled purposively to accomplish a goal require engagement of attentional processes. With repetition, these goal-directed plans become habits with concomitant disengagement of attention. Thus, attention and awareness play a crucial role in the implicit-to-explicit transition as well as in how goal-directed plans become automatic habits. Cortico-subcortical loops basal ganglia-frontal cortex and hippocampus-frontal cortex loops mediate the transition process. We show how the computational principles of model-free and model-based learning paradigms, along with a pivotal role for attention and awareness, offer a unifying framework for these two dichotomies. Based on this framework, we make testable predictions related to the potential influence of response-to-stimulus interval (RSI) on developing awareness in implicit learning tasks. PMID:27917146
ERIC Educational Resources Information Center
Lamb, Janeen; Kawakami, Takashi; Saeki, Akihiko; Matsuzaki, Akio
2014-01-01
The aim of this study was to investigate the use of the "dual mathematical modelling cycle framework" as one way to meet the espoused goals of the Australian Curriculum Mathematics. This study involved 23 Year 6 students from one Australian primary school who engaged in an "Oil Tank Task" that required them to develop two…
Mathematics Education as Sociopolitical: Prospective Teachers' Views of the What, Who, and How
ERIC Educational Resources Information Center
Felton-Koestler, Mathew D.
2017-01-01
In this article, I introduce a framework--the What, Who, and How of mathematics--that emerged from studying my teaching of prospective teachers and their views of the social and political dimensions of mathematics teaching and learning. The What, Who, How framework asks us to consider What messages we send about mathematics and the world, Whose…
The Pursuit of a "Better" Explanation as an Organizing Framework for Science Teaching and Learning
ERIC Educational Resources Information Center
Papadouris, Nicos; Vokos, Stamatis; Constantinou, Constantinos P.
2018-01-01
This article seeks to make the case for the pursuit of a "better" explanation being a productive organizing framework for science teaching and learning. Underlying this position is the idea that this framework allows promoting, in a unified manner, facility with the scientific practice of constructing explanations, appreciation of its…
NASA Astrophysics Data System (ADS)
Mezentsev, Yu A.; Baranova, N. V.
2018-05-01
A universal economical and mathematical model designed for determination of optimal strategies for managing subsystems (components of subsystems) of production and logistics of enterprises is considered. Declared universality allows taking into account on the system level both production components, including limitations on the ways of converting raw materials and components into sold goods, as well as resource and logical restrictions on input and output material flows. The presented model and generated control problems are developed within the framework of the unified approach that allows one to implement logical conditions of any complexity and to define corresponding formal optimization tasks. Conceptual meaning of used criteria and limitations are explained. The belonging of the generated tasks of the mixed programming with the class of NP is shown. An approximate polynomial algorithm for solving the posed optimization tasks for mixed programming of real dimension with high computational complexity is proposed. Results of testing the algorithm on the tasks in a wide range of dimensions are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, T., E-mail: xietao@ustc.edu.cn; Key Laboratory of Geospace Environment, CAS, Hefei, Anhui 230026; Qin, H.
A unified ballooning theory, constructed on the basis of two special theories [Zhang et al., Phys. Fluids B 4, 2729 (1992); Y. Z. Zhang and T. Xie, Nucl. Fusion Plasma Phys. 33, 193 (2013)], shows that a weak up-down asymmetric mode structure is normally formed in an up-down symmetric equilibrium; the weak up-down asymmetry in mode structure is the manifestation of non-trivial higher order effects beyond the standard ballooning equation. It is shown that the asymmetric mode may have even higher growth rate than symmetric modes. The salient features of the theory are illustrated by investigating a fluid model formore » the ion temperature gradient (ITG) mode. The two dimensional (2D) analytical form of the ITG mode, solved in ballooning representation, is then converted into the radial-poloidal space to provide the natural boundary condition for solving the 2D mathematical local eigenmode problem. We find that the analytical expression of the mode structure is in a good agreement with finite difference solution. This sets a reliable framework for quasi-linear computation.« less
NASA Astrophysics Data System (ADS)
Sivasundaram, Seenith
2016-07-01
The review paper [1] is devoted to the survey of different structures that have been developed for the modeling and analysis of various types of fibrosis. Biomathematics, bioinformatics, biomechanics and biophysics modeling have been treated by means of a brief description of the different models developed. The review is impressive and clearly written, addressed to a reader interested not only in the theoretical modeling but also in the biological description. The models have been described without recurring to technical statements or mathematical equations thus allowing the non-specialist reader to understand what framework is more suitable at a certain observation scale. The review [1] concludes with the possibility to develop a multiscale approach considering also the definition of a therapeutical strategy for pathological fibrosis. In particular the control and optimization of therapeutics action is an important issue and this article aims at commenting on this topic.
Knot invariants and M-theory: Proofs and derivations
NASA Astrophysics Data System (ADS)
Errasti Díez, Verónica
2018-01-01
We construct two distinct yet related M-theory models that provide suitable frameworks for the study of knot invariants. We then focus on the four-dimensional gauge theory that follows from appropriately compactifying one of these M-theory models. We show that this theory has indeed all required properties to host knots. Our analysis provides a unifying picture of the various recent works that attempt an understanding of knot invariants using techniques of four-dimensional physics. This is a companion paper to K. Dasgupta, V. Errasti Díez, P. Ramadevi, and R. Tatar, Phys. Rev. D 95, 026010 (2017), 10.1103/PhysRevD.95.026010, covering all but Sec. III C. It presents a detailed mathematical derivation of the main results there, as well as additional material. Among the new insights, those related to supersymmetry and the topological twist are highlighted. This paper offers an alternative, complementary formulation of the contents in the first paper, but is self-contained and can be read independently.
Interrelation Between Safety Factors and Reliability
NASA Technical Reports Server (NTRS)
Elishakoff, Isaac; Chamis, Christos C. (Technical Monitor)
2001-01-01
An evaluation was performed to establish relationships between safety factors and reliability relationships. Results obtained show that the use of the safety factor is not contradictory to the employment of the probabilistic methods. In many cases the safety factors can be directly expressed by the required reliability levels. However, there is a major difference that must be emphasized: whereas the safety factors are allocated in an ad hoc manner, the probabilistic approach offers a unified mathematical framework. The establishment of the interrelation between the concepts opens an avenue to specify safety factors based on reliability. In cases where there are several forms of failure, then the allocation of safety factors should he based on having the same reliability associated with each failure mode. This immediately suggests that by the probabilistic methods the existing over-design or under-design can be eliminated. The report includes three parts: Part 1-Random Actual Stress and Deterministic Yield Stress; Part 2-Deterministic Actual Stress and Random Yield Stress; Part 3-Both Actual Stress and Yield Stress Are Random.
Generalized Minimum-Time Follow-up Approaches Applied to Tasking Electro-Optical Sensor Tasking
NASA Astrophysics Data System (ADS)
Murphy, T. S.; Holzinger, M. J.
This work proposes a methodology for tasking of sensors to search an area of state space for a particular object, group of objects, or class of objects. This work creates a general unified mathematical framework for analyzing reacquisition, search, scheduling, and custody operations. In particular, this work looks at searching for unknown space object(s) with prior knowledge in the form of a set, which can be defined via an uncorrelated track, region of state space, or a variety of other methods. The follow-up tasking can occur from a variable location and time, which often requires searching a large region of the sky. This work analyzes the area of a search region over time to inform a time optimal search method. Simulation work looks at analyzing search regions relative to a particular sensor, and testing a tasking algorithm to search through the region. The tasking algorithm is also validated on a reacquisition problem with a telescope system at Georgia Tech.
Balkanization and Unification of Probabilistic Inferences
ERIC Educational Resources Information Center
Yu, Chong-Ho
2005-01-01
Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…
1979 National Unified Entrance Examination for Institutions of Higher Education.
ERIC Educational Resources Information Center
Chinese Education, 1979
1979-01-01
The article presents translations of Chinese college entrance examinations in the fields of politics, Chinese language and literature, mathematics, humanities, physics, chemistry, history, geography, and English. Translations are also presented of the 1979 review syllabus for 1979 for the same subject areas. (DB)
Propagation phasor approach for holographic image reconstruction
Luo, Wei; Zhang, Yibo; Göröcs, Zoltán; Feizi, Alborz; Ozcan, Aydogan
2016-01-01
To achieve high-resolution and wide field-of-view, digital holographic imaging techniques need to tackle two major challenges: phase recovery and spatial undersampling. Previously, these challenges were separately addressed using phase retrieval and pixel super-resolution algorithms, which utilize the diversity of different imaging parameters. Although existing holographic imaging methods can achieve large space-bandwidth-products by performing pixel super-resolution and phase retrieval sequentially, they require large amounts of data, which might be a limitation in high-speed or cost-effective imaging applications. Here we report a propagation phasor approach, which for the first time combines phase retrieval and pixel super-resolution into a unified mathematical framework and enables the synthesis of new holographic image reconstruction methods with significantly improved data efficiency. In this approach, twin image and spatial aliasing signals, along with other digital artifacts, are interpreted as noise terms that are modulated by phasors that analytically depend on the lateral displacement between hologram and sensor planes, sample-to-sensor distance, wavelength, and the illumination angle. Compared to previous holographic reconstruction techniques, this new framework results in five- to seven-fold reduced number of raw measurements, while still achieving a competitive resolution and space-bandwidth-product. We also demonstrated the success of this approach by imaging biological specimens including Papanicolaou and blood smears. PMID:26964671
Optimality in mono- and multisensory map formation.
Bürck, Moritz; Friedel, Paul; Sichert, Andreas B; Vossen, Christine; van Hemmen, J Leo
2010-07-01
In the struggle for survival in a complex and dynamic environment, nature has developed a multitude of sophisticated sensory systems. In order to exploit the information provided by these sensory systems, higher vertebrates reconstruct the spatio-temporal environment from each of the sensory systems they have at their disposal. That is, for each modality the animal computes a neuronal representation of the outside world, a monosensory neuronal map. Here we present a universal framework that allows to calculate the specific layout of the involved neuronal network by means of a general mathematical principle, viz., stochastic optimality. In order to illustrate the use of this theoretical framework, we provide a step-by-step tutorial of how to apply our model. In so doing, we present a spatial and a temporal example of optimal stimulus reconstruction which underline the advantages of our approach. That is, given a known physical signal transmission and rudimental knowledge of the detection process, our approach allows to estimate the possible performance and to predict neuronal properties of biological sensory systems. Finally, information from different sensory modalities has to be integrated so as to gain a unified perception of reality for further processing, e.g., for distinct motor commands. We briefly discuss concepts of multimodal interaction and how a multimodal space can evolve by alignment of monosensory maps.
Impact of environmental colored noise in single-species population dynamics
NASA Astrophysics Data System (ADS)
Spanio, Tommaso; Hidalgo, Jorge; Muñoz, Miguel A.
2017-10-01
Variability on external conditions has important consequences for the dynamics and the organization of biological systems. In many cases, the characteristic timescale of environmental changes as well as their correlations play a fundamental role in the way living systems adapt and respond to it. A proper mathematical approach to understand population dynamics, thus, requires approaches more refined than, e.g., simple white-noise approximations. To shed further light onto this problem, in this paper we propose a unifying framework based on different analytical and numerical tools available to deal with "colored" environmental noise. In particular, we employ a "unified colored noise approximation" to map the original problem into an effective one with white noise, and then we apply a standard path integral approach to gain analytical understanding. For the sake of specificity, we present our approach using as a guideline a variation of the contact process—which can also be seen as a birth-death process of the Malthus-Verhulst class—where the propagation or birth rate varies stochastically in time. Our approach allows us to tackle in a systematic manner some of the relevant questions concerning population dynamics under environmental variability, such as determining the stationary population density, establishing the conditions under which a population may become extinct, and estimating extinction times. We focus on the emerging phase diagram and its possible phase transitions, underlying how these are affected by the presence of environmental noise time-correlations.
ERIC Educational Resources Information Center
Palmer, Jackie; Powell, Mary Jo
The Laboratory Network Program and the National Network of Eisenhower Mathematics and Science Regional Consortia, operating as the Curriculum Frameworks Task Force, jointly convened a group of educators involved in implementing state-level mathematics or science curriculum frameworks (CF). The Hilton Head (South Carolina) conference had a dual…
Space-Time Processing for Tactical Mobile Ad Hoc Networks
2008-08-01
vision for multiple concurrent communication settings, i.e., a many-to-many framework where multi-packet transmissions (MPTs) and multi-packet...modelling framework of capacity-delay tradeoffs We have introduced the first unified modeling framework for the computation of fundamental limits o We...dalities in wireless n twor i-packet modelling framework to account for the use of m lti-packet reception (MPR) f ad hoc networks with MPT under
Unified formalism for higher order non-autonomous dynamical systems
NASA Astrophysics Data System (ADS)
Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso
2012-03-01
This work is devoted to giving a geometric framework for describing higher order non-autonomous mechanical systems. The starting point is to extend the Lagrangian-Hamiltonian unified formalism of Skinner and Rusk for these kinds of systems, generalizing previous developments for higher order autonomous mechanical systems and first-order non-autonomous mechanical systems. Then, we use this unified formulation to derive the standard Lagrangian and Hamiltonian formalisms, including the Legendre-Ostrogradsky map and the Euler-Lagrange and the Hamilton equations, both for regular and singular systems. As applications of our model, two examples of regular and singular physical systems are studied.
Apollo experience report: S-band system signal design and analysis
NASA Technical Reports Server (NTRS)
Rosenberg, H. R. (Editor)
1972-01-01
A description is given of the Apollo communications-system engineering-analysis effort that ensured the adequacy, performance, and interface compatibility of the unified S-band system elements for a successful lunar-landing mission. The evolution and conceptual design of the unified S-band system are briefly reviewed from a historical viewpoint. A comprehensive discussion of the unified S-band elements includes the salient design features of the system and serves as a basis for a better understanding of the design decisions and analyses. The significant design decisions concerning the Apollo communications-system signal design are discussed providing an insight into the role of systems analysis in arriving at the current configuration of the Apollo communications system. Analyses are presented concerning performance estimation (mathematical-model development through real-time mission support) and system deficiencies, modifications, and improvements.
Colaborated Architechture Framework for Composition UML 2.0 in Zachman Framework
NASA Astrophysics Data System (ADS)
Hermawan; Hastarista, Fika
2016-01-01
Zachman Framework (ZF) is the framework of enterprise architechture that most widely adopted in the Enterprise Information System (EIS) development. In this study, has been developed Colaborated Architechture Framework (CAF) to collaborate ZF with Unified Modeling Language (UML) 2.0 modeling. The CAF provides the composition of ZF matrix that each cell is consist of the Model Driven architechture (MDA) from the various UML models and many Software Requirement Specification (SRS) documents. Implementation of this modeling is used to develops Enterprise Resource Planning (ERP). Because ERP have a coverage of applications in large numbers and complexly relations, it is necessary to use Agile Model Driven Design (AMDD) approach as an advanced method to transforms MDA into components of application modules with efficiently and accurately. Finally, through the using of the CAF, give good achievement in fullfilment the needs from all stakeholders that are involved in the overall process stage of Rational Unified Process (RUP), and also obtaining a high satisfaction to fullfiled the functionality features of the ERP software in PT. Iglas (Persero) Gresik.
Family Systems Theory: A Unifying Framework for Codependence.
ERIC Educational Resources Information Center
Prest, Layne A.; Protinsky, Howard
1993-01-01
Considers addictions and construct of codependence. Offers critical review and synthesis of codependency literature, along with an intergenerational family systems framework for conceptualizing the relationship of the dysfunctional family to the construct of codependence. Presents theoretical basis for systemic clinical work and research in this…
[Research on tumor information grid framework].
Zhang, Haowei; Qin, Zhu; Liu, Ying; Tan, Jianghao; Cao, Haitao; Chen, Youping; Zhang, Ke; Ding, Yuqing
2013-10-01
In order to realize tumor disease information sharing and unified management, we utilized grid technology to make the data and software resources which distributed in various medical institutions for effective integration so that we could make the heterogeneous resources consistent and interoperable in both semantics and syntax aspects. This article describes the tumor grid framework, the type of the service being packaged in Web Service Description Language (WSDL) and extensible markup language schemas definition (XSD), the client use the serialized document to operate the distributed resources. The service objects could be built by Unified Modeling Language (UML) as middle ware to create application programming interface. All of the grid resources are registered in the index and released in the form of Web Services based on Web Services Resource Framework (WSRF). Using the system we can build a multi-center, large sample and networking tumor disease resource sharing framework to improve the level of development in medical scientific research institutions and the patient's quality of life.
Kwok, T; Smith, K A
2000-09-01
The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters.
Mathematical Tasks as a Framework for Reflection: From Research To Practice.
ERIC Educational Resources Information Center
Stein, Mary Kay; Smith, Margaret Schwan
1998-01-01
Describes the Quantitative Understanding: Amplifying Student Achievement and Reasoning (QUASAR) national reform project aimed at studying and fostering the development and implementation of enhanced mathematics instructional programs. It is a framework for reflection based on mathematical tasks used during classroom instruction and the ways in…
A Framework for Understanding Whiteness in Mathematics Education
ERIC Educational Resources Information Center
Battey, Dan; Leyva, Luis A.
2016-01-01
In this article, the authors provide a framework for understanding whiteness in mathematics education. While whiteness is receiving more attention in the broader education literature, only a handful of scholars address whiteness in mathematics education in any form. This lack of attention to whiteness leaves it invisible and neutral in documenting…
Torfs, Elena; Martí, M Carmen; Locatelli, Florent; Balemans, Sophie; Bürger, Raimund; Diehl, Stefan; Laurent, Julien; Vanrolleghem, Peter A; François, Pierre; Nopens, Ingmar
2017-02-01
A new perspective on the modelling of settling behaviour in water resource recovery facilities is introduced. The ultimate goal is to describe in a unified way the processes taking place both in primary settling tanks (PSTs) and secondary settling tanks (SSTs) for a more detailed operation and control. First, experimental evidence is provided, pointing out distributed particle properties (such as size, shape, density, porosity, and flocculation state) as an important common source of distributed settling behaviour in different settling unit processes and throughout different settling regimes (discrete, hindered and compression settling). Subsequently, a unified model framework that considers several particle classes is proposed in order to describe distributions in settling behaviour as well as the effect of variations in particle properties on the settling process. The result is a set of partial differential equations (PDEs) that are valid from dilute concentrations, where they correspond to discrete settling, to concentrated suspensions, where they correspond to compression settling. Consequently, these PDEs model both PSTs and SSTs.
A Framework of Mathematics Inductive Reasoning
ERIC Educational Resources Information Center
Christou, Constantinos; Papageorgiou, Eleni
2007-01-01
Based on a synthesis of the literature in inductive reasoning, a framework for prescribing and assessing mathematics inductive reasoning of primary school students was formulated and validated. The major constructs incorporated in this framework were students' cognitive abilities of finding similarities and/or dissimilarities among attributes and…
Towards a Unified Theory of Engineering Education
ERIC Educational Resources Information Center
Salcedo Orozco, Oscar H.
2017-01-01
STEM education is an interdisciplinary approach to learning where rigorous academic concepts are coupled with real-world lessons and activities as students apply science, technology, engineering, and mathematics in contexts that make connections between school, community, work, and the global enterprise enabling STEM literacy (Tsupros, Kohler and…
Biodiversity patterns along ecological gradients: unifying β-diversity indices.
Szava-Kovats, Robert C; Pärtel, Meelis
2014-01-01
Ecologists have developed an abundance of conceptions and mathematical expressions to define β-diversity, the link between local (α) and regional-scale (γ) richness, in order to characterize patterns of biodiversity along ecological (i.e., spatial and environmental) gradients. These patterns are often realized by regression of β-diversity indices against one or more ecological gradients. This practice, however, is subject to two shortcomings that can undermine the validity of the biodiversity patterns. First, many β-diversity indices are constrained to range between fixed lower and upper limits. As such, regression analysis of β-diversity indices against ecological gradients can result in regression curves that extend beyond these mathematical constraints, thus creating an interpretational dilemma. Second, despite being a function of the same measured α- and γ-diversity, the resultant biodiversity pattern depends on the choice of β-diversity index. We propose a simple logistic transformation that rids beta-diversity indices of their mathematical constraints, thus eliminating the possibility of an uninterpretable regression curve. Moreover, this transformation results in identical biodiversity patterns for three commonly used classical beta-diversity indices. As a result, this transformation eliminates the difficulties of both shortcomings, while allowing the researcher to use whichever beta-diversity index deemed most appropriate. We believe this method can help unify the study of biodiversity patterns along ecological gradients.
Biodiversity Patterns along Ecological Gradients: Unifying β-Diversity Indices
Szava-Kovats, Robert C.; Pärtel, Meelis
2014-01-01
Ecologists have developed an abundance of conceptions and mathematical expressions to define β-diversity, the link between local (α) and regional-scale (γ) richness, in order to characterize patterns of biodiversity along ecological (i.e., spatial and environmental) gradients. These patterns are often realized by regression of β-diversity indices against one or more ecological gradients. This practice, however, is subject to two shortcomings that can undermine the validity of the biodiversity patterns. First, many β-diversity indices are constrained to range between fixed lower and upper limits. As such, regression analysis of β-diversity indices against ecological gradients can result in regression curves that extend beyond these mathematical constraints, thus creating an interpretational dilemma. Second, despite being a function of the same measured α- and γ-diversity, the resultant biodiversity pattern depends on the choice of β-diversity index. We propose a simple logistic transformation that rids beta-diversity indices of their mathematical constraints, thus eliminating the possibility of an uninterpretable regression curve. Moreover, this transformation results in identical biodiversity patterns for three commonly used classical beta-diversity indices. As a result, this transformation eliminates the difficulties of both shortcomings, while allowing the researcher to use whichever beta-diversity index deemed most appropriate. We believe this method can help unify the study of biodiversity patterns along ecological gradients. PMID:25330181
NASA Astrophysics Data System (ADS)
Wang, Lin; Cao, Xin; Ren, Qingyun; Chen, Xueli; He, Xiaowei
2018-05-01
Cerenkov luminescence imaging (CLI) is an imaging method that uses an optical imaging scheme to probe a radioactive tracer. Application of CLI with clinically approved radioactive tracers has opened an opportunity for translating optical imaging from preclinical to clinical applications. Such translation was further improved by developing an endoscopic CLI system. However, two-dimensional endoscopic imaging cannot identify accurate depth and obtain quantitative information. Here, we present an imaging scheme to retrieve the depth and quantitative information from endoscopic Cerenkov luminescence tomography, which can also be applied for endoscopic radio-luminescence tomography. In the scheme, we first constructed a physical model for image collection, and then a mathematical model for characterizing the luminescent light propagation from tracer to the endoscopic detector. The mathematical model is a hybrid light transport model combined with the 3rd order simplified spherical harmonics approximation, diffusion, and radiosity equations to warrant accuracy and speed. The mathematical model integrates finite element discretization, regularization, and primal-dual interior-point optimization to retrieve the depth and the quantitative information of the tracer. A heterogeneous-geometry-based numerical simulation was used to explore the feasibility of the unified scheme, which demonstrated that it can provide a satisfactory balance between imaging accuracy and computational burden.
A quasi-likelihood approach to non-negative matrix factorization
Devarajan, Karthik; Cheung, Vincent C.K.
2017-01-01
A unified approach to non-negative matrix factorization based on the theory of generalized linear models is proposed. This approach embeds a variety of statistical models, including the exponential family, within a single theoretical framework and provides a unified view of such factorizations from the perspective of quasi-likelihood. Using this framework, a family of algorithms for handling signal-dependent noise is developed and its convergence proven using the Expectation-Maximization algorithm. In addition, a measure to evaluate the goodness-of-fit of the resulting factorization is described. The proposed methods allow modeling of non-linear effects via appropriate link functions and are illustrated using an application in biomedical signal processing. PMID:27348511
Groundwater modelling in decision support: reflections on a unified conceptual framework
NASA Astrophysics Data System (ADS)
Doherty, John; Simmons, Craig T.
2013-11-01
Groundwater models are commonly used as basis for environmental decision-making. There has been discussion and debate in recent times regarding the issue of model simplicity and complexity. This paper contributes to this ongoing discourse. The selection of an appropriate level of model structural and parameterization complexity is not a simple matter. Although the metrics on which such selection should be based are simple, there are many competing, and often unquantifiable, considerations which must be taken into account as these metrics are applied. A unified conceptual framework is introduced and described which is intended to underpin groundwater modelling in decision support with a direct focus on matters regarding model simplicity and complexity.
Pattern-oriented modeling of agent-based complex systems: Lessons from ecology
Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.
2005-01-01
Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.
Pattern-Oriented Modeling of Agent-Based Complex Systems: Lessons from Ecology
NASA Astrophysics Data System (ADS)
Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.
2005-11-01
Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.
Integrating diverse databases into an unified analysis framework: a Galaxy approach
Blankenberg, Daniel; Coraor, Nathan; Von Kuster, Gregory; Taylor, James; Nekrutenko, Anton
2011-01-01
Recent technological advances have lead to the ability to generate large amounts of data for model and non-model organisms. Whereas, in the past, there have been a relatively small number of central repositories that serve genomic data, an increasing number of distinct specialized data repositories and resources have been established. Here, we describe a generic approach that provides for the integration of a diverse spectrum of data resources into a unified analysis framework, Galaxy (http://usegalaxy.org). This approach allows the simplified coupling of external data resources with the data analysis tools available to Galaxy users, while leveraging the native data mining facilities of the external data resources. Database URL: http://usegalaxy.org PMID:21531983
Formal Darwinism, the individual-as-maximizing-agent analogy and bet-hedging
Grafen, A.
1999-01-01
The central argument of The origin of species was that mechanical processes (inheritance of features and the differential reproduction they cause) can give rise to the appearance of design. The 'mechanical processes' are now mathematically represented by the dynamic systems of population genetics, and the appearance of design by optimization and game theory in which the individual plays the part of the maximizing agent. Establishing a precise individual-as-maximizing-agent (IMA) analogy for a population-genetics system justifies optimization approaches, and so provides a modern formal representation of the core of Darwinism. It is a hitherto unnoticed implication of recent population-genetics models that, contrary to a decades-long consensus, an IMA analogy can be found in models with stochastic environments (subject to a convexity assumption), in which individuals maximize expected reproductive value. The key is that the total reproductive value of a species must be considered as constant, so therefore reproductive value should always be calculated in relative terms. This result removes a major obstacle from the theoretical challenge to find a unifying framework which establishes the IMA analogy for all of Darwinian biology, including as special cases inclusive fitness, evolutionarily stable strategies, evolutionary life-history theory, age-structured models and sex ratio theory. This would provide a formal, mathematical justification of fruitful and widespread but 'intentional' terms in evolutionary biology, such as 'selfish', 'altruism' and 'conflict'.
Putting the School Interoperability Framework to the Test
ERIC Educational Resources Information Center
Mercurius, Neil; Burton, Glenn; Hopkins, Bill; Larsen, Hans
2004-01-01
The Jurupa Unified School District in Southern California recently partnered with Microsoft, Dell and the Zone Integration Group for the implementation of a School Interoperability Framework (SIF) database repository model throughout the district (Magner 2002). A two-week project--the Integrated District Education Applications System, better known…
Generalized Multilevel Structural Equation Modeling
ERIC Educational Resources Information Center
Rabe-Hesketh, Sophia; Skrondal, Anders; Pickles, Andrew
2004-01-01
A unifying framework for generalized multilevel structural equation modeling is introduced. The models in the framework, called generalized linear latent and mixed models (GLLAMM), combine features of generalized linear mixed models (GLMM) and structural equation models (SEM) and consist of a response model and a structural model for the latent…
ERIC Educational Resources Information Center
Raveh, Ira; Koichu, Boris; Peled, Irit; Zaslavsky, Orit
2016-01-01
In this article we present an integrative framework of knowledge for teaching the standard algorithms of the four basic arithmetic operations. The framework is based on a mathematical analysis of the algorithms, a connectionist perspective on teaching mathematics and an analogy with previous frameworks of knowledge for teaching arithmetic…
ERIC Educational Resources Information Center
O'Keeffe, Shawn Edward
2013-01-01
The author developed a unified nD framework and process ontology for Building Information Modeling (BIM). The research includes a framework developed for 6D BIM, nD BIM, and nD ontology that defines the domain and sub-domain constructs for future nD BIM dimensions. The nD ontology defines the relationships of kinds within any new proposed…
2017-05-25
Operations, and Unified Land Operations) and the US Army’s leader development model identifies how the education , training, and experience of field-grade...officers have failed in their incorporation of the framework because they lack the education , training, and experience for the use of the framework... education , training, and experience of field-grade officers at the division level have influenced their use of the operational framework. The cause for
A Framework for Analyzing the Collaborative Construction of Arguments and Its Interplay with Agency
ERIC Educational Resources Information Center
Mueller, Mary; Yankelewitz, Dina; Maher, Carolyn
2012-01-01
In this report, we offer a framework for analyzing the ways in which collaboration influences learners' building of mathematical arguments and thus promotes mathematical understanding. Building on a previous model used to analyze discursive practices of students engaged in mathematical problem solving, we introduce three types of collaboration and…
A Framework for Mathematical Thinking: The Case of Linear Algebra
ERIC Educational Resources Information Center
Stewart, Sepideh; Thomas, Michael O. J.
2009-01-01
Linear algebra is one of the unavoidable advanced courses that many mathematics students encounter at university level. The research reported here was part of the first author's recent PhD study, where she created and applied a theoretical framework combining the strengths of two major mathematics education theories in order to investigate the…
The Importance of Theoretical Frameworks and Mathematical Constructs in Designing Digital Tools
ERIC Educational Resources Information Center
Trinter, Christine
2016-01-01
The increase in availability of educational technologies over the past few decades has not only led to new practice in teaching mathematics but also to new perspectives in research, methodologies, and theoretical frameworks within mathematics education. Hence, the amalgamation of theoretical and pragmatic considerations in digital tool design…
Teaching Multidigit Multiplication: Combining Multiple Frameworks to Analyse a Class Episode
ERIC Educational Resources Information Center
Clivaz, Stéphane
2017-01-01
This paper provides an analysis of a teaching episode of the multidigit algorithm for multiplication, with a focus on the influence of the teacher's mathematical knowledge on their teaching. The theoretical framework uses Mathematical Knowledge for Teaching, mathematical pertinence of the teacher and structuration of the milieu in a descending and…
Parametric models to relate spike train and LFP dynamics with neural information processing.
Banerjee, Arpan; Dean, Heather L; Pesaran, Bijan
2012-01-01
Spike trains and local field potentials (LFPs) resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus-driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior. We obtained significant spike-field onset time correlations from single trials using a previously published data set where significantly strong correlation was only obtained through trial averaging. We also found that unified models extracted a stronger relationship between neural response latency and trial-by-trial behavioral performance than existing models of neural information processing. Our results highlight the utility of the unified modeling framework for characterizing spike-LFP recordings obtained during behavioral performance.
Actuality of transcendental æsthetics for modern physics
NASA Astrophysics Data System (ADS)
Petitot, Jean
1. The more mathematics and physics unify themselves in the physico-mathematical modern theories, the more an objective epistemology becomes necessary. Only such a transcendental epistemology is able to thematize correctly the status of the mathematical determination of physical reality. 2. There exists a transcendental history of the synthetic a priori and of the construction of physical categories. 3. The transcendental approach allows to supersed Wittgenstein's and Carnap's antiplatonist thesis according to which pure mathematics are physically applicable only if they lack any descriptive, cognitive or objective, content and reduce to mere prescriptive and normative devices. In fact, pure mathematics are prescriptive-normative in physics because: (i) the categories of physical objectivity are prescriptive-normative, and (ii) their categorial content is mathematically “constructed” through a Transcendental Aesthetics. Only a transcendental approach make compatible, in the one hand, a grammatical conventionalism of Wittgensteinian or Carnapian type and, on the other hand, a platonist realism of Gödelian type. Mathematics are not a grammar of the world but a mathematical hermeneutics of the intuitive forms and of the categorial grammar of the world.
Separation of Variables and Superintegrability; The symmetry of solvable systems
NASA Astrophysics Data System (ADS)
Kalnins, Ernest G.; Kress, Jonathan M.; Miller, Willard, Jr.
2018-06-01
Separation of variables methods for solving partial differential equations are of immense theoretical and practical importance in mathematical physics. They are the most powerful tool known for obtaining explicit solutions of the partial differential equations of mathematical physics. The purpose of this book is to give an up-to-date presentation of the theory of separation of variables and its relation to superintegrability. Collating and presenting it in a unified, updated and a more accessible manner, the results scattered in the literature that the authors have prepared is an invaluable resource for mathematicians and mathematical physicists in particular, as well as science, engineering, geological and biological researchers interested in explicit solutions.
A unifying framework for systems modeling, control systems design, and system operation
NASA Technical Reports Server (NTRS)
Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.
2005-01-01
Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.
Toward a unified approach to dose-response modeling in ecotoxicology.
Ritz, Christian
2010-01-01
This study reviews dose-response models that are used in ecotoxicology. The focus lies on clarification of differences and similarities between models, and as a side effect, their different guises in ecotoxicology are unravelled. A look at frequently used dose-response models reveals major discrepancies, among other things in naming conventions. Therefore, there is a need for a unified view on dose-response modeling in order to improve the understanding of it and to facilitate communication and comparison of findings across studies, thus realizing its full potential. This study attempts to establish a general framework that encompasses most dose-response models that are of interest to ecotoxicologists in practice. The framework includes commonly used models such as the log-logistic and Weibull models, but also features entire suites of models as found in various guidance documents. An outline on how the proposed framework can be implemented in statistical software systems is also provided.
Theoretical Framework of Researcher Knowledge Development in Mathematics Education
ERIC Educational Resources Information Center
Kontorovich, Igor'
2016-01-01
The goal of this paper is to present a framework of researcher knowledge development in conducting a study in mathematics education. The key components of the framework are: knowledge germane to conducting a particular study, processes of knowledge accumulation, and catalyzing filters that influence a researcher's decision making. The components…
ERIC Educational Resources Information Center
Marston, Jennie
2014-01-01
This article by Jennie Marston provides a framework to assist you in selecting appropriate picture books to present mathematical content. Jennie demonstrates the framework by applying three specific examples of picture books to the framework along with examples of activities.
Theoretical Foundation of Copernicus: A Unified System for Trajectory Design and Optimization
NASA Technical Reports Server (NTRS)
Ocampo, Cesar; Senent, Juan S.; Williams, Jacob
2010-01-01
The fundamental methods are described for the general spacecraft trajectory design and optimization software system called Copernicus. The methods rely on a unified framework that is used to model, design, and optimize spacecraft trajectories that may operate in complex gravitational force fields, use multiple propulsion systems, and involve multiple spacecraft. The trajectory model, with its associated equations of motion and maneuver models, are discussed.
FORUM: The Algorithmic Way of Life is Best and Responses.
ERIC Educational Resources Information Center
Maurer, Stephen B.; And Others
1985-01-01
The forum is focused on thinking about and with algorithms as a way of unifying all one's mathematical endeavors. The lead article by Maurer presents examples and discussion of this point. Responses, often disagreeing with his views, are by Douglas, Korte, Hilton, Renz, Smorynski, Hammersley, and Halmos. (MNS)
The Functionator 3000: Transforming Numbers and Children
ERIC Educational Resources Information Center
Fisher, Elaine Cerrato; Roy, George; Reeves, Charles
2013-01-01
Mrs. Fisher's class was learning about arithmetic functions by pretending to operate real-world "function machines" (Reeves 2006). Functions are a unifying mathematics topic, and a great deal of emphasis is placed on understanding them in prekindergarten through grade 12 (Kilpatrick and Izsák 2008). In its Algebra Content Standard, the…
It Works: Project R-3, San Jose, California.
ERIC Educational Resources Information Center
American Institutes for Research in the Behavioral Sciences, Palo Alto, CA.
A project was designed by the San Jose Unified School District and the education division of the Lockheed Missiles and Space Company to treat learning problems experienced by eighth and ninth grade students with underdeveloped reading and mathematics skills. The students were largely Mexican American and were from predominately disadvantaged…
Science, Math, and Technology. K-6 Science Curriculum.
ERIC Educational Resources Information Center
Blueford, J. R.; And Others
Science, Math and Technology is one of the units of a K-6 unified science curriculum program. The unit consists of four organizing sub-themes: (1) science (with activities on observation, comparisons, and the scientific method); (2) technology (examining simple machines, electricity, magnetism, waves and forces); (3) mathematics (addressing skill…
A Unified Framework for Monetary Theory and Policy Analysis.
ERIC Educational Resources Information Center
Lagos, Ricardo; Wright, Randall
2005-01-01
Search-theoretic models of monetary exchange are based on explicit descriptions of the frictions that make money essential. However, tractable versions of these models typically make strong assumptions that render them ill suited for monetary policy analysis. We propose a new framework, based on explicit micro foundations, within which macro…
Generalizability Theory as a Unifying Framework of Measurement Reliability in Adolescent Research
ERIC Educational Resources Information Center
Fan, Xitao; Sun, Shaojing
2014-01-01
In adolescence research, the treatment of measurement reliability is often fragmented, and it is not always clear how different reliability coefficients are related. We show that generalizability theory (G-theory) is a comprehensive framework of measurement reliability, encompassing all other reliability methods (e.g., Pearson "r,"…
Mathematical correlation of modal parameter identification methods via system realization theory
NASA Technical Reports Server (NTRS)
Juang, J. N.
1986-01-01
A unified approach is introduced using system realization theory to derive and correlate modal parameter identification methods for flexible structures. Several different time-domain and frequency-domain methods are analyzed and treated. A basic mathematical foundation is presented which provides insight into the field of modal parameter identification for comparison and evaluation. The relation among various existing methods is established and discussed. This report serves as a starting point to stimulate additional research towards the unification of the many possible approaches for modal parameter identification.
Hu, Shiang; Yao, Dezhong; Valdes-Sosa, Pedro A
2018-01-01
The choice of reference for the electroencephalogram (EEG) is a long-lasting unsolved issue resulting in inconsistent usages and endless debates. Currently, both the average reference (AR) and the reference electrode standardization technique (REST) are two primary, apparently irreconcilable contenders. We propose a theoretical framework to resolve this reference issue by formulating both (a) estimation of potentials at infinity, and (b) determination of the reference, as a unified Bayesian linear inverse problem, which can be solved by maximum a posterior estimation. We find that AR and REST are very particular cases of this unified framework: AR results from biophysically non-informative prior; while REST utilizes the prior based on the EEG generative model. To allow for simultaneous denoising and reference estimation, we develop the regularized versions of AR and REST, named rAR and rREST, respectively. Both depend on a regularization parameter that is the noise to signal variance ratio. Traditional and new estimators are evaluated with this framework, by both simulations and analysis of real resting EEGs. Toward this end, we leverage the MRI and EEG data from 89 subjects which participated in the Cuban Human Brain Mapping Project. Generated artificial EEGs-with a known ground truth, show that relative error in estimating the EEG potentials at infinity is lowest for rREST. It also reveals that realistic volume conductor models improve the performances of REST and rREST. Importantly, for practical applications, it is shown that an average lead field gives the results comparable to the individual lead field. Finally, it is shown that the selection of the regularization parameter with Generalized Cross-Validation (GCV) is close to the "oracle" choice based on the ground truth. When evaluated with the real 89 resting state EEGs, rREST consistently yields the lowest GCV. This study provides a novel perspective to the EEG reference problem by means of a unified inverse solution framework. It may allow additional principled theoretical formulations and numerical evaluation of performance.
ERIC Educational Resources Information Center
Peck, Frederick; Sriraman, Bharath
2017-01-01
Mathematics education emerged as a field in the height of modernism in science and mathematics. For decades, modernist psychology provided the dominant framework for inquiry in the field. Recently, this framework has started to sustain questions, leading to an ongoing conversation in the literature about the identity of the field. We join this…
ERIC Educational Resources Information Center
Huda, Nizlel; Subanji; Nusantar, Toto; Susiswo; Sutawidjaja, Akbar; Rahardjo, Swasono
2016-01-01
This study aimed to determine students' metacognitive failure in Mathematics Education Program of FKIP in Jambi University investigated based on assimilation and accommodation Mathematical framework. There were 35 students, five students did not answer the question, three students completed the questions correctly and 27 students tried to solve…
Framework Design of Unified Cross-Authentication Based on the Fourth Platform Integrated Payment
NASA Astrophysics Data System (ADS)
Yong, Xu; Yujin, He
The essay advances a unified authentication based on the fourth integrated payment platform. The research aims at improving the compatibility of the authentication in electronic business and providing a reference for the establishment of credit system by seeking a way to carry out a standard unified authentication on a integrated payment platform. The essay introduces the concept of the forth integrated payment platform and finally put forward the whole structure and different components. The main issue of the essay is about the design of the credit system of the fourth integrated payment platform and the PKI/CA structure design.
An Exploratory Framework for Handling the Complexity of Mathematical Problem Posing in Small Groups
ERIC Educational Resources Information Center
Kontorovich, Igor; Koichu, Boris; Leikin, Roza; Berman, Avi
2012-01-01
The paper introduces an exploratory framework for handling the complexity of students' mathematical problem posing in small groups. The framework integrates four facets known from past research: task organization, students' knowledge base, problem-posing heuristics and schemes, and group dynamics and interactions. In addition, it contains a new…
Problem Solving Frameworks for Mathematics and Software Development
ERIC Educational Resources Information Center
McMaster, Kirby; Sambasivam, Samuel; Blake, Ashley
2012-01-01
In this research, we examine how problem solving frameworks differ between Mathematics and Software Development. Our methodology is based on the assumption that the words used frequently in a book indicate the mental framework of the author. We compared word frequencies in a sample of 139 books that discuss problem solving. The books were grouped…
Differentiable representations of finite dimensional Lie groups in rigged Hilbert spaces
NASA Astrophysics Data System (ADS)
Wickramasekara, Sujeewa
The inceptive motivation for introducing rigged Hilbert spaces (RHS) in quantum physics in the mid 1960's was to provide the already well established Dirac formalism with a proper mathematical context. It has since become clear, however, that this mathematical framework is lissome enough to accommodate a class of solutions to the dynamical equations of quantum physics that includes some which are not possible in the normative Hilbert space theory. Among the additional solutions, in particular, are those which describe aspects of scattering and decay phenomena that have eluded the orthodox quantum physics. In this light, the RHS formulation seems to provide a mathematical rubric under which various phenomenological observations and calculational techniques, commonly known in the study of resonance scattering and decay as ``effective theories'' (e.g., the Wigner- Weisskopf method), receive a unified theoretical foundation. These observations lead to the inference that a theory founded upon the RHS mathematics may prove to be of better utility and value in understanding quantum physical phenomena. This dissertation primarily aims to contribute to the general formalism of the RHS theory of quantum mechanics by undertaking a study of differentiable representations of finite dimensional Lie groups. In particular, it is shown that a finite dimensional operator Lie algebra G in a rigged Hilbert space can be always integrated, provided one parameter integrability holds true for the elements of any basis for G . This result differs from and extends the well known integration theorem of E. Nelson and the subsequent works of others on unitary representations in that it does not require any assumptions on the existence of analytic vectors. Also presented here is a construction of a particular rigged Hilbert space of Hardy class functions that appears useful in formulating a relativistic version of the RHS theory of resonances and decay. As a contexture for the construction, a synopsis of the new relativistic theory is presented.
A unified framework for building high performance DVEs
NASA Astrophysics Data System (ADS)
Lei, Kaibin; Ma, Zhixia; Xiong, Hua
2011-10-01
A unified framework for integrating PC cluster based parallel rendering with distributed virtual environments (DVEs) is presented in this paper. While various scene graphs have been proposed in DVEs, it is difficult to enable collaboration of different scene graphs. This paper proposes a technique for non-distributed scene graphs with the capability of object and event distribution. With the increase of graphics data, DVEs require more powerful rendering ability. But general scene graphs are inefficient in parallel rendering. The paper also proposes a technique to connect a DVE and a PC cluster based parallel rendering environment. A distributed multi-player video game is developed to show the interaction of different scene graphs and the parallel rendering performance on a large tiled display wall.
Unified Behavior Framework for Discrete Event Simulation Systems
2015-03-26
I would like to thank Dr. Hodson for his guidance and direction throughout the AFIT program. I also would like to thank my thesis committee members...SPA Sense-Plan-Act SSL System Service Layer TCA Task Control Architecture TRP Teleo-Reactive Program UAV Unmanned Aerial Vehicle UBF Unified Behavior...a teleo-reactive architecture [11]. Teleo-Reactive Programs ( TRPs ) are composed of a list of rules, where each has a condition and an action. When the
Evolutionary game theory meets social science: is there a unifying rule for human cooperation?
Rosas, Alejandro
2010-05-21
Evolutionary game theory has shown that human cooperation thrives in different types of social interactions with a PD structure. Models treat the cooperative strategies within the different frameworks as discrete entities and sometimes even as contenders. Whereas strong reciprocity was acclaimed as superior to classic reciprocity for its ability to defeat defectors in public goods games, recent experiments and simulations show that costly punishment fails to promote cooperation in the IR and DR games, where classic reciprocity succeeds. My aim is to show that cooperative strategies across frameworks are capable of a unified treatment, for they are governed by a common underlying rule or norm. An analysis of the reputation and action rules that govern some representative cooperative strategies both in models and in economic experiments confirms that the different frameworks share a conditional action rule and several reputation rules. The common conditional rule contains an option between costly punishment and withholding benefits that provides alternative enforcement methods against defectors. Depending on the framework, individuals can switch to the appropriate strategy and method of enforcement. The stability of human cooperation looks more promising if one mechanism controls successful strategies across frameworks. Published by Elsevier Ltd.
Quantum Chemistry in Great Britain: Developing a Mathematical Framework for Quantum Chemistry
NASA Astrophysics Data System (ADS)
Simões, Ana; Gavroglu, Kostas
By 1935 quantum chemistry was already delineated as a distinct sub-discipline due to the contributions of Fritz London, Walter Heitler, Friedrich Hund, Erich Hückel, Robert Mulliken, Linus Pauling, John van Vleck and John Slater. These people are credited with showing that the application of quantum mechanics to the solution of chemical problems was, indeed, possible, especially so after the introduction of a number of new concepts and the adoption of certain approximation methods. And though a number of chemists had started talking of the formation of theoretical or, even, mathematical chemistry, a fully developed mathematical framework of quantum chemistry was still wanting. The work of three persons in particular-of John E. Lennard-Jones, Douglas R. Hartree, and Charles Alfred Coulson-has been absolutely crucial in the development of such a framework. In this paper we shall discuss the work of these three researchers who started their careers in the Cambridge tradition of mathematical physics and who at some point of their careers all became professors of applied mathematics. We shall argue that their work consisted of decisive contributions to the development of such a mathematical framework for quantum chemistry.
Survey of meshless and generalized finite element methods: A unified approach
NASA Astrophysics Data System (ADS)
Babuška, Ivo; Banerjee, Uday; Osborn, John E.
In the past few years meshless methods for numerically solving partial differential equations have come into the focus of interest, especially in the engineering community. This class of methods was essentially stimulated by difficulties related to mesh generation. Mesh generation is delicate in many situations, for instance, when the domain has complicated geometry; when the mesh changes with time, as in crack propagation, and remeshing is required at each time step; when a Lagrangian formulation is employed, especially with nonlinear PDEs. In addition, the need for flexibility in the selection of approximating functions (e.g., the flexibility to use non-polynomial approximating functions), has played a significant role in the development of meshless methods. There are many recent papers, and two books, on meshless methods; most of them are of an engineering character, without any mathematical analysis.In this paper we address meshless methods and the closely related generalized finite element methods for solving linear elliptic equations, using variational principles. We give a unified mathematical theory with proofs, briefly address implementational aspects, present illustrative numerical examples, and provide a list of references to the current literature.The aim of the paper is to provide a survey of a part of this new field, with emphasis on mathematics. We present proofs of essential theorems because we feel these proofs are essential for the understanding of the mathematical aspects of meshless methods, which has approximation theory as a major ingredient. As always, any new field is stimulated by and related to older ideas. This will be visible in our paper.
General System Theory: Toward a Conceptual Framework for Science and Technology Education for All.
ERIC Educational Resources Information Center
Chen, David; Stroup, Walter
1993-01-01
Suggests using general system theory as a unifying theoretical framework for science and technology education for all. Five reasons are articulated: the multidisciplinary nature of systems theory, the ability to engage complexity, the capacity to describe system dynamics, the ability to represent the relationship between microlevel and…
Making Learning Personally Meaningful: A New Framework for Relevance Research
ERIC Educational Resources Information Center
Priniski, Stacy J.; Hecht, Cameron A.; Harackiewicz, Judith M.
2018-01-01
Personal relevance goes by many names in the motivation literature, stemming from a number of theoretical frameworks. Currently these lines of research are being conducted in parallel with little synthesis across them, perhaps because there is no unifying definition of the relevance construct within which this research can be situated. In this…
Unifying Different Theories of Learning: Theoretical Framework and Empirical Evidence
ERIC Educational Resources Information Center
Phan, Huy Phuong
2008-01-01
The main aim of this research study was to test out a conceptual model encompassing the theoretical frameworks of achievement goals, study processing strategies, effort, and reflective thinking practice. In particular, it was postulated that the causal influences of achievement goals on academic performance are direct and indirect through study…
ERIC Educational Resources Information Center
MacLean, Justine; Mulholland, Rosemary; Gray, Shirley; Horrell, Andrew
2015-01-01
Background: Curriculum for Excellence, a new national policy initiative in Scottish Schools, provides a unified curricular framework for children aged 3-18. Within this framework, Physical Education (PE) now forms part of a collective alongside physical activity and sport, subsumed by the newly created curriculum area of "Health and…
ERIC Educational Resources Information Center
Monaghan, John
2013-01-01
This paper offers a framework, an extension of Valsiner's "zone theory", for the analysis of joint student-teacher development over a series of technology-based mathematics lessons. The framework is suitable for developing research studies over a moderately long period of time and considers interrelated student-teacher development as…
Mathematical Frameworks for Diagnostics, Prognostics and Condition Based Maintenance Problems
2008-08-15
REPORT Mathematical Frameworks for Diagnostics, Prognostics and Condition Based Maintenance Problems (W911NF-05-1-0426) 14. ABSTRACT 16. SECURITY ...other documentation. 12. DISTRIBUTION AVAILIBILITY STATEMENT Approved for Public Release; Distribution Unlimited 9. SPONSORING/MONITORING AGENCY NAME...parallel and distributed computing environment were researched. In support of the Condition Based Maintenance (CBM) philosophy, a theoretical framework
Cognitively Guided Instruction: An Implementation Case Study of a High Performing School District
ERIC Educational Resources Information Center
Dowdy, William D. B.
2011-01-01
No Child Left Behind legislation developed goals for every student to be proficient in each academic subject by 2014. California's students are far from meeting this goal, especially in mathematics. One Southern Californian school district, renamed Green Valley Unified School District for anonymity, began using Cognitively Guided Instruction…
ERIC Educational Resources Information Center
Fiedler, Daniela; Tröbst, Steffen; Harms, Ute
2017-01-01
Students of all ages face severe conceptual difficulties regarding key aspects of evolution-- the central, unifying, and overarching theme in biology. Aspects strongly related to abstract "threshold" concepts like randomness and probability appear to pose particular difficulties. A further problem is the lack of an appropriate instrument…
Superalgebra and fermion-boson symmetry
Miyazawa, Hironari
2010-01-01
Fermions and bosons are quite different kinds of particles, but it is possible to unify them in a supermultiplet, by introducing a new mathematical scheme called superalgebra. In this article we discuss the development of the concept of symmetry, starting from the rotational symmetry and finally arriving at this fermion-boson (FB) symmetry. PMID:20228617
Unifying the Algebra for All Movement
ERIC Educational Resources Information Center
Eddy, Colleen M.; Quebec Fuentes, Sarah; Ward, Elizabeth K.; Parker, Yolanda A.; Cooper, Sandi; Jasper, William A.; Mallam, Winifred A.; Sorto, M. Alejandra; Wilkerson, Trena L.
2015-01-01
There exists an increased focus on school mathematics, especially first-year algebra, due to recent efforts for all students to be college and career ready. In addition, there are calls, policies, and legislation advocating for all students to study algebra epitomized by four rationales of the "Algebra for All" movement. In light of this…
Performance modeling of automated manufacturing systems
NASA Astrophysics Data System (ADS)
Viswanadham, N.; Narahari, Y.
A unified and systematic treatment is presented of modeling methodologies and analysis techniques for performance evaluation of automated manufacturing systems. The book is the first treatment of the mathematical modeling of manufacturing systems. Automated manufacturing systems are surveyed and three principal analytical modeling paradigms are discussed: Markov chains, queues and queueing networks, and Petri nets.
Biology. USMES Beginning "How To" Set.
ERIC Educational Resources Information Center
Agro, Sally; And Others
In this set of two booklets for primary grades, students learn how to make a home for their animals (amphibians, insects, fish, crayfish) and a home for their rodents (hamsters, guinea pigs, gerbils, mice). The major emphasis in all Unified Sciences and Mathematics for Elementary Schools (USMES) units is on open-ended, long-range investigations of…
Design Lab. USMES "How To" Series.
ERIC Educational Resources Information Center
Donahoe, Charles; And Others
The major emphasis in all Unified Sciences and Mathematics for Elementary Schools (USMES) units is on open-ended, long-range investigations of real problems. Since children often design and build things in USMES, 26 "Design Lab" cards provide information on the safe use and simple maintenance of tools. Each card has a large photograph of…
Towards new-generation soil erosion modeling: Building a unified omnivorous model
USDA-ARS?s Scientific Manuscript database
Soil erosion is a global threat to agricultural production, and results in off-site sediment and nutrient losses that negatively impact water and air quality. Models are mathematical equations used to estimate the amount of soil lost from a land air, due to the erosive forces of water or wind. Early...
NASA Astrophysics Data System (ADS)
Sousa, Tânia; Domingos, Tiago
2006-11-01
We develop a unified conceptual and mathematical structure for equilibrium econophysics, i.e., the use of concepts and tools of equilibrium thermodynamics in neoclassical microeconomics and vice versa. Within this conceptual structure the results obtained in microeconomic theory are: (1) the definition of irreversibility in economic behavior; (2) the clarification that the Engel curve and the offer curve are not descriptions of real processes dictated by the maximization of utility at constant endowment; (3) the derivation of a relation between elasticities proving that economic elasticities are not all independent; (4) the proof that Giffen goods do not exist in a stable equilibrium; (5) the derivation that ‘economic integrability’ is equivalent to the generalized Le Chatelier principle and (6) the definition of a first order phase transition, i.e., a transition between separate points in the utility function. In thermodynamics the results obtained are: (1) a relation between the non-dimensional isothermal and adiabatic compressibilities and the increase or decrease in the thermodynamic potentials; (2) the distinction between mathematical integrability and optimization behavior and (3) the generalization of the Clapeyron equation.
Theory of Remote Image Formation
NASA Astrophysics Data System (ADS)
Blahut, Richard E.
2004-11-01
In many applications, images, such as ultrasonic or X-ray signals, are recorded and then analyzed with digital or optical processors in order to extract information. Such processing requires the development of algorithms of great precision and sophistication. This book presents a unified treatment of the mathematical methods that underpin the various algorithms used in remote image formation. The author begins with a review of transform and filter theory. He then discusses two- and three-dimensional Fourier transform theory, the ambiguity function, image construction and reconstruction, tomography, baseband surveillance systems, and passive systems (where the signal source might be an earthquake or a galaxy). Information-theoretic methods in image formation are also covered, as are phase errors and phase noise. Throughout the book, practical applications illustrate theoretical concepts, and there are many homework problems. The book is aimed at graduate students of electrical engineering and computer science, and practitioners in industry. Presents a unified treatment of the mathematical methods that underpin the algorithms used in remote image formation Illustrates theoretical concepts with reference to practical applications Provides insights into the design parameters of real systems
Brainerd, C J; Reyna, V F; Howe, M L
2009-10-01
One of the most extensively investigated topics in the adult memory literature, dual memory processes, has had virtually no impact on the study of early memory development. The authors remove the key obstacles to such research by formulating a trichotomous theory of recall that combines the traditional dual processes of recollection and familiarity with a reconstruction process. The theory is then embedded in a hidden Markov model that measures all 3 processes with low-burden tasks that are appropriate for even young children. These techniques are applied to a large corpus of developmental studies of recall, yielding stable findings about the emergence of dual memory processes between childhood and young adulthood and generating tests of many theoretical predictions. The techniques are extended to the study of healthy aging and to the memory sequelae of common forms of neurocognitive impairment, resulting in a theoretical framework that is unified over 4 major domains of memory research: early development, mainstream adult research, aging, and neurocognitive impairment. The techniques are also extended to recognition, creating a unified dual process framework for recall and recognition.
Creating opportunities to learn in mathematics education: a sociocultural perspective
NASA Astrophysics Data System (ADS)
Goos, Merrilyn
2014-09-01
The notion of `opportunities to learn in mathematics education' is open to interpretation from multiple theoretical perspectives, where the focus may be on cognitive, social or affective dimensions of learning, curriculum and assessment design, issues of equity and access, or the broad policy and political contexts of learning and teaching. In this paper, I conceptualise opportunities to learn from a sociocultural perspective. Beginning with my own research on the learning of students and teachers of mathematics, I sketch out two theoretical frameworks for understanding this learning. One framework extends Valsiner's zone theory of child development, and the other draws on Wenger's ideas about communities of practice. My aim is then to suggest how these two frameworks might help us understand the learning of others who have an interest in mathematics education, such as mathematics teacher educator-researchers and mathematicians. In doing so, I attempt to move towards a synthesis of ideas to inform mathematics education research and development.
The Material Supply Adjustment Process in RAMF-SM, Step 2
2016-06-01
contain. The Risk Assessment and Mitigation Framework for Strategic Materials (RAMF-SM) is a suite of mathematical models and databases that has been...Risk Assessment and Mitigation Framework for Strategic Materials (RAMF-SM) is a suite of mathematical models and databases used to support the...and computes material shortfalls.1 Several mathematical models and dozens of databases, encompassing thousands of data items, support the
Tomalia, Donald A; Khanna, Shiv N
2016-02-24
Development of a central paradigm is undoubtedly the single most influential force responsible for advancing Dalton's 19th century atomic/molecular chemistry concepts to the current maturity enjoyed by traditional chemistry. A similar central dogma for guiding and unifying nanoscience has been missing. This review traces the origins, evolution, and current status of such a critical nanoperiodic concept/framework for defining and unifying nanoscience. Based on parallel efforts and a mutual consensus now shared by both chemists and physicists, a nanoperiodic/systematic framework concept has emerged. This concept is based on the well-documented existence of discrete, nanoscale collections of traditional inorganic/organic atoms referred to as hard and soft superatoms (i.e., nanoelement categories). These nanometric entities are widely recognized to exhibit nanoscale atom mimicry features reminiscent of traditional picoscale atoms. All unique superatom/nanoelement physicochemical features are derived from quantized structural control defined by six critical nanoscale design parameters (CNDPs), namely, size, shape, surface chemistry, flexibility/rigidity, architecture, and elemental composition. These CNDPs determine all intrinsic superatom properties, their combining behavior to form stoichiometric nanocompounds/assemblies as well as to exhibit nanoperiodic properties leading to new nanoperiodic rules and predictive Mendeleev-like nanoperiodic tables, and they portend possible extension of these principles to larger quantized building blocks including meta-atoms.
A new view of Baryon symmetric cosmology based on grand unified theories
NASA Technical Reports Server (NTRS)
Stecker, F. W.
1981-01-01
Within the framework of grand unified theories, it is shown how spontaneous CP violation leads to a domain structure in the universe with the domains evolving into separate regions of matter and antimatter excesses. Subsequent to exponential horizon growth, this can result in a universe of matter galaxies and antimatter galaxies. Various astrophysical data appear to favor this form of big bang cosmology. Future direct tests for cosmologically significant antimatter are discussed.
Modelling Trial-by-Trial Changes in the Mismatch Negativity
Lieder, Falk; Daunizeau, Jean; Garrido, Marta I.; Friston, Karl J.; Stephan, Klaas E.
2013-01-01
The mismatch negativity (MMN) is a differential brain response to violations of learned regularities. It has been used to demonstrate that the brain learns the statistical structure of its environment and predicts future sensory inputs. However, the algorithmic nature of these computations and the underlying neurobiological implementation remain controversial. This article introduces a mathematical framework with which competing ideas about the computational quantities indexed by MMN responses can be formalized and tested against single-trial EEG data. This framework was applied to five major theories of the MMN, comparing their ability to explain trial-by-trial changes in MMN amplitude. Three of these theories (predictive coding, model adjustment, and novelty detection) were formalized by linking the MMN to different manifestations of the same computational mechanism: approximate Bayesian inference according to the free-energy principle. We thereby propose a unifying view on three distinct theories of the MMN. The relative plausibility of each theory was assessed against empirical single-trial MMN amplitudes acquired from eight healthy volunteers in a roving oddball experiment. Models based on the free-energy principle provided more plausible explanations of trial-by-trial changes in MMN amplitude than models representing the two more traditional theories (change detection and adaptation). Our results suggest that the MMN reflects approximate Bayesian learning of sensory regularities, and that the MMN-generating process adjusts a probabilistic model of the environment according to prediction errors. PMID:23436989
NASA Astrophysics Data System (ADS)
Ivancevic, Vladimir
2016-07-01
The topic of the review article [1] is the derivation of a multiscale paradigm for the modeling of fibrosis. Firstly, the biological process of the physiological and pathological fibrosis including therapeutical actions is reviewed. Fibrosis can be a consequence of tissue damage, infections and autoimmune diseases, foreign material, tumors. Some questions regarding the pathogenesis, progression and possible regression of fibrosis are lacking. At each scale of observation, different theoretical tools coming from computational, mathematical and physical biology have been proposed. However a complete framework that takes into account the different mechanisms occurring at different scales is still missing. Therefore with the main aim to define a multiscale approach for the modeling of fibrosis, the authors of [1] have presented different top-down and bottom-up approaches that have been developed in the literature. Specifically, their description refers to models for fibrosis diseases based on ordinary and partial differential equation, agents [2], thermostatted kinetic theory [3-5], coarse-grained structures [6-8] and constitutive laws for fibrous collagen networks [9]. A critical analysis has been addressed for all frameworks discussed in the paper. Open problems and future research directions referring to both biological and modeling insight of fibrosis are presented. The paper concludes with the ambitious aim of a multiscale model.
Economics and econophysics in the era of Big Data
NASA Astrophysics Data System (ADS)
Cheong, Siew Ann
2016-12-01
There is an undeniable disconnect between theory-heavy economics and the real world, and some cross polination of ideas with econophysics, which is more balanced between data and models, might help economics along the way to become a truly scientific enterprise. With the coming of the era of Big Data, this transformation of economics into a data-driven science is becoming more urgent. In this article, I use the story of Kepler's discovery of his three laws of planetary motion to enlarge the framework of the scientific approach, from one that focuses on experimental sciences, to one that accommodates observational sciences, and further to one that embraces data mining and machine learning. I distinguish between the ontological values of Kepler's Laws vis-a-vis Newton's Laws, and argue that the latter is more fundamental because it is able to explain the former. I then argue that the fundamental laws of economics lie not in mathematical equations, but in models of adaptive economic agents. With this shift in mind set, it becomes possible to think about how interactions between agents can lead to the emergence of multiple stable states and critical transitions, and complex adaptive policies and regulations that might actually work in the real world. Finally, I discuss how Big Data, exploratory agent-based modeling, and predictive agent-based modeling can come together in a unified framework to make economics a true science.
Bridging the Vector Calculus Gap
NASA Astrophysics Data System (ADS)
Dray, Tevian; Manogue, Corinne
2003-05-01
As with Britain and America, mathematicians and physicists are separated from each other by a common language. In a nutshell, mathematics is about functions, but physics is about things. For the last several years, we have led an NSF-supported effort to "bridge the vector calculus gap" between mathematics and physics. The unifying theme we have discovered is to emphasize geometric reasoning, not (just) algebraic computation. In this talk, we will illustrate the language differences between mathematicians and physicists, and how we are trying reconcile them in the classroom. For further information about the project go to: http://www.physics.orst.edu/bridge
Stochastic differential equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sobczyk, K.
1990-01-01
This book provides a unified treatment of both regular (or random) and Ito stochastic differential equations. It focuses on solution methods, including some developed only recently. Applications are discussed, in particular an insight is given into both the mathematical structure, and the most efficient solution methods (analytical as well as numerical). Starting from basic notions and results of the theory of stochastic processes and stochastic calculus (including Ito's stochastic integral), many principal mathematical problems and results related to stochastic differential equations are expounded here for the first time. Applications treated include those relating to road vehicles, earthquake excitations and offshoremore » structures.« less
Ochs, Christopher; Geller, James; Perl, Yehoshua; Musen, Mark A.
2016-01-01
Software tools play a critical role in the development and maintenance of biomedical ontologies. One important task that is difficult without software tools is ontology quality assurance. In previous work, we have introduced different kinds of abstraction networks to provide a theoretical foundation for ontology quality assurance tools. Abstraction networks summarize the structure and content of ontologies. One kind of abstraction network that we have used repeatedly to support ontology quality assurance is the partial-area taxonomy. It summarizes structurally and semantically similar concepts within an ontology. However, the use of partial-area taxonomies was ad hoc and not generalizable. In this paper, we describe the Ontology Abstraction Framework (OAF), a unified framework and software system for deriving, visualizing, and exploring partial-area taxonomy abstraction networks. The OAF includes support for various ontology representations (e.g., OWL and SNOMED CT's relational format). A Protégé plugin for deriving “live partial-area taxonomies” is demonstrated. PMID:27345947
Ochs, Christopher; Geller, James; Perl, Yehoshua; Musen, Mark A
2016-08-01
Software tools play a critical role in the development and maintenance of biomedical ontologies. One important task that is difficult without software tools is ontology quality assurance. In previous work, we have introduced different kinds of abstraction networks to provide a theoretical foundation for ontology quality assurance tools. Abstraction networks summarize the structure and content of ontologies. One kind of abstraction network that we have used repeatedly to support ontology quality assurance is the partial-area taxonomy. It summarizes structurally and semantically similar concepts within an ontology. However, the use of partial-area taxonomies was ad hoc and not generalizable. In this paper, we describe the Ontology Abstraction Framework (OAF), a unified framework and software system for deriving, visualizing, and exploring partial-area taxonomy abstraction networks. The OAF includes support for various ontology representations (e.g., OWL and SNOMED CT's relational format). A Protégé plugin for deriving "live partial-area taxonomies" is demonstrated. Copyright © 2016 Elsevier Inc. All rights reserved.
A generalized theory of preferential linking
NASA Astrophysics Data System (ADS)
Hu, Haibo; Guo, Jinli; Liu, Xuan; Wang, Xiaofan
2014-12-01
There are diverse mechanisms driving the evolution of social networks. A key open question dealing with understanding their evolution is: How do various preferential linking mechanisms produce networks with different features? In this paper we first empirically study preferential linking phenomena in an evolving online social network, find and validate the linear preference. We propose an analyzable model which captures the real growth process of the network and reveals the underlying mechanism dominating its evolution. Furthermore based on preferential linking we propose a generalized model reproducing the evolution of online social networks, and present unified analytical results describing network characteristics for 27 preference scenarios. We study the mathematical structure of degree distributions and find that within the framework of preferential linking analytical degree distributions can only be the combinations of finite kinds of functions which are related to rational, logarithmic and inverse tangent functions, and extremely complex network structure will emerge even for very simple sublinear preferential linking. This work not only provides a verifiable origin for the emergence of various network characteristics in social networks, but bridges the micro individuals' behaviors and the global organization of social networks.
Asymptotic theory of neutral stability of the Couette flow of a vibrationally excited gas
NASA Astrophysics Data System (ADS)
Grigor'ev, Yu. N.; Ershov, I. V.
2017-01-01
An asymptotic theory of the neutral stability curve for a supersonic plane Couette flow of a vibrationally excited gas is developed. The initial mathematical model consists of equations of two-temperature viscous gas dynamics, which are used to derive a spectral problem for a linear system of eighth-order ordinary differential equations within the framework of the classical linear stability theory. Unified transformations of the system for all shear flows are performed in accordance with the classical Lin scheme. The problem is reduced to an algebraic secular equation with separation into the "inviscid" and "viscous" parts, which is solved numerically. It is shown that the thus-calculated neutral stability curves agree well with the previously obtained results of the direct numerical solution of the original spectral problem. In particular, the critical Reynolds number increases with excitation enhancement, and the neutral stability curve is shifted toward the domain of higher wave numbers. This is also confirmed by means of solving an asymptotic equation for the critical Reynolds number at the Mach number M ≤ 4.
NASA Astrophysics Data System (ADS)
Barra, Adriano; Contucci, Pierluigi; Sandell, Rickard; Vernia, Cecilia
2014-02-01
How does immigrant integration in a country change with immigration density? Guided by a statistical mechanics perspective we propose a novel approach to this problem. The analysis focuses on classical integration quantifiers such as the percentage of jobs (temporary and permanent) given to immigrants, mixed marriages, and newborns with parents of mixed origin. We find that the average values of different quantifiers may exhibit either linear or non-linear growth on immigrant density and we suggest that social action, a concept identified by Max Weber, causes the observed non-linearity. Using the statistical mechanics notion of interaction to quantitatively emulate social action, a unified mathematical model for integration is proposed and it is shown to explain both growth behaviors observed. The linear theory instead, ignoring the possibility of interaction effects would underestimate the quantifiers up to 30% when immigrant densities are low, and overestimate them as much when densities are high. The capacity to quantitatively isolate different types of integration mechanisms makes our framework a suitable tool in the quest for more efficient integration policies.
Measuring the shapes of macromolecules – and why it matters
Li, Jie; Mach, Paul; Koehl, Patrice
2013-01-01
The molecular basis of life rests on the activity of biological macromolecules, mostly nucleic acids and proteins. A perhaps surprising finding that crystallized over the last handful of decades is that geometric reasoning plays a major role in our attempt to understand these activities. In this paper, we address this connection between geometry and biology, focusing on methods for measuring and characterizing the shapes of macromolecules. We briefly review existing numerical and analytical approaches that solve these problems. We cover in more details our own work in this field, focusing on the alpha shape theory as it provides a unifying mathematical framework that enable the analytical calculations of the surface area and volume of a macromolecule represented as a union of balls, the detection of pockets and cavities in the molecule, and the quantification of contacts between the atomic balls. We have shown that each of these quantities can be related to physical properties of the molecule under study and ultimately provides insight on its activity. We conclude with a brief description of new challenges for the alpha shape theory in modern structural biology. PMID:24688748
Celedonio Aguirre-Bravo; Carlos Rodriguez Franco
1999-01-01
The general objective of this Symposium was to build on the best science and technology available to assure that the data and information produced in future inventory and monitoring programs are comparable, quality assured, available, and adequate for their intended purposes, thereby providing a reliable framework for characterization, assessment, and management of...
ERIC Educational Resources Information Center
Molina, Otilia Alejandro; Ratté, Sylvie
2017-01-01
This research introduces a method to construct a unified representation of teachers and students perspectives based on the actionable knowledge discovery (AKD) and delivery framework. The representation is constructed using two models: one obtained from student evaluations and the other obtained from teachers' reflections about their teaching…
Metzger, Marc J.; Bunce, Robert G.H.; Jongman, Rob H.G.; Sayre, Roger G.; Trabucco, Antonio; Zomer, Robert
2013-01-01
Main conclusions: The GEnS provides a robust spatial analytical framework for the aggregation of local observations, identification of gaps in current monitoring efforts and systematic design of complementary and new monitoring and research. The dataset is available for non-commercial use through the GEO portal (http://www.geoportal.org).
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
Teaching Introductory Business Statistics Using the DCOVA Framework
ERIC Educational Resources Information Center
Levine, David M.; Stephan, David F.
2011-01-01
Introductory business statistics students often receive little guidance on how to apply the methods they learn to further business objectives they may one day face. And those students may fail to see the continuity among the topics taught in an introductory course if they learn those methods outside a context that provides a unifying framework.…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
Evaluating Health Information Systems Using Ontologies
Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-01-01
Background There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. Objectives The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems—whether similar or heterogeneous—by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. Methods On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. Results The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. Conclusions The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems. PMID:27311735
Evaluating Health Information Systems Using Ontologies.
Eivazzadeh, Shahryar; Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-06-16
There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems-whether similar or heterogeneous-by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems.
TIMSS Advanced 2015 Assessment Frameworks
ERIC Educational Resources Information Center
Mullis, Ina V. S., Ed.; Martin, Michael O., Ed.
2014-01-01
The "TIMSS Advanced 2015 Assessment Frameworks" provides the foundation for the two international assessments to take place as part of the International Association for the Evaluation of Educational Achievement's TIMSS (Trends in International Mathematics and Science Study) Advanced 2015--Advanced Mathematics and Physics. Chapter 1 (Liv…
The information geometry of chaos
NASA Astrophysics Data System (ADS)
Cafaro, Carlo
2008-10-01
In this Thesis, we propose a new theoretical information-geometric framework (IGAC, Information Geometrodynamical Approach to Chaos) suitable to characterize chaotic dynamical behavior of arbitrary complex systems. First, the problem being investigated is defined; its motivation and relevance are discussed. The basic tools of information physics and the relevant mathematical tools employed in this work are introduced. The basic aspects of Entropic Dynamics (ED) are reviewed. ED is an information-constrained dynamics developed by Ariel Caticha to investigate the possibility that laws of physics---either classical or quantum---may emerge as macroscopic manifestations of underlying microscopic statistical structures. ED is of primary importance in our IGAC. The notion of chaos in classical and quantum physics is introduced. Special focus is devoted to the conventional Riemannian geometrodynamical approach to chaos (Jacobi geometrodynamics) and to the Zurek-Paz quantum chaos criterion of linear entropy growth. After presenting this background material, we show that the ED formalism is not purely an abstract mathematical framework, but is indeed a general theoretical scheme from which conventional Newtonian dynamics is obtained as a special limiting case. The major elements of our IGAC and the novel notion of information geometrodynamical entropy (IGE) are introduced by studying two "toy models". To illustrate the potential power of our IGAC, one application is presented. An information-geometric analogue of the Zurek-Paz quantum chaos criterion of linear entropy growth is suggested. Finally, concluding remarks emphasizing strengths and weak points of our approach are presented and possible further research directions are addressed. At this stage of its development, IGAC remains an ambitious unifying information-geometric theoretical construct for the study of chaotic dynamics with several unsolved problems. However, based on our recent findings, we believe it already provides an interesting, innovative and potentially powerful way to study and understand the very important and challenging problems of classical and quantum chaos.
ERIC Educational Resources Information Center
Lazzaro, Christopher; Jones, Lee; Webb, David C.; Grover, Ryan; Di Giacomo, F. Tony; Marino, Katherine Adele
2016-01-01
This report will determine to what degree the AP Physics 1 and 2 and AP Calculus AB and BC frameworks are aligned with the Trends in International Mathematics and Science Study (TIMSS) Advanced Physics and Mathematics frameworks. This will enable an exploration of any differences in content coverage and levels of complexity, and will set the stage…
ERIC Educational Resources Information Center
Zandieh, Michelle; Rasmussen, Chris
2010-01-01
The purpose of this paper is to further the notion of defining as a mathematical activity by elaborating a framework that structures the role of defining in student progress from informal to more formal ways of reasoning. The framework is the result of a retrospective account of a significant learning experience that occurred in an undergraduate…
Descriptive Geometry in Educational Process of Technical University in Russia Today
ERIC Educational Resources Information Center
Voronina, Marianna V.; Tretyakova, Zlata O.; Moroz, Olga N.; Folomkin, Andrey I.
2016-01-01
The relevance of the investigated problem is caused by the need for monitoring the impact of the Unified State Examination (USE) on the level of mathematical culture and the level of geometric literacy of applicants and students of modern engineering universities of Russia. The need to determine the position of Descriptive Geometry in the…
The Metaplectic Sampling of Quantum Engineering
NASA Astrophysics Data System (ADS)
Schempp, Walter J.
2010-12-01
Due to photonic visualization, quantum physics is not restricted to the microworld. Starting off with synthetic aperture radar, the paper provides a unified approach to coherent atom optics, clinical magnetic resonance tomography and the bacterial protein dynamics of structural microbiology. Its mathematical base is harmonic analysis on the three-dimensional Heisenberg Lie group with associated nilpotent Heisenberg algebra Lie(N).
A Model of E-Learning Uptake and Continued Use in Higher Education Institutions
ERIC Educational Resources Information Center
Pinpathomrat, Nakarin; Gilbert, Lester; Wills, Gary B.
2013-01-01
This research investigates the factors that affect a students' take-up and continued use of E-learning. A mathematical model was constructed by applying three grounded theories; Unified Theory of Acceptance and Use of Technology, Keller's ARCS model, and Expectancy Disconfirm Theory. The learning preference factor was included in the model.…
Arithmetic and Algebra in the Schools: Recommendations for a Return to Reality.
ERIC Educational Resources Information Center
Ailles, Douglas S.; And Others
The aim of this report is to suggest aspects of mathematics education that should be incorporated into curricula rather than to outline specific courses of study. General recommendations are made regarding curriculum, instructional methods, and textbooks. The suggestion that graphs and relations to be used as a unifying theme is followed by…
Beyond Containment and Deterrence: A Security Framework for Europe in the 21st Century
1990-04-02
decades of the 21st Century in Europe, and examines DDO FJoA 1473 E. T1O. Of INOV 65 IS OBSOLETE Uaf eSECRIT CUnclassified SECURITY CLASSIFICATION’ OF THIS... Poland , and parts of France and Russia, but it did not truely unify Germany. Bismarck unified only parts of Germany which he could constrain under...Europe, Central Europe, the Balkans, and the Soviet Union. Central Europe includes Vest Germany, East Germany, Austria, Czechoslavakia, Poland , and
Towards a Unified Description of the Electroweak Nuclear Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benhar, Omar; Lovato, Alessandro
2015-06-01
We briefly review the growing efforts to set up a unified framework for the description of neutrino interactions with atomic nuclei and nuclear matter, applicable in the broad kinematical region corresponding to neutrino energies ranging between few MeV and few GeV. The emerging picture suggests that the formalism of nuclear many-body theory (NMBT) can be exploited to obtain the neutrino-nucleus cross-sections needed for both the interpretation of oscillation signals and simulations of neutrino transport in compact stars
A theoretical formulation of wave-vortex interactions
NASA Technical Reports Server (NTRS)
Wu, J. Z.; Wu, J. M.
1989-01-01
A unified theoretical formulation for wave-vortex interaction, designated the '(omega, Pi) framework,' is presented. Based on the orthogonal decomposition of fluid dynamic interactions, the formulation can be used to study a variety of problems, including the interaction of a longitudinal (acoustic) wave and/or transverse (vortical) wave with a main vortex flow. Moreover, the formulation permits a unified treatment of wave-vortex interaction at various approximate levels, where the normal 'piston' process and tangential 'rubbing' process can be approximated dfferently.
Hu, Shiang; Yao, Dezhong; Valdes-Sosa, Pedro A.
2018-01-01
The choice of reference for the electroencephalogram (EEG) is a long-lasting unsolved issue resulting in inconsistent usages and endless debates. Currently, both the average reference (AR) and the reference electrode standardization technique (REST) are two primary, apparently irreconcilable contenders. We propose a theoretical framework to resolve this reference issue by formulating both (a) estimation of potentials at infinity, and (b) determination of the reference, as a unified Bayesian linear inverse problem, which can be solved by maximum a posterior estimation. We find that AR and REST are very particular cases of this unified framework: AR results from biophysically non-informative prior; while REST utilizes the prior based on the EEG generative model. To allow for simultaneous denoising and reference estimation, we develop the regularized versions of AR and REST, named rAR and rREST, respectively. Both depend on a regularization parameter that is the noise to signal variance ratio. Traditional and new estimators are evaluated with this framework, by both simulations and analysis of real resting EEGs. Toward this end, we leverage the MRI and EEG data from 89 subjects which participated in the Cuban Human Brain Mapping Project. Generated artificial EEGs—with a known ground truth, show that relative error in estimating the EEG potentials at infinity is lowest for rREST. It also reveals that realistic volume conductor models improve the performances of REST and rREST. Importantly, for practical applications, it is shown that an average lead field gives the results comparable to the individual lead field. Finally, it is shown that the selection of the regularization parameter with Generalized Cross-Validation (GCV) is close to the “oracle” choice based on the ground truth. When evaluated with the real 89 resting state EEGs, rREST consistently yields the lowest GCV. This study provides a novel perspective to the EEG reference problem by means of a unified inverse solution framework. It may allow additional principled theoretical formulations and numerical evaluation of performance. PMID:29780302
In quest of a systematic framework for unifying and defining nanoscience
2009-01-01
This article proposes a systematic framework for unifying and defining nanoscience based on historic first principles and step logic that led to a “central paradigm” (i.e., unifying framework) for traditional elemental/small-molecule chemistry. As such, a Nanomaterials classification roadmap is proposed, which divides all nanomatter into Category I: discrete, well-defined and Category II: statistical, undefined nanoparticles. We consider only Category I, well-defined nanoparticles which are >90% monodisperse as a function of Critical Nanoscale Design Parameters (CNDPs) defined according to: (a) size, (b) shape, (c) surface chemistry, (d) flexibility, and (e) elemental composition. Classified as either hard (H) (i.e., inorganic-based) or soft (S) (i.e., organic-based) categories, these nanoparticles were found to manifest pervasive atom mimicry features that included: (1) a dominance of zero-dimensional (0D) core–shell nanoarchitectures, (2) the ability to self-assemble or chemically bond as discrete, quantized nanounits, and (3) exhibited well-defined nanoscale valencies and stoichiometries reminiscent of atom-based elements. These discrete nanoparticle categories are referred to as hard or soft particle nanoelements. Many examples describing chemical bonding/assembly of these nanoelements have been reported in the literature. We refer to these hard:hard (H-n:H-n), soft:soft (S-n:S-n), or hard:soft (H-n:S-n) nanoelement combinations as nanocompounds. Due to their quantized features, many nanoelement and nanocompound categories are reported to exhibit well-defined nanoperiodic property patterns. These periodic property patterns are dependent on their quantized nanofeatures (CNDPs) and dramatically influence intrinsic physicochemical properties (i.e., melting points, reactivity/self-assembly, sterics, and nanoencapsulation), as well as important functional/performance properties (i.e., magnetic, photonic, electronic, and toxicologic properties). We propose this perspective as a modest first step toward more clearly defining synthetic nanochemistry as well as providing a systematic framework for unifying nanoscience. With further progress, one should anticipate the evolution of future nanoperiodic table(s) suitable for predicting important risk/benefit boundaries in the field of nanoscience. Electronic supplementary material The online version of this article (doi:10.1007/s11051-009-9632-z) contains supplementary material, which is available to authorized users. PMID:21170133
Mathematics education for social justice
NASA Astrophysics Data System (ADS)
Suhendra
2016-02-01
Mathematics often perceived as a difficult subject with many students failing to understand why they learn mathematics. This situation has been further aggravated by the teaching and learning processes used, which is mechanistic without considering students' needs. The learning of mathematics tends to be just a compulsory subject, in which all students have to attend its classes. Social justice framework facilitates individuals or groups as a whole and provides equitable approaches to achieving equitable outcomes by recognising disadvantage. Applying social justice principles in educational context is related to how the teachers treat their students, dictates that all students the right to equal treatment regardless of their background and completed with applying social justice issues integrated with the content of the subject in order to internalise the principles of social justice simultaneously the concepts of the subject. The study examined the usefulness of implementing the social justice framework as a means of improving the quality of mathematics teaching in Indonesia involved four teacher-participants and their mathematics classes. The study used action research as the research methodology in which the teachers implemented and evaluated their use of social justice framework in their teaching. The data were collected using multiple research methods while analysis and interpretation of the data were carried out throughout the study. The findings of the study indicated that there were a number of challengesrelated to the implementation of the social justice framework. The findings also indicated that, the teachers were provided with a comprehensive guide that they could draw on to make decisions about how they could improve their lessons. The interactions among students and between the teachers and the students improved, they became more involved in teaching and learning process. Using social justice framework helped the teachers to make mathematics more relevant to students. This increased relevance led to increasing students' engagement in the teaching and learning process and becoming more accessible to all students. Additionally, the findings have the potential to make a contribution to those seeking to reform mathematics teaching in Indonesia. The results could inform policy makers and professional development providers about how social justice framework might contribute to the educational reform in Indonesia.
Knowledge of Curriculum Embedded Mathematics: Exploring a Critical Domain of Teaching
ERIC Educational Resources Information Center
Remillard, Janine; Kim, Ok-Kyeong
2017-01-01
This paper proposes a framework for identifying the mathematical knowledge teachers activate when using curriculum resources. We use the term "knowledge of curriculum embedded mathematics" (KCEM) to refer to the mathematics knowledge activated by teachers when reading and interpreting mathematical tasks, instructional designs, and…
Mathematics Framework for California Public Schools, Kindergarten Through Grade Twelve.
ERIC Educational Resources Information Center
California State Dept. of Education, Sacramento.
This report, prepared by a statewide Mathematics Advisory Committee, revises the framework in the Second Strands Report of 1972, expanding it to encompass kindergarten through grade 12. Strands for kindergarten through grade 8 are: arithmetic, numbers, and operations; geometry; measurement, problem solving/ applications; probability and…
Assessing Mathematics: 1. APU Framework and Modes of Assessment.
ERIC Educational Resources Information Center
Foxman, Derek; Mitchell, Peter
1983-01-01
The "what" and "how" of the Assessment of Performance Unit surveys of the mathematics performance of 11- and 15-year-olds in England, Wales, and Northern Ireland are explained. The framework and forms of assessment are detailed, and the experience of the testers noted. (MNS)
Adapting Technological Pedagogical Content Knowledge Framework to Teach Mathematics
ERIC Educational Resources Information Center
Getenet, Seyum Tekeher
2017-01-01
The technological pedagogical content knowledge framework is increasingly in use by educational technology researcher as a generic description of the knowledge requirements for teachers using technology in all subjects. This study describes the development of a mathematics specific variety of the technological pedagogical content knowledge…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eccleston, C.H.
1997-09-05
The National Environmental Policy Act (NEPA) of 1969 was established by Congress more than a quarter of a century ago, yet there is a surprising lack of specific tools, techniques, and methodologies for effectively implementing these regulatory requirements. Lack of professionally accepted techniques is a principal factor responsible for many inefficiencies. Often, decision makers do not fully appreciate or capitalize on the true potential which NEPA provides as a platform for planning future actions. New approaches and modem management tools must be adopted to fully achieve NEPA`s mandate. A new strategy, referred to as Total Federal Planning, is proposed formore » unifying large-scale federal planning efforts under a single, systematic, structured, and holistic process. Under this approach, the NEPA planning process provides a unifying framework for integrating all early environmental and nonenvironmental decision-making factors into a single comprehensive planning process. To promote effectiveness and efficiency, modem tools and principles from the disciplines of Value Engineering, Systems Engineering, and Total Quality Management are incorporated. Properly integrated and implemented, these planning tools provide the rigorous, structured, and disciplined framework essential in achieving effective planning. Ultimately, the goal of a Total Federal Planning strategy is to construct a unified and interdisciplinary framework that substantially improves decision-making, while reducing the time, cost, redundancy, and effort necessary to comply with environmental and other planning requirements. At a time when Congress is striving to re-engineer the governmental framework, apparatus, and process, a Total Federal Planning philosophy offers a systematic approach for uniting the disjointed and often convoluted planning process currently used by most federal agencies. Potentially this approach has widespread implications in the way federal planning is approached.« less
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
A unifying framework for quantifying the nature of animal interactions.
Potts, Jonathan R; Mokross, Karl; Lewis, Mark A
2014-07-06
Collective phenomena, whereby agent-agent interactions determine spatial patterns, are ubiquitous in the animal kingdom. On the other hand, movement and space use are also greatly influenced by the interactions between animals and their environment. Despite both types of interaction fundamentally influencing animal behaviour, there has hitherto been no unifying framework for the models proposed in both areas. Here, we construct a general method for inferring population-level spatial patterns from underlying individual movement and interaction processes, a key ingredient in building a statistical mechanics for ecological systems. We show that resource selection functions, as well as several examples of collective motion models, arise as special cases of our framework, thus bringing together resource selection analysis and collective animal behaviour into a single theory. In particular, we focus on combining the various mechanistic models of territorial interactions in the literature with step selection functions, by incorporating interactions into the step selection framework and demonstrating how to derive territorial patterns from the resulting models. We demonstrate the efficacy of our model by application to a population of insectivore birds in the Amazon rainforest. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
A unified framework for image retrieval using keyword and visual features.
Jing, Feng; Li, Mingling; Zhang, Hong-Jiang; Zhang, Bo
2005-07-01
In this paper, a unified image retrieval framework based on both keyword annotations and visual features is proposed. In this framework, a set of statistical models are built based on visual features of a small set of manually labeled images to represent semantic concepts and used to propagate keywords to other unlabeled images. These models are updated periodically when more images implicitly labeled by users become available through relevance feedback. In this sense, the keyword models serve the function of accumulation and memorization of knowledge learned from user-provided relevance feedback. Furthermore, two sets of effective and efficient similarity measures and relevance feedback schemes are proposed for query by keyword scenario and query by image example scenario, respectively. Keyword models are combined with visual features in these schemes. In particular, a new, entropy-based active learning strategy is introduced to improve the efficiency of relevance feedback for query by keyword. Furthermore, a new algorithm is proposed to estimate the keyword features of the search concept for query by image example. It is shown to be more appropriate than two existing relevance feedback algorithms. Experimental results demonstrate the effectiveness of the proposed framework.
ERIC Educational Resources Information Center
Adams, Thomasenia Lott
2001-01-01
Focuses on the National Council of Teachers of Mathematics 2000 process-oriented standards of problem solving, reasoning and proof, communication, connections, and representation as providing a framework for using the multiple intelligences that children bring to mathematics learning. Presents ideas for mathematics lessons and activities to…
The CE/SE Method: a CFD Framework for the Challenges of the New Millennium
NASA Technical Reports Server (NTRS)
Chang, Sin-Chung; Yu, Sheng-Tao
2001-01-01
The space-time conservation element and solution element (CE/SE) method, which was originated and is continuously being developed at NASA Glenn Research Center, is a high-resolution, genuinely multidimensional and unstructured-mesh compatible numerical method for solving conservation laws. Since its inception in 1991, the CE/SE method has been used to obtain highly accurate numerical solutions for 1D, 2D and 3D flow problems involving shocks, contact discontinuities, acoustic waves, vortices, shock/acoustic waves/vortices interactions, shock/boundary layers interactions and chemical reactions. Without the aid of preconditioning or other special techniques, it has been applied to both steady and unsteady flows with speeds ranging from Mach number = 0.00288 to 10. In addition, the method has unique features that allow for (i) the use of very simple non-reflecting boundary conditions, and (ii) a unified wall boundary treatment for viscous and inviscid flows. The CE/SE method was developed with the conviction that, with a solid foundation in physics, a robust, coherent and accurate numerical framework can be built without involving overly complex mathematics. As a result, the method was constructed using a set of design principles that facilitate simplicity, robustness and accuracy. The most important among them are: (i) enforcing both local and global flux conservation in space and time, with flux evaluation at an interface being an integral part of the solution procedure and requiring no interpolation or extrapolation; (ii) unifying space and time and treating them as a single entity; and (iii) requiring that a numerical scheme be built from a nondissipative core scheme such that the numerical dissipation can be effectively controlled and, as a result, will not overwhelm the physical dissipation. Part I of the workshop will be devoted to a discussion of these principles along with a description of how the ID, 2D and 3D CE/SE schemes are constructed. In Part II, various applications of the CE/SE method, particularly those involving chemical reactions and acoustics, will be presented. The workshop will be concluded with a sketch of the future research directions.
NASA Astrophysics Data System (ADS)
Daher, Wajeeh M.
2014-04-01
Mathematical learning and teaching are increasingly seen as a multimodal experience involved in cultural and social semiotic registers and means, and as such social-cultural semiotic analysis is expected to shed light on learning and teaching processes occurring in the mathematics classroom. In this research, three social-cultural semiotic frameworks were utilised to analyse elementary school students' learning of a geometric relation: the semiotic bundle, the space of action, production and communication and the theoretical framework of attention, awareness and objectification. Educational mathematical situations are described, in addition to semiotic sets, registers and means emerging in the different mathematical situations and that are relevant to the three social-cultural semiotic frameworks which the current research utilizes. Further, the students, as a consequence of (1) their multimodal experience, (2) their connecting between the different mathematical situations and semiotic registers, and (3) the teacher's questions and tasks, could objectify the geometric relation between the lengths of the triangle's edges.
Four Factors to Consider in Helping Low Achievers in Mathematics
ERIC Educational Resources Information Center
Leong, Yew Hoong; Yap, Sook Fwe; Tay, Eng Guan
2013-01-01
In this paper, we propose and describe in some detail a framework for helping low achievers in mathematics that attends to the following areas: Mathematical content resources, Problem Solving disposition, Feelings towards the learning of mathematics, and Study habits.
40 CFR 300.105 - General organization concepts.
Code of Federal Regulations, 2010 CFR
2010-07-01
... capabilities. (b) Three fundamental kinds of activities are performed pursuant to the NCP: (1) Preparedness....205(c). (d) The basic framework for the response management structure is a system (e.g., a unified...
A unified and efficient framework for court-net sports video analysis using 3D camera modeling
NASA Astrophysics Data System (ADS)
Han, Jungong; de With, Peter H. N.
2007-01-01
The extensive amount of video data stored on available media (hard and optical disks) necessitates video content analysis, which is a cornerstone for different user-friendly applications, such as, smart video retrieval and intelligent video summarization. This paper aims at finding a unified and efficient framework for court-net sports video analysis. We concentrate on techniques that are generally applicable for more than one sports type to come to a unified approach. To this end, our framework employs the concept of multi-level analysis, where a novel 3-D camera modeling is utilized to bridge the gap between the object-level and the scene-level analysis. The new 3-D camera modeling is based on collecting features points from two planes, which are perpendicular to each other, so that a true 3-D reference is obtained. Another important contribution is a new tracking algorithm for the objects (i.e. players). The algorithm can track up to four players simultaneously. The complete system contributes to summarization by various forms of information, of which the most important are the moving trajectory and real-speed of each player, as well as 3-D height information of objects and the semantic event segments in a game. We illustrate the performance of the proposed system by evaluating it for a variety of court-net sports videos containing badminton, tennis and volleyball, and we show that the feature detection performance is above 92% and events detection about 90%.
Generic-distributed framework for cloud services marketplace based on unified ontology.
Hasan, Samer; Valli Kumari, V
2017-11-01
Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.
ERIC Educational Resources Information Center
Edwards, Ann R.; Beattie, Rachel L.
2016-01-01
This paper focuses on two research-based frameworks that inform the design of instruction and promote student success in accelerated, developmental mathematics pathways. These are Learning Opportunities--productive struggle on challenging and relevant tasks, deliberate practice, and explicit connections, and Productive Persistence--promoting…
Making Shifts toward Proficiency
ERIC Educational Resources Information Center
McGatha, Maggie B.; Bay-Williams, Jennifer M.
2013-01-01
The Leading for Mathematical Proficiency (LMP) Framework (Bay-Williams et al.) has three components: (1) The Standards for Mathematical Practice; (2) Shifts in classroom practice; and (3) Teaching skills. This article briefly describes each component of the LMP framework and then focuses more in depth on the second component, the shifts in…
The Conceptual Framework for the Development of a Mathematics Performance Assessment Instrument.
ERIC Educational Resources Information Center
Lane, Suzanne
1993-01-01
A conceptual framework is presented for the development of the Quantitative Understanding: Amplifying Student Achievement and Reasoning (QUASAR) Cognitive Assessment Instrument (QCAI) that focuses on the ability of middle-school students to problem solve, reason, and communicate mathematically. The instrument will provide programatic rather than…
Ratio Analysis: Where Investments Meet Mathematics.
ERIC Educational Resources Information Center
Barton, Susan D.; Woodbury, Denise
2002-01-01
Discusses ratio analysis by which investments may be evaluated. Requires the use of fundamental mathematics, problem solving, and a comparison of the mathematical results within the framework of industry. (Author/NB)
Using Technology to Unify Geometric Theorems about the Power of a Point
ERIC Educational Resources Information Center
Contreras, Jose N.
2011-01-01
In this article, I describe a classroom investigation in which a group of prospective secondary mathematics teachers discovered theorems related to the power of a point using "The Geometer's Sketchpad" (GSP). The power of a point is defines as follows: Let "P" be a fixed point coplanar with a circle. If line "PA" is a secant line that intersects…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2013
2013-01-01
The National Assessment of Educational Progress (NAEP), in partnership with the National Assessment Governing Board and the Council of the Great City Schools (CGCS), created the Trial Urban District Assessment (TUDA) in 2002 to support the improvement of student achievement in the nation's large urban districts. NAEP TUDA results in mathematics…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2013
2013-01-01
The National Assessment of Educational Progress (NAEP), in partnership with the National Assessment Governing Board and the Council of the Great City Schools (CGCS), created the Trial Urban District Assessment (TUDA) in 2002 to support the improvement of student achievement in the nation's large urban districts. NAEP TUDA results in mathematics…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
This one-page report presents overall results, achievement level percentages and average score results, scores at selected percentiles, average scores for district and large cities, results for student groups (school race, gender, and eligibility for National School Lunch Program) in 2011, and score gaps for student groups. In 2011, the average…
Modeling Synergistic Drug Inhibition of Mycobacterium tuberculosis Growth in Murine Macrophages
2011-01-01
important application of metabolic network modeling is the ability to quantitatively model metabolic enzyme inhibition and predict bacterial growth...describe the extensions of this framework to model drug- induced growth inhibition of M. tuberculosis in macrophages.39 Mathematical framework Fig. 1 shows...starting point, we used the previously developed iNJ661v model to represent the metabolic Fig. 1 Mathematical framework: a set of coupled models used to
Middle School Mathematics Teachers Panel Perspectives of Instructional Practicess
ERIC Educational Resources Information Center
Ziegler, Cindy
2017-01-01
In a local middle school, students were not meeting standards on the state mathematics tests. The purpose of this qualitative study was to explore mathematics teachers' perspectives on effective mathematics instruction vis-a-vis the principles of the National Council of Teachers of Mathematics (NCTM). Within this framework, the 6 principles in the…
MOOSE: A PARALLEL COMPUTATIONAL FRAMEWORK FOR COUPLED SYSTEMS OF NONLINEAR EQUATIONS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. Hansen; C. Newman; D. Gaston
Systems of coupled, nonlinear partial di?erential equations often arise in sim- ulation of nuclear processes. MOOSE: Multiphysics Ob ject Oriented Simulation Environment, a parallel computational framework targeted at solving these systems is presented. As opposed to traditional data / ?ow oriented com- putational frameworks, MOOSE is instead founded on mathematics based on Jacobian-free Newton Krylov (JFNK). Utilizing the mathematical structure present in JFNK, physics are modularized into “Kernels” allowing for rapid production of new simulation tools. In addition, systems are solved fully cou- pled and fully implicit employing physics based preconditioning allowing for a large amount of ?exibility even withmore » large variance in time scales. Background on the mathematics, an inspection of the structure of MOOSE and several rep- resentative solutions from applications built on the framework are presented.« less
MOOSE: A parallel computational framework for coupled systems of nonlinear equations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Derek Gaston; Chris Newman; Glen Hansen
Systems of coupled, nonlinear partial differential equations (PDEs) often arise in simulation of nuclear processes. MOOSE: Multiphysics Object Oriented Simulation Environment, a parallel computational framework targeted at the solution of such systems, is presented. As opposed to traditional data-flow oriented computational frameworks, MOOSE is instead founded on the mathematical principle of Jacobian-free Newton-Krylov (JFNK) solution methods. Utilizing the mathematical structure present in JFNK, physics expressions are modularized into `Kernels,'' allowing for rapid production of new simulation tools. In addition, systems are solved implicitly and fully coupled, employing physics based preconditioning, which provides great flexibility even with large variance in timemore » scales. A summary of the mathematics, an overview of the structure of MOOSE, and several representative solutions from applications built on the framework are presented.« less
Motivation and engagement in mathematics: a qualitative framework for teacher-student interactions
NASA Astrophysics Data System (ADS)
Durksen, Tracy L.; Way, Jennifer; Bobis, Janette; Anderson, Judy; Skilling, Karen; Martin, Andrew J.
2017-02-01
We started with a classic research question (How do teachers motivate and engage middle year students in mathematics?) that is solidly underpinned and guided by an integration of two theoretical and multidimensional models. In particular, the current study illustrates how theory is important for guiding qualitative analytical approaches to motivation and engagement in mathematics. With little research on how teachers of mathematics are able to maintain high levels of student motivation and engagement, we focused on developing a qualitative framework that highlights the influence of teacher-student interactions. Participants were six teachers (upper primary and secondary) that taught students with higher-than-average levels of motivation and engagement in mathematics. Data sources included one video-recorded lesson and associated transcripts from pre- and post-lesson interviews with each teacher. Overall, effective classroom organisation stood out as a priority when promoting motivation and engagement in mathematics. Results on classroom organisation revealed four key indicators within teacher-student interactions deemed important for motivation and engagement in mathematics—confidence, climate, contact, and connection. Since much of the effect of teachers on student learning relies on interactions, and given the universal trend of declining mathematical performance during the middle years of schooling, future research and intervention studies might be assisted by our qualitative framework.
Probabilistic delay differential equation modeling of event-related potentials.
Ostwald, Dirk; Starke, Ludger
2016-08-01
"Dynamic causal models" (DCMs) are a promising approach in the analysis of functional neuroimaging data due to their biophysical interpretability and their consolidation of functional-segregative and functional-integrative propositions. In this theoretical note we are concerned with the DCM framework for electroencephalographically recorded event-related potentials (ERP-DCM). Intuitively, ERP-DCM combines deterministic dynamical neural mass models with dipole-based EEG forward models to describe the event-related scalp potential time-series over the entire electrode space. Since its inception, ERP-DCM has been successfully employed to capture the neural underpinnings of a wide range of neurocognitive phenomena. However, in spite of its empirical popularity, the technical literature on ERP-DCM remains somewhat patchy. A number of previous communications have detailed certain aspects of the approach, but no unified and coherent documentation exists. With this technical note, we aim to close this gap and to increase the technical accessibility of ERP-DCM. Specifically, this note makes the following novel contributions: firstly, we provide a unified and coherent review of the mathematical machinery of the latent and forward models constituting ERP-DCM by formulating the approach as a probabilistic latent delay differential equation model. Secondly, we emphasize the probabilistic nature of the model and its variational Bayesian inversion scheme by explicitly deriving the variational free energy function in terms of both the likelihood expectation and variance parameters. Thirdly, we detail and validate the estimation of the model with a special focus on the explicit form of the variational free energy function and introduce a conventional nonlinear optimization scheme for its maximization. Finally, we identify and discuss a number of computational issues which may be addressed in the future development of the approach. Copyright © 2016 Elsevier Inc. All rights reserved.
Terrestrial carbon storage dynamics: Chasing a moving target
NASA Astrophysics Data System (ADS)
Luo, Y.; Shi, Z.; Jiang, L.; Xia, J.; Wang, Y.; Kc, M.; Liang, J.; Lu, X.; Niu, S.; Ahlström, A.; Hararuk, O.; Hastings, A.; Hoffman, F. M.; Medlyn, B. E.; Rasmussen, M.; Smith, M. J.; Todd-Brown, K. E.; Wang, Y.
2015-12-01
Terrestrial ecosystems have been estimated to absorb roughly 30% of anthropogenic CO2 emissions. Past studies have identified myriad drivers of terrestrial carbon storage changes, such as fire, climate change, and land use changes. Those drivers influence the carbon storage change via diverse mechanisms, which have not been unified into a general theory so as to identify what control the direction and rate of terrestrial carbon storage dynamics. Here we propose a theoretical framework to quantitatively determine the response of terrestrial carbon storage to different exogenous drivers. With a combination of conceptual reasoning, mathematical analysis, and numeric experiments, we demonstrated that the maximal capacity of an ecosystem to store carbon is time-dependent and equals carbon input (i.e., net primary production, NPP) multiplying by residence time. The capacity is a moving target toward which carbon storage approaches (i.e., the direction of carbon storage change) but usually does not attain. The difference between the capacity and the carbon storage at a given time t is the unrealized carbon storage potential. The rate of the storage change is proportional to the magnitude of the unrealized potential. We also demonstrated that a parameter space of NPP, residence time, and carbon storage potential can well characterize carbon storage dynamics quantified at six sites ranging from tropical forests to tundra and simulated by two versions (carbon-only and coupled carbon-nitrogen) of the Australian Community Atmosphere-Biosphere Land Ecosystem (CABLE) Model under three climate change scenarios (CO2 rising only, climate warming only, and RCP8.5). Overall this study reveals the unified mechanism unerlying terrestrial carbon storage dynamics to guide transient traceability analysis of global land models and synthesis of empirical studies.
LIFE CYCLE ENGINEERING GUIDELINES
This document provides guidelines for the implementation of LCE concepts, information, and techniques in engineering products, systems, processes, and facilities. To make this document as practical and useable as possible, a unifying LCE framework is presented. Subsequent topics ...
Contemplating Symbolic Literacy of First Year Mathematics Students
ERIC Educational Resources Information Center
Bardini, Caroline; Pierce, Robyn; Vincent, Jill
2015-01-01
Analysis of mathematical notations must consider both syntactical aspects of symbols and the underpinning mathematical concept(s) conveyed. We argue that the construct of "syntax template" provides a theoretical framework to analyse undergraduate mathematics students' written solutions, where we have identified several types of…
NASA Astrophysics Data System (ADS)
Jacobson, Erik; Simpson, Amber
2018-04-01
Replication studies play a critical role in scientific accumulation of knowledge, yet replication studies in mathematics education are rare. In this study, the authors replicated Thanheiser's (Educational Studies in Mathematics 75:241-251, 2010) study of prospective elementary teachers' conceptions of multidigit number and examined the main claim that most elementary pre-service teachers think about digits incorrectly at least some of the time. Results indicated no statistically significant difference in the distribution of conceptions between the original and replication samples and, moreover, no statistically significant differences in the distribution of sub-conceptions among prospective teachers with the most common conception. These results suggest confidence is warranted both in the generality of the main claim and in the utility of the conceptions framework for describing prospective elementary teachers' conceptions of multidigit number. The report further contributes a framework for replication of mathematics education research adapted from the field of psychology.
Predicting disease progression from short biomarker series using expert advice algorithm
NASA Astrophysics Data System (ADS)
Morino, Kai; Hirata, Yoshito; Tomioka, Ryota; Kashima, Hisashi; Yamanishi, Kenji; Hayashi, Norihiro; Egawa, Shin; Aihara, Kazuyuki
2015-05-01
Well-trained clinicians may be able to provide diagnosis and prognosis from very short biomarker series using information and experience gained from previous patients. Although mathematical methods can potentially help clinicians to predict the progression of diseases, there is no method so far that estimates the patient state from very short time-series of a biomarker for making diagnosis and/or prognosis by employing the information of previous patients. Here, we propose a mathematical framework for integrating other patients' datasets to infer and predict the state of the disease in the current patient based on their short history. We extend a machine-learning framework of ``prediction with expert advice'' to deal with unstable dynamics. We construct this mathematical framework by combining expert advice with a mathematical model of prostate cancer. Our model predicted well the individual biomarker series of patients with prostate cancer that are used as clinical samples.
Predicting disease progression from short biomarker series using expert advice algorithm.
Morino, Kai; Hirata, Yoshito; Tomioka, Ryota; Kashima, Hisashi; Yamanishi, Kenji; Hayashi, Norihiro; Egawa, Shin; Aihara, Kazuyuki
2015-05-20
Well-trained clinicians may be able to provide diagnosis and prognosis from very short biomarker series using information and experience gained from previous patients. Although mathematical methods can potentially help clinicians to predict the progression of diseases, there is no method so far that estimates the patient state from very short time-series of a biomarker for making diagnosis and/or prognosis by employing the information of previous patients. Here, we propose a mathematical framework for integrating other patients' datasets to infer and predict the state of the disease in the current patient based on their short history. We extend a machine-learning framework of "prediction with expert advice" to deal with unstable dynamics. We construct this mathematical framework by combining expert advice with a mathematical model of prostate cancer. Our model predicted well the individual biomarker series of patients with prostate cancer that are used as clinical samples.
Kernel-imbedded Gaussian processes for disease classification using microarray gene expression data
Zhao, Xin; Cheung, Leo Wang-Kit
2007-01-01
Background Designing appropriate machine learning methods for identifying genes that have a significant discriminating power for disease outcomes has become more and more important for our understanding of diseases at genomic level. Although many machine learning methods have been developed and applied to the area of microarray gene expression data analysis, the majority of them are based on linear models, which however are not necessarily appropriate for the underlying connection between the target disease and its associated explanatory genes. Linear model based methods usually also bring in false positive significant features more easily. Furthermore, linear model based algorithms often involve calculating the inverse of a matrix that is possibly singular when the number of potentially important genes is relatively large. This leads to problems of numerical instability. To overcome these limitations, a few non-linear methods have recently been introduced to the area. Many of the existing non-linear methods have a couple of critical problems, the model selection problem and the model parameter tuning problem, that remain unsolved or even untouched. In general, a unified framework that allows model parameters of both linear and non-linear models to be easily tuned is always preferred in real-world applications. Kernel-induced learning methods form a class of approaches that show promising potentials to achieve this goal. Results A hierarchical statistical model named kernel-imbedded Gaussian process (KIGP) is developed under a unified Bayesian framework for binary disease classification problems using microarray gene expression data. In particular, based on a probit regression setting, an adaptive algorithm with a cascading structure is designed to find the appropriate kernel, to discover the potentially significant genes, and to make the optimal class prediction accordingly. A Gibbs sampler is built as the core of the algorithm to make Bayesian inferences. Simulation studies showed that, even without any knowledge of the underlying generative model, the KIGP performed very close to the theoretical Bayesian bound not only in the case with a linear Bayesian classifier but also in the case with a very non-linear Bayesian classifier. This sheds light on its broader usability to microarray data analysis problems, especially to those that linear methods work awkwardly. The KIGP was also applied to four published microarray datasets, and the results showed that the KIGP performed better than or at least as well as any of the referred state-of-the-art methods did in all of these cases. Conclusion Mathematically built on the kernel-induced feature space concept under a Bayesian framework, the KIGP method presented in this paper provides a unified machine learning approach to explore both the linear and the possibly non-linear underlying relationship between the target features of a given binary disease classification problem and the related explanatory gene expression data. More importantly, it incorporates the model parameter tuning into the framework. The model selection problem is addressed in the form of selecting a proper kernel type. The KIGP method also gives Bayesian probabilistic predictions for disease classification. These properties and features are beneficial to most real-world applications. The algorithm is naturally robust in numerical computation. The simulation studies and the published data studies demonstrated that the proposed KIGP performs satisfactorily and consistently. PMID:17328811
Convex geometry of quantum resource quantification
NASA Astrophysics Data System (ADS)
Regula, Bartosz
2018-01-01
We introduce a framework unifying the mathematical characterisation of different measures of general quantum resources and allowing for a systematic way to define a variety of faithful quantifiers for any given convex quantum resource theory. The approach allows us to describe many commonly used measures such as matrix norm-based quantifiers, robustness measures, convex roof-based measures, and witness-based quantifiers together in a common formalism based on the convex geometry of the underlying sets of resource-free states. We establish easily verifiable criteria for a measure to possess desirable properties such as faithfulness and strong monotonicity under relevant free operations, and show that many quantifiers obtained in this framework indeed satisfy them for any considered quantum resource. We derive various bounds and relations between the measures, generalising and providing significantly simplified proofs of results found in the resource theories of quantum entanglement and coherence. We also prove that the quantification of resources in this framework simplifies for pure states, allowing us to obtain more easily computable forms of the considered measures, and show that many of them are in fact equal on pure states. Further, we investigate the dual formulation of resource quantifiers, which provide a characterisation of the sets of resource witnesses. We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, providing insight into the quantification of the four resources by establishing novel quantitative relations and introducing new quantifiers, such as a measure of entanglement of Schmidt number k which generalises the convex roof-extended negativity, a measure of k-coherence which generalises the \
A Role for Language Analysis in Mathematics Textbook Analysis
ERIC Educational Resources Information Center
O'Keeffe, Lisa; O'Donoghue, John
2015-01-01
In current textbook analysis research, there is a strong focus on the content, structure and expectation presented by the textbook as elements for analysis. This research moves beyond such foci and proposes a framework for textbook language analysis which is intended to be integrated into an overall framework for mathematics textbook analysis. The…
ERIC Educational Resources Information Center
Muñiz-Rodríguez, Laura; Alonso, Pedro; Rodríguez-Muñiz, Luis J.; Valcke, Martin
2017-01-01
Initial teacher education programmes provide student teachers with the desired competences to develop themselves as teachers. Although a generic framework for teaching competences is available covering all school subjects in Spain, the initial teacher education programmes curriculum does not specify which competences secondary mathematics student…
Pedagogies of Practice and Opportunities to Learn about Classroom Mathematics Discussions
ERIC Educational Resources Information Center
Ghousseini, Hala; Herbst, Patricio
2016-01-01
In this paper, we argue that to prepare pre-service teachers for doing complex work of teaching like leading classroom mathematics discussions requires an implementation of different pedagogies of teacher education in deliberate ways. In supporting our argument, we use two frameworks: one curricular and one pedagogical. The curricular framework is…
Development of a Framework for Teaching Mathematics in Depth
ERIC Educational Resources Information Center
LaFramenta, Joanne Jensen
2011-01-01
This study illuminates the practice of teaching mathematics in depth by developing a framework to serve practicing teachers and those who educate teachers. A thorough reading of the literature that began with all of the volumes in the decades since the publication of the Standards (1989) identified six elements that were profitable for effective…
Negotiating Meaning in Cross-National Studies of Mathematics Teaching: Kissing Frogs to Find Princes
ERIC Educational Resources Information Center
Andrews, Paul
2007-01-01
This paper outlines the iterative processes by which a multinational team of researchers developed a low-inference framework for the analysis of video recordings of mathematics lessons drawn from Flemish Belgium, England, Finland, Hungary and Spain. Located within a theoretical framework concerning learning as the negotiation of meaning, we…
Mathematical Abstraction: Constructing Concept of Parallel Coordinates
NASA Astrophysics Data System (ADS)
Nurhasanah, F.; Kusumah, Y. S.; Sabandar, J.; Suryadi, D.
2017-09-01
Mathematical abstraction is an important process in teaching and learning mathematics so pre-service mathematics teachers need to understand and experience this process. One of the theoretical-methodological frameworks for studying this process is Abstraction in Context (AiC). Based on this framework, abstraction process comprises of observable epistemic actions, Recognition, Building-With, Construction, and Consolidation called as RBC + C model. This study investigates and analyzes how pre-service mathematics teachers constructed and consolidated concept of Parallel Coordinates in a group discussion. It uses AiC framework for analyzing mathematical abstraction of a group of pre-service teachers consisted of four students in learning Parallel Coordinates concepts. The data were collected through video recording, students’ worksheet, test, and field notes. The result shows that the students’ prior knowledge related to concept of the Cartesian coordinate has significant role in the process of constructing Parallel Coordinates concept as a new knowledge. The consolidation process is influenced by the social interaction between group members. The abstraction process taken place in this group were dominated by empirical abstraction that emphasizes on the aspect of identifying characteristic of manipulated or imagined object during the process of recognizing and building-with.
Food-web based unified model of macro- and microevolution.
Chowdhury, Debashish; Stauffer, Dietrich
2003-10-01
We incorporate the generic hierarchical architecture of foodwebs into a "unified" model that describes both micro- and macroevolutions within a single theoretical framework. This model describes the microevolution in detail by accounting for the birth, ageing, and natural death of individual organisms as well as prey-predator interactions on a hierarchical dynamic food web. It also provides a natural description of random mutations and speciation (origination) of species as well as their extinctions. The distribution of lifetimes of species follows an approximate power law only over a limited regime.
Unified approach to redshift in cosmological/black hole spacetimes and synchronous frame
NASA Astrophysics Data System (ADS)
Toporensky, A. V.; Zaslavskii, O. B.; Popov, S. B.
2018-01-01
Usually, interpretation of redshift in static spacetimes (for example, near black holes) is opposed to that in cosmology. In this methodological note, we show that both explanations are unified in a natural picture. This is achieved if, considering the static spacetime, one (i) makes a transition to a synchronous frame, and (ii) returns to the original frame by means of local Lorentz boost. To reach our goal, we consider a rather general class of spherically symmetric spacetimes. In doing so, we construct frames that generalize the well-known Lemaitre and Painlevé-Gullstand ones and elucidate the relation between them. This helps us to understand, in a unifying approach, how gravitation reveals itself in different branches of general relativity. This framework can be useful for general relativity university courses.
Impact of Beads and Drops on a Repellent Solid Surface: A Unified Description
NASA Astrophysics Data System (ADS)
Arora, S.; Fromental, J.-M.; Mora, S.; Phou, Ty; Ramos, L.; Ligoure, C.
2018-04-01
We investigate freely expanding sheets formed by ultrasoft gel beads, and liquid and viscoelastic drops, produced by the impact of the bead or drop on a silicon wafer covered with a thin layer of liquid nitrogen that suppresses viscous dissipation thanks to an inverse Leidenfrost effect. Our experiments show a unified behavior for the impact dynamics that holds for solids, liquids, and viscoelastic fluids and that we rationalize by properly taking into account elastocapillary effects. In this framework, the classical impact dynamics of solids and liquids, as far as viscous dissipation is negligible, appears as the asymptotic limits of a universal theoretical description. A novel material-dependent characteristic velocity that includes both capillary and bulk elasticity emerges from this unified description of the physics of impact.
SCIFIO: an extensible framework to support scientific image formats.
Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W
2016-12-07
No gold standard exists in the world of scientific image acquisition; a proliferation of instruments each with its own proprietary data format has made out-of-the-box sharing of that data nearly impossible. In the field of light microscopy, the Bio-Formats library was designed to translate such proprietary data formats to a common, open-source schema, enabling sharing and reproduction of scientific results. While Bio-Formats has proved successful for microscopy images, the greater scientific community was lacking a domain-independent framework for format translation. SCIFIO (SCientific Image Format Input and Output) is presented as a freely available, open-source library unifying the mechanisms of reading and writing image data. The core of SCIFIO is its modular definition of formats, the design of which clearly outlines the components of image I/O to encourage extensibility, facilitated by the dynamic discovery of the SciJava plugin framework. SCIFIO is structured to support coexistence of multiple domain-specific open exchange formats, such as Bio-Formats' OME-TIFF, within a unified environment. SCIFIO is a freely available software library developed to standardize the process of reading and writing scientific image formats.
Zenni, Rafael Dudeque; Dickie, Ian A; Wingfield, Michael J; Hirsch, Heidi; Crous, Casparus J; Meyerson, Laura A; Burgess, Treena I; Zimmermann, Thalita G; Klock, Metha M; Siemann, Evan; Erfmeier, Alexandra; Aragon, Roxana; Montti, Lia; Le Roux, Johannes J
2016-12-30
Evolutionary processes greatly impact the outcomes of biological invasions. An extensive body of research suggests that invasive populations often undergo phenotypic and ecological divergence from their native sources. Evolution also operates at different and distinct stages during the invasion process. Thus, it is important to incorporate evolutionary change into frameworks of biological invasions because it allows us to conceptualize how these processes may facilitate or hinder invasion success. Here, we review such processes, with an emphasis on tree invasions, and place them in the context of the unified framework for biological invasions. The processes and mechanisms described are pre-introduction evolutionary history, sampling effect, founder effect, genotype-by-environment interactions, admixture, hybridization, polyploidization, rapid evolution, epigenetics, and second-genomes. For the last, we propose that co-evolved symbionts, both beneficial and harmful, which are closely physiologically associated with invasive species, contain critical genetic traits that affect the evolutionary dynamics of biological invasions. By understanding the mechanisms underlying invasion success, researchers will be better equipped to predict, understand, and manage biological invasions. Published by Oxford University Press on behalf of the Annals of Botany Company.
Dickie, Ian A.; Wingfield, Michael J.; Hirsch, Heidi; Crous, Casparus J.; Meyerson, Laura A.; Burgess, Treena I.; Zimmermann, Thalita G.; Klock, Metha M.; Siemann, Evan; Erfmeier, Alexandra; Aragon, Roxana; Montti, Lia; Le Roux, Johannes J.
2017-01-01
Abstract Evolutionary processes greatly impact the outcomes of biological invasions. An extensive body of research suggests that invasive populations often undergo phenotypic and ecological divergence from their native sources. Evolution also operates at different and distinct stages during the invasion process. Thus, it is important to incorporate evolutionary change into frameworks of biological invasions because it allows us to conceptualize how these processes may facilitate or hinder invasion success. Here, we review such processes, with an emphasis on tree invasions, and place them in the context of the unified framework for biological invasions. The processes and mechanisms described are pre-introduction evolutionary history, sampling effect, founder effect, genotype-by-environment interactions, admixture, hybridization, polyploidization, rapid evolution, epigenetics and second-genomes. For the last, we propose that co-evolved symbionts, both beneficial and harmful, which are closely physiologically associated with invasive species, contain critical genetic traits that affect the evolutionary dynamics of biological invasions. By understanding the mechanisms underlying invasion success, researchers will be better equipped to predict, understand and manage biological invasions. PMID:28039118
NASA Astrophysics Data System (ADS)
Pathirana, A.; Radhakrishnan, M.; Zevenbergen, C.; Quan, N. H.
2016-12-01
The need to address the shortcomings of urban systems - adaptation deficit - and shortcomings in response to climate change - `adaptation gap' - are both major challenges in maintaining the livability and sustainability of cities. However, the adaptation actions defined in terms of type I (addressing adaptation deficits) and type II (addressing adaptation gaps), often compete and conflict each other in the secondary cities of the global south. Extending the concept of the environmental Kuznets curve, this paper argues that a unified framework that calls for synergistic action on type I and type II adaptation is essential in order for these cities to maintain their livability, sustainability and resilience facing extreme rates of urbanization and rapid onset of climate change. The proposed framework has been demonstrated in Can Tho, Vietnam, where there are significant adaptation deficits due to rapid urbanisation and adaptation gaps due to climate change and socio-economic changes. The analysis in Can Tho reveals the lack of integration between type I and type II measures that could be overcome by closer integration between various stakeholders in terms of planning, prioritising and implementing the adaptation measures.
Unified framework for automated iris segmentation using distantly acquired face images.
Tan, Chun-Wei; Kumar, Ajay
2012-09-01
Remote human identification using iris biometrics has high civilian and surveillance applications and its success requires the development of robust segmentation algorithm to automatically extract the iris region. This paper presents a new iris segmentation framework which can robustly segment the iris images acquired using near infrared or visible illumination. The proposed approach exploits multiple higher order local pixel dependencies to robustly classify the eye region pixels into iris or noniris regions. Face and eye detection modules have been incorporated in the unified framework to automatically provide the localized eye region from facial image for iris segmentation. We develop robust postprocessing operations algorithm to effectively mitigate the noisy pixels caused by the misclassification. Experimental results presented in this paper suggest significant improvement in the average segmentation errors over the previously proposed approaches, i.e., 47.5%, 34.1%, and 32.6% on UBIRIS.v2, FRGC, and CASIA.v4 at-a-distance databases, respectively. The usefulness of the proposed approach is also ascertained from recognition experiments on three different publicly available databases.
Trajectory optimization for lunar soft landing with complex constraints
NASA Astrophysics Data System (ADS)
Chu, Huiping; Ma, Lin; Wang, Kexin; Shao, Zhijiang; Song, Zhengyu
2017-11-01
A unified trajectory optimization framework with initialization strategies is proposed in this paper for lunar soft landing for various missions with specific requirements. Two main missions of interest are Apollo-like Landing from low lunar orbit and Vertical Takeoff Vertical Landing (a promising mobility method) on the lunar surface. The trajectory optimization is characterized by difficulties arising from discontinuous thrust, multi-phase connections, jump of attitude angle, and obstacles avoidance. Here R-function is applied to deal with the discontinuities of thrust, checkpoint constraints are introduced to connect multiple landing phases, attitude angular rate is designed to get rid of radical changes, and safeguards are imposed to avoid collision with obstacles. The resulting dynamic problems are generally with complex constraints. The unified framework based on Gauss Pseudospectral Method (GPM) and Nonlinear Programming (NLP) solver are designed to solve the problems efficiently. Advanced initialization strategies are developed to enhance both the convergence and computation efficiency. Numerical results demonstrate the adaptability of the framework for various landing missions, and the performance of successful solution of difficult dynamic problems.
Understanding the Chinese Approach to Creative Teaching in Mathematics Classrooms
ERIC Educational Resources Information Center
Niu, Weihua; Zhou, Zheng; Zhou, Xinlin
2017-01-01
Using Amabile's componential theory of creativity as a framework, this paper analyzes how Chinese mathematics teachers achieve creative teaching through acquiring in-depth domain-specific knowledge in mathematics, developing creativity-related skills, as well as stimulating student interest in learning mathematics, through well-crafted,…
ERIC Educational Resources Information Center
Powell, Sarah R.; Fuchs, Lynn S.; Fuchs, Doug
2013-01-01
The Common Core State Standards provide teachers with a framework of necessary mathematics skills across grades K-12, which vary considerably from previous mathematics standards. In this article, we discuss concerns about the implications of the Common Core for students with mathematics difficulties (MD), given that students with MD, by…
ERIC Educational Resources Information Center
Fielding-Wells, Jill
2016-01-01
One potential means to develop students' contextual and conceptual understanding of mathematics is through Inquiry Learning. However, introducing a problem context can distract from mathematical content. Incorporating argumentation practices into Inquiry may address this through providing a stronger reliance on mathematical evidence and reasoning.…
In the Middle of Nowhere: How a Textbook Can Position the Mathematics Learner
ERIC Educational Resources Information Center
Herbel-Eisenmann, Beth; Wagner, David
2005-01-01
We outline a framework for investigating how a mathematics textbook positions the mathematics learner. We use tools and concepts from discourse analysis, a field of linguistic scholarship, to illustrate the ways in which a textbook can position people in relation to mathematics and how the text can position the mathematics learner in relation to…
Deep graphs—A general framework to represent and analyze heterogeneous complex systems across scales
NASA Astrophysics Data System (ADS)
Traxl, Dominik; Boers, Niklas; Kurths, Jürgen
2016-06-01
Network theory has proven to be a powerful tool in describing and analyzing systems by modelling the relations between their constituent objects. Particularly in recent years, a great progress has been made by augmenting "traditional" network theory in order to account for the multiplex nature of many networks, multiple types of connections between objects, the time-evolution of networks, networks of networks and other intricacies. However, existing network representations still lack crucial features in order to serve as a general data analysis tool. These include, most importantly, an explicit association of information with possibly heterogeneous types of objects and relations, and a conclusive representation of the properties of groups of nodes as well as the interactions between such groups on different scales. In this paper, we introduce a collection of definitions resulting in a framework that, on the one hand, entails and unifies existing network representations (e.g., network of networks and multilayer networks), and on the other hand, generalizes and extends them by incorporating the above features. To implement these features, we first specify the nodes and edges of a finite graph as sets of properties (which are permitted to be arbitrary mathematical objects). Second, the mathematical concept of partition lattices is transferred to the network theory in order to demonstrate how partitioning the node and edge set of a graph into supernodes and superedges allows us to aggregate, compute, and allocate information on and between arbitrary groups of nodes. The derived partition lattice of a graph, which we denote by deep graph, constitutes a concise, yet comprehensive representation that enables the expression and analysis of heterogeneous properties, relations, and interactions on all scales of a complex system in a self-contained manner. Furthermore, to be able to utilize existing network-based methods and models, we derive different representations of multilayer networks from our framework and demonstrate the advantages of our representation. On the basis of the formal framework described here, we provide a rich, fully scalable (and self-explanatory) software package that integrates into the PyData ecosystem and offers interfaces to popular network packages, making it a powerful, general-purpose data analysis toolkit. We exemplify an application of deep graphs using a real world dataset, comprising 16 years of satellite-derived global precipitation measurements. We deduce a deep graph representation of these measurements in order to track and investigate local formations of spatio-temporal clusters of extreme precipitation events.
Traxl, Dominik; Boers, Niklas; Kurths, Jürgen
2016-06-01
Network theory has proven to be a powerful tool in describing and analyzing systems by modelling the relations between their constituent objects. Particularly in recent years, a great progress has been made by augmenting "traditional" network theory in order to account for the multiplex nature of many networks, multiple types of connections between objects, the time-evolution of networks, networks of networks and other intricacies. However, existing network representations still lack crucial features in order to serve as a general data analysis tool. These include, most importantly, an explicit association of information with possibly heterogeneous types of objects and relations, and a conclusive representation of the properties of groups of nodes as well as the interactions between such groups on different scales. In this paper, we introduce a collection of definitions resulting in a framework that, on the one hand, entails and unifies existing network representations (e.g., network of networks and multilayer networks), and on the other hand, generalizes and extends them by incorporating the above features. To implement these features, we first specify the nodes and edges of a finite graph as sets of properties (which are permitted to be arbitrary mathematical objects). Second, the mathematical concept of partition lattices is transferred to the network theory in order to demonstrate how partitioning the node and edge set of a graph into supernodes and superedges allows us to aggregate, compute, and allocate information on and between arbitrary groups of nodes. The derived partition lattice of a graph, which we denote by deep graph, constitutes a concise, yet comprehensive representation that enables the expression and analysis of heterogeneous properties, relations, and interactions on all scales of a complex system in a self-contained manner. Furthermore, to be able to utilize existing network-based methods and models, we derive different representations of multilayer networks from our framework and demonstrate the advantages of our representation. On the basis of the formal framework described here, we provide a rich, fully scalable (and self-explanatory) software package that integrates into the PyData ecosystem and offers interfaces to popular network packages, making it a powerful, general-purpose data analysis toolkit. We exemplify an application of deep graphs using a real world dataset, comprising 16 years of satellite-derived global precipitation measurements. We deduce a deep graph representation of these measurements in order to track and investigate local formations of spatio-temporal clusters of extreme precipitation events.
REVIEW: Internal models in sensorimotor integration: perspectives from adaptive control theory
NASA Astrophysics Data System (ADS)
Tin, Chung; Poon, Chi-Sang
2005-09-01
Internal models and adaptive controls are empirical and mathematical paradigms that have evolved separately to describe learning control processes in brain systems and engineering systems, respectively. This paper presents a comprehensive appraisal of the correlation between these paradigms with a view to forging a unified theoretical framework that may benefit both disciplines. It is suggested that the classic equilibrium-point theory of impedance control of arm movement is analogous to continuous gain-scheduling or high-gain adaptive control within or across movement trials, respectively, and that the recently proposed inverse internal model is akin to adaptive sliding control originally for robotic manipulator applications. Modular internal models' architecture for multiple motor tasks is a form of multi-model adaptive control. Stochastic methods, such as generalized predictive control, reinforcement learning, Bayesian learning and Hebbian feedback covariance learning, are reviewed and their possible relevance to motor control is discussed. Possible applicability of a Luenberger observer and an extended Kalman filter to state estimation problems—such as sensorimotor prediction or the resolution of vestibular sensory ambiguity—is also discussed. The important role played by vestibular system identification in postural control suggests an indirect adaptive control scheme whereby system states or parameters are explicitly estimated prior to the implementation of control. This interdisciplinary framework should facilitate the experimental elucidation of the mechanisms of internal models in sensorimotor systems and the reverse engineering of such neural mechanisms into novel brain-inspired adaptive control paradigms in future.
A Unified Framework for Periodic, On-Demand, and User-Specified Software Information
NASA Technical Reports Server (NTRS)
Kolano, Paul Z.
2004-01-01
Although grid computing can increase the number of resources available to a user; not all resources on the grid may have a software environment suitable for running a given application. To provide users with the necessary assistance for selecting resources with compatible software environments and/or for automatically establishing such environments, it is necessary to have an accurate source of information about the software installed across the grid. This paper presents a new OGSI-compliant software information service that has been implemented as part of NASA's Information Power Grid project. This service is built on top of a general framework for reconciling information from periodic, on-demand, and user-specified sources. Information is retrieved using standard XPath queries over a single unified namespace independent of the information's source. Two consumers of the provided software information, the IPG Resource Broker and the IPG Neutralization Service, are briefly described.
Semantically enabled image similarity search
NASA Astrophysics Data System (ADS)
Casterline, May V.; Emerick, Timothy; Sadeghi, Kolia; Gosse, C. A.; Bartlett, Brent; Casey, Jason
2015-05-01
Georeferenced data of various modalities are increasingly available for intelligence and commercial use, however effectively exploiting these sources demands a unified data space capable of capturing the unique contribution of each input. This work presents a suite of software tools for representing geospatial vector data and overhead imagery in a shared high-dimension vector or embedding" space that supports fused learning and similarity search across dissimilar modalities. While the approach is suitable for fusing arbitrary input types, including free text, the present work exploits the obvious but computationally difficult relationship between GIS and overhead imagery. GIS is comprised of temporally-smoothed but information-limited content of a GIS, while overhead imagery provides an information-rich but temporally-limited perspective. This processing framework includes some important extensions of concepts in literature but, more critically, presents a means to accomplish them as a unified framework at scale on commodity cloud architectures.
Motor symptoms in Parkinson's disease: A unified framework.
Moustafa, Ahmed A; Chakravarthy, Srinivasa; Phillips, Joseph R; Gupta, Ankur; Keri, Szabolcs; Polner, Bertalan; Frank, Michael J; Jahanshahi, Marjan
2016-09-01
Parkinson's disease (PD) is characterized by a range of motor symptoms. Besides the cardinal symptoms (akinesia and bradykinesia, tremor and rigidity), PD patients show additional motor deficits, including: gait disturbance, impaired handwriting, grip force and speech deficits, among others. Some of these motor symptoms (e.g., deficits of gait, speech, and handwriting) have similar clinical profiles, neural substrates, and respond similarly to dopaminergic medication and deep brain stimulation (DBS). Here, we provide an extensive review of the clinical characteristics and neural substrates of each of these motor symptoms, to highlight precisely how PD and its medical and surgical treatments impact motor symptoms. In conclusion, we offer a unified framework for understanding the range of motor symptoms in PD. We argue that various motor symptoms in PD reflect dysfunction of neural structures responsible for action selection, motor sequencing, and coordination and execution of movement. Copyright © 2016 Elsevier Ltd. All rights reserved.
Benefits of a Unified LaSRS++ Simulation for NAS-Wide and High-Fidelity Modeling
NASA Technical Reports Server (NTRS)
Glaab, Patricia; Madden, Michael
2014-01-01
The LaSRS++ high-fidelity vehicle simulation was extended in 2012 to support a NAS-wide simulation mode. Since the initial proof-of-concept, the LaSRS++ NAS-wide simulation is maturing into a research-ready tool. A primary benefit of this new capability is the consolidation of the two modeling paradigms under a single framework to save cost, facilitate iterative concept testing between the two tools, and to promote communication and model sharing between user communities at Langley. Specific benefits of each type of modeling are discussed along with the expected benefits of the unified framework. Current capability details of the LaSRS++ NAS-wide simulations are provided, including the visualization tool, live data interface, trajectory generators, terminal routing for arrivals and departures, maneuvering, re-routing, navigation, winds, and turbulence. The plan for future development is also described.
Liu, Dan; Liu, Xuejun; Wu, Yiguang
2018-04-24
This paper presents an effective approach for depth reconstruction from a single image through the incorporation of semantic information and local details from the image. A unified framework for depth acquisition is constructed by joining a deep Convolutional Neural Network (CNN) and a continuous pairwise Conditional Random Field (CRF) model. Semantic information and relative depth trends of local regions inside the image are integrated into the framework. A deep CNN network is firstly used to automatically learn a hierarchical feature representation of the image. To get more local details in the image, the relative depth trends of local regions are incorporated into the network. Combined with semantic information of the image, a continuous pairwise CRF is then established and is used as the loss function of the unified model. Experiments on real scenes demonstrate that the proposed approach is effective and that the approach obtains satisfactory results.
Robust nonlinear control of vectored thrust aircraft
NASA Technical Reports Server (NTRS)
Doyle, John C.; Murray, Richard; Morris, John
1993-01-01
An interdisciplinary program in robust control for nonlinear systems with applications to a variety of engineering problems is outlined. Major emphasis will be placed on flight control, with both experimental and analytical studies. This program builds on recent new results in control theory for stability, stabilization, robust stability, robust performance, synthesis, and model reduction in a unified framework using Linear Fractional Transformations (LFT's), Linear Matrix Inequalities (LMI's), and the structured singular value micron. Most of these new advances have been accomplished by the Caltech controls group independently or in collaboration with researchers in other institutions. These recent results offer a new and remarkably unified framework for all aspects of robust control, but what is particularly important for this program is that they also have important implications for system identification and control of nonlinear systems. This combines well with Caltech's expertise in nonlinear control theory, both in geometric methods and methods for systems with constraints and saturations.
Discrete shearlet transform: faithful digitization concept and its applications
NASA Astrophysics Data System (ADS)
Lim, Wang-Q.
2011-09-01
Over the past years, various representation systems which sparsely approximate functions governed by anisotropic features such as edges in images have been proposed. Alongside the theoretical development of these systems, algorithmic realizations of the associated transforms were provided. However, one of the most common short-comings of these frameworks is the lack of providing a unified treatment of the continuum and digital world, i.e., allowing a digital theory to be a natural digitization of the continuum theory. Shearlets were introduced as means to sparsely encode anisotropic singularities of multivariate data while providing a unified treatment of the continuous and digital realm. In this paper, we introduce a discrete framework which allows a faithful digitization of the continuum domain shearlet transform based on compactly supported shearlets. Finally, we show numerical experiments demonstrating the potential of the discrete shearlet transform in several image processing applications.
Some characteristics of supernetworks based on unified hybrid network theory framework
NASA Astrophysics Data System (ADS)
Liu, Qiang; Fang, Jin-Qing; Li, Yong
Comparing with single complex networks, supernetworks are more close to the real world in some ways, and have become the newest research hot spot in the network science recently. Some progresses have been made in the research of supernetworks, but the theoretical research method and complex network characteristics of supernetwork models are still needed to further explore. In this paper, we propose three kinds of supernetwork models with three layers based on the unified hybrid network theory framework (UHNTF), and introduce preferential and random linking, respectively, between the upper and lower layers. Then we compared the topological characteristics of the single networks with the supernetwork models. In order to analyze the influence of the interlayer edges on network characteristics, the cross-degree is defined as a new important parameter. Then some interesting new phenomena are found, the results imply this supernetwork model has reference value and application potential.
Snoopy--a unifying Petri net framework to investigate biomolecular networks.
Rohr, Christian; Marwan, Wolfgang; Heiner, Monika
2010-04-01
To investigate biomolecular networks, Snoopy provides a unifying Petri net framework comprising a family of related Petri net classes. Models can be hierarchically structured, allowing for the mastering of larger networks. To move easily between the qualitative, stochastic and continuous modelling paradigms, models can be converted into each other. We get models sharing structure, but specialized by their kinetic information. The analysis and iterative reverse engineering of biomolecular networks is supported by the simultaneous use of several Petri net classes, while the graphical user interface adapts dynamically to the active one. Built-in animation and simulation are complemented by exports to various analysis tools. Snoopy facilitates the addition of new Petri net classes thanks to its generic design. Our tool with Petri net samples is available free of charge for non-commercial use at http://www-dssz.informatik.tu-cottbus.de/snoopy.html; supported operating systems: Mac OS X, Windows and Linux (selected distributions).
Achieving Quality Mathematics Classroom Instruction through Productive Pedagogies
ERIC Educational Resources Information Center
Bature, Iliya Joseph; Atweh, Bill
2016-01-01
This paper seeks to investigate the implementation of the Productive Pedagogies Framework in Nigerian mathematics classroom setting. The researcher adopted a qualitative case study approach to seeking data for the three research questions postulated for the study. Three mathematics teachers taught mathematics in two secondary schools in two…
Saussurian Linguistics Revisited: Can It Inform Our Interpretation of Mathematical Activity?.
ERIC Educational Resources Information Center
McNamara, O.
1995-01-01
Examines the basic notions of Ferdinand de Saussure and proposes that language is fundamental to the process of learning mathematics. Investigates possible mathematical perspectives upon Saussure's ideas and explores the contribution his work can offer to enhance and enrich the interpretive framework through which mathematical activity is observed…
Learning Mathematical Concepts through Authentic Learning
ERIC Educational Resources Information Center
Koh, Noi Keng; Low, Hwee Kian
2010-01-01
This paper explores the infusion of financial literacy into the Mathematics curriculum in a secondary school in Singapore. By infusing financial literacy, a core theme in the 21st century framework, into mathematics education, this study investigated the impact of using financial literacy-rich mathematics lessons by using validated learning…
Mathematics University Teachers' Perception of Pedagogical Content Knowledge (PCK)
ERIC Educational Resources Information Center
Khakbaz, Azimehsadat
2016-01-01
Teaching mathematics in university levels is one of the most important fields of research in the area of mathematics education. Nevertheless, there is little information about teaching knowledge of mathematics university teachers. Pedagogical content knowledge (PCK) provides a suitable framework to study knowledge of teachers. The purpose of this…
Mathematics, Programming, and STEM
ERIC Educational Resources Information Center
Yeh, Andy; Chandra, Vinesh
2015-01-01
Learning mathematics is a complex and dynamic process. In this paper, the authors adopt a semiotic framework (Yeh & Nason, 2004) and highlight programming as one of the main aspects of the semiosis or meaning-making for the learning of mathematics. During a 10- week teaching experiment, mathematical meaning-making was enriched when primary…
ERIC Educational Resources Information Center
Getenet, Seyum Tekeher; Beswick, Kim
2013-01-01
This study describes the construction of a questionnaire instrument to measure mathematics teacher educators' knowledge for technology integrated mathematics teaching. The study was founded on a reconceptualisation of the generic Technological Pedagogical Content Knowledge framework in the specific context of mathematics teaching. Steps in the…
Secure Computer System: Unified Exposition and Multics Interpretation
1976-03-01
prearranged code to semaphore critical information to an undercleared subject/process. Neither of these topics is directly addressed by the mathematical...FURTHER CONSIDERATIONS. RULES OF OPERATION FOR A SECURE MULTICS Kernel primitives for a secure Multics will be derived from a higher level user...the Multics architecture as little as possible; this will account to a large extent for radical differences in form between actual kernel primitives
ERIC Educational Resources Information Center
Zandieh, Michelle; Ellis, Jessica; Rasmussen, Chris
2017-01-01
As part of a larger study of student understanding of concepts in linear algebra, we interviewed 10 university linear algebra students as to their conceptions of functions from high school algebra and linear transformation from their study of linear algebra. An overarching goal of this study was to examine how linear algebra students see linear…
Frontiers in Human Information Processing Conference
2008-02-25
Frontiers in Human Information Processing - Vision, Attention , Memory , and Applications: A Tribute to George Sperling, a Festschrift. We are grateful...with focus on the formal, computational, and mathematical approaches that unify the areas of vision, attention , and memory . The conference also...Information Processing Conference Final Report AFOSR GRANT # FA9550-07-1-0346 The AFOSR Grant # FA9550-07-1-0346 provided partial support for the Conference
ERIC Educational Resources Information Center
Grenier-Boley, Nicolas
2014-01-01
Certain mathematical concepts were not introduced to solve a specific open problem but rather to solve different problems with the same tools in an economic formal way or to unify several approaches: such concepts, as some of those of linear algebra, are presumably difficult to introduce to students as they are potentially interwoven with many…
ERIC Educational Resources Information Center
Paprzycki, Peter; Tuttle, Nicole; Czerniak, Charlene M.; Molitor, Scott; Kadervaek, Joan; Mendenhall, Robert
2017-01-01
This study investigates the effect of a Framework-aligned professional development program at the PreK-3 level. The NSF funded program integrated science with literacy and mathematics learning and provided teacher professional development, along with materials and programming for parents to encourage science investigations and discourse around…
Using a Framework for Three Levels of Sense Making in a Mathematics Classroom
ERIC Educational Resources Information Center
Moss, Diana L.; Lamberg, Teruni
2016-01-01
This discussion-based lesson is designed to support Year 6 students in their initial understanding of using letters to represent numbers, expressions, and equations in algebra. The three level framework is designed for: (1) making thinking explicit, (2) exploring each other's solutions, and (3) developing new mathematical insights. In each level…
ERIC Educational Resources Information Center
Cohrssen, Caroline; Tayler, Collette; Cloney, Dan
2015-01-01
The Early Years Learning Framework for Australia governs early childhood education in the years before school in Australia. Since this framework is not a curriculum, early childhood educators report uncertainty regarding what mathematical concepts to teach and how to teach them. This implementation study, positioned within the broader E4Kids…
ERIC Educational Resources Information Center
Agrawal, Jugnu; Morin, Lisa L.
2016-01-01
Students with mathematics disabilities (MD) experience difficulties with both conceptual and procedural knowledge of different math concepts across grade levels. Research shows that concrete representational abstract framework of instruction helps to bridge this gap for students with MD. In this article, we provide an overview of this strategy…
Ghosh, Avijit; Scott, Dennis O; Maurer, Tristan S
2014-02-14
In this work, we provide a unified theoretical framework describing how drug molecules can permeate across membranes in neutral and ionized forms for unstirred in vitro systems. The analysis provides a self-consistent basis for the origin of the unstirred water layer (UWL) within the Nernst-Planck framework in the fully unstirred limit and further provides an accounting mechanism based simply on the bulk aqueous solvent diffusion constant of the drug molecule. Our framework makes no new assumptions about the underlying physics of molecular permeation. We hold simply that Nernst-Planck is a reasonable approximation at low concentrations and all physical systems must conserve mass. The applicability of the derived framework has been examined both with respect to the effect of stirring and externally applied voltages to measured permeability. The analysis contains data for 9 compounds extracted from the literature representing a range of permeabilities and aqueous diffusion coefficients. Applicability with respect to ionized permeation is examined using literature data for the permanently charged cation, crystal violet, providing a basis for the underlying mechanism for ionized drug permeation for this molecule as being due to mobile counter-current flow. Copyright © 2013 Elsevier B.V. All rights reserved.
A Unified Framework for Street-View Panorama Stitching
Li, Li; Yao, Jian; Xie, Renping; Xia, Menghan; Zhang, Wei
2016-01-01
In this paper, we propose a unified framework to generate a pleasant and high-quality street-view panorama by stitching multiple panoramic images captured from the cameras mounted on the mobile platform. Our proposed framework is comprised of four major steps: image warping, color correction, optimal seam line detection and image blending. Since the input images are captured without a precisely common projection center from the scenes with the depth differences with respect to the cameras to different extents, such images cannot be precisely aligned in geometry. Therefore, an efficient image warping method based on the dense optical flow field is proposed to greatly suppress the influence of large geometric misalignment at first. Then, to lessen the influence of photometric inconsistencies caused by the illumination variations and different exposure settings, we propose an efficient color correction algorithm via matching extreme points of histograms to greatly decrease color differences between warped images. After that, the optimal seam lines between adjacent input images are detected via the graph cut energy minimization framework. At last, the Laplacian pyramid blending algorithm is applied to further eliminate the stitching artifacts along the optimal seam lines. Experimental results on a large set of challenging street-view panoramic images captured form the real world illustrate that the proposed system is capable of creating high-quality panoramas. PMID:28025481
New Methods in Non-Perturbative QCD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Unsal, Mithat
2017-01-31
In this work, we investigate the properties of quantum chromodynamics (QCD), by using newly developing mathematics and physics formalisms. Almost all of the mass in the visible universe emerges from a quantum chromodynamics (QCD), which has a completely negligible microscopic mass content. An intimately related issue in QCD is the quark confinement problem. Answers to non-perturbative questions in QCD remained largely elusive despite much effort over the years. It is also believed that the usual perturbation theory is inadequate to address these kinds of problems. Perturbation theory gives a divergent asymptotic series (even when the theory is properly renormalized), andmore » there are non-perturbative phenomena which never appear at any order in perturbation theory. Recently, a fascinating bridge between perturbation theory and non-perturbative effects has been found: a formalism called resurgence theory in mathematics tells us that perturbative data and non-perturbative data are intimately related. Translating this to the language of quantum field theory, it turns out that non-perturbative information is present in a coded form in perturbation theory and it can be decoded. We take advantage of this feature, which is particularly useful to understand some unresolved mysteries of QCD from first principles. In particular, we use: a) Circle compactifications which provide a semi-classical window to study confinement and mass gap problems, and calculable prototypes of the deconfinement phase transition; b) Resurgence theory and transseries which provide a unified framework for perturbative and non-perturbative expansion; c) Analytic continuation of path integrals and Lefschetz thimbles which may be useful to address sign problem in QCD at finite density.« less
Measuring the diversity of the human microbiota with targeted next-generation sequencing.
Finotello, Francesca; Mastrorilli, Eleonora; Di Camillo, Barbara
2016-12-26
The human microbiota is a complex ecological community of commensal, symbiotic and pathogenic microorganisms harboured by the human body. Next-generation sequencing (NGS) technologies, in particular targeted amplicon sequencing of the 16S ribosomal RNA gene (16S-seq), are enabling the identification and quantification of human-resident microorganisms at unprecedented resolution, providing novel insights into the role of the microbiota in health and disease. Once microbial abundances are quantified through NGS data analysis, diversity indices provide valuable mathematical tools to describe the ecological complexity of a single sample or to detect species differences between samples. However, diversity is not a determined physical quantity for which a consensus definition and unit of measure have been established, and several diversity indices are currently available. Furthermore, they were originally developed for macroecology and their robustness to the possible bias introduced by sequencing has not been characterized so far. To assist the reader with the selection and interpretation of diversity measures, we review a panel of broadly used indices, describing their mathematical formulations, purposes and properties, and characterize their behaviour and criticalities in dependence of the data features using simulated data as ground truth. In addition, we make available an R package, DiversitySeq, which implements in a unified framework the full panel of diversity indices and a simulator of 16S-seq data, and thus represents a valuable resource for the analysis of diversity from NGS count data and for the benchmarking of computational methods for 16S-seq. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Dynamics and function of the tear film in relation to the blink cycle.
Braun, R J; King-Smith, P E; Begley, C G; Li, Longfei; Gewecke, N R
2015-03-01
Great strides have recently been made in quantitative measurements of tear film thickness and thinning, mathematical modeling thereof and linking these to sensory perception. This paper summarizes recent progress in these areas and reports on new results. The complete blink cycle is used as a framework that attempts to unify the results that are currently available. Understanding of tear film dynamics is aided by combining information from different imaging methods, including fluorescence, retroillumination and a new high-speed stroboscopic imaging system developed for studying the tear film during the blink cycle. During the downstroke of the blink, lipid is compressed as a thick layer just under the upper lid which is often released as a narrow thick band of lipid at the beginning of the upstroke. "Rippling" of the tear film/air interface due to motion of the tear film over the corneal surface, somewhat like the flow of water in a shallow stream over a rocky streambed, was observed during lid motion and treated theoretically here. New mathematical predictions of tear film osmolarity over the exposed ocular surface and in tear breakup are presented; the latter is closely linked to new in vivo observations. Models include the effects of evaporation, osmotic flow through the cornea and conjunctiva, quenching of fluorescence, tangential flow of aqueous tears and diffusion of tear solutes and fluorescein. These and other combinations of experiment and theory increase our understanding of the fluid dynamics of the tear film and its potential impact on the ocular surface. Copyright © 2014 Elsevier Ltd. All rights reserved.
Dynamics and function of the tear film in relation to the blink cycle
Braun, R.J.; King-Smith, P.E.; Begley, C.G.; Li, Longfei; Gewecke, N.R.
2014-01-01
Great strides have recently been made in quantitative measurements of tear film thickness and thinning, mathematical modeling thereof and linking these to sensory perception. This paper summarizes recent progress in these areas and reports on new results. The complete blink cycle is used as a framework that attempts to unify the results that are currently available. Understanding of tear film dynamics is aided by combining information from different imaging methods, including fluorescence, retroillumination and a new high-speed stroboscopic imaging system developed for studying the tear film during the blink cycle. During the downstroke of the blink, lipid is compressed as a thick layer just under the upper lid which is often released as a narrow thick band of lipid at the beginning of the upstroke. “Rippling” of the tear film/air interface due to motion of the tear film over the corneal surface, somewhat like the flow of water in a shallow stream over a rocky streambed, was observed during lid motion and treated theoretically here. New mathematical predictions of tear film osmolarity over the exposed ocular surface and in tear breakup are presented; the latter is closely linked to new in vivo observations. Models include the effects of evaporation, osmotic flow through the cornea and conjunctiva, quenching of fluorescence, tangential flow of aqueous tears and diffusion of tear solutes and fluorescein. These and other combinations of experiment and theory increase our understanding of the fluid dynamics of the tear film and its potential impact on the ocular surface. PMID:25479602
Cilliers, Cornelius; Guo, Hans; Liao, Jianshan; Christodolu, Nikolas; Thurber, Greg M
2016-09-01
Antibody-drug conjugates exhibit complex pharmacokinetics due to their combination of macromolecular and small molecule properties. These issues range from systemic concerns, such as deconjugation of the small molecule drug during the long antibody circulation time or rapid clearance from nonspecific interactions, to local tumor tissue heterogeneity, cell bystander effects, and endosomal escape. Mathematical models can be used to study the impact of these processes on overall distribution in an efficient manner, and several types of models have been used to analyze varying aspects of antibody distribution including physiologically based pharmacokinetic (PBPK) models and tissue-level simulations. However, these processes are quantitative in nature and cannot be handled qualitatively in isolation. For example, free antibody from deconjugation of the small molecule will impact the distribution of conjugated antibodies within the tumor. To incorporate these effects into a unified framework, we have coupled the systemic and organ-level distribution of a PBPK model with the tissue-level detail of a distributed parameter tumor model. We used this mathematical model to analyze new experimental results on the distribution of the clinical antibody-drug conjugate Kadcyla in HER2-positive mouse xenografts. This model is able to capture the impact of the drug-antibody ratio (DAR) on tumor penetration, the net result of drug deconjugation, and the effect of using unconjugated antibody to drive ADC penetration deeper into the tumor tissue. This modeling approach will provide quantitative and mechanistic support to experimental studies trying to parse the impact of multiple mechanisms of action for these complex drugs.
Cilliers, Cornelius; Guo, Hans; Liao, Jianshan; Christodolu, Nikolas; Thurber, Greg M.
2016-01-01
Antibody drug conjugates exhibit complex pharmacokinetics due to their combination of macromolecular and small molecule properties. These issues range from systemic concerns, such as deconjugation of the small molecule drug during the long antibody circulation time or rapid clearance from non-specific interactions, to local tumor tissue heterogeneity, cell bystander effects, and endosomal escape. Mathematical models can be used to study the impact of these processes on overall distribution in an efficient manner, and several types of models have been used to analyze varying aspects of antibody distribution including physiologically based pharmacokinetic (PBPK) models and tissue-level simulations. However, these processes are quantitative in nature and cannot be handled qualitatively in isolation. For example, free antibody from deconjugation of the small molecule will impact the distribution of conjugated antibodies within the tumor. To incorporate these effects into a unified framework, we have coupled the systemic and organ-level distribution of a PBPK model with the tissue-level detail of a distributed parameter tumor model. We used this mathematical model to analyze new experimental results on the distribution of the clinical antibody drug conjugate Kadcyla in HER2 positive mouse xenografts. This model is able to capture the impact of the drug antibody ratio (DAR) on tumor penetration, the net result of drug deconjugation, and the effect of using unconjugated antibody to drive ADC penetration deeper into the tumor tissue. This modeling approach will provide quantitative and mechanistic support to experimental studies trying to parse the impact of multiple mechanisms of action for these complex drugs. PMID:27287046
Unifying models of dialect spread and extinction using surface tension dynamics
2018-01-01
We provide a unified mathematical explanation of two classical forms of spatial linguistic spread. The wave model describes the radiation of linguistic change outwards from a central focus. Changes can also jump between population centres in a process known as hierarchical diffusion. It has recently been proposed that the spatial evolution of dialects can be understood using surface tension at linguistic boundaries. Here we show that the inclusion of long-range interactions in the surface tension model generates both wave-like spread, and hierarchical diffusion, and that it is surface tension that is the dominant effect in deciding the stable distribution of dialect patterns. We generalize the model to allow population mixing which can induce shrinkage of linguistic domains, or destroy dialect regions from within. PMID:29410847
A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models.
Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S; Wu, Xiaowei; Müller, Rolf
2018-01-01
Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design.
A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models
Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S.; Wu, Xiaowei; Müller, Rolf
2017-01-01
Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design. PMID:29749977
Unifying framework for multimodal brain MRI segmentation based on Hidden Markov Chains.
Bricq, S; Collet, Ch; Armspach, J P
2008-12-01
In the frame of 3D medical imaging, accurate segmentation of multimodal brain MR images is of interest for many brain disorders. However, due to several factors such as noise, imaging artifacts, intrinsic tissue variation and partial volume effects, tissue classification remains a challenging task. In this paper, we present a unifying framework for unsupervised segmentation of multimodal brain MR images including partial volume effect, bias field correction, and information given by a probabilistic atlas. Here-proposed method takes into account neighborhood information using a Hidden Markov Chain (HMC) model. Due to the limited resolution of imaging devices, voxels may be composed of a mixture of different tissue types, this partial volume effect is included to achieve an accurate segmentation of brain tissues. Instead of assigning each voxel to a single tissue class (i.e., hard classification), we compute the relative amount of each pure tissue class in each voxel (mixture estimation). Further, a bias field estimation step is added to the proposed algorithm to correct intensity inhomogeneities. Furthermore, atlas priors were incorporated using probabilistic brain atlas containing prior expectations about the spatial localization of different tissue classes. This atlas is considered as a complementary sensor and the proposed method is extended to multimodal brain MRI without any user-tunable parameter (unsupervised algorithm). To validate this new unifying framework, we present experimental results on both synthetic and real brain images, for which the ground truth is available. Comparison with other often used techniques demonstrates the accuracy and the robustness of this new Markovian segmentation scheme.
ERIC Educational Resources Information Center
Joseph, Christine M.
2012-01-01
The purpose of this study was to investigate how writing in mathematics is treated in one 4th grade National Science Foundation (NSF)-funded mathematics textbook titled "Everyday Mathematics" and one publisher-generated textbook titled "enVision MATH." The developed framework provided categories to support each of the research…
ERIC Educational Resources Information Center
Novikasari, Ifada; Darhim, Didi Suryadi
2015-01-01
This study explored the characteristics of pre-service primary teachers (PSTs) influenced by mathematical belief and mathematical knowledge for teaching (MKT) PSTs'. A qualitative approach was used to investigate the levels of PSTs on mathematical belief and MKT. The two research instruments used in this study were an interview-based task and a…
ERIC Educational Resources Information Center
Meiring, Steven P.; And Others
The 1989 document, "Curriculum and Evaluation Standards for School Mathematics," provides a vision and a framework for revising and strengthening the K-12 mathematics curriculum in North American schools and for evaluating both the mathematics curriculum and students' progress. When completed, it is expected that the Addenda Series will…
ERIC Educational Resources Information Center
Heyd-Metzuyanim, Einat
2015-01-01
This study uses a new communicational lens that conceptualizes the activity of learning mathematics as interplay between mathematizing and identifying in order to study how the emotional, social, and cognitive aspects of learning mathematics interact with one another. The proposed framework is used to analyze the case of Idit, a girl who started…
Brain-Mind Operational Architectonics Imaging: Technical and Methodological Aspects
Fingelkurts, Andrew A; Fingelkurts, Alexander A
2008-01-01
This review paper deals with methodological and technical foundations of the Operational Architectonics framework of brain and mind functioning. This theory provides a framework for mapping and understanding important aspects of the brain mechanisms that constitute perception, cognition, and eventually consciousness. The methods utilized within Operational Architectonics framework allow analyzing with an incredible detail the operational behavior of local neuronal assemblies and their joint activity in the form of unified and metastable operational modules, which constitute the whole hierarchy of brain operations, operations of cognition and phenomenal consciousness. PMID:19526071
A Unified Nonlinear Adaptive Approach for Detection and Isolation of Engine Faults
NASA Technical Reports Server (NTRS)
Tang, Liang; DeCastro, Jonathan A.; Zhang, Xiaodong; Farfan-Ramos, Luis; Simon, Donald L.
2010-01-01
A challenging problem in aircraft engine health management (EHM) system development is to detect and isolate faults in system components (i.e., compressor, turbine), actuators, and sensors. Existing nonlinear EHM methods often deal with component faults, actuator faults, and sensor faults separately, which may potentially lead to incorrect diagnostic decisions and unnecessary maintenance. Therefore, it would be ideal to address sensor faults, actuator faults, and component faults under one unified framework. This paper presents a systematic and unified nonlinear adaptive framework for detecting and isolating sensor faults, actuator faults, and component faults for aircraft engines. The fault detection and isolation (FDI) architecture consists of a parallel bank of nonlinear adaptive estimators. Adaptive thresholds are appropriately designed such that, in the presence of a particular fault, all components of the residual generated by the adaptive estimator corresponding to the actual fault type remain below their thresholds. If the faults are sufficiently different, then at least one component of the residual generated by each remaining adaptive estimator should exceed its threshold. Therefore, based on the specific response of the residuals, sensor faults, actuator faults, and component faults can be isolated. The effectiveness of the approach was evaluated using the NASA C-MAPSS turbofan engine model, and simulation results are presented.
Chen, Duan; Wei, Guo-Wei
2010-01-01
The miniaturization of nano-scale electronic devices, such as metal oxide semiconductor field effect transistors (MOSFETs), has given rise to a pressing demand in the new theoretical understanding and practical tactic for dealing with quantum mechanical effects in integrated circuits. Modeling and simulation of this class of problems have emerged as an important topic in applied and computational mathematics. This work presents mathematical models and computational algorithms for the simulation of nano-scale MOSFETs. We introduce a unified two-scale energy functional to describe the electrons and the continuum electrostatic potential of the nano-electronic device. This framework enables us to put microscopic and macroscopic descriptions in an equal footing at nano scale. By optimization of the energy functional, we derive consistently-coupled Poisson-Kohn-Sham equations. Additionally, layered structures are crucial to the electrostatic and transport properties of nano transistors. A material interface model is proposed for more accurate description of the electrostatics governed by the Poisson equation. Finally, a new individual dopant model that utilizes the Dirac delta function is proposed to understand the random doping effect in nano electronic devices. Two mathematical algorithms, the matched interface and boundary (MIB) method and the Dirichlet-to-Neumann mapping (DNM) technique, are introduced to improve the computational efficiency of nano-device simulations. Electronic structures are computed via subband decomposition and the transport properties, such as the I-V curves and electron density, are evaluated via the non-equilibrium Green's functions (NEGF) formalism. Two distinct device configurations, a double-gate MOSFET and a four-gate MOSFET, are considered in our three-dimensional numerical simulations. For these devices, the current fluctuation and voltage threshold lowering effect induced by the discrete dopant model are explored. Numerical convergence and model well-posedness are also investigated in the present work. PMID:20396650
Reframing Information Literacy as a Metaliteracy
ERIC Educational Resources Information Center
Mackey, Thomas P.; Jacobson, Trudi E.
2011-01-01
Social media environments and online communities are innovative collaborative technologies that challenge traditional definitions of information literacy. Metaliteracy is an overarching and self-referential framework that integrates emerging technologies and unifies multiple literacy types. This redefinition of information literacy expands the…
Unified approach for incompressible flows
NASA Astrophysics Data System (ADS)
Chang, Tyne-Hsien
1995-07-01
A unified approach for solving incompressible flows has been investigated in this study. The numerical CTVD (Centered Total Variation Diminishing) scheme used in this study was successfully developed by Sanders and Li for compressible flows, especially for the high speed. The CTVD scheme possesses better mathematical properties to damp out the spurious oscillations while providing high-order accuracy for high speed flows. It leads us to believe that the CTVD scheme can equally well apply to solve incompressible flows. Because of the mathematical difference between the governing equations for incompressible and compressible flows, the scheme can not directly apply to the incompressible flows. However, if one can modify the continuity equation for incompressible flows by introducing pseudo-compressibility, the governing equations for incompressible flows would have the same mathematical characters as compressible flows. The application of the algorithm to incompressible flows thus becomes feasible. In this study, the governing equations for incompressible flows comprise continuity equation and momentum equations. The continuity equation is modified by adding a time-derivative of the pressure term containing the artificial compressibility. The modified continuity equation together with the unsteady momentum equations forms a hyperbolic-parabolic type of time-dependent system of equations. Thus, the CTVD schemes can be implemented. In addition, the physical and numerical boundary conditions are properly implemented by the characteristic boundary conditions. Accordingly, a CFD code has been developed for this research and is currently under testing. Flow past a circular cylinder was chosen for numerical experiments to determine the accuracy and efficiency of the code. The code has shown some promising results.
Unified approach for incompressible flows
NASA Technical Reports Server (NTRS)
Chang, Tyne-Hsien
1995-01-01
A unified approach for solving incompressible flows has been investigated in this study. The numerical CTVD (Centered Total Variation Diminishing) scheme used in this study was successfully developed by Sanders and Li for compressible flows, especially for the high speed. The CTVD scheme possesses better mathematical properties to damp out the spurious oscillations while providing high-order accuracy for high speed flows. It leads us to believe that the CTVD scheme can equally well apply to solve incompressible flows. Because of the mathematical difference between the governing equations for incompressible and compressible flows, the scheme can not directly apply to the incompressible flows. However, if one can modify the continuity equation for incompressible flows by introducing pseudo-compressibility, the governing equations for incompressible flows would have the same mathematical characters as compressible flows. The application of the algorithm to incompressible flows thus becomes feasible. In this study, the governing equations for incompressible flows comprise continuity equation and momentum equations. The continuity equation is modified by adding a time-derivative of the pressure term containing the artificial compressibility. The modified continuity equation together with the unsteady momentum equations forms a hyperbolic-parabolic type of time-dependent system of equations. Thus, the CTVD schemes can be implemented. In addition, the physical and numerical boundary conditions are properly implemented by the characteristic boundary conditions. Accordingly, a CFD code has been developed for this research and is currently under testing. Flow past a circular cylinder was chosen for numerical experiments to determine the accuracy and efficiency of the code. The code has shown some promising results.
Understanding Understanding Mathematics. Artificial Intelligence Memo No. 488.
ERIC Educational Resources Information Center
Michener, Edwina Rissland
This document is concerned with the important extra-logical knowledge that is often outside of traditional discussions in mathematics, and looks at some of the ingredients and processes involved in the understanding of mathematics. The goal is to develop a conceptual framework in which to talk about mathematical knowledge and to understand the…
Mathematical String Sculptures: A Case Study in Computationally-Enhanced Mathematical Crafts
ERIC Educational Resources Information Center
Eisenberg, Michael
2007-01-01
Mathematical string sculptures constitute an extremely beautiful realm of mathematical crafts. This snapshot begins with a description of a marvelous (and no longer manufactured) toy called Space Spider, which provided a framework with which children could experiment with string sculptures. Using a computer-controlled laser cutter to create frames…
ERIC Educational Resources Information Center
del Prado Hill, Pixita; Friedland, Ellen S.; McMillen, Susan
2016-01-01
This article presents two innovative tools--the Mathematics-Literacy Planning Framework and Mathematics-Literacy Implementation Checklist--which are designed to help instructional coaches and specialists support teachers to meet the challenges of the mathematics-literacy integration goals of the Common Core. Developed with teacher input, these…
A Conceptual Metaphor Framework for the Teaching of Mathematics
ERIC Educational Resources Information Center
Danesi, Marcel
2007-01-01
Word problems in mathematics seem to constantly pose learning difficulties for all kinds of students. Recent work in math education (for example, [Lakoff, G. & Nunez, R. E. (2000). "Where mathematics comes from: How the embodied mind brings mathematics into being." New York: Basic Books]) suggests that the difficulties stem from an…
A Note on the Problem of Proper Time in Weyl Space-Time
NASA Astrophysics Data System (ADS)
Avalos, R.; Dahia, F.; Romero, C.
2018-02-01
We discuss the question of whether or not a general Weyl structure is a suitable mathematical model of space-time. This is an issue that has been in debate since Weyl formulated his unified field theory for the first time. We do not present the discussion from the point of view of a particular unification theory, but instead from a more general standpoint, in which the viability of such a structure as a model of space-time is investigated. Our starting point is the well known axiomatic approach to space-time given by Elhers, Pirani and Schild (EPS). In this framework, we carry out an exhaustive analysis of what is required for a consistent definition for proper time and show that such a definition leads to the prediction of the so-called "second clock effect". We take the view that if, based on experience, we were to reject space-time models predicting this effect, this could be incorporated as the last axiom in the EPS approach. Finally, we provide a proof that, in this case, we are led to a Weyl integrable space-time as the most general structure that would be suitable to model space-time.
Finding the way with a noisy brain.
Cheung, Allen; Vickerstaff, Robert
2010-11-11
Successful navigation is fundamental to the survival of nearly every animal on earth, and achieved by nervous systems of vastly different sizes and characteristics. Yet surprisingly little is known of the detailed neural circuitry from any species which can accurately represent space for navigation. Path integration is one of the oldest and most ubiquitous navigation strategies in the animal kingdom. Despite a plethora of computational models, from equational to neural network form, there is currently no consensus, even in principle, of how this important phenomenon occurs neurally. Recently, all path integration models were examined according to a novel, unifying classification system. Here we combine this theoretical framework with recent insights from directed walk theory, and develop an intuitive yet mathematically rigorous proof that only one class of neural representation of space can tolerate noise during path integration. This result suggests many existing models of path integration are not biologically plausible due to their intolerance to noise. This surprising result imposes significant computational limitations on the neurobiological spatial representation of all successfully navigating animals, irrespective of species. Indeed, noise-tolerance may be an important functional constraint on the evolution of neuroarchitectural plans in the animal kingdom.
A model of olfactory associative learning
NASA Astrophysics Data System (ADS)
Tavoni, Gaia; Balasubramanian, Vijay
We propose a mechanism, rooted in the known anatomy and physiology of the vertebrate olfactory system, by which presentations of rewarded and unrewarded odors lead to formation of odor-valence associations between piriform cortex (PC) and anterior olfactory nucleus (AON) which, in concert with neuromodulators release in the bulb, entrains a direct feedback from the AON representation of valence to a group of mitral cells (MCs). The model makes several predictions concerning MC activity during and after associative learning: (a) AON feedback produces synchronous divergent responses in a localized subset of MCs; (b) such divergence propagates to other MCs by lateral inhibition; (c) after learning, MC responses reconverge; (d) recall of the newly formed associations in the PC increases feedback inhibition in the MCs. These predictions have been confirmed in disparate experiments which we now explain in a unified framework. For cortex, our model further predicts that the response divergence developed during learning reshapes odor representations in the PC, with the effects of (a) decorrelating PC representations of odors with different valences, (b) increasing the size and reliability of those representations, and enabling recall correction and redundancy reduction after learning. Simons Foundation for Mathematical Modeling of Living Systems.
Dissecting Embryonic Stem Cell Self-Renewal and Differentiation Commitment from Quantitative Models.
Hu, Rong; Dai, Xianhua; Dai, Zhiming; Xiang, Qian; Cai, Yanning
2016-10-01
To model quantitatively embryonic stem cell (ESC) self-renewal and differentiation by computational approaches, we developed a unified mathematical model for gene expression involved in cell fate choices. Our quantitative model comprised ESC master regulators and lineage-specific pivotal genes. It took the factors of multiple pathways as input and computed expression as a function of intrinsic transcription factors, extrinsic cues, epigenetic modifications, and antagonism between ESC master regulators and lineage-specific pivotal genes. In the model, the differential equations of expression of genes involved in cell fate choices from regulation relationship were established according to the transcription and degradation rates. We applied this model to the Murine ESC self-renewal and differentiation commitment and found that it modeled the expression patterns with good accuracy. Our model analysis revealed that Murine ESC was an attractor state in culture and differentiation was predominantly caused by antagonism between ESC master regulators and lineage-specific pivotal genes. Moreover, antagonism among lineages played a critical role in lineage reprogramming. Our results also uncovered that the ordered expression alteration of ESC master regulators over time had a central role in ESC differentiation fates. Our computational framework was generally applicable to most cell-type maintenance and lineage reprogramming.
Multiscale geometric modeling of macromolecules II: Lagrangian representation
Feng, Xin; Xia, Kelin; Chen, Zhan; Tong, Yiying; Wei, Guo-Wei
2013-01-01
Geometric modeling of biomolecules plays an essential role in the conceptualization of biolmolecular structure, function, dynamics and transport. Qualitatively, geometric modeling offers a basis for molecular visualization, which is crucial for the understanding of molecular structure and interactions. Quantitatively, geometric modeling bridges the gap between molecular information, such as that from X-ray, NMR and cryo-EM, and theoretical/mathematical models, such as molecular dynamics, the Poisson-Boltzmann equation and the Nernst-Planck equation. In this work, we present a family of variational multiscale geometric models for macromolecular systems. Our models are able to combine multiresolution geometric modeling with multiscale electrostatic modeling in a unified variational framework. We discuss a suite of techniques for molecular surface generation, molecular surface meshing, molecular volumetric meshing, and the estimation of Hadwiger’s functionals. Emphasis is given to the multiresolution representations of biomolecules and the associated multiscale electrostatic analyses as well as multiresolution curvature characterizations. The resulting fine resolution representations of a biomolecular system enable the detailed analysis of solvent-solute interaction, and ion channel dynamics, while our coarse resolution representations highlight the compatibility of protein-ligand bindings and possibility of protein-protein interactions. PMID:23813599
Integrated performance and reliability specification for digital avionics systems
NASA Technical Reports Server (NTRS)
Brehm, Eric W.; Goettge, Robert T.
1995-01-01
This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.
Adaptive tracking control for active suspension systems with non-ideal actuators
NASA Astrophysics Data System (ADS)
Pan, Huihui; Sun, Weichao; Jing, Xingjian; Gao, Huijun; Yao, Jianyong
2017-07-01
As a critical component of transportation vehicles, active suspension systems are instrumental in the improvement of ride comfort and maneuverability. However, practical active suspensions commonly suffer from parameter uncertainties (e.g., the variations of payload mass and suspension component parameters), external disturbances and especially the unknown non-ideal actuators (i.e., dead-zone and hysteresis nonlinearities), which always significantly deteriorate the control performance in practice. To overcome these issues, this paper synthesizes an adaptive tracking control strategy for vehicle suspension systems to achieve suspension performance improvements. The proposed control algorithm is formulated by developing a unified framework of non-ideal actuators rather than a separate way, which is a simple yet effective approach to remove the unexpected nonlinear effects. From the perspective of practical implementation, the advantages of the presented controller for active suspensions include that the assumptions on the measurable actuator outputs, the prior knowledge of nonlinear actuator parameters and the uncertain parameters within a known compact set are not required. Furthermore, the stability of the closed-loop suspension system is theoretically guaranteed by rigorous mathematical analysis. Finally, the effectiveness of the presented adaptive control scheme is confirmed using comparative numerical simulation validations.
Information Transfer in the Brain: Insights from a Unified Approach
NASA Astrophysics Data System (ADS)
Marinazzo, Daniele; Wu, Guorong; Pellicoro, Mario; Stramaglia, Sebastiano
Measuring directed interactions in the brain in terms of information flow is a promising approach, mathematically treatable and amenable to encompass several methods. In this chapter we propose some approaches rooted in this framework for the analysis of neuroimaging data. First we will explore how the transfer of information depends on the network structure, showing how for hierarchical networks the information flow pattern is characterized by exponential distribution of the incoming information and a fat-tailed distribution of the outgoing information, as a signature of the law of diminishing marginal returns. This was reported to be true also for effective connectivity networks from human EEG data. Then we address the problem of partial conditioning to a limited subset of variables, chosen as the most informative ones for the driver node.We will then propose a formal expansion of the transfer entropy to put in evidence irreducible sets of variables which provide information for the future state of each assigned target. Multiplets characterized by a large contribution to the expansion are associated to informational circuits present in the system, with an informational character (synergetic or redundant) which can be associated to the sign of the contribution. Applications are reported for EEG and fMRI data.
Generalized Lorenz equations on a three-sphere
NASA Astrophysics Data System (ADS)
Saiki, Yoshitaka; Sander, Evelyn; Yorke, James A.
2017-06-01
Edward Lorenz is best known for one specific three-dimensional differential equation, but he actually created a variety of related N-dimensional models. In this paper, we discuss a unifying principle for these models and put them into an overall mathematical framework. Because this family of models is so large, we are forced to choose. We sample the variety of dynamics seen in these models, by concentrating on a four-dimensional version of the Lorenz models for which there are three parameters and the norm of the solution vector is preserved. We can therefore restrict our focus to trajectories on the unit sphere S 3 in ℝ4. Furthermore, we create a type of Poincaré return map. We choose the Poincaré surface to be the set where one of the variables is 0, i.e., the Poincaré surface is a two-sphere S 2 in ℝ3. Examining different choices of our three parameters, we illustrate the wide variety of dynamical behaviors, including chaotic attractors, period doubling cascades, Standard-Map-like structures, and quasiperiodic trajectories. Note that neither Standard-Map-like structure nor quasiperiodicity has previously been reported for Lorenz models.
Grid cell hexagonal patterns formed by fast self-organized learning within entorhinal cortex.
Mhatre, Himanshu; Gorchetchnikov, Anatoli; Grossberg, Stephen
2012-02-01
Grid cells in the dorsal segment of the medial entorhinal cortex (dMEC) show remarkable hexagonal activity patterns, at multiple spatial scales, during spatial navigation. It has previously been shown how a self-organizing map can convert firing patterns across entorhinal grid cells into hippocampal place cells that are capable of representing much larger spatial scales. Can grid cell firing fields also arise during navigation through learning within a self-organizing map? This article describes a simple and general mathematical property of the trigonometry of spatial navigation which favors hexagonal patterns. The article also develops a neural model that can learn to exploit this trigonometric relationship. This GRIDSmap self-organizing map model converts path integration signals into hexagonal grid cell patterns of multiple scales. GRIDSmap creates only grid cell firing patterns with the observed hexagonal structure, predicts how these hexagonal patterns can be learned from experience, and can process biologically plausible neural input and output signals during navigation. These results support an emerging unified computational framework based on a hierarchy of self-organizing maps for explaining how entorhinal-hippocampal interactions support spatial navigation. Copyright © 2010 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Kachapova, Farida
2016-07-01
Mathematical and computational models in biology and medicine help to improve diagnostics and medical treatments. Modeling of pathological fibrosis is reviewed by M. Ben Amar and C. Bianca in [4]. Pathological fibrosis is the process when excessive fibrous tissue is deposited on an organ or tissue during a wound healing and can obliterate their normal function. In [4] the phenomena of fibrosis are briefly explained including the causes, mechanism and management; research models of pathological fibrosis are described, compared and critically analyzed. Different models are suitable at different levels: molecular, cellular and tissue. The main goal of mathematical modeling of fibrosis is to predict long term behavior of the system depending on bifurcation parameters; there are two main trends: inhibition of fibrosis due to an active immune system and swelling of fibrosis because of a weak immune system.
Simulating the evolution of non-point source pollutants in a shallow water environment.
Yan, Min; Kahawita, Rene
2007-03-01
Non-point source pollution originating from surface applied chemicals in either liquid or solid form as part of agricultural activities, appears in the surface runoff caused by rainfall. The infiltration and transport of these pollutants has a significant impact on subsurface and riverine water quality. The present paper describes the development of a unified 2-D mathematical model incorporating individual models for infiltration, adsorption, solubility rate, advection and diffusion, which significantly improve the current practice on mathematical modeling of pollutant evolution in shallow water. The governing equations have been solved numerically using cubic spline integration. Experiments were conducted at the Hydrodynamics Laboratory of the Ecole Polytechnique de Montreal to validate the mathematical model. Good correspondence between the computed results and experimental data has been obtained. The model may be used to predict the ultimate fate of surface applied chemicals by evaluating the proportions that are dissolved, infiltrated into the subsurface or are washed off.
A UML profile for framework modeling.
Xu, Xiao-liang; Wang, Le-yu; Zhou, Hong
2004-01-01
The current standard Unified Modeling Language(UML) could not model framework flexibility and extendability adequately due to lack of appropriate constructs to distinguish framework hot-spots from kernel elements. A new UML profile that may customize UML for framework modeling was presented using the extension mechanisms of UML, providing a group of UML extensions to meet the needs of framework modeling. In this profile, the extended class diagrams and sequence diagrams were defined to straightforwardly identify the hot-spots and describe their instantiation restrictions. A transformation model based on design patterns was also put forward, such that the profile based framework design diagrams could be automatically mapped to the corresponding implementation diagrams. It was proved that the presented profile makes framework modeling more straightforwardly and therefore easier to understand and instantiate.
ERIC Educational Resources Information Center
Kim, Rae Young
2009-01-01
This study is an initial analytic attempt to iteratively develop a conceptual framework informed by both theoretical and practical perspectives that may be used to analyze non-textual elements in mathematics textbooks. Despite the importance of visual representations in teaching and learning, little effort has been made to specify in any…
ERIC Educational Resources Information Center
Wu, Margaret
2010-01-01
This paper makes an in-depth comparison of the PISA (OECD) and TIMSS (IEA) mathematics assessments conducted in 2003. First, a comparison of survey methodologies is presented, followed by an examination of the mathematics frameworks in the two studies. The methodologies and the frameworks in the two studies form the basis for providing…
ERIC Educational Resources Information Center
Barnett, Janet Heine; Lodder, Jerry; Pengelley, David
2014-01-01
We analyze our method of teaching with primary historical sources within the context of theoretical frameworks for the role of history in teaching mathematics developed by Barbin, Fried, Jahnke, Jankvist, and Kjeldsen and Blomhøj, and more generally from the perspective of Sfard's theory of learning as communication. We present case studies…
Self-Efficacy: Toward a Unifying Theory of Behavioral Change
ERIC Educational Resources Information Center
Bandura, Albert
1977-01-01
This research presents an integrative theoretical framework to explain and to predict psychological changes achieved by different modes of treatment. This theory states that psychological procedures, whatever their form, alter the level and strength of "self-efficacy". (Editor/RK)
COMPLEMENTARITY OF ECOLOGICAL GOAL FUNCTIONS
This paper summarizes, in the framework of network environ analysis, a set of analyses of energy-matter flow and storage in steady state systems. The network perspective is used to codify and unify ten ecological orientors or external principles: maximum power (Lotka), maximum st...
ERIC Educational Resources Information Center
Badar, Lawrence J.
This report, in the form of a teacher's guide, presents materials for a ninth grade introductory course on Introduction to Quantitative Science (IQS). It is intended to replace a traditional ninth grade general science with a process oriented course that will (1) unify the sciences, and (2) provide a quantitative preparation for the new science…
Chimaera simulation of complex states of flowing matter
2016-01-01
We discuss a unified mesoscale framework (chimaera) for the simulation of complex states of flowing matter across scales of motion. The chimaera framework can deal with each of the three macro–meso–micro levels through suitable ‘mutations’ of the basic mesoscale formulation. The idea is illustrated through selected simulations of complex micro- and nanoscale flows. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698031
Mathematical Education for Geographers
ERIC Educational Resources Information Center
Wilson, Alan
1978-01-01
Outlines mathematical topics of use to college geography students identifies teaching methods for mathematical techniques in geography at the University of Leeds; and discusses problem of providing students with a framework for synthesizing all content of geography education. For journal availability, see SO 506 593. (Author/AV)
A Framework for Teachers' Knowledge of Mathematical Reasoning
ERIC Educational Resources Information Center
Herbert, Sandra
2014-01-01
Exploring and developing primary teachers' understanding of mathematical reasoning was the focus of the "Mathematical Reasoning Professional Learning Research Program." Twenty-four primary teachers were interviewed after engagement in the first stage of the program incorporating demonstration lessons focused on reasoning conducted in…
Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A
2015-01-01
This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.
Proofs and Refutations in the Undergraduate Mathematics Classroom
ERIC Educational Resources Information Center
Larsen, Sean; Zandieh, Michelle
2008-01-01
In his 1976 book, "Proofs and Refutations," Lakatos presents a collection of case studies to illustrate methods of mathematical discovery in the history of mathematics. In this paper, we reframe these methods in ways that we have found make them more amenable for use as a framework for research on learning and teaching mathematics. We present an…
Language and Thought in Mathematics Staff Development: A Problem Probing Protocol
ERIC Educational Resources Information Center
Kabasakalian, Rita
2007-01-01
Background/Context: The theoretical framework of the paper comes from research on problem solving, considered by many to be the essence of mathematics; research on the importance of oral language in learning mathematics; and on the importance of the teacher as the primary instrument of learning mathematics for most students. As a nation, we are…
Using CAS to Solve a Mathematics Task: A Deconstruction
ERIC Educational Resources Information Center
Berger, Margot
2010-01-01
I investigate how and whether a heterogeneous group of first-year university mathematics students in South Africa harness the potential power of a computer algebra system (CAS) when doing a specific mathematics task. In order to do this, I develop a framework for deconstructing a mathematics task requiring the use of CAS, into its primary…
Applying a Universal Design for Learning Framework to Mediate the Language Demands of Mathematics
ERIC Educational Resources Information Center
Thomas, Cathy Newman; Van Garderen, Delinda; Scheuermann, Amy; Lee, Eun Ju
2015-01-01
This article provides information about the relationship between mathematics, language, and literacy and describes the difficulties faced by students with disabilities with math content based on the language demands of mathematics. We conceptualize mathematics language as a mode of discourse for math learning that can be thought of as receptive…
Mathematics Education in Singapore--An Insider's Perspective
ERIC Educational Resources Information Center
Kaur, Berinderjeet
2014-01-01
Singapore's Education System has evolved over time and so has Mathematics Education in Singapore. The present day School Mathematics Curricula can best be described as one that caters for the needs of every child in school. It is based on a framework that has mathematical problem solving as its primary focus. The developments from 1946 to 2012…
ERIC Educational Resources Information Center
Gonzalez, Marggie Denise
2016-01-01
This multiple case study examines four groups of secondary mathematics teachers engaged in a Lesson Study approach to professional development where they planned and taught lessons that integrate technology. Informed by current literature, a framework was developed to focus on the dimensions of teacher's knowledge to teach mathematics with…
ERIC Educational Resources Information Center
Burrill, Gail; And Others
The 1989 document, "Curriculum and Evaluation Standards for School Mathematics" (the "Standards"), provides a vision and a framework for revising and strengthening the K-12 mathematics curriculum in North American schools and for evaluating both the mathematics curriculum and students' progress. When completed, it is expected…
Pinto, Rogério M; da Silva, Sueli Bulhões; Soriano, Rafaela
2012-03-01
Community health workers (CHWs) play a pivotal role in primary care, serving as liaisons between community members and medical providers. However, the growing reliance of health care systems worldwide on CHWs has outpaced research explaining their praxis - how they combine indigenous and technical knowledge, overcome challenges and impact patient outcomes. This paper thus articulates the CHW Praxis and Patient Health Behavior Framework. Such a framework is needed to advance research on CHW impact on patient outcomes and to advance CHW training. The project that originated this framework followed community-based participatory research principles. A team of U.S.-Brazil research partners, including CHWs, worked together from conceptualization of the study to dissemination of its findings. The framework is built on an integrated conceptual foundation including learning/teaching and individual behavior theories. The empirical base of the framework comprises in-depth interviews with 30 CHWs in Brazil's Unified Health System, Mesquita, Rio de Janeiro. Data collection for the project which originated this report occurred in 2008-10. Semi-structured questions examined how CHWs used their knowledge/skills; addressed personal and environmental challenges; and how they promoted patient health behaviors. This study advances an explanation of how CHWs use self-identified strategies--i.e., empathic communication and perseverance--to help patients engage in health behaviors. Grounded in our proposed framework, survey measures can be developed and used in predictive models testing the effects of CHW praxis on health behaviors. Training for CHWs can explicitly integrate indigenous and technical knowledge in order for CHWs to overcome contextual challenges and enhance service delivery. Copyright © 2012 Elsevier Ltd. All rights reserved.
Pinto, Rogério M.; da Silva, Sueli Bulhões; Soriano, Rafaela
2012-01-01
Community Health Workers (CHWs) play a pivotal role in primary care, serving as liaisons between community members and medical providers. However, the growing reliance of health care systems worldwide on CHWs has outpaced research explaining their praxis – how they combine indigenous and technical knowledge, overcome challenges and impact patient outcomes. This paper thus articulates the CHW Praxis and Patient Health Behavior Framework. Such a framework is needed to advance research on CHW impact on patient outcomes and to advance CHW training. The project that originated this framework followed Community-Based Participatory Research principles. A team of U.S.-Brazil research partners, including CHWs, worked together from conceptualization of the study to dissemination of its findings. The framework is built on an integrated conceptual foundation including learning/teaching and individual behavior theories. The empirical base of the framework comprises in-depth interviews with 30 CHWs in Brazil's Unified Health System, Mesquita, Rio de Janeiro. Data collection for the project which originated this report occurred in 2008–10. Semi-structured questions examined how CHWs used their knowledge/skills; addressed personal and environmental challenges; and how they promoted patient health behaviors. This study advances an explanation of how CHWs use self-identified strategies – i.e., empathic communication and perseverance – to help patients engage in health behaviors. Grounded in our proposed framework, survey measures can be developed and used in predictive models testing the effects of CHW praxis on health behaviors. Training for CHWs can explicitly integrate indigenous and technical knowledge in order for CHWs to overcome contextual challenges and enhance service delivery. PMID:22305469
ERIC Educational Resources Information Center
Chen, Hsin-liang; Doty, Philip
2005-01-01
This article is the first of two that present a six-part conceptual framework for the design and evaluation of digital libraries meant to support mathematics education in K-12 settings (see also pt. 2). This first article concentrates on (1) information organization, (2) information literacy, and (3) integrated learning with multimedia materials.…
ERIC Educational Resources Information Center
Önal, Nezih
2017-01-01
The purpose of the present research was to reveal students' perceptions regarding the use of the interactive whiteboard in the mathematics classroom within the framework of the Technology Acceptance Model. Semi-structured interviews were performed with 58 secondary school students (5th, 6th, 7th, and 8th grades) to collect data. The data obtained…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Machnes, S.; Institute for Theoretical Physics, University of Ulm, D-89069 Ulm; Sander, U.
2011-08-15
For paving the way to novel applications in quantum simulation, computation, and technology, increasingly large quantum systems have to be steered with high precision. It is a typical task amenable to numerical optimal control to turn the time course of pulses, i.e., piecewise constant control amplitudes, iteratively into an optimized shape. Here, we present a comparative study of optimal-control algorithms for a wide range of finite-dimensional applications. We focus on the most commonly used algorithms: GRAPE methods which update all controls concurrently, and Krotov-type methods which do so sequentially. Guidelines for their use are given and open research questions aremore » pointed out. Moreover, we introduce a unifying algorithmic framework, DYNAMO (dynamic optimization platform), designed to provide the quantum-technology community with a convenient matlab-based tool set for optimal control. In addition, it gives researchers in optimal-control techniques a framework for benchmarking and comparing newly proposed algorithms with the state of the art. It allows a mix-and-match approach with various types of gradients, update and step-size methods as well as subspace choices. Open-source code including examples is made available at http://qlib.info.« less
NASA Technical Reports Server (NTRS)
2005-01-01
A number of titanium matrix composite (TMC) systems are currently being investigated for high-temperature air frame and propulsion system applications. As a result, numerous computational methodologies for predicting both deformation and life for this class of materials are under development. An integral part of these methodologies is an accurate and computationally efficient constitutive model for the metallic matrix constituent. Furthermore, because these systems are designed to operate at elevated temperatures, the required constitutive models must account for both time-dependent and time-independent deformations. To accomplish this, the NASA Lewis Research Center is employing a recently developed, complete, potential-based framework. This framework, which utilizes internal state variables, was put forth for the derivation of reversible and irreversible constitutive equations. The framework, and consequently the resulting constitutive model, is termed complete because the existence of the total (integrated) form of the Gibbs complementary free energy and complementary dissipation potentials are assumed a priori. The specific forms selected here for both the Gibbs and complementary dissipation potentials result in a fully associative, multiaxial, nonisothermal, unified viscoplastic model with nonlinear kinematic hardening. This model constitutes one of many models in the Generalized Viscoplasticity with Potential Structure (GVIPS) class of inelastic constitutive equations.
A mathematical applications into the cells.
Tiwari, Manjul
2012-01-01
Biology has become the new "physics" of mathematics, one of the areas of greatest mathematical applications. In turn, mathematics has provided powerful tools and metaphors to approach the astonishing complexity of biological systems. This has allowed the development of sound theoretical frameworks. Here, in this review article, some of the most significant contributions of mathematics to biology, ranging from population genetics, to developmental biology, and to networks of species interactions are summarized.
Stochastic and Deterministic Models for the Metastatic Emission Process: Formalisms and Crosslinks.
Gomez, Christophe; Hartung, Niklas
2018-01-01
Although the detection of metastases radically changes prognosis of and treatment decisions for a cancer patient, clinically undetectable micrometastases hamper a consistent classification into localized or metastatic disease. This chapter discusses mathematical modeling efforts that could help to estimate the metastatic risk in such a situation. We focus on two approaches: (1) a stochastic framework describing metastatic emission events at random times, formalized via Poisson processes, and (2) a deterministic framework describing the micrometastatic state through a size-structured density function in a partial differential equation model. Three aspects are addressed in this chapter. First, a motivation for the Poisson process framework is presented and modeling hypotheses and mechanisms are introduced. Second, we extend the Poisson model to account for secondary metastatic emission. Third, we highlight an inherent crosslink between the stochastic and deterministic frameworks and discuss its implications. For increased accessibility the chapter is split into an informal presentation of the results using a minimum of mathematical formalism and a rigorous mathematical treatment for more theoretically interested readers.
University students' achievement goals and approaches to learning in mathematics.
Cano, Francisco; Berbén, A B G
2009-03-01
Achievement goals (AG) and students' approaches to learning (SAL) are two research perspectives on student motivation and learning in higher education that have until now been pursued quite independently. This study sets out: (a) to explore the relationship between the most representative variables of SAL and AG; (b) to identify subgroups (clusters) of students with multiple AG; and (c) to examine the differences between these clusters with respect to various SAL and AG characteristics. The participants were 680 male and female 1st year university students studying different subjects (e.g. mathematics, physics, economics) but all enrolled on mathematics courses (e.g. algebra, calculus). Participants completed a series of questionnaires that measured their conceptions of mathematics, approaches to learning, course experience, personal 2 x 2 AG, and perceived AG. SAL and AG variables were moderately associated and related to both the way students perceived their academic environment and the way they conceived of the nature of mathematics (i.e. the perceptual-cognitive framework). Four clusters of students with distinctive multiple AG were identified and when the differences between clusters were analysed, we were able to attribute them to various constructs including perceptual-cognitive framework, learning approaches, and academic performance. This study reveals a consistent pattern of relationships between SAL and AG perspectives across different methods of analysis, supports the relevance of the 2 x 2 AG framework in a mathematics learning context and suggests that AG and SAL may be intertwined aspects of students' experience of learning mathematics at university.
Mathematics Framework, Kindergarten-Grade 12.
ERIC Educational Resources Information Center
Texas Education Agency, Austin.
This publication should help educators provide a mathematics program that emphasizes productive time on task and active involvement of students in mathematics activities. The focus on problem solving is stressed. Time allotments are stated, followed by descriptions of essential elements for kindergarten through grade 8: understanding numbers and…
Connecting Mathematics Learning through Spatial Reasoning
ERIC Educational Resources Information Center
Mulligan, Joanne; Woolcott, Geoffrey; Mitchelmore, Michael; Davis, Brent
2018-01-01
Spatial reasoning, an emerging transdisciplinary area of interest to mathematics education research, is proving integral to all human learning. It is particularly critical to science, technology, engineering and mathematics (STEM) fields. This project will create an innovative knowledge framework based on spatial reasoning that identifies new…
Considering Indigenous Knowledges and Mathematics Curriculum
ERIC Educational Resources Information Center
Sterenberg, Gladys
2013-01-01
Across Canada, significant program changes in school mathematics have been made that encourage teachers to consider Aboriginal perspectives. In this article, I investigate one Aboriginal teacher's approaches to integrating Indigenous knowledges and the mandated mathematics curriculum in a Blackfoot First Nation school. Using a framework that…
Promoting Mathematical Connections Using Three-Dimensional Manipulatives
ERIC Educational Resources Information Center
Safi, Farshid; Desai, Siddhi
2017-01-01
"Principles to Actions: Ensuring Mathematical Success for All" (NCTM 2014) gives teachers access to an insightful, research-informed framework that outlines ways to promote reasoning and sense making. Specifically, as students transition on their mathematical journey through middle school and beyond, their knowledge and use of…
ERIC Educational Resources Information Center
Marston, Jennifer L.; Muir, Tracey; Livy, Sharyn
2013-01-01
The National Council of Teachers of Mathematics (NCTM) and the Australian National Curriculum encourage the integration of literacy and numeracy, and "Teaching Children Mathematics" ("TCM") regularly includes articles on incorporating picture books into the mathematics program. Marston has developed a new framework (2010) to assist teachers in…
Classical Markov Chains: A Unifying Framework for Understanding Avian Reproductive Success
Traditional methods for monitoring and analysis of avian nesting success have several important shortcomings, including 1) inability to handle multiple classes of nest failure, and 2) inability to provide estimates of annual reproductive success (because birds can, and typically ...
Do changes in connectivity explain desertification?
USDA-ARS?s Scientific Manuscript database
Desertification, broad-scale land degradation in drylands, is a major environmental hazard facing inhabitants of the world’s deserts as well as an important component of global change. There is no unifying framework that simply and effectively explains different forms of desertification. Here we arg...
Cho, Kwang-Hyun; Choo, Sang-Mok; Wellstead, Peter; Wolkenhauer, Olaf
2005-08-15
We propose a unified framework for the identification of functional interaction structures of biomolecular networks in a way that leads to a new experimental design procedure. In developing our approach, we have built upon previous work. Thus we begin by pointing out some of the restrictions associated with existing structure identification methods and point out how these restrictions may be eased. In particular, existing methods use specific forms of experimental algebraic equations with which to identify the functional interaction structure of a biomolecular network. In our work, we employ an extended form of these experimental algebraic equations which, while retaining their merits, also overcome some of their disadvantages. Experimental data are required in order to estimate the coefficients of the experimental algebraic equation set associated with the structure identification task. However, experimentalists are rarely provided with guidance on which parameters to perturb, and to what extent, to perturb them. When a model of network dynamics is required then there is also the vexed question of sample rate and sample time selection to be resolved. Supplying some answers to these questions is the main motivation of this paper. The approach is based on stationary and/or temporal data obtained from parameter perturbations, and unifies the previous approaches of Kholodenko et al. (PNAS 99 (2002) 12841-12846) and Sontag et al. (Bioinformatics 20 (2004) 1877-1886). By way of demonstration, we apply our unified approach to a network model which cannot be properly identified by existing methods. Finally, we propose an experiment design methodology, which is not limited by the amount of parameter perturbations, and illustrate its use with an in numero example.
Wolfrum, Ed (ORCID:0000000273618931); Knoshug, Eric (ORCID:000000025709914X); Laurens, Lieve (ORCID:0000000349303267); Harmon, Valerie; Dempster, Thomas (ORCID:000000029550488X); McGowan, John (ORCID:0000000266920518); Rosov, Theresa; Cardello, David; Arrowsmith, Sarah; Kempkes, Sarah; Bautista, Maria; Lundquist, Tryg; Crowe, Brandon; Murawsky, Garrett; Nicolai, Eric; Rowe, Egan; Knurek, Emily; Javar, Reyna; Saracco Alvarez, Marcela; Schlosser, Steve; Riddle, Mary; Withstandley, Chris; Chen, Yongsheng; Van Ginkel, Steven; Igou, Thomas; Xu, Chunyan; Hu, Zixuan
2017-10-20
ATP3 Unified Field Study Data The Algae Testbed Public-Private Partnership (ATP3) was established with the goal of investigating open pond algae cultivation across different geographic, climatic, seasonal, and operational conditions while setting the benchmark for quality data collection, analysis, and dissemination. Identical algae cultivation systems and data analysis methodologies were established at testbed sites across the continental United States and Hawaii. Within this framework, the Unified Field Studies (UFS) were designed to characterize the cultivation of different algal strains during all 4 seasons across this testbed network. The dataset presented here is the complete, curated, climatic, cultivation, harvest, and biomass composition data for each season at each site. These data enable others to do in-depth cultivation, harvest, techno-economic, life cycle, resource, and predictive growth modeling analysis, as well as develop crop protection strategies for the nascent algae industry. NREL Sub award Number: DE-AC36-08-GO28308
A Unified Methodology for Computing Accurate Quaternion Color Moments and Moment Invariants.
Karakasis, Evangelos G; Papakostas, George A; Koulouriotis, Dimitrios E; Tourassis, Vassilios D
2014-02-01
In this paper, a general framework for computing accurate quaternion color moments and their corresponding invariants is proposed. The proposed unified scheme arose by studying the characteristics of different orthogonal polynomials. These polynomials are used as kernels in order to form moments, the invariants of which can easily be derived. The resulted scheme permits the usage of any polynomial-like kernel in a unified and consistent way. The resulted moments and moment invariants demonstrate robustness to noisy conditions and high discriminative power. Additionally, in the case of continuous moments, accurate computations take place to avoid approximation errors. Based on this general methodology, the quaternion Tchebichef, Krawtchouk, Dual Hahn, Legendre, orthogonal Fourier-Mellin, pseudo Zernike and Zernike color moments, and their corresponding invariants are introduced. A selected paradigm presents the reconstruction capability of each moment family, whereas proper classification scenarios evaluate the performance of color moment invariants.
NASA Astrophysics Data System (ADS)
Codello, Alessandro; Jain, Rajeev Kumar
2018-05-01
We present a unified evolution of the universe from very early times until the present epoch by including both the leading local correction R^2 and the leading non-local term R1/\\square ^2R to the classical gravitational action. We find that the inflationary phase driven by R^2 term gracefully exits in a transitory regime characterized by coherent oscillations of the Hubble parameter. The universe then naturally enters into a radiation dominated epoch followed by a matter dominated era. At sufficiently late times after radiation-matter equality, the non-local term starts to dominate inducing an accelerated expansion of the universe at the present epoch. We further exhibit the fact that both the leading local and non-local terms can be obtained within the covariant effective field theory of gravity. This scenario thus provides a unified picture of inflation and dark energy in a single framework by means of a purely gravitational action without the usual need of a scalar field.
Multilayer network of language: A unified framework for structural analysis of linguistic subsystems
NASA Astrophysics Data System (ADS)
Martinčić-Ipšić, Sanda; Margan, Domagoj; Meštrović, Ana
2016-09-01
Recently, the focus of complex networks' research has shifted from the analysis of isolated properties of a system toward a more realistic modeling of multiple phenomena - multilayer networks. Motivated by the prosperity of multilayer approach in social, transport or trade systems, we introduce the multilayer networks for language. The multilayer network of language is a unified framework for modeling linguistic subsystems and their structural properties enabling the exploration of their mutual interactions. Various aspects of natural language systems can be represented as complex networks, whose vertices depict linguistic units, while links model their relations. The multilayer network of language is defined by three aspects: the network construction principle, the linguistic subsystem and the language of interest. More precisely, we construct a word-level (syntax and co-occurrence) and a subword-level (syllables and graphemes) network layers, from four variations of original text (in the modeled language). The analysis and comparison of layers at the word and subword-levels are employed in order to determine the mechanism of the structural influences between linguistic units and subsystems. The obtained results suggest that there are substantial differences between the networks' structures of different language subsystems, which are hidden during the exploration of an isolated layer. The word-level layers share structural properties regardless of the language (e.g. Croatian or English), while the syllabic subword-level expresses more language dependent structural properties. The preserved weighted overlap quantifies the similarity of word-level layers in weighted and directed networks. Moreover, the analysis of motifs reveals a close topological structure of the syntactic and syllabic layers for both languages. The findings corroborate that the multilayer network framework is a powerful, consistent and systematic approach to model several linguistic subsystems simultaneously and hence to provide a more unified view on language.
An initial framework for the language of higher-order thinking mathematics practices
NASA Astrophysics Data System (ADS)
Staples, Megan E.; Truxaw, Mary P.
2012-09-01
This article presents an examination of the language demands of cognitively demanding tasks and proposes an initial framework for the language demands of higher-order mathematics thinking practices. We articulate four categories for this framework: language of generalisation, language of comparison, language of proportional reasoning, and language of analysing impact. These categories were developed out of our collaborative work to design and implement higher-order thinking tasks with a group of Grade 9 (14- and 15-year-olds) teachers teaching in a linguistically diverse setting; analyses of student work samples on these tasks; and our knowledge of the literature. We describe each type of language demand and then analyse student work in each category to reveal linguistic challenges facing students as they engage these mathematical tasks. Implications for teaching and professional development are discussed.
NASA Technical Reports Server (NTRS)
1978-01-01
A unified framework for comparing intercity passenger and freight transportation systems is presented. Composite measures for cost, service/demand, energy, and environmental impact were determined. A set of 14 basic measures were articulated to form the foundation for computing the composite measures. A parameter dependency diagram, constructed to explicitly interrelate the composite and basic measures is discussed. Ground rules and methodology for developing the values of the basic measures are provided and the use of the framework with existing cost and service data is illustrated for various freight systems.
ERIC Educational Resources Information Center
Jones, Dustin L.; Tarr, James E.
2007-01-01
We analyze probability content within middle grades (6, 7, and 8) mathematics textbooks from a historical perspective. Two series, one popular and the other alternative, from four recent eras of mathematics education (New Math, Back to Basics, Problem Solving, and Standards) were analyzed using the Mathematical Tasks Framework (Stein, Smith,…
ERIC Educational Resources Information Center
Stohlmann, Micah Stephen
2012-01-01
This case study explored the impact of a standards-based mathematics and pedagogy class on preservice elementary teachers' beliefs and conceptual subject matter knowledge of linear functions. The framework for the standards-based mathematics and pedagogy class in this study involved the National Council of Teachers of Mathematics Standards,…
ERIC Educational Resources Information Center
Perry, Bob; Hampshire, Ann; Gervaxoni, Ann; O'Neill, Will
2016-01-01
"Let's Count" is a preschool mathematics intervention implemented by The Smith Family from 2012 to the present in "disdvantaged" communities across Australia. It is based on current mathematics and early childhood education research and aligns with the Early Years Learning Framework. Let's Count has been shown to be effective…
ERIC Educational Resources Information Center
Miheso-O'Connor Khakasa, Marguerite; Berger, Margot
2016-01-01
Mathematical knowledge for teaching (MKT), defined by Ball ("Elementary Journal," 93, 373-397, 1993) as knowledge that is needed to teach mathematics, has been used as a framework by researchers to interrogate various aspects of teaching and learning mathematics. In this article, which draws from a larger study, we show how an in-depth…
An Emergent Framework: Views of Mathematical Processes
ERIC Educational Resources Information Center
Sanchez, Wendy B.; Lischka, Alyson E.; Edenfield, Kelly W.; Gammill, Rebecca
2015-01-01
The findings reported in this paper were generated from a case study of teacher leaders at a state-level mathematics conference. Investigation focused on how participants viewed the mathematical processes of communication, connections, representations, problem solving, and reasoning and proof. Purposeful sampling was employed to select nine…
Reconstructing Mathematics Pedagogy from a Constructivist Perspective.
ERIC Educational Resources Information Center
Simon, Martin A.
1995-01-01
Begins with an overview of the constructivist perspective and the pedagogical theory development upon which a constructivist teaching experiment with 20 prospective elementary teachers was based. Derives a theoretical framework for mathematics pedagogy with a focus on decisions about content and mathematical tasks. (49 references) (Author/DDD)
Mathematical Problem Solving through Sequential Process Analysis
ERIC Educational Resources Information Center
Codina, A.; Cañadas, M. C.; Castro, E.
2015-01-01
Introduction: The macroscopic perspective is one of the frameworks for research on problem solving in mathematics education. Coming from this perspective, our study addresses the stages of thought in mathematical problem solving, offering an innovative approach because we apply sequential relations and global interrelations between the different…
NASA Astrophysics Data System (ADS)
Perfors, Amy
2014-09-01
There is much to approve of in this provocative and interesting paper. I strongly agree in many parts, especially the point that dichotomies like nature/nurture are actively detrimental to the field. I also appreciate the idea that cognitive scientists should take the "biological wetware" of the cell (rather than the network) more seriously.
Chimaera simulation of complex states of flowing matter.
Succi, S
2016-11-13
We discuss a unified mesoscale framework (chimaera) for the simulation of complex states of flowing matter across scales of motion. The chimaera framework can deal with each of the three macro-meso-micro levels through suitable 'mutations' of the basic mesoscale formulation. The idea is illustrated through selected simulations of complex micro- and nanoscale flows.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2016 The Author(s).
A unified framework for gesture recognition and spatiotemporal gesture segmentation.
Alon, Jonathan; Athitsos, Vassilis; Yuan, Quan; Sclaroff, Stan
2009-09-01
Within the context of hand gesture recognition, spatiotemporal gesture segmentation is the task of determining, in a video sequence, where the gesturing hand is located and when the gesture starts and ends. Existing gesture recognition methods typically assume either known spatial segmentation or known temporal segmentation, or both. This paper introduces a unified framework for simultaneously performing spatial segmentation, temporal segmentation, and recognition. In the proposed framework, information flows both bottom-up and top-down. A gesture can be recognized even when the hand location is highly ambiguous and when information about when the gesture begins and ends is unavailable. Thus, the method can be applied to continuous image streams where gestures are performed in front of moving, cluttered backgrounds. The proposed method consists of three novel contributions: a spatiotemporal matching algorithm that can accommodate multiple candidate hand detections in every frame, a classifier-based pruning framework that enables accurate and early rejection of poor matches to gesture models, and a subgesture reasoning algorithm that learns which gesture models can falsely match parts of other longer gestures. The performance of the approach is evaluated on two challenging applications: recognition of hand-signed digits gestured by users wearing short-sleeved shirts, in front of a cluttered background, and retrieval of occurrences of signs of interest in a video database containing continuous, unsegmented signing in American Sign Language (ASL).
Danylov, Iu V; Motkov, K V; Shevchenko, T I
2014-01-01
The morphometric estimation of parenchyma and stroma condition included the determination of 25 parameters in a prostate gland at 27 persons. The mathematical model of morphogenesis of prostate gland was created by Bayes' method. The method of differential diagnosis of a prostate gland tissues' changes conditioned by the influence of the Chernobyl factor and/or unfavorable terms of the work in underground coal mines have been worked out. Its practical use provides exactness and reliability of the diagnosis (not less than 95%), independence from the level of the qualification and personal experience of the doctor, allows us to unify, optimize and individualize the diagnostic algorithms, answer the requirements of evidential medicine.
Danylov, Iu V; Motkov, K V; Shevchenko, T I
2014-01-01
The morphometric estimation of parenchyma and stroma condition included the determination of 29 parameters in testicles at 27 persons. The mathematical model of morphogenesis of testicles was created by Bayes' method. The method of differential diagnosis of testicles tissues' changes conditioned by the influence of the Chernobyl factor and/or unfavorable terms of the work in underground coal mines have been worked out. Its practical use provides exactness and reliability of the diagnosis (not less than 95%), independence from the level of the qualification and personal experience of the doctor, allows us to unify, optimize and individualize the diagnostic algorithms, answer the requirements of evidential medicine.
A Mathematical Framework for Image Analysis
1991-08-01
The results reported here were derived from the research project ’A Mathematical Framework for Image Analysis ’ supported by the Office of Naval...Research, contract N00014-88-K-0289 to Brown University. A common theme for the work reported is the use of probabilistic methods for problems in image ... analysis and image reconstruction. Five areas of research are described: rigid body recognition using a decision tree/combinatorial approach; nonrigid
ERIC Educational Resources Information Center
Kalinec-Craig, Crystal A.
2017-01-01
An elementary mathematics teacher once argued that she and her students held four Rights of the Learner in the classroom: (1) the right to be confused; (2) the right to claim a mistake; (3) the right to speak, listen and be heard; and (4) the right to write, do, and represent only what makes sense. Written as an emerging framework to promote…
Can quantum approaches benefit biology of decision making?
Takahashi, Taiki
2017-11-01
Human decision making has recently been focused in the emerging fields of quantum decision theory and neuroeconomics. The former discipline utilizes mathematical formulations developed in quantum theory, while the latter combines behavioral economics and neurobiology. In this paper, the author speculates on possible future directions unifying the two approaches, by contrasting the roles of quantum theory in the birth of molecular biology of the gene. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Vermont Department of Education, 2004
2004-01-01
This document, "Grade Expectations for Vermont's Framework of Standards and Learning Opportunities" (hereafter "Vermont's Grade Expectations"), is an important companion to "Vermont's Framework." These Grade Expectations (GEs) serve the same purposes as "Vermont's Framework," but articulate learning…
Data Centric Development Methodology
ERIC Educational Resources Information Center
Khoury, Fadi E.
2012-01-01
Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…
The semiotics of medical image Segmentation.
Baxter, John S H; Gibson, Eli; Eagleson, Roy; Peters, Terry M
2018-02-01
As the interaction between clinicians and computational processes increases in complexity, more nuanced mechanisms are required to describe how their communication is mediated. Medical image segmentation in particular affords a large number of distinct loci for interaction which can act on a deep, knowledge-driven level which complicates the naive interpretation of the computer as a symbol processing machine. Using the perspective of the computer as dialogue partner, we can motivate the semiotic understanding of medical image segmentation. Taking advantage of Peircean semiotic traditions and new philosophical inquiry into the structure and quality of metaphors, we can construct a unified framework for the interpretation of medical image segmentation as a sign exchange in which each sign acts as an interface metaphor. This allows for a notion of finite semiosis, described through a schematic medium, that can rigorously describe how clinicians and computers interpret the signs mediating their interaction. Altogether, this framework provides a unified approach to the understanding and development of medical image segmentation interfaces. Copyright © 2017 Elsevier B.V. All rights reserved.
Complex networks as a unified framework for descriptive analysis and predictive modeling in climate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steinhaeuser, Karsten J K; Chawla, Nitesh; Ganguly, Auroop R
The analysis of climate data has relied heavily on hypothesis-driven statistical methods, while projections of future climate are based primarily on physics-based computational models. However, in recent years a wealth of new datasets has become available. Therefore, we take a more data-centric approach and propose a unified framework for studying climate, with an aim towards characterizing observed phenomena as well as discovering new knowledge in the climate domain. Specifically, we posit that complex networks are well-suited for both descriptive analysis and predictive modeling tasks. We show that the structural properties of climate networks have useful interpretation within the domain. Further,more » we extract clusters from these networks and demonstrate their predictive power as climate indices. Our experimental results establish that the network clusters are statistically significantly better predictors than clusters derived using a more traditional clustering approach. Using complex networks as data representation thus enables the unique opportunity for descriptive and predictive modeling to inform each other.« less
The thermodynamics of dense granular flow and jamming
NASA Astrophysics Data System (ADS)
Lu, Shih Yu
The scope of the thesis is to propose, based on experimental evidence and theoretical validation, a quantifiable connection between systems that exhibit the jamming phenomenon. When jammed, some materials that flow are able to resist deformation so that they appear solid-like on the laboratory scale. But unlike ordinary fusion, which has a critically defined criterion in pressure and temperature, jamming occurs under a wide range of conditions. These condition have been rigorously investigated but at the moment, no self-consistent framework can apply to grains, foam and colloids that may have suddenly ceased to flow. To quantify the jamming behavior, a constitutive model of dense granular flows is deduced from shear-flow experiments. The empirical equations are then generalized, via a thermodynamic approach, into an equation-of-state for jamming. Notably, the unifying theory also predicts the experimental data on the behavior of molecular glassy liquids. This analogy paves a crucial road map for a unifying theoretical framework in condensed matter, for example, ranging from sand to fire retardants to toothpaste.
Li, Bing; Yuan, Chunfeng; Xiong, Weihua; Hu, Weiming; Peng, Houwen; Ding, Xinmiao; Maybank, Steve
2017-12-01
In multi-instance learning (MIL), the relations among instances in a bag convey important contextual information in many applications. Previous studies on MIL either ignore such relations or simply model them with a fixed graph structure so that the overall performance inevitably degrades in complex environments. To address this problem, this paper proposes a novel multi-view multi-instance learning algorithm (MIL) that combines multiple context structures in a bag into a unified framework. The novel aspects are: (i) we propose a sparse -graph model that can generate different graphs with different parameters to represent various context relations in a bag, (ii) we propose a multi-view joint sparse representation that integrates these graphs into a unified framework for bag classification, and (iii) we propose a multi-view dictionary learning algorithm to obtain a multi-view graph dictionary that considers cues from all views simultaneously to improve the discrimination of the MIL. Experiments and analyses in many practical applications prove the effectiveness of the M IL.
Unified Computational Methods for Regression Analysis of Zero-Inflated and Bound-Inflated Data
Yang, Yan; Simpson, Douglas
2010-01-01
Bounded data with excess observations at the boundary are common in many areas of application. Various individual cases of inflated mixture models have been studied in the literature for bound-inflated data, yet the computational methods have been developed separately for each type of model. In this article we use a common framework for computing these models, and expand the range of models for both discrete and semi-continuous data with point inflation at the lower boundary. The quasi-Newton and EM algorithms are adapted and compared for estimation of model parameters. The numerical Hessian and generalized Louis method are investigated as means for computing standard errors after optimization. Correlated data are included in this framework via generalized estimating equations. The estimation of parameters and effectiveness of standard errors are demonstrated through simulation and in the analysis of data from an ultrasound bioeffect study. The unified approach enables reliable computation for a wide class of inflated mixture models and comparison of competing models. PMID:20228950
Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.
Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda
2014-05-01
We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.