Sample records for generally applicable model

  1. An Application of Unfolding and Cumulative Item Response Theory Models for Noncognitive Scaling: Examining the Assumptions and Applicability of the Generalized Graded Unfolding Model

    ERIC Educational Resources Information Center

    Sgammato, Adrienne N.

    2009-01-01

    This study examined the applicability of a relatively new unidimensional, unfolding item response theory (IRT) model called the generalized graded unfolding model (GGUM; Roberts, Donoghue, & Laughlin, 2000). A total of four scaling methods were applied. Two commonly used cumulative IRT models for polytomous data, the Partial Credit Model and…

  2. Modeling Answer Change Behavior: An Application of a Generalized Item Response Tree Model

    ERIC Educational Resources Information Center

    Jeon, Minjeong; De Boeck, Paul; van der Linden, Wim

    2017-01-01

    We present a novel application of a generalized item response tree model to investigate test takers' answer change behavior. The model allows us to simultaneously model the observed patterns of the initial and final responses after an answer change as a function of a set of latent traits and item parameters. The proposed application is illustrated…

  3. 40 CFR 600.501-12 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Year 1978 Passenger Automobiles and for 1979 and Later Model Year Automobiles (Light Trucks and Passenger Automobiles)-Procedures for Determining Manufacturer's Average Fuel Economy § 600.501-12 General applicability. The provisions of this subpart are applicable to 2012 and later model year passenger automobiles...

  4. Formulation and Application of the Generalized Multilevel Facets Model

    ERIC Educational Resources Information Center

    Wang, Wen-Chung; Liu, Chih-Yu

    2007-01-01

    In this study, the authors develop a generalized multilevel facets model, which is not only a multilevel and two-parameter generalization of the facets model, but also a multilevel and facet generalization of the generalized partial credit model. Because the new model is formulated within a framework of nonlinear mixed models, no efforts are…

  5. A connectionist model for dynamic control

    NASA Technical Reports Server (NTRS)

    Whitfield, Kevin C.; Goodall, Sharon M.; Reggia, James A.

    1989-01-01

    The application of a connectionist modeling method known as competition-based spreading activation to a camera tracking task is described. The potential is explored for automation of control and planning applications using connectionist technology. The emphasis is on applications suitable for use in the NASA Space Station and in related space activities. The results are quite general and could be applicable to control systems in general.

  6. 40 CFR 600.301-12 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and Later Model Year Automobiles-Labeling § 600.301-12 General applicability. (a) Unless otherwise... 40 Protection of Environment 29 2010-07-01 2010-07-01 false General applicability. 600.301-12 Section 600.301-12 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY...

  7. Adaptive Long-Term Monitoring at Environmental Restoration Sites (ER-0629)

    DTIC Science & Technology

    2009-05-01

    Figures Figure 2-1 General Flowchart of Software Application Figure 2-2 Overview of the Genetic Algorithm Approach Figure 2-3 Example of a...and Model Builder) are highlighted on Figure 2-1, which is a general flowchart illustrating the application of the software. The software is applied...monitoring event (e.g., contaminant mass based on interpolation) that modeling is provided by Model Builder. 4 Figure 2-1. General Flowchart of Software

  8. Application of a Cognitive Diagnostic Model to a High-Stakes Reading Comprehension Test

    ERIC Educational Resources Information Center

    Ravand, Hamdollah

    2016-01-01

    General cognitive diagnostic models (CDM) such as the generalized deterministic input, noisy, "and" gate (G-DINA) model are flexible in that they allow for both compensatory and noncompensatory relationships among the subskills within the same test. Most of the previous CDM applications in the literature have been add-ons to simulation…

  9. 40 CFR 600.501-85 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Year 1978 Passenger Automobiles and for 1979 and Later Model Year Automobiles (Light Trucks and Passenger Automobiles)-Procedures for Determining Manufacturer's Average Fuel Economy § 600.501-85 General... applicable to 1985 and later model year gasoline-fueled and diesel automobiles. (b)(1) Manufacturers that...

  10. A generalized target theory and its applications.

    PubMed

    Zhao, Lei; Mi, Dong; Hu, Bei; Sun, Yeqing

    2015-09-28

    Different radiobiological models have been proposed to estimate the cell-killing effects, which are very important in radiotherapy and radiation risk assessment. However, most applied models have their own scopes of application. In this work, by generalizing the relationship between "hit" and "survival" in traditional target theory with Yager negation operator in Fuzzy mathematics, we propose a generalized target model of radiation-induced cell inactivation that takes into account both cellular repair effects and indirect effects of radiation. The simulation results of the model and the rethinking of "the number of targets in a cell" and "the number of hits per target" suggest that it is only necessary to investigate the generalized single-hit single-target (GSHST) in the present theoretical frame. Analysis shows that the GSHST model can be reduced to the linear quadratic model and multitarget model in the low-dose and high-dose regions, respectively. The fitting results show that the GSHST model agrees well with the usual experimental observations. In addition, the present model can be used to effectively predict cellular repair capacity, radiosensitivity, target size, especially the biologically effective dose for the treatment planning in clinical applications.

  11. Search algorithm complexity modeling with application to image alignment and matching

    NASA Astrophysics Data System (ADS)

    DelMarco, Stephen

    2014-05-01

    Search algorithm complexity modeling, in the form of penetration rate estimation, provides a useful way to estimate search efficiency in application domains which involve searching over a hypothesis space of reference templates or models, as in model-based object recognition, automatic target recognition, and biometric recognition. The penetration rate quantifies the expected portion of the database that must be searched, and is useful for estimating search algorithm computational requirements. In this paper we perform mathematical modeling to derive general equations for penetration rate estimates that are applicable to a wide range of recognition problems. We extend previous penetration rate analyses to use more general probabilistic modeling assumptions. In particular we provide penetration rate equations within the framework of a model-based image alignment application domain in which a prioritized hierarchical grid search is used to rank subspace bins based on matching probability. We derive general equations, and provide special cases based on simplifying assumptions. We show how previously-derived penetration rate equations are special cases of the general formulation. We apply the analysis to model-based logo image alignment in which a hierarchical grid search is used over a geometric misalignment transform hypothesis space. We present numerical results validating the modeling assumptions and derived formulation.

  12. Adding Temporal Characteristics to Geographical Schemata and Instances: A General Framework

    NASA Astrophysics Data System (ADS)

    Ota, Morishige

    2018-05-01

    This paper proposes the temporal general feature model (TGFM) as a meta-model for application schemata representing changes of real-world phenomena. It is not very easy to determine history directly from the current application schemata, even if the revision notes are attached to the specification. To solve this problem, the rules for description of the succession between previous and posterior components are added to the general feature model, thus resulting in TGFM. After discussing the concepts associated with the new model, simple examples of application schemata are presented as instances of TGFM. Descriptors for changing properties, the succession of changing properties in moving features, and the succession of features and associations are introduced. The modeling methods proposed in this paper will contribute to the acquisition of consistent and reliable temporal geospatial data.

  13. Dimensions for hearing-impaired mobile application usability model

    NASA Astrophysics Data System (ADS)

    Nathan, Shelena Soosay; Hussain, Azham; Hashim, Nor Laily; Omar, Mohd Adan

    2017-10-01

    This paper discuss on the dimensions that has been derived for the hearing-impaired mobile applications usability model. General usability model consist of general dimension for evaluating mobile application however requirements for the hearing-impaired are overlooked and often scanted. This led towards mobile application developed for the hearing-impaired are left unused. It is also apparent that these usability models do not consider accessibility dimensions according to the requirement of the special users. This complicates the work of usability practitioners as well as academician that practices research usability when application are developed for the specific user needs. To overcome this issue, dimension chosen for the hearing-impaired are ensured to be align with the real need of the hearing-impaired mobile application. Besides literature studies, requirements for the hearing-impaired mobile application have been identified through interview conducted with hearing-impaired mobile application users that were recorded as video outputs and analyzed using Nvivo. Finally total of 6 out of 15 dimensions gathered are chosen for the proposed model and presented.

  14. Energy Savings Forecast of Solid-State Lighting in General Illumination Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Penning, Julie; Stober, Kelsey; Taylor, Victor

    2016-09-01

    The DOE report, Energy Savings Forecast of Solid-State Lighting in General Illumination Applications, is a biannual report which models the adoption of LEDs in the U.S. general-lighting market, along with associated energy savings, based on the full potential DOE has determined to be technically feasible over time. This version of the report uses an updated 2016 U.S. lighting-market model that is more finely calibrated and granular than previous models, and extends the forecast period to 2035 from the 2030 limit that was used in previous editions.

  15. NON-SPATIAL CALIBRATIONS OF A GENERAL UNIT MODEL FOR ECOSYSTEM SIMULATIONS. (R825792)

    EPA Science Inventory

    General Unit Models simulate system interactions aggregated within one spatial unit of resolution. For unit models to be applicable to spatial computer simulations, they must be formulated generally enough to simulate all habitat elements within the landscape. We present the d...

  16. NON-SPATIAL CALIBRATIONS OF A GENERAL UNIT MODEL FOR ECOSYSTEM SIMULATIONS. (R827169)

    EPA Science Inventory

    General Unit Models simulate system interactions aggregated within one spatial unit of resolution. For unit models to be applicable to spatial computer simulations, they must be formulated generally enough to simulate all habitat elements within the landscape. We present the d...

  17. The Random-Threshold Generalized Unfolding Model and Its Application of Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Wang, Wen-Chung; Liu, Chen-Wei; Wu, Shiu-Lien

    2013-01-01

    The random-threshold generalized unfolding model (RTGUM) was developed by treating the thresholds in the generalized unfolding model as random effects rather than fixed effects to account for the subjective nature of the selection of categories in Likert items. The parameters of the new model can be estimated with the JAGS (Just Another Gibbs…

  18. A simple and exploratory way to determine the mean-variance relationship in generalized linear models.

    PubMed

    Tsou, Tsung-Shan

    2007-03-30

    This paper introduces an exploratory way to determine how variance relates to the mean in generalized linear models. This novel method employs the robust likelihood technique introduced by Royall and Tsou.A urinary data set collected by Ginsberg et al. and the fabric data set analysed by Lee and Nelder are considered to demonstrate the applicability and simplicity of the proposed technique. Application of the proposed method could easily reveal a mean-variance relationship that would generally be left unnoticed, or that would require more complex modelling to detect. Copyright (c) 2006 John Wiley & Sons, Ltd.

  19. Derivation of the linear-logistic model and Cox's proportional hazard model from a canonical system description.

    PubMed

    Voit, E O; Knapp, R G

    1997-08-15

    The linear-logistic regression model and Cox's proportional hazard model are widely used in epidemiology. Their successful application leaves no doubt that they are accurate reflections of observed disease processes and their associated risks or incidence rates. In spite of their prominence, it is not a priori evident why these models work. This article presents a derivation of the two models from the framework of canonical modeling. It begins with a general description of the dynamics between risk sources and disease development, formulates this description in the canonical representation of an S-system, and shows how the linear-logistic model and Cox's proportional hazard model follow naturally from this representation. The article interprets the model parameters in terms of epidemiological concepts as well as in terms of general systems theory and explains the assumptions and limitations generally accepted in the application of these epidemiological models.

  20. Robust integration schemes for generalized viscoplasticity with internal-state variables. Part 1: Theoretical developments and applications

    NASA Technical Reports Server (NTRS)

    Saleeb, Atef F.; Li, Wei

    1995-01-01

    This two-part report is concerned with the development of a general framework for the implicit time-stepping integrators for the flow and evolution equations in generalized viscoplastic models. The primary goal is to present a complete theoretical formulation, and to address in detail the algorithmic and numerical analysis aspects involved in its finite element implementation, as well as to critically assess the numerical performance of the developed schemes in a comprehensive set of test cases. On the theoretical side, the general framework is developed on the basis of the unconditionally-stable, backward-Euler difference scheme as a starting point. Its mathematical structure is of sufficient generality to allow a unified treatment of different classes of viscoplastic models with internal variables. In particular, two specific models of this type, which are representative of the present start-of-art in metal viscoplasticity, are considered in applications reported here; i.e., fully associative (GVIPS) and non-associative (NAV) models. The matrix forms developed for both these models are directly applicable for both initially isotropic and anisotropic materials, in general (three-dimensional) situations as well as subspace applications (i.e., plane stress/strain, axisymmetric, generalized plane stress in shells). On the computational side, issues related to efficiency and robustness are emphasized in developing the (local) interative algorithm. In particular, closed-form expressions for residual vectors and (consistent) material tangent stiffness arrays are given explicitly for both GVIPS and NAV models, with their maximum sizes 'optimized' to depend only on the number of independent stress components (but independent of the number of viscoplastic internal state parameters). Significant robustness of the local iterative solution is provided by complementing the basic Newton-Raphson scheme with a line-search strategy for convergence. In the present first part of the report, we focus on the theoretical developments, and discussions of the results of numerical-performance studies using the integration schemes for GVIPS and NAV models.

  1. Automatic Dynamic Aircraft Modeler (ADAM) for the Computer Program NASTRAN

    NASA Technical Reports Server (NTRS)

    Griffis, H.

    1985-01-01

    Large general purpose finite element programs require users to develop large quantities of input data. General purpose pre-processors are used to decrease the effort required to develop structural models. Further reduction of effort can be achieved by specific application pre-processors. Automatic Dynamic Aircraft Modeler (ADAM) is one such application specific pre-processor. General purpose pre-processors use points, lines and surfaces to describe geometric shapes. Specifying that ADAM is used only for aircraft structures allows generic structural sections, wing boxes and bodies, to be pre-defined. Hence with only gross dimensions, thicknesses, material properties and pre-defined boundary conditions a complete model of an aircraft can be created.

  2. Improved Doubly Robust Estimation when Data are Monotonely Coarsened, with Application to Longitudinal Studies with Dropout

    PubMed Central

    Tsiatis, Anastasios A.; Davidian, Marie; Cao, Weihua

    2010-01-01

    Summary A routine challenge is that of making inference on parameters in a statistical model of interest from longitudinal data subject to drop out, which are a special case of the more general setting of monotonely coarsened data. Considerable recent attention has focused on doubly robust estimators, which in this context involve positing models for both the missingness (more generally, coarsening) mechanism and aspects of the distribution of the full data, that have the appealing property of yielding consistent inferences if only one of these models is correctly specified. Doubly robust estimators have been criticized for potentially disastrous performance when both of these models are even only mildly misspecified. We propose a doubly robust estimator applicable in general monotone coarsening problems that achieves comparable or improved performance relative to existing doubly robust methods, which we demonstrate via simulation studies and by application to data from an AIDS clinical trial. PMID:20731640

  3. The General Ensemble Biogeochemical Modeling System (GEMS) and its applications to agricultural systems in the United States: Chapter 18

    USGS Publications Warehouse

    Liu, Shuguang; Tan, Zhengxi; Chen, Mingshi; Liu, Jinxun; Wein, Anne; Li, Zhengpeng; Huang, Shengli; Oeding, Jennifer; Young, Claudia; Verma, Shashi B.; Suyker, Andrew E.; Faulkner, Stephen P.

    2012-01-01

    The General Ensemble Biogeochemical Modeling System (GEMS) was es in individual models, it uses multiple site-scale biogeochemical models to perform model simulations. Second, it adopts Monte Carlo ensemble simulations of each simulation unit (one site/pixel or group of sites/pixels with similar biophysical conditions) to incorporate uncertainties and variability (as measured by variances and covariance) of input variables into model simulations. In this chapter, we illustrate the applications of GEMS at the site and regional scales with an emphasis on incorporating agricultural practices. Challenges in modeling soil carbon dynamics and greenhouse emissions are also discussed.

  4. Generalized random sign and alert delay models for imperfect maintenance.

    PubMed

    Dijoux, Yann; Gaudoin, Olivier

    2014-04-01

    This paper considers the modelling of the process of Corrective and condition-based Preventive Maintenance, for complex repairable systems. In order to take into account the dependency between both types of maintenance and the possibility of imperfect maintenance, Generalized Competing Risks models have been introduced in "Doyen and Gaudoin (J Appl Probab 43:825-839, 2006)". In this paper, we study two classes of these models, the Generalized Random Sign and Generalized Alert Delay models. A Generalized Competing Risks model can be built as a generalization of a particular Usual Competing Risks model, either by using a virtual age framework or not. The models properties are studied and their parameterizations are discussed. Finally, simulation results and an application to real data are presented.

  5. A generalized predictive model for direct gain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Givoni, B.

    In the correlational model for direct gain developed by the Los Alamos National Laboratory, a list of constants applicable to different types of buildings or passive solar systems was specified separately for each type. In its original form, the model was applicable only to buildings similar in their heat capacity, type of glazing, or night insulation to the types specified by the model. While maintaining the general form of the predictive equations, the new model, the predictive model for direct gain (PMDG), replaces the constants with functions dependent upon the thermal properties of the building, or the components of themore » solar system, or both. By this transformation, the LANL model for direct gain becomes a generalized one. The new model predicts the performance of buildings heated by direct gain with any heat capacity, glazing, and night insulation as functions of their thermophysical properties and climatic conditions.« less

  6. Generalized linear and generalized additive models in studies of species distributions: Setting the scene

    USGS Publications Warehouse

    Guisan, Antoine; Edwards, T.C.; Hastie, T.

    2002-01-01

    An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001. We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling. ?? 2002 Elsevier Science B.V. All rights reserved.

  7. A Generalized Partial Credit Model: Application of an EM Algorithm.

    ERIC Educational Resources Information Center

    Muraki, Eiji

    1992-01-01

    The partial credit model with a varying slope parameter is developed and called the generalized partial credit model (GPCM). Analysis results for simulated data by this and other polytomous item-response models demonstrate that the rating formulation of the GPCM is adaptable to the analysis of polytomous item responses. (SLD)

  8. Variable Complexity Structural Optimization of Shells

    NASA Technical Reports Server (NTRS)

    Haftka, Raphael T.; Venkataraman, Satchi

    1999-01-01

    Structural designers today face both opportunities and challenges in a vast array of available analysis and optimization programs. Some programs such as NASTRAN, are very general, permitting the designer to model any structure, to any degree of accuracy, but often at a higher computational cost. Additionally, such general procedures often do not allow easy implementation of all constraints of interest to the designer. Other programs, based on algebraic expressions used by designers one generation ago, have limited applicability for general structures with modem materials. However, when applicable, they provide easy understanding of design decisions trade-off. Finally, designers can also use specialized programs suitable for designing efficiently a subset of structural problems. For example, PASCO and PANDA2 are panel design codes, which calculate response and estimate failure much more efficiently than general-purpose codes, but are narrowly applicable in terms of geometry and loading. Therefore, the problem of optimizing structures based on simultaneous use of several models and computer programs is a subject of considerable interest. The problem of using several levels of models in optimization has been dubbed variable complexity modeling. Work under NASA grant NAG1-2110 has been concerned with the development of variable complexity modeling strategies with special emphasis on response surface techniques. In addition, several modeling issues for the design of shells of revolution were studied.

  9. Variable Complexity Structural Optimization of Shells

    NASA Technical Reports Server (NTRS)

    Haftka, Raphael T.; Venkataraman, Satchi

    1998-01-01

    Structural designers today face both opportunities and challenges in a vast array of available analysis and optimization programs. Some programs such as NASTRAN, are very general, permitting the designer to model any structure, to any degree of accuracy, but often at a higher computational cost. Additionally, such general procedures often do not allow easy implementation of all constraints of interest to the designer. Other programs, based on algebraic expressions used by designers one generation ago, have limited applicability for general structures with modem materials. However, when applicable, they provide easy understanding of design decisions trade-off. Finally, designers can also use specialized programs suitable for designing efficiently a subset of structural problems. For example, PASCO and PANDA2 are panel design codes, which calculate response and estimate failure much more efficiently than general-purpose codes, but are narrowly applicable in terms of geometry and loading. Therefore, the problem of optimizing structures based on simultaneous use of several models and computer programs is a subject of considerable interest. The problem of using several levels of models in optimization has been dubbed variable complexity modeling. Work under NASA grant NAG1-1808 has been concerned with the development of variable complexity modeling strategies with special emphasis on response surface techniques. In addition several modeling issues for the design of shells of revolution were studied.

  10. Labyrinth, An Abstract Model for Hypermedia Applications. Description of its Static Components.

    ERIC Educational Resources Information Center

    Diaz, Paloma; Aedo, Ignacio; Panetsos, Fivos

    1997-01-01

    The model for hypermedia applications called Labyrinth allows: (1) the design of platform-independent hypermedia applications; (2) the categorization, generalization and abstraction of sparse unstructured heterogeneous information in multiple and interconnected levels; (3) the creation of personal views in multiuser hyperdocuments for both groups…

  11. The Need for a Contemporary Theory of Job Design.

    ERIC Educational Resources Information Center

    Martelli, Joseph T.

    1982-01-01

    Presents a critique of Taylor's scientific management theory and the negative consequences of work simplification. Compares this method with Maslow's, Herzberg's, and Thorsrud's theories of motivation, and contrasts the experiences of General Motors' application of Taylor's model and General Foods' application of Thorsrud's. (SK)

  12. Generalized Ordinary Differential Equation Models 1

    PubMed Central

    Miao, Hongyu; Wu, Hulin; Xue, Hongqi

    2014-01-01

    Existing estimation methods for ordinary differential equation (ODE) models are not applicable to discrete data. The generalized ODE (GODE) model is therefore proposed and investigated for the first time. We develop the likelihood-based parameter estimation and inference methods for GODE models. We propose robust computing algorithms and rigorously investigate the asymptotic properties of the proposed estimator by considering both measurement errors and numerical errors in solving ODEs. The simulation study and application of our methods to an influenza viral dynamics study suggest that the proposed methods have a superior performance in terms of accuracy over the existing ODE model estimation approach and the extended smoothing-based (ESB) method. PMID:25544787

  13. Generalized Ordinary Differential Equation Models.

    PubMed

    Miao, Hongyu; Wu, Hulin; Xue, Hongqi

    2014-10-01

    Existing estimation methods for ordinary differential equation (ODE) models are not applicable to discrete data. The generalized ODE (GODE) model is therefore proposed and investigated for the first time. We develop the likelihood-based parameter estimation and inference methods for GODE models. We propose robust computing algorithms and rigorously investigate the asymptotic properties of the proposed estimator by considering both measurement errors and numerical errors in solving ODEs. The simulation study and application of our methods to an influenza viral dynamics study suggest that the proposed methods have a superior performance in terms of accuracy over the existing ODE model estimation approach and the extended smoothing-based (ESB) method.

  14. Working covariance model selection for generalized estimating equations.

    PubMed

    Carey, Vincent J; Wang, You-Gan

    2011-11-20

    We investigate methods for data-based selection of working covariance models in the analysis of correlated data with generalized estimating equations. We study two selection criteria: Gaussian pseudolikelihood and a geodesic distance based on discrepancy between model-sensitive and model-robust regression parameter covariance estimators. The Gaussian pseudolikelihood is found in simulation to be reasonably sensitive for several response distributions and noncanonical mean-variance relations for longitudinal data. Application is also made to a clinical dataset. Assessment of adequacy of both correlation and variance models for longitudinal data should be routine in applications, and we describe open-source software supporting this practice. Copyright © 2011 John Wiley & Sons, Ltd.

  15. 40 CFR 86.005-1 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Emission Regulations for 1977 and Later Model Year New Light-Duty Vehicles, Light-Duty Trucks and Heavy-Duty Engines, and for 1985 and Later Model Year New Gasoline Fueled, Natural Gas-Fueled, Liquefied... this subpart generally apply to 2005 and later model year new Otto-cycle heavy-duty engines used in...

  16. A haptic model of vibration modes in spherical geometry and its application in atomic physics, nuclear physics and beyond

    NASA Astrophysics Data System (ADS)

    Ubben, Malte; Heusler, Stefan

    2018-07-01

    Vibration modes in spherical geometry can be classified based on the number and position of nodal planes. However, the geometry of these planes is non-trivial and cannot be easily displayed in two dimensions. We present 3D-printed models of those vibration modes, enabling a haptic approach for understanding essential features of bound states in quantum physics and beyond. In particular, when applied to atomic physics, atomic orbitals are obtained in a natural manner. Applied to nuclear physics, the same patterns of vibration modes emerge as cornerstone for the nuclear shell model. These applications of the very same model in a range of more than 5 orders of magnitude in length scales leads to a general discussion of the applicability and limits of validity of physical models in general.

  17. Fuzzy bilevel programming with multiple non-cooperative followers: model, algorithm and application

    NASA Astrophysics Data System (ADS)

    Ke, Hua; Huang, Hu; Ralescu, Dan A.; Wang, Lei

    2016-04-01

    In centralized decision problems, it is not complicated for decision-makers to make modelling technique selections under uncertainty. When a decentralized decision problem is considered, however, choosing appropriate models is no longer easy due to the difficulty in estimating the other decision-makers' inconclusive decision criteria. These decision criteria may vary with different decision-makers because of their special risk tolerances and management requirements. Considering the general differences among the decision-makers in decentralized systems, we propose a general framework of fuzzy bilevel programming including hybrid models (integrated with different modelling methods in different levels). Specially, we discuss two of these models which may have wide applications in many fields. Furthermore, we apply the proposed two models to formulate a pricing decision problem in a decentralized supply chain with fuzzy coefficients. In order to solve these models, a hybrid intelligent algorithm integrating fuzzy simulation, neural network and particle swarm optimization based on penalty function approach is designed. Some suggestions on the applications of these models are also presented.

  18. 40 CFR 86.201-94 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission Regulations for 1994 and Later Model Year Gasoline-Fueled New Light-Duty Vehicles, New Light-Duty Trucks and New Medium-Duty Passenger Vehicles; Cold Temperature Test Procedures § 86.201-94 General applicability. (a) This...

  19. Application of General Regression Neural Network to the Prediction of LOD Change

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao-Hong; Wang, Qi-Jie; Zhu, Jian-Jun; Zhang, Hao

    2012-01-01

    Traditional methods for predicting the change in length of day (LOD change) are mainly based on some linear models, such as the least square model and autoregression model, etc. However, the LOD change comprises complicated non-linear factors and the prediction effect of the linear models is always not so ideal. Thus, a kind of non-linear neural network — general regression neural network (GRNN) model is tried to make the prediction of the LOD change and the result is compared with the predicted results obtained by taking advantage of the BP (back propagation) neural network model and other models. The comparison result shows that the application of the GRNN to the prediction of the LOD change is highly effective and feasible.

  20. Modelling uncertainty with generalized credal sets: application to conjunction and decision

    NASA Astrophysics Data System (ADS)

    Bronevich, Andrey G.; Rozenberg, Igor N.

    2018-01-01

    To model conflict, non-specificity and contradiction in information, upper and lower generalized credal sets are introduced. Any upper generalized credal set is a convex subset of plausibility measures interpreted as lower probabilities whose bodies of evidence consist of singletons and a certain event. Analogously, contradiction is modelled in the theory of evidence by a belief function that is greater than zero at empty set. Based on generalized credal sets, we extend the conjunctive rule for contradictory sources of information, introduce constructions like natural extension in the theory of imprecise probabilities and show that the model of generalized credal sets coincides with the model of imprecise probabilities if the profile of a generalized credal set consists of probability measures. We give ways how the introduced model can be applied to decision problems.

  1. A generalized right truncated bivariate Poisson regression model with applications to health data.

    PubMed

    Islam, M Ataharul; Chowdhury, Rafiqul I

    2017-01-01

    A generalized right truncated bivariate Poisson regression model is proposed in this paper. Estimation and tests for goodness of fit and over or under dispersion are illustrated for both untruncated and right truncated bivariate Poisson regression models using marginal-conditional approach. Estimation and test procedures are illustrated for bivariate Poisson regression models with applications to Health and Retirement Study data on number of health conditions and the number of health care services utilized. The proposed test statistics are easy to compute and it is evident from the results that the models fit the data very well. A comparison between the right truncated and untruncated bivariate Poisson regression models using the test for nonnested models clearly shows that the truncated model performs significantly better than the untruncated model.

  2. A generalized right truncated bivariate Poisson regression model with applications to health data

    PubMed Central

    Islam, M. Ataharul; Chowdhury, Rafiqul I.

    2017-01-01

    A generalized right truncated bivariate Poisson regression model is proposed in this paper. Estimation and tests for goodness of fit and over or under dispersion are illustrated for both untruncated and right truncated bivariate Poisson regression models using marginal-conditional approach. Estimation and test procedures are illustrated for bivariate Poisson regression models with applications to Health and Retirement Study data on number of health conditions and the number of health care services utilized. The proposed test statistics are easy to compute and it is evident from the results that the models fit the data very well. A comparison between the right truncated and untruncated bivariate Poisson regression models using the test for nonnested models clearly shows that the truncated model performs significantly better than the untruncated model. PMID:28586344

  3. A General-applications Direct Global Matrix Algorithm for Rapid Seismo-acoustic Wavefield Computations

    NASA Technical Reports Server (NTRS)

    Schmidt, H.; Tango, G. J.; Werby, M. F.

    1985-01-01

    A new matrix method for rapid wave propagation modeling in generalized stratified media, which has recently been applied to numerical simulations in diverse areas of underwater acoustics, solid earth seismology, and nondestructive ultrasonic scattering is explained and illustrated. A portion of recent efforts jointly undertaken at NATOSACLANT and NORDA Numerical Modeling groups in developing, implementing, and testing a new fast general-applications wave propagation algorithm, SAFARI, formulated at SACLANT is summarized. The present general-applications SAFARI program uses a Direct Global Matrix Approach to multilayer Green's function calculation. A rapid and unconditionally stable solution is readily obtained via simple Gaussian ellimination on the resulting sparsely banded block system, precisely analogous to that arising in the Finite Element Method. The resulting gains in accuracy and computational speed allow consideration of much larger multilayered air/ocean/Earth/engineering material media models, for many more source-receiver configurations than previously possible. The validity and versatility of the SAFARI-DGM method is demonstrated by reviewing three practical examples of engineering interest, drawn from ocean acoustics, engineering seismology and ultrasonic scattering.

  4. 40 CFR 600.501-86 - General applicability.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Year 1978 Passenger Automobiles and for 1979 and Later Model Year Automobiles (Light Trucks and Passenger Automobiles)-Procedures for Determining Manufacturer's Average Fuel Economy and Manufacturer's... subpart are applicable to 1986 and later model year gasoline-fueled and diesel automobiles. (b)(1...

  5. Estimating Neutral Atmosphere Drivers using a Physical Model

    DTIC Science & Technology

    2009-03-30

    Araujo-Pradere, M. Fedrizzi, 2007, Memory effects in the ionosphere storm response. EGU General Assembly , Vienna, Austria Codrescu, M., T.J. Fuller...Strickland, D, 2007: Application of thermospheric general circulation models for space weather operations. J. Adv. Space Res., edited by Schmidtke

  6. Modeling Learning in Doubly Multilevel Binary Longitudinal Data Using Generalized Linear Mixed Models: An Application to Measuring and Explaining Word Learning.

    PubMed

    Cho, Sun-Joo; Goodwin, Amanda P

    2016-04-01

    When word learning is supported by instruction in experimental studies for adolescents, word knowledge outcomes tend to be collected from complex data structure, such as multiple aspects of word knowledge, multilevel reader data, multilevel item data, longitudinal design, and multiple groups. This study illustrates how generalized linear mixed models can be used to measure and explain word learning for data having such complexity. Results from this application provide deeper understanding of word knowledge than could be attained from simpler models and show that word knowledge is multidimensional and depends on word characteristics and instructional contexts.

  7. Spatial Double Generalized Beta Regression Models: Extensions and Application to Study Quality of Education in Colombia

    ERIC Educational Resources Information Center

    Cepeda-Cuervo, Edilberto; Núñez-Antón, Vicente

    2013-01-01

    In this article, a proposed Bayesian extension of the generalized beta spatial regression models is applied to the analysis of the quality of education in Colombia. We briefly revise the beta distribution and describe the joint modeling approach for the mean and dispersion parameters in the spatial regression models' setting. Finally, we motivate…

  8. Robust integration schemes for generalized viscoplasticity with internal-state variables. Part 2: Algorithmic developments and implementation

    NASA Technical Reports Server (NTRS)

    Li, Wei; Saleeb, Atef F.

    1995-01-01

    This two-part report is concerned with the development of a general framework for the implicit time-stepping integrators for the flow and evolution equations in generalized viscoplastic models. The primary goal is to present a complete theoretical formulation, and to address in detail the algorithmic and numerical analysis aspects involved in its finite element implementation, as well as to critically assess the numerical performance of the developed schemes in a comprehensive set of test cases. On the theoretical side, the general framework is developed on the basis of the unconditionally-stable, backward-Euler difference scheme as a starting point. Its mathematical structure is of sufficient generality to allow a unified treatment of different classes of viscoplastic models with internal variables. In particular, two specific models of this type, which are representative of the present start-of-art in metal viscoplasticity, are considered in applications reported here; i.e., fully associative (GVIPS) and non-associative (NAV) models. The matrix forms developed for both these models are directly applicable for both initially isotropic and anisotropic materials, in general (three-dimensional) situations as well as subspace applications (i.e., plane stress/strain, axisymmetric, generalized plane stress in shells). On the computational side, issues related to efficiency and robustness are emphasized in developing the (local) interative algorithm. In particular, closed-form expressions for residual vectors and (consistent) material tangent stiffness arrays are given explicitly for both GVIPS and NAV models, with their maximum sizes 'optimized' to depend only on the number of independent stress components (but independent of the number of viscoplastic internal state parameters). Significant robustness of the local iterative solution is provided by complementing the basic Newton-Raphson scheme with a line-search strategy for convergence. In the present second part of the report, we focus on the specific details of the numerical schemes, and associated computer algorithms, for the finite-element implementation of GVIPS and NAV models.

  9. Generalized plasma skimming model for cells and drug carriers in the microvasculature.

    PubMed

    Lee, Tae-Rin; Yoo, Sung Sic; Yang, Jiho

    2017-04-01

    In microvascular transport, where both blood and drug carriers are involved, plasma skimming has a key role on changing hematocrit level and drug carrier concentration in capillary beds after continuous vessel bifurcation in the microvasculature. While there have been numerous studies on modeling the plasma skimming of blood, previous works lacked in consideration of its interaction with drug carriers. In this paper, a generalized plasma skimming model is suggested to predict the redistributions of both the cells and drug carriers at each bifurcation. In order to examine its applicability, this new model was applied on a single bifurcation system to predict the redistribution of red blood cells and drug carriers. Furthermore, this model was tested at microvascular network level under different plasma skimming conditions for predicting the concentration of drug carriers. Based on these results, the applicability of this generalized plasma skimming model is fully discussed and future works along with the model's limitations are summarized.

  10. 40 CFR 600.501-93 - General applicability.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Year 1978 Passenger Automobiles and for 1979 and Later Model Year Automobiles (Light Trucks and Passenger Automobiles)-Procedures for Determining Manufacturer's Average Fuel Economy and Manufacturer's... subpart are applicable to 1993 and later model year passenger automobiles and light trucks, and to the...

  11. 40 CFR 600.501-12 - General applicability.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Year 1978 Passenger Automobiles and for 1979 and Later Model Year Automobiles (Light Trucks and Passenger Automobiles)-Procedures for Determining Manufacturer's Average Fuel Economy and Manufacturer's... subpart are applicable to 2012 and later model year passenger automobiles and light trucks and to the...

  12. Predicting diameters inside bark for 10 important hardwood species

    Treesearch

    Donald E. Hilt; Everette D. Rast; Herman J. Bailey

    1983-01-01

    General models for predicting DIB/DOB ratios up the stem, applicable over wide geographic areas, have been developed for 10 important hardwood species. Results indicate that the ratios either decrease or remain constant up the stem. Methods for adjusting the general models to local conditions are presented. The prediction models can be used in conjunction with optical...

  13. The Cost of Publishing an Electronic Journal: A General Model and a Case Study.

    ERIC Educational Resources Information Center

    Bot, Marjolein; Burgemeester, Johan; Roes, Hans

    1998-01-01

    Describes the Electronic Journal of Comparative Law (EJCL) project from Tilburg University and Utrecht University (Netherlands). A general costing model was developed to chart shared and direct costs of producing electronic journals. Data from the developing/publishing EJCL were used to illustrated the application of this model and to assess the…

  14. Efficient numerical modeling of the cornea, and applications

    NASA Astrophysics Data System (ADS)

    Gonzalez, L.; Navarro, Rafael M.; Hdez-Matamoros, J. L.

    2004-10-01

    Corneal topography has shown to be an essential tool in the ophthalmology clinic both in diagnosis and custom treatments (refractive surgery, keratoplastia), having also a strong potential in optometry. The post processing and analysis of corneal elevation, or local curvature data, is a necessary step to refine the data and also to extract relevant information for the clinician. In this context a parametric cornea model is proposed consisting of a surface described mathematically by two terms: one general ellipsoid corresponding to a regular base surface, expressed by a general quadric term located at an arbitrary position and free orientation in 3D space and a second term, described by a Zernike polynomial expansion, which accounts for irregularities and departures from the basic geometry. The model has been validated obtaining better adjustment of experimental data than other previous models. Among other potential applications, here we present the determination of the optical axis of the cornea by transforming the general quadric to its canonical form. This has permitted us to perform 3D registration of corneal topographical maps to improve the signal-to-noise ratio. Other basic and clinical applications are also explored.

  15. Computational Flow Analysis of Ultra High Pressure Firefighting Technology with Application to Long Range Nozzle Design

    DTIC Science & Technology

    2010-03-01

    release; distribution unlimited. Ref AFRL/RXQ Public Affairs Case # 10-100. Document contains color images . Although aqueous fire fighting agent...in conjunction with the standard Eulerian multiphase flow model. The two- equation k- model was selected due to its wide industrial application in...energy (k) and its dissipation rate (). Because of their heuristic development, RANS models have applicable limitations and in general must be

  16. Application of the Human Activity Assistive Technology model for occupational therapy research.

    PubMed

    Giesbrecht, Ed

    2013-08-01

    Theoretical models provide a framework for describing practice and integrating evidence into systematic research. There are few models that relate specifically to the provision of assistive technology in occupational therapy practice. The Human Activity Assistive Technology model is an enduring example that has continued to develop by integrating a social model of disability, concepts from occupational therapy theory and principles of assistive technology adoption and abandonment. This study first describes the core concepts of the Human Activity Assistive Technology model and reviews its development over three successive published versions. A review of the research literature reflects application of the model to clinical practice, study design, outcome measure selection and interpretation of results, particularly among occupational therapists. An evaluative framework is used to critique the adequacy of the Human Activity Assistive Technology model for practice and research, exploring attributes of clarity, simplicity, generality, accessibility and importance. Finally, recommendations are proposed for continued development of the model and research applications. Most of the existing research literature employs the Human Activity Assistive Technology model for background and study design; there is emerging evidence to support the core concepts as predictive factors. Although the concepts are generally simple, clear and applicable to occupational therapy practice and research, evolving terminology and outcomes become more complex with the conflation of integrated theories. The development of the Human Activity Assistive Technology model offers enhanced access and application for occupational therapists, but poses challenges to clarity among concepts. Suggestions are made for further development and applications of the model. © 2013 Occupational Therapy Australia.

  17. Developing Pre-Algebraic Thinking in Generalizing Repeating Pattern Using SOLO Model

    ERIC Educational Resources Information Center

    Lian, Lim Hooi; Yew, Wun Thiam

    2011-01-01

    In this paper, researchers discussed the application of the generalization perspective in helping the primary school pupils to develop their pre-algebraic thinking in generalizing repeating pattern. There are two main stages of the generalization perspective had been adapted, namely investigating and generalizing the pattern. Since the Biggs and…

  18. Architecture of autonomous systems

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok; Larsen, Ronald L.

    1986-01-01

    Automation of Space Station functions and activities, particularly those involving robotic capabilities with interactive or supervisory human control, is a complex, multi-disciplinary systems design problem. A wide variety of applications using autonomous control can be found in the literature, but none of them seem to address the problem in general. All of them are designed with a specific application in mind. In this report, an abstract model is described which unifies the key concepts underlying the design of automated systems such as those studied by the aerospace contractors. The model has been kept as general as possible. The attempt is to capture all the key components of autonomous systems. With a little effort, it should be possible to map the functions of any specific autonomous system application to the model presented here.

  19. Architecture of autonomous systems

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok; Larsen, Ronald L.

    1989-01-01

    Automation of Space Station functions and activities, particularly those involving robotic capabilities with interactive or supervisory human control, is a complex, multi-disciplinary systems design problem. A wide variety of applications using autonomous control can be found in the literature, but none of them seem to address the problem in general. All of them are designed with a specific application in mind. In this report, an abstract model is described which unifies the key concepts underlying the design of automated systems such as those studied by the aerospace contractors. The model has been kept as general as possible. The attempt is to capture all the key components of autonomous systems. With a little effort, it should be possible to map the functions of any specific autonomous system application to the model presented here.

  20. Digital Simulation and Modelling.

    ERIC Educational Resources Information Center

    Hawthorne, G. B., Jr.

    A basically tutorial point of view is taken in this general discussion. The author examines the basic concepts and principles of simulation and modelling and the application of digital computers to these tasks. Examples of existing simulations, a discussion of the applicability and feasibility of simulation studies, a review of simulation…

  1. Scaling Limit for a Generalization of the Nelson Model and its Application to Nuclear Physics

    NASA Astrophysics Data System (ADS)

    Suzuki, Akito

    We study a mathematically rigorous derivation of a quantum mechanical Hamiltonian in a general framework. We derive such a Hamiltonian by taking a scaling limit for a generalization of the Nelson model, which is an abstract interaction model between particles and a Bose field with some internal degrees of freedom. Applying it to a model for the field of the nuclear force with isospins, we obtain a Schrödinger Hamiltonian with a matrix-valued potential, the one pion exchange potential, describing an effective interaction between nucleons.

  2. BioVapor Model Evaluation

    EPA Science Inventory

    General background on modeling and specifics of modeling vapor intrusion are given. Three classical model applications are described and related to the problem of petroleum vapor intrusion. These indicate the need for model calibration and uncertainty analysis. Evaluation of Bi...

  3. Model-based metabolism design: constraints for kinetic and stoichiometric models

    PubMed Central

    Stalidzans, Egils; Seiman, Andrus; Peebo, Karl; Komasilovs, Vitalijs; Pentjuss, Agris

    2018-01-01

    The implementation of model-based designs in metabolic engineering and synthetic biology may fail. One of the reasons for this failure is that only a part of the real-world complexity is included in models. Still, some knowledge can be simplified and taken into account in the form of optimization constraints to improve the feasibility of model-based designs of metabolic pathways in organisms. Some constraints (mass balance, energy balance, and steady-state assumption) serve as a basis for many modelling approaches. There are others (total enzyme activity constraint and homeostatic constraint) proposed decades ago, but which are frequently ignored in design development. Several new approaches of cellular analysis have made possible the application of constraints like cell size, surface, and resource balance. Constraints for kinetic and stoichiometric models are grouped according to their applicability preconditions in (1) general constraints, (2) organism-level constraints, and (3) experiment-level constraints. General constraints are universal and are applicable for any system. Organism-level constraints are applicable for biological systems and usually are organism-specific, but these constraints can be applied without information about experimental conditions. To apply experimental-level constraints, peculiarities of the organism and the experimental set-up have to be taken into account to calculate the values of constraints. The limitations of applicability of particular constraints for kinetic and stoichiometric models are addressed. PMID:29472367

  4. Renewal processes based on generalized Mittag-Leffler waiting times

    NASA Astrophysics Data System (ADS)

    Cahoy, Dexter O.; Polito, Federico

    2013-03-01

    The fractional Poisson process has recently attracted experts from several fields of study. Its natural generalization of the ordinary Poisson process made the model more appealing for real-world applications. In this paper, we generalized the standard and fractional Poisson processes through the waiting time distribution, and showed their relations to an integral operator with a generalized Mittag-Leffler function in the kernel. The waiting times of the proposed renewal processes have the generalized Mittag-Leffler and stretched-squashed Mittag-Leffler distributions. Note that the generalizations naturally provide greater flexibility in modeling real-life renewal processes. Algorithms to simulate sample paths and to estimate the model parameters are derived. Note also that these procedures are necessary to make these models more usable in practice. State probabilities and other qualitative or quantitative features of the models are also discussed.

  5. A generalized Benford's law for JPEG coefficients and its applications in image forensics

    NASA Astrophysics Data System (ADS)

    Fu, Dongdong; Shi, Yun Q.; Su, Wei

    2007-02-01

    In this paper, a novel statistical model based on Benford's law for the probability distributions of the first digits of the block-DCT and quantized JPEG coefficients is presented. A parametric logarithmic law, i.e., the generalized Benford's law, is formulated. Furthermore, some potential applications of this model in image forensics are discussed in this paper, which include the detection of JPEG compression for images in bitmap format, the estimation of JPEG compression Qfactor for JPEG compressed bitmap image, and the detection of double compressed JPEG image. The results of our extensive experiments demonstrate the effectiveness of the proposed statistical model.

  6. Euromech 579 Arpino 3-8 April 2017: Generalized and microstructured continua: new ideas in modeling and/or applications to structures with (nearly)inextensible fibers—a review of presentations and discussions

    NASA Astrophysics Data System (ADS)

    Laudato, Marco; Di Cosmo, Fabio

    2018-04-01

    In the present paper, a rational report on Euromech 579, Generalized and Microstructured Continua: New ideas in modeling and/or Applications to Structures with (nearly)inextensible fibers (Arpino 3-8 April 2017), is provided. The main aim of the colloquium was to provide a forum for experts in generalized and microstructured continua with inextensible fibers to exchange ideas and get informed about the latest research trends in the domain. The interested reader will find more details about the colloquium at the dedicated web page http://www.memocsevents.eu/euromech579.

  7. Application of Complex Adaptive Systems in Portfolio Management

    ERIC Educational Resources Information Center

    Su, Zheyuan

    2017-01-01

    Simulation-based methods are becoming a promising research tool in financial markets. A general Complex Adaptive System can be tailored to different application scenarios. Based on the current research, we built two models that would benefit portfolio management by utilizing Complex Adaptive Systems (CAS) in Agent-based Modeling (ABM) approach.…

  8. 40 CFR 600.101-08 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... provisions of this subpart are applicable to 2008 and later model year automobiles, except medium duty passenger vehicles, manufactured on or after January 26, 2007, and to 2011 and later model year medium-duty... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1978...

  9. 40 CFR 600.001-08 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... provisions of this subpart are applicable to 2008 and later model year automobiles, except medium duty passenger vehicles, manufactured on or after January 26, 2007, and to 2011 and later model year medium-duty... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977...

  10. Applications of Diagnostic Classification Models: A Literature Review and Critical Commentary

    ERIC Educational Resources Information Center

    Sessoms, John; Henson, Robert A.

    2018-01-01

    Diagnostic classification models (DCMs) classify examinees based on the skills they have mastered given their test performance. This classification enables targeted feedback that can inform remedial instruction. Unfortunately, applications of DCMs have been criticized (e.g., no validity support). Generally, these evaluations have been brief and…

  11. A bivariate gamma probability distribution with application to gust modeling. [for the ascent flight of the space shuttle

    NASA Technical Reports Server (NTRS)

    Smith, O. E.; Adelfang, S. I.; Tubbs, J. D.

    1982-01-01

    A five-parameter gamma distribution (BGD) having two shape parameters, two location parameters, and a correlation parameter is investigated. This general BGD is expressed as a double series and as a single series of the modified Bessel function. It reduces to the known special case for equal shape parameters. Practical functions for computer evaluations for the general BGD and for special cases are presented. Applications to wind gust modeling for the ascent flight of the space shuttle are illustrated.

  12. The ABC (in any D) of logarithmic CFT

    NASA Astrophysics Data System (ADS)

    Hogervorst, Matthijs; Paulos, Miguel; Vichi, Alessandro

    2017-10-01

    Logarithmic conformal field theories have a vast range of applications, from critical percolation to systems with quenched disorder. In this paper we thoroughly examine the structure of these theories based on their symmetry properties. Our analysis is model-independent and holds for any spacetime dimension. Our results include a determination of the general form of correlation functions and conformal block decompositions, clearing the path for future bootstrap applications. Several examples are discussed in detail, including logarithmic generalized free fields, holographic models, self-avoiding random walks and critical percolation.

  13. Generalized Born-Oppenheimer treatment of Jahn-Teller systems in Hilbert spaces of arbitrary dimension: theory and application to a three-state model potential.

    PubMed

    Varandas, A J C; Sarkar, B

    2011-05-14

    Generalized Born-Oppenheimer equations including the geometrical phase effect are derived for three- and four-fold electronic manifolds in Jahn-Teller systems near the degeneracy seam. The method is readily extendable to N-fold systems of arbitrary dimension. An application is reported for a model threefold system, and the results are compared with Born-Oppenheimer (geometrical phase ignored), extended Born-Oppenheimer, and coupled three-state calculations. The theory shows unprecedented simplicity while depicting all features of more elaborated ones.

  14. Development of the general interpolants method for the CYBER 200 series of supercomputers

    NASA Technical Reports Server (NTRS)

    Stalnaker, J. F.; Robinson, M. A.; Spradley, L. W.; Kurzius, S. C.; Thoenes, J.

    1988-01-01

    The General Interpolants Method (GIM) is a 3-D, time-dependent, hybrid procedure for generating numerical analogs of the conservation laws. This study is directed toward the development and application of the GIM computer code for fluid dynamic research applications as implemented for the Cyber 200 series of supercomputers. An elliptic and quasi-parabolic version of the GIM code are discussed. Turbulence models, algebraic and differential equations, were added to the basic viscous code. An equilibrium reacting chemistry model and an implicit finite difference scheme are also included.

  15. Application of the Hyper-Poisson Generalized Linear Model for Analyzing Motor Vehicle Crashes.

    PubMed

    Khazraee, S Hadi; Sáez-Castillo, Antonio Jose; Geedipally, Srinivas Reddy; Lord, Dominique

    2015-05-01

    The hyper-Poisson distribution can handle both over- and underdispersion, and its generalized linear model formulation allows the dispersion of the distribution to be observation-specific and dependent on model covariates. This study's objective is to examine the potential applicability of a newly proposed generalized linear model framework for the hyper-Poisson distribution in analyzing motor vehicle crash count data. The hyper-Poisson generalized linear model was first fitted to intersection crash data from Toronto, characterized by overdispersion, and then to crash data from railway-highway crossings in Korea, characterized by underdispersion. The results of this study are promising. When fitted to the Toronto data set, the goodness-of-fit measures indicated that the hyper-Poisson model with a variable dispersion parameter provided a statistical fit as good as the traditional negative binomial model. The hyper-Poisson model was also successful in handling the underdispersed data from Korea; the model performed as well as the gamma probability model and the Conway-Maxwell-Poisson model previously developed for the same data set. The advantages of the hyper-Poisson model studied in this article are noteworthy. Unlike the negative binomial model, which has difficulties in handling underdispersed data, the hyper-Poisson model can handle both over- and underdispersed crash data. Although not a major issue for the Conway-Maxwell-Poisson model, the effect of each variable on the expected mean of crashes is easily interpretable in the case of this new model. © 2014 Society for Risk Analysis.

  16. General Purpose Data-Driven Online System Health Monitoring with Applications to Space Operations

    NASA Technical Reports Server (NTRS)

    Iverson, David L.; Spirkovska, Lilly; Schwabacher, Mark

    2010-01-01

    Modern space transportation and ground support system designs are becoming increasingly sophisticated and complex. Determining the health state of these systems using traditional parameter limit checking, or model-based or rule-based methods is becoming more difficult as the number of sensors and component interactions grows. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults, failures, or precursors of significant failures. The Inductive Monitoring System (IMS) is a general purpose, data-driven system health monitoring software tool that has been successfully applied to several aerospace applications and is under evaluation for anomaly detection in vehicle and ground equipment for next generation launch systems. After an introduction to IMS application development, we discuss these NASA online monitoring applications, including the integration of IMS with complementary model-based and rule-based methods. Although the examples presented in this paper are from space operations applications, IMS is a general-purpose health-monitoring tool that is also applicable to power generation and transmission system monitoring.

  17. Latent Trait Theory in the Affective Domain--Applications of the Rasch Model.

    ERIC Educational Resources Information Center

    Curry, Allen R.; Riegel, N. Blyth

    The Rasch model of test theory is described in general terms, compared with latent trait theory, and shown to have interesting applications for the measurement of affective as well as cognitive traits. Three assumption of the Rasch model are stated to support the conclusion that calibration of the items and tests is independent of the examinee…

  18. Development of a GIS interface for WEPP Model application to Great Lakes forested watersheds

    Treesearch

    J. R. Frankenberger; S. Dun; D. C. Flanagan; J. Q. Wu; W. J. Elliot

    2011-01-01

    This presentation will highlight efforts on development of a new online WEPP GIS interface, targeted toward application in forested regions bordering the Great Lakes. The key components and algorithms of the online GIS system will be outlined. The general procedures used to provide input to the WEPP model and to display model output will be demonstrated.

  19. Quality Control Analysis of Selected Aspects of Programs Administered by the Bureau of Student Financial Assistance. Task 1 and Quality Control Sample; Error-Prone Modeling Analysis Plan.

    ERIC Educational Resources Information Center

    Saavedra, Pedro; And Others

    Parameters and procedures for developing an error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications are introduced. Specifications to adapt these general parameters to secondary data analysis of the Validation, Edits, and Applications Processing Systems…

  20. 40 CFR 600.201-93 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model Year Automobiles-Procedures for Calculating Fuel Economy Values § 600.201-93 General...

  1. 40 CFR 600.401-77 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model Year Automobiles-Dealer Availability of Fuel Economy Information § 600.401-77 General...

  2. 40 CFR 600.201-12 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model Year Automobiles-Procedures for Calculating Fuel Economy Values § 600.201-12 General...

  3. 40 CFR 600.201-86 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model Year Automobiles-Procedures for Calculating Fuel Economy Values § 600.201-86 General...

  4. 40 CFR 600.201-08 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model Year Automobiles-Procedures for Calculating Fuel Economy Values § 600.201-08 General...

  5. 40 CFR 600.401-77 - General applicability.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model Year Automobiles-Dealer Availability of Fuel Economy Information § 600.401-77 General...

  6. Privacy-Preserving Evaluation of Generalization Error and Its Application to Model and Attribute Selection

    NASA Astrophysics Data System (ADS)

    Sakuma, Jun; Wright, Rebecca N.

    Privacy-preserving classification is the task of learning or training a classifier on the union of privately distributed datasets without sharing the datasets. The emphasis of existing studies in privacy-preserving classification has primarily been put on the design of privacy-preserving versions of particular data mining algorithms, However, in classification problems, preprocessing and postprocessing— such as model selection or attribute selection—play a prominent role in achieving higher classification accuracy. In this paper, we show generalization error of classifiers in privacy-preserving classification can be securely evaluated without sharing prediction results. Our main technical contribution is a new generalized Hamming distance protocol that is universally applicable to preprocessing and postprocessing of various privacy-preserving classification problems, such as model selection in support vector machine and attribute selection in naive Bayes classification.

  7. Application of conditional moment tests to model checking for generalized linear models.

    PubMed

    Pan, Wei

    2002-06-01

    Generalized linear models (GLMs) are increasingly being used in daily data analysis. However, model checking for GLMs with correlated discrete response data remains difficult. In this paper, through a case study on marginal logistic regression using a real data set, we illustrate the flexibility and effectiveness of using conditional moment tests (CMTs), along with other graphical methods, to do model checking for generalized estimation equation (GEE) analyses. Although CMTs provide an array of powerful diagnostic tests for model checking, they were originally proposed in the econometrics literature and, to our knowledge, have never been applied to GEE analyses. CMTs cover many existing tests, including the (generalized) score test for an omitted covariate, as special cases. In summary, we believe that CMTs provide a class of useful model checking tools.

  8. Solving Disparities Through Payment And Delivery System Reform: A Program To Achieve Health Equity.

    PubMed

    DeMeester, Rachel H; Xu, Lucy J; Nocon, Robert S; Cook, Scott C; Ducas, Andrea M; Chin, Marshall H

    2017-06-01

    Payment systems generally do not directly encourage or support the reduction of health disparities. In 2013 the Finding Answers: Solving Disparities through Payment and Delivery System Reform program of the Robert Wood Johnson Foundation sought to understand how alternative payment models might intentionally incorporate a disparities-reduction component to promote health equity. A qualitative analysis of forty proposals to the program revealed that applicants generally did not link payment reform tightly to disparities reduction. Most proposed general pay-for-performance, global payment, or shared savings plans, combined with multicomponent system interventions. None of the applicants proposed making any financial payments contingent on having successfully reduced disparities. Most applicants did not address how they would optimize providers' intrinsic and extrinsic motivation to reduce disparities. A better understanding of how payment and care delivery models might be designed and implemented to reduce health disparities is essential. Project HOPE—The People-to-People Health Foundation, Inc.

  9. A general method to determine sampling windows for nonlinear mixed effects models with an application to population pharmacokinetic studies.

    PubMed

    Foo, Lee Kien; McGree, James; Duffull, Stephen

    2012-01-01

    Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models. Copyright © 2012 John Wiley & Sons, Ltd.

  10. A Generalized Least Squares Regression Approach for Computing Effect Sizes in Single-Case Research: Application Examples

    ERIC Educational Resources Information Center

    Maggin, Daniel M.; Swaminathan, Hariharan; Rogers, Helen J.; O'Keeffe, Breda V.; Sugai, George; Horner, Robert H.

    2011-01-01

    A new method for deriving effect sizes from single-case designs is proposed. The strategy is applicable to small-sample time-series data with autoregressive errors. The method uses Generalized Least Squares (GLS) to model the autocorrelation of the data and estimate regression parameters to produce an effect size that represents the magnitude of…

  11. General Training System; GENTRAS. Final Report.

    ERIC Educational Resources Information Center

    International Business Machines Corp., Gaithersburg, MD. Federal Systems Div.

    GENTRAS (General Training System) is a computer-based training model for the Marine Corps which makes use of a systems approach. The model defines the skill levels applicable for career growth and classifies and defines the training needed for this growth. It also provides a training cost subsystem which will provide a more efficient means of…

  12. Generalized Kapchinskij-Vladimirskij Distribution and Beam Matrix for Phase-Space Manipulations of High-Intensity Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chung, Moses; Qin, Hong; Davidson, Ronald C.

    In an uncoupled linear lattice system, the Kapchinskij-Vladimirskij (KV) distribution formulated on the basis of the single-particle Courant-Snyder invariants has served as a fundamental theoretical basis for the analyses of the equilibrium, stability, and transport properties of high-intensity beams for the past several decades. Recent applications of high-intensity beams, however, require beam phase-space manipulations by intentionally introducing strong coupling. Here in this Letter, we report the full generalization of the KV model by including all of the linear (both external and space-charge) coupling forces, beam energy variations, and arbitrary emittance partition, which all form essential elements for phase-space manipulations. Themore » new generalized KV model yields spatially uniform density profiles and corresponding linear self-field forces as desired. Finally, the corresponding matrix envelope equations and beam matrix for the generalized KV model provide important new theoretical tools for the detailed design and analysis of high-intensity beam manipulations, for which previous theoretical models are not easily applicable.« less

  13. Generalized Kapchinskij-Vladimirskij Distribution and Beam Matrix for Phase-Space Manipulations of High-Intensity Beams

    DOE PAGES

    Chung, Moses; Qin, Hong; Davidson, Ronald C.; ...

    2016-11-23

    In an uncoupled linear lattice system, the Kapchinskij-Vladimirskij (KV) distribution formulated on the basis of the single-particle Courant-Snyder invariants has served as a fundamental theoretical basis for the analyses of the equilibrium, stability, and transport properties of high-intensity beams for the past several decades. Recent applications of high-intensity beams, however, require beam phase-space manipulations by intentionally introducing strong coupling. Here in this Letter, we report the full generalization of the KV model by including all of the linear (both external and space-charge) coupling forces, beam energy variations, and arbitrary emittance partition, which all form essential elements for phase-space manipulations. Themore » new generalized KV model yields spatially uniform density profiles and corresponding linear self-field forces as desired. Finally, the corresponding matrix envelope equations and beam matrix for the generalized KV model provide important new theoretical tools for the detailed design and analysis of high-intensity beam manipulations, for which previous theoretical models are not easily applicable.« less

  14. A generalized Jaynes-Cummings model: The relativistic parametric amplifier and a single trapped ion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ojeda-Guillén, D., E-mail: dojedag@ipn.mx; Mota, R. D.; Granados, V. D.

    2016-06-15

    We introduce a generalization of the Jaynes-Cummings model and study some of its properties. We obtain the energy spectrum and eigenfunctions of this model by using the tilting transformation and the squeezed number states of the one-dimensional harmonic oscillator. As physical applications, we connect this new model to two important and novelty problems: the relativistic parametric amplifier and the quantum simulation of a single trapped ion.

  15. A general health policy model: update and applications.

    PubMed Central

    Kaplan, R M; Anderson, J P

    1988-01-01

    This article describes the development of a General Health Policy Model that can be used for program evaluation, population monitoring, clinical research, and policy analysis. An important component of the model, the Quality of Well-being scale (QWB) combines preference-weighted measures of symptoms and functioning to provide a numerical point-in-time expression of well-being, ranging from 0 for death to 1.0 for asymptomatic optimum functioning. The level of wellness at particular points in time is governed by the prognosis (transition rates or probabilities) generated by the underlying disease or injury under different treatment (control) variables. Well-years result from integrating the level of wellness, or health-related quality of life, over the life expectancy. Several issues relevant to the application of the model are discussed. It is suggested that a quality of life measure need not have separate components for social and mental health. Social health has been difficult to define; social support may be a poor criterion for resource allocation; and some evidence suggests that aspects of mental health are captured by the general measure. Although it has been suggested that measures of child health should differ from those used for adults, we argue that a separate conceptualization of child health creates new problems for policy analysis. After offering several applications of the model for the evaluation of prevention programs, we conclude that many of the advantages of general measures have been overlooked and should be given serious consideration in future studies. PMID:3384669

  16. Qualitative model-based diagnosis using possibility theory

    NASA Technical Reports Server (NTRS)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  17. 40 CFR 93.159 - Procedures for conformity determinations of general Federal actions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... based on the applicable air quality models, data bases, and other requirements specified in the most... applicable air quality models, data bases, and other requirements specified in the most recent version of the... data are available, such as actual stack test data from stationary sources which are part of the...

  18. Using Mathematics, Mathematical Applications, Mathematical Modelling, and Mathematical Literacy: A Theoretical Study

    ERIC Educational Resources Information Center

    Mumcu, Hayal Yavuz

    2016-01-01

    The purpose of this theoretical study is to explore the relationships between the concepts of using mathematics in the daily life, mathematical applications, mathematical modelling, and mathematical literacy. As these concepts are generally taken as independent concepts in the related literature, they are confused with each other and it becomes…

  19. Applying the scientific method to small catchment studies: Areview of the Panola Mountain experience

    USGS Publications Warehouse

    Hooper, R.P.

    2001-01-01

    A hallmark of the scientific method is its iterative application to a problem to increase and refine the understanding of the underlying processes controlling it. A successful iterative application of the scientific method to catchment science (including the fields of hillslope hydrology and biogeochemistry) has been hindered by two factors. First, the scale at which controlled experiments can be performed is much smaller than the scale of the phenomenon of interest. Second, computer simulation models generally have not been used as hypothesis-testing tools as rigorously as they might have been. Model evaluation often has gone only so far as evaluation of goodness of fit, rather than a full structural analysis, which is more useful when treating the model as a hypothesis. An iterative application of a simple mixing model to the Panola Mountain Research Watershed is reviewed to illustrate the increase in understanding gained by this approach and to discern general principles that may be applicable to other studies. The lessons learned include the need for an explicitly stated conceptual model of the catchment, the definition of objective measures of its applicability, and a clear linkage between the scale of observations and the scale of predictions. Published in 2001 by John Wiley & Sons. Ltd.

  20. Decision Support Tool Evaluation Report for General NOAA Oil Modeling Environment(GNOME) Version 2.0

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph P.; Hall, Callie; Zanoni, Vicki; Blonski, Slawomir; D'Sa, Eurico; Estep, Lee; Holland, Donald; Moore, Roxzana F.; Pagnutti, Mary; Terrie, Gregory

    2004-01-01

    NASA's Earth Science Applications Directorate evaluated the potential of NASA remote sensing data and modeling products to enhance the General NOAA Oil Modeling Environment (GNOME) decision support tool. NOAA's Office of Response and Restoration (OR&R) Hazardous Materials (HAZMAT) Response Division is interested in enhancing GNOME with near-realtime (NRT) NASA remote sensing products on oceanic winds and ocean circulation. The NASA SeaWinds sea surface wind and Jason-1 sea surface height NRT products have potential, as do sea surface temperature and reflectance products from the Moderate Resolution Imaging Spectroradiometer and sea surface reflectance products from Landsat and the Advanced Spaceborne Thermal Emission and Reflectance Radiometer. HAZMAT is also interested in the Advanced Circulation model and the Ocean General Circulation Model. Certain issues must be considered, including lack of data continuity, marginal data redundancy, and data formatting problems. Spatial resolution is an issue for near-shore GNOME applications. Additional work will be needed to incorporate NASA inputs into GNOME, including verification and validation of data products, algorithms, models, and NRT data.

  1. Theoretical models for application in school health education research.

    PubMed

    Parcel, G S

    1984-01-01

    Theoretical models that may be useful to research studies in school health education are reviewed. Selected, well-defined theories include social learning theory, problem-behavior theory, theory of reasoned action, communications theory, coping theory, social competence, and social and family theories. Also reviewed are multiple theory models including models of health related-behavior, the PRECEDE Framework, social-psychological approaches and the Activated Health Education Model. Two major reviews of teaching models are also discussed. The paper concludes with a brief outline of the general applications of theory to the field of school health education including applications to basic research, development and design of interventions, program evaluation, and program utilization.

  2. Developing a Fundamental Model for an Integrated GPS/INS State Estimation System with Kalman Filtering

    NASA Technical Reports Server (NTRS)

    Canfield, Stephen

    1999-01-01

    This work will demonstrate the integration of sensor and system dynamic data and their appropriate models using an optimal filter to create a robust, adaptable, easily reconfigurable state (motion) estimation system. This state estimation system will clearly show the application of fundamental modeling and filtering techniques. These techniques are presented at a general, first principles level, that can easily be adapted to specific applications. An example of such an application is demonstrated through the development of an integrated GPS/INS navigation system. This system acquires both global position data and inertial body data, to provide optimal estimates of current position and attitude states. The optimal states are estimated using a Kalman filter. The state estimation system will include appropriate error models for the measurement hardware. The results of this work will lead to the development of a "black-box" state estimation system that supplies current motion information (position and attitude states) that can be used to carry out guidance and control strategies. This black-box state estimation system is developed independent of the vehicle dynamics and therefore is directly applicable to a variety of vehicles. Issues in system modeling and application of Kalman filtering techniques are investigated and presented. These issues include linearized models of equations of state, models of the measurement sensors, and appropriate application and parameter setting (tuning) of the Kalman filter. The general model and subsequent algorithm is developed in Matlab for numerical testing. The results of this system are demonstrated through application to data from the X-33 Michael's 9A8 mission and are presented in plots and simple animations.

  3. AIR QUALITY MODELING OF AMMONIA: A REGIONAL MODELING PERSPECTIVE

    EPA Science Inventory

    The talk will address the status of modeling of ammonia from a regional modeling perspective, yet the observations and comments should have general applicability. The air quality modeling system components that are central to modeling ammonia will be noted and a perspective on ...

  4. Unitary n -designs via random quenches in atomic Hubbard and spin models: Application to the measurement of Rényi entropies

    NASA Astrophysics Data System (ADS)

    Vermersch, B.; Elben, A.; Dalmonte, M.; Cirac, J. I.; Zoller, P.

    2018-02-01

    We present a general framework for the generation of random unitaries based on random quenches in atomic Hubbard and spin models, forming approximate unitary n -designs, and their application to the measurement of Rényi entropies. We generalize our protocol presented in Elben et al. [Phys. Rev. Lett. 120, 050406 (2018), 10.1103/PhysRevLett.120.050406] to a broad class of atomic and spin-lattice models. We further present an in-depth numerical and analytical study of experimental imperfections, including the effect of decoherence and statistical errors, and discuss connections of our approach with many-body quantum chaos.

  5. Generalized Parameter-Adjusted Stochastic Resonance of Duffing Oscillator and Its Application to Weak-Signal Detection.

    PubMed

    Lai, Zhi-Hui; Leng, Yong-Gang

    2015-08-28

    A two-dimensional Duffing oscillator which can produce stochastic resonance (SR) is studied in this paper. We introduce its SR mechanism and present a generalized parameter-adjusted SR (GPASR) model of this oscillator for the necessity of parameter adjustments. The Kramers rate is chosen as the theoretical basis to establish a judgmental function for judging the occurrence of SR in this model; and to analyze and summarize the parameter-adjusted rules under unmatched signal amplitude, frequency, and/or noise-intensity. Furthermore, we propose the weak-signal detection approach based on this GPASR model. Finally, we employ two practical examples to demonstrate the feasibility of the proposed approach in practical engineering application.

  6. Dynamic regulation of erythropoiesis: A computer model of general applicability

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1979-01-01

    A mathematical model for the control of erythropoiesis was developed based on the balance between oxygen supply and demand at a renal oxygen detector which controls erythropoietin release and red cell production. Feedback regulation of tissue oxygen tension is accomplished by adjustments of hemoglobin levels resulting from the output of a renal-bone marrow controller. Special consideration was given to the determinants of tissue oxygenation including evaluation of the influence of blood flow, capillary diffusivity, oxygen uptake and oxygen-hemoglobin affinity. A theoretical analysis of the overall control system is presented. Computer simulations of altitude hypoxia, red cell infusion hyperoxia, and homolytic anemia demonstrate validity of the model for general human application in health and disease.

  7. A large-grain mapping approach for multiprocessor systems through data flow model. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Kim, Hwa-Soo

    1991-01-01

    A large-grain level mapping method is presented of numerical oriented applications onto multiprocessor systems. The method is based on the large-grain data flow representation of the input application and it assumes a general interconnection topology of the multiprocessor system. The large-grain data flow model was used because such representation best exhibits inherited parallelism in many important applications, e.g., CFD models based on partial differential equations can be presented in large-grain data flow format, very effectively. A generalized interconnection topology of the multiprocessor architecture is considered, including such architectural issues as interprocessor communication cost, with the aim to identify the 'best matching' between the application and the multiprocessor structure. The objective is to minimize the total execution time of the input algorithm running on the target system. The mapping strategy consists of the following: (1) large-grain data flow graph generation from the input application using compilation techniques; (2) data flow graph partitioning into basic computation blocks; and (3) physical mapping onto the target multiprocessor using a priority allocation scheme for the computation blocks.

  8. Commentary on the statistical properties of noise and its implication on general linear models in functional near-infrared spectroscopy.

    PubMed

    Huppert, Theodore J

    2016-01-01

    Functional near-infrared spectroscopy (fNIRS) is a noninvasive neuroimaging technique that uses low levels of light to measure changes in cerebral blood oxygenation levels. In the majority of NIRS functional brain studies, analysis of this data is based on a statistical comparison of hemodynamic levels between a baseline and task or between multiple task conditions by means of a linear regression model: the so-called general linear model. Although these methods are similar to their implementation in other fields, particularly for functional magnetic resonance imaging, the specific application of these methods in fNIRS research differs in several key ways related to the sources of noise and artifacts unique to fNIRS. In this brief communication, we discuss the application of linear regression models in fNIRS and the modifications needed to generalize these models in order to deal with structured (colored) noise due to systemic physiology and noise heteroscedasticity due to motion artifacts. The objective of this work is to present an overview of these noise properties in the context of the linear model as it applies to fNIRS data. This work is aimed at explaining these mathematical issues to the general fNIRS experimental researcher but is not intended to be a complete mathematical treatment of these concepts.

  9. Applications of General Systems Theory to the Development of an Adjustable Tutorial Software Machine.

    ERIC Educational Resources Information Center

    Vos, Hans J.

    1994-01-01

    Describes the construction of a model of computer-assisted instruction using a qualitative block diagram based on general systems theory (GST) as a framework. Subject matter representation is discussed, and appendices include system variables and system equations of the GST model, as well as an example of developing flexible courseware. (Contains…

  10. Aeronautical Engineering. A Continuing Bibliography with Indexes

    DTIC Science & Technology

    1987-09-01

    engines 482 01 AERONAUTICS (GENERAL) i-10 aircraft equipped with turbine engine ...rate adaptive control with applications to lateral Statistics on aircraft gas turbine engine rotor failures Unified model for the calculation of blade ...PUMPS p 527 A87-35669 to test data for a composite prop-tan model Gas turbine combustor and engine augmentor tube GENERAL AVIATION AIRCRAFT

  11. Smart Climatology Applications for Undersea Warfare

    DTIC Science & Technology

    2008-09-01

    Comparisons of these climatologies with existing Navy climatologies based on the Generalized Digital Environmental Model ( GDEM ) reveal differences in sonic...undersea warfare. 15. NUMBER OF PAGES 117 14. SUBJECT TERMS antisubmarine warfare, climate variations, climatology, GDEM , ocean, re...climatologies based on the Generalized Digital Environmental Model ( GDEM ) to our smart ocean climatologies reveal a number of differences. The

  12. A generalized development model for testing GPS user equipment

    NASA Technical Reports Server (NTRS)

    Hemesath, N.

    1978-01-01

    The generalized development model (GDM) program, which was intended to establish how well GPS user equipment can perform under a combination of jamming and dynamics, is described. The systems design and the characteristics of the GDM are discussed. The performance aspects of the GDM are listed and the application of the GDM to civil aviation is examined.

  13. Entropy maximization under the constraints on the generalized Gini index and its application in modeling income distributions

    NASA Astrophysics Data System (ADS)

    Khosravi Tanak, A.; Mohtashami Borzadaran, G. R.; Ahmadi, J.

    2015-11-01

    In economics and social sciences, the inequality measures such as Gini index, Pietra index etc., are commonly used to measure the statistical dispersion. There is a generalization of Gini index which includes it as special case. In this paper, we use principle of maximum entropy to approximate the model of income distribution with a given mean and generalized Gini index. Many distributions have been used as descriptive models for the distribution of income. The most widely known of these models are the generalized beta of second kind and its subclass distributions. The obtained maximum entropy distributions are fitted to the US family total money income in 2009, 2011 and 2013 and their relative performances with respect to generalized beta of second kind family are compared.

  14. [GSH fermentation process modeling using entropy-criterion based RBF neural network model].

    PubMed

    Tan, Zuoping; Wang, Shitong; Deng, Zhaohong; Du, Guocheng

    2008-05-01

    The prediction accuracy and generalization of GSH fermentation process modeling are often deteriorated by noise existing in the corresponding experimental data. In order to avoid this problem, we present a novel RBF neural network modeling approach based on entropy criterion. It considers the whole distribution structure of the training data set in the parameter learning process compared with the traditional MSE-criterion based parameter learning, and thus effectively avoids the weak generalization and over-learning. Then the proposed approach is applied to the GSH fermentation process modeling. Our results demonstrate that this proposed method has better prediction accuracy, generalization and robustness such that it offers a potential application merit for the GSH fermentation process modeling.

  15. A General Accelerated Degradation Model Based on the Wiener Process.

    PubMed

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-12-06

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  16. A General Accelerated Degradation Model Based on the Wiener Process

    PubMed Central

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-01-01

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses. PMID:28774107

  17. An Interdisciplinary Model for Teaching Evolutionary Ecology.

    ERIC Educational Resources Information Center

    Coletta, John

    1992-01-01

    Describes a general systems evolutionary model and demonstrates how a previously established ecological model is a function of its past development based on the evolution of the rock, nutrient, and water cycles. Discusses the applications of the model in environmental education. (MDH)

  18. Efficient Analysis of Complex Structures

    NASA Technical Reports Server (NTRS)

    Kapania, Rakesh K.

    2000-01-01

    Last various accomplishments achieved during this project are : (1) A Survey of Neural Network (NN) applications using MATLAB NN Toolbox on structural engineering especially on equivalent continuum models (Appendix A). (2) Application of NN and GAs to simulate and synthesize substructures: 1-D and 2-D beam problems (Appendix B). (3) Development of an equivalent plate-model analysis method (EPA) for static and vibration analysis of general trapezoidal built-up wing structures composed of skins, spars and ribs. Calculation of all sorts of test cases and comparison with measurements or FEA results. (Appendix C). (4) Basic work on using second order sensitivities on simulating wing modal response, discussion of sensitivity evaluation approaches, and some results (Appendix D). (5) Establishing a general methodology of simulating the modal responses by direct application of NN and by sensitivity techniques, in a design space composed of a number of design points. Comparison is made through examples using these two methods (Appendix E). (6) Establishing a general methodology of efficient analysis of complex wing structures by indirect application of NN: the NN-aided Equivalent Plate Analysis. Training of the Neural Networks for this purpose in several cases of design spaces, which can be applicable for actual design of complex wings (Appendix F).

  19. Statistically Modeling I-V Characteristics of CNT-FET with LASSO

    NASA Astrophysics Data System (ADS)

    Ma, Dongsheng; Ye, Zuochang; Wang, Yan

    2017-08-01

    With the advent of internet of things (IOT), the need for studying new material and devices for various applications is increasing. Traditionally we build compact models for transistors on the basis of physics. But physical models are expensive and need a very long time to adjust for non-ideal effects. As the vision for the application of many novel devices is not certain or the manufacture process is not mature, deriving generalized accurate physical models for such devices is very strenuous, whereas statistical modeling is becoming a potential method because of its data oriented property and fast implementation. In this paper, one classical statistical regression method, LASSO, is used to model the I-V characteristics of CNT-FET and a pseudo-PMOS inverter simulation based on the trained model is implemented in Cadence. The normalized relative mean square prediction error of the trained model versus experiment sample data and the simulation results show that the model is acceptable for digital circuit static simulation. And such modeling methodology can extend to general devices.

  20. A General Reliability Model for Ni-BaTiO3-Based Multilayer Ceramic Capacitors

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    The evaluation of multilayer ceramic capacitors (MLCCs) with Ni electrode and BaTiO3 dielectric material for potential space project applications requires an in-depth understanding of their reliability. A general reliability model for Ni-BaTiO3 MLCC is developed and discussed. The model consists of three parts: a statistical distribution; an acceleration function that describes how a capacitor's reliability life responds to the external stresses, and an empirical function that defines contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size, and capacitor chip size A. Application examples are also discussed based on the proposed reliability model for Ni-BaTiO3 MLCCs.

  1. A General Reliability Model for Ni-BaTiO3-Based Multilayer Ceramic Capacitors

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    The evaluation for potential space project applications of multilayer ceramic capacitors (MLCCs) with Ni electrode and BaTiO3 dielectric material requires an in-depth understanding of the MLCCs reliability. A general reliability model for Ni-BaTiO3 MLCCs is developed and discussed in this paper. The model consists of three parts: a statistical distribution; an acceleration function that describes how a capacitors reliability life responds to external stresses; and an empirical function that defines the contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size r, and capacitor chip size A. Application examples are also discussed based on the proposed reliability model for Ni-BaTiO3 MLCCs.

  2. Rhombic micro-displacement amplifier for piezoelectric actuator and its linear and hybrid model

    NASA Astrophysics Data System (ADS)

    Chen, Jinglong; Zhang, Chunlin; Xu, Minglong; Zi, Yanyang; Zhang, Xinong

    2015-01-01

    This paper proposes rhombic micro-displacement amplifier (RMDA) for piezoelectric actuator (PA). First, the geometric amplification relations are analyzed and linear model is built to analyze the mechanical and electrical properties of this amplifier. Next, the accurate modeling method of amplifier is studied for important application of precise servo control. The classical Preisach model (CPM) is generally implemented using a numerical technique based on the first-order reversal curves (FORCs). The accuracy of CPM mainly depends on the number of FORCs. However, it is generally difficult to achieve enough number of FORCs in practice. So, Support Vector Machine (SVM) is employed in the work to circumvent the deficiency of the CPM. Then the hybrid model, which is based on discrete CPM and SVM is developed to account for hysteresis and dynamic effects. Finally, experimental validation is carried out. The analyzed result shows that this amplifier with the hybrid model is suitable for control application.

  3. Assessment of Automated Measurement and Verification (M&V) Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Touzani, Samir; Custodio, Claudine

    This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.

  4. 40 CFR 85.2101 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) CONTROL OF AIR POLLUTION FROM MOBILE SOURCES Emissions Control System Performance Warranty Regulations and... through 85.2111 are applicable to all 1981 and later model year light-duty vehicles and light-duty trucks... apply to durability groups and test groups as applicable for manufacturers certifying new light-duty...

  5. Large-Scale Partial-Duplicate Image Retrieval and Its Applications

    DTIC Science & Technology

    2016-04-23

    SECURITY CLASSIFICATION OF: The explosive growth of Internet Media (partial-duplicate/similar images, 3D objects, 3D models, etc.) sheds bright...light on many promising applications in forensics, surveillance, 3D animation, mobile visual search, and 3D model/object search. Compared with the...and stable spatial configuration. Compared with the general 2D objects, 3D models/objects consist of 3D data information (typically a list of

  6. Generalization Technique for 2D+SCALE Dhe Data Model

    NASA Astrophysics Data System (ADS)

    Karim, Hairi; Rahman, Alias Abdul; Boguslawski, Pawel

    2016-10-01

    Different users or applications need different scale model especially in computer application such as game visualization and GIS modelling. Some issues has been raised on fulfilling GIS requirement of retaining the details while minimizing the redundancy of the scale datasets. Previous researchers suggested and attempted to add another dimension such as scale or/and time into a 3D model, but the implementation of scale dimension faces some problems due to the limitations and availability of data structures and data models. Nowadays, various data structures and data models have been proposed to support variety of applications and dimensionality but lack research works has been conducted in terms of supporting scale dimension. Generally, the Dual Half Edge (DHE) data structure was designed to work with any perfect 3D spatial object such as buildings. In this paper, we attempt to expand the capability of the DHE data structure toward integration with scale dimension. The description of the concept and implementation of generating 3D-scale (2D spatial + scale dimension) for the DHE data structure forms the major discussion of this paper. We strongly believed some advantages such as local modification and topological element (navigation, query and semantic information) in scale dimension could be used for the future 3D-scale applications.

  7. The Topp-Leone generalized Rayleigh cure rate model and its application

    NASA Astrophysics Data System (ADS)

    Nanthaprut, Pimwarat; Bodhisuwan, Winai; Patummasut, Mena

    2017-11-01

    Cure rate model is one of the survival analysis when model consider a proportion of the censored data. In clinical trials, the data represent time to recurrence of event or death of patients are used to improve the efficiency of treatments. Each dataset can be separated into two groups: censored and uncensored data. In this work, the new mixture cure rate model is introduced based on the Topp-Leone generalized Rayleigh distribution. The Bayesian approach is employed to estimate its parameters. In addition, a breast cancer dataset is analyzed for model illustration purpose. According to the deviance information criterion, the Topp-Leone generalized Rayleigh cure rate model shows better result than the Weibull and exponential cure rate models.

  8. Modeling Evaporation of Drops of Different Kerosenes

    NASA Technical Reports Server (NTRS)

    Bellan, Josette; Harstad, Kenneth

    2007-01-01

    A mathematical model describes the evaporation of drops of a hydrocarbon liquid composed of as many as hundreds of chemical species. The model is intended especially for application to any of several types of kerosenes commonly used as fuels. The concept of continuous thermodynamics, according to which the chemical composition of the evaporating multicomponent liquid is described by use of a probability distribution function (PDF). However, the present model is more generally applicable than is its immediate predecessor.

  9. Multipolar Ewald Methods, 2: Applications Using a Quantum Mechanical Force Field

    PubMed Central

    2015-01-01

    A fully quantum mechanical force field (QMFF) based on a modified “divide-and-conquer” (mDC) framework is applied to a series of molecular simulation applications, using a generalized Particle Mesh Ewald method extended to multipolar charge densities. Simulation results are presented for three example applications: liquid water, p-nitrophenylphosphate reactivity in solution, and crystalline N,N-dimethylglycine. Simulations of liquid water using a parametrized mDC model are compared to TIP3P and TIP4P/Ew water models and experiment. The mDC model is shown to be superior for cluster binding energies and generally comparable for bulk properties. Examination of the dissociative pathway for dephosphorylation of p-nitrophenylphosphate shows that the mDC method evaluated with the DFTB3/3OB and DFTB3/OPhyd semiempirical models bracket the experimental barrier, whereas DFTB2 and AM1/d-PhoT QM/MM simulations exhibit deficiencies in the barriers, the latter for which is related, in part, to the anomalous underestimation of the p-nitrophenylate leaving group pKa. Simulations of crystalline N,N-dimethylglycine are performed and the overall structure and atomic fluctuations are compared with the experiment and the general AMBER force field (GAFF). The QMFF, which was not parametrized for this application, was shown to be in better agreement with crystallographic data than GAFF. Our simulations highlight some of the application areas that may benefit from using new QMFFs, and they demonstrate progress toward the development of accurate QMFFs using the recently developed mDC framework. PMID:25691830

  10. Entanglement model of homeopathy as an example of generalized entanglement predicted by weak quantum theory.

    PubMed

    Walach, H

    2003-08-01

    Homeopathy is scientifically banned, both for lack of consistent empirical findings, but more so for lack of a sound theoretical model to explain its purported effects. This paper makes an attempt to introduce an explanatory idea based on a generalized version of quantum mechanics (QM), the weak quantum theory (WQT). WQT uses the algebraic formalism of QM proper, but drops some restrictions and definitions typical for QM. This results in a general axiomatic framework similar to QM, but more generalized and applicable to all possible systems. Most notably, WQT predicts entanglement, which in QM is known as Einstein-Podolsky-Rosen (EPR) correlatedness within quantum systems. According to WQT, this entanglement is not only tied to quantum systems, but is to be expected whenever a global and a local variable describing a system are complementary. This idea is used here to reconstruct homeopathy as an exemplification of generalized entanglement as predicted by WQT. It transpires that homeopathy uses two instances of generalized entanglement: one between the remedy and the original substance (potentiation principle) and one between the individual symptoms of a patient and the general symptoms of a remedy picture (similarity principle). By bringing these two elements together, double entanglement ensues, which is reminiscent of cryptographic and teleportation applications of entanglement in QM proper. Homeopathy could be a macroscopic analogue to quantum teleportation. This model is exemplified and some predictions are derived, which make it possible to test the model. Copyright 2003 S. Karger GmbH, Freiburg

  11. Application of Two-Dimensional AWE Algorithm in Training Multi-Dimensional Neural Network Model

    DTIC Science & Technology

    2003-07-01

    hybrid scheme . the general neural network method (Table 3.1). The training process of the software- ACKNOWLEDGMENT "Neuralmodeler" is shown in Fig. 3.2...engineering. Artificial neural networks (ANNs) have emerged Training a neural network model is the key of as a powerful technique for modeling general neural...coefficients am, the derivatives method of moments (MoM). The variables in the of matrix I have to be generated . A closed form model are frequency

  12. A general graphical user interface for automatic reliability modeling

    NASA Technical Reports Server (NTRS)

    Liceaga, Carlos A.; Siewiorek, Daniel P.

    1991-01-01

    Reported here is a general Graphical User Interface (GUI) for automatic reliability modeling of Processor Memory Switch (PMS) structures using a Markov model. This GUI is based on a hierarchy of windows. One window has graphical editing capabilities for specifying the system's communication structure, hierarchy, reconfiguration capabilities, and requirements. Other windows have field texts, popup menus, and buttons for specifying parameters and selecting actions. An example application of the GUI is given.

  13. A protocol for better design, application, and communication of population viability analyses.

    PubMed

    Pe'er, Guy; Matsinos, Yiannis G; Johst, Karin; Franz, Kamila W; Turlure, Camille; Radchuk, Viktoriia; Malinowska, Agnieszka H; Curtis, Janelle M R; Naujokaitis-Lewis, Ilona; Wintle, Brendan A; Henle, Klaus

    2013-08-01

    Population viability analyses (PVAs) contribute to conservation theory, policy, and management. Most PVAs focus on single species within a given landscape and address a specific problem. This specificity often is reflected in the organization of published PVA descriptions. Many lack structure, making them difficult to understand, assess, repeat, or use for drawing generalizations across PVA studies. In an assessment comparing published PVAs and existing guidelines, we found that model selection was rarely justified; important parameters remained neglected or their implementation was described vaguely; limited details were given on parameter ranges, sensitivity analysis, and scenarios; and results were often reported too inconsistently to enable repeatability and comparability. Although many guidelines exist on how to design and implement reliable PVAs and standards exist for documenting and communicating ecological models in general, there is a lack of organized guidelines for designing, applying, and communicating PVAs that account for their diversity of structures and contents. To fill this gap, we integrated published guidelines and recommendations for PVA design and application, protocols for documenting ecological models in general and individual-based models in particular, and our collective experience in developing, applying, and reviewing PVAs. We devised a comprehensive protocol for the design, application, and communication of PVAs (DAC-PVA), which has 3 primary elements. The first defines what a useful PVA is; the second element provides a workflow for the design and application of a useful PVA and highlights important aspects that need to be considered during these processes; and the third element focuses on communication of PVAs to ensure clarity, comprehensiveness, repeatability, and comparability. Thereby, DAC-PVA should strengthen the credibility and relevance of PVAs for policy and management, and improve the capacity to generalize PVA findings across studies. © 2013 Society for Conservation Biology.

  14. Verification and Validation of a Three-Dimensional Generalized Composite Material Model

    NASA Technical Reports Server (NTRS)

    Hoffarth, Canio; Harrington, Joseph; Subramaniam, D. Rajan; Goldberg, Robert K.; Carney, Kelly S.; DuBois, Paul; Blankenhorn, Gunther

    2014-01-01

    A general purpose orthotropic elasto-plastic computational constitutive material model has been developed to improve predictions of the response of composites subjected to high velocity impact. The three-dimensional orthotropic elasto-plastic composite material model is being implemented initially for solid elements in LS-DYNA as MAT213. In order to accurately represent the response of a composite, experimental stress-strain curves are utilized as input, allowing for a more general material model that can be used on a variety of composite applications. The theoretical details are discussed in a companion paper. This paper documents the implementation, verification and qualitative validation of the material model using the T800- F3900 fiber/resin composite material.

  15. Verification and Validation of a Three-Dimensional Generalized Composite Material Model

    NASA Technical Reports Server (NTRS)

    Hoffarth, Canio; Harrington, Joseph; Rajan, Subramaniam D.; Goldberg, Robert K.; Carney, Kelly S.; DuBois, Paul; Blankenhorn, Gunther

    2015-01-01

    A general purpose orthotropic elasto-plastic computational constitutive material model has been developed to improve predictions of the response of composites subjected to high velocity impact. The three-dimensional orthotropic elasto-plastic composite material model is being implemented initially for solid elements in LS-DYNA as MAT213. In order to accurately represent the response of a composite, experimental stress-strain curves are utilized as input, allowing for a more general material model that can be used on a variety of composite applications. The theoretical details are discussed in a companion paper. This paper documents the implementation, verification and qualitative validation of the material model using the T800-F3900 fiber/resin composite material

  16. General Education Courses at the University of Botswana: Application of the Theory of Reasoned Action in Measuring Course Outcomes

    ERIC Educational Resources Information Center

    Garg, Deepti; Garg, Ajay K.

    2007-01-01

    This study applied the Theory of Reasoned Action and the Technology Acceptance Model to measure outcomes of general education courses (GECs) under the University of Botswana Computer and Information Skills (CIS) program. An exploratory model was validated for responses from 298 students. The results suggest that resources currently committed to…

  17. Dynamic GSCA (Generalized Structured Component Analysis) with Applications to the Analysis of Effective Connectivity in Functional Neuroimaging Data

    ERIC Educational Resources Information Center

    Jung, Kwanghee; Takane, Yoshio; Hwang, Heungsun; Woodward, Todd S.

    2012-01-01

    We propose a new method of structural equation modeling (SEM) for longitudinal and time series data, named Dynamic GSCA (Generalized Structured Component Analysis). The proposed method extends the original GSCA by incorporating a multivariate autoregressive model to account for the dynamic nature of data taken over time. Dynamic GSCA also…

  18. BUMPER: the Bayesian User-friendly Model for Palaeo-Environmental Reconstruction

    NASA Astrophysics Data System (ADS)

    Holden, Phil; Birks, John; Brooks, Steve; Bush, Mark; Hwang, Grace; Matthews-Bird, Frazer; Valencia, Bryan; van Woesik, Robert

    2017-04-01

    We describe the Bayesian User-friendly Model for Palaeo-Environmental Reconstruction (BUMPER), a Bayesian transfer function for inferring past climate and other environmental variables from microfossil assemblages. The principal motivation for a Bayesian approach is that the palaeoenvironment is treated probabilistically, and can be updated as additional data become available. Bayesian approaches therefore provide a reconstruction-specific quantification of the uncertainty in the data and in the model parameters. BUMPER is fully self-calibrating, straightforward to apply, and computationally fast, requiring 2 seconds to build a 100-taxon model from a 100-site training-set on a standard personal computer. We apply the model's probabilistic framework to generate thousands of artificial training-sets under ideal assumptions. We then use these to demonstrate both the general applicability of the model and the sensitivity of reconstructions to the characteristics of the training-set, considering assemblage richness, taxon tolerances, and the number of training sites. We demonstrate general applicability to real data, considering three different organism types (chironomids, diatoms, pollen) and different reconstructed variables. In all of these applications an identically configured model is used, the only change being the input files that provide the training-set environment and taxon-count data.

  19. A Generalized Model for Transport of Contaminants in Soil by Electric Fields

    PubMed Central

    Paz-Garcia, Juan M.; Baek, Kitae; Alshawabkeh, Iyad D.; Alshawabkeh, Akram N.

    2012-01-01

    A generalized model applicable to soils contaminated with multiple species under enhanced boundary conditions during treatment by electric fields is presented. The partial differential equations describing species transport are developed by applying the law of mass conservation to their fluxes. Transport, due to migration, advection and diffusion, of each aqueous component and complex species are combined to produce one partial differential equation hat describes transport of the total analytical concentrations of component species which are the primary dependent variables. This transport couples with geochemical reactions such as aqueous equilibrium, sorption, precipitation and dissolution. The enhanced model is used to simulate electrokinetic cleanup of lead and copper contaminants at an Army Firing Range. Acid enhancement is achieved by the use of adipic acid to neutralize the basic front produced for the cathode electrochemical reaction. The model is able to simulate enhanced application of the process by modifying the boundary conditions. The model showed that kinetics of geochemical reactions, such as metals dissolution/leaching and redox reactions might be significant for realistic prediction of enhanced electrokinetic extraction of metals in real world applications. PMID:22242884

  20. 76 FR 39256 - Airworthiness Directives; Dassault Aviation Model FALCON 7X Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-06

    ... include modifying the applicable wiring and layout, a general visual inspection for absence of marks of... March 3, 2010. (2) Modify the applicable wiring and layout, in accordance with the Accomplishment... modifying the applicable wiring and layout, in accordance with Dassault Mandatory Service Bulletin 7X- 006...

  1. Information processing of earth resources data

    NASA Technical Reports Server (NTRS)

    Zobrist, A. L.; Bryant, N. A.

    1982-01-01

    Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.

  2. Generalized Parameter-Adjusted Stochastic Resonance of Duffing Oscillator and Its Application to Weak-Signal Detection

    PubMed Central

    Lai, Zhi-Hui; Leng, Yong-Gang

    2015-01-01

    A two-dimensional Duffing oscillator which can produce stochastic resonance (SR) is studied in this paper. We introduce its SR mechanism and present a generalized parameter-adjusted SR (GPASR) model of this oscillator for the necessity of parameter adjustments. The Kramers rate is chosen as the theoretical basis to establish a judgmental function for judging the occurrence of SR in this model; and to analyze and summarize the parameter-adjusted rules under unmatched signal amplitude, frequency, and/or noise-intensity. Furthermore, we propose the weak-signal detection approach based on this GPASR model. Finally, we employ two practical examples to demonstrate the feasibility of the proposed approach in practical engineering application. PMID:26343671

  3. Estimating daily time series of streamflow using hydrological model calibrated based on satellite observations of river water surface width: Toward real world applications.

    PubMed

    Sun, Wenchao; Ishidaira, Hiroshi; Bastola, Satish; Yu, Jingshan

    2015-05-01

    Lacking observation data for calibration constrains applications of hydrological models to estimate daily time series of streamflow. Recent improvements in remote sensing enable detection of river water-surface width from satellite observations, making possible the tracking of streamflow from space. In this study, a method calibrating hydrological models using river width derived from remote sensing is demonstrated through application to the ungauged Irrawaddy Basin in Myanmar. Generalized likelihood uncertainty estimation (GLUE) is selected as a tool for automatic calibration and uncertainty analysis. Of 50,000 randomly generated parameter sets, 997 are identified as behavioral, based on comparing model simulation with satellite observations. The uncertainty band of streamflow simulation can span most of 10-year average monthly observed streamflow for moderate and high flow conditions. Nash-Sutcliffe efficiency is 95.7% for the simulated streamflow at the 50% quantile. These results indicate that application to the target basin is generally successful. Beyond evaluating the method in a basin lacking streamflow data, difficulties and possible solutions for applications in the real world are addressed to promote future use of the proposed method in more ungauged basins. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  4. A Generalized Model of E-trading for GSR Fair Exchange Protocol

    NASA Astrophysics Data System (ADS)

    Konar, Debajyoti; Mazumdar, Chandan

    In this paper we propose a generalized model of E-trading for the development of GSR Fair Exchange Protocols. Based on the model, a method is narrated to implement E-trading protocols that ensure fairness in true sense without using an additional trusted third party for which either party has to pay. The model provides the scope to include the correctness of the product, money atomicity and customer's anonymity properties within E-trading protocol. We conclude this paper by indicating the area of applicability for our model.

  5. A General Approach for Specifying Informative Prior Distributions for PBPK Model Parameters

    EPA Science Inventory

    Characterization of uncertainty in model predictions is receiving more interest as more models are being used in applications that are critical to human health. For models in which parameters reflect biological characteristics, it is often possible to provide estimates of paramet...

  6. Energy Savings Forecast of SSL in General Illumination Report Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2016-09-30

    Summary of the DOE report Energy Savings Forecast of Solid-State Lighting in General Illumination Applications, a biannual report that models the adoption of LEDs in the U.S. general-lighting market, along with associated energy savings, based on the full potential DOE has determined to be technically feasible over time.

  7. Estimation and Selection via Absolute Penalized Convex Minimization And Its Multistage Adaptive Applications

    PubMed Central

    Huang, Jian; Zhang, Cun-Hui

    2013-01-01

    The ℓ1-penalized method, or the Lasso, has emerged as an important tool for the analysis of large data sets. Many important results have been obtained for the Lasso in linear regression which have led to a deeper understanding of high-dimensional statistical problems. In this article, we consider a class of weighted ℓ1-penalized estimators for convex loss functions of a general form, including the generalized linear models. We study the estimation, prediction, selection and sparsity properties of the weighted ℓ1-penalized estimator in sparse, high-dimensional settings where the number of predictors p can be much larger than the sample size n. Adaptive Lasso is considered as a special case. A multistage method is developed to approximate concave regularized estimation by applying an adaptive Lasso recursively. We provide prediction and estimation oracle inequalities for single- and multi-stage estimators, a general selection consistency theorem, and an upper bound for the dimension of the Lasso estimator. Important models including the linear regression, logistic regression and log-linear models are used throughout to illustrate the applications of the general results. PMID:24348100

  8. Generalization of the event-based Carnevale-Hines integration scheme for integrate-and-fire models.

    PubMed

    van Elburg, Ronald A J; van Ooyen, Arjen

    2009-07-01

    An event-based integration scheme for an integrate-and-fire neuron model with exponentially decaying excitatory synaptic currents and double exponential inhibitory synaptic currents has been introduced by Carnevale and Hines. However, the integration scheme imposes nonphysiological constraints on the time constants of the synaptic currents, which hamper its general applicability. This letter addresses this problem in two ways. First, we provide physical arguments demonstrating why these constraints on the time constants can be relaxed. Second, we give a formal proof showing which constraints can be abolished. As part of our formal proof, we introduce the generalized Carnevale-Hines lemma, a new tool for comparing double exponentials as they naturally occur in many cascaded decay systems, including receptor-neurotransmitter dissociation followed by channel closing. Through repeated application of the generalized lemma, we lift most of the original constraints on the time constants. Thus, we show that the Carnevale-Hines integration scheme for the integrate-and-fire model can be employed for simulating a much wider range of neuron and synapse types than was previously thought.

  9. Beta Regression Finite Mixture Models of Polarization and Priming

    ERIC Educational Resources Information Center

    Smithson, Michael; Merkle, Edgar C.; Verkuilen, Jay

    2011-01-01

    This paper describes the application of finite-mixture general linear models based on the beta distribution to modeling response styles, polarization, anchoring, and priming effects in probability judgments. These models, in turn, enhance our capacity for explicitly testing models and theories regarding the aforementioned phenomena. The mixture…

  10. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  11. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  12. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  13. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  14. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  15. Brazil for Sale? Does Sino-Brazilian Trade or Investment Significantly Influence Brazil’s United Nations General Assembly (UNGA) Voting Pattern?

    DTIC Science & Technology

    2011-12-01

    for Trade in Brazil? An Application of the Gravity Model. Anais do XXXI Encontro Nacional de Economia [Proceedings of the 31th Brazilian Economics...FDI Matter for Trade in Brazil? An Application of the Gravity Model. Anais do XXXI Encontro Nacional de Economia [Proceedings of the 31th

  16. A model of cloud application assignments in software-defined storages

    NASA Astrophysics Data System (ADS)

    Bolodurina, Irina P.; Parfenov, Denis I.; Polezhaev, Petr N.; E Shukhman, Alexander

    2017-01-01

    The aim of this study is to analyze the structure and mechanisms of interaction of typical cloud applications and to suggest the approaches to optimize their placement in storage systems. In this paper, we describe a generalized model of cloud applications including the three basic layers: a model of application, a model of service, and a model of resource. The distinctive feature of the model suggested implies analyzing cloud resources from the user point of view and from the point of view of a software-defined infrastructure of the virtual data center (DC). The innovation character of this model is in describing at the same time the application data placements, as well as the state of the virtual environment, taking into account the network topology. The model of software-defined storage has been developed as a submodel within the resource model. This model allows implementing the algorithm for control of cloud application assignments in software-defined storages. Experimental researches returned this algorithm decreases in cloud application response time and performance growth in user request processes. The use of software-defined data storages allows the decrease in the number of physical store devices, which demonstrates the efficiency of our algorithm.

  17. Formulation of strongly non-local, non-isothermal dynamics for heterogeneous solids based on the GENERIC with application to phase-field modeling

    NASA Astrophysics Data System (ADS)

    Hütter, Markus; Svendsen, Bob

    2017-12-01

    The purpose of the current work is the formulation of models for conservative and non-conservative dynamics in solid systems with the help of the General Equation for the Non-Equilibrium Reversible-Irreversible Coupling (GENERIC: e.g., Grmela and Öttinger, Phys. Rev. E 56(6), 6620 (1997); Öttinger and Grmela, Phys. Rev. E 56(6), 6633 (1997)). In this context, the resulting models are inherently spatially strongly non-local (i.e., functional) and non-isothermal in character. They are applicable in particular to the modeling of phase transitions as well as mass and heat transport in multiphase, multicomponent solids. In the last part of the work, the strongly non-local model formulation is reduced to weakly non-local form with the help of generalized gradient approximation of the energy and entropy functionals. On this basis, the current model formulation is shown to be consistent with and reduce to a recent non-isothermal generalization (Gladkov et al., J. Non-Equilib. Thermodyn. 41(2), 131 (2016)) of the well-known phase-field models of Cahn and Hilliard (J. Chem. Phys. 28(2), 258 (1958)) for conservative dynamics and of Allen and Cahn (Acta Metall. 27(6), 1085 (1979)) for non-conservative dynamics. Finally, the current approach is applied to derive a non-isothermal generalization of a phase-field crystal model for binary alloys (see, e.g., Elder et al., Phys. Rev. B 75(6), 064107 (2007)).

  18. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  19. Logit Models for the Analysis of Two-Way Categorical Data

    ERIC Educational Resources Information Center

    Draxler, Clemens

    2011-01-01

    This article discusses the application of logit models for the analyses of 2-way categorical observations. The models described are generalized linear models using the logit link function. One of the models is the Rasch model (Rasch, 1960). The objective is to test hypotheses of marginal and conditional independence between explanatory quantities…

  20. Quantization and symmetry in periodic coverage patterns with applications to earth observation. [for satellite ground tracks

    NASA Technical Reports Server (NTRS)

    King, J. C.

    1975-01-01

    The general orbit-coverage problem in a simplified physical model is investigated by application of numerical approaches derived from basic number theory. A system of basic and general properties is defined by which idealized periodic coverage patterns may be characterized, classified, and delineated. The principal common features of these coverage patterns are their longitudinal quantization, determined by the revolution number R, and their overall symmetry.

  1. Bayesian inference on risk differences: an application to multivariate meta-analysis of adverse events in clinical trials.

    PubMed

    Chen, Yong; Luo, Sheng; Chu, Haitao; Wei, Peng

    2013-05-01

    Multivariate meta-analysis is useful in combining evidence from independent studies which involve several comparisons among groups based on a single outcome. For binary outcomes, the commonly used statistical models for multivariate meta-analysis are multivariate generalized linear mixed effects models which assume risks, after some transformation, follow a multivariate normal distribution with possible correlations. In this article, we consider an alternative model for multivariate meta-analysis where the risks are modeled by the multivariate beta distribution proposed by Sarmanov (1966). This model have several attractive features compared to the conventional multivariate generalized linear mixed effects models, including simplicity of likelihood function, no need to specify a link function, and has a closed-form expression of distribution functions for study-specific risk differences. We investigate the finite sample performance of this model by simulation studies and illustrate its use with an application to multivariate meta-analysis of adverse events of tricyclic antidepressants treatment in clinical trials.

  2. Application of SIGGS to Project PRIME: A General Systems Approach to Evaluation of Mainstreaming.

    ERIC Educational Resources Information Center

    Frick, Ted

    The use of the systems approach in educational inquiry is not new, and the models of input/output, input/process/product, and cybernetic systems have been widely used. The general systems model is an extension of all these, adding the dimension of environmental influence on the system as well as system influence on the environment. However, if the…

  3. SIMULATION MODEL FOR WATERSHED MANAGEMENT PLANNING. VOLUME 2. MODEL USER MANUAL

    EPA Science Inventory

    This report provides a user manual for the hydrologic, nonpoint source pollution simulation of the generalized planning model for evaluating forest and farming management alternatives. The manual contains an explanation of application of specific code and indicates changes that s...

  4. 40 CFR 86.401-2006 - General applicability.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... to 1990 and later model year, new methanol-fueled motorcycles built after December 31, 1989 and to... after December 31, 1996 and to 2006 and later model year new motorcycles, regardless of fuel. [69 FR...

  5. Defining a Technical Basis for Comparing and Contrasting Emerging Dynamic Discovery Protocols

    DTIC Science & Technology

    2001-05-02

    UPnP, SLP, Bluetooth , and HAVi • Projected specific UML models for Jini, UPnP, and SLP • Completed a Rapide Model of Jini structure, function, and...narrow application focus but targeting a different application domain. (e.g., HAVi, Salutation Consortium, and Bluetooth Service Discovery) • Sun has...Our General Approach? 1/31/2002 7 Particulars of Our Approach Define a Generic UML Model that Encompasses Jini, UPnP, SLP, HAVi, and Bluetooth

  6. Validated linear dynamic model of electrically-shunted magnetostrictive transducers with application to structural vibration control

    NASA Astrophysics Data System (ADS)

    Scheidler, Justin J.; Asnani, Vivake M.

    2017-03-01

    This paper presents a linear model of the fully-coupled electromechanical behavior of a generally-shunted magnetostrictive transducer. The impedance and admittance representations of the model are reported. The model is used to derive the effect of the shunt’s electrical impedance on the storage modulus and loss factor of the transducer without neglecting the inherent resistance of the transducer’s coil. The expressions are normalized and then shown to also represent generally-shunted piezoelectric materials that have a finite leakage resistance. The generalized expressions are simplified for three shunts: resistive, series resistive-capacitive, and inductive, which are considered for shunt damping, resonant shunt damping, and stiffness tuning, respectively. For each shunt, the storage modulus and loss factor are plotted for a wide range of the normalized parameters. Then, important trends and their impact on different applications are discussed. An experimental validation of the transducer model is presented for the case of resistive and resonant shunts. The model closely predicts the measured response for a variety of operating conditions. This paper also introduces a model for the dynamic compliance of a vibrating structure that is coupled to a magnetostrictive transducer for shunt damping and resonant shunt damping applications. This compliance is normalized and then shown to be analogous to that of a structure that is coupled to a piezoelectric material. The derived analogies allow for the observations and equations in the existing literature on structural vibration control using shunted piezoelectric materials to be directly applied to the case of shunted magnetostrictive transducers.

  7. Advanced Spectral Modeling Development

    DTIC Science & Technology

    1992-09-14

    above, the AFGL line-by-line code already possesses many of the attributes desired of a generally applicable transmittance/radiance simulation code, it...transmittance calculations, (b) perform generalized multiple scattering calculations, (c) calculate both heating and dissociative fluxes, (d) provide...This report is subdivided into task specific subsections. The following section describes our general approach to address these technical issues (Section

  8. 40 CFR 600.501-86 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Year 1978 Passenger Automobiles and for 1979 and Later Model Year Automobiles (Light Trucks and Passenger Automobiles)-Procedures for Determining Manufacturer's Average Fuel Economy § 600.501-86 General... and diesel automobiles. (b)(1) Manufacturers that produce only electric vehicles are exempt from the...

  9. 37 CFR 1.94 - Return of models, exhibits or specimens.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing Provisions Models... business before the Office and will be returned, applicant must arrange for the return of the model... model, exhibit or specimen is no longer necessary for the conduct of business before the Office. (b...

  10. Natural hazard modeling and uncertainty analysis [Chapter 2

    Treesearch

    Matthew Thompson; Jord J. Warmink

    2017-01-01

    Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...

  11. Mathematical foundations of hybrid data assimilation from a synchronization perspective

    NASA Astrophysics Data System (ADS)

    Penny, Stephen G.

    2017-12-01

    The state-of-the-art data assimilation methods used today in operational weather prediction centers around the world can be classified as generalized one-way coupled impulsive synchronization. This classification permits the investigation of hybrid data assimilation methods, which combine dynamic error estimates of the system state with long time-averaged (climatological) error estimates, from a synchronization perspective. Illustrative results show how dynamically informed formulations of the coupling matrix (via an Ensemble Kalman Filter, EnKF) can lead to synchronization when observing networks are sparse and how hybrid methods can lead to synchronization when those dynamic formulations are inadequate (due to small ensemble sizes). A large-scale application with a global ocean general circulation model is also presented. Results indicate that the hybrid methods also have useful applications in generalized synchronization, in particular, for correcting systematic model errors.

  12. Vector calculus in non-integer dimensional space and its applications to fractal media

    NASA Astrophysics Data System (ADS)

    Tarasov, Vasily E.

    2015-02-01

    We suggest a generalization of vector calculus for the case of non-integer dimensional space. The first and second orders operations such as gradient, divergence, the scalar and vector Laplace operators for non-integer dimensional space are defined. For simplification we consider scalar and vector fields that are independent of angles. We formulate a generalization of vector calculus for rotationally covariant scalar and vector functions. This generalization allows us to describe fractal media and materials in the framework of continuum models with non-integer dimensional space. As examples of application of the suggested calculus, we consider elasticity of fractal materials (fractal hollow ball and fractal cylindrical pipe with pressure inside and outside), steady distribution of heat in fractal media, electric field of fractal charged cylinder. We solve the correspondent equations for non-integer dimensional space models.

  13. Mathematical foundations of hybrid data assimilation from a synchronization perspective.

    PubMed

    Penny, Stephen G

    2017-12-01

    The state-of-the-art data assimilation methods used today in operational weather prediction centers around the world can be classified as generalized one-way coupled impulsive synchronization. This classification permits the investigation of hybrid data assimilation methods, which combine dynamic error estimates of the system state with long time-averaged (climatological) error estimates, from a synchronization perspective. Illustrative results show how dynamically informed formulations of the coupling matrix (via an Ensemble Kalman Filter, EnKF) can lead to synchronization when observing networks are sparse and how hybrid methods can lead to synchronization when those dynamic formulations are inadequate (due to small ensemble sizes). A large-scale application with a global ocean general circulation model is also presented. Results indicate that the hybrid methods also have useful applications in generalized synchronization, in particular, for correcting systematic model errors.

  14. Efficient polarimetric BRDF model.

    PubMed

    Renhorn, Ingmar G E; Hallberg, Tomas; Boreman, Glenn D

    2015-11-30

    The purpose of the present manuscript is to present a polarimetric bidirectional reflectance distribution function (BRDF) model suitable for hyperspectral and polarimetric signature modelling. The model is based on a further development of a previously published four-parameter model that has been generalized in order to account for different types of surface structures (generalized Gaussian distribution). A generalization of the Lambertian diffuse model is presented. The pBRDF-functions are normalized using numerical integration. Using directional-hemispherical reflectance (DHR) measurements, three of the four basic parameters can be determined for any wavelength. This simplifies considerably the development of multispectral polarimetric BRDF applications. The scattering parameter has to be determined from at least one BRDF measurement. The model deals with linear polarized radiation; and in similarity with e.g. the facet model depolarization is not included. The model is very general and can inherently model extreme surfaces such as mirrors and Lambertian surfaces. The complex mixture of sources is described by the sum of two basic models, a generalized Gaussian/Fresnel model and a generalized Lambertian model. Although the physics inspired model has some ad hoc features, the predictive power of the model is impressive over a wide range of angles and scattering magnitudes. The model has been applied successfully to painted surfaces, both dull and glossy and also on metallic bead blasted surfaces. The simple and efficient model should be attractive for polarimetric simulations and polarimetric remote sensing.

  15. Nucleation and growth in one dimension. I. The generalized Kolmogorov-Johnson-Mehl-Avrami model

    NASA Astrophysics Data System (ADS)

    Jun, Suckjoon; Zhang, Haiyang; Bechhoefer, John

    2005-01-01

    Motivated by a recent application of the Kolmogorov-Johnson-Mehl-Avrami (KJMA) model to the study of DNA replication, we consider the one-dimensional (1D) version of this model. We generalize previous work to the case where the nucleation rate is an arbitrary function I(t) and obtain analytical results for the time-dependent distributions of various quantities (such as the island distribution). We also present improved computer simulation algorithms to study the 1D KJMA model. The analytical results and simulations are in excellent agreement.

  16. A Wind Tunnel Model to Explore Unsteady Circulation Control for General Aviation Applications

    NASA Technical Reports Server (NTRS)

    Cagle, Christopher M.; Jones, Gregory S.

    2002-01-01

    Circulation Control airfoils have been demonstrated to provide substantial improvements in lift over conventional airfoils. The General Aviation Circular Control model is an attempt to address some of the concerns of this technique. The primary focus is to substantially reduce the amount of air mass flow by implementing unsteady flow. This paper describes a wind tunnel model that implements unsteady circulation control by pulsing internal pneumatic valves and details some preliminary results from the first test entry.

  17. Hamilton's Equations with Euler Parameters for Rigid Body Dynamics Modeling. Chapter 3

    NASA Technical Reports Server (NTRS)

    Shivarama, Ravishankar; Fahrenthold, Eric P.

    2004-01-01

    A combination of Euler parameter kinematics and Hamiltonian mechanics provides a rigid body dynamics model well suited for use in strongly nonlinear problems involving arbitrarily large rotations. The model is unconstrained, free of singularities, includes a general potential energy function and a minimum set of momentum variables, and takes an explicit state space form convenient for numerical implementation. The general formulation may be specialized to address particular applications, as illustrated in several three dimensional example problems.

  18. Hope grounded in belief: Influences of reward for application and social cynicism on dispositional hope.

    PubMed

    Bernardo, Allan B I

    2013-12-01

    Two studies explore whether general beliefs about the social world or social axioms may be antecedents of dispositional hope. Social axioms are generalized cognitive representations that provide frames for constructing individuals' hope-related cognitions. Considering social axioms' instrumental and ego-defensive functions, two social axioms, social cynicism and reward for application are hypothesized to be negative and positive predictors of hope, respectively. Study 1 used multiple regression analysis to test the hypothesis. Study 2 used structural equation modeling to test the model with a pathway linking reward for application with hope, and another pathway linking social cynicism and hope that is mediated by self-esteem. The results are discussed in terms of extending the range of psychological constructs and processes that foster the development of hope. © 2013 The Scandinavian Psychological Associations.

  19. Socrates Meets the 21st Century

    ERIC Educational Resources Information Center

    Lege, Jerry

    2005-01-01

    A inquiry-based approach called the "modelling discussion" is introduced for structuring beginning modelling activity, teaching new mathematics from examining its applications in contextual situations, and as a general classroom management technique when students are engaged in mathematical modelling. An example which illustrates the style and…

  20. Identification of aerodynamic models for maneuvering aircraft

    NASA Technical Reports Server (NTRS)

    Lan, C. Edward; Hu, C. C.

    1992-01-01

    The method based on Fourier functional analysis and indicial formulation for aerodynamic modeling as proposed by Chin and Lan is extensively examined and improved for the purpose of general applications to realistic airplane configurations. Improvement is made to automate the calculation of model coefficients, and to evaluate more accurately the indicial integral. Test data of large angle-of-attack ranges for two different models, a 70 deg. delta wing and an F-18 model, are used to further verify the applicability of Fourier functional analysis and validate the indicial formulation. The results show that the general expression for harmonic motions throughout a range of k is capable of accurately modeling the nonlinear responses with large phase lag except in the region where an inconsistent hysteresis behavior from one frequency to the other occurs. The results by the indicial formulation indicate that more accurate results can be obtained when the motion starts from a low angle of attack where hysteresis effect is not important.

  1. A complete dynamic model of primary sedimentation.

    PubMed

    Paraskevas, P; Kolokithas, G; Lekkas, T

    1993-11-01

    A dynamic mathematical model for the primary clarifier of a wastewater treatment plant is described, which is represented by a general tanks-in-series model, to simulate insufficient mixing. The model quantifies successfully the diurnal response of both the suspended and dissolved species. It is general enough, so that the values of the parameters can be replaced with those applicable to a specific case. The model was verified through data from the Biological Centre of Metamorfosi, in Athens, Greece, and can be used to assist in the design of new plants or in the analysis and output predictions of existing ones.

  2. Successful Application of Adaptive Emotion Regulation Skills Predicts the Subsequent Reduction of Depressive Symptom Severity but neither the Reduction of Anxiety nor the Reduction of General Distress during the Treatment of Major Depressive Disorder

    PubMed Central

    Wirtz, Carolin M.; Radkovsky, Anna; Ebert, David D.; Berking, Matthias

    2014-01-01

    Objective Deficits in general emotion regulation (ER) skills have been linked to symptoms of depression and are thus considered a promising target in the treatment of Major depressive disorder (MDD). However, at this point, the extent to which such skills are relevant for coping with depression and whether they should instead be considered a transdiagnostic factor remain unclear. Therefore, the present study aimed to investigate whether successful ER skills application is associated with changes in depressive symptom severity (DSS), anxiety symptom severity (ASS), and general distress severity (GDS) over the course of treatment for MDD. Methods Successful ER skills application, DSS, ASS, and GDS were assessed four times during the first three weeks of treatment in 175 inpatients who met the criteria for MDD. We computed Pearson correlations to test whether successful ER skills application and the three indicators of psychopathology are cross-sectionally associated. We then performed latent growth curve modelling to test whether changes in successful ER skills application are negatively associated with a reduction of DSS, ASS, or GDS. Finally, we utilized latent change score models to examine whether successful ER skills application predicts subsequent reduction of DSS, ASS, or GDS. Results Successful ER skills application was cross-sectionally associated with lower levels of DSS, ASS, and GDS at all points of assessment. An increase in successful skills application during treatment was associated with a decrease in DSS and GDS but not ASS. Finally, successful ER skills application predicted changes in subsequent DSS but neither changes in ASS nor changes in GDS. Conclusions Although general ER skills might be relevant for a broad range of psychopathological symptoms, they might be particularly important for the maintenance and treatment of depressive symptoms. PMID:25330159

  3. Modeling pilot interaction with automated digital avionics systems: Guidance and control algorithms for contour and nap-of-the-Earth flight

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.

    1990-01-01

    A collection of technical papers are presented that cover modeling pilot interaction with automated digital avionics systems and guidance and control algorithms for contour and nap-of-the-earth flight. The titles of the papers presented are as follows: (1) Automation effects in a multiloop manual control system; (2) A qualitative model of human interaction with complex dynamic systems; (3) Generalized predictive control of dynamic systems; (4) An application of generalized predictive control to rotorcraft terrain-following flight; (5) Self-tuning generalized predictive control applied to terrain-following flight; and (6) Precise flight path control using a predictive algorithm.

  4. Applicability of the Risk-Need-Responsivity Model to Persons With Mental Illness Involved in the Criminal Justice System.

    PubMed

    Skeem, Jennifer L; Steadman, Henry J; Manchak, Sarah M

    2015-09-01

    National efforts to improve responses to persons with mental illness involved with the criminal justice system have traditionally focused on providing mental health services under court supervision. However, a new policy emphasis has emerged that focuses on providing correctional treatment services consistent with the risk-need-responsivity (RNR) model to reduce recidivism. The objective of this review was to evaluate empirical support for following the RNR model (developed with general offenders) with this group and to pose major questions that the field needs to address. A comprehensive search using PubMed and PsycINFO yielded 18 studies that addressed the applicability of the RNR model to the target population. The results of these studies were synthesized. There is strong support for using general risk assessment tools to assess this group's risk of recidivism. Preliminary evidence indicates that cognitive-behavioral programs targeting general risk factors are more effective than psychiatric treatment alone. However, there is as yet no direct support for the applicability of the three core RNR principles to treat this population. Although the new policy emphasis shows substantial promise, the field must avoid rushing to the next "evidence base" too rapidly and with too little data. There must be explicit recognition that RNR principles are being applied to a new population with unique characteristics (mental illness combined with justice system involvement), such that generalizability from general offender samples is uncertain. Moreover, public safety goals for the target population should not eclipse those related to public health. This group's unique features may affect both the process and outcomes of treatment.

  5. 47 CFR 18.207 - Technical report.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL INDUSTRIAL, SCIENTIFIC, AND MEDICAL EQUIPMENT... the application. (c) The full name and mailing address of the manufacturer of the device and/or applicant filing for the equipment authorization. (d) The FCC Identifier, trade name(s), and/or model number...

  6. 47 CFR 18.207 - Technical report.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL INDUSTRIAL, SCIENTIFIC, AND MEDICAL EQUIPMENT... the application. (c) The full name and mailing address of the manufacturer of the device and/or applicant filing for the equipment authorization. (d) The FCC Identifier, trade name(s), and/or model number...

  7. A multitasking general executive for compound continuous tasks.

    PubMed

    Salvucci, Dario D

    2005-05-06

    As cognitive architectures move to account for increasingly complex real-world tasks, one of the most pressing challenges involves understanding and modeling human multitasking. Although a number of existing models now perform multitasking in real-world scenarios, these models typically employ customized executives that schedule tasks for the particular domain but do not generalize easily to other domains. This article outlines a general executive for the Adaptive Control of Thought-Rational (ACT-R) cognitive architecture that, given independent models of individual tasks, schedules and interleaves the models' behavior into integrated multitasking behavior. To demonstrate the power of the proposed approach, the article describes an application to the domain of driving, showing how the general executive can interleave component subtasks of the driving task (namely, control and monitoring) and interleave driving with in-vehicle secondary tasks (radio tuning and phone dialing). 2005 Lawrence Erlbaum Associates, Inc.

  8. Aspects of job scheduling

    NASA Technical Reports Server (NTRS)

    Phillips, K.

    1976-01-01

    A mathematical model for job scheduling in a specified context is presented. The model uses both linear programming and combinatorial methods. While designed with a view toward optimization of scheduling of facility and plant operations at the Deep Space Communications Complex, the context is sufficiently general to be widely applicable. The general scheduling problem including options for scheduling objectives is discussed and fundamental parameters identified. Mathematical algorithms for partitioning problems germane to scheduling are presented.

  9. 40 CFR 600.501-93 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Year 1978 Passenger Automobiles and for 1979 and Later Model Year Automobiles (Light Trucks and Passenger Automobiles)-Procedures for Determining Manufacturer's Average Fuel Economy § 600.501-93 General... automobiles and light trucks, and to the manufacturers of passenger automobiles and light trucks as determined...

  10. Computational Toxicology: Application in Environmental Chemicals

    EPA Science Inventory

    This chapter provides an overview of computational models that describe various aspects of the source-to-health effect continuum. Fate and transport models describe the release, transportation, and transformation of chemicals from sources of emission throughout the general envir...

  11. RECEPTOR MODEL DEVELOPMENT AND APPLICATION

    EPA Science Inventory

    Source apportionment (receptor) models are mathematical procedures for identifying and quantifying the sources of ambient air pollutants and their effects at a site (the receptor), primarily on the basis of species concentration measurements at the receptor, and generally without...

  12. Application of remote sensing to thermal pollution analysis. [satellite sea surface temperature measurement assessment

    NASA Technical Reports Server (NTRS)

    Hiser, H. W.; Lee, S. S.; Veziroglu, T. N.; Sengupta, S.

    1975-01-01

    A comprehensive numerical model development program for near-field thermal plume discharge and far field general circulation in coastal regions is being carried on at the University of Miami Clean Energy Research Institute. The objective of the program is to develop a generalized, three-dimensional, predictive model for thermal pollution studies. Two regions of specific application of the model are the power plants sites at the Biscayne Bay and Hutchinson Island area along the Florida coastline. Remote sensing from aircraft as well as satellites are used in parallel with in situ measurements to provide information needed for the development and verification of the mathematical model. This paper describes the efforts that have been made to identify problems and limitations of the presently available satellite data and to develop methods for enhancing and enlarging thermal infrared displays for mesoscale sea surface temperature measurements.

  13. A generalized groundwater fluctuation model based on precipitation for estimating water table levels of deep unconfined aquifers

    NASA Astrophysics Data System (ADS)

    Jeong, Jina; Park, Eungyu; Shik Han, Weon; Kim, Kue-Young; Suk, Heejun; Beom Jo, Si

    2018-07-01

    A generalized water table fluctuation model based on precipitation was developed using a statistical conceptualization of unsaturated infiltration fluxes. A gamma distribution function was adopted as a transfer function due to its versatility in representing recharge rates with temporally dispersed infiltration fluxes, and a Laplace transformation was used to obtain an analytical solution. To prove the general applicability of the model, convergences with previous water table fluctuation models were shown as special cases. For validation, a few hypothetical cases were developed, where the applicability of the model to a wide range of unsaturated zone conditions was confirmed. For further validation, the model was applied to water table level estimations of three monitoring wells with considerably thick unsaturated zones on Jeju Island. The results show that the developed model represented the pattern of hydrographs from the two monitoring wells fairly well. The lag times from precipitation to recharge estimated from the developed system transfer function were found to agree with those from a conventional cross-correlation analysis. The developed model has the potential to be adopted for the hydraulic characterization of both saturated and unsaturated zones by being calibrated to actual data when extraneous and exogenous causes of water table fluctuation are limited. In addition, as it provides reference estimates, the model can be adopted as a tool for surveilling groundwater resources under hydraulically stressed conditions.

  14. Interactive access and management for four-dimensional environmental data sets using McIDAS

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Tripoli, Gregory J.

    1991-01-01

    Significant accomplishments in the following areas are presented: (1) enhancements to the visualization of 5-D data sets (VIS-5D); (2) development of the visualization of global images (VIS-GI) application; (3) design of the Visualization for Algorithm Development (VIS-AD) System; and (4) numerical modeling applications. The focus of current research and future research plans is presented and the following topics are addressed: (1) further enhancements to VIS-5D; (2) generalization and enhancement of the VIS-GI application; (3) the implementation of the VIS-AD System; and (4) plans for modeling applications.

  15. Multilevel modelling: Beyond the basic applications.

    PubMed

    Wright, Daniel B; London, Kamala

    2009-05-01

    Over the last 30 years statistical algorithms have been developed to analyse datasets that have a hierarchical/multilevel structure. Particularly within developmental and educational psychology these techniques have become common where the sample has an obvious hierarchical structure, like pupils nested within a classroom. We describe two areas beyond the basic applications of multilevel modelling that are important to psychology: modelling the covariance structure in longitudinal designs and using generalized linear multilevel modelling as an alternative to methods from signal detection theory (SDT). Detailed code for all analyses is described using packages for the freeware R.

  16. Land Surface Process and Air Quality Research and Applications at MSFC

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale; Khan, Maudood

    2007-01-01

    This viewgraph presentation provides an overview of land surface process and air quality research at MSFC including atmospheric modeling and ongoing research whose objective is to undertake a comprehensive spatiotemporal analysis of the effects of accurate land surface characterization on atmospheric modeling results, and public health applications. Land use maps as well as 10 meter air temperature, surface wind, PBL mean difference heights, NOx, ozone, and O3+NO2 plots as well as spatial growth model outputs are included. Emissions and general air quality modeling are also discussed.

  17. Global stability of a multiple delayed viral infection model with general incidence rate and an application to HIV infection.

    PubMed

    Ji, Yu

    2015-06-01

    In this paper, the dynamical behavior of a viral infection model with general incidence rate and two time delays is studied. By using the Lyapunov functional and LaSalle invariance principle, the global stabilities of the infection-free equilibrium and the endemic equilibrium are obtained. We obtain a threshold of the global stability for the uninfected equilibrium, which means the disease will be under control eventually. These results can be applied to a variety of viral infections of disease that would make it possible to devise optimal treatment strategies. Numerical simulations with application to HIV infection are given to verify the analytical results.

  18. Finite element implementation of Robinson's unified viscoplastic model and its application to some uniaxial and multiaxial problems

    NASA Technical Reports Server (NTRS)

    Arya, V. K.; Kaufman, A.

    1989-01-01

    A description of the finite element implementation of Robinson's unified viscoplastic model into the General Purpose Finite Element Program (MARC) is presented. To demonstrate its application, the implementation is applied to some uniaxial and multiaxial problems. A comparison of the results for the multiaxial problem of a thick internally pressurized cylinder, obtained using the finite element implementation and an analytical solution, is also presented. The excellent agreement obtained confirms the correct finite element implementation of Robinson's model.

  19. Finite element implementation of Robinson's unified viscoplastic model and its application to some uniaxial and multiaxial problems

    NASA Technical Reports Server (NTRS)

    Arya, V. K.; Kaufman, A.

    1987-01-01

    A description of the finite element implementation of Robinson's unified viscoplastic model into the General Purpose Finite Element Program (MARC) is presented. To demonstrate its application, the implementation is applied to some uniaxial and multiaxial problems. A comparison of the results for the multiaxial problem of a thick internally pressurized cylinder, obtained using the finite element implementation and an analytical solution, is also presented. The excellent agreement obtained confirms the correct finite element implementation of Robinson's model.

  20. On unified modeling, theory, and method for solving multi-scale global optimization problems

    NASA Astrophysics Data System (ADS)

    Gao, David Yang

    2016-10-01

    A unified model is proposed for general optimization problems in multi-scale complex systems. Based on this model and necessary assumptions in physics, the canonical duality theory is presented in a precise way to include traditional duality theories and popular methods as special applications. Two conjectures on NP-hardness are proposed, which should play important roles for correctly understanding and efficiently solving challenging real-world problems. Applications are illustrated for both nonconvex continuous optimization and mixed integer nonlinear programming.

  1. Curriculum Development for Business and Industry.

    ERIC Educational Resources Information Center

    Stolovitch, Harold D.; Keeps, Erica J.

    1988-01-01

    Defines the concept of curriculum for industrial personnel development needs, explains the concept of professionalism, and presents a model for developing curricula for business and industry called the Professional Development Curriculum (PDC) model. Training needs are discussed and two applications of the model in General Motors are described.…

  2. 10 CFR 429.12 - General requirements applicable to certification reports.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... distribution transformers, the basic model number or kVA grouping model number (depending on the certification... signal modules, Pedestrian modules, and Distribution transformers June 1. Room air conditioners... within one year after the first date of new model manufacture. (3) For distribution transformers, the...

  3. 10 CFR 429.12 - General requirements applicable to certification reports.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... distribution transformers, the basic model number or kVA grouping model number (depending on the certification... signal modules, Pedestrian modules, and Distribution transformers June 1. Room air conditioners... within one year after the first date of new model manufacture. (3) For distribution transformers, the...

  4. HexSim - A general purpose framework for spatially-explicit, individual-based modeling

    EPA Science Inventory

    HexSim is a framework for constructing spatially-explicit, individual-based computer models designed for simulating terrestrial wildlife population dynamics and interactions. HexSim is useful for a broad set of modeling applications. This talk will focus on a subset of those ap...

  5. Remotely piloted vehicle: Application of the GRASP analysis method

    NASA Technical Reports Server (NTRS)

    Andre, W. L.; Morris, J. B.

    1981-01-01

    The application of General Reliability Analysis Simulation Program (GRASP) to the remotely piloted vehicle (RPV) system is discussed. The model simulates the field operation of the RPV system. By using individual component reliabilities, the overall reliability of the RPV system is determined. The results of the simulations are given in operational days. The model represented is only a basis from which more detailed work could progress. The RPV system in this model is based on preliminary specifications and estimated values. The use of GRASP from basic system definition, to model input, and to model verification is demonstrated.

  6. Auditory models for speech analysis

    NASA Astrophysics Data System (ADS)

    Maybury, Mark T.

    This paper reviews the psychophysical basis for auditory models and discusses their application to automatic speech recognition. First an overview of the human auditory system is presented, followed by a review of current knowledge gleaned from neurological and psychoacoustic experimentation. Next, a general framework describes established peripheral auditory models which are based on well-understood properties of the peripheral auditory system. This is followed by a discussion of current enhancements to that models to include nonlinearities and synchrony information as well as other higher auditory functions. Finally, the initial performance of auditory models in the task of speech recognition is examined and additional applications are mentioned.

  7. Goodness of Model-Data Fit and Invariant Measurement

    ERIC Educational Resources Information Center

    Engelhard, George, Jr.; Perkins, Aminah

    2013-01-01

    In this commentary, Englehard and Perkins remark that Maydeu-Olivares has presented a framework for evaluating the goodness of model-data fit for item response theory (IRT) models and correctly points out that overall goodness-of-fit evaluations of IRT models and data are not generally explored within most applications in educational and…

  8. AERMOD: A DISPERSION MODEL FOR INDUSTRIAL SOURCE APPLICATIONS PART I: GENERAL MODEL FORMULATION AND BOUNDARY LAYER CHARACTERIZATION

    EPA Science Inventory

    The formulations of the AMS/EPA Regulatory Model Improvement Committee's applied air dispersion model (AERMOD) as related to the characterization of the planetary boundary layer are described. This is the first in a series of three articles. Part II describes the formulation of...

  9. Application of Three Cognitive Diagnosis Models to ESL Reading and Listening Assessments

    ERIC Educational Resources Information Center

    Lee, Yong-Won; Sawaki, Yasuyo

    2009-01-01

    The present study investigated the functioning of three psychometric models for cognitive diagnosis--the general diagnostic model, the fusion model, and latent class analysis--when applied to large-scale English as a second language listening and reading comprehension assessments. Data used in this study were scored item responses and incidence…

  10. Review of forest landscape models: types, methods, development and applications

    Treesearch

    Weimin Xi; Robert N. Coulson; Andrew G. Birt; Zong-Bo Shang; John D. Waldron; Charles W. Lafon; David M. Cairns; Maria D. Tchakerian; Kier D. Klepzig

    2009-01-01

    Forest landscape models simulate forest change through time using spatially referenced data across a broad spatial scale (i.e. landscape scale) generally larger than a single forest stand. Spatial interactions between forest stands are a key component of such models. These models can incorporate other spatio-temporal processes such as...

  11. Diagnostic Classification Models: Thoughts and Future Directions

    ERIC Educational Resources Information Center

    Henson, Robert A.

    2009-01-01

    The paper by Drs. Rupp and Templin provides a much needed step toward the general application of diagnostic classification modeling (DCMs). The authors have provided a summary of many of the concepts that one must consider to properly apply a DCM (which ranges from model selection and estimation, to assessing the appropriateness of the model using…

  12. APPLYING THE PATUXENT LANDSCAPE UNIT MODEL TO HUMAN DOMINATED ECOSYSTEMS: THE CASE OF AGRICULTURE. (R827169)

    EPA Science Inventory

    Non-spatial dynamics are core to landscape simulations. Unit models simulate system interactions aggregated within one space unit of resolution used within a spatial model. For unit models to be applicable to spatial simulations they have to be formulated in a general enough w...

  13. A quantum–quantum Metropolis algorithm

    PubMed Central

    Yung, Man-Hong; Aspuru-Guzik, Alán

    2012-01-01

    The classical Metropolis sampling method is a cornerstone of many statistical modeling applications that range from physics, chemistry, and biology to economics. This method is particularly suitable for sampling the thermal distributions of classical systems. The challenge of extending this method to the simulation of arbitrary quantum systems is that, in general, eigenstates of quantum Hamiltonians cannot be obtained efficiently with a classical computer. However, this challenge can be overcome by quantum computers. Here, we present a quantum algorithm which fully generalizes the classical Metropolis algorithm to the quantum domain. The meaning of quantum generalization is twofold: The proposed algorithm is not only applicable to both classical and quantum systems, but also offers a quantum speedup relative to the classical counterpart. Furthermore, unlike the classical method of quantum Monte Carlo, this quantum algorithm does not suffer from the negative-sign problem associated with fermionic systems. Applications of this algorithm include the study of low-temperature properties of quantum systems, such as the Hubbard model, and preparing the thermal states of sizable molecules to simulate, for example, chemical reactions at an arbitrary temperature. PMID:22215584

  14. Finite elements of nonlinear continua.

    NASA Technical Reports Server (NTRS)

    Oden, J. T.

    1972-01-01

    The finite element method is extended to a broad class of practical nonlinear problems, treating both theory and applications from a general and unifying point of view. The thermomechanical principles of continuous media and the properties of the finite element method are outlined, and are brought together to produce discrete physical models of nonlinear continua. The mathematical properties of the models are analyzed, and the numerical solution of the equations governing the discrete models is examined. The application of the models to nonlinear problems in finite elasticity, viscoelasticity, heat conduction, and thermoviscoelasticity is discussed. Other specific topics include the topological properties of finite element models, applications to linear and nonlinear boundary value problems, convergence, continuum thermodynamics, finite elasticity, solutions to nonlinear partial differential equations, and discrete models of the nonlinear thermomechanical behavior of dissipative media.

  15. Probabilistic image modeling with an extended chain graph for human activity recognition and image segmentation.

    PubMed

    Zhang, Lei; Zeng, Zhi; Ji, Qiang

    2011-09-01

    Chain graph (CG) is a hybrid probabilistic graphical model (PGM) capable of modeling heterogeneous relationships among random variables. So far, however, its application in image and video analysis is very limited due to lack of principled learning and inference methods for a CG of general topology. To overcome this limitation, we introduce methods to extend the conventional chain-like CG model to CG model with more general topology and the associated methods for learning and inference in such a general CG model. Specifically, we propose techniques to systematically construct a generally structured CG, to parameterize this model, to derive its joint probability distribution, to perform joint parameter learning, and to perform probabilistic inference in this model. To demonstrate the utility of such an extended CG, we apply it to two challenging image and video analysis problems: human activity recognition and image segmentation. The experimental results show improved performance of the extended CG model over the conventional directed or undirected PGMs. This study demonstrates the promise of the extended CG for effective modeling and inference of complex real-world problems.

  16. 40 CFR 86.016-1 - General applicability.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...-Duty Engines, and for 1985 and Later Model Year New Gasoline Fueled, Natural Gas-Fueled, Liquefied...) of this section. (h) Turbine engines. Turbine engines are deemed to be compression-ignition engines... (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES General Provisions for...

  17. 40 CFR 86.701-94 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... medium duty passenger vehicles. (b) References in this subpart to engine families and emission control... (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES (CONTINUED) General Provisions for In-Use Emission Regulations for 1994 and Later Model Year Light-Duty Vehicles and Light-Duty...

  18. Meta-analysis in Stata using gllamm.

    PubMed

    Bagos, Pantelis G

    2015-12-01

    There are several user-written programs for performing meta-analysis in Stata (Stata Statistical Software: College Station, TX: Stata Corp LP). These include metan, metareg, mvmeta, and glst. However, there are several cases for which these programs do not suffice. For instance, there is no software for performing univariate meta-analysis with correlated estimates, for multilevel or hierarchical meta-analysis, or for meta-analysis of longitudinal data. In this work, we show with practical applications that many disparate models, including but not limited to the ones mentioned earlier, can be fitted using gllamm. The software is very versatile and can handle a wide variety of models with applications in a wide range of disciplines. The method presented here takes advantage of these modeling capabilities and makes use of appropriate transformations, based on the Cholesky decomposition of the inverse of the covariance matrix, known as generalized least squares, in order to handle correlated data. The models described earlier can be thought of as special instances of a general linear mixed-model formulation, but to the author's knowledge, a general exposition in order to incorporate all the available models for meta-analysis as special cases and the instructions to fit them in Stata has not been presented so far. Source code is available at http:www.compgen.org/tools/gllamm. Copyright © 2015 John Wiley & Sons, Ltd.

  19. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    PubMed Central

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  20. Reliability Modeling Development and Its Applications for Ceramic Capacitors with Base-Metal Electrodes (BMEs)

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    This presentation includes a summary of NEPP-funded deliverables for the Base-Metal Electrodes (BMEs) capacitor task, development of a general reliability model for BME capacitors, and a summary and future work.

  1. Naturalistic driving data for the analysis of car-following models.

    DOT National Transportation Integrated Search

    2013-02-01

    The first research effort investigates the general application of naturalistic driving data to the modeling of car-following behavior. The driver-specific data available from naturalistic driving studies provides a unique perspective from which to te...

  2. A toy terrestrial carbon flow model

    NASA Technical Reports Server (NTRS)

    Parton, William J.; Running, Steven W.; Walker, Brian

    1992-01-01

    A generalized carbon flow model for the major terrestrial ecosystems of the world is reported. The model is a simplification of the Century model and the Forest-Biogeochemical model. Topics covered include plant production, decomposition and nutrient cycling, biomes, the utility of the carbon flow model for predicting carbon dynamics under global change, and possible applications to state-and-transition models and environmentally driven global vegetation models.

  3. General existence principles for Stieltjes differential equations with applications to mathematical biology

    NASA Astrophysics Data System (ADS)

    López Pouso, Rodrigo; Márquez Albés, Ignacio

    2018-04-01

    Stieltjes differential equations, which contain equations with impulses and equations on time scales as particular cases, simply consist on replacing usual derivatives by derivatives with respect to a nondecreasing function. In this paper we prove new existence results for functional and discontinuous Stieltjes differential equations and we show that such general results have real world applications. Specifically, we show that Stieltjes differential equations are specially suitable to study populations which exhibit dormant states and/or very short (impulsive) periods of reproduction. In particular, we construct two mathematical models for the evolution of a silkworm population. Our first model can be explicitly solved, as it consists on a linear Stieltjes equation. Our second model, more realistic, is nonlinear, discontinuous and functional, and we deduce the existence of solutions by means of a result proven in this paper.

  4. Modeling Two-Phase Flow and Vapor Cycles Using the Generalized Fluid System Simulation Program

    NASA Technical Reports Server (NTRS)

    Smith, Amanda D.; Majumdar, Alok K.

    2017-01-01

    This work presents three new applications for the general purpose fluid network solver code GFSSP developed at NASA's Marshall Space Flight Center: (1) cooling tower, (2) vapor-compression refrigeration system, and (3) vapor-expansion power generation system. These systems are widely used across engineering disciplines in a variety of energy systems, and these models expand the capabilities and the use of GFSSP to include fluids and features that are not part of its present set of provided examples. GFSSP provides pressure, temperature, and species concentrations at designated locations, or nodes, within a fluid network based on a finite volume formulation of thermodynamics and conservation laws. This paper describes the theoretical basis for the construction of the models, their implementation in the current GFSSP modeling system, and a brief evaluation of the usefulness of the model results, as well as their applicability toward a broader spectrum of analytical problems in both university teaching and engineering research.

  5. Finite element meshing of ANSYS (trademark) solid models

    NASA Technical Reports Server (NTRS)

    Kelley, F. S.

    1987-01-01

    A large scale, general purpose finite element computer program, ANSYS, developed and marketed by Swanson Analysis Systems, Inc. is discussed. ANSYS was perhaps the first commercially available program to offer truly interactive finite element model generation. ANSYS's purpose is for solid modeling. This application is briefly discussed and illustrated.

  6. An Illustration of Diagnostic Classification Modeling in Student Learning Outcomes Assessment

    ERIC Educational Resources Information Center

    Jurich, Daniel P.; Bradshaw, Laine P.

    2014-01-01

    The assessment of higher-education student learning outcomes is an important component in understanding the strengths and weaknesses of academic and general education programs. This study illustrates the application of diagnostic classification models, a burgeoning set of statistical models, in assessing student learning outcomes. To facilitate…

  7. Bayesian Estimation of Multivariate Latent Regression Models: Gauss versus Laplace

    ERIC Educational Resources Information Center

    Culpepper, Steven Andrew; Park, Trevor

    2017-01-01

    A latent multivariate regression model is developed that employs a generalized asymmetric Laplace (GAL) prior distribution for regression coefficients. The model is designed for high-dimensional applications where an approximate sparsity condition is satisfied, such that many regression coefficients are near zero after accounting for all the model…

  8. 40 CFR 86.401-97 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... applies to 1978 and later model year, new, gasoline-fueled motorcycles built after 31 December, 1977, and to 1990 and later model year, new, methanol-fueled motorcycles built after 31 December, 1989 and to 1997 and later model year, new, natural gas-fueled and liquefied petroleum gas-fueled motorcycles built...

  9. 40 CFR 86.401-2006 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... applies to 1978 and later model year, new, gasoline-fueled motorcycles built after December 31, 1977, and to 1990 and later model year, new methanol-fueled motorcycles built after December 31, 1989 and to 1997 and later model year, new natural gas-fueled and liquefied petroleum gas-fueled motorcycles built...

  10. Receptor Surface Models in the Classroom: Introducing Molecular Modeling to Students in a 3-D World

    ERIC Educational Resources Information Center

    Geldenhuys, Werner J.; Hayes, Michael; Van der Schyf, Cornelis J.; Allen, David D.; Malan, Sarel F.

    2007-01-01

    A simple, novel and generally applicable method to demonstrate structure-activity associations of a group of biologically interesting compounds in relation to receptor binding is described. This method is useful for undergraduates and graduate students in medicinal chemistry and computer modeling programs.

  11. Extension of the general thermal field equation for nanosized emitters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kyritsakis, A., E-mail: akyritsos1@gmail.com; Xanthakis, J. P.

    2016-01-28

    During the previous decade, Jensen et al. developed a general analytical model that successfully describes electron emission from metals both in the field and thermionic regimes, as well as in the transition region. In that development, the standard image corrected triangular potential barrier was used. This barrier model is valid only for planar surfaces and therefore cannot be used in general for modern nanometric emitters. In a recent publication, the authors showed that the standard Fowler-Nordheim theory can be generalized for highly curved emitters if a quadratic term is included to the potential model. In this paper, we extend thismore » generalization for high temperatures and include both the thermal and intermediate regimes. This is achieved by applying the general method developed by Jensen to the quadratic barrier model of our previous publication. We obtain results that are in good agreement with fully numerical calculations for radii R > 4 nm, while our calculated current density differs by a factor up to 27 from the one predicted by the Jensen's standard General-Thermal-Field (GTF) equation. Our extended GTF equation has application to modern sharp electron sources, beam simulation models, and vacuum breakdown theory.« less

  12. Capturing nonlocal interaction effects in the Hubbard model: Optimal mappings and limits of applicability

    NASA Astrophysics Data System (ADS)

    van Loon, E. G. C. P.; Schüler, M.; Katsnelson, M. I.; Wehling, T. O.

    2016-10-01

    We investigate the Peierls-Feynman-Bogoliubov variational principle to map Hubbard models with nonlocal interactions to effective models with only local interactions. We study the renormalization of the local interaction induced by nearest-neighbor interaction and assess the quality of the effective Hubbard models in reproducing observables of the corresponding extended Hubbard models. We compare the renormalization of the local interactions as obtained from numerically exact determinant quantum Monte Carlo to approximate but more generally applicable calculations using dual boson, dynamical mean field theory, and the random phase approximation. These more approximate approaches are crucial for any application with real materials in mind. Furthermore, we use the dual boson method to calculate observables of the extended Hubbard models directly and benchmark these against determinant quantum Monte Carlo simulations of the effective Hubbard model.

  13. A class of fractional differential hemivariational inequalities with application to contact problem

    NASA Astrophysics Data System (ADS)

    Zeng, Shengda; Liu, Zhenhai; Migorski, Stanislaw

    2018-04-01

    In this paper, we study a class of generalized differential hemivariational inequalities of parabolic type involving the time fractional order derivative operator in Banach spaces. We use the Rothe method combined with surjectivity of multivalued pseudomonotone operators and properties of the Clarke generalized gradient to establish existence of solution to the abstract inequality. As an illustrative application, a frictional quasistatic contact problem for viscoelastic materials with adhesion is investigated, in which the friction and contact conditions are described by the Clarke generalized gradient of nonconvex and nonsmooth functionals, and the constitutive relation is modeled by the fractional Kelvin-Voigt law.

  14. A new hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1988-01-01

    This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.

  15. Sparse gammatone signal model optimized for English speech does not match the human auditory filters.

    PubMed

    Strahl, Stefan; Mertins, Alfred

    2008-07-18

    Evidence that neurosensory systems use sparse signal representations as well as improved performance of signal processing algorithms using sparse signal models raised interest in sparse signal coding in the last years. For natural audio signals like speech and environmental sounds, gammatone atoms have been derived as expansion functions that generate a nearly optimal sparse signal model (Smith, E., Lewicki, M., 2006. Efficient auditory coding. Nature 439, 978-982). Furthermore, gammatone functions are established models for the human auditory filters. Thus far, a practical application of a sparse gammatone signal model has been prevented by the fact that deriving the sparsest representation is, in general, computationally intractable. In this paper, we applied an accelerated version of the matching pursuit algorithm for gammatone dictionaries allowing real-time and large data set applications. We show that a sparse signal model in general has advantages in audio coding and that a sparse gammatone signal model encodes speech more efficiently in terms of sparseness than a sparse modified discrete cosine transform (MDCT) signal model. We also show that the optimal gammatone parameters derived for English speech do not match the human auditory filters, suggesting for signal processing applications to derive the parameters individually for each applied signal class instead of using psychometrically derived parameters. For brain research, it means that care should be taken with directly transferring findings of optimality for technical to biological systems.

  16. A Bivariate Generalized Linear Item Response Theory Modeling Framework to the Analysis of Responses and Response Times.

    PubMed

    Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J

    2015-01-01

    A generalized linear modeling framework to the analysis of responses and response times is outlined. In this framework, referred to as bivariate generalized linear item response theory (B-GLIRT), separate generalized linear measurement models are specified for the responses and the response times that are subsequently linked by cross-relations. The cross-relations can take various forms. Here, we focus on cross-relations with a linear or interaction term for ability tests, and cross-relations with a curvilinear term for personality tests. In addition, we discuss how popular existing models from the psychometric literature are special cases in the B-GLIRT framework depending on restrictions in the cross-relation. This allows us to compare existing models conceptually and empirically. We discuss various extensions of the traditional models motivated by practical problems. We also illustrate the applicability of our approach using various real data examples, including data on personality and cognitive ability.

  17. 40 CFR 600.001-12 - General applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the related exhaust emissions of CO2, HC, and CO, and where applicable for alternative fuel vehicles... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977... of 2012 and later model year automobiles. (b) Fuel economy and related emissions data. Unless stated...

  18. Computer and control applications in a vegetable processing plant

    USDA-ARS?s Scientific Manuscript database

    There are many advantages to the use of computers and control in food industry. Software in the food industry takes 2 forms - general purpose commercial computer software and software for specialized applications, such as drying and thermal processing of foods. Many applied simulation models for d...

  19. Statistical study of air pollutant concentrations via generalized gamma distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marani, A.; Lavagnini, I.; Buttazzoni, C.

    1986-11-01

    This paper deals with modeling observed frequency distributions of air quality data measured in the area of Venice, Italy. The paper discusses the application of the generalized gamma distribution (ggd) which has not been commonly applied to air quality data notwithstanding the fact that it embodies most distribution models used for air quality analyses. The approach yields important simplifications for statistical analyses. A comparison among the ggd and other relevant models (standard gamma, Weibull, lognormal), carried out on daily sulfur dioxide concentrations in the area of Venice underlines the efficiency of ggd models in portraying experimental data.

  20. Renormalization Group Studies and Monte Carlo Simulation for Quantum Spin Systems.

    NASA Astrophysics Data System (ADS)

    Pan, Ching-Yan

    We have discussed the extended application of various real space renormalization group methods to the quantum spin systems. At finite temperature, we extended both the reliability and range of application of the decimation renormalization group method (DRG) for calculating the thermal and magnetic properties of low-dimensional quantum spin chains, in which we have proposed general models of the three-state Potts model and the general Heisenberg model. Some interesting finite-temperature behavior of the models has been obtained. We also proposed a general formula for the critical properties of the n-dimensional q-state Potts model by using a modified migdal-Kadanoff approach which is in very good agreement with all available results for general q and d. For high-spin systems, we have investigated the famous Haldane's prediction by using a modified block renormalization group approach in spin -1over2, spin-1 and spin-3 over2 cases. Our result supports Haldane's prediction and a novel property of the spin-1 Heisenberg antiferromagnet has been predicted. A modified quantum monte Carlo simulation approach has been developed in this study which we use to treat quantum interacting problems (we only work on quantum spin systems in this study) without the "negative sign problem". We also obtain with the Monte Carlo approach the numerical derivative directly. Furthermore, using this approach we have obtained the energy spectrum and the thermodynamic properties of the antiferromagnetic q-state Potts model, and have studied the q-color problem with the result which supports Mattis' recent conjecture of entropy for the n -dimensional q-state Potts antiferromagnet. We also find a general solution for the q-color problem in d dimensions.

  1. A Generalized Hybrid Multiscale Modeling Approach for Flow and Reactive Transport in Porous Media

    NASA Astrophysics Data System (ADS)

    Yang, X.; Meng, X.; Tang, Y. H.; Guo, Z.; Karniadakis, G. E.

    2017-12-01

    Using emerging understanding of biological and environmental processes at fundamental scales to advance predictions of the larger system behavior requires the development of multiscale approaches, and there is strong interest in coupling models at different scales together in a hybrid multiscale simulation framework. A limited number of hybrid multiscale simulation methods have been developed for subsurface applications, mostly using application-specific approaches for model coupling. The proposed generalized hybrid multiscale approach is designed with minimal intrusiveness to the at-scale simulators (pre-selected) and provides a set of lightweight C++ scripts to manage a complex multiscale workflow utilizing a concurrent coupling approach. The workflow includes at-scale simulators (using the lattice-Boltzmann method, LBM, at the pore and Darcy scale, respectively), scripts for boundary treatment (coupling and kriging), and a multiscale universal interface (MUI) for data exchange. The current study aims to apply the generalized hybrid multiscale modeling approach to couple pore- and Darcy-scale models for flow and mixing-controlled reaction with precipitation/dissolution in heterogeneous porous media. The model domain is packed heterogeneously that the mixing front geometry is more complex and not known a priori. To address those challenges, the generalized hybrid multiscale modeling approach is further developed to 1) adaptively define the locations of pore-scale subdomains, 2) provide a suite of physical boundary coupling schemes and 3) consider the dynamic change of the pore structures due to mineral precipitation/dissolution. The results are validated and evaluated by comparing with single-scale simulations in terms of velocities, reactive concentrations and computing cost.

  2. Musite, a tool for global prediction of general and kinase-specific phosphorylation sites.

    PubMed

    Gao, Jianjiong; Thelen, Jay J; Dunker, A Keith; Xu, Dong

    2010-12-01

    Reversible protein phosphorylation is one of the most pervasive post-translational modifications, regulating diverse cellular processes in various organisms. High throughput experimental studies using mass spectrometry have identified many phosphorylation sites, primarily from eukaryotes. However, the vast majority of phosphorylation sites remain undiscovered, even in well studied systems. Because mass spectrometry-based experimental approaches for identifying phosphorylation events are costly, time-consuming, and biased toward abundant proteins and proteotypic peptides, in silico prediction of phosphorylation sites is potentially a useful alternative strategy for whole proteome annotation. Because of various limitations, current phosphorylation site prediction tools were not well designed for comprehensive assessment of proteomes. Here, we present a novel software tool, Musite, specifically designed for large scale predictions of both general and kinase-specific phosphorylation sites. We collected phosphoproteomics data in multiple organisms from several reliable sources and used them to train prediction models by a comprehensive machine-learning approach that integrates local sequence similarities to known phosphorylation sites, protein disorder scores, and amino acid frequencies. Application of Musite on several proteomes yielded tens of thousands of phosphorylation site predictions at a high stringency level. Cross-validation tests show that Musite achieves some improvement over existing tools in predicting general phosphorylation sites, and it is at least comparable with those for predicting kinase-specific phosphorylation sites. In Musite V1.0, we have trained general prediction models for six organisms and kinase-specific prediction models for 13 kinases or kinase families. Although the current pretrained models were not correlated with any particular cellular conditions, Musite provides a unique functionality for training customized prediction models (including condition-specific models) from users' own data. In addition, with its easily extensible open source application programming interface, Musite is aimed at being an open platform for community-based development of machine learning-based phosphorylation site prediction applications. Musite is available at http://musite.sourceforge.net/.

  3. A CONSISTENT APPROACH FOR THE APPLICATION OF PHARMACOKINETIC MODELING IN CANCER RISK ASSESSMENT

    EPA Science Inventory

    Physiologically based pharmacokinetic (PBPK) modeling provides important capabilities for improving the reliability of the extrapolations across dose, species, and exposure route that are generally required in chemical risk assessment regardless of the toxic endpoint being consid...

  4. Artificial intelligence based models for stream-flow forecasting: 2000-2015

    NASA Astrophysics Data System (ADS)

    Yaseen, Zaher Mundher; El-shafie, Ahmed; Jaafar, Othman; Afan, Haitham Abdulmohsin; Sayl, Khamis Naba

    2015-11-01

    The use of Artificial Intelligence (AI) has increased since the middle of the 20th century as seen in its application in a wide range of engineering and science problems. The last two decades, for example, has seen a dramatic increase in the development and application of various types of AI approaches for stream-flow forecasting. Generally speaking, AI has exhibited significant progress in forecasting and modeling non-linear hydrological applications and in capturing the noise complexity in the dataset. This paper explores the state-of-the-art application of AI in stream-flow forecasting, focusing on defining the data-driven of AI, the advantages of complementary models, as well as the literature and their possible future application in modeling and forecasting stream-flow. The review also identifies the major challenges and opportunities for prospective research, including, a new scheme for modeling the inflow, a novel method for preprocessing time series frequency based on Fast Orthogonal Search (FOS) techniques, and Swarm Intelligence (SI) as an optimization approach.

  5. Methodological quality and reporting of generalized linear mixed models in clinical medicine (2000-2012): a systematic review.

    PubMed

    Casals, Martí; Girabent-Farrés, Montserrat; Carrasco, Josep L

    2014-01-01

    Modeling count and binary data collected in hierarchical designs have increased the use of Generalized Linear Mixed Models (GLMMs) in medicine. This article presents a systematic review of the application and quality of results and information reported from GLMMs in the field of clinical medicine. A search using the Web of Science database was performed for published original articles in medical journals from 2000 to 2012. The search strategy included the topic "generalized linear mixed models","hierarchical generalized linear models", "multilevel generalized linear model" and as a research domain we refined by science technology. Papers reporting methodological considerations without application, and those that were not involved in clinical medicine or written in English were excluded. A total of 443 articles were detected, with an increase over time in the number of articles. In total, 108 articles fit the inclusion criteria. Of these, 54.6% were declared to be longitudinal studies, whereas 58.3% and 26.9% were defined as repeated measurements and multilevel design, respectively. Twenty-two articles belonged to environmental and occupational public health, 10 articles to clinical neurology, 8 to oncology, and 7 to infectious diseases and pediatrics. The distribution of the response variable was reported in 88% of the articles, predominantly Binomial (n = 64) or Poisson (n = 22). Most of the useful information about GLMMs was not reported in most cases. Variance estimates of random effects were described in only 8 articles (9.2%). The model validation, the method of covariate selection and the method of goodness of fit were only reported in 8.0%, 36.8% and 14.9% of the articles, respectively. During recent years, the use of GLMMs in medical literature has increased to take into account the correlation of data when modeling qualitative data or counts. According to the current recommendations, the quality of reporting has room for improvement regarding the characteristics of the analysis, estimation method, validation, and selection of the model.

  6. Experience with a vectorized general circulation weather model on Star-100

    NASA Technical Reports Server (NTRS)

    Soll, D. B.; Habra, N. R.; Russell, G. L.

    1977-01-01

    A version of an atmospheric general circulation model was vectorized to run on a CDC STAR 100. The numerical model was coded and run in two different vector languages, CDC and LRLTRAN. A factor of 10 speed improvement over an IBM 360/95 was realized. Efficient use of the STAR machine required some redesigning of algorithms and logic. This precludes the application of vectorizing compilers on the original scalar code to achieve the same results. Vector languages permit a more natural and efficient formulation for such numerical codes.

  7. Women’s Sexuality: Behaviors, Responses, and Individual Differences

    PubMed Central

    Andersen, Barbara L.; Cyranowski, Jill M.

    2009-01-01

    Classic and contemporary approaches to the assessment of female sexuality are discussed. General approaches, assessment strategies, and models of female sexuality are organized within the conceptual domains of sexual behaviors, sexual responses (desire, excitement, orgasm, and resolution), and individual differences, including general and sex-specific personality models. Where applicable, important trends and relationships are highlighted in the literature with both existing reports and previously unpublished data. The present conceptual overview highlights areas in sexual assessment and model building that are in need of further research and theoretical clarification. PMID:8543712

  8. An overview of topic modeling and its current applications in bioinformatics.

    PubMed

    Liu, Lin; Tang, Lin; Dong, Wen; Yao, Shaowen; Zhou, Wei

    2016-01-01

    With the rapid accumulation of biological datasets, machine learning methods designed to automate data analysis are urgently needed. In recent years, so-called topic models that originated from the field of natural language processing have been receiving much attention in bioinformatics because of their interpretability. Our aim was to review the application and development of topic models for bioinformatics. This paper starts with the description of a topic model, with a focus on the understanding of topic modeling. A general outline is provided on how to build an application in a topic model and how to develop a topic model. Meanwhile, the literature on application of topic models to biological data was searched and analyzed in depth. According to the types of models and the analogy between the concept of document-topic-word and a biological object (as well as the tasks of a topic model), we categorized the related studies and provided an outlook on the use of topic models for the development of bioinformatics applications. Topic modeling is a useful method (in contrast to the traditional means of data reduction in bioinformatics) and enhances researchers' ability to interpret biological information. Nevertheless, due to the lack of topic models optimized for specific biological data, the studies on topic modeling in biological data still have a long and challenging road ahead. We believe that topic models are a promising method for various applications in bioinformatics research.

  9. A Unidimensional Item Response Model for Unfolding Responses from a Graded Disagree-Agree Response Scale.

    ERIC Educational Resources Information Center

    Roberts, James S.; Laughlin, James E.

    1996-01-01

    A parametric item response theory model for unfolding binary or graded responses is developed. The graded unfolding model (GUM) is a generalization of the hyperbolic cosine model for binary data of D. Andrich and G. Luo (1993). Applicability of the GUM to attitude testing is illustrated with real data. (SLD)

  10. Photogrammetric techniques for aerospace applications

    NASA Astrophysics Data System (ADS)

    Liu, Tianshu; Burner, Alpheus W.; Jones, Thomas W.; Barrows, Danny A.

    2012-10-01

    Photogrammetric techniques have been used for measuring the important physical quantities in both ground and flight testing including aeroelastic deformation, attitude, position, shape and dynamics of objects such as wind tunnel models, flight vehicles, rotating blades and large space structures. The distinct advantage of photogrammetric measurement is that it is a non-contact, global measurement technique. Although the general principles of photogrammetry are well known particularly in topographic and aerial survey, photogrammetric techniques require special adaptation for aerospace applications. This review provides a comprehensive and systematic summary of photogrammetric techniques for aerospace applications based on diverse sources. It is useful mainly for aerospace engineers who want to use photogrammetric techniques, but it also gives a general introduction for photogrammetrists and computer vision scientists to new applications.

  11. Identifying model error in metabolic flux analysis - a generalized least squares approach.

    PubMed

    Sokolenko, Stanislav; Quattrociocchi, Marco; Aucoin, Marc G

    2016-09-13

    The estimation of intracellular flux through traditional metabolic flux analysis (MFA) using an overdetermined system of equations is a well established practice in metabolic engineering. Despite the continued evolution of the methodology since its introduction, there has been little focus on validation and identification of poor model fit outside of identifying "gross measurement error". The growing complexity of metabolic models, which are increasingly generated from genome-level data, has necessitated robust validation that can directly assess model fit. In this work, MFA calculation is framed as a generalized least squares (GLS) problem, highlighting the applicability of the common t-test for model validation. To differentiate between measurement and model error, we simulate ideal flux profiles directly from the model, perturb them with estimated measurement error, and compare their validation to real data. Application of this strategy to an established Chinese Hamster Ovary (CHO) cell model shows how fluxes validated by traditional means may be largely non-significant due to a lack of model fit. With further simulation, we explore how t-test significance relates to calculation error and show that fluxes found to be non-significant have 2-4 fold larger error (if measurement uncertainty is in the 5-10 % range). The proposed validation method goes beyond traditional detection of "gross measurement error" to identify lack of fit between model and data. Although the focus of this work is on t-test validation and traditional MFA, the presented framework is readily applicable to other regression analysis methods and MFA formulations.

  12. Effects of Interdisciplinary Education on Technology-Driven Application Design

    ERIC Educational Resources Information Center

    Tafa, Z.; Rakocevic, G.; Mihailovic, D.; Milutinovic, V.

    2011-01-01

    This paper describes the structure and the underlying rationale of a new course dedicated to capability maturity model integration (CMMI)-directed design of wireless sensor networks (WSNs)-based biomedical applications that stresses: 1) engineering-, medico-engineering-, and informatics-related issues; 2) design for general- and special-purpose…

  13. [Impact analysis of shuxuetong injection on abnormal changes of ALT based on generalized boosted models propensity score weighting].

    PubMed

    Yang, Wei; Yi, Dan-Hui; Xie, Yan-Ming; Yang, Wei; Dai, Yi; Zhi, Ying-Jie; Zhuang, Yan; Yang, Hu

    2013-09-01

    To estimate treatment effects of Shuxuetong injection on abnormal changes on ALT index, that is, to explore whether the Shuxuetong injection harms liver function in clinical settings and to provide clinical guidance for its safe application. Clinical information of traditional Chinese medicine (TCM) injections is gathered from hospital information system (HIS) of eighteen general hospitals. This is a retrospective cohort study, using abnormal changes in ALT index as an outcome. A large number of confounding biases are taken into account through the generalized boosted models (GBM) and multiple logistic regression model (MLRM) to estimate the treatment effects of Shuxuetong injections on abnormal changes in ALT index and to explore possible influencing factors. The advantages and process of application of GBM has been demonstrated with examples which eliminate the biases from most confounding variables between groups. This serves to modify the estimation of treatment effects of Shuxuetong injection on ALT index making the results more reliable. Based on large scale clinical observational data from HIS database, significant effects of Shuxuetong injection on abnormal changes in ALT have not been found.

  14. Modified Petri net model sensitivity to workload manipulations

    NASA Technical Reports Server (NTRS)

    White, S. A.; Mackinnon, D. P.; Lyman, J.

    1986-01-01

    Modified Petri Nets (MPNs) are investigated as a workload modeling tool. The results of an exploratory study of the sensitivity of MPNs to work load manipulations in a dual task are described. Petri nets have been used to represent systems with asynchronous, concurrent and parallel activities (Peterson, 1981). These characteristics led some researchers to suggest the use of Petri nets in workload modeling where concurrent and parallel activities are common. Petri nets are represented by places and transitions. In the workload application, places represent operator activities and transitions represent events. MPNs have been used to formally represent task events and activities of a human operator in a man-machine system. Some descriptive applications demonstrate the usefulness of MPNs in the formal representation of systems. It is the general hypothesis herein that in addition to descriptive applications, MPNs may be useful for workload estimation and prediction. The results are reported of the first of a series of experiments designed to develop and test a MPN system of workload estimation and prediction. This first experiment is a screening test of MPN model general sensitivity to changes in workload. Positive results from this experiment will justify the more complicated analyses and techniques necessary for developing a workload prediction system.

  15. Survey of Airport Access Analysis Techniques - Models, Data and a Research Program

    DOT National Transportation Integrated Search

    1972-06-01

    The report points up the differences and similarities between airport access travel and general urban trip making. Models and surveys developed for, or applicable, to airport access planning are reviewed. A research program is proposed which would ge...

  16. Credibility analysis of risk classes by generalized linear model

    NASA Astrophysics Data System (ADS)

    Erdemir, Ovgucan Karadag; Sucu, Meral

    2016-06-01

    In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.

  17. Inverse scattering method and soliton double solution family for the general symplectic gravity model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao Yajun

    A previously established Hauser-Ernst-type extended double-complex linear system is slightly modified and used to develop an inverse scattering method for the stationary axisymmetric general symplectic gravity model. The reduction procedures in this inverse scattering method are found to be fairly simple, which makes the inverse scattering method applied fine and effective. As an application, a concrete family of soliton double solutions for the considered theory is obtained.

  18. An Application to the Prediction of LOD Change Based on General Regression Neural Network

    NASA Astrophysics Data System (ADS)

    Zhang, X. H.; Wang, Q. J.; Zhu, J. J.; Zhang, H.

    2011-07-01

    Traditional prediction of the LOD (length of day) change was based on linear models, such as the least square model and the autoregressive technique, etc. Due to the complex non-linear features of the LOD variation, the performances of the linear model predictors are not fully satisfactory. This paper applies a non-linear neural network - general regression neural network (GRNN) model to forecast the LOD change, and the results are analyzed and compared with those obtained with the back propagation neural network and other models. The comparison shows that the performance of the GRNN model in the prediction of the LOD change is efficient and feasible.

  19. Evaluating targeted interventions via meta-population models with multi-level mixing.

    PubMed

    Feng, Zhilan; Hill, Andrew N; Curns, Aaron T; Glasser, John W

    2017-05-01

    Among the several means by which heterogeneity can be modeled, Levins' (1969) meta-population approach preserves the most analytical tractability, a virtue to the extent that generality is desirable. When model populations are stratified, contacts among their respective sub-populations must be described. Using a simple meta-population model, Feng et al. (2015) showed that mixing among sub-populations, as well as heterogeneity in characteristics affecting sub-population reproduction numbers, must be considered when evaluating public health interventions to prevent or control infectious disease outbreaks. They employed the convex combination of preferential within- and proportional among-group contacts first described by Nold (1980) and subsequently generalized by Jacquez et al. (1988). As the utility of meta-population modeling depends on more realistic mixing functions, the authors added preferential contacts between parents and children and among co-workers (Glasser et al., 2012). Here they further generalize this function by including preferential contacts between grandparents and grandchildren, but omit workplace contacts. They also describe a general multi-level mixing scheme, provide three two-level examples, and apply two of them. In their first application, the authors describe age- and gender-specific patterns in face-to-face conversations (Mossong et al., 2008), proxies for contacts by which respiratory pathogens might be transmitted, that are consistent with everyday experience. This suggests that meta-population models with inter-generational mixing could be employed to evaluate prolonged school-closures, a proposed pandemic mitigation measure that could expose grandparents, and other elderly surrogate caregivers for working parents, to infectious children. In their second application, the authors use a meta-population SEIR model stratified by 7 age groups and 50 states plus the District of Columbia, to compare actual with optimal vaccination during the 2009-2010 influenza pandemic in the United States. They also show that vaccination efforts could have been adjusted month-to-month during the fall of 2009 to ensure maximum impact. Such applications inspire confidence in the reliability of meta-population modeling in support of public health policymaking. Published by Elsevier Inc.

  20. Analysis of shell-type structures subjected to time-dependent mechanical and thermal loading

    NASA Technical Reports Server (NTRS)

    Simitses, George J.

    1990-01-01

    The development of a general mathematical model and solution methodologies for analyzing structural response of thin, metallic shell-like structures under dynamic and/or static thermomechanical loads is examined. In the mathematical model, geometric as well as material-type of nonlinearities are considered. Traditional as well as novel approaches are reported and detailed applications are presented in the appendices. The emphasis for the mathematical model, the related solution schemes, and the applications, is on thermal viscoelastic and viscoplastic phenomena, which can predict creep and ratchetting.

  1. Analysis of shell-type structures subjected to time-dependent mechanical and thermal loading

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.

    1991-01-01

    This report deals with the development of a general mathematical model and solution methodology for analyzing the structural response of thin, metallic shell-like structures under dynamic and/or static thermomechanical loads. In the mathematical model, geometric as well as the material-type of nonlinearities are considered. Traditional as well as novel approaches are reported and detailed applications are presented in the appendices. The emphasis for the mathematical model, the related solution schemes, and the applications, is on thermal viscoelastic and viscoplastic phenomena, which can predict creep and ratchetting.

  2. Application of a metabolic balancing technique to the analysis of microbial fermentation data.

    PubMed

    de Hollander, J A

    1991-01-01

    A general method for the development of fermentation models, based on elemental and metabolic balances, is illustrated with three examples from the literature. Physiological parameters such as the (maximal) yield on ATP, the energetic maintenance coefficient, the P/O ratio and others are estimated by fitting model equations to experimental data. Further, phenomenological relations concerning kinetics of product formation and limiting enzyme activities are assessed. The results are compared with the conclusions of the original articles, and differences due to the application of improved models are discussed.

  3. Convergence dynamics and pseudo almost periodicity of a class of nonautonomous RFDEs with applications

    NASA Astrophysics Data System (ADS)

    Fan, Meng; Ye, Dan

    2005-09-01

    This paper studies the dynamics of a system of retarded functional differential equations (i.e., RF=Es), which generalize the Hopfield neural network models, the bidirectional associative memory neural networks, the hybrid network models of the cellular neural network type, and some population growth model. Sufficient criteria are established for the globally exponential stability and the existence and uniqueness of pseudo almost periodic solution. The approaches are based on constructing suitable Lyapunov functionals and the well-known Banach contraction mapping principle. The paper ends with some applications of the main results to some neural network models and population growth models and numerical simulations.

  4. Application of 6D Building Information Model (6D BIM) for Business-storage Building in Slovenia

    NASA Astrophysics Data System (ADS)

    Pučko, Zoran; Vincek, Dražen; Štrukelj, Andrej; Šuman, Nataša

    2017-10-01

    The aim of this paper is to present an application of 6D building information modelling (6D BIM) on a real business-storage building in Slovenia. First, features of building maintenance in general are described according to the current Slovenian legislation, and also a general principle of BIM is given. After that, step-by-step activities for modelling 6D BIM are exposed, namely from Element list for maintenance, determination of their lifetime and service measures, cost analysing and time analysing to 6D BIM modelling. The presented 6D BIM model is designed in a unique way in which cost analysis is performed as 5D BIM model with linked data to use BIM Construction Project Management Software (Vico Office), integrated with 3D BIM model, whereas time analysis as 4D BIM model is carried out as non-linked data with the help of Excel (without connection to 3D BIM model). The paper is intended to serve as a guide to the building owners to prepare 6D BIM and to provide an insight into the relevant dynamic information about intervals and costs for execution of maintenance works in the whole building lifecycle.

  5. Modeling the brain morphology distribution in the general aging population

    NASA Astrophysics Data System (ADS)

    Huizinga, W.; Poot, D. H. J.; Roshchupkin, G.; Bron, E. E.; Ikram, M. A.; Vernooij, M. W.; Rueckert, D.; Niessen, W. J.; Klein, S.

    2016-03-01

    Both normal aging and neurodegenerative diseases such as Alzheimer's disease cause morphological changes of the brain. To better distinguish between normal and abnormal cases, it is necessary to model changes in brain morphology owing to normal aging. To this end, we developed a method for analyzing and visualizing these changes for the entire brain morphology distribution in the general aging population. The method is applied to 1000 subjects from a large population imaging study in the elderly, from which 900 were used to train the model and 100 were used for testing. The results of the 100 test subjects show that the model generalizes to subjects outside the model population. Smooth percentile curves showing the brain morphology changes as a function of age and spatiotemporal atlases derived from the model population are publicly available via an interactive web application at agingbrain.bigr.nl.

  6. Conditions for the cosmological viability of the most general scalar-tensor theories and their applications to extended Galileon dark energy models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felice, Antonio De; Tsujikawa, Shinji, E-mail: antoniod@nu.ac.th, E-mail: shinji@rs.kagu.tus.ac.jp

    2012-02-01

    In the Horndeski's most general scalar-tensor theories with second-order field equations, we derive the conditions for the avoidance of ghosts and Laplacian instabilities associated with scalar, tensor, and vector perturbations in the presence of two perfect fluids on the flat Friedmann-Lemaître-Robertson-Walker (FLRW) background. Our general results are useful for the construction of theoretically consistent models of dark energy. We apply our formulas to extended Galileon models in which a tracker solution with an equation of state smaller than -1 is present. We clarify the allowed parameter space in which the ghosts and Laplacian instabilities are absent and we numerically confirmmore » that such models are indeed cosmologically viable.« less

  7. Version Control in Project-Based Learning

    ERIC Educational Resources Information Center

    Milentijevic, Ivan; Ciric, Vladimir; Vojinovic, Oliver

    2008-01-01

    This paper deals with the development of a generalized model for version control systems application as a support in a range of project-based learning methods. The model is given as UML sequence diagram and described in detail. The proposed model encompasses a wide range of different project-based learning approaches by assigning a supervisory…

  8. Power Analysis for Complex Mediational Designs Using Monte Carlo Methods

    ERIC Educational Resources Information Center

    Thoemmes, Felix; MacKinnon, David P.; Reiser, Mark R.

    2010-01-01

    Applied researchers often include mediation effects in applications of advanced methods such as latent variable models and linear growth curve models. Guidance on how to estimate statistical power to detect mediation for these models has not yet been addressed in the literature. We describe a general framework for power analyses for complex…

  9. General Circulation Model Output for Forest Climate Change Research and Applications

    Treesearch

    Ellen J. Cooter; Brian K. Eder; Sharon K. LeDuc; Lawrence Truppi

    1993-01-01

    This report reviews technical aspects of and summarizes output from four climate models. Recommendations concerning the use of these outputs in forest impact assessments are made. This report reviews technical aspects of and summarizes output from four climate models. Recommendations concerning the use of these outputs in forest impact assessments are made.

  10. EVALUATION OF THE REAL-TIME AIR-QUALITY MODEL USING THE RAPS (REGIONAL AIR POLLUTION STUDY) DATA BASE. VOLUME 1. OVERVIEW

    EPA Science Inventory

    The theory and programming of statistical tests for evaluating the Real-Time Air-Quality Model (RAM) using the Regional Air Pollution Study (RAPS) data base are fully documented in four report volumes. Moreover, the tests are generally applicable to other model evaluation problem...

  11. A generalized system of models forecasting Central States tree growth.

    Treesearch

    Stephen R. Shifley

    1987-01-01

    Describes the development and testing of a system of individual tree-based growth projection models applicable to species in Indiana, Missouri, and Ohio. Annual tree basal area growth is estimated as a function of tree size, crown ratio, stand density, and site index. Models are compatible with the STEMS and TWIGS Projection System.

  12. Application of Local Discretization Methods in the NASA Finite-Volume General Circulation Model

    NASA Technical Reports Server (NTRS)

    Yeh, Kao-San; Lin, Shian-Jiann; Rood, Richard B.

    2002-01-01

    We present the basic ideas of the dynamics system of the finite-volume General Circulation Model developed at NASA Goddard Space Flight Center for climate simulations and other applications in meteorology. The dynamics of this model is designed with emphases on conservative and monotonic transport, where the property of Lagrangian conservation is used to maintain the physical consistency of the computational fluid for long-term simulations. As the model benefits from the noise-free solutions of monotonic finite-volume transport schemes, the property of Lagrangian conservation also partly compensates the accuracy of transport for the diffusion effects due to the treatment of monotonicity. By faithfully maintaining the fundamental laws of physics during the computation, this model is able to achieve sufficient accuracy for the global consistency of climate processes. Because the computing algorithms are based on local memory, this model has the advantage of efficiency in parallel computation with distributed memory. Further research is yet desirable to reduce the diffusion effects of monotonic transport for better accuracy, and to mitigate the limitation due to fast-moving gravity waves for better efficiency.

  13. On a nonlinear model for tumour growth with drug application

    NASA Astrophysics Data System (ADS)

    Donatelli, Donatella; Trivisa, Konstantina

    2015-05-01

    We investigate the dynamics of a nonlinear system modelling tumour growth with drug application. The tumour is viewed as a mixture consisting of proliferating, quiescent and dead cells as well as a nutrient in the presence of a drug. The system is given by a multi-phase flow model: the densities of the different cells are governed by a set of transport equations, the density of the nutrient and the density of the drug are governed by rather general diffusion equations, while the velocity of the tumour is given by Brinkman's equation. The domain occupied by the tumour in this setting is a growing continuum Ω with boundary ∂Ω both of which evolve in time. Global-in-time weak solutions are obtained using an approach based on penalization of the boundary behaviour, diffusion and viscosity in the weak formulation. Both the solutions and the domain are rather general, no symmetry assumption is required and the result holds for large initial data. This article is part of a research programme whose aim is the investigation of the effect of drug application in tumour growth.

  14. Prediction of 1-octanol solubilities using data from the Open Notebook Science Challenge.

    PubMed

    Buonaiuto, Michael A; Lang, Andrew S I D

    2015-12-01

    1-Octanol solubility is important in a variety of applications involving pharmacology and environmental chemistry. Current models are linear in nature and often require foreknowledge of either melting point or aqueous solubility. Here we extend the range of applicability of 1-octanol solubility models by creating a random forest model that can predict 1-octanol solubilities directly from structure. We created a random forest model using CDK descriptors that has an out-of-bag (OOB) R 2 value of 0.66 and an OOB mean squared error of 0.34. The model has been deployed for general use as a Shiny application. The 1-octanol solubility model provides reasonably accurate predictions of the 1-octanol solubility of organic solutes directly from structure. The model was developed under Open Notebook Science conditions which makes it open, reproducible, and as useful as possible.Graphical abstract.

  15. Empirical comparison study of approximate methods for structure selection in binary graphical models.

    PubMed

    Viallon, Vivian; Banerjee, Onureena; Jougla, Eric; Rey, Grégoire; Coste, Joel

    2014-03-01

    Looking for associations among multiple variables is a topical issue in statistics due to the increasing amount of data encountered in biology, medicine, and many other domains involving statistical applications. Graphical models have recently gained popularity for this purpose in the statistical literature. In the binary case, however, exact inference is generally very slow or even intractable because of the form of the so-called log-partition function. In this paper, we review various approximate methods for structure selection in binary graphical models that have recently been proposed in the literature and compare them through an extensive simulation study. We also propose a modification of one existing method, that is shown to achieve good performance and to be generally very fast. We conclude with an application in which we search for associations among causes of death recorded on French death certificates. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Volterra model of the parametric array loudspeaker operating at ultrasonic frequencies.

    PubMed

    Shi, Chuang; Kajikawa, Yoshinobu

    2016-11-01

    The parametric array loudspeaker (PAL) is an application of the parametric acoustic array in air, which can be applied to transmit a narrow audio beam from an ultrasonic emitter. However, nonlinear distortion is very perceptible in the audio beam. Modulation methods to reduce the nonlinear distortion are available for on-axis far-field applications. For other applications, preprocessing techniques are wanting. In order to develop a preprocessing technique with general applicability to a wide range of operating conditions, the Volterra filter is investigated as a nonlinear model of the PAL in this paper. Limitations of the standard audio-to-audio Volterra filter are elaborated. An improved ultrasound-to-ultrasound Volterra filter is proposed and empirically demonstrated to be a more generic Volterra model of the PAL.

  17. 78 FR 50320 - Airworthiness Directives; General Electric Company Turbofan Engines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-19

    ... Airworthiness Directives; General Electric Company Turbofan Engines AGENCY: Federal Aviation Administration (FAA... Electric Company (GE) model GEnx-2B67B turbofan engines with booster anti-ice (BAI) air duct, part number...-2B67 turbofan engine be removed from the Applicability section of this AD. The commenters noted that...

  18. Formulation and Application of the Hierarchical Generalized Random-Situation Random-Weight MIRID

    ERIC Educational Resources Information Center

    Hung, Lai-Fa

    2011-01-01

    The process-component approach has become quite popular for examining many psychological concepts. A typical example is the model with internal restrictions on item difficulty (MIRID) described by Butter (1994) and Butter, De Boeck, and Verhelst (1998). This study proposes a hierarchical generalized random-situation random-weight MIRID. The…

  19. An object-oriented approach for harmonization of multimedia markup languages

    NASA Astrophysics Data System (ADS)

    Chen, Yih-Feng; Kuo, May-Chen; Sun, Xiaoming; Kuo, C.-C. Jay

    2003-12-01

    An object-oriented methodology is proposed to harmonize several different markup languages in this research. First, we adopt the Unified Modelling Language (UML) as the data model to formalize the concept and the process of the harmonization process between the eXtensible Markup Language (XML) applications. Then, we design the Harmonization eXtensible Markup Language (HXML) based on the data model and formalize the transformation between the Document Type Definitions (DTDs) of the original XML applications and HXML. The transformation between instances is also discussed. We use the harmonization of SMIL and X3D as an example to demonstrate the proposed methodology. This methodology can be generalized to various application domains.

  20. Generalized trajectory surface hopping method based on the Zhu-Nakamura theory

    NASA Astrophysics Data System (ADS)

    Oloyede, Ponmile; Mil'nikov, Gennady; Nakamura, Hiroki

    2006-04-01

    We present a generalized formulation of the trajectory surface hopping method applicable to a general multidimensional system. The method is based on the Zhu-Nakamura theory of a nonadiabatic transition and therefore includes the treatment of classically forbidden hops. The method uses a generalized recipe for the conservation of angular momentum after forbidden hops and an approximation for determining a nonadiabatic transition direction which is crucial when the coupling vector is unavailable. This method also eliminates the need for a rigorous location of the seam surface, thereby ensuring its applicability to a wide class of chemical systems. In a test calculation, we implement the method for the DH2+ system, and it shows a remarkable agreement with the previous results of C. Zhu, H. Kamisaka, and H. Nakamura, [J. Chem. Phys. 116, 3234 (2002)]. We then apply it to a diatomic-in-molecule model system with a conical intersection, and the results compare well with exact quantum calculations. The successful application to the conical intersection system confirms the possibility of directly extending the present method to an arbitrary potential of general topology.

  1. Generalized logistic map and its application in chaos based cryptography

    NASA Astrophysics Data System (ADS)

    Lawnik, M.

    2017-12-01

    The logistic map is commonly used in, for example, chaos based cryptography. However, its properties do not render a safe construction of encryption algorithms. Thus, the scope of the paper is a proposal of generalization of the logistic map by means of a wellrecognized family of chaotic maps. In the next step, an analysis of Lyapunov exponent and the distribution of the iterative variable are studied. The obtained results confirm that the analyzed model can safely and effectively replace a classic logistic map for applications involving chaotic cryptography.

  2. Detection of generalized synchronization using echo state networks

    NASA Astrophysics Data System (ADS)

    Ibáñez-Soria, D.; Garcia-Ojalvo, J.; Soria-Frisch, A.; Ruffini, G.

    2018-03-01

    Generalized synchronization between coupled dynamical systems is a phenomenon of relevance in applications that range from secure communications to physiological modelling. Here, we test the capabilities of reservoir computing and, in particular, echo state networks for the detection of generalized synchronization. A nonlinear dynamical system consisting of two coupled Rössler chaotic attractors is used to generate temporal series consisting of time-locked generalized synchronized sequences interleaved with unsynchronized ones. Correctly tuned, echo state networks are able to efficiently discriminate between unsynchronized and synchronized sequences even in the presence of relatively high levels of noise. Compared to other state-of-the-art techniques of synchronization detection, the online capabilities of the proposed Echo State Network based methodology make it a promising choice for real-time applications aiming to monitor dynamical synchronization changes in continuous signals.

  3. JIGSAW-GEO (1.0): Locally Orthogonal Staggered Unstructured Grid Generation for General Circulation Modelling on the Sphere

    NASA Technical Reports Server (NTRS)

    Engwirda, Darren

    2017-01-01

    An algorithm for the generation of non-uniform, locally orthogonal staggered unstructured spheroidal grids is described. This technique is designed to generate very high-quality staggered VoronoiDelaunay meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric simulation, ocean-modelling and numerical weather prediction. Using a recently developed Frontal-Delaunay refinement technique, a method for the construction of high-quality unstructured spheroidal Delaunay triangulations is introduced. A locally orthogonal polygonal grid, derived from the associated Voronoi diagram, is computed as the staggered dual. It is shown that use of the Frontal-Delaunay refinement technique allows for the generation of very high-quality unstructured triangulations, satisfying a priori bounds on element size and shape. Grid quality is further improved through the application of hill-climbing-type optimisation techniques. Overall, the algorithm is shown to produce grids with very high element quality and smooth grading characteristics, while imposing relatively low computational expense. A selection of uniform and non-uniform spheroidal grids appropriate for high-resolution, multi-scale general circulation modelling are presented. These grids are shown to satisfy the geometric constraints associated with contemporary unstructured C-grid-type finite-volume models, including the Model for Prediction Across Scales (MPAS-O). The use of user-defined mesh-spacing functions to generate smoothly graded, non-uniform grids for multi-resolution-type studies is discussed in detail.

  4. JIGSAW-GEO (1.0): locally orthogonal staggered unstructured grid generation for general circulation modelling on the sphere

    NASA Astrophysics Data System (ADS)

    Engwirda, Darren

    2017-06-01

    An algorithm for the generation of non-uniform, locally orthogonal staggered unstructured spheroidal grids is described. This technique is designed to generate very high-quality staggered Voronoi-Delaunay meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric simulation, ocean-modelling and numerical weather prediction. Using a recently developed Frontal-Delaunay refinement technique, a method for the construction of high-quality unstructured spheroidal Delaunay triangulations is introduced. A locally orthogonal polygonal grid, derived from the associated Voronoi diagram, is computed as the staggered dual. It is shown that use of the Frontal-Delaunay refinement technique allows for the generation of very high-quality unstructured triangulations, satisfying a priori bounds on element size and shape. Grid quality is further improved through the application of hill-climbing-type optimisation techniques. Overall, the algorithm is shown to produce grids with very high element quality and smooth grading characteristics, while imposing relatively low computational expense. A selection of uniform and non-uniform spheroidal grids appropriate for high-resolution, multi-scale general circulation modelling are presented. These grids are shown to satisfy the geometric constraints associated with contemporary unstructured C-grid-type finite-volume models, including the Model for Prediction Across Scales (MPAS-O). The use of user-defined mesh-spacing functions to generate smoothly graded, non-uniform grids for multi-resolution-type studies is discussed in detail.

  5. MODEL DEVELOPMENT AND APPLICATION FOR ASSESSING HUMAN EXPOSURE AND DOSE TO TOXIC CHEMICALS AND POLLUTANTS

    EPA Science Inventory

    This project aims to strengthen the general scientific foundation of EPA's exposure and risk assessment processes by developing state-of-the-art exposure to dose computational models. This research will produce physiologically-based pharmacokinetic (PBPK) and pharmacodynamic (PD)...

  6. Lumped Parameter Modeling for Rapid Vibration Response Prototyping and Test Correlation for Electronic Units

    NASA Technical Reports Server (NTRS)

    Van Dyke, Michael B.

    2013-01-01

    Present preliminary work using lumped parameter models to approximate dynamic response of electronic units to random vibration; Derive a general N-DOF model for application to electronic units; Illustrate parametric influence of model parameters; Implication of coupled dynamics for unit/board design; Demonstrate use of model to infer printed wiring board (PWB) dynamics from external chassis test measurement.

  7. Modeling Enclosure Design in Above-Grade Walls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lstiburek, J.; Ueno, K.; Musunuru, S.

    2016-03-01

    Building Science Corporation modeled typically well-performing wall assemblies using Wärme und Feuchte instationär (WUFI) Version 5.3 software and demonstrated that these models agree with historic experience when calibrated and modeled correctly. This technical report provides a library of WUFI modeling input data and results. Within the limits of existing experience, this information can be generalized for applications to a broad population of houses.

  8. Technical Note: Approximate Bayesian parameterization of a process-based tropical forest model

    NASA Astrophysics Data System (ADS)

    Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.

    2014-02-01

    Inverse parameter estimation of process-based models is a long-standing problem in many scientific disciplines. A key question for inverse parameter estimation is how to define the metric that quantifies how well model predictions fit to the data. This metric can be expressed by general cost or objective functions, but statistical inversion methods require a particular metric, the probability of observing the data given the model parameters, known as the likelihood. For technical and computational reasons, likelihoods for process-based stochastic models are usually based on general assumptions about variability in the observed data, and not on the stochasticity generated by the model. Only in recent years have new methods become available that allow the generation of likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional Markov chain Monte Carlo (MCMC) sampler, performs well in retrieving known parameter values from virtual inventory data generated by the forest model. We analyze the results of the parameter estimation, examine its sensitivity to the choice and aggregation of model outputs and observed data (summary statistics), and demonstrate the application of this method by fitting the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss how this approach differs from approximate Bayesian computation (ABC), another method commonly used to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can be successfully applied to process-based models of high complexity. The methodology is particularly suitable for heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models.

  9. Lessons learned in using IPE model for IPEEE study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guey, C.

    1995-12-31

    This paper summarizes lessons learned in applying the plant model developed in the Individual Plant Examination (IPE) to the IPE for External Events (IPEEE). Both core damage frequency and containment performance features are addressed. The IPE model applications are discussed for internal fires, hurricanes, and tornadoes. Areas in which the IPE model may be improved and general findings are described.

  10. Semiparametric bivariate zero-inflated Poisson models with application to studies of abundance for multiple species

    USGS Publications Warehouse

    Arab, Ali; Holan, Scott H.; Wikle, Christopher K.; Wildhaber, Mark L.

    2012-01-01

    Ecological studies involving counts of abundance, presence–absence or occupancy rates often produce data having a substantial proportion of zeros. Furthermore, these types of processes are typically multivariate and only adequately described by complex nonlinear relationships involving externally measured covariates. Ignoring these aspects of the data and implementing standard approaches can lead to models that fail to provide adequate scientific understanding of the underlying ecological processes, possibly resulting in a loss of inferential power. One method of dealing with data having excess zeros is to consider the class of univariate zero-inflated generalized linear models. However, this class of models fails to address the multivariate and nonlinear aspects associated with the data usually encountered in practice. Therefore, we propose a semiparametric bivariate zero-inflated Poisson model that takes into account both of these data attributes. The general modeling framework is hierarchical Bayes and is suitable for a broad range of applications. We demonstrate the effectiveness of our model through a motivating example on modeling catch per unit area for multiple species using data from the Missouri River Benthic Fishes Study, implemented by the United States Geological Survey.

  11. Application of Bayesian Networks to hindcast barrier island morphodynamics

    USGS Publications Warehouse

    Wilson, Kathleen E.; Adams, Peter N.; Hapke, Cheryl J.; Lentz, Erika E.; Brenner, Owen T.

    2015-01-01

    We refine a preliminary Bayesian Network by 1) increasing model experience through additional observations, 2) including anthropogenic modification history, and 3) replacing parameterized wave impact values with maximum run-up elevation. Further, we develop and train a pair of generalized models with an additional dataset encompassing a different storm event, which expands the observations beyond our hindcast objective. We compare the skill of the generalized models against the Nor'Ida specific model formulation, balancing the reduced skill with an expectation of increased transferability. Results of Nor'Ida hindcasts ranged in skill from 0.37 to 0.51 and accuracy of 65.0 to 81.9%.

  12. Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Application of Model View Definition Attributes

    DTIC Science & Technology

    2013-06-01

    Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...to develop a life-cycle building model have resulted in the definition of a “core” building information model that contains general information de...develop an information -exchange Model View Definition (MVD) for building electrical systems. The objective of the current work was to document the

  13. Development and application of damage assessment modeling: example assessment for the North Cape oil spill.

    PubMed

    McCay, Deborah French

    2003-01-01

    Natural resource damage assessment (NRDA) models for oil spills have been under development since 1984. Generally applicable (simplified) versions with built-in data sets are included in US government regulations for NRDAs in US waters. The most recent version of these models is SIMAP (Spill Impact Model Application Package), which contains oil fates and effects models that may be applied to any spill event and location in marine or freshwater environments. It is often not cost-effective or even possible to quantify spill impacts using field data collections. Modeling allows quantification of spill impacts using as much site-specific data as available, either as input or as validation of model results. SIMAP was used for the North Cape oil spill in Rhode Island (USA) in January 1996, for injury quantification in the first and largest NRDA case to be performed under the 1996 Oil Pollution Act NRDA regulations. The case was successfully settled in 1999. This paper, which contains a description of the model and application to the North Cape spill, delineates and demonstrates the approach.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buechler, Elizabeth D.; Pallin, Simon B.; Boudreaux, Philip R.

    The indoor air temperature and relative humidity in residential buildings significantly affect material moisture durability, HVAC system performance, and occupant comfort. Therefore, indoor climate data is generally required to define boundary conditions in numerical models that evaluate envelope durability and equipment performance. However, indoor climate data obtained from field studies is influenced by weather, occupant behavior and internal loads, and is generally unrepresentative of the residential building stock. Likewise, whole-building simulation models typically neglect stochastic variables and yield deterministic results that are applicable to only a single home in a specific climate. The

  15. Solar Occultation Retrieval Algorithm Development

    NASA Technical Reports Server (NTRS)

    Lumpe, Jerry D.

    2004-01-01

    This effort addresses the comparison and validation of currently operational solar occultation retrieval algorithms, and the development of generalized algorithms for future application to multiple platforms. initial development of generalized forward model algorithms capable of simulating transmission data from of the POAM II/III and SAGE II/III instruments. Work in the 2" quarter will focus on: completion of forward model algorithms, including accurate spectral characteristics for all instruments, and comparison of simulated transmission data with actual level 1 instrument data for specific occultation events.

  16. Continuity-based model interfacing for plant-wide simulation: a general approach.

    PubMed

    Volcke, Eveline I P; van Loosdrecht, Mark C M; Vanrolleghem, Peter A

    2006-08-01

    In plant-wide simulation studies of wastewater treatment facilities, often existing models from different origin need to be coupled. However, as these submodels are likely to contain different state variables, their coupling is not straightforward. The continuity-based interfacing method (CBIM) provides a general framework to construct model interfaces for models of wastewater systems, taking into account conservation principles. In this contribution, the CBIM approach is applied to study the effect of sludge digestion reject water treatment with a SHARON-Anammox process on a plant-wide scale. Separate models were available for the SHARON process and for the Anammox process. The Benchmark simulation model no. 2 (BSM2) is used to simulate the behaviour of the complete WWTP including sludge digestion. The CBIM approach is followed to develop three different model interfaces. At the same time, the generally applicable CBIM approach was further refined and particular issues when coupling models in which pH is considered as a state variable, are pointed out.

  17. GSTARS computer models and their applications, Part II: Applications

    USGS Publications Warehouse

    Simoes, F.J.M.; Yang, C.T.

    2008-01-01

    In part 1 of this two-paper series, a brief summary of the basic concepts and theories used in developing the Generalized Stream Tube model for Alluvial River Simulation (GSTARS) computer models was presented. Part 2 provides examples that illustrate some of the capabilities of the GSTARS models and how they can be applied to solve a wide range of river and reservoir sedimentation problems. Laboratory and field case studies are used and the examples show representative applications of the earlier and of the more recent versions of GSTARS. Some of the more recent capabilities implemented in GSTARS3, one of the latest versions of the series, are also discussed here with more detail. ?? 2008 International Research and Training Centre on Erosion and Sedimentation and the World Association for Sedimentation and Erosion Research.

  18. On a class of integrals of Legendre polynomials with complicated arguments--with applications in electrostatics and biomolecular modeling.

    PubMed

    Yu, Yi-Kuo

    2003-08-15

    The exact analytical result for a class of integrals involving (associated) Legendre polynomials of complicated argument is presented. The method employed can in principle be generalized to integrals involving other special functions. This class of integrals also proves useful in the electrostatic problems in which dielectric spheres are involved, which is of importance in modeling the dynamics of biological macromolecules. In fact, with this solution, a more robust foundation is laid for the Generalized Born method in modeling the dynamics of biomolecules. c2003 Elsevier B.V. All rights reserved.

  19. Probabilistic modeling of the indoor climates of residential buildings using EnergyPlus

    DOE PAGES

    Buechler, Elizabeth D.; Pallin, Simon B.; Boudreaux, Philip R.; ...

    2017-04-25

    The indoor air temperature and relative humidity in residential buildings significantly affect material moisture durability, HVAC system performance, and occupant comfort. Therefore, indoor climate data is generally required to define boundary conditions in numerical models that evaluate envelope durability and equipment performance. However, indoor climate data obtained from field studies is influenced by weather, occupant behavior and internal loads, and is generally unrepresentative of the residential building stock. Likewise, whole-building simulation models typically neglect stochastic variables and yield deterministic results that are applicable to only a single home in a specific climate. The

  20. Extending the Diffuse Layer Model of Surface Acidity Behavior: III. Estimating Bound Site Activity Coefficients

    EPA Science Inventory

    Although detailed thermodynamic analyses of the 2-pK diffuse layer surface complexation model generally specify bound site activity coefficients for the purpose of accounting for those non-ideal excess free energies contributing to bound site electrochemical potentials, in applic...

  1. A Quantitative Model for the Exchange Current of Porous Molybdenum Electrodes on Sodium Beta-Alumina in Sodium Vapor

    NASA Technical Reports Server (NTRS)

    Williams, R. M.; Ryan, M. A.; LeDuc, H.; Cortez, R. H.; Saipetch, C.; Shields, V.; Manatt, K.; Homer, M. L.

    1998-01-01

    This paper presents a model of the exchange current developed for porous molybdenum electrodes on sodium beta-alumina ceramics in low pressure sodium vapor, but which has general applicability to gas/porous metal electrodes on solid electrolytes.

  2. A Multidimensional Partial Credit Model with Associated Item and Test Statistics: An Application to Mixed-Format Tests

    ERIC Educational Resources Information Center

    Yao, Lihua; Schwarz, Richard D.

    2006-01-01

    Multidimensional item response theory (IRT) models have been proposed for better understanding the dimensional structure of data or to define diagnostic profiles of student learning. A compensatory multidimensional two-parameter partial credit model (M-2PPC) for constructed-response items is presented that is a generalization of those proposed to…

  3. Epistemology and the Curriculum: The Bio-Ecocultural Cognitive Spiral as Model.

    ERIC Educational Resources Information Center

    Gagne, Raymond C.

    Described is a curriculum theory and construction within an epistemological model whose specific purpose is to serve as a general guide to the Amerindian peoples (Indians and Inuit) in their search for solutions to the problem of cultural survival. The model is also meant to have universal application, especially where there are cultures in…

  4. An Application of Variational Theory to an Integrated Walrasian Model of Exchange, Consumption and Production

    NASA Astrophysics Data System (ADS)

    Donato, M. B.; Milasi, M.; Vitanza, C.

    2010-09-01

    An existence result of a Walrasian equilibrium for an integrated model of exchange, consumption and production is obtained. The equilibrium model is characterized in terms of a suitable generalized quasi-variational inequality; so the existence result comes from an original technique which takes into account tools of convex and set-valued analysis.

  5. The Convergence Model of Communication. Papers of the East-West Communication Institute, No. 18.

    ERIC Educational Resources Information Center

    Kincaid, D. Lawrence

    Expressing the need for a description of communication that is equally applicable to all the social sciences, this report develops a general model of the communication process based upon the principle of convergence as derived from basic information theory and cybernetics. It criticizes the linear, one-way models of communication that have…

  6. EVALUATION OF THE REAL-TIME AIR-QUALITY MODEL USING THE RAPS (REGIONAL AIR POLLUTION STUDY) DATA BASE. VOLUME 3. PROGRAM USER'S GUIDE

    EPA Science Inventory

    The theory and programming of statistical tests for evaluating the Real-Time Air-Quality Model (RAM) using the Regional Air Pollution Study (RAPS) data base are fully documented in four volumes. Moreover, the tests are generally applicable to other model evaluation problems. Volu...

  7. EVALUATION OF THE REAL-TIME AIR-QUALITY MODEL USING THE RAPS (REGIONAL AIR POLLUTION STUDY) DATA BASE. VOLUME 4. EVALUATION GUIDE

    EPA Science Inventory

    The theory and programming of statistical tests for evaluating the Real-Time Air-Quality Model (RAM) using the Regional Air Pollution Study (RAPS) data base are fully documented in four volumes. Moreover, the tests are generally applicable to other model evaluation problems. Volu...

  8. Modeling Longitudinal Data with Generalized Additive Models: Applications to Single-Case Designs

    ERIC Educational Resources Information Center

    Sullivan, Kristynn J.; Shadish, William R.

    2013-01-01

    Single case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time both in the presence and absence of treatment. For a variety of reasons, interest in the statistical analysis and meta-analysis of these designs has been growing in recent years. This paper proposes modeling SCD data with…

  9. Investigation of traveler acceptance factors in short haul air carrier operations

    NASA Technical Reports Server (NTRS)

    Kuhlthau, A. R.; Jacobson, I. D.

    1972-01-01

    The development of a mathematical model for human reaction to variables involved in transportation systems is discussed. The techniques, activities, and results related to defining certain specific inputs to the model are presented. A general schematic diagram of the problem solution is developed. The application of the model to short haul air carrier operations is examined.

  10. Microengineering in cardiovascular research: new developments and translational applications.

    PubMed

    Chan, Juliana M; Wong, Keith H K; Richards, Arthur Mark; Drum, Chester L

    2015-04-01

    Microfluidic, cellular co-cultures that approximate macro-scale biology are important tools for refining the in vitro study of organ-level function and disease. In recent years, advances in technical fabrication and biological integration have provided new insights into biological phenomena, improved diagnostic measurements, and made major steps towards de novo tissue creation. Here we review applications of these technologies specific to the cardiovascular field, emphasizing three general categories of use: reductionist vascular models, tissue-engineered vascular models, and point-of-care diagnostics. With continued progress in the ability to purposefully control microscale environments, the detailed study of both primary and cultured cells may find new relevance in the general cardiovascular research community. © The Author 2015. Published by Oxford University Press on behalf of the European Society of Cardiology.

  11. Parameter Optimization of Pseudo-Rigid-Body Models of MRI-Actuated Catheters

    PubMed Central

    Greigarn, Tipakorn; Liu, Taoming; Çavuşoğlu, M. Cenk

    2016-01-01

    Simulation and control of a system containing compliant mechanisms such as cardiac catheters often incur high computational costs. One way to reduce the costs is to approximate the mechanisms with Pseudo-Rigid-Body Models (PRBMs). A PRBM generally consists of rigid links connected by spring-loaded revolute joints. The lengths of the rigid links and the stiffnesses of the springs are usually chosen to minimize the tip deflection differences between the PRBM and the compliant mechanism. In most applications, only the relationship between end load and tip deflection is considered. This is obviously not applicable for MRI-actuated catheters which is actuated by the coils attached to the body. This paper generalizes PRBM parameter optimization to include loading and reference points along the body. PMID:28261009

  12. The Baldwin-Lomax model for separated and wake flows using the entropy envelope concept

    NASA Technical Reports Server (NTRS)

    Brock, J. S.; Ng, W. F.

    1992-01-01

    Implementation of the Baldwin-Lomax algebraic turbulence model is difficult and ambiguous within flows characterized by strong viscous-inviscid interactions and flow separations. A new method of implementation is proposed which uses an entropy envelope concept and is demonstrated to ensure the proper evaluation of modeling parameters. The method is simple, computationally fast, and applicable to both wake and boundary layer flows. The method is general, making it applicable to any turbulence model which requires the automated determination of the proper maxima of a vorticity-based function. The new method is evalulated within two test cases involving strong viscous-inviscid interaction.

  13. Application of Zebrafish Model to Environmental Toxicology.

    PubMed

    Komoike, Yuta; Matsuoka, Masato

    2016-01-01

    Recently, a tropical freshwater fish, the zebrafish, has been generally used as a useful model organism in various fields of life science worldwide. The zebrafish model has also been applied to environmental toxicology; however, in Japan, it has not yet become widely used. In this review, we will introduce the biological and historical backgrounds of zebrafish as an animal model and their breeding. We then present the current status of toxicological experiments using zebrafish that were treated with some important environmental contaminants, including cadmium, organic mercury, 2,3,7,8-tetrachlorodibenzo-p-dioxin, and tributyltin. Finally, the future possible application of genetically modified zebrafish to the study of environmental toxicology is discussed.

  14. Application of Mouse Models to Research in Hearing and Balance.

    PubMed

    Ohlemiller, Kevin K; Jones, Sherri M; Johnson, Kenneth R

    2016-12-01

    Laboratory mice (Mus musculus) have become the major model species for inner ear research. The major uses of mice include gene discovery, characterization, and confirmation. Every application of mice is founded on assumptions about what mice represent and how the information gained may be generalized. A host of successes support the continued use of mice to understand hearing and balance. Depending on the research question, however, some mouse models and research designs will be more appropriate than others. Here, we recount some of the history and successes of the use of mice in hearing and vestibular studies and offer guidelines to those considering how to apply mouse models.

  15. A dynamic subgrid scale model for Large Eddy Simulations based on the Mori-Zwanzig formalism

    NASA Astrophysics Data System (ADS)

    Parish, Eric J.; Duraisamy, Karthik

    2017-11-01

    The development of reduced models for complex multiscale problems remains one of the principal challenges in computational physics. The optimal prediction framework of Chorin et al. [1], which is a reformulation of the Mori-Zwanzig (M-Z) formalism of non-equilibrium statistical mechanics, provides a framework for the development of mathematically-derived reduced models of dynamical systems. Several promising models have emerged from the optimal prediction community and have found application in molecular dynamics and turbulent flows. In this work, a new M-Z-based closure model that addresses some of the deficiencies of existing methods is developed. The model is constructed by exploiting similarities between two levels of coarse-graining via the Germano identity of fluid mechanics and by assuming that memory effects have a finite temporal support. The appeal of the proposed model, which will be referred to as the 'dynamic-MZ-τ' model, is that it is parameter-free and has a structural form imposed by the mathematics of the coarse-graining process (rather than the phenomenological assumptions made by the modeler, such as in classical subgrid scale models). To promote the applicability of M-Z models in general, two procedures are presented to compute the resulting model form, helping to bypass the tedious error-prone algebra that has proven to be a hindrance to the construction of M-Z-based models for complex dynamical systems. While the new formulation is applicable to the solution of general partial differential equations, demonstrations are presented in the context of Large Eddy Simulation closures for the Burgers equation, decaying homogeneous turbulence, and turbulent channel flow. The performance of the model and validity of the underlying assumptions are investigated in detail.

  16. A new approach to modeling temperature-related mortality: Non-linear autoregressive models with exogenous input.

    PubMed

    Lee, Cameron C; Sheridan, Scott C

    2018-07-01

    Temperature-mortality relationships are nonlinear, time-lagged, and can vary depending on the time of year and geographic location, all of which limits the applicability of simple regression models in describing these associations. This research demonstrates the utility of an alternative method for modeling such complex relationships that has gained recent traction in other environmental fields: nonlinear autoregressive models with exogenous input (NARX models). All-cause mortality data and multiple temperature-based data sets were gathered from 41 different US cities, for the period 1975-2010, and subjected to ensemble NARX modeling. Models generally performed better in larger cities and during the winter season. Across the US, median absolute percentage errors were 10% (ranging from 4% to 15% in various cities), the average improvement in the r-squared over that of a simple persistence model was 17% (6-24%), and the hit rate for modeling spike days in mortality (>80th percentile) was 54% (34-71%). Mortality responded acutely to hot summer days, peaking at 0-2 days of lag before dropping precipitously, and there was an extended mortality response to cold winter days, peaking at 2-4 days of lag and dropping slowly and continuing for multiple weeks. Spring and autumn showed both of the aforementioned temperature-mortality relationships, but generally to a lesser magnitude than what was seen in summer or winter. When compared to distributed lag nonlinear models, NARX model output was nearly identical. These results highlight the applicability of NARX models for use in modeling complex and time-dependent relationships for various applications in epidemiology and environmental sciences. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Relations between dissipated work and Rényi divergences in the generalized Gibbs ensemble

    NASA Astrophysics Data System (ADS)

    Wei, Bo-Bo

    2018-04-01

    In this work, we show that the dissipation in a many-body system under an arbitrary nonequilibrium process is related to the Rényi divergences between two states along the forward and reversed dynamics under a very general family of initial conditions. This relation generalizes the links between dissipated work and Rényi divergences to quantum systems with conserved quantities whose equilibrium state is described by the generalized Gibbs ensemble. The relation is applicable for quantum systems with conserved quantities and can be applied to protocols driving the system between integrable and chaotic regimes. We demonstrate our ideas by considering the one-dimensional transverse quantum Ising model and the Jaynes-Cummings model which are driven out of equilibrium.

  18. Efficiency of Finnish General Upper Secondary Schools: An Application of Stochastic Frontier Analysis with Panel Data

    ERIC Educational Resources Information Center

    Kirjavainen, Tanja

    2012-01-01

    Different stochastic frontier models for panel data are used to estimate education production functions and the efficiency of Finnish general upper secondary schools. Grades in the matriculation examination are used as an output and explained with the comprehensive school grade point average, parental socio-economic background, school resources,…

  19. Shifting attention from objective risk factors to patients' self-assessed health resources: a clinical model for general practice.

    PubMed

    Hollnagel, H; Malterud, K

    1995-12-01

    The study was designed to present and apply theoretical and empirical knowledge for the construction of a clinical model intended to shift the attention of the general practitioner from objective risk factors to self-assessed health resources in male and female patients. Review, discussion and analysis of selected theoretical models about personal health resources involving assessing existing theories according to their emphasis concerning self-assessed vs. doctor-assessed health resources, specific health resources vs. life and coping in general, abstract vs. clinically applicable theory, gender perspective explicitly included or not. Relevant theoretical models on health and coping (salutogenesis, coping and social support, control/demand, locus of control, health belief model, quality of life), and the perspective of the underprivileged Other (critical theory, feminist standpoint theory, the patient-centred clinical method) were presented and assessed. Components from Antonovsky's salutogenetic perspective and McWhinney's patient-centred clinical method, supported by gender perspectives, were integrated to a clinical model which is presented. General practitioners are recommended to shift their attention from objective risk factors to self-assessed health resources by means of the clinical model. The relevance and feasibility of the model should be explored in empirical research.

  20. Differential renormalization-group generators for static and dynamic critical phenomena

    NASA Astrophysics Data System (ADS)

    Chang, T. S.; Vvedensky, D. D.; Nicoll, J. F.

    1992-09-01

    The derivation of differential renormalization-group (DRG) equations for applications to static and dynamic critical phenomena is reviewed. The DRG approach provides a self-contained closed-form representation of the Wilson renormalization group (RG) and should be viewed as complementary to the Callan-Symanzik equations used in field-theoretic approaches to the RG. The various forms of DRG equations are derived to illustrate the general mathematical structure of each approach and to point out the advantages and disadvantages for performing practical calculations. Otherwise, the review focuses upon the one-particle-irreducible DRG equations derived by Nicoll and Chang and by Chang, Nicoll, and Young; no attempt is made to provide a general treatise of critical phenomena. A few specific examples are included to illustrate the utility of the DRG approach: the large- n limit of the classical n-vector model (the spherical model), multi- or higher-order critical phenomena, and crit ical dynamics far from equilibrium. The large- n limit of the n-vector model is used to introduce the application of DRG equations to a well-known example, with exact solution obtained for the nonlinear trajectories, generating functions for nonlinear scaling fields, and the equation of state. Trajectory integrals and nonlinear scaling fields within the framework of ɛ-expansions are then discussed for tricritical crossover, and briefly for certain aspects of multi- or higher-order critical points, including the derivation of the Helmholtz free energy and the equation of state. The discussion then turns to critical dynamics with a development of the path integral formulation for general dynamic processes. This is followed by an application to a model far-from-equilibrium system that undergoes a phase transformation analogous to a second-order critical point, the Schlögl model for a chemical instability.

  1. Use of instrumental variables in the analysis of generalized linear models in the presence of unmeasured confounding with applications to epidemiological research.

    PubMed

    Johnston, K M; Gustafson, P; Levy, A R; Grootendorst, P

    2008-04-30

    A major, often unstated, concern of researchers carrying out epidemiological studies of medical therapy is the potential impact on validity if estimates of treatment are biased due to unmeasured confounders. One technique for obtaining consistent estimates of treatment effects in the presence of unmeasured confounders is instrumental variables analysis (IVA). This technique has been well developed in the econometrics literature and is being increasingly used in epidemiological studies. However, the approach to IVA that is most commonly used in such studies is based on linear models, while many epidemiological applications make use of non-linear models, specifically generalized linear models (GLMs) such as logistic or Poisson regression. Here we present a simple method for applying IVA within the class of GLMs using the generalized method of moments approach. We explore some of the theoretical properties of the method and illustrate its use within both a simulation example and an epidemiological study where unmeasured confounding is suspected to be present. We estimate the effects of beta-blocker therapy on one-year all-cause mortality after an incident hospitalization for heart failure, in the absence of data describing disease severity, which is believed to be a confounder. 2008 John Wiley & Sons, Ltd

  2. [Application of three risk assessment models in occupational health risk assessment of dimethylformamide].

    PubMed

    Wu, Z J; Xu, B; Jiang, H; Zheng, M; Zhang, M; Zhao, W J; Cheng, J

    2016-08-20

    Objective: To investigate the application of United States Environmental Protection Agency (EPA) inhalation risk assessment model, Singapore semi-quantitative risk assessment model, and occupational hazards risk assessment index method in occupational health risk in enterprises using dimethylformamide (DMF) in a certain area in Jiangsu, China, and to put forward related risk control measures. Methods: The industries involving DMF exposure in Jiangsu province were chosen as the evaluation objects in 2013 and three risk assessment models were used in the evaluation. EPA inhalation risk assessment model: HQ=EC/RfC; Singapore semi-quantitative risk assessment model: Risk= (HR×ER) 1/2 ; Occupational hazards risk assessment index=2 Health effect level ×2 exposure ratio ×Operation condition level. Results: The results of hazard quotient (HQ>1) from EPA inhalation risk assessment model suggested that all the workshops (dry method, wet method and printing) and work positions (pasting, burdening, unreeling, rolling, assisting) were high risk. The results of Singapore semi-quantitative risk assessment model indicated that the workshop risk level of dry method, wet method and printing were 3.5 (high) , 3.5 (high) and 2.8 (general) , and position risk level of pasting, burdening, unreeling, rolling, assisting were 4 (high) , 4 (high) , 2.8 (general) , 2.8 (general) and 2.8 (general) . The results of occupational hazards risk assessment index method demonstrated that the position risk index of pasting, burdening, unreeling, rolling, assisting were 42 (high) , 33 (high) , 23 (middle) , 21 (middle) and 22 (middle) . The results of Singapore semi-quantitative risk assessment model and occupational hazards risk assessment index method were similar, while EPA inhalation risk assessment model indicated all the workshops and positions were high risk. Conclusion: The occupational hazards risk assessment index method fully considers health effects, exposure, and operating conditions and can comprehensively and accurately evaluate occupational health risk caused by DMF.

  3. Stochastic Analysis and Probabilistic Downscaling of Soil Moisture

    NASA Astrophysics Data System (ADS)

    Deshon, J. P.; Niemann, J. D.; Green, T. R.; Jones, A. S.

    2017-12-01

    Soil moisture is a key variable for rainfall-runoff response estimation, ecological and biogeochemical flux estimation, and biodiversity characterization, each of which is useful for watershed condition assessment. These applications require not only accurate, fine-resolution soil-moisture estimates but also confidence limits on those estimates and soil-moisture patterns that exhibit realistic statistical properties (e.g., variance and spatial correlation structure). The Equilibrium Moisture from Topography, Vegetation, and Soil (EMT+VS) model downscales coarse-resolution (9-40 km) soil moisture from satellite remote sensing or land-surface models to produce fine-resolution (10-30 m) estimates. The model was designed to produce accurate deterministic soil-moisture estimates at multiple points, but the resulting patterns do not reproduce the variance or spatial correlation of observed soil-moisture patterns. The primary objective of this research is to generalize the EMT+VS model to produce a probability density function (pdf) for soil moisture at each fine-resolution location and time. Each pdf has a mean that is equal to the deterministic soil-moisture estimate, and the pdf can be used to quantify the uncertainty in the soil-moisture estimates and to simulate soil-moisture patterns. Different versions of the generalized model are hypothesized based on how uncertainty enters the model, whether the uncertainty is additive or multiplicative, and which distributions describe the uncertainty. These versions are then tested by application to four catchments with detailed soil-moisture observations (Tarrawarra, Satellite Station, Cache la Poudre, and Nerrigundah). The performance of the generalized models is evaluated by comparing the statistical properties of the simulated soil-moisture patterns to those of the observations and the deterministic EMT+VS model. The versions of the generalized EMT+VS model with normally distributed stochastic components produce soil-moisture patterns with more realistic statistical properties than the deterministic model. Additionally, the results suggest that the variance and spatial correlation of the stochastic soil-moisture variations do not vary consistently with the spatial-average soil moisture.

  4. A Statistical Learning Framework for Materials Science: Application to Elastic Moduli of k-nary Inorganic Polycrystalline Compounds.

    PubMed

    de Jong, Maarten; Chen, Wei; Notestine, Randy; Persson, Kristin; Ceder, Gerbrand; Jain, Anubhav; Asta, Mark; Gamst, Anthony

    2016-10-03

    Materials scientists increasingly employ machine or statistical learning (SL) techniques to accelerate materials discovery and design. Such pursuits benefit from pooling training data across, and thus being able to generalize predictions over, k-nary compounds of diverse chemistries and structures. This work presents a SL framework that addresses challenges in materials science applications, where datasets are diverse but of modest size, and extreme values are often of interest. Our advances include the application of power or Hölder means to construct descriptors that generalize over chemistry and crystal structure, and the incorporation of multivariate local regression within a gradient boosting framework. The approach is demonstrated by developing SL models to predict bulk and shear moduli (K and G, respectively) for polycrystalline inorganic compounds, using 1,940 compounds from a growing database of calculated elastic moduli for metals, semiconductors and insulators. The usefulness of the models is illustrated by screening for superhard materials.

  5. A Statistical Learning Framework for Materials Science: Application to Elastic Moduli of k-nary Inorganic Polycrystalline Compounds

    PubMed Central

    de Jong, Maarten; Chen, Wei; Notestine, Randy; Persson, Kristin; Ceder, Gerbrand; Jain, Anubhav; Asta, Mark; Gamst, Anthony

    2016-01-01

    Materials scientists increasingly employ machine or statistical learning (SL) techniques to accelerate materials discovery and design. Such pursuits benefit from pooling training data across, and thus being able to generalize predictions over, k-nary compounds of diverse chemistries and structures. This work presents a SL framework that addresses challenges in materials science applications, where datasets are diverse but of modest size, and extreme values are often of interest. Our advances include the application of power or Hölder means to construct descriptors that generalize over chemistry and crystal structure, and the incorporation of multivariate local regression within a gradient boosting framework. The approach is demonstrated by developing SL models to predict bulk and shear moduli (K and G, respectively) for polycrystalline inorganic compounds, using 1,940 compounds from a growing database of calculated elastic moduli for metals, semiconductors and insulators. The usefulness of the models is illustrated by screening for superhard materials. PMID:27694824

  6. A Statistical Learning Framework for Materials Science: Application to Elastic Moduli of k-nary Inorganic Polycrystalline Compounds

    DOE PAGES

    de Jong, Maarten; Chen, Wei; Notestine, Randy; ...

    2016-10-03

    Materials scientists increasingly employ machine or statistical learning (SL) techniques to accelerate materials discovery and design. Such pursuits benefit from pooling training data across, and thus being able to generalize predictions over, k-nary compounds of diverse chemistries and structures. This work presents a SL framework that addresses challenges in materials science applications, where datasets are diverse but of modest size, and extreme values are often of interest. Our advances include the application of power or Hölder means to construct descriptors that generalize over chemistry and crystal structure, and the incorporation of multivariate local regression within a gradient boosting framework. Themore » approach is demonstrated by developing SL models to predict bulk and shear moduli (K and G, respectively) for polycrystalline inorganic compounds, using 1,940 compounds from a growing database of calculated elastic moduli for metals, semiconductors and insulators. The usefulness of the models is illustrated by screening for superhard materials.« less

  7. EEG-Based Quantification of Cortical Current Density and Dynamic Causal Connectivity Generalized across Subjects Performing BCI-Monitored Cognitive Tasks

    PubMed Central

    Courellis, Hristos; Mullen, Tim; Poizner, Howard; Cauwenberghs, Gert; Iversen, John R.

    2017-01-01

    Quantification of dynamic causal interactions among brain regions constitutes an important component of conducting research and developing applications in experimental and translational neuroscience. Furthermore, cortical networks with dynamic causal connectivity in brain-computer interface (BCI) applications offer a more comprehensive view of brain states implicated in behavior than do individual brain regions. However, models of cortical network dynamics are difficult to generalize across subjects because current electroencephalography (EEG) signal analysis techniques are limited in their ability to reliably localize sources across subjects. We propose an algorithmic and computational framework for identifying cortical networks across subjects in which dynamic causal connectivity is modeled among user-selected cortical regions of interest (ROIs). We demonstrate the strength of the proposed framework using a “reach/saccade to spatial target” cognitive task performed by 10 right-handed individuals. Modeling of causal cortical interactions was accomplished through measurement of cortical activity using (EEG), application of independent component clustering to identify cortical ROIs as network nodes, estimation of cortical current density using cortically constrained low resolution electromagnetic brain tomography (cLORETA), multivariate autoregressive (MVAR) modeling of representative cortical activity signals from each ROI, and quantification of the dynamic causal interaction among the identified ROIs using the Short-time direct Directed Transfer function (SdDTF). The resulting cortical network and the computed causal dynamics among its nodes exhibited physiologically plausible behavior, consistent with past results reported in the literature. This physiological plausibility of the results strengthens the framework's applicability in reliably capturing complex brain functionality, which is required by applications, such as diagnostics and BCI. PMID:28566997

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farrell, Kathryn, E-mail: kfarrell@ices.utexas.edu; Oden, J. Tinsley, E-mail: oden@ices.utexas.edu; Faghihi, Danial, E-mail: danial@ices.utexas.edu

    A general adaptive modeling algorithm for selection and validation of coarse-grained models of atomistic systems is presented. A Bayesian framework is developed to address uncertainties in parameters, data, and model selection. Algorithms for computing output sensitivities to parameter variances, model evidence and posterior model plausibilities for given data, and for computing what are referred to as Occam Categories in reference to a rough measure of model simplicity, make up components of the overall approach. Computational results are provided for representative applications.

  9. Techniques for forced response involving discrete nonlinearities. I - Theory. II - Applications

    NASA Astrophysics Data System (ADS)

    Avitabile, Peter; Callahan, John O.

    Several new techniques developed for the forced response analysis of systems containing discrete nonlinear connection elements are presented and compared to the traditional methods. In particular, the techniques examined are the Equivalent Reduced Model Technique (ERMT), Modal Modification Response Technique (MMRT), and Component Element Method (CEM). The general theory of the techniques is presented, and applications are discussed with particular reference to the beam nonlinear system model using ERMT, MMRT, and CEM; frame nonlinear response using the three techniques; and comparison of the results obtained by using the ERMT, MMRT, and CEM models.

  10. An Estimating Equations Approach for the LISCOMP Model.

    ERIC Educational Resources Information Center

    Reboussin, Beth A.; Liang, Kung-Lee

    1998-01-01

    A quadratic estimating equations approach for the LISCOMP model is proposed that only requires specification of the first two moments. This method is compared with a three-stage generalized least squares approach through a numerical study and application to a study of life events and neurotic illness. (SLD)

  11. Improving software maintenance through measurement

    NASA Technical Reports Server (NTRS)

    Rombach, H. Dieter; Ulery, Bradford T.

    1989-01-01

    A practical approach to improving software maintenance through measurements is presented. This approach is based on general models for measurement and improvement. Both models, their integration, and practical guidelines for transferring them into industrial maintenance settings are presented. Several examples of applications of the approach to real-world maintenance environments are discussed.

  12. A Storytelling Learning Model for Legal Education

    ERIC Educational Resources Information Center

    Capuano, Nicola; De Maio, Carmen; Gaeta, Angelo; Mangione, Giuseppina Rita; Salerno, Saverio; Fratesi, Eleonora

    2014-01-01

    The purpose of this paper is to describe a learning model based on "Storytelling" and its application in the context of legal education helping build challenging training resources that explain, to common citizens with little or no background about legal topics, concepts related to "Legal Mediation" in general and in specific…

  13. Amnesic patients show superior generalization in category learning.

    PubMed

    O'Connell, Garret; Myers, Catherine E; Hopkins, Ramona O; McLaren, R P; Gluck, Mark A; Wills, Andy J

    2016-11-01

    Generalization is the application of existing knowledge to novel situations. Questions remain about the precise role of the hippocampus in this facet of learning, but a connectionist model by Gluck and Myers (1993) predicts that generalization should be enhanced following hippocampal damage. In a two-category learning task, a group of amnesic patients (n = 9) learned the training items to a similar level of accuracy as matched controls (n = 9). Both groups then classified new items at various levels of distortion. The amnesic group showed significantly more accurate generalization to high-distortion novel items, a difference also present compared to a larger group of unmatched controls (n = 33). The model prediction of a broadening of generalization gradients in amnesia, at least for items near category boundaries, was supported by the results. Our study shows for the first time that amnesia can sometimes improve generalization. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  14. Self-Ordered Titanium Dioxide Nanotube Arrays: Anodic Synthesis and Their Photo/Electro-Catalytic Applications

    PubMed Central

    Smith, York R.; Ray, Rupashree S.; Carlson, Krista; Sarma, Biplab; Misra, Mano

    2013-01-01

    Metal oxide nanotubes have become a widely investigated material, more specifically, self-organized titania nanotube arrays synthesized by electrochemical anodization. As a highly investigated material with a wide gamut of applications, the majority of published literature focuses on the solar-based applications of this material. The scope of this review summarizes some of the recent advances made using metal oxide nanotube arrays formed via anodization in solar-based applications. A general methodology for theoretical modeling of titania surfaces in solar applications is also presented. PMID:28811415

  15. Magic cards: a new augmented-reality approach.

    PubMed

    Demuynck, Olivier; Menendez, José Manuel

    2013-01-01

    Augmented reality (AR) commonly uses markers for detection and tracking. Such multimedia applications associate each marker with a virtual 3D model stored in the memory of the camera-equipped device running the application. Application users are limited in their interactions, which require knowing how to design and program 3D objects. This generally prevents them from developing their own entertainment AR applications. The Magic Cards application solves this problem by offering an easy way to create and manage an unlimited number of virtual objects that are encoded on special markers.

  16. Widening access? Characteristics of applicants to medical and dental schools, compared with UCAS.

    PubMed

    Gallagher, J E; Niven, V; Donaldson, N; Wilson, N H F

    2009-11-14

    The aim of this paper is to compare the demography (age, sex, ethnicity, social status) and academic experience (school type, tariff scores) of focused and successful applicants to preclinical dentistry with preclinical medicine, and with higher education in general in the UK. Retrospective analyses of anonymised University and College Admissions Services (UCAS) data for focused applicants whose preferred subject was preclinical dentistry or medicine, and accepted (successful) applicants to the same programmes in 2006. These data were compared with publicly available data on applicants and accepted applicants through UCAS. Information for each medical, dental and general UCAS applicant included age, sex, ethnicity, socio-economic group, region, school type and tariff score. Logistic regression was used to model the probability of being accepted in relation to all explanatory variables and interactions. In total there were 2,577 focused applicants to dentistry; 1,114 applicants were accepted, 4% (n = 46) of whom did not have it as their preferred subject choice. There were seven times as many focused applicants for medicine (18,943) when compared with dentistry; 8,011 applicants were accepted, 2.7% of whom did not have medicine as their preferred subject choice (n = 218). Just over half of the applicants to dentistry were from minority ethnic backgrounds (50.5%), exceeding medicine (29.5%), and higher education in general (19%). The proportion of female applicants was similar across all three groups at around 55%. Only one fifth (21%) of focused applicants to dentistry were mature compared with one third (33%) to medicine and one quarter (25.5%) of all UCAS applicants. Greater proportions of applicants to medicine (25.8%) and dentistry (23.5%) were from upper socio-economic backgrounds, compared with higher education in general (15.5%). When all other factors are controlled, the odds of being accepted for medicine, and for dentistry, are lower if mature, male, from a lower social class, from a minority ethnic group and have attended a further/higher education college. Focused and successful applicants for preclinical medicine and dentistry are more likely to be from higher social classes and a minority ethnic background than applicants to higher education in general. Dentistry attracts twice the level of Asian applicants as medicine and four times that of universities in general. Controlling for other factors, there is evidence that gender, ethnicity, maturity, and school type are associated with probability of acceptance for medicine and dentistry. Higher social status is particularly associated with acceptance for medicine. The implications of these findings are discussed in terms of widening access and social justice.

  17. Comparison of Threshold Detection Methods for the Generalized Pareto Distribution (GPD): Application to the NOAA-NCDC Daily Rainfall Dataset

    NASA Astrophysics Data System (ADS)

    Deidda, Roberto; Mamalakis, Antonis; Langousis, Andreas

    2015-04-01

    One of the most crucial issues in statistical hydrology is the estimation of extreme rainfall from data. To that extent, based on asymptotic arguments from Extreme Excess (EE) theory, several studies have focused on developing new, or improving existing methods to fit a Generalized Pareto Distribution (GPD) model to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches that can be grouped into three basic classes: a) non-parametric methods that locate the changing point between extreme and non-extreme regions of the data, b) graphical methods where one studies the dependence of the GPD parameters (or related metrics) to the threshold level u, and c) Goodness of Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GPD model is applicable. In this work, we review representative methods for GPD threshold detection, discuss fundamental differences in their theoretical bases, and apply them to daily rainfall records from the NOAA-NCDC open-access database (http://www.ncdc.noaa.gov/oa/climate/ghcn-daily/). We find that non-parametric methods that locate the changing point between extreme and non-extreme regions of the data are generally not reliable, while graphical methods and GoF metrics that rely on limiting arguments for the upper distribution tail lead to unrealistically high thresholds u. The latter is expected, since one checks the validity of the limiting arguments rather than the applicability of a GPD distribution model. Better performance is demonstrated by graphical methods and GoF metrics that rely on GPD properties. Finally, we discuss the effects of data quantization (common in hydrologic applications) on the estimated thresholds. Acknowledgments: The research project is implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General Secretariat for Research and Technology), and is co-financed by the European Social Fund (ESF) and the Greek State.

  18. A Quasi-Linear Behavioral Model and an Application to Self-Directed Learning

    NASA Technical Reports Server (NTRS)

    Ponton, Michael K.; Carr, Paul B.

    1999-01-01

    A model is presented that describes the relationship between one's knowledge of the world and the concomitant personal behaviors that serve as a mechanism to obtain desired outcomes. Integrated within this model are the differing roles that outcomes serve as motivators and as modifiers to one's worldview. The model is dichotomized between general and contextual applications. Because learner self-directedness (a personal characteristic) involves cognition and affection while self-directed learning (a pedagogic process) encompasses conation, behavior and introspection, the model can be dichotomized again in another direction. Presented also are the roles that cognitive motivation theories play in moving an individual through this behavioral model and the roles of wishes, self-efficacy, opportunity and self-influence.

  19. Generalized Beer-Lambert model for near-infrared light propagation in thick biological tissues

    NASA Astrophysics Data System (ADS)

    Bhatt, Manish; Ayyalasomayajula, Kalyan R.; Yalavarthy, Phaneendra K.

    2016-07-01

    The attenuation of near-infrared (NIR) light intensity as it propagates in a turbid medium like biological tissue is described by modified the Beer-Lambert law (MBLL). The MBLL is generally used to quantify the changes in tissue chromophore concentrations for NIR spectroscopic data analysis. Even though MBLL is effective in terms of providing qualitative comparison, it suffers from its applicability across tissue types and tissue dimensions. In this work, we introduce Lambert-W function-based modeling for light propagation in biological tissues, which is a generalized version of the Beer-Lambert model. The proposed modeling provides parametrization of tissue properties, which includes two attenuation coefficients μ0 and η. We validated our model against the Monte Carlo simulation, which is the gold standard for modeling NIR light propagation in biological tissue. We included numerous human and animal tissues to validate the proposed empirical model, including an inhomogeneous adult human head model. The proposed model, which has a closed form (analytical), is first of its kind in providing accurate modeling of NIR light propagation in biological tissues.

  20. Updraft Fixed Bed Gasification Aspen Plus Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2007-09-27

    The updraft fixed bed gasification model provides predictive modeling capabilities for updraft fixed bed gasifiers, when devolatilization data is available. The fixed bed model is constructed using Aspen Plus, process modeling software, coupled with a FORTRAN user kinetic subroutine. Current updraft gasification models created in Aspen Plus have limited predictive capabilities and must be "tuned" to reflect a generalized gas composition as specified in literature or by the gasifier manufacturer. This limits the applicability of the process model.

  1. Online Survey, Enrollment, and Examination: Special Internet Applications in Teacher Education.

    ERIC Educational Resources Information Center

    Tu, Jho-Ju; Babione, Carolyn; Chen, Hsin-Chu

    The Teachers College at Emporia State University in Kansas is now utilizing World Wide Web technology for automating the application procedure for student teaching. The general concepts and some of the key terms that are important for understanding the process involved in this project include: a client-server model, HyperText Markup Language,…

  2. Application of the generalized vertical coordinate ocean model for better representing satellite data

    NASA Technical Reports Server (NTRS)

    Song, Y. T.

    2002-01-01

    It is found that two adaptive parametric functions can be introduced into the basic ocean equations for utilizing the optimal or hybrid features of commonly used z-level, terrain- following, isopycnal, and pressure coordinates in numerical ocean models. The two parametric functions are formulated by combining three techniques: the arbitrary vertical coordinate system of Kasahara (1 974), the Jacobian pressure gradient formulation of Song (1 998), and a newly developed metric factor that permits both compressible (non-Boussinesq) and incompressible (Boussinesq) approximations. Based on the new formulation, an adaptive modeling strategy is proposed and a staggered finite volume method is designed to ensure conservation of important physical properties and numerical accuracy. Implementation of the combined techniques to SCRUM (Song and Haidvogel1994) shows that the adaptive modeling strategy can be applied to any existing ocean model without incurring computational expense or altering the original numerical schemes. Such a generalized coordinate model is expected to benefit diverse ocean modelers for easily choosing optimal vertical structures and sharing modeling resources based on a common model platform. Several representing oceanographic problems with different scales and characteristics, such as coastal canyons, basin-scale circulation, and global ocean circulation, are used to demonstrate the model's capability for multiple applications. New results show that the model is capable of simultaneously resolving both Boussinesq and non-Boussinesq, and both small- and large-scale processes well. This talk will focus on its applications of multiple satellite sensing data in eddy-resolving simulations of Asian Marginal Sea and Kurosio. Attention will be given to how Topex/Poseidon SSH, TRMM SST; and GRACE ocean bottom pressure can be correctly represented in a non- Boussinesq model.

  3. Systems concept for speech technology application in general aviation

    NASA Technical Reports Server (NTRS)

    North, R. A.; Bergeron, H.

    1984-01-01

    The application potential of voice recognition and synthesis circuits for general aviation, single-pilot IFR (SPIFR) situations is examined. The viewpoint of the pilot was central to workload analyses and assessment of the effectiveness of the voice systems. A twin-engine, high performance general aviation aircraft on a cross-country fixed route was employed as the study model. No actual control movements were considered and other possible functions were scored by three IFR-rated instructors. The SPIFR was concluded helpful in alleviating visual and manual workloads during take-off, approach and landing, particularly for data retrieval and entry tasks. Voice synthesis was an aid in alerting a pilot to in-flight problems. It is expected that usable systems will be available within 5 yr.

  4. Generalized filtering of laser fields in optimal control theory: application to symmetry filtering of quantum gate operations

    NASA Astrophysics Data System (ADS)

    Schröder, Markus; Brown, Alex

    2009-10-01

    We present a modified version of a previously published algorithm (Gollub et al 2008 Phys. Rev. Lett.101 073002) for obtaining an optimized laser field with more general restrictions on the search space of the optimal field. The modification leads to enforcement of the constraints on the optimal field while maintaining good convergence behaviour in most cases. We demonstrate the general applicability of the algorithm by imposing constraints on the temporal symmetry of the optimal fields. The temporal symmetry is used to reduce the number of transitions that have to be optimized for quantum gate operations that involve inversion (NOT gate) or partial inversion (Hadamard gate) of the qubits in a three-dimensional model of ammonia.

  5. Telerobotic system performance measurement - Motivation and methods

    NASA Technical Reports Server (NTRS)

    Kondraske, George V.; Khoury, George J.

    1992-01-01

    A systems performance-based strategy for modeling and conducting experiments relevant to the design and performance characterization of telerobotic systems is described. A developmental testbed consisting of a distributed telerobotics network and initial efforts to implement the strategy described is presented. Consideration is given to the general systems performance theory (GSPT) to tackle human performance problems as a basis for: measurement of overall telerobotic system (TRS) performance; task decomposition; development of a generic TRS model; and the characterization of performance of subsystems comprising the generic model. GSPT employs a resource construct to model performance and resource economic principles to govern the interface of systems to tasks. It provides a comprehensive modeling/measurement strategy applicable to complex systems including both human and artificial components. Application is presented within the framework of a distributed telerobotics network as a testbed. Insight into the design of test protocols which elicit application-independent data is described.

  6. A matlab framework for estimation of NLME models using stochastic differential equations: applications for estimation of insulin secretion rates.

    PubMed

    Mortensen, Stig B; Klim, Søren; Dammann, Bernd; Kristensen, Niels R; Madsen, Henrik; Overgaard, Rune V

    2007-10-01

    The non-linear mixed-effects model based on stochastic differential equations (SDEs) provides an attractive residual error model, that is able to handle serially correlated residuals typically arising from structural mis-specification of the true underlying model. The use of SDEs also opens up for new tools for model development and easily allows for tracking of unknown inputs and parameters over time. An algorithm for maximum likelihood estimation of the model has earlier been proposed, and the present paper presents the first general implementation of this algorithm. The implementation is done in Matlab and also demonstrates the use of parallel computing for improved estimation times. The use of the implementation is illustrated by two examples of application which focus on the ability of the model to estimate unknown inputs facilitated by the extension to SDEs. The first application is a deconvolution-type estimation of the insulin secretion rate based on a linear two-compartment model for C-peptide measurements. In the second application the model is extended to also give an estimate of the time varying liver extraction based on both C-peptide and insulin measurements.

  7. CAD-CAM database management at Bendix Kansas City

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witte, D.R.

    1985-05-01

    The Bendix Kansas City Division of Allied Corporation began integrating mechanical CAD-CAM capabilities into its operations in June 1980. The primary capabilities include a wireframe modeling application, a solid modeling application, and the Bendix Integrated Computer Aided Manufacturing (BICAM) System application, a set of software programs and procedures which provides user-friendly access to graphic applications and data, and user-friendly sharing of data between applications and users. BICAM also provides for enforcement of corporate/enterprise policies. Three access categories, private, local, and global, are realized through the implementation of data-management metaphors: the desk, reading rack, file cabinet, and library are for themore » storage, retrieval, and sharing of drawings and models. Access is provided through menu selections; searching for designs is done by a paging method or a search-by-attribute-value method. The sharing of designs between all users of Part Data is key. The BICAM System supports 375 unique users per quarter and manages over 7500 drawings and models. The BICAM System demonstrates the need for generalized models, a high-level system framework, prototyping, information-modeling methods, and an understanding of the entire enterprise. Future BICAM System implementations are planned to take advantage of this knowledge.« less

  8. Parallel plan execution with self-processing networks

    NASA Technical Reports Server (NTRS)

    Dautrechy, C. Lynne; Reggia, James A.

    1989-01-01

    A critical issue for space operations is how to develop and apply advanced automation techniques to reduce the cost and complexity of working in space. In this context, it is important to examine how recent advances in self-processing networks can be applied for planning and scheduling tasks. For this reason, the feasibility of applying self-processing network models to a variety of planning and control problems relevant to spacecraft activities is being explored. Goals are to demonstrate that self-processing methods are applicable to these problems, and that MIRRORS/II, a general purpose software environment for implementing self-processing models, is sufficiently robust to support development of a wide range of application prototypes. Using MIRRORS/II and marker passing modelling techniques, a model of the execution of a Spaceworld plan was implemented. This is a simplified model of the Voyager spacecraft which photographed Jupiter, Saturn, and their satellites. It is shown that plan execution, a task usually solved using traditional artificial intelligence (AI) techniques, can be accomplished using a self-processing network. The fact that self-processing networks were applied to other space-related tasks, in addition to the one discussed here, demonstrates the general applicability of this approach to planning and control problems relevant to spacecraft activities. It is also demonstrated that MIRRORS/II is a powerful environment for the development and evaluation of self-processing systems.

  9. Variable selection for marginal longitudinal generalized linear models.

    PubMed

    Cantoni, Eva; Flemming, Joanna Mills; Ronchetti, Elvezio

    2005-06-01

    Variable selection is an essential part of any statistical analysis and yet has been somewhat neglected in the context of longitudinal data analysis. In this article, we propose a generalized version of Mallows's C(p) (GC(p)) suitable for use with both parametric and nonparametric models. GC(p) provides an estimate of a measure of model's adequacy for prediction. We examine its performance with popular marginal longitudinal models (fitted using GEE) and contrast results with what is typically done in practice: variable selection based on Wald-type or score-type tests. An application to real data further demonstrates the merits of our approach while at the same time emphasizing some important robust features inherent to GC(p).

  10. Dimensional Reduction for the General Markov Model on Phylogenetic Trees.

    PubMed

    Sumner, Jeremy G

    2017-03-01

    We present a method of dimensional reduction for the general Markov model of sequence evolution on a phylogenetic tree. We show that taking certain linear combinations of the associated random variables (site pattern counts) reduces the dimensionality of the model from exponential in the number of extant taxa, to quadratic in the number of taxa, while retaining the ability to statistically identify phylogenetic divergence events. A key feature is the identification of an invariant subspace which depends only bilinearly on the model parameters, in contrast to the usual multi-linear dependence in the full space. We discuss potential applications including the computation of split (edge) weights on phylogenetic trees from observed sequence data.

  11. Application of Climate Assessment Tool (CAT) to estimate climate variability impacts on nutrient loading from local watersheds

    Treesearch

    Ying Ouyang; Prem B. Parajuli; Gary Feng; Theodor D. Leininger; Yongshan Wan; Padmanava Dash

    2018-01-01

    A vast amount of future climate scenario datasets, created by climate models such as general circulation models (GCMs), have been used in conjunction with watershed models to project future climate variability impact on hydrological processes and water quality. However, these low spatial-temporal resolution datasets are often difficult to downscale spatially and...

  12. Economic Planning for Multicounty Rural Areas: Application of a Linear Programming Model in Northwest Arkansas. Technical Bulletin No. 1653.

    ERIC Educational Resources Information Center

    Williams, Daniel G.

    Planners in multicounty rural areas can use the Rural Development, Activity Analysis Planning (RDAAP) model to try to influence the optimal growth of their areas among different general economic goals. The model implies that best industries for rural areas have: high proportion of imported inputs; low transportation costs; high value added/output…

  13. Building out a Measurement Model to Incorporate Complexities of Testing in the Language Domain

    ERIC Educational Resources Information Center

    Wilson, Mark; Moore, Stephen

    2011-01-01

    This paper provides a summary of a novel and integrated way to think about the item response models (most often used in measurement applications in social science areas such as psychology, education, and especially testing of various kinds) from the viewpoint of the statistical theory of generalized linear and nonlinear mixed models. In addition,…

  14. Parsimonious Structural Equation Models for Repeated Measures Data, with Application to the Study of Consumer Preferences

    ERIC Educational Resources Information Center

    Elrod, Terry; Haubl, Gerald; Tipps, Steven W.

    2012-01-01

    Recent research reflects a growing awareness of the value of using structural equation models to analyze repeated measures data. However, such data, particularly in the presence of covariates, often lead to models that either fit the data poorly, are exceedingly general and hard to interpret, or are specified in a manner that is highly data…

  15. Information visualisation based on graph models

    NASA Astrophysics Data System (ADS)

    Kasyanov, V. N.; Kasyanova, E. V.

    2013-05-01

    Information visualisation is a key component of support tools for many applications in science and engineering. A graph is an abstract structure that is widely used to model information for its visualisation. In this paper, we consider practical and general graph formalism called hierarchical graphs and present the Higres and Visual Graph systems aimed at supporting information visualisation on the base of hierarchical graph models.

  16. Application of the coastal generalized ecosystem model (CGEM) to assess the impacts of a potential future climate scenario on northern Gulf of Mexico hypoxia

    EPA Science Inventory

    Mechanistic hypoxia models for the northern Gulf of Mexico are being used to guide policy goals for Mississippi River nutrient loading reductions. However, to date, these models have not examined the effects of both nutrient loads and future climate. Here, we simulate a future c...

  17. Mathematical modeling of high-pH chemical flooding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhuyan, D.; Lake, L.W.; Pope, G.A.

    1990-05-01

    This paper describes a generalized compositional reservoir simulator for high-pH chemical flooding processes. This simulator combines the reaction chemistry associated with these processes with the extensive physical- and flow-property modeling schemes of an existing micellar/polymer flood simulator, UTCHEM. Application of the model is illustrated for cases from a simple alkaline preflush to surfactant-enhanced alkaline-polymer flooding.

  18. A general rough-surface inversion algorithm: Theory and application to SAR data

    NASA Technical Reports Server (NTRS)

    Moghaddam, M.

    1993-01-01

    Rough-surface inversion has significant applications in interpretation of SAR data obtained over bare soil surfaces and agricultural lands. Due to the sparsity of data and the large pixel size in SAR applications, it is not feasible to carry out inversions based on numerical scattering models. The alternative is to use parameter estimation techniques based on approximate analytical or empirical models. Hence, there are two issues to be addressed, namely, what model to choose and what estimation algorithm to apply. Here, a small perturbation model (SPM) is used to express the backscattering coefficients of the rough surface in terms of three surface parameters. The algorithm used to estimate these parameters is based on a nonlinear least-squares criterion. The least-squares optimization methods are widely used in estimation theory, but the distinguishing factor for SAR applications is incorporating the stochastic nature of both the unknown parameters and the data into formulation, which will be discussed in detail. The algorithm is tested with synthetic data, and several Newton-type least-squares minimization methods are discussed to compare their convergence characteristics. Finally, the algorithm is applied to multifrequency polarimetric SAR data obtained over some bare soil and agricultural fields. Results will be shown and compared to ground-truth measurements obtained from these areas. The strength of this general approach to inversion of SAR data is that it can be easily modified for use with any scattering model without changing any of the inversion steps. Note also that, for the same reason it is not limited to inversion of rough surfaces, and can be applied to any parameterized scattering process.

  19. CityGML - Interoperable semantic 3D city models

    NASA Astrophysics Data System (ADS)

    Gröger, Gerhard; Plümer, Lutz

    2012-07-01

    CityGML is the international standard of the Open Geospatial Consortium (OGC) for the representation and exchange of 3D city models. It defines the three-dimensional geometry, topology, semantics and appearance of the most relevant topographic objects in urban or regional contexts. These definitions are provided in different, well-defined Levels-of-Detail (multiresolution model). The focus of CityGML is on the semantical aspects of 3D city models, its structures, taxonomies and aggregations, allowing users to employ virtual 3D city models for advanced analysis and visualization tasks in a variety of application domains such as urban planning, indoor/outdoor pedestrian navigation, environmental simulations, cultural heritage, or facility management. This is in contrast to purely geometrical/graphical models such as KML, VRML, or X3D, which do not provide sufficient semantics. CityGML is based on the Geography Markup Language (GML), which provides a standardized geometry model. Due to this model and its well-defined semantics and structures, CityGML facilitates interoperable data exchange in the context of geo web services and spatial data infrastructures. Since its standardization in 2008, CityGML has become used on a worldwide scale: tools from notable companies in the geospatial field provide CityGML interfaces. Many applications and projects use this standard. CityGML is also having a strong impact on science: numerous approaches use CityGML, particularly its semantics, for disaster management, emergency responses, or energy-related applications as well as for visualizations, or they contribute to CityGML, improving its consistency and validity, or use CityGML, particularly its different Levels-of-Detail, as a source or target for generalizations. This paper gives an overview of CityGML, its underlying concepts, its Levels-of-Detail, how to extend it, its applications, its likely future development, and the role it plays in scientific research. Furthermore, its relationship to other standards from the fields of computer graphics and computer-aided architectural design and to the prospective INSPIRE model are discussed, as well as the impact CityGML has and is having on the software industry, on applications of 3D city models, and on science generally.

  20. The Generalized Born solvation model: What is it?

    NASA Astrophysics Data System (ADS)

    Onufriev, Alexey

    2004-03-01

    Implicit solvation models provide, for many applications, an effective way of describing the electrostatic effects of aqueous solvation. Here we outline the main approximations behind the popular Generalized Born solvation model. We show how its accuracy, relative to the Poisson-Boltzmann treatment, can be significantly improved in a computationally inexpensive manner to make the model useful in the studies of large-scale conformational transitions at the atomic level. The improved model is tested in a molecular dynamics simulation of folding of a 46-residue (three helix bundle) protein. Starting from an extended structure at 450K, the protein folds to the lowest energy conformation within 6 ns of simulation time, and the predicted structure differs from the native one by 2.4 A (backbone RMSD).

  1. The applicability of turbulence models to aerodynamic and propulsion flowfields at McDonnell-Douglas Aerospace

    NASA Technical Reports Server (NTRS)

    Kral, Linda D.; Ladd, John A.; Mani, Mori

    1995-01-01

    The objective of this viewgraph presentation is to evaluate turbulence models for integrated aircraft components such as the forebody, wing, inlet, diffuser, nozzle, and afterbody. The one-equation models have replaced the algebraic models as the baseline turbulence models. The Spalart-Allmaras one-equation model consistently performs better than the Baldwin-Barth model, particularly in the log-layer and free shear layers. Also, the Sparlart-Allmaras model is not grid dependent like the Baldwin-Barth model. No general turbulence model exists for all engineering applications. The Spalart-Allmaras one-equation model and the Chien k-epsilon models are the preferred turbulence models. Although the two-equation models often better predict the flow field, they may take from two to five times the CPU time. Future directions are in further benchmarking the Menter blended k-w/k-epsilon and algorithmic improvements to reduce CPU time of the two-equation model.

  2. General three-state model with biased population replacement: Analytical solution and application to language dynamics

    NASA Astrophysics Data System (ADS)

    Colaiori, Francesca; Castellano, Claudio; Cuskley, Christine F.; Loreto, Vittorio; Pugliese, Martina; Tria, Francesca

    2015-01-01

    Empirical evidence shows that the rate of irregular usage of English verbs exhibits discontinuity as a function of their frequency: the most frequent verbs tend to be totally irregular. We aim to qualitatively understand the origin of this feature by studying simple agent-based models of language dynamics, where each agent adopts an inflectional state for a verb and may change it upon interaction with other agents. At the same time, agents are replaced at some rate by new agents adopting the regular form. In models with only two inflectional states (regular and irregular), we observe that either all verbs regularize irrespective of their frequency, or a continuous transition occurs between a low-frequency state, where the lemma becomes fully regular, and a high-frequency one, where both forms coexist. Introducing a third (mixed) state, wherein agents may use either form, we find that a third, qualitatively different behavior may emerge, namely, a discontinuous transition in frequency. We introduce and solve analytically a very general class of three-state models that allows us to fully understand these behaviors in a unified framework. Realistic sets of interaction rules, including the well-known naming game (NG) model, result in a discontinuous transition, in agreement with recent empirical findings. We also point out that the distinction between speaker and hearer in the interaction has no effect on the collective behavior. The results for the general three-state model, although discussed in terms of language dynamics, are widely applicable.

  3. The use of tetrahedral mesh geometries in Monte Carlo simulation of applicator based brachytherapy dose distributions

    NASA Astrophysics Data System (ADS)

    Paiva Fonseca, Gabriel; Landry, Guillaume; White, Shane; D'Amours, Michel; Yoriyaz, Hélio; Beaulieu, Luc; Reniers, Brigitte; Verhaegen, Frank

    2014-10-01

    Accounting for brachytherapy applicator attenuation is part of the recommendations from the recent report of AAPM Task Group 186. To do so, model based dose calculation algorithms require accurate modelling of the applicator geometry. This can be non-trivial in the case of irregularly shaped applicators such as the Fletcher Williamson gynaecological applicator or balloon applicators with possibly irregular shapes employed in accelerated partial breast irradiation (APBI) performed using electronic brachytherapy sources (EBS). While many of these applicators can be modelled using constructive solid geometry (CSG), the latter may be difficult and time-consuming. Alternatively, these complex geometries can be modelled using tessellated geometries such as tetrahedral meshes (mesh geometries (MG)). Recent versions of Monte Carlo (MC) codes Geant4 and MCNP6 allow for the use of MG. The goal of this work was to model a series of applicators relevant to brachytherapy using MG. Applicators designed for 192Ir sources and 50 kV EBS were studied; a shielded vaginal applicator, a shielded Fletcher Williamson applicator and an APBI balloon applicator. All applicators were modelled in Geant4 and MCNP6 using MG and CSG for dose calculations. CSG derived dose distributions were considered as reference and used to validate MG models by comparing dose distribution ratios. In general agreement within 1% for the dose calculations was observed for all applicators between MG and CSG and between codes when considering volumes inside the 25% isodose surface. When compared to CSG, MG required longer computation times by a factor of at least 2 for MC simulations using the same code. MCNP6 calculation times were more than ten times shorter than Geant4 in some cases. In conclusion we presented methods allowing for high fidelity modelling with results equivalent to CSG. To the best of our knowledge MG offers the most accurate representation of an irregular APBI balloon applicator.

  4. Exponentiated power Lindley distribution.

    PubMed

    Ashour, Samir K; Eltehiwy, Mahmoud A

    2015-11-01

    A new generalization of the Lindley distribution is recently proposed by Ghitany et al. [1], called as the power Lindley distribution. Another generalization of the Lindley distribution was introduced by Nadarajah et al. [2], named as the generalized Lindley distribution. This paper proposes a more generalization of the Lindley distribution which generalizes the two. We refer to this new generalization as the exponentiated power Lindley distribution. The new distribution is important since it contains as special sub-models some widely well-known distributions in addition to the above two models, such as the Lindley distribution among many others. It also provides more flexibility to analyze complex real data sets. We study some statistical properties for the new distribution. We discuss maximum likelihood estimation of the distribution parameters. Least square estimation is used to evaluate the parameters. Three algorithms are proposed for generating random data from the proposed distribution. An application of the model to a real data set is analyzed using the new distribution, which shows that the exponentiated power Lindley distribution can be used quite effectively in analyzing real lifetime data.

  5. Comparative study of generalized born models: Born radii and peptide folding.

    PubMed

    Zhu, Jiang; Alexov, Emil; Honig, Barry

    2005-02-24

    In this study, we have implemented four analytical generalized Born (GB) models and investigated their performance in conjunction with the GROMOS96 force field. The four models include that of Still and co-workers, the HCT model of Cramer, Truhlar, and co-workers, a modified form of the AGB model of Levy and co-workers, and the GBMV2 model of Brooks and co-workers. The models were coded independently and implemented in the GROMOS software package and in TINKER. They were compared in terms of their ability to reproduce the results of Poisson-Boltzmann (PB) calculations and in their performance in the ab initio peptide folding of two peptides, one that forms a beta-hairpin in solution and one that forms an alpha-helix. In agreement with previous work, the GBMV2 model is most successful in reproducing PB results while the other models tend to underestimate the effective Born radii of buried atoms. In contrast, stochastic dynamics simulations on the folding of the two peptides, the C-terminus beta-hairpin of the B1 domain of protein G and the alanine-based alpha-helical peptide 3K(I), suggest that the simpler GB models are more effective in sampling conformational space. Indeed, the Still model used in conjunction with the GROMOS96 force field is able to fold the hairpin peptide to a native-like structure without the benefit of enhanced sampling techniques. This is due in part to the properties of the united-atom GROMOS96 force field which appears to be more flexible, and hence to sample more efficiently, than force fields such as OPLSAA. Our results suggest a general strategy which involves using different combinations of force fields and solvent models in different applications, for example, using GROMOS96 and a simple GB model in sampling and OPLSAA and a more accurate GB model in refinement. The fact that various methods have been implemented in a unified way should facilitate the testing and subsequent use of different methods to evaluate conformational free energies in different applications. Our results also bear on some general issues involved in peptide folding and structure prediction which are addressed in the Discussion.

  6. Indoor Semi-volatile Organic Compounds (i-SVOC) Version 1.0

    EPA Pesticide Factsheets

    i-SVOC Version 1.0 is a general-purpose software application for dynamic modeling of the emission, transport, sorption, and distribution of semi-volatile organic compounds (SVOCs) in indoor environments.

  7. STRUCTURE-ACTIVITY RELATIONSHIPS--COMPUTERIZED SYSTEMS

    EPA Science Inventory

    This report discusses some important general strategies and issuesrelative to the application of computational SAR techniques formodeling genotoxicity and carcinogenicity endpoints. roblemsparticular to the SAR modeling of such endpoints pertain to: hecomplexity of the carcinogen...

  8. Maximum Marginal Likelihood Estimation of a Monotonic Polynomial Generalized Partial Credit Model with Applications to Multiple Group Analysis.

    PubMed

    Falk, Carl F; Cai, Li

    2016-06-01

    We present a semi-parametric approach to estimating item response functions (IRF) useful when the true IRF does not strictly follow commonly used functions. Our approach replaces the linear predictor of the generalized partial credit model with a monotonic polynomial. The model includes the regular generalized partial credit model at the lowest order polynomial. Our approach extends Liang's (A semi-parametric approach to estimate IRFs, Unpublished doctoral dissertation, 2007) method for dichotomous item responses to the case of polytomous data. Furthermore, item parameter estimation is implemented with maximum marginal likelihood using the Bock-Aitkin EM algorithm, thereby facilitating multiple group analyses useful in operational settings. Our approach is demonstrated on both educational and psychological data. We present simulation results comparing our approach to more standard IRF estimation approaches and other non-parametric and semi-parametric alternatives.

  9. Maximum profile likelihood estimation of differential equation parameters through model based smoothing state estimates.

    PubMed

    Campbell, D A; Chkrebtii, O

    2013-12-01

    Statistical inference for biochemical models often faces a variety of characteristic challenges. In this paper we examine state and parameter estimation for the JAK-STAT intracellular signalling mechanism, which exemplifies the implementation intricacies common in many biochemical inference problems. We introduce an extension to the Generalized Smoothing approach for estimating delay differential equation models, addressing selection of complexity parameters, choice of the basis system, and appropriate optimization strategies. Motivated by the JAK-STAT system, we further extend the generalized smoothing approach to consider a nonlinear observation process with additional unknown parameters, and highlight how the approach handles unobserved states and unevenly spaced observations. The methodology developed is generally applicable to problems of estimation for differential equation models with delays, unobserved states, nonlinear observation processes, and partially observed histories. Crown Copyright © 2013. Published by Elsevier Inc. All rights reserved.

  10. A Non-Gaussian Stock Price Model: Options, Credit and a Multi-Timescale Memory

    NASA Astrophysics Data System (ADS)

    Borland, L.

    We review a recently proposed model of stock prices, based on astatistical feedback model that results in a non-Gaussian distribution of price changes. Applications to option pricing and the pricing of debt is discussed. A generalization to account for feedback effects over multiple timescales is also presented. This model reproduces most of the stylized facts (ie statistical anomalies) observed in real financial markets.

  11. Strategy Guideline: Modeling Enclosure Design in Above-Grade Walls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lstiburek, J.; Ueno, K.; Musunuru, S.

    2016-02-24

    The Strategy Guideline describes how to model and interpret results of models for above grade walls. The Measure Guideline analyzes the failure thresholds and criteria for above grade walls. A library of above-grade walls with historically successful performance was used to calibrate WUFI (Warme Und Feuchte Instationar) software models. The information is generalized for application to a broad population of houses within the limits of existing experience.

  12. Susceptible-infected-susceptible epidemics on networks with general infection and cure times.

    PubMed

    Cator, E; van de Bovenkamp, R; Van Mieghem, P

    2013-06-01

    The classical, continuous-time susceptible-infected-susceptible (SIS) Markov epidemic model on an arbitrary network is extended to incorporate infection and curing or recovery times each characterized by a general distribution (rather than an exponential distribution as in Markov processes). This extension, called the generalized SIS (GSIS) model, is believed to have a much larger applicability to real-world epidemics (such as information spread in online social networks, real diseases, malware spread in computer networks, etc.) that likely do not feature exponential times. While the exact governing equations for the GSIS model are difficult to deduce due to their non-Markovian nature, accurate mean-field equations are derived that resemble our previous N-intertwined mean-field approximation (NIMFA) and so allow us to transfer the whole analytic machinery of the NIMFA to the GSIS model. In particular, we establish the criterion to compute the epidemic threshold in the GSIS model. Moreover, we show that the average number of infection attempts during a recovery time is the more natural key parameter, instead of the effective infection rate in the classical, continuous-time SIS Markov model. The relative simplicity of our mean-field results enables us to treat more general types of SIS epidemics, while offering an easier key parameter to measure the average activity of those general viral agents.

  13. Susceptible-infected-susceptible epidemics on networks with general infection and cure times

    NASA Astrophysics Data System (ADS)

    Cator, E.; van de Bovenkamp, R.; Van Mieghem, P.

    2013-06-01

    The classical, continuous-time susceptible-infected-susceptible (SIS) Markov epidemic model on an arbitrary network is extended to incorporate infection and curing or recovery times each characterized by a general distribution (rather than an exponential distribution as in Markov processes). This extension, called the generalized SIS (GSIS) model, is believed to have a much larger applicability to real-world epidemics (such as information spread in online social networks, real diseases, malware spread in computer networks, etc.) that likely do not feature exponential times. While the exact governing equations for the GSIS model are difficult to deduce due to their non-Markovian nature, accurate mean-field equations are derived that resemble our previous N-intertwined mean-field approximation (NIMFA) and so allow us to transfer the whole analytic machinery of the NIMFA to the GSIS model. In particular, we establish the criterion to compute the epidemic threshold in the GSIS model. Moreover, we show that the average number of infection attempts during a recovery time is the more natural key parameter, instead of the effective infection rate in the classical, continuous-time SIS Markov model. The relative simplicity of our mean-field results enables us to treat more general types of SIS epidemics, while offering an easier key parameter to measure the average activity of those general viral agents.

  14. Concept of dynamic memory in economics

    NASA Astrophysics Data System (ADS)

    Tarasova, Valentina V.; Tarasov, Vasily E.

    2018-02-01

    In this paper we discuss a concept of dynamic memory and an application of fractional calculus to describe the dynamic memory. The concept of memory is considered from the standpoint of economic models in the framework of continuous time approach based on fractional calculus. We also describe some general restrictions that can be imposed on the structure and properties of dynamic memory. These restrictions include the following three principles: (a) the principle of fading memory; (b) the principle of memory homogeneity on time (the principle of non-aging memory); (c) the principle of memory reversibility (the principle of memory recovery). Examples of different memory functions are suggested by using the fractional calculus. To illustrate an application of the concept of dynamic memory in economics we consider a generalization of the Harrod-Domar model, where the power-law memory is taken into account.

  15. Classification of Phase Transitions by Microcanonical Inflection-Point Analysis

    NASA Astrophysics Data System (ADS)

    Qi, Kai; Bachmann, Michael

    2018-05-01

    By means of the principle of minimal sensitivity we generalize the microcanonical inflection-point analysis method by probing derivatives of the microcanonical entropy for signals of transitions in complex systems. A strategy of systematically identifying and locating independent and dependent phase transitions of any order is proposed. The power of the generalized method is demonstrated in applications to the ferromagnetic Ising model and a coarse-grained model for polymer adsorption onto a substrate. The results shed new light on the intrinsic phase structure of systems with cooperative behavior.

  16. Fast model updating coupling Bayesian inference and PGD model reduction

    NASA Astrophysics Data System (ADS)

    Rubio, Paul-Baptiste; Louf, François; Chamoin, Ludovic

    2018-04-01

    The paper focuses on a coupled Bayesian-Proper Generalized Decomposition (PGD) approach for the real-time identification and updating of numerical models. The purpose is to use the most general case of Bayesian inference theory in order to address inverse problems and to deal with different sources of uncertainties (measurement and model errors, stochastic parameters). In order to do so with a reasonable CPU cost, the idea is to replace the direct model called for Monte-Carlo sampling by a PGD reduced model, and in some cases directly compute the probability density functions from the obtained analytical formulation. This procedure is first applied to a welding control example with the updating of a deterministic parameter. In the second application, the identification of a stochastic parameter is studied through a glued assembly example.

  17. Estimation and model selection of semiparametric multivariate survival functions under general censorship.

    PubMed

    Chen, Xiaohong; Fan, Yanqin; Pouzo, Demian; Ying, Zhiliang

    2010-07-01

    We study estimation and model selection of semiparametric models of multivariate survival functions for censored data, which are characterized by possibly misspecified parametric copulas and nonparametric marginal survivals. We obtain the consistency and root- n asymptotic normality of a two-step copula estimator to the pseudo-true copula parameter value according to KLIC, and provide a simple consistent estimator of its asymptotic variance, allowing for a first-step nonparametric estimation of the marginal survivals. We establish the asymptotic distribution of the penalized pseudo-likelihood ratio statistic for comparing multiple semiparametric multivariate survival functions subject to copula misspecification and general censorship. An empirical application is provided.

  18. Estimation and model selection of semiparametric multivariate survival functions under general censorship

    PubMed Central

    Chen, Xiaohong; Fan, Yanqin; Pouzo, Demian; Ying, Zhiliang

    2013-01-01

    We study estimation and model selection of semiparametric models of multivariate survival functions for censored data, which are characterized by possibly misspecified parametric copulas and nonparametric marginal survivals. We obtain the consistency and root-n asymptotic normality of a two-step copula estimator to the pseudo-true copula parameter value according to KLIC, and provide a simple consistent estimator of its asymptotic variance, allowing for a first-step nonparametric estimation of the marginal survivals. We establish the asymptotic distribution of the penalized pseudo-likelihood ratio statistic for comparing multiple semiparametric multivariate survival functions subject to copula misspecification and general censorship. An empirical application is provided. PMID:24790286

  19. Practical Application of Model Checking in Software Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Skakkebaek, Jens Ulrik

    1999-01-01

    This paper presents our experiences in applying the JAVA PATHFINDER (J(sub PF)), a recently developed JAVA to SPIN translator, in the finding of synchronization bugs in a Chinese Chess game server application written in JAVA. We give an overview of J(sub PF) and the subset of JAVA that it supports and describe the abstraction and verification of the game server. Finally, we analyze the results of the effort. We argue that abstraction by under-approximation is necessary for abstracting sufficiently smaller models for verification purposes; that user guidance is crucial for effective abstraction; and that current model checkers do not conveniently support the computational models of software in general and JAVA in particular.

  20. Clinical application of the five-factor model.

    PubMed

    Widiger, Thomas A; Presnall, Jennifer Ruth

    2013-12-01

    The Five-Factor Model (FFM) has become the predominant dimensional model of general personality structure. The purpose of this paper is to suggest a clinical application. A substantial body of research indicates that the personality disorders included within the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) can be understood as extreme and/or maladaptive variants of the FFM (the acronym "DSM" refers to any particular edition of the APA DSM). In addition, the current proposal for the forthcoming fifth edition of the DSM (i.e., DSM-5) is shifting closely toward an FFM dimensional trait model of personality disorder. Advantages of this shifting conceptualization are discussed, including treatment planning. © 2012 Wiley Periodicals, Inc.

  1. Artificial Neural Networks for Processing Graphs with Application to Image Understanding: A Survey

    NASA Astrophysics Data System (ADS)

    Bianchini, Monica; Scarselli, Franco

    In graphical pattern recognition, each data is represented as an arrangement of elements, that encodes both the properties of each element and the relations among them. Hence, patterns are modelled as labelled graphs where, in general, labels can be attached to both nodes and edges. Artificial neural networks able to process graphs are a powerful tool for addressing a great variety of real-world problems, where the information is naturally organized in entities and relationships among entities and, in fact, they have been widely used in computer vision, f.i. in logo recognition, in similarity retrieval, and for object detection. In this chapter, we propose a survey of neural network models able to process structured information, with a particular focus on those architectures tailored to address image understanding applications. Starting from the original recursive model (RNNs), we subsequently present different ways to represent images - by trees, forests of trees, multiresolution trees, directed acyclic graphs with labelled edges, general graphs - and, correspondingly, neural network architectures appropriate to process such structures.

  2. Item Response Theory Using Hierarchical Generalized Linear Models

    ERIC Educational Resources Information Center

    Ravand, Hamdollah

    2015-01-01

    Multilevel models (MLMs) are flexible in that they can be employed to obtain item and person parameters, test for differential item functioning (DIF) and capture both local item and person dependence. Papers on the MLM analysis of item response data have focused mostly on theoretical issues where applications have been add-ons to simulation…

  3. Forestry sector analysis for developing countries: issues and methods.

    Treesearch

    R.W. Haynes

    1993-01-01

    A satellite meeting of the 10th Forestry World Congress focused on the methods used for forest sector analysis and their applications in both developed and developing countries. The results of that meeting are summarized, and a general approach for forest sector modeling is proposed. The approach includes models derived from the existing...

  4. A General Multivariate Latent Growth Model with Applications to Student Achievement

    ERIC Educational Resources Information Center

    Bianconcini, Silvia; Cagnone, Silvia

    2012-01-01

    The evaluation of the formative process in the University system has been assuming an ever increasing importance in the European countries. Within this context, the analysis of student performance and capabilities plays a fundamental role. In this work, the authors propose a multivariate latent growth model for studying the performances of a…

  5. An Ethnographic Case Study of the Administrative Organization, Processes, and Behavior in a Model Comprehensive High School.

    ERIC Educational Resources Information Center

    Zimman, Richard N.

    Using ethnographic case study methodology (involving open-ended interviews, participant observation, and document analysis) theories of administrative organization, processes, and behavior were tested during a three-week observation of a model comprehensive (experimental) high school. Although the study is limited in its general application, it…

  6. Poisson Growth Mixture Modeling of Intensive Longitudinal Data: An Application to Smoking Cessation Behavior

    ERIC Educational Resources Information Center

    Shiyko, Mariya P.; Li, Yuelin; Rindskopf, David

    2012-01-01

    Intensive longitudinal data (ILD) have become increasingly common in the social and behavioral sciences; count variables, such as the number of daily smoked cigarettes, are frequently used outcomes in many ILD studies. We demonstrate a generalized extension of growth mixture modeling (GMM) to Poisson-distributed ILD for identifying qualitatively…

  7. A Generalized Process Model of Human Action Selection and Error and its Application to Error Prediction

    DTIC Science & Technology

    2014-07-01

    Macmillan & Creelman , 2005). This is a quite high degree of discriminability and it means that when the decision model predicts a probability of...ROC analysis. Pattern Recognition Letters, 27(8), 861-874. Retrieved from Google Scholar. Macmillan, N. A., & Creelman , C. D. (2005). Detection

  8. Intraclass Correlation Coefficients in Hierarchical Designs: Evaluation Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2011-01-01

    Interval estimation of intraclass correlation coefficients in hierarchical designs is discussed within a latent variable modeling framework. A method accomplishing this aim is outlined, which is applicable in two-level studies where participants (or generally lower-order units) are clustered within higher-order units. The procedure can also be…

  9. Improving component interoperability and reusability with the java connection framework (JCF): overview and application to the ages-w environmental model

    USDA-ARS?s Scientific Manuscript database

    Environmental modeling framework (EMF) design goals are multi-dimensional and often include many aspects of general software framework development. Many functional capabilities offered by current EMFs are closely related to interoperability and reuse aspects. For example, an EMF needs to support dev...

  10. Battling the War for Talent: An Application in a Military Context

    ERIC Educational Resources Information Center

    Schreurs, Bert H. J.; Syed, Fariya

    2011-01-01

    Purpose: The purpose of this paper is to introduce a comprehensive new recruitment model that brings together research findings in the different areas of recruitment. This model may serve as a general framework for further recruitment research, and is intended to support Human Resource managers in developing their recruitment policy. To highlight…

  11. Determination of the key parameters affecting historic communications satellite trends

    NASA Technical Reports Server (NTRS)

    Namkoong, D.

    1984-01-01

    Data representing 13 series of commercial communications satellites procured between 1968 and 1982 were analyzed to determine the factors that have contributed to the general reduction over time of the per circuit cost of communications satellites. The model by which the data were analyzed was derived from a general telecommunications application and modified to be more directly applicable for communications satellites. In this model satellite mass, bandwidth-years, and technological change were the variable parameters. A linear, least squares, multiple regression routine was used to obtain the measure of significance of the model. Correlation was measured by coefficient of determination (R super 2) and t-statistic. The results showed that no correlation could be established with satellite mass. Bandwidth-year however, did show a significant correlation. Technological change in the bandwidth-year case was a significant factor in the model. This analysis and the conclusions derived are based on mature technologies, i.e., satellite designs that are evolutions of earlier designs rather than the first of a new generation. The findings, therefore, are appropriate to future satellites only if they are a continuation of design evolution.

  12. A general model-based design of experiments approach to achieve practical identifiability of pharmacokinetic and pharmacodynamic models.

    PubMed

    Galvanin, Federico; Ballan, Carlo C; Barolo, Massimiliano; Bezzo, Fabrizio

    2013-08-01

    The use of pharmacokinetic (PK) and pharmacodynamic (PD) models is a common and widespread practice in the preliminary stages of drug development. However, PK-PD models may be affected by structural identifiability issues intrinsically related to their mathematical formulation. A preliminary structural identifiability analysis is usually carried out to check if the set of model parameters can be uniquely determined from experimental observations under the ideal assumptions of noise-free data and no model uncertainty. However, even for structurally identifiable models, real-life experimental conditions and model uncertainty may strongly affect the practical possibility to estimate the model parameters in a statistically sound way. A systematic procedure coupling the numerical assessment of structural identifiability with advanced model-based design of experiments formulations is presented in this paper. The objective is to propose a general approach to design experiments in an optimal way, detecting a proper set of experimental settings that ensure the practical identifiability of PK-PD models. Two simulated case studies based on in vitro bacterial growth and killing models are presented to demonstrate the applicability and generality of the methodology to tackle model identifiability issues effectively, through the design of feasible and highly informative experiments.

  13. The application of SVR model in the improvement of QbD: a case study of the extraction of podophyllotoxin.

    PubMed

    Zhai, Chun-Hui; Xuan, Jian-Bang; Fan, Hai-Liu; Zhao, Teng-Fei; Jiang, Jian-Lan

    2018-05-03

    In order to make a further optimization of process design via increasing the stability of design space, we brought in the model of Support Vector Regression (SVR). In this work, the extraction of podophyllotoxin was researched as a case study based on Quality by Design (QbD). We compared the fitting effect of SVR and the most used quadratic polynomial model (QPM) in QbD, and an analysis was made between the two design spaces obtained by SVR and QPM. As a result, the SVR stayed ahead of QPM in prediction accuracy, the stability of model and the generalization ability. The introduction of SVR into QbD made the extraction process of podophyllotoxin well designed and easier to control. The better fitting effect of SVR improved the application effect of QbD and the universal applicability of SVR, especially for non-linear, complicated and weak-regularity problems, widened the application field of QbD.

  14. Modeling microbiological and chemical processes in municipal solid waste bioreactor, Part II: Application of numerical model BIOKEMOD-3P.

    PubMed

    Gawande, Nitin A; Reinhart, Debra R; Yeh, Gour-Tsyh

    2010-02-01

    Biodegradation process modeling of municipal solid waste (MSW) bioreactor landfills requires the knowledge of various process reactions and corresponding kinetic parameters. Mechanistic models available to date are able to simulate biodegradation processes with the help of pre-defined species and reactions. Some of these models consider the effect of critical parameters such as moisture content, pH, and temperature. Biomass concentration is a vital parameter for any biomass growth model and often not compared with field and laboratory results. A more complex biodegradation model includes a large number of chemical and microbiological species. Increasing the number of species and user defined process reactions in the simulation requires a robust numerical tool. A generalized microbiological and chemical model, BIOKEMOD-3P, was developed to simulate biodegradation processes in three-phases (Gawande et al. 2009). This paper presents the application of this model to simulate laboratory-scale MSW bioreactors under anaerobic conditions. BIOKEMOD-3P was able to closely simulate the experimental data. The results from this study may help in application of this model to full-scale landfill operation.

  15. Inversion of Gravity Data to Define the Pre-Cenozoic Surface and Regional Structures Possibly Influencing Groundwater Flow in the Rainier Mesa Region, Nye County, Nevada.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas G. Hildenbrand; Geoffrey A. Phelps; Edward A. Mankinen

    2006-09-21

    A three-dimensional inversion of gravity data from the Rainier Mesa area and surrounding regions reveals a topographically complex pre-Cenozoic basement surface. This model of the depth to pre-Cenozoic basement rocks is intended for use in a 3D hydrogeologic model being constructed for the Rainier Mesa area. Prior to this study, our knowledge of the depth to pre-Cenozoic basement rocks was based on a regional model, applicable to general studies of the greater Nevada Test Site area but inappropriate for higher resolution modeling of ground-water flow across the Rainier Mesa area. The new model incorporates several changes that lead to significantmore » improvements over the previous regional view. First, the addition of constraining wells, encountering old volcanic rocks lying above but near pre-Cenozoic basement, prevents modeled basement from being too shallow. Second, an extensive literature and well data search has led to an increased understanding of the change of rock density with depth in the vicinity of Rainier Mesa. The third, and most important change, relates to the application of several depth-density relationships in the study area instead of a single generalized relationship, thereby improving the overall model fit. In general, the pre-Cenozoic basement surface deepens in the western part of the study area, delineating collapses within the Silent Canyon and Timber Mountain caldera complexes, and shallows in the east in the Eleana Range and Yucca Flat regions, where basement crops out. In the Rainier Mesa study area, basement is generally shallow (< 1 km). The new model identifies previously unrecognized structures within the pre-Cenozoic basement that may influence ground-water flow, such as a shallow basement ridge related to an inferred fault extending northward from Rainier Mesa into Kawich Valley.« less

  16. Modeling of space environment impact on nanostructured materials. General principles

    NASA Astrophysics Data System (ADS)

    Voronina, Ekaterina; Novikov, Lev

    2016-07-01

    In accordance with the resolution of ISO TC20/SC14 WG4/WG6 joint meeting, Technical Specification (TS) 'Modeling of space environment impact on nanostructured materials. General principles' which describes computer simulation methods of space environment impact on nanostructured materials is being prepared. Nanomaterials surpass traditional materials for space applications in many aspects due to their unique properties associated with nanoscale size of their constituents. This superiority in mechanical, thermal, electrical and optical properties will evidently inspire a wide range of applications in the next generation spacecraft intended for the long-term (~15-20 years) operation in near-Earth orbits and the automatic and manned interplanetary missions. Currently, ISO activity on developing standards concerning different issues of nanomaterials manufacturing and applications is high enough. Most such standards are related to production and characterization of nanostructures, however there is no ISO documents concerning nanomaterials behavior in different environmental conditions, including the space environment. The given TS deals with the peculiarities of the space environment impact on nanostructured materials (i.e. materials with structured objects which size in at least one dimension lies within 1-100 nm). The basic purpose of the document is the general description of the methodology of applying computer simulation methods which relate to different space and time scale to modeling processes occurring in nanostructured materials under the space environment impact. This document will emphasize the necessity of applying multiscale simulation approach and present the recommendations for the choice of the most appropriate methods (or a group of methods) for computer modeling of various processes that can occur in nanostructured materials under the influence of different space environment components. In addition, TS includes the description of possible approximations and limitations of proposed simulation methods as well as of widely used software codes. This TS may be used as a base for developing a new standard devoted to nanomaterials applications for spacecraft.

  17. The algebra of the general Markov model on phylogenetic trees and networks.

    PubMed

    Sumner, J G; Holland, B R; Jarvis, P D

    2012-04-01

    It is known that the Kimura 3ST model of sequence evolution on phylogenetic trees can be extended quite naturally to arbitrary split systems. However, this extension relies heavily on mathematical peculiarities of the associated Hadamard transformation, and providing an analogous augmentation of the general Markov model has thus far been elusive. In this paper, we rectify this shortcoming by showing how to extend the general Markov model on trees to include incompatible edges; and even further to more general network models. This is achieved by exploring the algebra of the generators of the continuous-time Markov chain together with the “splitting” operator that generates the branching process on phylogenetic trees. For simplicity, we proceed by discussing the two state case and then show that our results are easily extended to more states with little complication. Intriguingly, upon restriction of the two state general Markov model to the parameter space of the binary symmetric model, our extension is indistinguishable from the Hadamard approach only on trees; as soon as any incompatible splits are introduced the two approaches give rise to differing probability distributions with disparate structure. Through exploration of a simple example, we give an argument that our extension to more general networks has desirable properties that the previous approaches do not share. In particular, our construction allows for convergent evolution of previously divergent lineages; a property that is of significant interest for biological applications.

  18. Calibration Modeling Methodology to Optimize Performance for Low Range Applications

    NASA Technical Reports Server (NTRS)

    McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.

    2010-01-01

    Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.

  19. A systems approach to theoretical fluid mechanics: Fundamentals

    NASA Technical Reports Server (NTRS)

    Anyiwo, J. C.

    1978-01-01

    A preliminary application of the underlying principles of the investigator's general system theory to the description and analyses of the fluid flow system is presented. An attempt is made to establish practical models, or elements of the general fluid flow system from the point of view of the general system theory fundamental principles. Results obtained are applied to a simple experimental fluid flow system, as test case, with particular emphasis on the understanding of fluid flow instability, transition and turbulence.

  20. Figures of merit for self-beating filtered microwave photonic systems.

    PubMed

    Pérez, Daniel; Gasulla, Ivana; Capmany, José; Fandiño, Javier S; Muñoz, Pascual; Alavi, Hossein

    2016-05-02

    We present a model to compute the figures of merit of self-beating Microwave Photonic systems, a novel class of systems that work on a self-homodyne fashion by sharing the same laser source for information bearing and local oscillator tasks. General and simplified expressions are given and, as an example, we have considered their application to the design of a tunable RF MWP BS/UE front end for band selection, based on a Chebyshev Type-II optical filter. The applicability and usefulness of the model are also discussed.

  1. Application of a General Polytomous Testlet Model to the Reading Section of a Large-Scale English Language Assessment. Research Report. ETS RR-10-21

    ERIC Educational Resources Information Center

    Li, Yanmei; Li, Shuhong; Wang, Lin

    2010-01-01

    Many standardized educational tests include groups of items based on a common stimulus, known as "testlets". Standard unidimensional item response theory (IRT) models are commonly used to model examinees' responses to testlet items. However, it is known that local dependence among testlet items can lead to biased item parameter estimates…

  2. Bioaccumulation and Aquatic System Simulator (BASS) User's Manual Beta Test Version 2.1. EPA/600/R-01/035

    EPA Pesticide Factsheets

    this report describes the theoretical development, parameterization, and application software of a generalized, community-based, bioaccumulation model called BASS (Bioaccumulation and Aquatic System Simulator).

  3. A risk evaluation model and its application in online retailing trustfulness

    NASA Astrophysics Data System (ADS)

    Ye, Ruyi; Xu, Yingcheng

    2017-08-01

    Building a general model for risks evaluation in advance could improve the convenience, normality and comparability of the results of repeating risks evaluation in the case that the repeating risks evaluating are in the same area and for a similar purpose. One of the most convenient and common risks evaluation models is an index system including of several index, according weights and crediting method. One method to build a risk evaluation index system that guarantees the proportional relationship between the resulting credit and the expected risk loss is proposed and an application example is provided in online retailing in this article.

  4. Application of a forest-simulation model to assess the energy yield and ecological impact of forest utilization for energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doyle, T W; Shugart, H H; West, D C

    1981-01-01

    This study examines the utilization and management of natural forest lands to meet growing wood-energy demands. An application of a forest simulation model is described for assessing energy returns and long-term ecological impacts of wood-energy harvesting under four general silvicultural practices. Results indicate that moderate energy yields could be expected from mild cutting operations which would significantly effect neither the commercial timber market nor the composition, structure, or diversity of these forests. Forest models can provide an effective tool for determining optimal management strategies that maximize energy returns, minimize environmental detriment, and complement existing land-use plans.

  5. A DGTD method for the numerical modeling of the interaction of light with nanometer scale metallic structures taking into account non-local dispersion effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmitt, Nikolai; Technische Universitaet Darmstadt, Institut fuer Theorie Elektromagnetischer Felder; Scheid, Claire

    2016-07-01

    The interaction of light with metallic nanostructures is increasingly attracting interest because of numerous potential applications. Sub-wavelength metallic structures, when illuminated with a frequency close to the plasma frequency of the metal, present resonances that cause extreme local field enhancements. Exploiting the latter in applications of interest requires a detailed knowledge about the occurring fields which can actually not be obtained analytically. For the latter mentioned reason, numerical tools are thus an absolute necessity. The insight they provide is very often the only way to get a deep enough understanding of the very rich physics at play. For the numericalmore » modeling of light-structure interaction on the nanoscale, the choice of an appropriate material model is a crucial point. Approaches that are adopted in a first instance are based on local (i.e. with no interaction between electrons) dispersive models, e.g. Drude or Drude–Lorentz models. From the mathematical point of view, when a time-domain modeling is considered, these models lead to an additional system of ordinary differential equations coupled to Maxwell's equations. However, recent experiments have shown that the repulsive interaction between electrons inside the metal makes the response of metals intrinsically non-local and that this effect cannot generally be overlooked. Technological achievements have enabled the consideration of metallic structures in a regime where such non-localities have a significant influence on the structures' optical response. This leads to an additional, in general non-linear, system of partial differential equations which is, when coupled to Maxwell's equations, significantly more difficult to treat. Nevertheless, dealing with a linearized non-local dispersion model already opens the route to numerous practical applications of plasmonics. In this work, we present a Discontinuous Galerkin Time-Domain (DGTD) method able to solve the system of Maxwell's equations coupled to a linearized non-local dispersion model relevant to plasmonics. While the method is presented in the general 3D case, numerical results are given for 2D simulation settings.« less

  6. Multilocality and fusion rules on the generalized structure functions in two-dimensional and three-dimensional Navier-Stokes turbulence.

    PubMed

    Gkioulekas, Eleftherios

    2016-09-01

    Using the fusion-rules hypothesis for three-dimensional and two-dimensional Navier-Stokes turbulence, we generalize a previous nonperturbative locality proof to multiple applications of the nonlinear interactions operator on generalized structure functions of velocity differences. We call this generalization of nonperturbative locality to multiple applications of the nonlinear interactions operator "multilocality." The resulting cross terms pose a new challenge requiring a new argument and the introduction of a new fusion rule that takes advantage of rotational symmetry. Our main result is that the fusion-rules hypothesis implies both locality and multilocality in both the IR and UV limits for the downscale energy cascade of three-dimensional Navier-Stokes turbulence and the downscale enstrophy cascade and inverse energy cascade of two-dimensional Navier-Stokes turbulence. We stress that these claims relate to nonperturbative locality of generalized structure functions on all orders and not the term-by-term perturbative locality of diagrammatic theories or closure models that involve only two-point correlation and response functions.

  7. Application of geometric approximation to the CPMG experiment: Two- and three-site exchange.

    PubMed

    Chao, Fa-An; Byrd, R Andrew

    2017-04-01

    The Carr-Purcell-Meiboom-Gill (CPMG) experiment is one of the most classical and well-known relaxation dispersion experiments in NMR spectroscopy, and it has been successfully applied to characterize biologically relevant conformational dynamics in many cases. Although the data analysis of the CPMG experiment for the 2-site exchange model can be facilitated by analytical solutions, the data analysis in a more complex exchange model generally requires computationally-intensive numerical analysis. Recently, a powerful computational strategy, geometric approximation, has been proposed to provide approximate numerical solutions for the adiabatic relaxation dispersion experiments where analytical solutions are neither available nor feasible. Here, we demonstrate the general potential of geometric approximation by providing a data analysis solution of the CPMG experiment for both the traditional 2-site model and a linear 3-site exchange model. The approximate numerical solution deviates less than 0.5% from the numerical solution on average, and the new approach is computationally 60,000-fold more efficient than the numerical approach. Moreover, we find that accurate dynamic parameters can be determined in most cases, and, for a range of experimental conditions, the relaxation can be assumed to follow mono-exponential decay. The method is general and applicable to any CPMG RD experiment (e.g. N, C', C α , H α , etc.) The approach forms a foundation of building solution surfaces to analyze the CPMG experiment for different models of 3-site exchange. Thus, the geometric approximation is a general strategy to analyze relaxation dispersion data in any system (biological or chemical) if the appropriate library can be built in a physically meaningful domain. Published by Elsevier Inc.

  8. Rare events modeling with support vector machine: Application to forecasting large-amplitude geomagnetic substorms and extreme events in financial markets.

    NASA Astrophysics Data System (ADS)

    Gavrishchaka, V. V.; Ganguli, S. B.

    2001-12-01

    Reliable forecasting of rare events in a complex dynamical system is a challenging problem that is important for many practical applications. Due to the nature of rare events, data set available for construction of the statistical and/or machine learning model is often very limited and incomplete. Therefore many widely used approaches including such robust algorithms as neural networks can easily become inadequate for rare events prediction. Moreover in many practical cases models with high-dimensional inputs are required. This limits applications of the existing rare event modeling techniques (e.g., extreme value theory) that focus on univariate cases. These approaches are not easily extended to multivariate cases. Support vector machine (SVM) is a machine learning system that can provide an optimal generalization using very limited and incomplete training data sets and can efficiently handle high-dimensional data. These features may allow to use SVM to model rare events in some applications. We have applied SVM-based system to the problem of large-amplitude substorm prediction and extreme event forecasting in stock and currency exchange markets. Encouraging preliminary results will be presented and other possible applications of the system will be discussed.

  9. Exploring behavior of an unusual megaherbivore: A spatially explicit foraging model of the hippopotamus

    USGS Publications Warehouse

    Lewison, R.L.; Carter, J.

    2004-01-01

    Herbivore foraging theories have been developed for and tested on herbivores across a range of sizes. Due to logistical constraints, however, little research has focused on foraging behavior of megaherbivores. Here we present a research approach that explores megaherbivore foraging behavior, and assesses the applicability of foraging theories developed on smaller herbivores to megafauna. With simulation models as reference points for the analysis of empirical data, we investigate foraging strategies of the common hippopotamus (Hippopotamus amphibius). Using a spatially explicit individual based foraging model, we apply traditional herbivore foraging strategies to a model hippopotamus, compare model output, and then relate these results to field data from wild hippopotami. Hippopotami appear to employ foraging strategies that respond to vegetation characteristics, such as vegetation quality, as well as spatial reference information, namely distance to a water source. Model predictions, field observations, and comparisons of the two support that hippopotami generally conform to the central place foraging construct. These analyses point to the applicability of general herbivore foraging concepts to megaherbivores, but also point to important differences between hippopotami and other herbivores. Our synergistic approach of models as reference points for empirical data highlights a useful method of behavioral analysis for hard-to-study megafauna. ?? 2003 Elsevier B.V. All rights reserved.

  10. A Critical Survey of Optimization Models for Tactical and Strategic Aspects of Air Traffic Flow Management

    NASA Technical Reports Server (NTRS)

    Bertsimas, Dimitris; Odoni, Amedeo

    1997-01-01

    This document presents a critical review of the principal existing optimization models that have been applied to Air Traffic Flow Management (TFM). Emphasis will be placed on two problems, the Generalized Tactical Flow Management Problem (GTFMP) and the Ground Holding Problem (GHP), as well as on some of their variations. To perform this task, we have carried out an extensive literature review that has covered more than 40 references, most of them very recent. Based on the review of this emerging field our objectives were to: (i) identify the best available models; (ii) describe typical contexts for applications of the models; (iii) provide illustrative model formulations; and (iv) identify the methodologies that can be used to solve the models. We shall begin our presentation below by providing a brief context for the models that we are reviewing. In Section 3 we shall offer a taxonomy and identify four classes of models for review. In Sections 4, 5, and 6 we shall then review, respectively, models for the Single-Airport Ground Holding Problem, the Generalized Tactical FM P and the Multi-Airport Ground Holding Problem (for the definition of these problems see Section 3 below). In each section, we identify the best available models and discuss briefly their computational performance and applications, if any, to date. Section 7 summarizes our conclusions about the state of the art.

  11. Forest management applications of Landsat data in a geographic information system

    NASA Technical Reports Server (NTRS)

    Maw, K. D.; Brass, J. A.

    1982-01-01

    The utility of land-cover data resulting from Landsat MSS classification can be greatly enhanced by use in combination with ancillary data. A demonstration forest management applications data base was constructed for Santa Cruz County, California, to demonstrate geographic information system applications of classified Landsat data. The data base contained detailed soils, digital terrain, land ownership, jurisdictional boundaries, fire events, and generalized land-use data, all registered to a UTM grid base. Applications models were developed from problems typical of fire management and reforestation planning.

  12. The Power Prior: Theory and Applications

    PubMed Central

    Ibrahim, Joseph G.; Chen, Ming-Hui; Gwon, Yeongjin; Chen, Fang

    2015-01-01

    The power prior has been widely used in many applications covering a large number of disciplines. The power prior is intended to be an informative prior constructed from historical data. It has been used in clinical trials, genetics, health care, psychology, environmental health, engineering, economics, and business. It has also been applied for a wide variety of models and settings, both in the experimental design and analysis contexts. In this review article, we give an A to Z exposition of the power prior and its applications to date. We review its theoretical properties, variations in its formulation, statistical contexts for which it has been used, applications, and its advantages over other informative priors. We review models for which it has been used, including generalized linear models, survival models, and random effects models. Statistical areas where the power prior has been used include model selection, experimental design, hierarchical modeling, and conjugate priors. Prequentist properties of power priors in posterior inference are established and a simulation study is conducted to further examine the empirical performance of the posterior estimates with power priors. Real data analyses are given illustrating the power prior as well as the use of the power prior in the Bayesian design of clinical trials. PMID:26346180

  13. Bayesian generalized linear mixed modeling of Tuberculosis using informative priors.

    PubMed

    Ojo, Oluwatobi Blessing; Lougue, Siaka; Woldegerima, Woldegebriel Assefa

    2017-01-01

    TB is rated as one of the world's deadliest diseases and South Africa ranks 9th out of the 22 countries with hardest hit of TB. Although many pieces of research have been carried out on this subject, this paper steps further by inculcating past knowledge into the model, using Bayesian approach with informative prior. Bayesian statistics approach is getting popular in data analyses. But, most applications of Bayesian inference technique are limited to situations of non-informative prior, where there is no solid external information about the distribution of the parameter of interest. The main aim of this study is to profile people living with TB in South Africa. In this paper, identical regression models are fitted for classical and Bayesian approach both with non-informative and informative prior, using South Africa General Household Survey (GHS) data for the year 2014. For the Bayesian model with informative prior, South Africa General Household Survey dataset for the year 2011 to 2013 are used to set up priors for the model 2014.

  14. Beyond heat baths II: framework for generalized thermodynamic resource theories

    NASA Astrophysics Data System (ADS)

    Yunger Halpern, Nicole

    2018-03-01

    Thermodynamics, which describes vast systems, has been reconciled with small scales, relevant to single-molecule experiments, in resource theories. Resource theories have been used to model exchanges of energy and information. Recently, particle exchanges were modeled; and an umbrella family of thermodynamic resource theories was proposed to model diverse baths, interactions, and free energies. This paper motivates and details the family’s structure and prospective applications. How to model electrochemical, gravitational, magnetic, and other thermodynamic systems is explained. Szilárd’s engine and Landauer’s Principle are generalized, as resourcefulness is shown to be convertible not only between information and gravitational energy, but also among diverse degrees of freedom. Extensive variables are associated with quantum operators that might fail to commute, introducing extra nonclassicality into thermodynamic resource theories. An early version of this paper partially motivated the later development of noncommutative thermalization. This generalization expands the theories’ potential for modeling realistic systems with which small-scale statistical mechanics might be tested experimentally.

  15. Chronic heart failure management in Australia -- time for general practice centred models of care?

    PubMed

    Scott, Ian; Jackson, Claire

    2013-05-01

    Chronic heart failure (CHF) is an increasingly prevalent problem within ageing populations and accounts for thousands of hospitalisations and deaths annually in Australia. Disease management programs for CHF (CHF-DMPs) aim to optimise care, with the predominant model being cardiologist led, hospital based multidisciplinary clinics with cardiac nurse outreach. However, findings from contemporary observational studies and clinical trials raise uncertainty around the effectiveness and sustainability of traditional CHF-DMPs in real-world clinical practice. To suggest an alternative model of care that involves general practitioners with a special interest in CHF liaising with, and being up-skilled by, specialists within community based, multidisciplinary general practice settings. Preliminary data from trials evaluating primary care based CHF-DMPs are encouraging, and further studies are underway comparing this model of care with traditional hospital based, specialist led CHF-DMPs. Results of studies of similar primary care models targeting diabetes and other chronic diseases suggest potential for its application to CHF.

  16. Application of the Conway-Maxwell-Poisson generalized linear model for analyzing motor vehicle crashes.

    PubMed

    Lord, Dominique; Guikema, Seth D; Geedipally, Srinivas Reddy

    2008-05-01

    This paper documents the application of the Conway-Maxwell-Poisson (COM-Poisson) generalized linear model (GLM) for modeling motor vehicle crashes. The COM-Poisson distribution, originally developed in 1962, has recently been re-introduced by statisticians for analyzing count data subjected to over- and under-dispersion. This innovative distribution is an extension of the Poisson distribution. The objectives of this study were to evaluate the application of the COM-Poisson GLM for analyzing motor vehicle crashes and compare the results with the traditional negative binomial (NB) model. The comparison analysis was carried out using the most common functional forms employed by transportation safety analysts, which link crashes to the entering flows at intersections or on segments. To accomplish the objectives of the study, several NB and COM-Poisson GLMs were developed and compared using two datasets. The first dataset contained crash data collected at signalized four-legged intersections in Toronto, Ont. The second dataset included data collected for rural four-lane divided and undivided highways in Texas. Several methods were used to assess the statistical fit and predictive performance of the models. The results of this study show that COM-Poisson GLMs perform as well as NB models in terms of GOF statistics and predictive performance. Given the fact the COM-Poisson distribution can also handle under-dispersed data (while the NB distribution cannot or has difficulties converging), which have sometimes been observed in crash databases, the COM-Poisson GLM offers a better alternative over the NB model for modeling motor vehicle crashes, especially given the important limitations recently documented in the safety literature about the latter type of model.

  17. General Method for Constructing Local Hidden Variable Models for Entangled Quantum States

    NASA Astrophysics Data System (ADS)

    Cavalcanti, D.; Guerini, L.; Rabelo, R.; Skrzypczyk, P.

    2016-11-01

    Entanglement allows for the nonlocality of quantum theory, which is the resource behind device-independent quantum information protocols. However, not all entangled quantum states display nonlocality. A central question is to determine the precise relation between entanglement and nonlocality. Here we present the first general test to decide whether a quantum state is local, and show that the test can be implemented by semidefinite programing. This method can be applied to any given state and for the construction of new examples of states with local hidden variable models for both projective and general measurements. As applications, we provide a lower-bound estimate of the fraction of two-qubit local entangled states and present new explicit examples of such states, including those that arise from physical noise models, Bell-diagonal states, and noisy Greenberger-Horne-Zeilinger and W states.

  18. Spacecraft dynamics characterization and control system failure detection. Volume 3: Control system failure monitoring

    NASA Technical Reports Server (NTRS)

    Vanschalkwyk, Christiaan M.

    1992-01-01

    We discuss the application of Generalized Parity Relations to two experimental flexible space structures, the NASA Langley Mini-Mast and Marshall Space Flight Center ACES mast. We concentrate on the generation of residuals and make no attempt to implement the Decision Function. It should be clear from the examples that are presented whether it would be possible to detect the failure of a specific component. We derive the equations from Generalized Parity Relations. Two special cases are treated: namely, Single Sensor Parity Relations (SSPR) and Double Sensor Parity Relations (DSPR). Generalized Parity Relations for actuators are also derived. The NASA Langley Mini-Mast and the application of SSPR and DSPR to a set of displacement sensors located at the tip of the Mini-Mast are discussed. The performance of a reduced order model that includes the first five models of the mast is compared to a set of parity relations that was identified on a set of input-output data. Both time domain and frequency domain comparisons are made. The effect of the sampling period and model order on the performance of the Residual Generators are also discussed. Failure detection experiments where the sensor set consisted of two gyros and an accelerometer are presented. The effects of model order and sampling frequency are again illustrated. The detection of actuator failures is discussed. We use Generalized Parity Relations to monitor control system component failures on the ACES mast. An overview is given of the Failure Detection Filter and experimental results are discussed. Conclusions and directions for future research are given.

  19. CRISPR-Cas in Medicinal Chemistry: Applications and Regulatory Concerns.

    PubMed

    Duardo-Sanchez, Aliuska

    2017-01-01

    A rapid search in scientific publication's databases shows how the use of CRISPR-Cas genome editions' technique has considerably expanded, and its growing importance, in modern molecular biology. Just in pub-med platform, the search of the term gives more than 3000 results. Specifically, in Drug Discovery, Medicinal Chemistry and Chemical Biology in general CRISPR method may have multiple applications. Some of these applications are: resistance-selection studies of antimalarial lead organic compounds; investigation of druggability; development of animal models for chemical compounds testing, etc. In this paper, we offer a review of the most relevant scientific literature illustrated with specific examples of application of CRISPR technique to medicinal chemistry and chemical biology. We also present a general overview of the main legal and ethical trends regarding this method of genome editing. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  20. A general system for automatic biomedical image segmentation using intensity neighborhoods.

    PubMed

    Chen, Cheng; Ozolek, John A; Wang, Wei; Rohde, Gustavo K

    2011-01-01

    Image segmentation is important with applications to several problems in biology and medicine. While extensively researched, generally, current segmentation methods perform adequately in the applications for which they were designed, but often require extensive modifications or calibrations before being used in a different application. We describe an approach that, with few modifications, can be used in a variety of image segmentation problems. The approach is based on a supervised learning strategy that utilizes intensity neighborhoods to assign each pixel in a test image its correct class based on training data. We describe methods for modeling rotations and variations in scales as well as a subset selection for training the classifiers. We show that the performance of our approach in tissue segmentation tasks in magnetic resonance and histopathology microscopy images, as well as nuclei segmentation from fluorescence microscopy images, is similar to or better than several algorithms specifically designed for each of these applications.

  1. A Fractional Cartesian Composition Model for Semi-Spatial Comparative Visualization Design.

    PubMed

    Kolesar, Ivan; Bruckner, Stefan; Viola, Ivan; Hauser, Helwig

    2017-01-01

    The study of spatial data ensembles leads to substantial visualization challenges in a variety of applications. In this paper, we present a model for comparative visualization that supports the design of according ensemble visualization solutions by partial automation. We focus on applications, where the user is interested in preserving selected spatial data characteristics of the data as much as possible-even when many ensemble members should be jointly studied using comparative visualization. In our model, we separate the design challenge into a minimal set of user-specified parameters and an optimization component for the automatic configuration of the remaining design variables. We provide an illustrated formal description of our model and exemplify our approach in the context of several application examples from different domains in order to demonstrate its generality within the class of comparative visualization problems for spatial data ensembles.

  2. GRace: a MATLAB-based application for fitting the discrimination-association model.

    PubMed

    Stefanutti, Luca; Vianello, Michelangelo; Anselmi, Pasquale; Robusto, Egidio

    2014-10-28

    The Implicit Association Test (IAT) is a computerized two-choice discrimination task in which stimuli have to be categorized as belonging to target categories or attribute categories by pressing, as quickly and accurately as possible, one of two response keys. The discrimination association model has been recently proposed for the analysis of reaction time and accuracy of an individual respondent to the IAT. The model disentangles the influences of three qualitatively different components on the responses to the IAT: stimuli discrimination, automatic association, and termination criterion. The article presents General Race (GRace), a MATLAB-based application for fitting the discrimination association model to IAT data. GRace has been developed for Windows as a standalone application. It is user-friendly and does not require any programming experience. The use of GRace is illustrated on the data of a Coca Cola-Pepsi Cola IAT, and the results of the analysis are interpreted and discussed.

  3. Statistical Time Series Models of Pilot Control with Applications to Instrument Discrimination

    NASA Technical Reports Server (NTRS)

    Altschul, R. E.; Nagel, P. M.; Oliver, F.

    1984-01-01

    A general description of the methodology used in obtaining the transfer function models and verification of model fidelity, frequency domain plots of the modeled transfer functions, numerical results obtained from an analysis of poles and zeroes obtained from z plane to s-plane conversions of the transfer functions, and the results of a study on the sequential introduction of other variables, both exogenous and endogenous into the loop are contained.

  4. Model structure of the stream salmonid simulator (S3)—A dynamic model for simulating growth, movement, and survival of juvenile salmonids

    USGS Publications Warehouse

    Perry, Russell W.; Plumb, John M.; Jones, Edward C.; Som, Nicholas A.; Hetrick, Nicholas J.; Hardy, Thomas B.

    2018-04-06

    Fisheries and water managers often use population models to aid in understanding the effect of alternative water management or restoration actions on anadromous fish populations. We developed the Stream Salmonid Simulator (S3) to help resource managers evaluate the effect of management alternatives on juvenile salmonid populations. S3 is a deterministic stage-structured population model that tracks daily growth, movement, and survival of juvenile salmon. A key theme of the model is that river flow affects habitat availability and capacity, which in turn drives density dependent population dynamics. To explicitly link population dynamics to habitat quality and quantity, the river environment is constructed as a one-dimensional series of linked habitat units, each of which has an associated daily time series of discharge, water temperature, and usable habitat area or carrying capacity. The physical characteristics of each habitat unit and the number of fish occupying each unit, in turn, drive survival and growth within each habitat unit and movement of fish among habitat units.The purpose of this report is to outline the underlying general structure of the S3 model that is common among different applications of the model. We have developed applications of the S3 model for juvenile fall Chinook salmon (Oncorhynchus tshawytscha) in the lower Klamath River. Thus, this report is a companion to current application of the S3 model to the Trinity River (in review). The general S3 model structure provides a biological and physical framework for the salmonid freshwater life cycle. This framework captures important demographics of juvenile salmonids aimed at translating management alternatives into simulated population responses. Although the S3 model is built on this common framework, the model has been constructed to allow much flexibility in application of the model to specific river systems. The ability for practitioners to include system-specific information for the physical stream structure, survival, growth, and movement processes ensures that simulations provide results that are relevant to the questions asked about the population under study.

  5. The Expected Sample Variance of Uncorrelated Random Variables with a Common Mean and Some Applications in Unbalanced Random Effects Models

    ERIC Educational Resources Information Center

    Vardeman, Stephen B.; Wendelberger, Joanne R.

    2005-01-01

    There is a little-known but very simple generalization of the standard result that for uncorrelated random variables with common mean [mu] and variance [sigma][superscript 2], the expected value of the sample variance is [sigma][superscript 2]. The generalization justifies the use of the usual standard error of the sample mean in possibly…

  6. Mobile applications in oncology: is it possible for patients and healthcare professionals to easily identify relevant tools?

    PubMed

    Brouard, Benoit; Bardo, Pascale; Bonnet, Clément; Mounier, Nicolas; Vignot, Marina; Vignot, Stéphane

    2016-11-01

    Mobile applications represent promising tools in management of chronic diseases, both for patients and healthcare professionals, and especially in oncology. Among the large number of mobile health (mhealth) applications available in mobile stores, it could be difficult for users to identify the most relevant ones. This study evaluated the business model and the scientific validation for mobile applications related to oncology. A systematic review was performed over the two major marketplaces. Purpose, scientific validation, and source of funding were evaluated according to the description of applications in stores. Results were stratified according to targeted audience (general population/patients/healthcare professionals). Five hundred and thirty-nine applications related to oncology were identified: 46.8% dedicated to healthcare professionals, 31.5% to general population, and 21.7% to patients. A lack of information about healthcare professionals' involvement in the development process was noted since only 36.5% of applications mentioned an obvious scientific validation. Most apps were free (72.2%) and without explicit support by industry (94.2%). There is a need to enforce independent review of mhealth applications in oncology. The economic model could be questioned and the source of funding should be clarified. Meanwhile, patients and healthcare professionals should remain cautious about applications' contents. Key messages A systematic review was performed to describe the mobile applications related to oncology and it revealed a lack of information on scientific validation and funding. Independent scientific review and the reporting of conflicts of interest should be encouraged. Users, and all health professionals, should be aware that health applications, whatever the quality of their content, do not actually embrace such an approach.

  7. Using color histogram normalization for recovering chromatic illumination-changed images.

    PubMed

    Pei, S C; Tseng, C L; Wu, C C

    2001-11-01

    We propose a novel image-recovery method using the covariance matrix of the red-green-blue (R-G-B) color histogram and tensor theories. The image-recovery method is called the color histogram normalization algorithm. It is known that the color histograms of an image taken under varied illuminations are related by a general affine transformation of the R-G-B coordinates when the illumination is changed. We propose a simplified affine model for application with illumination variation. This simplified affine model considers the effects of only three basic forms of distortion: translation, scaling, and rotation. According to this principle, we can estimate the affine transformation matrix necessary to recover images whose color distributions are varied as a result of illumination changes. We compare the normalized color histogram of the standard image with that of the tested image. By performing some operations of simple linear algebra, we can estimate the matrix of the affine transformation between two images under different illuminations. To demonstrate the performance of the proposed algorithm, we divide the experiments into two parts: computer-simulated images and real images corresponding to illumination changes. Simulation results show that the proposed algorithm is effective for both types of images. We also explain the noise-sensitive skew-rotation estimation that exists in the general affine model and demonstrate that the proposed simplified affine model without the use of skew rotation is better than the general affine model for such applications.

  8. Finite-Size Scaling of a First-Order Dynamical Phase Transition: Adaptive Population Dynamics and an Effective Model

    NASA Astrophysics Data System (ADS)

    Nemoto, Takahiro; Jack, Robert L.; Lecomte, Vivien

    2017-03-01

    We analyze large deviations of the time-averaged activity in the one-dimensional Fredrickson-Andersen model, both numerically and analytically. The model exhibits a dynamical phase transition, which appears as a singularity in the large deviation function. We analyze the finite-size scaling of this phase transition numerically, by generalizing an existing cloning algorithm to include a multicanonical feedback control: this significantly improves the computational efficiency. Motivated by these numerical results, we formulate an effective theory for the model in the vicinity of the phase transition, which accounts quantitatively for the observed behavior. We discuss potential applications of the numerical method and the effective theory in a range of more general contexts.

  9. Framework for adaptive multiscale analysis of nonhomogeneous point processes.

    PubMed

    Helgason, Hannes; Bartroff, Jay; Abry, Patrice

    2011-01-01

    We develop the methodology for hypothesis testing and model selection in nonhomogeneous Poisson processes, with an eye toward the application of modeling and variability detection in heart beat data. Modeling the process' non-constant rate function using templates of simple basis functions, we develop the generalized likelihood ratio statistic for a given template and a multiple testing scheme to model-select from a family of templates. A dynamic programming algorithm inspired by network flows is used to compute the maximum likelihood template in a multiscale manner. In a numerical example, the proposed procedure is nearly as powerful as the super-optimal procedures that know the true template size and true partition, respectively. Extensions to general history-dependent point processes is discussed.

  10. Non-extensitivity vs. informative moments for financial models —A unifying framework and empirical results

    NASA Astrophysics Data System (ADS)

    Herrmann, K.

    2009-11-01

    Information-theoretic approaches still play a minor role in financial market analysis. Nonetheless, there have been two very similar approaches evolving during the last years, one in the so-called econophysics and the other in econometrics. Both generalize the notion of GARCH processes in an information-theoretic sense and are able to capture kurtosis better than traditional models. In this article we present both approaches in a more general framework. The latter allows the derivation of a wide range of new models. We choose a third model using an entropy measure suggested by Kapur. In an application to financial market data, we find that all considered models - with similar flexibility in terms of skewness and kurtosis - lead to very similar results.

  11. Network representations of angular regions for electromagnetic scattering

    PubMed Central

    2017-01-01

    Network modeling in electromagnetics is an effective technique in treating scattering problems by canonical and complex structures. Geometries constituted of angular regions (wedges) together with planar layers can now be approached with the Generalized Wiener-Hopf Technique supported by network representation in spectral domain. Even if the network representations in spectral planes are of great importance by themselves, the aim of this paper is to present a theoretical base and a general procedure for the formulation of complex scattering problems using network representation for the Generalized Wiener Hopf Technique starting basically from the wave equation. In particular while the spectral network representations are relatively well known for planar layers, the network modelling for an angular region requires a new theory that will be developed in this paper. With this theory we complete the formulation of a network methodology whose effectiveness is demonstrated by the application to a complex scattering problem with practical solutions given in terms of GTD/UTD diffraction coefficients and total far fields for engineering applications. The methodology can be applied to other physics fields. PMID:28817573

  12. Adaptive transmission disequilibrium test for family trio design.

    PubMed

    Yuan, Min; Tian, Xin; Zheng, Gang; Yang, Yaning

    2009-01-01

    The transmission disequilibrium test (TDT) is a standard method to detect association using family trio design. It is optimal for an additive genetic model. Other TDT-type tests optimal for recessive and dominant models have also been developed. Association tests using family data, including the TDT-type statistics, have been unified to a class of more comprehensive and flexable family-based association tests (FBAT). TDT-type tests have high efficiency when the genetic model is known or correctly specified, but may lose power if the model is mis-specified. Hence tests that are robust to genetic model mis-specification yet efficient are preferred. Constrained likelihood ratio test (CLRT) and MAX-type test have been shown to be efficiency robust. In this paper we propose a new efficiency robust procedure, referred to as adaptive TDT (aTDT). It uses the Hardy-Weinberg disequilibrium coefficient to identify the potential genetic model underlying the data and then applies the TDT-type test (or FBAT for general applications) corresponding to the selected model. Simulation demonstrates that aTDT is efficiency robust to model mis-specifications and generally outperforms the MAX test and CLRT in terms of power. We also show that aTDT has power close to, but much more robust, than the optimal TDT-type test based on a single genetic model. Applications to real and simulated data from Genetic Analysis Workshop (GAW) illustrate the use of our adaptive TDT.

  13. Development of the GEOS-5 Atmospheric General Circulation Model: Evolution from MERRA to MERRA2.

    NASA Technical Reports Server (NTRS)

    Molod, Andrea; Takacs, Lawrence; Suarez, Max; Bacmeister, Julio

    2014-01-01

    The Modern-Era Retrospective Analysis for Research and Applications-2 (MERRA2) version of the GEOS-5 (Goddard Earth Observing System Model - 5) Atmospheric General Circulation Model (AGCM) is currently in use in the NASA Global Modeling and Assimilation Office (GMAO) at a wide range of resolutions for a variety of applications. Details of the changes in parameterizations subsequent to the version in the original MERRA reanalysis are presented here. Results of a series of atmosphere-only sensitivity studies are shown to demonstrate changes in simulated climate associated with specific changes in physical parameterizations, and the impact of the newly implemented resolution-aware behavior on simulations at different resolutions is demonstrated. The GEOS-5 AGCM presented here is the model used as part of the GMAO's MERRA2 reanalysis, the global mesoscale "nature run", the real-time numerical weather prediction system, and for atmosphere-only, coupled ocean-atmosphere and coupled atmosphere-chemistry simulations. The seasonal mean climate of the MERRA2 version of the GEOS-5 AGCM represents a substantial improvement over the simulated climate of the MERRA version at all resolutions and for all applications. Fundamental improvements in simulated climate are associated with the increased re-evaporation of frozen precipitation and cloud condensate, resulting in a wetter atmosphere. Improvements in simulated climate are also shown to be attributable to changes in the background gravity wave drag, and to upgrades in the relationship between the ocean surface stress and the ocean roughness. The series of "resolution aware" parameters related to the moist physics were shown to result in improvements at higher resolutions, and result in AGCM simulations that exhibit seamless behavior across different resolutions and applications.

  14. Biohybrid Control of General Linear Systems Using the Adaptive Filter Model of Cerebellum.

    PubMed

    Wilson, Emma D; Assaf, Tareq; Pearson, Martin J; Rossiter, Jonathan M; Dean, Paul; Anderson, Sean R; Porrill, John

    2015-01-01

    The adaptive filter model of the cerebellar microcircuit has been successfully applied to biological motor control problems, such as the vestibulo-ocular reflex (VOR), and to sensory processing problems, such as the adaptive cancelation of reafferent noise. It has also been successfully applied to problems in robotics, such as adaptive camera stabilization and sensor noise cancelation. In previous applications to inverse control problems, the algorithm was applied to the velocity control of a plant dominated by viscous and elastic elements. Naive application of the adaptive filter model to the displacement (as opposed to velocity) control of this plant results in unstable learning and control. To be more generally useful in engineering problems, it is essential to remove this restriction to enable the stable control of plants of any order. We address this problem here by developing a biohybrid model reference adaptive control (MRAC) scheme, which stabilizes the control algorithm for strictly proper plants. We evaluate the performance of this novel cerebellar-inspired algorithm with MRAC scheme in the experimental control of a dielectric electroactive polymer, a class of artificial muscle. The results show that the augmented cerebellar algorithm is able to accurately control the displacement response of the artificial muscle. The proposed solution not only greatly extends the practical applicability of the cerebellar-inspired algorithm, but may also shed light on cerebellar involvement in a wider range of biological control tasks.

  15. Safe Upper-Bounds Inference of Energy Consumption for Java Bytecode Applications

    NASA Technical Reports Server (NTRS)

    Navas, Jorge; Mendez-Lojo, Mario; Hermenegildo, Manuel V.

    2008-01-01

    Many space applications such as sensor networks, on-board satellite-based platforms, on-board vehicle monitoring systems, etc. handle large amounts of data and analysis of such data is often critical for the scientific mission. Transmitting such large amounts of data to the remote control station for analysis is usually too expensive for time-critical applications. Instead, modern space applications are increasingly relying on autonomous on-board data analysis. All these applications face many resource constraints. A key requirement is to minimize energy consumption. Several approaches have been developed for estimating the energy consumption of such applications (e.g. [3, 1]) based on measuring actual consumption at run-time for large sets of random inputs. However, this approach has the limitation that it is in general not possible to cover all possible inputs. Using formal techniques offers the potential for inferring safe energy consumption bounds, thus being specially interesting for space exploration and safety-critical systems. We have proposed and implemented a general frame- work for resource usage analysis of Java bytecode [2]. The user defines a set of resource(s) of interest to be tracked and some annotations that describe the cost of some elementary elements of the program for those resources. These values can be constants or, more generally, functions of the input data sizes. The analysis then statically derives an upper bound on the amount of those resources that the program as a whole will consume or provide, also as functions of the input data sizes. This article develops a novel application of the analysis of [2] to inferring safe upper bounds on the energy consumption of Java bytecode applications. We first use a resource model that describes the cost of each bytecode instruction in terms of the joules it consumes. With this resource model, we then generate energy consumption cost relations, which are then used to infer safe upper bounds. How energy consumption for each bytecode instruction is measured is beyond the scope of this paper. Instead, this paper is about how to infer safe energy consumption estimations assuming that those energy consumption costs are provided. For concreteness, we use a simplified version of an existing resource model [1] in which an energy consumption cost for individual Java opcodes is defined.

  16. Bridge deck service life prediction and costs.

    DOT National Transportation Integrated Search

    2007-01-01

    The service life of Virginia's concrete bridge decks is generally controlled by chloride-induced corrosion of the reinforcing steel as a result of the application of winter maintenance deicing salts. A chloride corrosion model accounting for the vari...

  17. The Electrocardiogram as an Example of Electrostatics

    ERIC Educational Resources Information Center

    Hobbie, Russell K.

    1973-01-01

    Develops a simplified electrostatic model of the heart with conduction within the torso neglected to relate electrocardiogram patterns to the charge distribution within the myocardium. Suggests its application to explanation of Coulomb's law in general physics. (CC)

  18. Multichannel ECG and Noise Modeling: Application to Maternal and Fetal ECG Signals

    NASA Astrophysics Data System (ADS)

    Sameni, Reza; Clifford, Gari D.; Jutten, Christian; Shamsollahi, Mohammad B.

    2007-12-01

    A three-dimensional dynamic model of the electrical activity of the heart is presented. The model is based on the single dipole model of the heart and is later related to the body surface potentials through a linear model which accounts for the temporal movements and rotations of the cardiac dipole, together with a realistic ECG noise model. The proposed model is also generalized to maternal and fetal ECG mixtures recorded from the abdomen of pregnant women in single and multiple pregnancies. The applicability of the model for the evaluation of signal processing algorithms is illustrated using independent component analysis. Considering the difficulties and limitations of recording long-term ECG data, especially from pregnant women, the model described in this paper may serve as an effective means of simulation and analysis of a wide range of ECGs, including adults and fetuses.

  19. Different Manhattan project: automatic statistical model generation

    NASA Astrophysics Data System (ADS)

    Yap, Chee Keng; Biermann, Henning; Hertzmann, Aaron; Li, Chen; Meyer, Jon; Pao, Hsing-Kuo; Paxia, Salvatore

    2002-03-01

    We address the automatic generation of large geometric models. This is important in visualization for several reasons. First, many applications need access to large but interesting data models. Second, we often need such data sets with particular characteristics (e.g., urban models, park and recreation landscape). Thus we need the ability to generate models with different parameters. We propose a new approach for generating such models. It is based on a top-down propagation of statistical parameters. We illustrate the method in the generation of a statistical model of Manhattan. But the method is generally applicable in the generation of models of large geographical regions. Our work is related to the literature on generating complex natural scenes (smoke, forests, etc) based on procedural descriptions. The difference in our approach stems from three characteristics: modeling with statistical parameters, integration of ground truth (actual map data), and a library-based approach for texture mapping.

  20. Fuzzy set methods for object recognition in space applications

    NASA Technical Reports Server (NTRS)

    Keller, James M.

    1991-01-01

    During the reporting period, the development of the theory and application of methodologies for decision making under uncertainty was addressed. Two subreports are included; the first on properties of general hybrid operators, while the second considers some new research on generalized threshold logic units. In the first part, the properties of the additive gamma-model, where the intersection part is first considered to be the product of the input values and the union part is obtained by an extension of De Morgan's law to fuzzy sets, is explored. Then the Yager's class of union and intersection is used in the additive gamma-model. The inputs are weighted to some power that represents their importance and thus their contribution to the compensation process. In the second part, the extension of binary logic synthesis methods to multiple valued logic synthesis methods to enable the synthesis of decision networks when the input/output variables are not binary is discussed.

  1. A high-level language for rule-based modelling.

    PubMed

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages.

  2. A High-Level Language for Rule-Based Modelling

    PubMed Central

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D.

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages. PMID:26043208

  3. Unraveling the Mystery of the Origin of Mathematical Problems: Using a Problem-Posing Framework with Prospective Mathematics Teachers

    ERIC Educational Resources Information Center

    Contreras, Jose

    2007-01-01

    In this article, I model how a problem-posing framework can be used to enhance our abilities to systematically generate mathematical problems by modifying the attributes of a given problem. The problem-posing model calls for the application of the following fundamental mathematical processes: proving, reversing, specializing, generalizing, and…

  4. Computation of turbulent flows-state-of-the-art, 1970

    NASA Technical Reports Server (NTRS)

    Reynolds, W. C.

    1972-01-01

    The state-of-the-art of turbulent flow computation is surveyed. The formulations were generalized to increase the range of their applicability, and the excitement of current debate on equation models was brought into the review. Some new ideas on the modeling of the pressure-strain term in the Reynolds stress equations are also suggested.

  5. Analyzing Company Economics Using the Leontief Open Production Model

    ERIC Educational Resources Information Center

    Laumakis, Paul J.

    2008-01-01

    This article details the application of an economic theory to the fiscal operation of a small engineering consulting firm. Nobel Prize-winning economist Wassily Leontief developed his general input-output economic theory in the mid-twentieth century to describe the flow of goods and services in the U.S. economy. We use one mathematical model that…

  6. Diagnostic Classification Models and Multidimensional Adaptive Testing: A Commentary on Rupp and Templin

    ERIC Educational Resources Information Center

    Frey, Andreas; Carstensen, Claus H.

    2009-01-01

    On a general level, the objective of diagnostic classifications models (DCMs) lies in a classification of individuals regarding multiple latent skills. In this article, the authors show that this objective can be achieved by multidimensional adaptive testing (MAT) as well. The authors discuss whether or not the restricted applicability of DCMs can…

  7. Modeling Expert Behavior in Support of an Adaptive Psychomotor Training Environment: A Marksmanship Use Case

    ERIC Educational Resources Information Center

    Goldberg, Benjamin; Amburn, Charles; Ragusa, Charlie; Chen, Dar-Wei

    2018-01-01

    The U.S. Army is interested in extending the application of intelligent tutoring systems (ITS) beyond cognitive problem spaces and into psychomotor skill domains. In this paper, we present a methodology and validation procedure for creating expert model representations in the domain of rifle marksmanship. GIFT (Generalized Intelligent Framework…

  8. On the accuracy of models for predicting sound propagation in fitted rooms.

    PubMed

    Hodgson, M

    1990-08-01

    The objective of this article is to make a contribution to the evaluation of the accuracy and applicability of models for predicting the sound propagation in fitted rooms such as factories, classrooms, and offices. The models studied are 1:50 scale models; the method-of-image models of Jovicic, Lindqvist, Hodgson, Kurze, and of Lemire and Nicolas; the emprical formula of Friberg; and Ondet and Barbry's ray-tracing model. Sound propagation predictions by the analytic models are compared with the results of sound propagation measurements in a 1:50 scale model and in a warehouse, both containing various densities of approximately isotropically distributed, rectangular-parallelepipedic fittings. The results indicate that the models of Friberg and of Lemire and Nicolas are fundamentally incorrect. While more generally applicable versions exist, the versions of the models of Jovicic and Kurze studied here are found to be of limited applicability since they ignore vertical-wall reflections. The Hodgson and Lindqvist models appear to be accurate in certain limited cases. This preliminary study found the ray-tracing model of Ondet and Barbry to be the most accurate of all the cases studied. Furthermore, it has the necessary flexibility with respect to room geometry, surface-absorption distribution, and fitting distribution. It appears to be the model with the greatest applicability to fitted-room sound propagation prediction.

  9. Aspects géométriques et intégrables des modèles de matrices aléatoires

    NASA Astrophysics Data System (ADS)

    Marchal, Olivier

    2010-12-01

    This thesis deals with the geometric and integrable aspects associated with random matrix models. Its purpose is to provide various applications of random matrix theory, from algebraic geometry to partial differential equations of integrable systems. The variety of these applications shows why matrix models are important from a mathematical point of view. First, the thesis will focus on the study of the merging of two intervals of the eigenvalues density near a singular point. Specifically, we will show why this special limit gives universal equations from the Painlevé II hierarchy of integrable systems theory. Then, following the approach of (bi) orthogonal polynomials introduced by Mehta to compute partition functions, we will find Riemann-Hilbert and isomonodromic problems connected to matrix models, making the link with the theory of Jimbo, Miwa and Ueno. In particular, we will describe how the hermitian two-matrix models provide a degenerate case of Jimbo-Miwa-Ueno's theory that we will generalize in this context. Furthermore, the loop equations method, with its central notions of spectral curve and topological expansion, will lead to the symplectic invariants of algebraic geometry recently proposed by Eynard and Orantin. This last point will be generalized to the case of non-hermitian matrix models (arbitrary beta) paving the way to "quantum algebraic geometry" and to the generalization of symplectic invariants to "quantum curves". Finally, this set up will be applied to combinatorics in the context of topological string theory, with the explicit computation of an hermitian random matrix model enumerating the Gromov-Witten invariants of a toric Calabi-Yau threefold.

  10. An Analysis of the INGRES Database Management System Applications Program Development Tools and Programming Environment

    DTIC Science & Technology

    1986-12-01

    Position cursor over the naBe of a report, then use the appropriate enu iteffl to perforn an operation on that report. Naae Owner RBF? Last changed...LANGUAGE- INDEPENDENT, PORTABLE FILE ACCESS SY STEM A MODEL FOR AUTOMATIC FILE AND PROGRAM DESIGN IN BUSINE SS APPLICATION SYSTEM GENERALLY APPLICABLE...Article Description Year: 1988 Title: FLASH : A LANGUAGE- INDEPENDENT, PORTABLE FILE ACCESS SY STEM Authors: ALLCHIN.J.E., KaLER.A.H., WIEDERHOL.D.G

  11. Fifth Conference on Artificial Intelligence for Space Applications

    NASA Technical Reports Server (NTRS)

    Odell, Steve L. (Compiler)

    1990-01-01

    The Fifth Conference on Artificial Intelligence for Space Applications brings together diverse technical and scientific work in order to help those who employ AI methods in space applications to identify common goals and to address issues of general interest in the AI community. Topics include the following: automation for Space Station; intelligent control, testing, and fault diagnosis; robotics and vision; planning and scheduling; simulation, modeling, and tutoring; development tools and automatic programming; knowledge representation and acquisition; and knowledge base/data base integration.

  12. Soliton Gases and Generalized Hydrodynamics

    NASA Astrophysics Data System (ADS)

    Doyon, Benjamin; Yoshimura, Takato; Caux, Jean-Sébastien

    2018-01-01

    We show that the equations of generalized hydrodynamics (GHD), a hydrodynamic theory for integrable quantum systems at the Euler scale, emerge in full generality in a family of classical gases, which generalize the gas of hard rods. In this family, the particles, upon colliding, jump forward or backward by a distance that depends on their velocities, reminiscent of classical soliton scattering. This provides a "molecular dynamics" for GHD: a numerical solver which is efficient, flexible, and which applies to the presence of external force fields. GHD also describes the hydrodynamics of classical soliton gases. We identify the GHD of any quantum model with that of the gas of its solitonlike wave packets, thus providing a remarkable quantum-classical equivalence. The theory is directly applicable, for instance, to integrable quantum chains and to the Lieb-Liniger model realized in cold-atom experiments.

  13. Overview of NASARTI (NASA Radiation Track Image) Program: Highlights of the Model Improvement and the New Results

    NASA Technical Reports Server (NTRS)

    Ponomarev, Artem L.; Plante, I.; George, Kerry; Cornforth, M. N.; Loucas, B. D.; Wu, Honglu

    2014-01-01

    This presentation summarizes several years of research done by the co-authors developing the NASARTI (NASA Radiation Track Image) program and supporting it with scientific data. The goal of the program is to support NASA mission to achieve a safe space travel for humans despite the perils of space radiation. The program focuses on selected topics in radiation biology that were deemed important throughout this period of time, both for the NASA human space flight program and to academic radiation research. Besides scientific support to develop strategies protecting humans against an exposure to deep space radiation during space missions, and understanding health effects from space radiation on astronauts, other important ramifications of the ionizing radiation were studied with the applicability to greater human needs: understanding the origins of cancer, the impact on human genome, and the application of computer technology to biological research addressing the health of general population. The models under NASARTI project include: the general properties of ionizing radiation, such as particular track structure, the effects of radiation on human DNA, visualization and the statistical properties of DSBs (DNA double-strand breaks), DNA damage and repair pathways models and cell phenotypes, chromosomal aberrations, microscopy data analysis and the application to human tissue damage and cancer models. The development of the GUI and the interactive website, as deliverables to NASA operations teams and tools for a broader research community, is discussed. Most recent findings in the area of chromosomal aberrations and the application of the stochastic track structure are also presented.

  14. A general structure-property relationship to predict the enthalpy of vaporisation at ambient temperatures.

    PubMed

    Oberg, T

    2007-01-01

    The vapour pressure is the most important property of an anthropogenic organic compound in determining its partitioning between the atmosphere and the other environmental media. The enthalpy of vaporisation quantifies the temperature dependence of the vapour pressure and its value around 298 K is needed for environmental modelling. The enthalpy of vaporisation can be determined by different experimental methods, but estimation methods are needed to extend the current database and several approaches are available from the literature. However, these methods have limitations, such as a need for other experimental results as input data, a limited applicability domain, a lack of domain definition, and a lack of predictive validation. Here we have attempted to develop a quantitative structure-property relationship (QSPR) that has general applicability and is thoroughly validated. Enthalpies of vaporisation at 298 K were collected from the literature for 1835 pure compounds. The three-dimensional (3D) structures were optimised and each compound was described by a set of computationally derived descriptors. The compounds were randomly assigned into a calibration set and a prediction set. Partial least squares regression (PLSR) was used to estimate a low-dimensional QSPR model with 12 latent variables. The predictive performance of this model, within the domain of application, was estimated at n=560, q2Ext=0.968 and s=0.028 (log transformed values). The QSPR model was subsequently applied to a database of 100,000+ structures, after a similar 3D optimisation and descriptor generation. Reliable predictions can be reported for compounds within the previously defined applicability domain.

  15. Online Statistical Modeling (Regression Analysis) for Independent Responses

    NASA Astrophysics Data System (ADS)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  16. Modeling Piezoelectric Stack Actuators for Control of Micromanipulation

    NASA Technical Reports Server (NTRS)

    Goldfarb, Michael; Celanovic, Nikola

    1997-01-01

    A nonlinear lumped-parameter model of a piezoelectric stack actuator has been developed to describe actuator behavior for purposes of control system analysis and design, and, in particular, for microrobotic applications requiring accurate position and/or force control. In formulating this model, the authors propose a generalized Maxwell resistive capacitor as a lumped-parameter causal representation of rate-independent hysteresis. Model formulation is validated by comparing results of numerical simulations to experimental data. Validation is followed by a discussion of model implications for purposes of actuator control.

  17. TLS from fundamentals to practice

    PubMed Central

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Adams, Paul D.

    2014-01-01

    The Translation-Libration-Screw-rotation (TLS) model of rigid-body harmonic displacements introduced in crystallography by Schomaker & Trueblood (1968) is now a routine tool in macromolecular studies and is a feature of most modern crystallographic structure refinement packages. In this review we consider a number of simple examples that illustrate important features of the TLS model. Based on these examples simplified formulae are given for several special cases that may occur in structure modeling and refinement. The derivation of general TLS formulae from basic principles is also provided. This manuscript describes the principles of TLS modeling, as well as some select algorithmic details for practical application. An extensive list of applications references as examples of TLS in macromolecular crystallography refinement is provided. PMID:25249713

  18. Are Delinquents Different? Predictive Patterns for Low, Mid and High Delinquency Levels in a General Youth Sample via the HEW Youth Development Model's Impact Scales.

    ERIC Educational Resources Information Center

    Truckenmiller, James L.

    The Health, Education and Welfare (HEW) Office of Youth Development's National Strategy for Youth Development model was promoted as a community-based planning and procedural tool for enhancing positive youth development and reducing delinquency. To test the applicability of the model as a function of delinquency level, the program's Impact Scales…

  19. Petri net model for analysis of concurrently processed complex algorithms

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1986-01-01

    This paper presents a Petri-net model suitable for analyzing the concurrent processing of computationally complex algorithms. The decomposed operations are to be processed in a multiple processor, data driven architecture. Of particular interest is the application of the model to both the description of the data/control flow of a particular algorithm, and to the general specification of the data driven architecture. A candidate architecture is also presented.

  20. The Prediction of Length-of-day Variations Based on Gaussian Processes

    NASA Astrophysics Data System (ADS)

    Lei, Y.; Zhao, D. N.; Gao, Y. P.; Cai, H. B.

    2015-01-01

    Due to the complicated time-varying characteristics of the length-of-day (LOD) variations, the accuracies of traditional strategies for the prediction of the LOD variations such as the least squares extrapolation model, the time-series analysis model, and so on, have not met the requirements for real-time and high-precision applications. In this paper, a new machine learning algorithm --- the Gaussian process (GP) model is employed to forecast the LOD variations. Its prediction precisions are analyzed and compared with those of the back propagation neural networks (BPNN), general regression neural networks (GRNN) models, and the Earth Orientation Parameters Prediction Comparison Campaign (EOP PCC). The results demonstrate that the application of the GP model to the prediction of the LOD variations is efficient and feasible.

  1. Generalized multiscale finite-element method (GMsFEM) for elastic wave propagation in heterogeneous, anisotropic media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Kai; Fu, Shubin; Gibson, Richard L.

    It is important to develop fast yet accurate numerical methods for seismic wave propagation to characterize complex geological structures and oil and gas reservoirs. However, the computational cost of conventional numerical modeling methods, such as finite-difference method and finite-element method, becomes prohibitively expensive when applied to very large models. We propose a Generalized Multiscale Finite-Element Method (GMsFEM) for elastic wave propagation in heterogeneous, anisotropic media, where we construct basis functions from multiple local problems for both the boundaries and interior of a coarse node support or coarse element. The application of multiscale basis functions can capture the fine scale mediummore » property variations, and allows us to greatly reduce the degrees of freedom that are required to implement the modeling compared with conventional finite-element method for wave equation, while restricting the error to low values. We formulate the continuous Galerkin and discontinuous Galerkin formulation of the multiscale method, both of which have pros and cons. Applications of the multiscale method to three heterogeneous models show that our multiscale method can effectively model the elastic wave propagation in anisotropic media with a significant reduction in the degrees of freedom in the modeling system.« less

  2. Generalized Multiscale Finite-Element Method (GMsFEM) for elastic wave propagation in heterogeneous, anisotropic media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Kai, E-mail: kaigao87@gmail.com; Fu, Shubin, E-mail: shubinfu89@gmail.com; Gibson, Richard L., E-mail: gibson@tamu.edu

    It is important to develop fast yet accurate numerical methods for seismic wave propagation to characterize complex geological structures and oil and gas reservoirs. However, the computational cost of conventional numerical modeling methods, such as finite-difference method and finite-element method, becomes prohibitively expensive when applied to very large models. We propose a Generalized Multiscale Finite-Element Method (GMsFEM) for elastic wave propagation in heterogeneous, anisotropic media, where we construct basis functions from multiple local problems for both the boundaries and interior of a coarse node support or coarse element. The application of multiscale basis functions can capture the fine scale mediummore » property variations, and allows us to greatly reduce the degrees of freedom that are required to implement the modeling compared with conventional finite-element method for wave equation, while restricting the error to low values. We formulate the continuous Galerkin and discontinuous Galerkin formulation of the multiscale method, both of which have pros and cons. Applications of the multiscale method to three heterogeneous models show that our multiscale method can effectively model the elastic wave propagation in anisotropic media with a significant reduction in the degrees of freedom in the modeling system.« less

  3. Generalized multiscale finite-element method (GMsFEM) for elastic wave propagation in heterogeneous, anisotropic media

    DOE PAGES

    Gao, Kai; Fu, Shubin; Gibson, Richard L.; ...

    2015-04-14

    It is important to develop fast yet accurate numerical methods for seismic wave propagation to characterize complex geological structures and oil and gas reservoirs. However, the computational cost of conventional numerical modeling methods, such as finite-difference method and finite-element method, becomes prohibitively expensive when applied to very large models. We propose a Generalized Multiscale Finite-Element Method (GMsFEM) for elastic wave propagation in heterogeneous, anisotropic media, where we construct basis functions from multiple local problems for both the boundaries and interior of a coarse node support or coarse element. The application of multiscale basis functions can capture the fine scale mediummore » property variations, and allows us to greatly reduce the degrees of freedom that are required to implement the modeling compared with conventional finite-element method for wave equation, while restricting the error to low values. We formulate the continuous Galerkin and discontinuous Galerkin formulation of the multiscale method, both of which have pros and cons. Applications of the multiscale method to three heterogeneous models show that our multiscale method can effectively model the elastic wave propagation in anisotropic media with a significant reduction in the degrees of freedom in the modeling system.« less

  4. Statistical inference for template aging

    NASA Astrophysics Data System (ADS)

    Schuckers, Michael E.

    2006-04-01

    A change in classification error rates for a biometric device is often referred to as template aging. Here we offer two methods for determining whether the effect of time is statistically significant. The first of these is the use of a generalized linear model to determine if these error rates change linearly over time. This approach generalizes previous work assessing the impact of covariates using generalized linear models. The second approach uses of likelihood ratio tests methodology. The focus here is on statistical methods for estimation not the underlying cause of the change in error rates over time. These methodologies are applied to data from the National Institutes of Standards and Technology Biometric Score Set Release 1. The results of these applications are discussed.

  5. Application of surface complexation models to anion adsorption by natural materials.

    PubMed

    Goldberg, Sabine

    2014-10-01

    Various chemical models of ion adsorption are presented and discussed. Chemical models, such as surface complexation models, provide a molecular description of anion adsorption reactions using an equilibrium approach. Two such models, the constant capacitance model and the triple layer model, are described in the present study. Characteristics common to all the surface complexation models are equilibrium constant expressions, mass and charge balances, and surface activity coefficient electrostatic potential terms. Methods for determining parameter values for surface site density, capacitances, and surface complexation constants also are discussed. Spectroscopic experimental methods of establishing ion adsorption mechanisms include vibrational spectroscopy, nuclear magnetic resonance spectroscopy, electron spin resonance spectroscopy, X-ray absorption spectroscopy, and X-ray reflectivity. Experimental determinations of point of zero charge shifts and ionic strength dependence of adsorption results and molecular modeling calculations also can be used to deduce adsorption mechanisms. Applications of the surface complexation models to heterogeneous natural materials, such as soils, using the component additivity and the generalized composite approaches are described. Emphasis is on the generalized composite approach for predicting anion adsorption by soils. Continuing research is needed to develop consistent and realistic protocols for describing ion adsorption reactions on soil minerals and soils. The availability of standardized model parameter databases for use in chemical speciation-transport models is critical. Published 2014 Wiley Periodicals Inc. on behalf of SETAC. This article is a US Government work and as such, is in the public domain in the in the United States of America.

  6. On domain modelling of the service system with its application to enterprise information systems

    NASA Astrophysics Data System (ADS)

    Wang, J. W.; Wang, H. F.; Ding, J. L.; Furuta, K.; Kanno, T.; Ip, W. H.; Zhang, W. J.

    2016-01-01

    Information systems are a kind of service systems and they are throughout every element of a modern industrial and business system, much like blood in our body. Types of information systems are heterogeneous because of extreme uncertainty in changes in modern industrial and business systems. To effectively manage information systems, modelling of the work domain (or domain) of information systems is necessary. In this paper, a domain modelling framework for the service system is proposed and its application to the enterprise information system is outlined. The framework is defined based on application of a general domain modelling tool called function-context-behaviour-principle-state-structure (FCBPSS). The FCBPSS is based on a set of core concepts, namely: function, context, behaviour, principle, state and structure and system decomposition. Different from many other applications of FCBPSS in systems engineering, the FCBPSS is applied to both infrastructure and substance systems, which is novel and effective to modelling of service systems including enterprise information systems. It is to be noted that domain modelling of systems (e.g. enterprise information systems) is a key to integration of heterogeneous systems and to coping with unanticipated situations facing to systems.

  7. General Model for Multicomponent Ablation Thermochemistry

    NASA Technical Reports Server (NTRS)

    Milos, Frank S.; Marschall, Jochen; Rasky, Daniel J. (Technical Monitor)

    1994-01-01

    A previous paper (AIAA 94-2042) presented equations and numerical procedures for modeling the thermochemical ablation and pyrolysis of thermal protection materials which contain multiple surface species. This work describes modifications and enhancements to the Multicomponent Ablation Thermochemistry (MAT) theory and code for application to the general case which includes surface area constraints, rate limited surface reactions, and non-thermochemical mass loss (failure). Detailed results and comparisons with data are presented for the Shuttle Orbiter reinforced carbon-carbon oxidation protection system which contains a mixture of sodium silicate (Na2SiO3), silica (SiO2), silicon carbide (SiC), and carbon (C).

  8. Real-time simulation of an F110/STOVL turbofan engine

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.; Ouzts, Peter J.

    1989-01-01

    A traditional F110-type turbofan engine model was extended to include a ventral nozzle and two thrust-augmenting ejectors for Short Take-Off Vertical Landing (STOVL) aircraft applications. Development of the real-time F110/STOVL simulation required special attention to the modeling approach to component performance maps, the low pressure turbine exit mixing region, and the tailpipe dynamic approximation. Simulation validation derives by comparing output from the ADSIM simulation with the output for a validated F110/STOVL General Electric Aircraft Engines FORTRAN deck. General Electric substantiated basic engine component characteristics through factory testing and full scale ejector data.

  9. A Prior for Neural Networks utilizing Enclosing Spheres for Normalization

    NASA Astrophysics Data System (ADS)

    v. Toussaint, U.; Gori, S.; Dose, V.

    2004-11-01

    Neural Networks are famous for their advantageous flexibility for problems when there is insufficient knowledge to set up a proper model. On the other hand this flexibility can cause over-fitting and can hamper the generalization properties of neural networks. Many approaches to regularize NN have been suggested but most of them based on ad-hoc arguments. Employing the principle of transformation invariance we derive a general prior in accordance with the Bayesian probability theory for a class of feedforward networks. Optimal networks are determined by Bayesian model comparison verifying the applicability of this approach.

  10. Facial first impressions and partner preference models: Comparable or distinct underlying structures?

    PubMed

    South Palomares, Jennifer K; Sutherland, Clare A M; Young, Andrew W

    2017-12-17

    Given the frequency of relationships nowadays initiated online, where impressions from face photographs may influence relationship initiation, it is important to understand how facial first impressions might be used in such contexts. We therefore examined the applicability of a leading model of verbally expressed partner preferences to impressions derived from real face images and investigated how the factor structure of first impressions based on potential partner preference-related traits might relate to a more general model of facial first impressions. Participants rated 1,000 everyday face photographs on 12 traits selected to represent (Fletcher, et al. 1999, Journal of Personality and Social Psychology, 76, 72) verbal model of partner preferences. Facial trait judgements showed an underlying structure that largely paralleled the tripartite structure of Fletcher et al.'s verbal preference model, regardless of either face gender or participant gender. Furthermore, there was close correspondence between the verbal partner preference model and a more general tripartite model of facial first impressions derived from a different literature (Sutherland et al., 2013, Cognition, 127, 105), suggesting an underlying correspondence between verbal conceptual models of romantic preferences and more general models of facial first impressions. © 2017 The British Psychological Society.

  11. A multiscale approach to modelling electrochemical processes occurring across the cell membrane with application to transmission of action potentials.

    PubMed

    Richardson, G

    2009-09-01

    By application of matched asymptotic expansions, a simplified partial differential equation (PDE) model for the dynamic electrochemical processes occurring in the vicinity of a membrane, as ions selectively permeate across it, is formally derived from the Poisson-Nernst-Planck equations of electrochemistry. It is demonstrated that this simplified model reduces itself, in the limit of a long thin axon, to the cable equation used by Hodgkin and Huxley to describe the propagation of action potentials in the unmyelinated squid giant axon. The asymptotic reduction from the simplified PDE model to the cable equation leads to insights that are not otherwise apparent; these include an explanation of why the squid giant axon attains a diameter in the region of 1 mm. The simplified PDE model has more general application than the Hodgkin-Huxley cable equation and can, e.g. be used to describe action potential propagation in myelinated axons and neuronal cell bodies.

  12. A general description of detachment for multidimensional modelling of biofilms.

    PubMed

    Xavier, Joao de Bivar; Picioreanu, Cristian; van Loosdrecht, Mark C M

    2005-09-20

    A general method for describing biomass detachment in multidimensional biofilm modelling is introduced. Biomass losses from processes acting on the entire surface of the biofilm, such as erosion, are modelled using a continuous detachment speed function F(det). Discrete detachment events, i.e. sloughing, are implicitly derived from simulations. The method is flexible to allow F(det) to take several forms, including expressions dependent on any state variables such as the local biofilm density. This methodology for biomass detachment was integrated with multidimensional (2D and 3D) particle-based multispecies biofilm models by using a novel application of the level set method. Application of the method is illustrated by trends in the dynamics of biofilms structure and activity derived from simulations performed on a simple model considering uniform biomass (case study I) and a model discriminating biomass composition in heterotrophic active mass, extracellular polymeric substances (EPS) and inert mass (case study II). Results from case study I demonstrate the effect of applied detachment forces as a fundamental factor influencing steady-state biofilm activity and structure. Trends from experimental observations reported in literature were correctly described. For example, simulation results indicated that biomass sloughing is reduced when erosion forces are increased. Case study II illustrates the application of the detachment methodology to systems with non-uniform biomass composition. Simulations carried out at different bulk concentrations of substrate show changes in biofilm structure (in terms of shape, density and spatial distribution of biomass components) and activity (in terms of oxygen and substrate consumption) as a consequence of either oxygen-limited or substrate-limited growth. (c) 2005 Wiley Periodicals, Inc.

  13. A radio-frequency sheath model for complex waveforms

    NASA Astrophysics Data System (ADS)

    Turner, M. M.; Chabert, P.

    2014-04-01

    Plasma sheaths driven by radio-frequency voltages occur in contexts ranging from plasma processing to magnetically confined fusion experiments. An analytical understanding of such sheaths is therefore important, both intrinsically and as an element in more elaborate theoretical structures. Radio-frequency sheaths are commonly excited by highly anharmonic waveforms, but no analytical model exists for this general case. We present a mathematically simple sheath model that is in good agreement with earlier models for single frequency excitation, yet can be solved for arbitrary excitation waveforms. As examples, we discuss dual-frequency and pulse-like waveforms. The model employs the ansatz that the time-averaged electron density is a constant fraction of the ion density. In the cases we discuss, the error introduced by this approximation is small, and in general it can be quantified through an internal consistency condition of the model. This simple and accurate model is likely to have wide application.

  14. A Three-Component Model for Magnetization Transfer. Solution by Projection-Operator Technique, and Application to Cartilage

    NASA Astrophysics Data System (ADS)

    Adler, Ronald S.; Swanson, Scott D.; Yeung, Hong N.

    1996-01-01

    A projection-operator technique is applied to a general three-component model for magnetization transfer, extending our previous two-component model [R. S. Adler and H. N. Yeung,J. Magn. Reson. A104,321 (1993), and H. N. Yeung, R. S. Adler, and S. D. Swanson,J. Magn. Reson. A106,37 (1994)]. The PO technique provides an elegant means of deriving a simple, effective rate equation in which there is natural separation of relaxation and source terms and allows incorporation of Redfield-Provotorov theory without any additional assumptions or restrictive conditions. The PO technique is extended to incorporate more general, multicomponent models. The three-component model is used to fit experimental data from samples of human hyaline cartilage and fibrocartilage. The fits of the three-component model are compared to the fits of the two-component model.

  15. Characterizing and Modeling the Cost of Rework in a Library of Reusable Software Components

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Condon, Steven E.; ElEmam, Khaled; Hendrick, Robert B.; Melo, Walcelio

    1997-01-01

    In this paper we characterize and model the cost of rework in a Component Factory (CF) organization. A CF is responsible for developing and packaging reusable software components. Data was collected on corrective maintenance activities for the Generalized Support Software reuse asset library located at the Flight Dynamics Division of NASA's GSFC. We then constructed a predictive model of the cost of rework using the C4.5 system for generating a logical classification model. The predictor variables for the model are measures of internal software product attributes. The model demonstrates good prediction accuracy, and can be used by managers to allocate resources for corrective maintenance activities. Furthermore, we used the model to generate proscriptive coding guidelines to improve programming, practices so that the cost of rework can be reduced in the future. The general approach we have used is applicable to other environments.

  16. Analytic uncertainty and sensitivity analysis of models with input correlations

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  17. Individual heterogeneity in life histories and eco-evolutionary dynamics

    PubMed Central

    Vindenes, Yngvild; Langangen, Øystein

    2015-01-01

    Individual heterogeneity in life history shapes eco-evolutionary processes, and unobserved heterogeneity can affect demographic outputs characterising life history and population dynamical properties. Demographic frameworks like matrix models or integral projection models represent powerful approaches to disentangle mechanisms linking individual life histories and population-level processes. Recent developments have provided important steps towards their application to study eco-evolutionary dynamics, but so far individual heterogeneity has largely been ignored. Here, we present a general demographic framework that incorporates individual heterogeneity in a flexible way, by separating static and dynamic traits (discrete or continuous). First, we apply the framework to derive the consequences of ignoring heterogeneity for a range of widely used demographic outputs. A general conclusion is that besides the long-term growth rate lambda, all parameters can be affected. Second, we discuss how the framework can help advance current demographic models of eco-evolutionary dynamics, by incorporating individual heterogeneity. For both applications numerical examples are provided, including an empirical example for pike. For instance, we demonstrate that predicted demographic responses to climate warming can be reversed by increased heritability. We discuss how applications of this demographic framework incorporating individual heterogeneity can help answer key biological questions that require a detailed understanding of eco-evolutionary dynamics. PMID:25807980

  18. Global attractivity of an almost periodic N-species nonlinear ecological competitive model

    NASA Astrophysics Data System (ADS)

    Xia, Yonghui; Han, Maoan; Huang, Zhenkun

    2008-01-01

    By using comparison theorem and constructing suitable Lyapunov functional, we study the following almost periodic nonlinear N-species competitive Lotka-Volterra model: A set of sufficient conditions is obtained for the existence and global attractivity of a unique positive almost periodic solution of the above model. As applications, some special competition models are studied again, our new results improve and generalize former results. Examples and their simulations show the feasibility of our main results.

  19. Prediction of dynamical systems by symbolic regression

    NASA Astrophysics Data System (ADS)

    Quade, Markus; Abel, Markus; Shafi, Kamran; Niven, Robert K.; Noack, Bernd R.

    2016-07-01

    We study the modeling and prediction of dynamical systems based on conventional models derived from measurements. Such algorithms are highly desirable in situations where the underlying dynamics are hard to model from physical principles or simplified models need to be found. We focus on symbolic regression methods as a part of machine learning. These algorithms are capable of learning an analytically tractable model from data, a highly valuable property. Symbolic regression methods can be considered as generalized regression methods. We investigate two particular algorithms, the so-called fast function extraction which is a generalized linear regression algorithm, and genetic programming which is a very general method. Both are able to combine functions in a certain way such that a good model for the prediction of the temporal evolution of a dynamical system can be identified. We illustrate the algorithms by finding a prediction for the evolution of a harmonic oscillator based on measurements, by detecting an arriving front in an excitable system, and as a real-world application, the prediction of solar power production based on energy production observations at a given site together with the weather forecast.

  20. A generalized partially linear mean-covariance regression model for longitudinal proportional data, with applications to the analysis of quality of life data from cancer clinical trials.

    PubMed

    Zheng, Xueying; Qin, Guoyou; Tu, Dongsheng

    2017-05-30

    Motivated by the analysis of quality of life data from a clinical trial on early breast cancer, we propose in this paper a generalized partially linear mean-covariance regression model for longitudinal proportional data, which are bounded in a closed interval. Cholesky decomposition of the covariance matrix for within-subject responses and generalized estimation equations are used to estimate unknown parameters and the nonlinear function in the model. Simulation studies are performed to evaluate the performance of the proposed estimation procedures. Our new model is also applied to analyze the data from the cancer clinical trial that motivated this research. In comparison with available models in the literature, the proposed model does not require specific parametric assumptions on the density function of the longitudinal responses and the probability function of the boundary values and can capture dynamic changes of time or other interested variables on both mean and covariance of the correlated proportional responses. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. A Bayesian framework for adaptive selection, calibration, and validation of coarse-grained models of atomistic systems

    NASA Astrophysics Data System (ADS)

    Farrell, Kathryn; Oden, J. Tinsley; Faghihi, Danial

    2015-08-01

    A general adaptive modeling algorithm for selection and validation of coarse-grained models of atomistic systems is presented. A Bayesian framework is developed to address uncertainties in parameters, data, and model selection. Algorithms for computing output sensitivities to parameter variances, model evidence and posterior model plausibilities for given data, and for computing what are referred to as Occam Categories in reference to a rough measure of model simplicity, make up components of the overall approach. Computational results are provided for representative applications.

  2. Application of artificial neural networks in hydrological modeling: A case study of runoff simulation of a Himalayan glacier basin

    NASA Technical Reports Server (NTRS)

    Buch, A. M.; Narain, A.; Pandey, P. C.

    1994-01-01

    The simulation of runoff from a Himalayan Glacier basin using an Artificial Neural Network (ANN) is presented. The performance of the ANN model is found to be superior to the Energy Balance Model and the Multiple Regression model. The RMS Error is used as the figure of merit for judging the performance of the three models, and the RMS Error for the ANN model is the latest of the three models. The ANN is faster in learning and exhibits excellent system generalization characteristics.

  3. Application of a Near-Field Water Quality Model.

    DTIC Science & Technology

    1979-07-01

    VERIFICATION 45 CENTERLINE TEMPERATURE DECRF~A7F 46 LATERAL VARIATION OF CONSTITUENTS 46 VARIATIOtN OF PLUME WIDTH 49 GENERAL ON VERIFICATION 49...40 4 SOME RESULTS OF VARYING THE ENTRAINMENT COEFFICIENT 4’ 5 RESULTS OF VARYING OTHER COEFFICEINT 42 6 GENERAL PLUME CHARACTERISITICS FOR VARIATION... plume ) axis. These profile forms are then integrated within the basic conservation equations. This integration reduces the problem to a one

  4. Mean energy of some interacting bosonic systems derived by virtue of the generalized Hellmann-Feynman theorem

    NASA Astrophysics Data System (ADS)

    Fan, Hong-yi; Xu, Xue-xiang

    2009-06-01

    By virtue of the generalized Hellmann-Feynman theorem [H. Y. Fan and B. Z. Chen, Phys. Lett. A 203, 95 (1995)], we derive the mean energy of some interacting bosonic systems for some Hamiltonian models without proceeding with diagonalizing the Hamiltonians. Our work extends the field of applications of the Hellmann-Feynman theorem and may enrich the theory of quantum statistics.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Kandler; Shi, Ying; Santhanagopalan, Shriram

    Predictive models of Li-ion battery lifetime must consider a multiplicity of electrochemical, thermal, and mechanical degradation modes experienced by batteries in application environments. To complicate matters, Li-ion batteries can experience different degradation trajectories that depend on storage and cycling history of the application environment. Rates of degradation are controlled by factors such as temperature history, electrochemical operating window, and charge/discharge rate. We present a generalized battery life prognostic model framework for battery systems design and control. The model framework consists of trial functions that are statistically regressed to Li-ion cell life datasets wherein the cells have been aged under differentmore » levels of stress. Degradation mechanisms and rate laws dependent on temperature, storage, and cycling condition are regressed to the data, with multiple model hypotheses evaluated and the best model down-selected based on statistics. The resulting life prognostic model, implemented in state variable form, is extensible to arbitrary real-world scenarios. The model is applicable in real-time control algorithms to maximize battery life and performance. We discuss efforts to reduce lifetime prediction error and accommodate its inevitable impact in controller design.« less

  6. Modeling error PDF optimization based wavelet neural network modeling of dynamic system and its application in blast furnace ironmaking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Ping; Wang, Chenyu; Li, Mingjie

    In general, the modeling errors of dynamic system model are a set of random variables. The traditional performance index of modeling such as means square error (MSE) and root means square error (RMSE) can not fully express the connotation of modeling errors with stochastic characteristics both in the dimension of time domain and space domain. Therefore, the probability density function (PDF) is introduced to completely describe the modeling errors in both time scales and space scales. Based on it, a novel wavelet neural network (WNN) modeling method is proposed by minimizing the two-dimensional (2D) PDF shaping of modeling errors. First,more » the modeling error PDF by the tradional WNN is estimated using data-driven kernel density estimation (KDE) technique. Then, the quadratic sum of 2D deviation between the modeling error PDF and the target PDF is utilized as performance index to optimize the WNN model parameters by gradient descent method. Since the WNN has strong nonlinear approximation and adaptive capability, and all the parameters are well optimized by the proposed method, the developed WNN model can make the modeling error PDF track the target PDF, eventually. Simulation example and application in a blast furnace ironmaking process show that the proposed method has a higher modeling precision and better generalization ability compared with the conventional WNN modeling based on MSE criteria. Furthermore, the proposed method has more desirable estimation for modeling error PDF that approximates to a Gaussian distribution whose shape is high and narrow.« less

  7. Modeling error PDF optimization based wavelet neural network modeling of dynamic system and its application in blast furnace ironmaking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Ping; Wang, Chenyu; Li, Mingjie

    In general, the modeling errors of dynamic system model are a set of random variables. The traditional performance index of modeling such as means square error (MSE) and root means square error (RMSE) cannot fully express the connotation of modeling errors with stochastic characteristics both in the dimension of time domain and space domain. Therefore, the probability density function (PDF) is introduced to completely describe the modeling errors in both time scales and space scales. Based on it, a novel wavelet neural network (WNN) modeling method is proposed by minimizing the two-dimensional (2D) PDF shaping of modeling errors. First, themore » modeling error PDF by the traditional WNN is estimated using data-driven kernel density estimation (KDE) technique. Then, the quadratic sum of 2D deviation between the modeling error PDF and the target PDF is utilized as performance index to optimize the WNN model parameters by gradient descent method. Since the WNN has strong nonlinear approximation and adaptive capability, and all the parameters are well optimized by the proposed method, the developed WNN model can make the modeling error PDF track the target PDF, eventually. Simulation example and application in a blast furnace ironmaking process show that the proposed method has a higher modeling precision and better generalization ability compared with the conventional WNN modeling based on MSE criteria. However, the proposed method has more desirable estimation for modeling error PDF that approximates to a Gaussian distribution whose shape is high and narrow.« less

  8. Modeling error PDF optimization based wavelet neural network modeling of dynamic system and its application in blast furnace ironmaking

    DOE PAGES

    Zhou, Ping; Wang, Chenyu; Li, Mingjie; ...

    2018-01-31

    In general, the modeling errors of dynamic system model are a set of random variables. The traditional performance index of modeling such as means square error (MSE) and root means square error (RMSE) cannot fully express the connotation of modeling errors with stochastic characteristics both in the dimension of time domain and space domain. Therefore, the probability density function (PDF) is introduced to completely describe the modeling errors in both time scales and space scales. Based on it, a novel wavelet neural network (WNN) modeling method is proposed by minimizing the two-dimensional (2D) PDF shaping of modeling errors. First, themore » modeling error PDF by the traditional WNN is estimated using data-driven kernel density estimation (KDE) technique. Then, the quadratic sum of 2D deviation between the modeling error PDF and the target PDF is utilized as performance index to optimize the WNN model parameters by gradient descent method. Since the WNN has strong nonlinear approximation and adaptive capability, and all the parameters are well optimized by the proposed method, the developed WNN model can make the modeling error PDF track the target PDF, eventually. Simulation example and application in a blast furnace ironmaking process show that the proposed method has a higher modeling precision and better generalization ability compared with the conventional WNN modeling based on MSE criteria. However, the proposed method has more desirable estimation for modeling error PDF that approximates to a Gaussian distribution whose shape is high and narrow.« less

  9. A Hybrid Multiscale Framework for Subsurface Flow and Transport Simulations

    DOE PAGES

    Scheibe, Timothy D.; Yang, Xiaofan; Chen, Xingyuan; ...

    2015-06-01

    Extensive research efforts have been invested in reducing model errors to improve the predictive ability of biogeochemical earth and environmental system simulators, with applications ranging from contaminant transport and remediation to impacts of biogeochemical elemental cycling (e.g., carbon and nitrogen) on local ecosystems and regional to global climate. While the bulk of this research has focused on improving model parameterizations in the face of observational limitations, the more challenging type of model error/uncertainty to identify and quantify is model structural error which arises from incorrect mathematical representations of (or failure to consider) important physical, chemical, or biological processes, properties, ormore » system states in model formulations. While improved process understanding can be achieved through scientific study, such understanding is usually developed at small scales. Process-based numerical models are typically designed for a particular characteristic length and time scale. For application-relevant scales, it is generally necessary to introduce approximations and empirical parameterizations to describe complex systems or processes. This single-scale approach has been the best available to date because of limited understanding of process coupling combined with practical limitations on system characterization and computation. While computational power is increasing significantly and our understanding of biological and environmental processes at fundamental scales is accelerating, using this information to advance our knowledge of the larger system behavior requires the development of multiscale simulators. Accordingly there has been much recent interest in novel multiscale methods in which microscale and macroscale models are explicitly coupled in a single hybrid multiscale simulation. A limited number of hybrid multiscale simulations have been developed for biogeochemical earth systems, but they mostly utilize application-specific and sometimes ad-hoc approaches for model coupling. We are developing a generalized approach to hierarchical model coupling designed for high-performance computational systems, based on the Swift computing workflow framework. In this presentation we will describe the generalized approach and provide two use cases: 1) simulation of a mixing-controlled biogeochemical reaction coupling pore- and continuum-scale models, and 2) simulation of biogeochemical impacts of groundwater – river water interactions coupling fine- and coarse-grid model representations. This generalized framework can be customized for use with any pair of linked models (microscale and macroscale) with minimal intrusiveness to the at-scale simulators. It combines a set of python scripts with the Swift workflow environment to execute a complex multiscale simulation utilizing an approach similar to the well-known Heterogeneous Multiscale Method. User customization is facilitated through user-provided input and output file templates and processing function scripts, and execution within a high-performance computing environment is handled by Swift, such that minimal to no user modification of at-scale codes is required.« less

  10. Probabilistic arithmetic automata and their applications.

    PubMed

    Marschall, Tobias; Herms, Inke; Kaltenbach, Hans-Michael; Rahmann, Sven

    2012-01-01

    We present a comprehensive review on probabilistic arithmetic automata (PAAs), a general model to describe chains of operations whose operands depend on chance, along with two algorithms to numerically compute the distribution of the results of such probabilistic calculations. PAAs provide a unifying framework to approach many problems arising in computational biology and elsewhere. We present five different applications, namely 1) pattern matching statistics on random texts, including the computation of the distribution of occurrence counts, waiting times, and clump sizes under hidden Markov background models; 2) exact analysis of window-based pattern matching algorithms; 3) sensitivity of filtration seeds used to detect candidate sequence alignments; 4) length and mass statistics of peptide fragments resulting from enzymatic cleavage reactions; and 5) read length statistics of 454 and IonTorrent sequencing reads. The diversity of these applications indicates the flexibility and unifying character of the presented framework. While the construction of a PAA depends on the particular application, we single out a frequently applicable construction method: We introduce deterministic arithmetic automata (DAAs) to model deterministic calculations on sequences, and demonstrate how to construct a PAA from a given DAA and a finite-memory random text model. This procedure is used for all five discussed applications and greatly simplifies the construction of PAAs. Implementations are available as part of the MoSDi package. Its application programming interface facilitates the rapid development of new applications based on the PAA framework.

  11. First experience with multiple mini interview for medical school admission in Brazil: Does it work in a different cultural scenario?

    PubMed

    Daniel-Filho, Durval Anibal; Pires, Elda Maria Stafuzza Gonçalves; Paes, Angela Tavares; Troster, Eduardo Juan; Silva, Simone Cristina Azevedo B S; Granato, Mariana Fachini; Couto, Thomaz Bittencourt; Barreto, Joyce Kelly Silva; Campos, Alexandre Holthausen; Monte, Julio C Martins; Schvartsman, Claudio

    2017-10-01

    Evaluation of non-cognitive skills never has been used in Brazil. This study aims to evaluate Multiple Mini Interviews (MMI) in the admission process of a School of Medicine in São Paulo, Brazil. The population of the study comprised 240 applicants summoned for the interviews, and 96 raters. MMI contributed to 25% of the applicants' final grade. Eight scenarios were created with the aim of evaluating different non-cognitive skills, each one had two raters. At the end of the interviews, the applicants and raters described their impressions about MMI. The reliability of the MMI was analyzed using the Theory of Generalization and Many-Facet Rasch Model (MFRM). The G-study showed that the general reliability of the process was satisfactory (coefficient G = 0.743). The MMI grades were not affected by the raters' profile, time of interview (p = 0.715), and randomization group (p = 0.353). The Rasch analysis showed that there was no misfitting effects or inconsistent stations or raters. A significant majority of the applicants (98%) and all the raters believed MMIs were important in selecting students with a more adequate profile to study medicine. The general reliability of the selection process was excellent, and it was fully accepted by the applicants and raters.

  12. Application of numerical optimization techniques to control system design for nonlinear dynamic models of aircraft

    NASA Technical Reports Server (NTRS)

    Lan, C. Edward; Ge, Fuying

    1989-01-01

    Control system design for general nonlinear flight dynamic models is considered through numerical simulation. The design is accomplished through a numerical optimizer coupled with analysis of flight dynamic equations. The general flight dynamic equations are numerically integrated and dynamic characteristics are then identified from the dynamic response. The design variables are determined iteratively by the optimizer to optimize a prescribed objective function which is related to desired dynamic characteristics. Generality of the method allows nonlinear effects to aerodynamics and dynamic coupling to be considered in the design process. To demonstrate the method, nonlinear simulation models for an F-5A and an F-16 configurations are used to design dampers to satisfy specifications on flying qualities and control systems to prevent departure. The results indicate that the present method is simple in formulation and effective in satisfying the design objectives.

  13. A general framework for parametric survival analysis.

    PubMed

    Crowther, Michael J; Lambert, Paul C

    2014-12-30

    Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.

  14. Bayesian analysis of volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Ho, Chih-Hsiang

    1990-10-01

    The simple Poisson model generally gives a good fit to many volcanoes for volcanic eruption forecasting. Nonetheless, empirical evidence suggests that volcanic activity in successive equal time-periods tends to be more variable than a simple Poisson with constant eruptive rate. An alternative model is therefore examined in which eruptive rate(λ) for a given volcano or cluster(s) of volcanoes is described by a gamma distribution (prior) rather than treated as a constant value as in the assumptions of a simple Poisson model. Bayesian analysis is performed to link two distributions together to give the aggregate behavior of the volcanic activity. When the Poisson process is expanded to accomodate a gamma mixing distribution on λ, a consequence of this mixed (or compound) Poisson model is that the frequency distribution of eruptions in any given time-period of equal length follows the negative binomial distribution (NBD). Applications of the proposed model and comparisons between the generalized model and simple Poisson model are discussed based on the historical eruptive count data of volcanoes Mauna Loa (Hawaii) and Etna (Italy). Several relevant facts lead to the conclusion that the generalized model is preferable for practical use both in space and time.

  15. Metadata for selecting or submitting generic seismic vulnerability functions via GEM's vulnerability database

    USGS Publications Warehouse

    Jaiswal, Kishor

    2013-01-01

    This memo lays out a procedure for the GEM software to offer an available vulnerability function for any acceptable set of attributes that the user specifies for a particular building category. The memo also provides general guidelines on how to submit the vulnerability or fragility functions to the GEM vulnerability repository, stipulating which attributes modelers must provide so that their vulnerability or fragility functions can be queried appropriately by the vulnerability database. An important objective is to provide users guidance on limitations and applicability by providing the associated modeling assumptions and applicability of each vulnerability or fragility function.

  16. Monte Carlo skin dose simulation in intraoperative radiotherapy of breast cancer using spherical applicators.

    PubMed

    Moradi, F; Ung, N M; Khandaker, M U; Mahdiraji, G A; Saad, M; Abdul Malik, R; Bustam, A Z; Zaili, Z; Bradley, D A

    2017-07-28

    The relatively new treatment modality electronic intraoperative radiotherapy (IORT) is gaining popularity, irradiation being obtained within a surgically produced cavity being delivered via a low-energy x-ray source and spherical applicators, primarily for early stage breast cancer. Due to the spatially dramatic dose-rate fall off with radial distance from the source and effects related to changes in the beam quality of the low keV photon spectra, dosimetric account of the Intrabeam system is rather complex. Skin dose monitoring in IORT is important due to the high dose prescription per treatment fraction. In this study, modeling of the x-ray source and related applicators were performed using the Monte Carlo N-Particle transport code. The dosimetric characteristics of the model were validated against measured data obtained using an ionization chamber and EBT3 film as dosimeters. By using a simulated breast phantom, absorbed doses to the skin for different combinations of applicator size (1.5-5 cm) and treatment depth (0.5-3 cm) were calculated. Simulation results showed overdosing of the skin (>30% of prescribed dose) at a treatment depth of 0.5 cm using applicator sizes larger than 1.5 cm. Skin doses were significantly increased with applicator size, insofar as delivering 12 Gy (60% of the prescribed dose) to skin for the largest sized applicator (5 cm diameter) and treatment depth of 0.5 cm. It is concluded that the recommended 0.5-1 cm distance between the skin and applicator surface does not guarantee skin safety and skin dose is generally more significant in cases with the larger applicators. • Intrabeam x-ray source and spherical applicators were simulated and skin dose was calculated. • Skin dose for constant skin to applicator distance strongly depends on applicator size. • Use of larger applicators generally results in higher skin dose. • The recommended 0.5-1 cm skin to applicator distance does not guarantee skin safety.

  17. 3D-printing techniques in a medical setting: a systematic literature review.

    PubMed

    Tack, Philip; Victor, Jan; Gemmel, Paul; Annemans, Lieven

    2016-10-21

    Three-dimensional (3D) printing has numerous applications and has gained much interest in the medical world. The constantly improving quality of 3D-printing applications has contributed to their increased use on patients. This paper summarizes the literature on surgical 3D-printing applications used on patients, with a focus on reported clinical and economic outcomes. Three major literature databases were screened for case series (more than three cases described in the same study) and trials of surgical applications of 3D printing in humans. 227 surgical papers were analyzed and summarized using an evidence table. The papers described the use of 3D printing for surgical guides, anatomical models, and custom implants. 3D printing is used in multiple surgical domains, such as orthopedics, maxillofacial surgery, cranial surgery, and spinal surgery. In general, the advantages of 3D-printed parts are said to include reduced surgical time, improved medical outcome, and decreased radiation exposure. The costs of printing and additional scans generally increase the overall cost of the procedure. 3D printing is well integrated in surgical practice and research. Applications vary from anatomical models mainly intended for surgical planning to surgical guides and implants. Our research suggests that there are several advantages to 3D-printed applications, but that further research is needed to determine whether the increased intervention costs can be balanced with the observable advantages of this new technology. There is a need for a formal cost-effectiveness analysis.

  18. A general framework of automorphic inflation

    NASA Astrophysics Data System (ADS)

    Schimmrigk, Rolf

    2016-05-01

    Automorphic inflation is an application of the framework of automorphic scalar field theory, based on the theory of automorphic forms and representations. In this paper the general framework of automorphic and modular inflation is described in some detail, with emphasis on the resulting stratification of the space of scalar field theories in terms of the group theoretic data associated to the shift symmetry, as well as the automorphic data that specifies the potential. The class of theories based on Eisenstein series provides a natural generalization of the model of j-inflation considered previously.

  19. Real-time Adaptive Control Using Neural Generalized Predictive Control

    NASA Technical Reports Server (NTRS)

    Haley, Pam; Soloway, Don; Gold, Brian

    1999-01-01

    The objective of this paper is to demonstrate the feasibility of a Nonlinear Generalized Predictive Control algorithm by showing real-time adaptive control on a plant with relatively fast time-constants. Generalized Predictive Control has classically been used in process control where linear control laws were formulated for plants with relatively slow time-constants. The plant of interest for this paper is a magnetic levitation device that is nonlinear and open-loop unstable. In this application, the reference model of the plant is a neural network that has an embedded nominal linear model in the network weights. The control based on the linear model provides initial stability at the beginning of network training. In using a neural network the control laws are nonlinear and online adaptation of the model is possible to capture unmodeled or time-varying dynamics. Newton-Raphson is the minimization algorithm. Newton-Raphson requires the calculation of the Hessian, but even with this computational expense the low iteration rate make this a viable algorithm for real-time control.

  20. Testing goodness of fit in regression: a general approach for specified alternatives.

    PubMed

    Solari, Aldo; le Cessie, Saskia; Goeman, Jelle J

    2012-12-10

    When fitting generalized linear models or the Cox proportional hazards model, it is important to have tools to test for lack of fit. Because lack of fit comes in all shapes and sizes, distinguishing among different types of lack of fit is of practical importance. We argue that an adequate diagnosis of lack of fit requires a specified alternative model. Such specification identifies the type of lack of fit the test is directed against so that if we reject the null hypothesis, we know the direction of the departure from the model. The goodness-of-fit approach of this paper allows to treat different types of lack of fit within a unified general framework and to consider many existing tests as special cases. Connections with penalized likelihood and random effects are discussed, and the application of the proposed approach is illustrated with medical examples. Tailored functions for goodness-of-fit testing have been implemented in the R package global test. Copyright © 2012 John Wiley & Sons, Ltd.

Top