Statistical label fusion with hierarchical performance models
Asman, Andrew J.; Dagley, Alexander S.; Landman, Bennett A.
2014-01-01
Label fusion is a critical step in many image segmentation frameworks (e.g., multi-atlas segmentation) as it provides a mechanism for generalizing a collection of labeled examples into a single estimate of the underlying segmentation. In the multi-label case, typical label fusion algorithms treat all labels equally – fully neglecting the known, yet complex, anatomical relationships exhibited in the data. To address this problem, we propose a generalized statistical fusion framework using hierarchical models of rater performance. Building on the seminal work in statistical fusion, we reformulate the traditional rater performance model from a multi-tiered hierarchical perspective. This new approach provides a natural framework for leveraging known anatomical relationships and accurately modeling the types of errors that raters (or atlases) make within a hierarchically consistent formulation. Herein, we describe several contributions. First, we derive a theoretical advancement to the statistical fusion framework that enables the simultaneous estimation of multiple (hierarchical) performance models within the statistical fusion context. Second, we demonstrate that the proposed hierarchical formulation is highly amenable to the state-of-the-art advancements that have been made to the statistical fusion framework. Lastly, in an empirical whole-brain segmentation task we demonstrate substantial qualitative and significant quantitative improvement in overall segmentation accuracy. PMID:24817809
Hierarchical Bayesian Modeling of Fluid-Induced Seismicity
NASA Astrophysics Data System (ADS)
Broccardo, M.; Mignan, A.; Wiemer, S.; Stojadinovic, B.; Giardini, D.
2017-11-01
In this study, we present a Bayesian hierarchical framework to model fluid-induced seismicity. The framework is based on a nonhomogeneous Poisson process with a fluid-induced seismicity rate proportional to the rate of injected fluid. The fluid-induced seismicity rate model depends upon a set of physically meaningful parameters and has been validated for six fluid-induced case studies. In line with the vision of hierarchical Bayesian modeling, the rate parameters are considered as random variables. We develop both the Bayesian inference and updating rules, which are used to develop a probabilistic forecasting model. We tested the Basel 2006 fluid-induced seismic case study to prove that the hierarchical Bayesian model offers a suitable framework to coherently encode both epistemic uncertainty and aleatory variability. Moreover, it provides a robust and consistent short-term seismic forecasting model suitable for online risk quantification and mitigation.
Clinical time series prediction: Toward a hierarchical dynamical system framework.
Liu, Zitao; Hauskrecht, Milos
2015-09-01
Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.
Clinical time series prediction: towards a hierarchical dynamical system framework
Liu, Zitao; Hauskrecht, Milos
2014-01-01
Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. PMID:25534671
A hierarchical spatial framework for forest landscape planning.
Pete Bettinger; Marie Lennette; K. Norman Johnson; Thomas A. Spies
2005-01-01
A hierarchical spatial framework for large-scale, long-term forest landscape planning is presented along with example policy analyses for a 560,000 ha area of the Oregon Coast Range. The modeling framework suggests utilizing the detail provided by satellite imagery to track forest vegetation condition and for representation of fine-scale features, such as riparian...
A Hierarchical Learning Control Framework for an Aerial Manipulation System
NASA Astrophysics Data System (ADS)
Ma, Le; Chi, yanxun; Li, Jiapeng; Li, Zhongsheng; Ding, Yalei; Liu, Lixing
2017-07-01
A hierarchical learning control framework for an aerial manipulation system is proposed. Firstly, the mechanical design of aerial manipulation system is introduced and analyzed, and the kinematics and the dynamics based on Newton-Euler equation are modeled. Secondly, the framework of hierarchical learning for this system is presented, in which flight platform and manipulator are controlled by different controller respectively. The RBF (Radial Basis Function) neural networks are employed to estimate parameters and control. The Simulation and experiment demonstrate that the methods proposed effective and advanced.
A conceptual modeling framework for discrete event simulation using hierarchical control structures.
Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D
2015-08-01
Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.
A conceptual modeling framework for discrete event simulation using hierarchical control structures
Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.
2015-01-01
Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940
van Rijn, Peter W; Ali, Usama S
2017-05-01
We compare three modelling frameworks for accuracy and speed of item responses in the context of adaptive testing. The first framework is based on modelling scores that result from a scoring rule that incorporates both accuracy and speed. The second framework is the hierarchical modelling approach developed by van der Linden (2007, Psychometrika, 72, 287) in which a regular item response model is specified for accuracy and a log-normal model for speed. The third framework is the diffusion framework in which the response is assumed to be the result of a Wiener process. Although the three frameworks differ in the relation between accuracy and speed, one commonality is that the marginal model for accuracy can be simplified to the two-parameter logistic model. We discuss both conditional and marginal estimation of model parameters. Models from all three frameworks were fitted to data from a mathematics and spelling test. Furthermore, we applied a linear and adaptive testing mode to the data off-line in order to determine differences between modelling frameworks. It was found that a model from the scoring rule framework outperformed a hierarchical model in terms of model-based reliability, but the results were mixed with respect to correlations with external measures. © 2017 The British Psychological Society.
Semantic Image Segmentation with Contextual Hierarchical Models.
Seyedhosseini, Mojtaba; Tasdizen, Tolga
2016-05-01
Semantic segmentation is the problem of assigning an object label to each pixel. It unifies the image segmentation and object recognition problems. The importance of using contextual information in semantic segmentation frameworks has been widely realized in the field. We propose a contextual framework, called contextual hierarchical model (CHM), which learns contextual information in a hierarchical framework for semantic segmentation. At each level of the hierarchy, a classifier is trained based on downsampled input images and outputs of previous levels. Our model then incorporates the resulting multi-resolution contextual information into a classifier to segment the input image at original resolution. This training strategy allows for optimization of a joint posterior probability at multiple resolutions through the hierarchy. Contextual hierarchical model is purely based on the input image patches and does not make use of any fragments or shape examples. Hence, it is applicable to a variety of problems such as object segmentation and edge detection. We demonstrate that CHM performs at par with state-of-the-art on Stanford background and Weizmann horse datasets. It also outperforms state-of-the-art edge detection methods on NYU depth dataset and achieves state-of-the-art on Berkeley segmentation dataset (BSDS 500).
NASA Astrophysics Data System (ADS)
Wang, Bo; Tian, Kuo; Zhao, Haixin; Hao, Peng; Zhu, Tianyu; Zhang, Ke; Ma, Yunlong
2017-06-01
In order to improve the post-buckling optimization efficiency of hierarchical stiffened shells, a multilevel optimization framework accelerated by adaptive equivalent strategy is presented in this paper. Firstly, the Numerical-based Smeared Stiffener Method (NSSM) for hierarchical stiffened shells is derived by means of the numerical implementation of asymptotic homogenization (NIAH) method. Based on the NSSM, a reasonable adaptive equivalent strategy for hierarchical stiffened shells is developed from the concept of hierarchy reduction. Its core idea is to self-adaptively decide which hierarchy of the structure should be equivalent according to the critical buckling mode rapidly predicted by NSSM. Compared with the detailed model, the high prediction accuracy and efficiency of the proposed model is highlighted. On the basis of this adaptive equivalent model, a multilevel optimization framework is then established by decomposing the complex entire optimization process into major-stiffener-level and minor-stiffener-level sub-optimizations, during which Fixed Point Iteration (FPI) is employed to accelerate convergence. Finally, the illustrative examples of the multilevel framework is carried out to demonstrate its efficiency and effectiveness to search for the global optimum result by contrast with the single-level optimization method. Remarkably, the high efficiency and flexibility of the adaptive equivalent strategy is indicated by compared with the single equivalent strategy.
Hierarchical models of animal abundance and occurrence
Royle, J. Andrew; Dorazio, R.M.
2006-01-01
Much of animal ecology is devoted to studies of abundance and occurrence of species, based on surveys of spatially referenced sample units. These surveys frequently yield sparse counts that are contaminated by imperfect detection, making direct inference about abundance or occurrence based on observational data infeasible. This article describes a flexible hierarchical modeling framework for estimation and inference about animal abundance and occurrence from survey data that are subject to imperfect detection. Within this framework, we specify models of abundance and detectability of animals at the level of the local populations defined by the sample units. Information at the level of the local population is aggregated by specifying models that describe variation in abundance and detection among sites. We describe likelihood-based and Bayesian methods for estimation and inference under the resulting hierarchical model. We provide two examples of the application of hierarchical models to animal survey data, the first based on removal counts of stream fish and the second based on avian quadrat counts. For both examples, we provide a Bayesian analysis of the models using the software WinBUGS.
Hierarchical species distribution models
Hefley, Trevor J.; Hooten, Mevin B.
2016-01-01
Determining the distribution pattern of a species is important to increase scientific knowledge, inform management decisions, and conserve biodiversity. To infer spatial and temporal patterns, species distribution models have been developed for use with many sampling designs and types of data. Recently, it has been shown that count, presence-absence, and presence-only data can be conceptualized as arising from a point process distribution. Therefore, it is important to understand properties of the point process distribution. We examine how the hierarchical species distribution modeling framework has been used to incorporate a wide array of regression and theory-based components while accounting for the data collection process and making use of auxiliary information. The hierarchical modeling framework allows us to demonstrate how several commonly used species distribution models can be derived from the point process distribution, highlight areas of potential overlap between different models, and suggest areas where further research is needed.
A hierarchical approach for simulating northern forest dynamics
Don C. Bragg; David W. Roberts; Thomas R. Crow
2004-01-01
Complexity in ecological systems has challenged forest simulation modelers for years, resulting in a number of approaches with varying degrees of success. Arguments in favor of hierarchical modeling are made, especially for considering a complex environmental issue like widespread eastern hemlock regeneration failure. We present the philosophy and basic framework for...
ERIC Educational Resources Information Center
Anderson, Daniel
2012-01-01
This manuscript provides an overview of hierarchical linear modeling (HLM), as part of a series of papers covering topics relevant to consumers of educational research. HLM is tremendously flexible, allowing researchers to specify relations across multiple "levels" of the educational system (e.g., students, classrooms, schools, etc.).…
Bayesian Variable Selection for Hierarchical Gene-Environment and Gene-Gene Interactions
Liu, Changlu; Ma, Jianzhong; Amos, Christopher I.
2014-01-01
We propose a Bayesian hierarchical mixture model framework that allows us to investigate the genetic and environmental effects, gene by gene interactions and gene by environment interactions in the same model. Our approach incorporates the natural hierarchical structure between the main effects and interaction effects into a mixture model, such that our methods tend to remove the irrelevant interaction effects more effectively, resulting in more robust and parsimonious models. We consider both strong and weak hierarchical models. For a strong hierarchical model, both of the main effects between interacting factors must be present for the interactions to be considered in the model development, while for a weak hierarchical model, only one of the two main effects is required to be present for the interaction to be evaluated. Our simulation results show that the proposed strong and weak hierarchical mixture models work well in controlling false positive rates and provide a powerful approach for identifying the predisposing effects and interactions in gene-environment interaction studies, in comparison with the naive model that does not impose this hierarchical constraint in most of the scenarios simulated. We illustrated our approach using data for lung cancer and cutaneous melanoma. PMID:25154630
Hierarchical Boltzmann simulations and model error estimation
NASA Astrophysics Data System (ADS)
Torrilhon, Manuel; Sarna, Neeraj
2017-08-01
A hierarchical simulation approach for Boltzmann's equation should provide a single numerical framework in which a coarse representation can be used to compute gas flows as accurately and efficiently as in computational fluid dynamics, but a subsequent refinement allows to successively improve the result to the complete Boltzmann result. We use Hermite discretization, or moment equations, for the steady linearized Boltzmann equation for a proof-of-concept of such a framework. All representations of the hierarchy are rotationally invariant and the numerical method is formulated on fully unstructured triangular and quadrilateral meshes using a implicit discontinuous Galerkin formulation. We demonstrate the performance of the numerical method on model problems which in particular highlights the relevance of stability of boundary conditions on curved domains. The hierarchical nature of the method allows also to provide model error estimates by comparing subsequent representations. We present various model errors for a flow through a curved channel with obstacles.
ERIC Educational Resources Information Center
Raykov, Tenko
2011-01-01
Interval estimation of intraclass correlation coefficients in hierarchical designs is discussed within a latent variable modeling framework. A method accomplishing this aim is outlined, which is applicable in two-level studies where participants (or generally lower-order units) are clustered within higher-order units. The procedure can also be…
Model-based hierarchical reinforcement learning and human action control
Botvinick, Matthew; Weinstein, Ari
2014-01-01
Recent work has reawakened interest in goal-directed or ‘model-based’ choice, where decisions are based on prospective evaluation of potential action outcomes. Concurrently, there has been growing attention to the role of hierarchy in decision-making and action control. We focus here on the intersection between these two areas of interest, considering the topic of hierarchical model-based control. To characterize this form of action control, we draw on the computational framework of hierarchical reinforcement learning, using this to interpret recent empirical findings. The resulting picture reveals how hierarchical model-based mechanisms might play a special and pivotal role in human decision-making, dramatically extending the scope and complexity of human behaviour. PMID:25267822
ERIC Educational Resources Information Center
Patz, Richard J.; Junker, Brian W.; Johnson, Matthew S.; Mariano, Louis T.
2002-01-01
Discusses the hierarchical rater model (HRM) of R. Patz (1996) and shows how it can be used to scale examinees and items, model aspects of consensus among raters, and model individual rater severity and consistency effects. Also shows how the HRM fits into the generalizability theory framework. Compares the HRM to the conventional item response…
Chad Babcock; Andrew O. Finley; John B. Bradford; Randy Kolka; Richard Birdsey; Michael G. Ryan
2015-01-01
Many studies and production inventory systems have shown the utility of coupling covariates derived from Light Detection and Ranging (LiDAR) data with forest variables measured on georeferenced inventory plots through regression models. The objective of this study was to propose and assess the use of a Bayesian hierarchical modeling framework that accommodates both...
Towards a hierarchical optimization modeling framework for ...
Background:Bilevel optimization has been recognized as a 2-player Stackelberg game where players are represented as leaders and followers and each pursue their own set of objectives. Hierarchical optimization problems, which are a generalization of bilevel, are especially difficult because the optimization is nested, meaning that the objectives of one level depend on solutions to the other levels. We introduce a hierarchical optimization framework for spatially targeting multiobjective green infrastructure (GI) incentive policies under uncertainties related to policy budget, compliance, and GI effectiveness. We demonstrate the utility of the framework using a hypothetical urban watershed, where the levels are characterized by multiple levels of policy makers (e.g., local, regional, national) and policy followers (e.g., landowners, communities), and objectives include minimization of policy cost, implementation cost, and risk; reduction of combined sewer overflow (CSO) events; and improvement in environmental benefits such as reduced nutrient run-off and water availability. Conclusions: While computationally expensive, this hierarchical optimization framework explicitly simulates the interaction between multiple levels of policy makers (e.g., local, regional, national) and policy followers (e.g., landowners, communities) and is especially useful for constructing and evaluating environmental and ecological policy. Using the framework with a hypothetical urba
Hierarchical models for informing general biomass equations with felled tree data
Brian J. Clough; Matthew B. Russell; Christopher W. Woodall; Grant M. Domke; Philip J. Radtke
2015-01-01
We present a hierarchical framework that uses a large multispecies felled tree database to inform a set of general models for predicting tree foliage biomass, with accompanying uncertainty, within the FIA database. Results suggest significant prediction uncertainty for individual trees and reveal higher errors when predicting foliage biomass for larger trees and for...
Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J
2015-05-01
We show how the hierarchical model for responses and response times as developed by van der Linden (2007), Fox, Klein Entink, and van der Linden (2007), Klein Entink, Fox, and van der Linden (2009), and Glas and van der Linden (2010) can be simplified to a generalized linear factor model with only the mild restriction that there is no hierarchical model at the item side. This result is valuable as it enables all well-developed modelling tools and extensions that come with these methods. We show that the restriction we impose on the hierarchical model does not influence parameter recovery under realistic circumstances. In addition, we present two illustrative real data analyses to demonstrate the practical benefits of our approach. © 2014 The British Psychological Society.
A hierarchical-multiobjective framework for risk management
NASA Technical Reports Server (NTRS)
Haimes, Yacov Y.; Li, Duan
1991-01-01
A broad hierarchical-multiobjective framework is established and utilized to methodologically address the management of risk. United into the framework are the hierarchical character of decision-making, the multiple decision-makers at separate levels within the hierarchy, the multiobjective character of large-scale systems, the quantitative/empirical aspects, and the qualitative/normative/judgmental aspects. The methodological components essentially consist of hierarchical-multiobjective coordination, risk of extreme events, and impact analysis. Examples of applications of the framework are presented. It is concluded that complex and interrelated forces require an analysis of trade-offs between engineering analysis and societal preferences, as in the hierarchical-multiobjective framework, to successfully address inherent risk.
Royle, J. Andrew; Dorazio, Robert M.
2008-01-01
A guide to data collection, modeling and inference strategies for biological survey data using Bayesian and classical statistical methods. This book describes a general and flexible framework for modeling and inference in ecological systems based on hierarchical models, with a strict focus on the use of probability models and parametric inference. Hierarchical models represent a paradigm shift in the application of statistics to ecological inference problems because they combine explicit models of ecological system structure or dynamics with models of how ecological systems are observed. The principles of hierarchical modeling are developed and applied to problems in population, metapopulation, community, and metacommunity systems. The book provides the first synthetic treatment of many recent methodological advances in ecological modeling and unifies disparate methods and procedures. The authors apply principles of hierarchical modeling to ecological problems, including * occurrence or occupancy models for estimating species distribution * abundance models based on many sampling protocols, including distance sampling * capture-recapture models with individual effects * spatial capture-recapture models based on camera trapping and related methods * population and metapopulation dynamic models * models of biodiversity, community structure and dynamics.
Chen, Xi; Cui, Qiang; Tang, Yuye; Yoo, Jejoong; Yethiraj, Arun
2008-01-01
A hierarchical simulation framework that integrates information from molecular dynamics (MD) simulations into a continuum model is established to study the mechanical response of mechanosensitive channel of large-conductance (MscL) using the finite element method (FEM). The proposed MD-decorated FEM (MDeFEM) approach is used to explore the detailed gating mechanisms of the MscL in Escherichia coli embedded in a palmitoyloleoylphosphatidylethanolamine lipid bilayer. In Part I of this study, the framework of MDeFEM is established. The transmembrane and cytoplasmic helices are taken to be elastic rods, the loops are modeled as springs, and the lipid bilayer is approximated by a three-layer sheet. The mechanical properties of the continuum components, as well as their interactions, are derived from molecular simulations based on atomic force fields. In addition, analytical closed-form continuum model and elastic network model are established to complement the MDeFEM approach and to capture the most essential features of gating. In Part II of this study, the detailed gating mechanisms of E. coli-MscL under various types of loading are presented and compared with experiments, structural model, and all-atom simulations, as well as the analytical models established in Part I. It is envisioned that such a hierarchical multiscale framework will find great value in the study of a variety of biological processes involving complex mechanical deformations such as muscle contraction and mechanotransduction. PMID:18390626
Bayesian hierarchical model for large-scale covariance matrix estimation.
Zhu, Dongxiao; Hero, Alfred O
2007-12-01
Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.
On the application of multilevel modeling in environmental and ecological studies
Qian, Song S.; Cuffney, Thomas F.; Alameddine, Ibrahim; McMahon, Gerard; Reckhow, Kenneth H.
2010-01-01
This paper illustrates the advantages of a multilevel/hierarchical approach for predictive modeling, including flexibility of model formulation, explicitly accounting for hierarchical structure in the data, and the ability to predict the outcome of new cases. As a generalization of the classical approach, the multilevel modeling approach explicitly models the hierarchical structure in the data by considering both the within- and between-group variances leading to a partial pooling of data across all levels in the hierarchy. The modeling framework provides means for incorporating variables at different spatiotemporal scales. The examples used in this paper illustrate the iterative process of model fitting and evaluation, a process that can lead to improved understanding of the system being studied.
When mechanism matters: Bayesian forecasting using models of ecological diffusion
Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.
2017-01-01
Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.
Dorazio, R.M.; Jelks, H.L.; Jordan, F.
2005-01-01
A statistical modeling framework is described for estimating the abundances of spatially distinct subpopulations of animals surveyed using removal sampling. To illustrate this framework, hierarchical models are developed using the Poisson and negative-binomial distributions to model variation in abundance among subpopulations and using the beta distribution to model variation in capture probabilities. These models are fitted to the removal counts observed in a survey of a federally endangered fish species. The resulting estimates of abundance have similar or better precision than those computed using the conventional approach of analyzing the removal counts of each subpopulation separately. Extension of the hierarchical models to include spatial covariates of abundance is straightforward and may be used to identify important features of an animal's habitat or to predict the abundance of animals at unsampled locations.
ERIC Educational Resources Information Center
De los Santos, Saturnino; Norland, Emmalou Van Tilburg
A study evaluated the cacao farmer training program in the Dominican Republic by testing hypothesized relationships among reactions, knowledge and skills, attitudes, aspirations, and some selected demographic characteristics of farmers who attended programs. Bennett's hierarchical model of program evaluation was used as the framework of the study.…
Royle, J. Andrew; Converse, Sarah J.
2014-01-01
Capture–recapture studies are often conducted on populations that are stratified by space, time or other factors. In this paper, we develop a Bayesian spatial capture–recapture (SCR) modelling framework for stratified populations – when sampling occurs within multiple distinct spatial and temporal strata.We describe a hierarchical model that integrates distinct models for both the spatial encounter history data from capture–recapture sampling, and also for modelling variation in density among strata. We use an implementation of data augmentation to parameterize the model in terms of a latent categorical stratum or group membership variable, which provides a convenient implementation in popular BUGS software packages.We provide an example application to an experimental study involving small-mammal sampling on multiple trapping grids over multiple years, where the main interest is in modelling a treatment effect on population density among the trapping grids.Many capture–recapture studies involve some aspect of spatial or temporal replication that requires some attention to modelling variation among groups or strata. We propose a hierarchical model that allows explicit modelling of group or strata effects. Because the model is formulated for individual encounter histories and is easily implemented in the BUGS language and other free software, it also provides a general framework for modelling individual effects, such as are present in SCR models.
Moore, Jason H; Amos, Ryan; Kiralis, Jeff; Andrews, Peter C
2015-01-01
Simulation plays an essential role in the development of new computational and statistical methods for the genetic analysis of complex traits. Most simulations start with a statistical model using methods such as linear or logistic regression that specify the relationship between genotype and phenotype. This is appealing due to its simplicity and because these statistical methods are commonly used in genetic analysis. It is our working hypothesis that simulations need to move beyond simple statistical models to more realistically represent the biological complexity of genetic architecture. The goal of the present study was to develop a prototype genotype–phenotype simulation method and software that are capable of simulating complex genetic effects within the context of a hierarchical biology-based framework. Specifically, our goal is to simulate multilocus epistasis or gene–gene interaction where the genetic variants are organized within the framework of one or more genes, their regulatory regions and other regulatory loci. We introduce here the Heuristic Identification of Biological Architectures for simulating Complex Hierarchical Interactions (HIBACHI) method and prototype software for simulating data in this manner. This approach combines a biological hierarchy, a flexible mathematical framework, a liability threshold model for defining disease endpoints, and a heuristic search strategy for identifying high-order epistatic models of disease susceptibility. We provide several simulation examples using genetic models exhibiting independent main effects and three-way epistatic effects. PMID:25395175
NASA Astrophysics Data System (ADS)
Montillo, Albert; Song, Qi; Das, Bipul; Yin, Zhye
2015-03-01
Parsing volumetric computed tomography (CT) into 10 or more salient organs simultaneously is a challenging task with many applications such as personalized scan planning and dose reporting. In the clinic, pre-scan data can come in the form of very low dose volumes acquired just prior to the primary scan or from an existing primary scan. To localize organs in such diverse data, we propose a new learning based framework that we call hierarchical pictorial structures (HPS) which builds multiple levels of models in a tree-like hierarchy that mirrors the natural decomposition of human anatomy from gross structures to finer structures. Each node of our hierarchical model learns (1) the local appearance and shape of structures, and (2) a generative global model that learns probabilistic, structural arrangement. Our main contribution is twofold. First we embed the pictorial structures approach in a hierarchical framework which reduces test time image interpretation and allows for the incorporation of additional geometric constraints that robustly guide model fitting in the presence of noise. Second we guide our HPS framework with the probabilistic cost maps extracted using random decision forests using volumetric 3D HOG features which makes our model fast to train and fast to apply to novel test data and posses a high degree of invariance to shape distortion and imaging artifacts. All steps require approximate 3 mins to compute and all organs are located with suitably high accuracy for our clinical applications such as personalized scan planning for radiation dose reduction. We assess our method using a database of volumetric CT scans from 81 subjects with widely varying age and pathology and with simulated ultra-low dose cadaver pre-scan data.
Montijn, Jorrit Steven; Klink, P Christaan; van Wezel, Richard J A
2012-01-01
Divisive normalization models of covert attention commonly use spike rate modulations as indicators of the effect of top-down attention. In addition, an increasing number of studies have shown that top-down attention increases the synchronization of neuronal oscillations as well, particularly in gamma-band frequencies (25-100 Hz). Although modulations of spike rate and synchronous oscillations are not mutually exclusive as mechanisms of attention, there has thus far been little effort to integrate these concepts into a single framework of attention. Here, we aim to provide such a unified framework by expanding the normalization model of attention with a multi-level hierarchical structure and a time dimension; allowing the simulation of a recently reported backward progression of attentional effects along the visual cortical hierarchy. A simple cascade of normalization models simulating different cortical areas is shown to cause signal degradation and a loss of stimulus discriminability over time. To negate this degradation and ensure stable neuronal stimulus representations, we incorporate a kind of oscillatory phase entrainment into our model that has previously been proposed as the "communication-through-coherence" (CTC) hypothesis. Our analysis shows that divisive normalization and oscillation models can complement each other in a unified account of the neural mechanisms of selective visual attention. The resulting hierarchical normalization and oscillation (HNO) model reproduces several additional spatial and temporal aspects of attentional modulation and predicts a latency effect on neuronal responses as a result of cued attention.
Montijn, Jorrit Steven; Klink, P. Christaan; van Wezel, Richard J. A.
2012-01-01
Divisive normalization models of covert attention commonly use spike rate modulations as indicators of the effect of top-down attention. In addition, an increasing number of studies have shown that top-down attention increases the synchronization of neuronal oscillations as well, particularly in gamma-band frequencies (25–100 Hz). Although modulations of spike rate and synchronous oscillations are not mutually exclusive as mechanisms of attention, there has thus far been little effort to integrate these concepts into a single framework of attention. Here, we aim to provide such a unified framework by expanding the normalization model of attention with a multi-level hierarchical structure and a time dimension; allowing the simulation of a recently reported backward progression of attentional effects along the visual cortical hierarchy. A simple cascade of normalization models simulating different cortical areas is shown to cause signal degradation and a loss of stimulus discriminability over time. To negate this degradation and ensure stable neuronal stimulus representations, we incorporate a kind of oscillatory phase entrainment into our model that has previously been proposed as the “communication-through-coherence” (CTC) hypothesis. Our analysis shows that divisive normalization and oscillation models can complement each other in a unified account of the neural mechanisms of selective visual attention. The resulting hierarchical normalization and oscillation (HNO) model reproduces several additional spatial and temporal aspects of attentional modulation and predicts a latency effect on neuronal responses as a result of cued attention. PMID:22586372
Bayesian Hierarchical Grouping: perceptual grouping as mixture estimation
Froyen, Vicky; Feldman, Jacob; Singh, Manish
2015-01-01
We propose a novel framework for perceptual grouping based on the idea of mixture models, called Bayesian Hierarchical Grouping (BHG). In BHG we assume that the configuration of image elements is generated by a mixture of distinct objects, each of which generates image elements according to some generative assumptions. Grouping, in this framework, means estimating the number and the parameters of the mixture components that generated the image, including estimating which image elements are “owned” by which objects. We present a tractable implementation of the framework, based on the hierarchical clustering approach of Heller and Ghahramani (2005). We illustrate it with examples drawn from a number of classical perceptual grouping problems, including dot clustering, contour integration, and part decomposition. Our approach yields an intuitive hierarchical representation of image elements, giving an explicit decomposition of the image into mixture components, along with estimates of the probability of various candidate decompositions. We show that BHG accounts well for a diverse range of empirical data drawn from the literature. Because BHG provides a principled quantification of the plausibility of grouping interpretations over a wide range of grouping problems, we argue that it provides an appealing unifying account of the elusive Gestalt notion of Prägnanz. PMID:26322548
A Lightweight Hierarchical Activity Recognition Framework Using Smartphone Sensors
Han, Manhyung; Bang, Jae Hun; Nugent, Chris; McClean, Sally; Lee, Sungyoung
2014-01-01
Activity recognition for the purposes of recognizing a user's intentions using multimodal sensors is becoming a widely researched topic largely based on the prevalence of the smartphone. Previous studies have reported the difficulty in recognizing life-logs by only using a smartphone due to the challenges with activity modeling and real-time recognition. In addition, recognizing life-logs is difficult due to the absence of an established framework which enables the use of different sources of sensor data. In this paper, we propose a smartphone-based Hierarchical Activity Recognition Framework which extends the Naïve Bayes approach for the processing of activity modeling and real-time activity recognition. The proposed algorithm demonstrates higher accuracy than the Naïve Bayes approach and also enables the recognition of a user's activities within a mobile environment. The proposed algorithm has the ability to classify fifteen activities with an average classification accuracy of 92.96%. PMID:25184486
Loops in hierarchical channel networks
NASA Astrophysics Data System (ADS)
Katifori, Eleni; Magnasco, Marcelo
2012-02-01
Nature provides us with many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture. Although a number of methods have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated and natural graphs extracted from digitized images of dicotyledonous leaves and animal vasculature. We calculate various metrics on the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information from the metric topology (connectivity and edge weight) and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs.
Risk Assessment for Mobile Systems Through a Multilayered Hierarchical Bayesian Network.
Li, Shancang; Tryfonas, Theo; Russell, Gordon; Andriotis, Panagiotis
2016-08-01
Mobile systems are facing a number of application vulnerabilities that can be combined together and utilized to penetrate systems with devastating impact. When assessing the overall security of a mobile system, it is important to assess the security risks posed by each mobile applications (apps), thus gaining a stronger understanding of any vulnerabilities present. This paper aims at developing a three-layer framework that assesses the potential risks which apps introduce within the Android mobile systems. A Bayesian risk graphical model is proposed to evaluate risk propagation in a layered risk architecture. By integrating static analysis, dynamic analysis, and behavior analysis in a hierarchical framework, the risks and their propagation through each layer are well modeled by the Bayesian risk graph, which can quantitatively analyze risks faced to both apps and mobile systems. The proposed hierarchical Bayesian risk graph model offers a novel way to investigate the security risks in mobile environment and enables users and administrators to evaluate the potential risks. This strategy allows to strengthen both app security as well as the security of the entire system.
Background:Bilevel optimization has been recognized as a 2-player Stackelberg game where players are represented as leaders and followers and each pursue their own set of objectives. Hierarchical optimization problems, which are a generalization of bilevel, are especially difficu...
unmarked: An R package for fitting hierarchical models of wildlife occurrence and abundance
Fiske, Ian J.; Chandler, Richard B.
2011-01-01
Ecological research uses data collection techniques that are prone to substantial and unique types of measurement error to address scientific questions about species abundance and distribution. These data collection schemes include a number of survey methods in which unmarked individuals are counted, or determined to be present, at spatially- referenced sites. Examples include site occupancy sampling, repeated counts, distance sampling, removal sampling, and double observer sampling. To appropriately analyze these data, hierarchical models have been developed to separately model explanatory variables of both a latent abundance or occurrence process and a conditional detection process. Because these models have a straightforward interpretation paralleling mechanisms under which the data arose, they have recently gained immense popularity. The common hierarchical structure of these models is well-suited for a unified modeling interface. The R package unmarked provides such a unified modeling framework, including tools for data exploration, model fitting, model criticism, post-hoc analysis, and model comparison.
On the use of a PM2.5 exposure simulator to explain birthweight
Berrocal, Veronica J.; Gelfand, Alan E.; Holland, David M.; Burke, Janet; Miranda, Marie Lynn
2010-01-01
In relating pollution to birth outcomes, maternal exposure has usually been described using monitoring data. Such characterization provides a misrepresentation of exposure as it (i) does not take into account the spatial misalignment between an individual’s residence and monitoring sites, and (ii) it ignores the fact that individuals spend most of their time indoors and typically in more than one location. In this paper, we break with previous studies by using a stochastic simulator to describe personal exposure (to particulate matter) and then relate simulated exposures at the individual level to the health outcome (birthweight) rather than aggregating to a selected spatial unit. We propose a hierarchical model that, at the first stage, specifies a linear relationship between birthweight and personal exposure, adjusting for individual risk factors and introduces random spatial effects for the census tract of maternal residence. At the second stage, our hierarchical model specifies the distribution of each individual’s personal exposure using the empirical distribution yielded by the stochastic simulator as well as a model for the spatial random effects. We have applied our framework to analyze birthweight data from 14 counties in North Carolina in years 2001 and 2002. We investigate whether there are certain aspects and time windows of exposure that are more detrimental to birthweight by building different exposure metrics which we incorporate, one by one, in our hierarchical model. To assess the difference in relating ambient exposure to birthweight versus personal exposure to birthweight, we compare estimates of the effect of air pollution obtained from hierarchical models that linearly relate ambient exposure and birthweight versus those obtained from our modeling framework. Our analysis does not show a significant effect of PM2.5 on birthweight for reasons which we discuss. However, our modeling framework serves as a template for analyzing the relationship between personal exposure and longer term health endpoints. PMID:21691413
Food-web based unified model of macro- and microevolution.
Chowdhury, Debashish; Stauffer, Dietrich
2003-10-01
We incorporate the generic hierarchical architecture of foodwebs into a "unified" model that describes both micro- and macroevolutions within a single theoretical framework. This model describes the microevolution in detail by accounting for the birth, ageing, and natural death of individual organisms as well as prey-predator interactions on a hierarchical dynamic food web. It also provides a natural description of random mutations and speciation (origination) of species as well as their extinctions. The distribution of lifetimes of species follows an approximate power law only over a limited regime.
A hierarchical framework of aquatic ecological units in North America (Nearctic Zone).
James R. Maxwell; Clayton J. Edwards; Mark E. Jensen; Steven J. Paustian; Harry Parrott; Donley M. Hill
1995-01-01
Proposes a framework for classifying and mapping aquatic systems at various scales using ecologically significant physical and biological criteria. Classification and mapping concepts follow tenets of hierarchical theory, pattern recognition, and driving variables. Criteria are provided for the hierarchical classification and mapping of aquatic ecological units of...
Impacts of forest fragmentation on species richness: a hierarchical approach to community modelling
Zipkin, Elise F.; DeWan, Amielle; Royle, J. Andrew
2009-01-01
1. Species richness is often used as a tool for prioritizing conservation action. One method for predicting richness and other summaries of community structure is to develop species-specific models of occurrence probability based on habitat or landscape characteristics. However, this approach can be challenging for rare or elusive species for which survey data are often sparse. 2. Recent developments have allowed for improved inference about community structure based on species-specific models of occurrence probability, integrated within a hierarchical modelling framework. This framework offers advantages to inference about species richness over typical approaches by accounting for both species-level effects and the aggregated effects of landscape composition on a community as a whole, thus leading to increased precision in estimates of species richness by improving occupancy estimates for all species, including those that were observed infrequently. 3. We developed a hierarchical model to assess the community response of breeding birds in the Hudson River Valley, New York, to habitat fragmentation and analysed the model using a Bayesian approach. 4. The model was designed to estimate species-specific occurrence and the effects of fragment area and edge (as measured through the perimeter and the perimeter/area ratio, P/A), while accounting for imperfect detection of species. 5. We used the fitted model to make predictions of species richness within forest fragments of variable morphology. The model revealed that species richness of the observed bird community was maximized in small forest fragments with a high P/A. However, the number of forest interior species, a subset of the community with high conservation value, was maximized in large fragments with low P/A. 6. Synthesis and applications. Our results demonstrate the importance of understanding the responses of both individual, and groups of species, to environmental heterogeneity while illustrating the utility of hierarchical models for inference about species richness for conservation. This framework can be used to investigate the impacts of land-use change and fragmentation on species or assemblage richness, and to further understand trade-offs in species-specific occupancy probabilities associated with landscape variability.
Towards a multilevel cognitive probabilistic representation of space
NASA Astrophysics Data System (ADS)
Tapus, Adriana; Vasudevan, Shrihari; Siegwart, Roland
2005-03-01
This paper addresses the problem of perception and representation of space for a mobile agent. A probabilistic hierarchical framework is suggested as a solution to this problem. The method proposed is a combination of probabilistic belief with "Object Graph Models" (OGM). The world is viewed from a topological optic, in terms of objects and relationships between them. The hierarchical representation that we propose permits an efficient and reliable modeling of the information that the mobile agent would perceive from its environment. The integration of both navigational and interactional capabilities through efficient representation is also addressed. Experiments on a set of images taken from the real world that validate the approach are reported. This framework draws on the general understanding of human cognition and perception and contributes towards the overall efforts to build cognitive robot companions.
A Statistical Test for Comparing Nonnested Covariance Structure Models.
ERIC Educational Resources Information Center
Levy, Roy; Hancock, Gregory R.
While statistical procedures are well known for comparing hierarchically related (nested) covariance structure models, statistical tests for comparing nonhierarchically related (nonnested) models have proven more elusive. While isolated attempts have been made, none exists within the commonly used maximum likelihood estimation framework, thereby…
Quantifying loopy network architectures.
Katifori, Eleni; Magnasco, Marcelo O
2012-01-01
Biology presents many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture containing closed loops at many different levels. Although a number of approaches have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework, the hierarchical loop decomposition, that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated graphs, such as artificial models and optimal distribution networks, as well as natural graphs extracted from digitized images of dicotyledonous leaves and vasculature of rat cerebral neocortex. We calculate various metrics based on the asymmetry, the cumulative size distribution and the Strahler bifurcation ratios of the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information (exact location of edges and nodes) from the metric topology (connectivity and edge weight) and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Jacob; Edgar, Thomas W.; Daily, Jeffrey A.
With an ever-evolving power grid, concerns regarding how to maintain system stability, efficiency, and reliability remain constant because of increasing uncertainties and decreasing rotating inertia. To alleviate some of these concerns, demand response represents a viable solution and is virtually an untapped resource in the current power grid. This work describes a hierarchical control framework that allows coordination between distributed energy resources and demand response. This control framework is composed of two control layers: a coordination layer that ensures aggregations of resources are coordinated to achieve system objectives and a device layer that controls individual resources to assure the predeterminedmore » power profile is tracked in real time. Large-scale simulations are executed to study the hierarchical control, requiring advancements in simulation capabilities. Technical advancements necessary to investigate and answer control interaction questions, including the Framework for Network Co-Simulation platform and Arion modeling capability, are detailed. Insights into the interdependencies of controls across a complex system and how they must be tuned, as well as validation of the effectiveness of the proposed control framework, are yielded using a large-scale integrated transmission system model coupled with multiple distribution systems.« less
Fermion hierarchy from sfermion anarchy
Altmannshofer, Wolfgang; Frugiuele, Claudia; Harnik, Roni
2014-12-31
We present a framework to generate the hierarchical flavor structure of Standard Model quarks and leptons from loops of superpartners. The simplest model consists of the minimal supersymmetric standard model with tree level Yukawa couplings for the third generation only and anarchic squark and slepton mass matrices. Agreement with constraints from low energy flavor observables, in particular Kaon mixing, is obtained for supersymmetric particles with masses at the PeV scale or above. In our framework both the second and the first generation fermion masses are generated at 1-loop. Despite this, a novel mechanism generates a hierarchy among the first andmore » second generations without imposing a symmetry or small parameters. A second-to-first generation mass ratio of order 100 is typical. The minimal supersymmetric standard model thus includes all the necessary ingredients to realize a fermion spectrum that is qualitatively similar to observation, with hierarchical masses and mixing. The minimal framework produces only a few quantitative discrepancies with observation, most notably the muon mass is too low. Furthermore, we discuss simple modifications which resolve this and also investigate the compatibility of our model with gauge and Yukawa coupling Unification.« less
NASA Astrophysics Data System (ADS)
Fijani, E.; Chitsazan, N.; Nadiri, A.; Tsai, F. T.; Asghari Moghaddam, A.
2012-12-01
Artificial Neural Networks (ANNs) have been widely used to estimate concentration of chemicals in groundwater systems. However, estimation uncertainty is rarely discussed in the literature. Uncertainty in ANN output stems from three sources: ANN inputs, ANN parameters (weights and biases), and ANN structures. Uncertainty in ANN inputs may come from input data selection and/or input data error. ANN parameters are naturally uncertain because they are maximum-likelihood estimated. ANN structure is also uncertain because there is no unique ANN model given a specific case. Therefore, multiple plausible AI models are generally resulted for a study. One might ask why good models have to be ignored in favor of the best model in traditional estimation. What is the ANN estimation variance? How do the variances from different ANN models accumulate to the total estimation variance? To answer these questions we propose a Hierarchical Bayesian Model Averaging (HBMA) framework. Instead of choosing one ANN model (the best ANN model) for estimation, HBMA averages outputs of all plausible ANN models. The model weights are based on the evidence of data. Therefore, the HBMA avoids overconfidence on the single best ANN model. In addition, HBMA is able to analyze uncertainty propagation through aggregation of ANN models in a hierarchy framework. This method is applied for estimation of fluoride concentration in the Poldasht plain and the Bazargan plain in Iran. Unusually high fluoride concentration in the Poldasht and Bazargan plains has caused negative effects on the public health. Management of this anomaly requires estimation of fluoride concentration distribution in the area. The results show that the HBMA provides a knowledge-decision-based framework that facilitates analyzing and quantifying ANN estimation uncertainties from different sources. In addition HBMA allows comparative evaluation of the realizations for each source of uncertainty by segregating the uncertainty sources in a hierarchical framework. Fluoride concentration estimation using the HBMA method shows better agreement to the observation data in the test step because they are not based on a single model with a non-dominate weights.
Tom, Jennifer A; Sinsheimer, Janet S; Suchard, Marc A
Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework.
Tom, Jennifer A.; Sinsheimer, Janet S.; Suchard, Marc A.
2015-01-01
Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework. PMID:26681992
Unmarked: An R package for fitting hierarchical models of wildlife occurrence and abundance
Fiske, I.J.; Chandler, R.B.
2011-01-01
Ecological research uses data collection techniques that are prone to substantial and unique types of measurement error to address scientic questions about species abundance and distribution. These data collection schemes include a number of survey methods in which unmarked individuals are counted, or determined to be present, at spatially- referenced sites. Examples include site occupancy sampling, repeated counts, distance sampling, removal sampling, and double observer sampling. To appropriately analyze these data, hierarchical models have been developed to separately model explanatory variables of both a latent abundance or occurrence process and a conditional detection process. Because these models have a straightforward interpretation paralleling mecha- nisms under which the data arose, they have recently gained immense popularity. The common hierarchical structure of these models is well-suited for a unied modeling in- terface. The R package unmarked provides such a unied modeling framework, including tools for data exploration, model tting, model criticism, post-hoc analysis, and model comparison.
NASA Astrophysics Data System (ADS)
Eadie, Gwendolyn M.; Springford, Aaron; Harris, William E.
2017-02-01
We present a hierarchical Bayesian method for estimating the total mass and mass profile of the Milky Way Galaxy. The new hierarchical Bayesian approach further improves the framework presented by Eadie et al. and Eadie and Harris and builds upon the preliminary reports by Eadie et al. The method uses a distribution function f({ E },L) to model the Galaxy and kinematic data from satellite objects, such as globular clusters (GCs), to trace the Galaxy’s gravitational potential. A major advantage of the method is that it not only includes complete and incomplete data simultaneously in the analysis, but also incorporates measurement uncertainties in a coherent and meaningful way. We first test the hierarchical Bayesian framework, which includes measurement uncertainties, using the same data and power-law model assumed in Eadie and Harris and find the results are similar but more strongly constrained. Next, we take advantage of the new statistical framework and incorporate all possible GC data, finding a cumulative mass profile with Bayesian credible regions. This profile implies a mass within 125 kpc of 4.8× {10}11{M}⊙ with a 95% Bayesian credible region of (4.0{--}5.8)× {10}11{M}⊙ . Our results also provide estimates of the true specific energies of all the GCs. By comparing these estimated energies to the measured energies of GCs with complete velocity measurements, we observe that (the few) remote tracers with complete measurements may play a large role in determining a total mass estimate of the Galaxy. Thus, our study stresses the need for more remote tracers with complete velocity measurements.
A hierarchical competing systems model of the emergence and early development of executive function
Marcovitch, Stuart; Zelazo, Philip David
2010-01-01
The hierarchical competing systems model (HCSM) provides a framework for understanding the emergence and early development of executive function – the cognitive processes underlying the conscious control of behavior – in the context of search for hidden objects. According to this model, behavior is determined by the joint influence of a developmentally invariant habit system and a conscious representational system that becomes increasingly influential as children develop. This article describes a computational formalization of the HCSM, reviews behavioral and computational research consistent with the model, and suggests directions for future research on the development of executive function. PMID:19120405
Documentation of the Douglas-fir tussock moth outbreak-population model.
J.J. Colbert; W. Scott Overton; Curtis. White
1979-01-01
Documentation of three model versions: the Douglas-fir tussock moth population-branch model on (1) daily temporal resolution, (2) instart temporal resolution, and (3) the Douglas-fir tussock moth stand-outbreak model; the hierarchical framework and the conceptual paradigm used are described. The coupling of the model with a normal-stand model is discussed. The modeling...
We introduce a hierarchical optimization framework for spatially targeting green infrastructure (GI) incentive policies in order to meet objectives related to cost and environmental effectiveness. The framework explicitly simulates the interaction between multiple levels of polic...
A study of microindentation hardness tests by mechanism-based strain gradient plasticity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Y.; Xue, Z.; Gao, H.
2000-08-01
We recently proposed a theory of mechanism-based strain gradient (MSG) plasticity to account for the size dependence of plastic deformation at micron- and submicron-length scales. The MSG plasticity theory connects micron-scale plasticity to dislocation theories via a multiscale, hierarchical framework linking Taylor's dislocation hardening model to strain gradient plasticity. Here we show that the theory of MSG plasticity, when used to study micro-indentation, indeed reproduces the linear dependence observed in experiments, thus providing an important self-consistent check of the theory. The effects of pileup, sink-in, and the radius of indenter tip have been taken into account in the indentation model.more » In accomplishing this objective, we have generalized the MSG plasticity theory to include the elastic deformation in the hierarchical framework. (c) 2000 Materials Research Society.« less
Kotliar, Natasha B.; Wiens, John A.
1990-01-01
We develop a hierarchical model of heterogeneity that provides a framework for classifying patch structure across a range of scales. Patches at lower levels in the hierarchy are more simplistic and correspond to the traditional view of patches. At levels approaching the upper bounds of the hierarchy the internal structure becomes more heterogeneous and boundaries more ambiguous. At each level in the hierarchy, patch structure will be influenced by both contrast among patches as well as the degree of aggregation of patches at lower levels in the hierarchy. We apply this model to foraging theory, but it has wider applications as in the study of habitat selection, population dynamics, and habitat fragmentation. It may also be useful in expanding the realm of landscape ecology beyond the current focus on anthropocentric scales.
Understanding seasonal variability of uncertainty in hydrological prediction
NASA Astrophysics Data System (ADS)
Li, M.; Wang, Q. J.
2012-04-01
Understanding uncertainty in hydrological prediction can be highly valuable for improving the reliability of streamflow prediction. In this study, a monthly water balance model, WAPABA, in a Bayesian joint probability with error models are presented to investigate the seasonal dependency of prediction error structure. A seasonal invariant error model, analogous to traditional time series analysis, uses constant parameters for model error and account for no seasonal variations. In contrast, a seasonal variant error model uses a different set of parameters for bias, variance and autocorrelation for each individual calendar month. Potential connection amongst model parameters from similar months is not considered within the seasonal variant model and could result in over-fitting and over-parameterization. A hierarchical error model further applies some distributional restrictions on model parameters within a Bayesian hierarchical framework. An iterative algorithm is implemented to expedite the maximum a posterior (MAP) estimation of a hierarchical error model. Three error models are applied to forecasting streamflow at a catchment in southeast Australia in a cross-validation analysis. This study also presents a number of statistical measures and graphical tools to compare the predictive skills of different error models. From probability integral transform histograms and other diagnostic graphs, the hierarchical error model conforms better to reliability when compared to the seasonal invariant error model. The hierarchical error model also generally provides the most accurate mean prediction in terms of the Nash-Sutcliffe model efficiency coefficient and the best probabilistic prediction in terms of the continuous ranked probability score (CRPS). The model parameters of the seasonal variant error model are very sensitive to each cross validation, while the hierarchical error model produces much more robust and reliable model parameters. Furthermore, the result of the hierarchical error model shows that most of model parameters are not seasonal variant except for error bias. The seasonal variant error model is likely to use more parameters than necessary to maximize the posterior likelihood. The model flexibility and robustness indicates that the hierarchical error model has great potential for future streamflow predictions.
NASA Astrophysics Data System (ADS)
Sahai, Swupnil
This thesis includes three parts. The overarching theme is how to analyze structured hierarchical data, with applications to astronomy and sociology. The first part discusses how expectation propagation can be used to parallelize the computation when fitting big hierarchical bayesian models. This methodology is then used to fit a novel, nonlinear mixture model to ultraviolet radiation from various regions of the observable universe. The second part discusses how the Stan probabilistic programming language can be used to numerically integrate terms in a hierarchical bayesian model. This technique is demonstrated on supernovae data to significantly speed up convergence to the posterior distribution compared to a previous study that used a Gibbs-type sampler. The third part builds a formal latent kernel representation for aggregate relational data as a way to more robustly estimate the mixing characteristics of agents in a network. In particular, the framework is applied to sociology surveys to estimate, as a function of ego age, the age and sex composition of the personal networks of individuals in the United States.
NASA Astrophysics Data System (ADS)
Kashim, Rosmaini; Kasim, Maznah Mat; Rahman, Rosshairy Abd
2015-12-01
Measuring university performance is essential for efficient allocation and utilization of educational resources. In most of the previous studies, performance measurement in universities emphasized the operational efficiency and resource utilization without investigating the university's ability to fulfill the needs of its stakeholders and society. Therefore, assessment of the performance of university should be separated into two stages namely efficiency and effectiveness. In conventional DEA analysis, a decision making unit (DMU) or in this context, a university is generally treated as a black-box which ignores the operation and interdependence of the internal processes. When this happens, the results obtained would be misleading. Thus, this paper suggest an alternative framework for measuring the overall performance of a university by incorporating both efficiency and effectiveness and applies network DEA model. The network DEA models are recommended because this approach takes into account the interrelationship between the processes of efficiency and effectiveness in the system. This framework also focuses on the university structure which is expanded from the hierarchical to form a series of horizontal relationship between subordinate units by assuming both intermediate unit and its subordinate units can generate output(s). Three conceptual models are proposed to evaluate the performance of a university. An efficiency model is developed at the first stage by using hierarchical network model. It is followed by an effectiveness model which take output(s) from the hierarchical structure at the first stage as a input(s) at the second stage. As a result, a new overall performance model is proposed by combining both efficiency and effectiveness models. Thus, once this overall model is realized and utilized, the university's top management can determine the overall performance of each unit more accurately and systematically. Besides that, the result from the network DEA model can give a superior benchmarking power over the conventional models.
A Hierarchical Building Segmentation in Digital Surface Models for 3D Reconstruction.
Yan, Yiming; Gao, Fengjiao; Deng, Shupei; Su, Nan
2017-01-24
In this study, a hierarchical method for segmenting buildings in a digital surface model (DSM), which is used in a novel framework for 3D reconstruction, is proposed. Most 3D reconstructions of buildings are model-based. However, the limitations of these methods are overreliance on completeness of the offline-constructed models of buildings, and the completeness is not easily guaranteed since in modern cities buildings can be of a variety of types. Therefore, a model-free framework using high precision DSM and texture-images buildings was introduced. There are two key problems with this framework. The first one is how to accurately extract the buildings from the DSM. Most segmentation methods are limited by either the terrain factors or the difficult choice of parameter-settings. A level-set method are employed to roughly find the building regions in the DSM, and then a recently proposed 'occlusions of random textures model' are used to enhance the local segmentation of the buildings. The second problem is how to generate the facades of buildings. Synergizing with the corresponding texture-images, we propose a roof-contour guided interpolation of building facades. The 3D reconstruction results achieved by airborne-like images and satellites are compared. Experiments show that the segmentation method has good performance, and 3D reconstruction is easily performed by our framework, and better visualization results can be obtained by airborne-like images, which can be further replaced by UAV images.
ERIC Educational Resources Information Center
Rockwell, S. Kay; Albrecht, Julie A.; Nugent, Gwen C.; Kunz, Gina M.
2012-01-01
Targeting Outcomes of Programs (TOP) is a seven-step hierarchical programming model in which the program development and performance sides are mirror images of each other. It served as a framework to identify a simple method for targeting photographic events in nonformal education programs, indicating why, when, and how photographs would be useful…
Hierarchic models for laminated plates. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Actis, Ricardo Luis
1991-01-01
Structural plates and shells are three-dimensional bodies, one dimension of which happens to be much smaller than the other two. Thus, the quality of a plate or shell model must be judged on the basis of how well its exact solution approximates the corresponding three-dimensional problem. Of course, the exact solution depends not only on the choice of the model but also on the topology, material properties, loading and constraints. The desired degree of approximation depends on the analyst's goals in performing the analysis. For these reasons models have to be chosen adaptively. Hierarchic sequences of models make adaptive selection of the model which is best suited for the purposes of a particular analysis possible. The principles governing the formulation of hierarchic models for laminated plates are presented. The essential features of the hierarchic models described models are: (1) the exact solutions corresponding to the hierarchic sequence of models converge to the exact solution of the corresponding problem of elasticity for a fixed laminate thickness; and (2) the exact solution of each model converges to the same limit as the exact solution of the corresponding problem of elasticity with respect to the laminate thickness approaching zero. The formulation is based on one parameter (beta) which characterizes the hierarchic sequence of models, and a set of constants whose influence was assessed by a numerical sensitivity study. The recommended selection of these constants results in the number of fields increasing by three for each increment in the power of beta. Numerical examples analyzed with the proposed sequence of models are included and good correlation with the reference solutions was found. Results were obtained for laminated strips (plates in cylindrical bending) and for square and rectangular plates with uniform loading and with homogeneous boundary conditions. Cross-ply and angle-ply laminates were evaluated and the results compared with those of MSC/PROBE. Hierarchic models make the computation of any engineering data possible to an arbitrary level of precision within the framework of the theory of elasticity.
ERIC Educational Resources Information Center
Zhu, Xiaoshu
2013-01-01
The current study introduced a general modeling framework, multilevel mixture IRT (MMIRT) which detects and describes characteristics of population heterogeneity, while accommodating the hierarchical data structure. In addition to introducing both continuous and discrete approaches to MMIRT, the main focus of the current study was to distinguish…
Using Response Times for Item Selection in Adaptive Testing
ERIC Educational Resources Information Center
van der Linden, Wim J.
2008-01-01
Response times on items can be used to improve item selection in adaptive testing provided that a probabilistic model for their distribution is available. In this research, the author used a hierarchical modeling framework with separate first-level models for the responses and response times and a second-level model for the distribution of the…
NASA Astrophysics Data System (ADS)
Graham, James; Ternovskiy, Igor V.
2013-06-01
We applied a two stage unsupervised hierarchical learning system to model complex dynamic surveillance and cyber space monitoring systems using a non-commercial version of the NeoAxis visualization software. The hierarchical scene learning and recognition approach is based on hierarchical expectation maximization, and was linked to a 3D graphics engine for validation of learning and classification results and understanding the human - autonomous system relationship. Scene recognition is performed by taking synthetically generated data and feeding it to a dynamic logic algorithm. The algorithm performs hierarchical recognition of the scene by first examining the features of the objects to determine which objects are present, and then determines the scene based on the objects present. This paper presents a framework within which low level data linked to higher-level visualization can provide support to a human operator and be evaluated in a detailed and systematic way.
Verbal Neuropsychological Functions in Aphasia: An Integrative Model
ERIC Educational Resources Information Center
Vigliecca, Nora Silvana; Báez, Sandra
2015-01-01
A theoretical framework which considers the verbal functions of the brain under a multivariate and comprehensive cognitive model was statistically analyzed. A confirmatory factor analysis was performed to verify whether some recognized aphasia constructs can be hierarchically integrated as latent factors from a homogenously verbal test. The Brief…
Theory Learning as Stochastic Search in the Language of Thought
ERIC Educational Resources Information Center
Ullman, Tomer D.; Goodman, Noah D.; Tenenbaum, Joshua B.
2012-01-01
We present an algorithmic model for the development of children's intuitive theories within a hierarchical Bayesian framework, where theories are described as sets of logical laws generated by a probabilistic context-free grammar. We contrast our approach with connectionist and other emergentist approaches to modeling cognitive development. While…
Why environmental scientists are becoming Bayesians
James S. Clark
2005-01-01
Advances in computational statistics provide a general framework for the high dimensional models typically needed for ecological inference and prediction. Hierarchical Bayes (HB) represents a modelling structure with capacity to exploit diverse sources of information, to accommodate influences that are unknown (or unknowable), and to draw inference on large numbers of...
The hierarchical structure of self-reported impulsivity
Kirby, Kris N.; Finch, Julia C.
2010-01-01
The hierarchical structure of 95 self-reported impulsivity items, along with delay-discount rates for money, was examined. A large sample of college students participated in the study (N = 407). Items represented every previously proposed dimension of self-reported impulsivity. Exploratory PCA yielded at least 7 interpretable components: Prepared/Careful, Impetuous, Divertible, Thrill and Risk Seeking, Happy-Go-Lucky, Impatiently Pleasure Seeking, and Reserved. Discount rates loaded on Impatiently Pleasure Seeking, and correlated with the impulsiveness and venturesomeness scales from the I7 (Eysenck, Pearson, Easting, & Allsopp, 1985). The hierarchical emergence of the components was explored, and we show how this hierarchical structure may help organize conflicting dimensions found in previous analyses. Finally, we argue that the discounting model (Ainslie, 1975) provides a qualitative framework for understanding the dimensions of impulsivity. PMID:20224803
A Hierarchical Building Segmentation in Digital Surface Models for 3D Reconstruction
Yan, Yiming; Gao, Fengjiao; Deng, Shupei; Su, Nan
2017-01-01
In this study, a hierarchical method for segmenting buildings in a digital surface model (DSM), which is used in a novel framework for 3D reconstruction, is proposed. Most 3D reconstructions of buildings are model-based. However, the limitations of these methods are overreliance on completeness of the offline-constructed models of buildings, and the completeness is not easily guaranteed since in modern cities buildings can be of a variety of types. Therefore, a model-free framework using high precision DSM and texture-images buildings was introduced. There are two key problems with this framework. The first one is how to accurately extract the buildings from the DSM. Most segmentation methods are limited by either the terrain factors or the difficult choice of parameter-settings. A level-set method are employed to roughly find the building regions in the DSM, and then a recently proposed ‘occlusions of random textures model’ are used to enhance the local segmentation of the buildings. The second problem is how to generate the facades of buildings. Synergizing with the corresponding texture-images, we propose a roof-contour guided interpolation of building facades. The 3D reconstruction results achieved by airborne-like images and satellites are compared. Experiments show that the segmentation method has good performance, and 3D reconstruction is easily performed by our framework, and better visualization results can be obtained by airborne-like images, which can be further replaced by UAV images. PMID:28125018
Judge, Timothy A; Rodell, Jessica B; Klinger, Ryan L; Simon, Lauren S; Crawford, Eean R
2013-11-01
Integrating 2 theoretical perspectives on predictor-criterion relationships, the present study developed and tested a hierarchical framework in which each five-factor model (FFM) personality trait comprises 2 DeYoung, Quilty, and Peterson (2007) facets, which in turn comprise 6 Costa and McCrae (1992) NEO facets. Both theoretical perspectives-the bandwidth-fidelity dilemma and construct correspondence-suggest that lower order traits would better predict facets of job performance (task performance and contextual performance). They differ, however, as to the relative merits of broad and narrow traits in predicting a broad criterion (overall job performance). We first meta-analyzed the relationship of the 30 NEO facets to overall job performance and its facets. Overall, 1,176 correlations from 410 independent samples (combined N = 406,029) were coded and meta-analyzed. We then formed the 10 DeYoung et al. facets from the NEO facets, and 5 broad traits from those facets. Overall, results provided support for the 6-2-1 framework in general and the importance of the NEO facets in particular. (c) 2013 APA, all rights reserved.
Decomposition and extraction: a new framework for visual classification.
Fang, Yuqiang; Chen, Qiang; Sun, Lin; Dai, Bin; Yan, Shuicheng
2014-08-01
In this paper, we present a novel framework for visual classification based on hierarchical image decomposition and hybrid midlevel feature extraction. Unlike most midlevel feature learning methods, which focus on the process of coding or pooling, we emphasize that the mechanism of image composition also strongly influences the feature extraction. To effectively explore the image content for the feature extraction, we model a multiplicity feature representation mechanism through meaningful hierarchical image decomposition followed by a fusion step. In particularly, we first propose a new hierarchical image decomposition approach in which each image is decomposed into a series of hierarchical semantical components, i.e, the structure and texture images. Then, different feature extraction schemes can be adopted to match the decomposed structure and texture processes in a dissociative manner. Here, two schemes are explored to produce property related feature representations. One is based on a single-stage network over hand-crafted features and the other is based on a multistage network, which can learn features from raw pixels automatically. Finally, those multiple midlevel features are incorporated by solving a multiple kernel learning task. Extensive experiments are conducted on several challenging data sets for visual classification, and experimental results demonstrate the effectiveness of the proposed method.
Modeling Bivariate Longitudinal Hormone Profiles by Hierarchical State Space Models
Liu, Ziyue; Cappola, Anne R.; Crofford, Leslie J.; Guo, Wensheng
2013-01-01
The hypothalamic-pituitary-adrenal (HPA) axis is crucial in coping with stress and maintaining homeostasis. Hormones produced by the HPA axis exhibit both complex univariate longitudinal profiles and complex relationships among different hormones. Consequently, modeling these multivariate longitudinal hormone profiles is a challenging task. In this paper, we propose a bivariate hierarchical state space model, in which each hormone profile is modeled by a hierarchical state space model, with both population-average and subject-specific components. The bivariate model is constructed by concatenating the univariate models based on the hypothesized relationship. Because of the flexible framework of state space form, the resultant models not only can handle complex individual profiles, but also can incorporate complex relationships between two hormones, including both concurrent and feedback relationship. Estimation and inference are based on marginal likelihood and posterior means and variances. Computationally efficient Kalman filtering and smoothing algorithms are used for implementation. Application of the proposed method to a study of chronic fatigue syndrome and fibromyalgia reveals that the relationships between adrenocorticotropic hormone and cortisol in the patient group are weaker than in healthy controls. PMID:24729646
Modeling Bivariate Longitudinal Hormone Profiles by Hierarchical State Space Models.
Liu, Ziyue; Cappola, Anne R; Crofford, Leslie J; Guo, Wensheng
2014-01-01
The hypothalamic-pituitary-adrenal (HPA) axis is crucial in coping with stress and maintaining homeostasis. Hormones produced by the HPA axis exhibit both complex univariate longitudinal profiles and complex relationships among different hormones. Consequently, modeling these multivariate longitudinal hormone profiles is a challenging task. In this paper, we propose a bivariate hierarchical state space model, in which each hormone profile is modeled by a hierarchical state space model, with both population-average and subject-specific components. The bivariate model is constructed by concatenating the univariate models based on the hypothesized relationship. Because of the flexible framework of state space form, the resultant models not only can handle complex individual profiles, but also can incorporate complex relationships between two hormones, including both concurrent and feedback relationship. Estimation and inference are based on marginal likelihood and posterior means and variances. Computationally efficient Kalman filtering and smoothing algorithms are used for implementation. Application of the proposed method to a study of chronic fatigue syndrome and fibromyalgia reveals that the relationships between adrenocorticotropic hormone and cortisol in the patient group are weaker than in healthy controls.
Feng, Liang; Yuan, Shuai; Zhang, Liang-Liang; Tan, Kui; Li, Jia-Luo; Kirchon, Angelo; Liu, Ling-Mei; Zhang, Peng; Han, Yu; Chabal, Yves J; Zhou, Hong-Cai
2018-02-14
Sufficient pore size, appropriate stability, and hierarchical porosity are three prerequisites for open frameworks designed for drug delivery, enzyme immobilization, and catalysis involving large molecules. Herein, we report a powerful and general strategy, linker thermolysis, to construct ultrastable hierarchically porous metal-organic frameworks (HP-MOFs) with tunable pore size distribution. Linker instability, usually an undesirable trait of MOFs, was exploited to create mesopores by generating crystal defects throughout a microporous MOF crystal via thermolysis. The crystallinity and stability of HP-MOFs remain after thermolabile linkers are selectively removed from multivariate metal-organic frameworks (MTV-MOFs) through a decarboxylation process. A domain-based linker spatial distribution was found to be critical for creating hierarchical pores inside MTV-MOFs. Furthermore, linker thermolysis promotes the formation of ultrasmall metal oxide nanoparticles immobilized in an open framework that exhibits high catalytic activity for Lewis acid-catalyzed reactions. Most importantly, this work provides fresh insights into the connection between linker apportionment and vacancy distribution, which may shed light on probing the disordered linker apportionment in multivariate systems, a long-standing challenge in the study of MTV-MOFs.
Shi, Chengxiang; Wang, Wenxuan; Liu, Ni; Xu, Xueyan; Wang, Danhong; Zhang, Minghui; Sun, Pingchuan; Chen, Tiehong
2015-07-21
Hierarchically porous Ti-SBA-2 with high framework Ti content (up to 5 wt%) was firstly synthesized by employing organic mesomorphous complexes of a cationic surfactant (CTAB) and an anionic polyelectrolyte (PAA) as templates. The material exhibited excellent performance in oxidative desulfurization of diesel fuel at low temperature (40 °C or 25 °C) due to the unique hierarchically porous structure and high framework Ti content.
Hierarchial mark-recapture models: a framework for inference about demographic processes
Link, W.A.; Barker, R.J.
2004-01-01
The development of sophisticated mark-recapture models over the last four decades has provided fundamental tools for the study of wildlife populations, allowing reliable inference about population sizes and demographic rates based on clearly formulated models for the sampling processes. Mark-recapture models are now routinely described by large numbers of parameters. These large models provide the next challenge to wildlife modelers: the extraction of signal from noise in large collections of parameters. Pattern among parameters can be described by strong, deterministic relations (as in ultrastructural models) but is more flexibly and credibly modeled using weaker, stochastic relations. Trend in survival rates is not likely to be manifest by a sequence of values falling precisely on a given parametric curve; rather, if we could somehow know the true values, we might anticipate a regression relation between parameters and explanatory variables, in which true value equals signal plus noise. Hierarchical models provide a useful framework for inference about collections of related parameters. Instead of regarding parameters as fixed but unknown quantities, we regard them as realizations of stochastic processes governed by hyperparameters. Inference about demographic processes is based on investigation of these hyperparameters. We advocate the Bayesian paradigm as a natural, mathematically and scientifically sound basis for inference about hierarchical models. We describe analysis of capture-recapture data from an open population based on hierarchical extensions of the Cormack-Jolly-Seber model. In addition to recaptures of marked animals, we model first captures of animals and losses on capture, and are thus able to estimate survival probabilities w (i.e., the complement of death or permanent emigration) and per capita growth rates f (i.e., the sum of recruitment and immigration rates). Covariation in these rates, a feature of demographic interest, is explicitly described in the model.
Image Search Reranking With Hierarchical Topic Awareness.
Tian, Xinmei; Yang, Linjun; Lu, Yijuan; Tian, Qi; Tao, Dacheng
2015-10-01
With much attention from both academia and industrial communities, visual search reranking has recently been proposed to refine image search results obtained from text-based image search engines. Most of the traditional reranking methods cannot capture both relevance and diversity of the search results at the same time. Or they ignore the hierarchical topic structure of search result. Each topic is treated equally and independently. However, in real applications, images returned for certain queries are naturally in hierarchical organization, rather than simple parallel relation. In this paper, a new reranking method "topic-aware reranking (TARerank)" is proposed. TARerank describes the hierarchical topic structure of search results in one model, and seamlessly captures both relevance and diversity of the image search results simultaneously. Through a structured learning framework, relevance and diversity are modeled in TARerank by a set of carefully designed features, and then the model is learned from human-labeled training samples. The learned model is expected to predict reranking results with high relevance and diversity for testing queries. To verify the effectiveness of the proposed method, we collect an image search dataset and conduct comparison experiments on it. The experimental results demonstrate that the proposed TARerank outperforms the existing relevance-based and diversified reranking methods.
Chalmers, Eric; Luczak, Artur; Gruber, Aaron J.
2016-01-01
The mammalian brain is thought to use a version of Model-based Reinforcement Learning (MBRL) to guide “goal-directed” behavior, wherein animals consider goals and make plans to acquire desired outcomes. However, conventional MBRL algorithms do not fully explain animals' ability to rapidly adapt to environmental changes, or learn multiple complex tasks. They also require extensive computation, suggesting that goal-directed behavior is cognitively expensive. We propose here that key features of processing in the hippocampus support a flexible MBRL mechanism for spatial navigation that is computationally efficient and can adapt quickly to change. We investigate this idea by implementing a computational MBRL framework that incorporates features inspired by computational properties of the hippocampus: a hierarchical representation of space, “forward sweeps” through future spatial trajectories, and context-driven remapping of place cells. We find that a hierarchical abstraction of space greatly reduces the computational load (mental effort) required for adaptation to changing environmental conditions, and allows efficient scaling to large problems. It also allows abstract knowledge gained at high levels to guide adaptation to new obstacles. Moreover, a context-driven remapping mechanism allows learning and memory of multiple tasks. Simulating dorsal or ventral hippocampal lesions in our computational framework qualitatively reproduces behavioral deficits observed in rodents with analogous lesions. The framework may thus embody key features of how the brain organizes model-based RL to efficiently solve navigation and other difficult tasks. PMID:28018203
Hierarchical spatial models for predicting pygmy rabbit distribution and relative abundance
Wilson, T.L.; Odei, J.B.; Hooten, M.B.; Edwards, T.C.
2010-01-01
Conservationists routinely use species distribution models to plan conservation, restoration and development actions, while ecologists use them to infer process from pattern. These models tend to work well for common or easily observable species, but are of limited utility for rare and cryptic species. This may be because honest accounting of known observation bias and spatial autocorrelation are rarely included, thereby limiting statistical inference of resulting distribution maps. We specified and implemented a spatially explicit Bayesian hierarchical model for a cryptic mammal species (pygmy rabbit Brachylagus idahoensis). Our approach used two levels of indirect sign that are naturally hierarchical (burrows and faecal pellets) to build a model that allows for inference on regression coefficients as well as spatially explicit model parameters. We also produced maps of rabbit distribution (occupied burrows) and relative abundance (number of burrows expected to be occupied by pygmy rabbits). The model demonstrated statistically rigorous spatial prediction by including spatial autocorrelation and measurement uncertainty. We demonstrated flexibility of our modelling framework by depicting probabilistic distribution predictions using different assumptions of pygmy rabbit habitat requirements. Spatial representations of the variance of posterior predictive distributions were obtained to evaluate heterogeneity in model fit across the spatial domain. Leave-one-out cross-validation was conducted to evaluate the overall model fit. Synthesis and applications. Our method draws on the strengths of previous work, thereby bridging and extending two active areas of ecological research: species distribution models and multi-state occupancy modelling. Our framework can be extended to encompass both larger extents and other species for which direct estimation of abundance is difficult. ?? 2010 The Authors. Journal compilation ?? 2010 British Ecological Society.
Power to Detect Intervention Effects on Ensembles of Social Networks
ERIC Educational Resources Information Center
Sweet, Tracy M.; Junker, Brian W.
2016-01-01
The hierarchical network model (HNM) is a framework introduced by Sweet, Thomas, and Junker for modeling interventions and other covariate effects on ensembles of social networks, such as what would be found in randomized controlled trials in education research. In this article, we develop calculations for the power to detect an intervention…
ERIC Educational Resources Information Center
Tchumtchoua, Sylvie; Dey, Dipak K.
2012-01-01
This paper proposes a semiparametric Bayesian framework for the analysis of associations among multivariate longitudinal categorical variables in high-dimensional data settings. This type of data is frequent, especially in the social and behavioral sciences. A semiparametric hierarchical factor analysis model is developed in which the…
Elemental Learning as a Framework for E-Learning
ERIC Educational Resources Information Center
Dempsey, John V.; Litchfield, Brenda C.
2013-01-01
Analysis of learning outcomes can be a complex and esoteric instructional design process that is often ignored by educators and e-learning designers. This paper describes a model of analysis that fosters the real-life application of learning outcomes and explains why the model may be needed. The Elemental Learning taxonomy is a hierarchical model…
Technology and Participation in Japanese Factories: The Consequences for Morale and Productivity.
ERIC Educational Resources Information Center
Hull, Frank; Azumi, Koya
1988-01-01
By fully using their human resources, Japanese factories mass produce goods of low cost and high quality. Participation in Japanese factories occurs in a more hierarchical framework than advocated in the Western model of worker democracy. (JOW)
NASA Astrophysics Data System (ADS)
Hsieh, Chang-Yu; Cao, Jianshu
2018-01-01
We extend a standard stochastic theory to study open quantum systems coupled to a generic quantum environment. We exemplify the general framework by studying a two-level quantum system coupled bilinearly to the three fundamental classes of non-interacting particles: bosons, fermions, and spins. In this unified stochastic approach, the generalized stochastic Liouville equation (SLE) formally captures the exact quantum dissipations when noise variables with appropriate statistics for different bath models are applied. Anharmonic effects of a non-Gaussian bath are precisely encoded in the bath multi-time correlation functions that noise variables have to satisfy. Starting from the SLE, we devise a family of generalized hierarchical equations by averaging out the noise variables and expand bath multi-time correlation functions in a complete basis of orthonormal functions. The general hierarchical equations constitute systems of linear equations that provide numerically exact simulations of quantum dynamics. For bosonic bath models, our general hierarchical equation of motion reduces exactly to an extended version of hierarchical equation of motion which allows efficient simulation for arbitrary spectral densities and temperature regimes. Similar efficiency and flexibility can be achieved for the fermionic bath models within our formalism. The spin bath models can be simulated with two complementary approaches in the present formalism. (I) They can be viewed as an example of non-Gaussian bath models and be directly handled with the general hierarchical equation approach given their multi-time correlation functions. (II) Alternatively, each bath spin can be first mapped onto a pair of fermions and be treated as fermionic environments within the present formalism.
Estimation and Application of Ecological Memory Functions in Time and Space
NASA Astrophysics Data System (ADS)
Itter, M.; Finley, A. O.; Dawson, A.
2017-12-01
A common goal in quantitative ecology is the estimation or prediction of ecological processes as a function of explanatory variables (or covariates). Frequently, the ecological process of interest and associated covariates vary in time, space, or both. Theory indicates many ecological processes exhibit memory to local, past conditions. Despite such theoretical understanding, few methods exist to integrate observations from the recent past or within a local neighborhood as drivers of these processes. We build upon recent methodological advances in ecology and spatial statistics to develop a Bayesian hierarchical framework to estimate so-called ecological memory functions; that is, weight-generating functions that specify the relative importance of local, past covariate observations to ecological processes. Memory functions are estimated using a set of basis functions in time and/or space, allowing for flexible ecological memory based on a reduced set of parameters. Ecological memory functions are entirely data driven under the Bayesian hierarchical framework—no a priori assumptions are made regarding functional forms. Memory function uncertainty follows directly from posterior distributions for model parameters allowing for tractable propagation of error to predictions of ecological processes. We apply the model framework to simulated spatio-temporal datasets generated using memory functions of varying complexity. The framework is also applied to estimate the ecological memory of annual boreal forest growth to local, past water availability. Consistent with ecological understanding of boreal forest growth dynamics, memory to past water availability peaks in the year previous to growth and slowly decays to zero in five to eight years. The Bayesian hierarchical framework has applicability to a broad range of ecosystems and processes allowing for increased understanding of ecosystem responses to local and past conditions and improved prediction of ecological processes.
NASA Astrophysics Data System (ADS)
Kim, Seongryong; Tkalčić, Hrvoje; Mustać, Marija; Rhie, Junkee; Ford, Sean
2016-04-01
A framework is presented within which we provide rigorous estimations for seismic sources and structures in the Northeast Asia. We use Bayesian inversion methods, which enable statistical estimations of models and their uncertainties based on data information. Ambiguities in error statistics and model parameterizations are addressed by hierarchical and trans-dimensional (trans-D) techniques, which can be inherently implemented in the Bayesian inversions. Hence reliable estimation of model parameters and their uncertainties is possible, thus avoiding arbitrary regularizations and parameterizations. Hierarchical and trans-D inversions are performed to develop a three-dimensional velocity model using ambient noise data. To further improve the model, we perform joint inversions with receiver function data using a newly developed Bayesian method. For the source estimation, a novel moment tensor inversion method is presented and applied to regional waveform data of the North Korean nuclear explosion tests. By the combination of new Bayesian techniques and the structural model, coupled with meaningful uncertainties related to each of the processes, more quantitative monitoring and discrimination of seismic events is possible.
Stability and structural properties of gene regulation networks with coregulation rules.
Warrell, Jonathan; Mhlanga, Musa
2017-05-07
Coregulation of the expression of groups of genes has been extensively demonstrated empirically in bacterial and eukaryotic systems. Such coregulation can arise through the use of shared regulatory motifs, which allow the coordinated expression of modules (and module groups) of functionally related genes across the genome. Coregulation can also arise through the physical association of multi-gene complexes through chromosomal looping, which are then transcribed together. We present a general formalism for modeling coregulation rules in the framework of Random Boolean Networks (RBN), and develop specific models for transcription factor networks with modular structure (including module groups, and multi-input modules (MIM) with autoregulation) and multi-gene complexes (including hierarchical differentiation between multi-gene complex members). We develop a mean-field approach to analyse the dynamical stability of large networks incorporating coregulation, and show that autoregulated MIM and hierarchical gene-complex models can achieve greater stability than networks without coregulation whose rules have matching activation frequency. We provide further analysis of the stability of small networks of both kinds through simulations. We also characterize several general properties of the transients and attractors in the hierarchical coregulation model, and show using simulations that the steady-state distribution factorizes hierarchically as a Bayesian network in a Markov Jump Process analogue of the RBN model. Copyright © 2017. Published by Elsevier Ltd.
Shankle, William R; Pooley, James P; Steyvers, Mark; Hara, Junko; Mangrola, Tushar; Reisberg, Barry; Lee, Michael D
2013-01-01
Determining how cognition affects functional abilities is important in Alzheimer disease and related disorders. A total of 280 patients (normal or Alzheimer disease and related disorders) received a total of 1514 assessments using the functional assessment staging test (FAST) procedure and the MCI Screen. A hierarchical Bayesian cognitive processing model was created by embedding a signal detection theory model of the MCI Screen-delayed recognition memory task into a hierarchical Bayesian framework. The signal detection theory model used latent parameters of discriminability (memory process) and response bias (executive function) to predict, simultaneously, recognition memory performance for each patient and each FAST severity group. The observed recognition memory data did not distinguish the 6 FAST severity stages, but the latent parameters completely separated them. The latent parameters were also used successfully to transform the ordinal FAST measure into a continuous measure reflecting the underlying continuum of functional severity. Hierarchical Bayesian cognitive processing models applied to recognition memory data from clinical practice settings accurately translated a latent measure of cognition into a continuous measure of functional severity for both individuals and FAST groups. Such a translation links 2 levels of brain information processing and may enable more accurate correlations with other levels, such as those characterized by biomarkers.
Evaluating scaling models in biology using hierarchical Bayesian approaches
Price, Charles A; Ogle, Kiona; White, Ethan P; Weitz, Joshua S
2009-01-01
Theoretical models for allometric relationships between organismal form and function are typically tested by comparing a single predicted relationship with empirical data. Several prominent models, however, predict more than one allometric relationship, and comparisons among alternative models have not taken this into account. Here we evaluate several different scaling models of plant morphology within a hierarchical Bayesian framework that simultaneously fits multiple scaling relationships to three large allometric datasets. The scaling models include: inflexible universal models derived from biophysical assumptions (e.g. elastic similarity or fractal networks), a flexible variation of a fractal network model, and a highly flexible model constrained only by basic algebraic relationships. We demonstrate that variation in intraspecific allometric scaling exponents is inconsistent with the universal models, and that more flexible approaches that allow for biological variability at the species level outperform universal models, even when accounting for relative increases in model complexity. PMID:19453621
Model Based Mission Assurance: Emerging Opportunities for Robotic Systems
NASA Technical Reports Server (NTRS)
Evans, John W.; DiVenti, Tony
2016-01-01
The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulvatunyou, Boonserm; Wysk, Richard A.; Cho, Hyunbo
2004-06-01
In today's global manufacturing environment, manufacturing functions are distributed as never before. Design, engineering, fabrication, and assembly of new products are done routinely in many different enterprises scattered around the world. Successful business transactions require the sharing of design and engineering data on an unprecedented scale. This paper describes a framework that facilitates the collaboration of engineering tasks, particularly process planning and analysis, to support such globalized manufacturing activities. The information models of data and the software components that integrate those information models are described. The integration framework uses an Integrated Product and Process Data (IPPD) representation called a Resourcemore » Independent Operation Summary (RIOS) to facilitate the communication of business and manufacturing requirements. Hierarchical process modeling, process planning decomposition and an augmented AND/OR directed graph are used in this representation. The Resource Specific Process Planning (RSPP) module assigns required equipment and tools, selects process parameters, and determines manufacturing costs based on two-level hierarchical RIOS data. The shop floor knowledge (resource and process knowledge) and a hybrid approach (heuristic and linear programming) to linearize the AND/OR graph provide the basis for the planning. Finally, a prototype system is developed and demonstrated with an exemplary part. Java and XML (Extensible Markup Language) are used to ensure software and information portability.« less
ERIC Educational Resources Information Center
Arens, A. Katrin; Morin, Alexandre J. S.
2017-01-01
This study illustrates an integrative psychometric framework to investigate two sources of construct-relevant multidimensionality in answers to the Self-Perception Profile for Children (SPPC). Using a sample of 2,353 German students attending Grades 3 to 6, we contrasted: (a) first-order versus hierarchical and bifactor models to investigate…
A Hierarchical and Contextual Model for Learning and Recognizing Highly Variant Visual Categories
2010-01-01
neighboring pattern primitives, to create our model. We also present a minimax entropy framework for automatically learning which contextual constraints are...Grammars . . . . . . . . . . . . . . . . . . 19 3.2 Markov Random Fields . . . . . . . . . . . . . . . . . . . . . . . . 23 3.3 Creating a Contextual...Compositional Boosting. . . . . 119 7.8 Top-down hallucinations of missing objects. . . . . . . . . . . . . . . 121 7.9 The bottom-up to top-down
Testing adaptive toolbox models: a Bayesian hierarchical approach.
Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan
2013-01-01
Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.
Semantic-based surveillance video retrieval.
Hu, Weiming; Xie, Dan; Fu, Zhouyu; Zeng, Wenrong; Maybank, Steve
2007-04-01
Visual surveillance produces large amounts of video data. Effective indexing and retrieval from surveillance video databases are very important. Although there are many ways to represent the content of video clips in current video retrieval algorithms, there still exists a semantic gap between users and retrieval systems. Visual surveillance systems supply a platform for investigating semantic-based video retrieval. In this paper, a semantic-based video retrieval framework for visual surveillance is proposed. A cluster-based tracking algorithm is developed to acquire motion trajectories. The trajectories are then clustered hierarchically using the spatial and temporal information, to learn activity models. A hierarchical structure of semantic indexing and retrieval of object activities, where each individual activity automatically inherits all the semantic descriptions of the activity model to which it belongs, is proposed for accessing video clips and individual objects at the semantic level. The proposed retrieval framework supports various queries including queries by keywords, multiple object queries, and queries by sketch. For multiple object queries, succession and simultaneity restrictions, together with depth and breadth first orders, are considered. For sketch-based queries, a method for matching trajectories drawn by users to spatial trajectories is proposed. The effectiveness and efficiency of our framework are tested in a crowded traffic scene.
Tang, Yuye; Chen, Xi; Yoo, Jejoong; Yethiraj, Arun; Cui, Qiang
2010-01-01
A hierarchical simulation framework that integrates information from all-atom simulations into a finite element model at the continuum level is established to study the mechanical response of a mechanosensitive channel of large conductance (MscL) in bacteria Escherichia Coli (E.coli) embedded in a vesicle formed by the dipalmitoylphosphatidycholine (DPPC) lipid bilayer. Sufficient structural details of the protein are built into the continuum model, with key parameters and material properties derived from molecular mechanics simulations. The multi-scale framework is used to analyze the gating of MscL when the lipid vesicle is subjective to nanoindentation and patch clamp experiments, and the detailed structural transitions of the protein are obtained explicitly as a function of external load; it is currently impossible to derive such information based solely on all-atom simulations. The gating pathways of E.coli-MscL qualitatively agree with results from previous patch clamp experiments. The gating mechanisms under complex indentation-induced deformation are also predicted. This versatile hierarchical multi-scale framework may be further extended to study the mechanical behaviors of cells and biomolecules, as well as to guide and stimulate biomechanics experiments. PMID:21874098
Bonding Social Capital in Low-Income Neighborhoods
ERIC Educational Resources Information Center
Brisson, Daniel S.; Usher, Charles L.
2005-01-01
Social capital has recently become a guiding theoretical framework for family interventions in low-income neighborhoods. In the context of the Annie E. Casey Foundation's Making Connections initiative, this research uses hierarchical linear modeling to examine how neighborhood characteristics and resident participation affect bonding social…
Hierarchically organized behavior and its neural foundations: A reinforcement-learning perspective
Botvinick, Matthew M.; Niv, Yael; Barto, Andrew C.
2009-01-01
Research on human and animal behavior has long emphasized its hierarchical structure — the divisibility of ongoing behavior into discrete tasks, which are comprised of subtask sequences, which in turn are built of simple actions. The hierarchical structure of behavior has also been of enduring interest within neuroscience, where it has been widely considered to reflect prefrontal cortical functions. In this paper, we reexamine behavioral hierarchy and its neural substrates from the point of view of recent developments in computational reinforcement learning. Specifically, we consider a set of approaches known collectively as hierarchical reinforcement learning, which extend the reinforcement learning paradigm by allowing the learning agent to aggregate actions into reusable subroutines or skills. A close look at the components of hierarchical reinforcement learning suggests how they might map onto neural structures, in particular regions within the dorsolateral and orbital prefrontal cortex. It also suggests specific ways in which hierarchical reinforcement learning might provide a complement to existing psychological models of hierarchically structured behavior. A particularly important question that hierarchical reinforcement learning brings to the fore is that of how learning identifies new action routines that are likely to provide useful building blocks in solving a wide range of future problems. Here and at many other points, hierarchical reinforcement learning offers an appealing framework for investigating the computational and neural underpinnings of hierarchically structured behavior. PMID:18926527
Delineating the Structure of Normal and Abnormal Personality: An Integrative Hierarchical Approach
Markon, Kristian E.; Krueger, Robert F.; Watson, David
2008-01-01
Increasing evidence indicates that normal and abnormal personality can be treated within a single structural framework. However, identification of a single integrated structure of normal and abnormal personality has remained elusive. Here, a constructive replication approach was used to delineate an integrative hierarchical account of the structure of normal and abnormal personality. This hierarchical structure, which integrates many Big Trait models proposed in the literature, replicated across a meta-analysis as well as an empirical study, and across samples of participants as well as measures. The proposed structure resembles previously suggested accounts of personality hierarchy and provides insight into the nature of personality hierarchy more generally. Potential directions for future research on personality and psychopathology are discussed. PMID:15631580
Delineating the structure of normal and abnormal personality: an integrative hierarchical approach.
Markon, Kristian E; Krueger, Robert F; Watson, David
2005-01-01
Increasing evidence indicates that normal and abnormal personality can be treated within a single structural framework. However, identification of a single integrated structure of normal and abnormal personality has remained elusive. Here, a constructive replication approach was used to delineate an integrative hierarchical account of the structure of normal and abnormal personality. This hierarchical structure, which integrates many Big Trait models proposed in the literature, replicated across a meta-analysis as well as an empirical study, and across samples of participants as well as measures. The proposed structure resembles previously suggested accounts of personality hierarchy and provides insight into the nature of personality hierarchy more generally. Potential directions for future research on personality and psychopathology are discussed.
A general science-based framework for dynamical spatio-temporal models
Wikle, C.K.; Hooten, M.B.
2010-01-01
Spatio-temporal statistical models are increasingly being used across a wide variety of scientific disciplines to describe and predict spatially-explicit processes that evolve over time. Correspondingly, in recent years there has been a significant amount of research on new statistical methodology for such models. Although descriptive models that approach the problem from the second-order (covariance) perspective are important, and innovative work is being done in this regard, many real-world processes are dynamic, and it can be more efficient in some cases to characterize the associated spatio-temporal dependence by the use of dynamical models. The chief challenge with the specification of such dynamical models has been related to the curse of dimensionality. Even in fairly simple linear, first-order Markovian, Gaussian error settings, statistical models are often over parameterized. Hierarchical models have proven invaluable in their ability to deal to some extent with this issue by allowing dependency among groups of parameters. In addition, this framework has allowed for the specification of science based parameterizations (and associated prior distributions) in which classes of deterministic dynamical models (e. g., partial differential equations (PDEs), integro-difference equations (IDEs), matrix models, and agent-based models) are used to guide specific parameterizations. Most of the focus for the application of such models in statistics has been in the linear case. The problems mentioned above with linear dynamic models are compounded in the case of nonlinear models. In this sense, the need for coherent and sensible model parameterizations is not only helpful, it is essential. Here, we present an overview of a framework for incorporating scientific information to motivate dynamical spatio-temporal models. First, we illustrate the methodology with the linear case. We then develop a general nonlinear spatio-temporal framework that we call general quadratic nonlinearity and demonstrate that it accommodates many different classes of scientific-based parameterizations as special cases. The model is presented in a hierarchical Bayesian framework and is illustrated with examples from ecology and oceanography. ?? 2010 Sociedad de Estad??stica e Investigaci??n Operativa.
Performability modeling with continuous accomplishment sets
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1979-01-01
A general modeling framework that permits the definition, formulation, and evaluation of performability is described. It is shown that performability relates directly to system effectiveness, and is a proper generalization of both performance and reliability. A hierarchical modeling scheme is used to formulate the capability function used to evaluate performability. The case in which performance variables take values in a continuous accomplishment set is treated explicitly.
Developing a Framework for Communication Management Competencies
ERIC Educational Resources Information Center
Jeffrey, Lynn Maud; Brunton, Margaret Ann
2011-01-01
Using a hierarchical needs assessment model developed by Hunt we identified the essential competencies of communication management practitioners for the purpose of curriculum development and selection. We found that the underlying values of the profession were embodied in two superordinate goals. Six major competencies were identified, which were…
Understanding movement data and movement processes: current and emerging directions.
Schick, Robert S; Loarie, Scott R; Colchero, Fernando; Best, Benjamin D; Boustany, Andre; Conde, Dalia A; Halpin, Patrick N; Joppa, Lucas N; McClellan, Catherine M; Clark, James S
2008-12-01
Animal movement has been the focus on much theoretical and empirical work in ecology over the last 25 years. By studying the causes and consequences of individual movement, ecologists have gained greater insight into the behavior of individuals and the spatial dynamics of populations at increasingly higher levels of organization. In particular, ecologists have focused on the interaction between individuals and their environment in an effort to understand future impacts from habitat loss and climate change. Tools to examine this interaction have included: fractal analysis, first passage time, Lévy flights, multi-behavioral analysis, hidden markov models, and state-space models. Concurrent with the development of movement models has been an increase in the sophistication and availability of hierarchical bayesian models. In this review we bring these two threads together by using hierarchical structures as a framework for reviewing individual models. We synthesize emerging themes in movement ecology, and propose a new hierarchical model for animal movement that builds on these emerging themes. This model moves away from traditional random walks, and instead focuses inference on how moving animals with complex behavior interact with their landscape and make choices about its suitability.
Model architecture of intelligent data mining oriented urban transportation information
NASA Astrophysics Data System (ADS)
Yang, Bogang; Tao, Yingchun; Sui, Jianbo; Zhang, Feizhou
2007-06-01
Aiming at solving practical problems in urban traffic, the paper presents model architecture of intelligent data mining from hierarchical view. With artificial intelligent technologies used in the framework, the intelligent data mining technology improves, which is more suitable for the change of real-time road condition. It also provides efficient technology support for the urban transport information distribution, transmission and display.
ERIC Educational Resources Information Center
Calabrese, William R.; Rudick, Monica M.; Simms, Leonard J.; Clark, Lee Anna
2012-01-01
Recently, integrative, hierarchical models of personality and personality disorder (PD)--such as the Big Three, Big Four, and Big Five trait models--have gained support as a unifying dimensional framework for describing PD. However, no measures to date can simultaneously represent each of these potentially interesting levels of the personality…
ERIC Educational Resources Information Center
Black, Ryan A.; Yang, Yanyun; Beitra, Danette; McCaffrey, Stacey
2015-01-01
Estimation of composite reliability within a hierarchical modeling framework has recently become of particular interest given the growing recognition that the underlying assumptions of coefficient alpha are often untenable. Unfortunately, coefficient alpha remains the prominent estimate of reliability when estimating total scores from a scale with…
Fully Bayesian Estimation of Data from Single Case Designs
ERIC Educational Resources Information Center
Rindskopf, David
2013-01-01
Single case designs (SCDs) generally consist of a small number of short time series in two or more phases. The analysis of SCDs statistically fits in the framework of a multilevel model, or hierarchical model. The usual analysis does not take into account the uncertainty in the estimation of the random effects. This not only has an effect on the…
A Unified Theoretical Framework for Cognitive Sequencing.
Savalia, Tejas; Shukla, Anuj; Bapi, Raju S
2016-01-01
The capacity to sequence information is central to human performance. Sequencing ability forms the foundation stone for higher order cognition related to language and goal-directed planning. Information related to the order of items, their timing, chunking and hierarchical organization are important aspects in sequencing. Past research on sequencing has emphasized two distinct and independent dichotomies: implicit vs. explicit and goal-directed vs. habits. We propose a theoretical framework unifying these two streams. Our proposal relies on brain's ability to implicitly extract statistical regularities from the stream of stimuli and with attentional engagement organizing sequences explicitly and hierarchically. Similarly, sequences that need to be assembled purposively to accomplish a goal require engagement of attentional processes. With repetition, these goal-directed plans become habits with concomitant disengagement of attention. Thus, attention and awareness play a crucial role in the implicit-to-explicit transition as well as in how goal-directed plans become automatic habits. Cortico-subcortical loops basal ganglia-frontal cortex and hippocampus-frontal cortex loops mediate the transition process. We show how the computational principles of model-free and model-based learning paradigms, along with a pivotal role for attention and awareness, offer a unifying framework for these two dichotomies. Based on this framework, we make testable predictions related to the potential influence of response-to-stimulus interval (RSI) on developing awareness in implicit learning tasks.
A Unified Theoretical Framework for Cognitive Sequencing
Savalia, Tejas; Shukla, Anuj; Bapi, Raju S.
2016-01-01
The capacity to sequence information is central to human performance. Sequencing ability forms the foundation stone for higher order cognition related to language and goal-directed planning. Information related to the order of items, their timing, chunking and hierarchical organization are important aspects in sequencing. Past research on sequencing has emphasized two distinct and independent dichotomies: implicit vs. explicit and goal-directed vs. habits. We propose a theoretical framework unifying these two streams. Our proposal relies on brain's ability to implicitly extract statistical regularities from the stream of stimuli and with attentional engagement organizing sequences explicitly and hierarchically. Similarly, sequences that need to be assembled purposively to accomplish a goal require engagement of attentional processes. With repetition, these goal-directed plans become habits with concomitant disengagement of attention. Thus, attention and awareness play a crucial role in the implicit-to-explicit transition as well as in how goal-directed plans become automatic habits. Cortico-subcortical loops basal ganglia-frontal cortex and hippocampus-frontal cortex loops mediate the transition process. We show how the computational principles of model-free and model-based learning paradigms, along with a pivotal role for attention and awareness, offer a unifying framework for these two dichotomies. Based on this framework, we make testable predictions related to the potential influence of response-to-stimulus interval (RSI) on developing awareness in implicit learning tasks. PMID:27917146
Automated Analysis of Stateflow Models
NASA Technical Reports Server (NTRS)
Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier
2017-01-01
Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.
2017-01-01
The goal of this work is to understand adsorption-induced deformation of hierarchically structured porous silica exhibiting well-defined cylindrical mesopores. For this purpose, we performed an in situ dilatometry measurement on a calcined and sintered monolithic silica sample during the adsorption of N2 at 77 K. To analyze the experimental data, we extended the adsorption stress model to account for the anisotropy of cylindrical mesopores, i.e., we explicitly derived the adsorption stress tensor components in the axial and radial direction of the pore. For quantitative predictions of stresses and strains, we applied the theoretical framework of Derjaguin, Broekhoff, and de Boer for adsorption in mesopores and two mechanical models of silica rods with axially aligned pore channels: an idealized cylindrical tube model, which can be described analytically, and an ordered hexagonal array of cylindrical mesopores, whose mechanical response to adsorption stress was evaluated by 3D finite element calculations. The adsorption-induced strains predicted by both mechanical models are in good quantitative agreement making the cylindrical tube the preferable model for adsorption-induced strains due to its simple analytical nature. The theoretical results are compared with the in situ dilatometry data on a hierarchically structured silica monolith composed by a network of mesoporous struts of MCM-41 type morphology. Analyzing the experimental adsorption and strain data with the proposed theoretical framework, we find the adsorption-induced deformation of the monolithic sample being reasonably described by a superposition of axial and radial strains calculated on the mesopore level. The structural and mechanical parameters obtained from the model are in good agreement with expectations from independent measurements and literature, respectively. PMID:28547995
Balzer, Christian; Waag, Anna M; Gehret, Stefan; Reichenauer, Gudrun; Putz, Florian; Hüsing, Nicola; Paris, Oskar; Bernstein, Noam; Gor, Gennady Y; Neimark, Alexander V
2017-06-06
The goal of this work is to understand adsorption-induced deformation of hierarchically structured porous silica exhibiting well-defined cylindrical mesopores. For this purpose, we performed an in situ dilatometry measurement on a calcined and sintered monolithic silica sample during the adsorption of N 2 at 77 K. To analyze the experimental data, we extended the adsorption stress model to account for the anisotropy of cylindrical mesopores, i.e., we explicitly derived the adsorption stress tensor components in the axial and radial direction of the pore. For quantitative predictions of stresses and strains, we applied the theoretical framework of Derjaguin, Broekhoff, and de Boer for adsorption in mesopores and two mechanical models of silica rods with axially aligned pore channels: an idealized cylindrical tube model, which can be described analytically, and an ordered hexagonal array of cylindrical mesopores, whose mechanical response to adsorption stress was evaluated by 3D finite element calculations. The adsorption-induced strains predicted by both mechanical models are in good quantitative agreement making the cylindrical tube the preferable model for adsorption-induced strains due to its simple analytical nature. The theoretical results are compared with the in situ dilatometry data on a hierarchically structured silica monolith composed by a network of mesoporous struts of MCM-41 type morphology. Analyzing the experimental adsorption and strain data with the proposed theoretical framework, we find the adsorption-induced deformation of the monolithic sample being reasonably described by a superposition of axial and radial strains calculated on the mesopore level. The structural and mechanical parameters obtained from the model are in good agreement with expectations from independent measurements and literature, respectively.
Temperament, Personality and Achievement Goals among Chinese Adolescent Students
ERIC Educational Resources Information Center
Chen, Chen; Zhang, Li-Fang
2011-01-01
Temperament and personality have been presumed to affect achievement goals based on the hierarchical model of achievement motivation. This research investigated the relationships of temperament dimensions and the Big Five personality traits to achievement goals based on the 2 x 2 achievement goal framework among 775 Chinese adolescent students.…
University Student Satisfaction: An Empirical Analysis
ERIC Educational Resources Information Center
Clemes, Michael D.; Gan, Christopher E. C.; Kao, Tzu-Hui
2008-01-01
The purpose of this research is to gain an empirical understanding of students' overall satisfaction with their academic university experiences. A hierarchal model is used as a framework for this analysis. Fifteen hypotheses are formulated and tested, in order to identify the dimensions of service quality as perceived by university students, to…
A HIERARCHICAL MODELING FRAMEWORK FOR GEOLOGICAL STORAGE OF CARBON DIOXIDE
Carbon Capture and Storage, or CCS, is likely to be an important technology in a carbonconstrained world. CCS will involve subsurface injection of massive amounts of captured CO2, on a scale that has not previously been approached. The unprecedented scale of t...
Using Multilevel Modeling to Examine the Effects of Multitiered Interventions
ERIC Educational Resources Information Center
Clements, Melissa A.; Bolt, Daniel; Hoyt, William; Kratochwill, Thomas R.
2007-01-01
Data collected in school settings are inherently hierarchical. At the same time, it is becoming increasingly common for interventions to be implemented within the context of a similar multitiered intervention framework where different interventions are provided at different levels of the hierarchy, often simultaneously. This prevalence of…
Testing Adaptive Toolbox Models: A Bayesian Hierarchical Approach
ERIC Educational Resources Information Center
Scheibehenne, Benjamin; Rieskamp, Jorg; Wagenmakers, Eric-Jan
2013-01-01
Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox…
Anatomical Entity Recognition with a Hierarchical Framework Augmented by External Resources
Xu, Yan; Hua, Ji; Ni, Zhaoheng; Chen, Qinlang; Fan, Yubo; Ananiadou, Sophia; Chang, Eric I-Chao; Tsujii, Junichi
2014-01-01
References to anatomical entities in medical records consist not only of explicit references to anatomical locations, but also other diverse types of expressions, such as specific diseases, clinical tests, clinical treatments, which constitute implicit references to anatomical entities. In order to identify these implicit anatomical entities, we propose a hierarchical framework, in which two layers of named entity recognizers (NERs) work in a cooperative manner. Each of the NERs is implemented using the Conditional Random Fields (CRF) model, which use a range of external resources to generate features. We constructed a dictionary of anatomical entity expressions by exploiting four existing resources, i.e., UMLS, MeSH, RadLex and BodyPart3D, and supplemented information from two external knowledge bases, i.e., Wikipedia and WordNet, to improve inference of anatomical entities from implicit expressions. Experiments conducted on 300 discharge summaries showed a micro-averaged performance of 0.8509 Precision, 0.7796 Recall and 0.8137 F1 for explicit anatomical entity recognition, and 0.8695 Precision, 0.6893 Recall and 0.7690 F1 for implicit anatomical entity recognition. The use of the hierarchical framework, which combines the recognition of named entities of various types (diseases, clinical tests, treatments) with information embedded in external knowledge bases, resulted in a 5.08% increment in F1. The resources constructed for this research will be made publicly available. PMID:25343498
ERIC Educational Resources Information Center
Savage, Robert; Burgos, Giovani; Wood, Eileen; Piquette, Noella
2015-01-01
The Simple View of Reading (SVR) describes Reading Comprehension as the product of distinct child-level variance in decoding (D) and linguistic comprehension (LC) component abilities. When used as a model for educational policy, distinct classroom-level influences of each of the components of the SVR model have been assumed, but have not yet been…
NASA Astrophysics Data System (ADS)
Yin, Ping; Mu, Lan; Madden, Marguerite; Vena, John E.
2014-10-01
Lung cancer is the second most commonly diagnosed cancer in both men and women in Georgia, USA. However, the spatio-temporal patterns of lung cancer risk in Georgia have not been fully studied. Hierarchical Bayesian models are used here to explore the spatio-temporal patterns of lung cancer incidence risk by race and gender in Georgia for the period of 2000-2007. With the census tract level as the spatial scale and the 2-year period aggregation as the temporal scale, we compare a total of seven Bayesian spatio-temporal models including two under a separate modeling framework and five under a joint modeling framework. One joint model outperforms others based on the deviance information criterion. Results show that the northwest region of Georgia has consistently high lung cancer incidence risk for all population groups during the study period. In addition, there are inverse relationships between the socioeconomic status and the lung cancer incidence risk among all Georgian population groups, and the relationships in males are stronger than those in females. By mapping more reliable variations in lung cancer incidence risk at a relatively fine spatio-temporal scale for different Georgian population groups, our study aims to better support healthcare performance assessment, etiological hypothesis generation, and health policy making.
NASA Astrophysics Data System (ADS)
Udupa, Jayaram K.; Odhner, Dewey; Falcao, Alexandre X.; Ciesielski, Krzysztof C.; Miranda, Paulo A. V.; Vaideeswaran, Pavithra; Mishra, Shipra; Grevera, George J.; Saboury, Babak; Torigian, Drew A.
2011-03-01
To make Quantitative Radiology (QR) a reality in routine clinical practice, computerized automatic anatomy recognition (AAR) becomes essential. As part of this larger goal, we present in this paper a novel fuzzy strategy for building bodywide group-wise anatomic models. They have the potential to handle uncertainties and variability in anatomy naturally and to be integrated with the fuzzy connectedness framework for image segmentation. Our approach is to build a family of models, called the Virtual Quantitative Human, representing normal adult subjects at a chosen resolution of the population variables (gender, age). Models are represented hierarchically, the descendents representing organs contained in parent organs. Based on an index of fuzziness of the models, 32 thorax data sets, and 10 organs defined in them, we found that the hierarchical approach to modeling can effectively handle the non-linear relationships in position, scale, and orientation that exist among organs in different patients.
NASA Astrophysics Data System (ADS)
Li, Bin
Spatial control behaviors account for a large proportion of human everyday activities from normal daily tasks, such as reaching for objects, to specialized tasks, such as driving, surgery, or operating equipment. These behaviors involve intensive interactions within internal processes (i.e. cognitive, perceptual, and motor control) and with the physical world. This dissertation builds on a concept of interaction pattern and a hierarchical functional model. Interaction pattern represents a type of behavior synergy that humans coordinates cognitive, perceptual, and motor control processes. It contributes to the construction of the hierarchical functional model that delineates humans spatial control behaviors as the coordination of three functional subsystems: planning, guidance, and tracking/pursuit. This dissertation formalizes and validates these two theories and extends them for the investigation of human spatial control skills encompassing development and assessment. Specifically, this dissertation first presents an overview of studies in human spatial control skills encompassing definition, characteristic, development, and assessment, to provide theoretical evidence for the concept of interaction pattern and the hierarchical functional model. The following, the human experiments for collecting motion and gaze data and techniques to register and classify gaze data, are described. This dissertation then elaborates and mathematically formalizes the hierarchical functional model and the concept of interaction pattern. These theories then enables the construction of a succinct simulation model that can reproduce a variety of human performance with a minimal set of hypotheses. This validates the hierarchical functional model as a normative framework for interpreting human spatial control behaviors. The dissertation then investigates human skill development and captures the emergence of interaction pattern. The final part of the dissertation applies the hierarchical functional model for skill assessment and introduces techniques to capture interaction patterns both from the top down using their geometric features and from the bottom up using their dynamical characteristics. The validity and generality of the skill assessment is illustrated using two the remote-control flight and laparoscopic surgical training experiments.
A Hierarchical Model for Simultaneous Detection and Estimation in Multi-subject fMRI Studies
Degras, David; Lindquist, Martin A.
2014-01-01
In this paper we introduce a new hierarchical model for the simultaneous detection of brain activation and estimation of the shape of the hemodynamic response in multi-subject fMRI studies. The proposed approach circumvents a major stumbling block in standard multi-subject fMRI data analysis, in that it both allows the shape of the hemodynamic response function to vary across region and subjects, while still providing a straightforward way to estimate population-level activation. An e cient estimation algorithm is presented, as is an inferential framework that not only allows for tests of activation, but also for tests for deviations from some canonical shape. The model is validated through simulations and application to a multi-subject fMRI study of thermal pain. PMID:24793829
Marwan, Wolfgang; Sujatha, Arumugam; Starostzik, Christine
2005-10-21
We reconstruct the regulatory network controlling commitment and sporulation of Physarum polycephalum from experimental results using a hierarchical Petri Net-based modelling and simulation framework. The stochastic Petri Net consistently describes the structure and simulates the dynamics of the molecular network as analysed by genetic, biochemical and physiological experiments within a single coherent model. The Petri Net then is extended to simulate time-resolved somatic complementation experiments performed by mixing the cytoplasms of mutants altered in the sporulation response, to systematically explore the network structure and to probe its dynamics. This reverse engineering approach presumably can be employed to explore other molecular or genetic signalling systems where the activity of genes or their products can be experimentally controlled in a time-resolved manner.
A Hierarchical Framework for Demand-Side Frequency Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moya, Christian; Zhang, Wei; Lian, Jianming
2014-06-02
With large-scale plans to integrate renewable generation, more resources will be needed to compensate for the uncertainty associated with intermittent generation resources. Under such conditions, performing frequency control using only supply-side resources become not only prohibitively expensive but also technically difficult. It is therefore important to explore how a sufficient proportion of the loads could assume a routine role in frequency control to maintain the stability of the system at an acceptable cost. In this paper, a novel hierarchical decentralized framework for frequency based load control is proposed. The framework involves two decision layers. The top decision layer determines themore » optimal droop gain required from the aggregated load response on each bus using a robust decentralized control approach. The second layer consists of a large number of devices, which switch probabilistically during contingencies so that the aggregated power change matches the desired droop amount according to the updated gains. The proposed framework is based on the classical nonlinear multi-machine power system model, and can deal with timevarying system operating conditions while respecting the physical constraints of individual devices. Realistic simulation results based on a 68-bus system are provided to demonstrate the effectiveness of the proposed strategy.« less
A Hierarchical Security Architecture for Cyber-Physical Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quanyan Zhu; Tamer Basar
2011-08-01
Security of control systems is becoming a pivotal concern in critical national infrastructures such as the power grid and nuclear plants. In this paper, we adopt a hierarchical viewpoint to these security issues, addressing security concerns at each level and emphasizing a holistic cross-layer philosophy for developing security solutions. We propose a bottom-up framework that establishes a model from the physical and control levels to the supervisory level, incorporating concerns from network and communication levels. We show that the game-theoretical approach can yield cross-layer security strategy solutions to the cyber-physical systems.
CHAMPION: Intelligent Hierarchical Reasoning Agents for Enhanced Decision Support
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hohimer, Ryan E.; Greitzer, Frank L.; Noonan, Christine F.
2011-11-15
We describe the design and development of an advanced reasoning framework employing semantic technologies, organized within a hierarchy of computational reasoning agents that interpret domain specific information. Designed based on an inspirational metaphor of the pattern recognition functions performed by the human neocortex, the CHAMPION reasoning framework represents a new computational modeling approach that derives invariant knowledge representations through memory-prediction belief propagation processes that are driven by formal ontological language specification and semantic technologies. The CHAMPION framework shows promise for enhancing complex decision making in diverse problem domains including cyber security, nonproliferation and energy consumption analysis.
NASA Technical Reports Server (NTRS)
Peuquet, Donna J.
1987-01-01
A new approach to building geographic data models that is based on the fundamental characteristics of the data is presented. An overall theoretical framework for representing geographic data is proposed. An example of utilizing this framework in a Geographic Information System (GIS) context by combining artificial intelligence techniques with recent developments in spatial data processing techniques is given. Elements of data representation discussed include hierarchical structure, separation of locational and conceptual views, and the ability to store knowledge at variable levels of completeness and precision.
A management-oriented classification of pinyon-juniper woodlands of the Great Basin
Neil E. West; Robin J. Tausch; Paul T. Tueller
1998-01-01
A hierarchical framework for the classification of Great Basin pinyon-juniper woodlands was based on a systematic sample of 426 stands from a random selection of 66 of the 110 mountain ranges in the region. That is, mountain ranges were randomly selected, but stands were systematically located on mountain ranges. The National Hierarchical Framework of Ecological Units...
NASA Astrophysics Data System (ADS)
Yang, Xiaoli; Wu, Suilan; Wang, Panhao; Yang, Lin
2018-02-01
The synthesis of well-ordered hierarchical metal-organic frameworks (MOFs) in an efficient manner is a great challenge. Here, a 3D regular ordered meso-/macroporous MOF of Cu-TATAB (referred to as MM-MOF) was synthesized through a facile template-free self-assembly process with pore sizes of 31 nm and 119 nm.
Hierarchical modeling of cluster size in wildlife surveys
Royle, J. Andrew
2008-01-01
Clusters or groups of individuals are the fundamental unit of observation in many wildlife sampling problems, including aerial surveys of waterfowl, marine mammals, and ungulates. Explicit accounting of cluster size in models for estimating abundance is necessary because detection of individuals within clusters is not independent and detectability of clusters is likely to increase with cluster size. This induces a cluster size bias in which the average cluster size in the sample is larger than in the population at large. Thus, failure to account for the relationship between delectability and cluster size will tend to yield a positive bias in estimates of abundance or density. I describe a hierarchical modeling framework for accounting for cluster-size bias in animal sampling. The hierarchical model consists of models for the observation process conditional on the cluster size distribution and the cluster size distribution conditional on the total number of clusters. Optionally, a spatial model can be specified that describes variation in the total number of clusters per sample unit. Parameter estimation, model selection, and criticism may be carried out using conventional likelihood-based methods. An extension of the model is described for the situation where measurable covariates at the level of the sample unit are available. Several candidate models within the proposed class are evaluated for aerial survey data on mallard ducks (Anas platyrhynchos).
A multiscale, hierarchical model of pulse dynamics in arid-land ecosystems
Collins, Scott L.; Belnap, Jayne; Grimm, N. B.; Rudgers, J. A.; Dahm, Clifford N.; D'Odorico, P.; Litvak, M.; Natvig, D. O.; Peters, Douglas C.; Pockman, W. T.; Sinsabaugh, R. L.; Wolf, B. O.
2014-01-01
Ecological processes in arid lands are often described by the pulse-reserve paradigm, in which rain events drive biological activity until moisture is depleted, leaving a reserve. This paradigm is frequently applied to processes stimulated by one or a few precipitation events within a growing season. Here we expand the original framework in time and space and include other pulses that interact with rainfall. This new hierarchical pulse-dynamics framework integrates space and time through pulse-driven exchanges, interactions, transitions, and transfers that occur across individual to multiple pulses extending from micro to watershed scales. Climate change will likely alter the size, frequency, and intensity of precipitation pulses in the future, and arid-land ecosystems are known to be highly sensitive to climate variability. Thus, a more comprehensive understanding of arid-land pulse dynamics is needed to determine how these ecosystems will respond to, and be shaped by, increased climate variability.
Grieve, Richard; Nixon, Richard; Thompson, Simon G
2010-01-01
Cost-effectiveness analyses (CEA) may be undertaken alongside cluster randomized trials (CRTs) where randomization is at the level of the cluster (for example, the hospital or primary care provider) rather than the individual. Costs (and outcomes) within clusters may be correlated so that the assumption made by standard bivariate regression models, that observations are independent, is incorrect. This study develops a flexible modeling framework to acknowledge the clustering in CEA that use CRTs. The authors extend previous Bayesian bivariate models for CEA of multicenter trials to recognize the specific form of clustering in CRTs. They develop new Bayesian hierarchical models (BHMs) that allow mean costs and outcomes, and also variances, to differ across clusters. They illustrate how each model can be applied using data from a large (1732 cases, 70 primary care providers) CRT evaluating alternative interventions for reducing postnatal depression. The analyses compare cost-effectiveness estimates from BHMs with standard bivariate regression models that ignore the data hierarchy. The BHMs show high levels of cost heterogeneity across clusters (intracluster correlation coefficient, 0.17). Compared with standard regression models, the BHMs yield substantially increased uncertainty surrounding the cost-effectiveness estimates, and altered point estimates. The authors conclude that ignoring clustering can lead to incorrect inferences. The BHMs that they present offer a flexible modeling framework that can be applied more generally to CEA that use CRTs.
Cao, Yu; Wu, Zhuofu; Wang, Tao; Xiao, Yu; Huo, Qisheng; Liu, Yunling
2016-04-28
Bacillus subtilis lipase (BSL2) has been successfully immobilized into a Cu-BTC based hierarchically porous metal-organic framework material for the first time. The Cu-BTC hierarchically porous MOF material with large mesopore apertures is prepared conveniently by using a template-free strategy under mild conditions. The immobilized BSL2 presents high enzymatic activity and perfect reusability during the esterification reaction. After 10 cycles, the immobilized BSL2 still exhibits 90.7% of its initial enzymatic activity and 99.6% of its initial conversion.
Li, Ben; Li, Yunxiao; Qin, Zhaohui S
2017-06-01
Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical 'large p , small n ' problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package "adaptiveHM", which is freely available from https://github.com/benliemory/adaptiveHM.
Li, Ben; Li, Yunxiao; Qin, Zhaohui S.
2016-01-01
Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical ‘large p, small n’ problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package “adaptiveHM”, which is freely available from https://github.com/benliemory/adaptiveHM. PMID:28919931
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.
This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less
Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.
2016-08-10
This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less
Geomechanical Modeling of Gas Hydrate Bearing Sediments
NASA Astrophysics Data System (ADS)
Sanchez, M. J.; Gai, X., Sr.
2015-12-01
This contribution focuses on an advance geomechanical model for methane hydrate-bearing soils based on concepts of elasto-plasticity for strain hardening/softening soils and incorporates bonding and damage effects. The core of the proposed model includes: a hierarchical single surface critical state framework, sub-loading concepts for modeling the plastic strains generally observed inside the yield surface and a hydrate enhancement factor to account for the cementing effects provided by the presence of hydrates in sediments. The proposed framework has been validated against recently published experiments involving both, synthetic and natural hydrate soils, as well as different sediments types (i.e., different hydrate saturations, and different hydrates morphologies) and confinement conditions. The performance of the model in these different case studies was very satisfactory.
Forbes, Miriam K; Kotov, Roman; Ruggero, Camilo J; Watson, David; Zimmerman, Mark; Krueger, Robert F
2017-11-01
A large body of research has focused on identifying the optimal number of dimensions - or spectra - to model individual differences in psychopathology. Recently, it has become increasingly clear that ostensibly competing models with varying numbers of spectra can be synthesized in empirically derived hierarchical structures. We examined the convergence between top-down (bass-ackwards or sequential principal components analysis) and bottom-up (hierarchical agglomerative cluster analysis) statistical methods for elucidating hierarchies to explicate the joint hierarchical structure of clinical and personality disorders. Analyses examined 24 clinical and personality disorders based on semi-structured clinical interviews in an outpatient psychiatric sample (n=2900). The two methods of hierarchical analysis converged on a three-tier joint hierarchy of psychopathology. At the lowest tier, there were seven spectra - disinhibition, antagonism, core thought disorder, detachment, core internalizing, somatoform, and compulsivity - that emerged in both methods. These spectra were nested under the same three higher-order superspectra in both methods: externalizing, broad thought dysfunction, and broad internalizing. In turn, these three superspectra were nested under a single general psychopathology spectrum, which represented the top tier of the hierarchical structure. The hierarchical structure mirrors and extends upon past research, with the inclusion of a novel compulsivity spectrum, and the finding that psychopathology is organized in three superordinate domains. This hierarchy can thus be used as a flexible and integrative framework to facilitate psychopathology research with varying levels of specificity (i.e., focusing on the optimal level of detailed information, rather than the optimal number of factors). Copyright © 2017 Elsevier Inc. All rights reserved.
Hierarchical Aligned Cluster Analysis for Temporal Clustering of Human Motion.
Zhou, Feng; De la Torre, Fernando; Hodgins, Jessica K
2013-03-01
Temporal segmentation of human motion into plausible motion primitives is central to understanding and building computational models of human motion. Several issues contribute to the challenge of discovering motion primitives: the exponential nature of all possible movement combinations, the variability in the temporal scale of human actions, and the complexity of representing articulated motion. We pose the problem of learning motion primitives as one of temporal clustering, and derive an unsupervised hierarchical bottom-up framework called hierarchical aligned cluster analysis (HACA). HACA finds a partition of a given multidimensional time series into m disjoint segments such that each segment belongs to one of k clusters. HACA combines kernel k-means with the generalized dynamic time alignment kernel to cluster time series data. Moreover, it provides a natural framework to find a low-dimensional embedding for time series. HACA is efficiently optimized with a coordinate descent strategy and dynamic programming. Experimental results on motion capture and video data demonstrate the effectiveness of HACA for segmenting complex motions and as a visualization tool. We also compare the performance of HACA to state-of-the-art algorithms for temporal clustering on data of a honey bee dance. The HACA code is available online.
Predictors of College Readiness: An Analysis of the Student Readiness Inventory
ERIC Educational Resources Information Center
Wilson, James K., III
2012-01-01
The purpose of this study was to better predict how a first semester college freshman becomes prepared for college. The theoretical framework guiding this study is Vrooms' expectancy theory, motivation plays a key role in success. This study used a hierarchical multiple regression model. The independent variables of interest included high school…
A Hierarchical Approach to Examine Personal and School Effect on Teacher Motivation
ERIC Educational Resources Information Center
Wei, Yi-En
2012-01-01
In order to depict a better picture of teacher motivation, the researcher developed the theoretical framework based on Deci and Ryan's (1985) self-determination theory (SDT) and examined factors affecting teachers' autonomous motivation at both the personal and school level. Several multilevel structural equation models (ML-SEM) were…
Examining Heterogeneity in Residual Variance to Detect Differential Response to Treatments
ERIC Educational Resources Information Center
Kim, Jinok; Seltzer, Michael
2011-01-01
Individual differences in response to treatments have been a long-standing interest in education, psychology, and related fields. This article presents a conceptual framework and hierarchical modeling strategies that may help identify the subgroups for whom, or the conditions under which, a particular treatment is associated with better outcomes.…
ERIC Educational Resources Information Center
Botvinick, Matthew; Plaut, David C.
2004-01-01
In everyday tasks, selecting actions in the proper sequence requires a continuously updated representation of temporal context. Previous models have addressed this problem by positing a hierarchy of processing units, mirroring the roughly hierarchical structure of naturalistic tasks themselves. The present study considers an alternative framework,…
Predictors of Service Utilization among Youth Diagnosed with Mood Disorders
ERIC Educational Resources Information Center
Mendenhall, Amy N.
2012-01-01
In this study, I investigated patterns and predictors of service utilization for children with mood disorders. The Behavioral Model for Health Care Utilization was used as an organizing framework for identifying predictors of the number and quality of services utilized. Hierarchical regression was used in secondary data analyses of the…
Variability, Negative Evidence, and the Acquisition of Verb Argument Constructions
ERIC Educational Resources Information Center
Perfors, Amy; Tenenbaum, Joshua B.; Wonnacott, Elizabeth
2010-01-01
We present a hierarchical Bayesian framework for modeling the acquisition of verb argument constructions. It embodies a domain-general approach to learning higher-level knowledge in the form of inductive constraints (or overhypotheses), and has been used to explain other aspects of language development such as the shape bias in learning object…
Complex optimization for big computational and experimental neutron datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bao, Feng; Oak Ridge National Lab.; Archibald, Richard
Here, we present a framework to use high performance computing to determine accurate solutions to the inverse optimization problem of big experimental data against computational models. We demonstrate how image processing, mathematical regularization, and hierarchical modeling can be used to solve complex optimization problems on big data. We also demonstrate how both model and data information can be used to further increase solution accuracy of optimization by providing confidence regions for the processing and regularization algorithms. Finally, we use the framework in conjunction with the software package SIMPHONIES to analyze results from neutron scattering experiments on silicon single crystals, andmore » refine first principles calculations to better describe the experimental data.« less
Complex optimization for big computational and experimental neutron datasets
Bao, Feng; Oak Ridge National Lab.; Archibald, Richard; ...
2016-11-07
Here, we present a framework to use high performance computing to determine accurate solutions to the inverse optimization problem of big experimental data against computational models. We demonstrate how image processing, mathematical regularization, and hierarchical modeling can be used to solve complex optimization problems on big data. We also demonstrate how both model and data information can be used to further increase solution accuracy of optimization by providing confidence regions for the processing and regularization algorithms. Finally, we use the framework in conjunction with the software package SIMPHONIES to analyze results from neutron scattering experiments on silicon single crystals, andmore » refine first principles calculations to better describe the experimental data.« less
Pan, Mei; Zhu, Yi-Xuan; Wu, Kai; Chen, Ling; Hou, Ya-Jun; Yin, Shao-Yun; Wang, Hai-Ping; Fan, Ya-Nan; Su, Cheng-Yong
2017-11-13
Core-shell or striped heteroatomic lanthanide metal-organic framework hierarchical single crystals were obtained by liquid-phase anisotropic epitaxial growth, maintaining identical periodic organization while simultaneously exhibiting spatially segregated structure. Different types of domain and orientation-controlled multicolor photophysical models are presented, which show either visually distinguishable or visible/near infrared (NIR) emissive colors. This provides a new bottom-up strategy toward the design of hierarchical molecular systems, offering high-throughput and multiplexed luminescence color tunability and readability. The unique capability of combining spectroscopic coding with 3D (three-dimensional) microscale spatial coding is established, providing potential applications in anti-counterfeiting, color barcoding, and other types of integrated and miniaturized optoelectronic materials and devices. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ecoregions of the conterminous United States: evolution of a hierarchical spatial framework
Omernik, James M.; Griffith, Glenn E.
2014-01-01
A map of ecological regions of the conterminous United States, first published in 1987, has been greatly refined and expanded into a hierarchical spatial framework in response to user needs, particularly by state resource management agencies. In collaboration with scientists and resource managers from numerous agencies and institutions in the United States, Mexico, and Canada, the framework has been expanded to cover North America, and the original ecoregions (now termed Level III) have been refined, subdivided, and aggregated to identify coarser as well as more detailed spatial units. The most generalized units (Level I) define 10 ecoregions in the conterminous U.S., while the finest-scale units (Level IV) identify 967 ecoregions. In this paper, we explain the logic underpinning the approach, discuss the evolution of the regional mapping process, and provide examples of how the ecoregions were distinguished at each hierarchical level. The variety of applications of the ecoregion framework illustrates its utility in resource assessment and management.
Ecoregions of the Conterminous United States: Evolution of a Hierarchical Spatial Framework
NASA Astrophysics Data System (ADS)
Omernik, James M.; Griffith, Glenn E.
2014-12-01
A map of ecological regions of the conterminous United States, first published in 1987, has been greatly refined and expanded into a hierarchical spatial framework in response to user needs, particularly by state resource management agencies. In collaboration with scientists and resource managers from numerous agencies and institutions in the United States, Mexico, and Canada, the framework has been expanded to cover North America, and the original ecoregions (now termed Level III) have been refined, subdivided, and aggregated to identify coarser as well as more detailed spatial units. The most generalized units (Level I) define 10 ecoregions in the conterminous U.S., while the finest-scale units (Level IV) identify 967 ecoregions. In this paper, we explain the logic underpinning the approach, discuss the evolution of the regional mapping process, and provide examples of how the ecoregions were distinguished at each hierarchical level. The variety of applications of the ecoregion framework illustrates its utility in resource assessment and management.
Hierarchical Factoring Based On Image Analysis And Orthoblique Rotations.
Stankov, L
1979-07-01
The procedure for hierarchical factoring suggested by Schmid and Leiman (1957) is applied within the framework of image analysis and orthoblique rotational procedures. It is shown that this approach necessarily leads to correlated higher order factors. Also, one can obtain a smaller number of factors than produced by typical hierarchical procedures.
Fleury, Guillaume; Steele, Julian A; Gerber, Iann C; Jolibois, F; Puech, P; Muraoka, Koki; Keoh, Sye Hoe; Chaikittisilp, Watcharop; Okubo, Tatsuya; Roeffaers, Maarten B J
2018-04-05
The direct synthesis of hierarchically intergrown silicalite-1 can be achieved using a specific diquaternary ammonium agent. However, the location of these molecules in the zeolite framework, which is critical to understand the formation of the material, remains unclear. Where traditional characterization tools have previously failed, herein we use polarized stimulated Raman scattering (SRS) microscopy to resolve molecular organization inside few-micron-sized crystals. Through a combination of experiment and first-principles calculations, our investigation reveals the preferential location of the templating agent inside the linear pores of the MFI framework. Besides illustrating the attractiveness of SRS microscopy in the field of material science to study and spatially resolve local molecular distribution as well as orientation, these results can be exploited in the design of new templating agents for the preparation of hierarchical zeolites.
Hierarchical probabilistic Gabor and MRF segmentation of brain tumours in MRI volumes.
Subbanna, Nagesh K; Precup, Doina; Collins, D Louis; Arbel, Tal
2013-01-01
In this paper, we present a fully automated hierarchical probabilistic framework for segmenting brain tumours from multispectral human brain magnetic resonance images (MRIs) using multiwindow Gabor filters and an adapted Markov Random Field (MRF) framework. In the first stage, a customised Gabor decomposition is developed, based on the combined-space characteristics of the two classes (tumour and non-tumour) in multispectral brain MRIs in order to optimally separate tumour (including edema) from healthy brain tissues. A Bayesian framework then provides a coarse probabilistic texture-based segmentation of tumours (including edema) whose boundaries are then refined at the voxel level through a modified MRF framework that carefully separates the edema from the main tumour. This customised MRF is not only built on the voxel intensities and class labels as in traditional MRFs, but also models the intensity differences between neighbouring voxels in the likelihood model, along with employing a prior based on local tissue class transition probabilities. The second inference stage is shown to resolve local inhomogeneities and impose a smoothing constraint, while also maintaining the appropriate boundaries as supported by the local intensity difference observations. The method was trained and tested on the publicly available MICCAI 2012 Brain Tumour Segmentation Challenge (BRATS) Database [1] on both synthetic and clinical volumes (low grade and high grade tumours). Our method performs well compared to state-of-the-art techniques, outperforming the results of the top methods in cases of clinical high grade and low grade tumour core segmentation by 40% and 45% respectively.
Cable deformation simulation and a hierarchical framework for Nb3Sn Rutherford cables
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arbelaez, D.; Prestemon, S. O.; Ferracin, P.
2009-09-13
Knowledge of the three-dimensional strain state induced in the superconducting filaments due to loads on Rutherford cables is essential to analyze the performance of Nb{sub 3}Sn magnets. Due to the large range of length scales involved, we develop a hierarchical computational scheme that includes models at both the cable and strand levels. At the Rutherford cable level, where the strands are treated as a homogeneous medium, a three-dimensional computational model is developed to determine the deformed shape of the cable that can subsequently be used to determine the strain state under specified loading conditions, which may be of thermal, magnetic,more » and mechanical origins. The results can then be transferred to the model at the strand/macro-filament level for rod restack process (RRP) strands, where the geometric details of the strand are included. This hierarchical scheme can be used to estimate the three-dimensional strain state in the conductor as well as to determine the effective properties of the strands and cables from the properties of individual components. Examples of the modeling results obtained for the orthotropic mechanical properties of the Rutherford cables are presented.« less
A novel algorithm for delineating wetland depressions and ...
In traditional watershed delineation and topographic modeling, surface depressions are generally treated as spurious features and simply removed from a digital elevation model (DEM) to enforce flow continuity of water across the topographic surface to the watershed outlets. In reality, however, many depressions in the DEM are actual wetland landscape features that are seldom fully filled with water. For instance, wetland depressions in the Prairie Pothole Region (PPR) are seasonally to permanently flooded wetlands characterized by nested hierarchical structures with dynamic filling- spilling-merging surface-water hydrological processes. The objectives of this study were to delineate hierarchical wetland catchments and model their hydrologic connectivity using high-resolution LiDAR data and aerial imagery. We proposed a novel algorithm delineate the hierarchical wetland catchments and characterize their geometric and topological properties. Potential hydrologic connectivity between wetlands and streams were simulated using the least-cost path algorithm. The resulting flow network delineated putative temporary or seasonal flow paths connecting wetland depressions to each other or to the river network at scales finer than available through the National Hydrography Dataset. The results demonstrated that our proposed framework is promising for improving overland flow modeling and hydrologic connectivity analysis. Presentation at AWRA Spring Specialty Conference in Sn
Xue, Alexander T; Hickerson, Michael J
2017-11-01
Population genetic data from multiple taxa can address comparative phylogeographic questions about community-scale response to environmental shifts, and a useful strategy to this end is to employ hierarchical co-demographic models that directly test multi-taxa hypotheses within a single, unified analysis. This approach has been applied to classical phylogeographic data sets such as mitochondrial barcodes as well as reduced-genome polymorphism data sets that can yield 10,000s of SNPs, produced by emergent technologies such as RAD-seq and GBS. A strategy for the latter had been accomplished by adapting the site frequency spectrum to a novel summarization of population genomic data across multiple taxa called the aggregate site frequency spectrum (aSFS), which potentially can be deployed under various inferential frameworks including approximate Bayesian computation, random forest and composite likelihood optimization. Here, we introduce the r package multi-dice, a wrapper program that exploits existing simulation software for flexible execution of hierarchical model-based inference using the aSFS, which is derived from reduced genome data, as well as mitochondrial data. We validate several novel software features such as applying alternative inferential frameworks, enforcing a minimal threshold of time surrounding co-demographic pulses and specifying flexible hyperprior distributions. In sum, multi-dice provides comparative analysis within the familiar R environment while allowing a high degree of user customization, and will thus serve as a tool for comparative phylogeography and population genomics. © 2017 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.
Law, Jane
2016-01-01
Intrinsic conditional autoregressive modeling in a Bayeisan hierarchical framework has been increasingly applied in small-area ecological studies. This study explores the specifications of spatial structure in this Bayesian framework in two aspects: adjacency, i.e., the set of neighbor(s) for each area; and (spatial) weight for each pair of neighbors. Our analysis was based on a small-area study of falling injuries among people age 65 and older in Ontario, Canada, that was aimed to estimate risks and identify risk factors of such falls. In the case study, we observed incorrect adjacencies information caused by deficiencies in the digital map itself. Further, when equal weights was replaced by weights based on a variable of expected count, the range of estimated risks increased, the number of areas with probability of estimated risk greater than one at different probability thresholds increased, and model fit improved. More importantly, significance of a risk factor diminished. Further research to thoroughly investigate different methods of variable weights; quantify the influence of specifications of spatial weights; and develop strategies for better defining spatial structure of a map in small-area analysis in Bayesian hierarchical spatial modeling is recommended. PMID:29546147
Yu, Wenxi; Liu, Yang; Ma, Zongwei; Bi, Jun
2017-08-01
Using satellite-based aerosol optical depth (AOD) measurements and statistical models to estimate ground-level PM 2.5 is a promising way to fill the areas that are not covered by ground PM 2.5 monitors. The statistical models used in previous studies are primarily Linear Mixed Effects (LME) and Geographically Weighted Regression (GWR) models. In this study, we developed a new regression model between PM 2.5 and AOD using Gaussian processes in a Bayesian hierarchical setting. Gaussian processes model the stochastic nature of the spatial random effects, where the mean surface and the covariance function is specified. The spatial stochastic process is incorporated under the Bayesian hierarchical framework to explain the variation of PM 2.5 concentrations together with other factors, such as AOD, spatial and non-spatial random effects. We evaluate the results of our model and compare them with those of other, conventional statistical models (GWR and LME) by within-sample model fitting and out-of-sample validation (cross validation, CV). The results show that our model possesses a CV result (R 2 = 0.81) that reflects higher accuracy than that of GWR and LME (0.74 and 0.48, respectively). Our results indicate that Gaussian process models have the potential to improve the accuracy of satellite-based PM 2.5 estimates.
Mollenhauer, Robert; Brewer, Shannon K.
2017-01-01
Failure to account for variable detection across survey conditions constrains progressive stream ecology and can lead to erroneous stream fish management and conservation decisions. In addition to variable detection’s confounding long-term stream fish population trends, reliable abundance estimates across a wide range of survey conditions are fundamental to establishing species–environment relationships. Despite major advancements in accounting for variable detection when surveying animal populations, these approaches remain largely ignored by stream fish scientists, and CPUE remains the most common metric used by researchers and managers. One notable advancement for addressing the challenges of variable detection is the multinomial N-mixture model. Multinomial N-mixture models use a flexible hierarchical framework to model the detection process across sites as a function of covariates; they also accommodate common fisheries survey methods, such as removal and capture–recapture. Effective monitoring of stream-dwelling Smallmouth Bass Micropterus dolomieu populations has long been challenging; therefore, our objective was to examine the use of multinomial N-mixture models to improve the applicability of electrofishing for estimating absolute abundance. We sampled Smallmouth Bass populations by using tow-barge electrofishing across a range of environmental conditions in streams of the Ozark Highlands ecoregion. Using an information-theoretic approach, we identified effort, water clarity, wetted channel width, and water depth as covariates that were related to variable Smallmouth Bass electrofishing detection. Smallmouth Bass abundance estimates derived from our top model consistently agreed with baseline estimates obtained via snorkel surveys. Additionally, confidence intervals from the multinomial N-mixture models were consistently more precise than those of unbiased Petersen capture–recapture estimates due to the dependency among data sets in the hierarchical framework. We demonstrate the application of this contemporary population estimation method to address a longstanding stream fish management issue. We also detail the advantages and trade-offs of hierarchical population estimation methods relative to CPUE and estimation methods that model each site separately.
Babcock, Chad; Finley, Andrew O.; Bradford, John B.; Kolka, Randall K.; Birdsey, Richard A.; Ryan, Michael G.
2015-01-01
Many studies and production inventory systems have shown the utility of coupling covariates derived from Light Detection and Ranging (LiDAR) data with forest variables measured on georeferenced inventory plots through regression models. The objective of this study was to propose and assess the use of a Bayesian hierarchical modeling framework that accommodates both residual spatial dependence and non-stationarity of model covariates through the introduction of spatial random effects. We explored this objective using four forest inventory datasets that are part of the North American Carbon Program, each comprising point-referenced measures of above-ground forest biomass and discrete LiDAR. For each dataset, we considered at least five regression model specifications of varying complexity. Models were assessed based on goodness of fit criteria and predictive performance using a 10-fold cross-validation procedure. Results showed that the addition of spatial random effects to the regression model intercept improved fit and predictive performance in the presence of substantial residual spatial dependence. Additionally, in some cases, allowing either some or all regression slope parameters to vary spatially, via the addition of spatial random effects, further improved model fit and predictive performance. In other instances, models showed improved fit but decreased predictive performance—indicating over-fitting and underscoring the need for cross-validation to assess predictive ability. The proposed Bayesian modeling framework provided access to pixel-level posterior predictive distributions that were useful for uncertainty mapping, diagnosing spatial extrapolation issues, revealing missing model covariates, and discovering locally significant parameters.
NASA Astrophysics Data System (ADS)
Lowman, L.; Barros, A. P.
2014-12-01
Computational modeling of surface erosion processes is inherently difficult because of the four-dimensional nature of the problem and the multiple temporal and spatial scales that govern individual mechanisms. Landscapes are modified via surface and fluvial erosion and exhumation, each of which takes place over a range of time scales. Traditional field measurements of erosion/exhumation rates are scale dependent, often valid for a single point-wise location or averaging over large aerial extents and periods with intense and mild erosion. We present a method of remotely estimating erosion rates using a Bayesian hierarchical model based upon the stream power erosion law (SPEL). A Bayesian approach allows for estimating erosion rates using the deterministic relationship given by the SPEL and data on channel slopes and precipitation at the basin and sub-basin scale. The spatial scale associated with this framework is the elevation class, where each class is characterized by distinct morphologic behavior observed through different modes in the distribution of basin outlet elevations. Interestingly, the distributions of first-order outlets are similar in shape and extent to the distribution of precipitation events (i.e. individual storms) over a 14-year period between 1998-2011. We demonstrate an application of the Bayesian hierarchical modeling framework for five basins and one intermontane basin located in the central Andes between 5S and 20S. Using remotely sensed data of current annual precipitation rates from the Tropical Rainfall Measuring Mission (TRMM) and topography from a high resolution (3 arc-seconds) digital elevation map (DEM), our erosion rate estimates are consistent with decadal-scale estimates based on landslide mapping and sediment flux observations and 1-2 orders of magnitude larger than most millennial and million year timescale estimates from thermochronology and cosmogenic nuclides.
Inferring on the Intentions of Others by Hierarchical Bayesian Learning
Diaconescu, Andreea O.; Mathys, Christoph; Weber, Lilian A. E.; Daunizeau, Jean; Kasper, Lars; Lomakina, Ekaterina I.; Fehr, Ernst; Stephan, Klaas E.
2014-01-01
Inferring on others' (potentially time-varying) intentions is a fundamental problem during many social transactions. To investigate the underlying mechanisms, we applied computational modeling to behavioral data from an economic game in which 16 pairs of volunteers (randomly assigned to “player” or “adviser” roles) interacted. The player performed a probabilistic reinforcement learning task, receiving information about a binary lottery from a visual pie chart. The adviser, who received more predictive information, issued an additional recommendation. Critically, the game was structured such that the adviser's incentives to provide helpful or misleading information varied in time. Using a meta-Bayesian modeling framework, we found that the players' behavior was best explained by the deployment of hierarchical learning: they inferred upon the volatility of the advisers' intentions in order to optimize their predictions about the validity of their advice. Beyond learning, volatility estimates also affected the trial-by-trial variability of decisions: participants were more likely to rely on their estimates of advice accuracy for making choices when they believed that the adviser's intentions were presently stable. Finally, our model of the players' inference predicted the players' interpersonal reactivity index (IRI) scores, explicit ratings of the advisers' helpfulness and the advisers' self-reports on their chosen strategy. Overall, our results suggest that humans (i) employ hierarchical generative models to infer on the changing intentions of others, (ii) use volatility estimates to inform decision-making in social interactions, and (iii) integrate estimates of advice accuracy with non-social sources of information. The Bayesian framework presented here can quantify individual differences in these mechanisms from simple behavioral readouts and may prove useful in future clinical studies of maladaptive social cognition. PMID:25187943
High-Dimensional Bayesian Geostatistics
Banerjee, Sudipto
2017-01-01
With the growing capabilities of Geographic Information Systems (GIS) and user-friendly software, statisticians today routinely encounter geographically referenced data containing observations from a large number of spatial locations and time points. Over the last decade, hierarchical spatiotemporal process models have become widely deployed statistical tools for researchers to better understand the complex nature of spatial and temporal variability. However, fitting hierarchical spatiotemporal models often involves expensive matrix computations with complexity increasing in cubic order for the number of spatial locations and temporal points. This renders such models unfeasible for large data sets. This article offers a focused review of two methods for constructing well-defined highly scalable spatiotemporal stochastic processes. Both these processes can be used as “priors” for spatiotemporal random fields. The first approach constructs a low-rank process operating on a lower-dimensional subspace. The second approach constructs a Nearest-Neighbor Gaussian Process (NNGP) that ensures sparse precision matrices for its finite realizations. Both processes can be exploited as a scalable prior embedded within a rich hierarchical modeling framework to deliver full Bayesian inference. These approaches can be described as model-based solutions for big spatiotemporal datasets. The models ensure that the algorithmic complexity has ~ n floating point operations (flops), where n the number of spatial locations (per iteration). We compare these methods and provide some insight into their methodological underpinnings. PMID:29391920
High-Dimensional Bayesian Geostatistics.
Banerjee, Sudipto
2017-06-01
With the growing capabilities of Geographic Information Systems (GIS) and user-friendly software, statisticians today routinely encounter geographically referenced data containing observations from a large number of spatial locations and time points. Over the last decade, hierarchical spatiotemporal process models have become widely deployed statistical tools for researchers to better understand the complex nature of spatial and temporal variability. However, fitting hierarchical spatiotemporal models often involves expensive matrix computations with complexity increasing in cubic order for the number of spatial locations and temporal points. This renders such models unfeasible for large data sets. This article offers a focused review of two methods for constructing well-defined highly scalable spatiotemporal stochastic processes. Both these processes can be used as "priors" for spatiotemporal random fields. The first approach constructs a low-rank process operating on a lower-dimensional subspace. The second approach constructs a Nearest-Neighbor Gaussian Process (NNGP) that ensures sparse precision matrices for its finite realizations. Both processes can be exploited as a scalable prior embedded within a rich hierarchical modeling framework to deliver full Bayesian inference. These approaches can be described as model-based solutions for big spatiotemporal datasets. The models ensure that the algorithmic complexity has ~ n floating point operations (flops), where n the number of spatial locations (per iteration). We compare these methods and provide some insight into their methodological underpinnings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lai, Canhai; Xu, Zhijie; Pan, Wenxiao
2016-01-01
To quantify the predictive confidence of a solid sorbent-based carbon capture design, a hierarchical validation methodology—consisting of basic unit problems with increasing physical complexity coupled with filtered model-based geometric upscaling has been developed and implemented. This paper describes the computational fluid dynamics (CFD) multi-phase reactive flow simulations and the associated data flows among different unit problems performed within the said hierarchical validation approach. The bench-top experiments used in this calibration and validation effort were carefully designed to follow the desired simple-to-complex unit problem hierarchy, with corresponding data acquisition to support model parameters calibrations at each unit problem level. A Bayesianmore » calibration procedure is employed and the posterior model parameter distributions obtained at one unit-problem level are used as prior distributions for the same parameters in the next-tier simulations. Overall, the results have demonstrated that the multiphase reactive flow models within MFIX can be used to capture the bed pressure, temperature, CO2 capture capacity, and kinetics with quantitative accuracy. The CFD modeling methodology and associated uncertainty quantification techniques presented herein offer a solid framework for estimating the predictive confidence in the virtual scale up of a larger carbon capture device.« less
A Graph-Embedding Approach to Hierarchical Visual Word Mergence.
Wang, Lei; Liu, Lingqiao; Zhou, Luping
2017-02-01
Appropriately merging visual words are an effective dimension reduction method for the bag-of-visual-words model in image classification. The approach of hierarchically merging visual words has been extensively employed, because it gives a fully determined merging hierarchy. Existing supervised hierarchical merging methods take different approaches and realize the merging process with various formulations. In this paper, we propose a unified hierarchical merging approach built upon the graph-embedding framework. Our approach is able to merge visual words for any scenario, where a preferred structure and an undesired structure are defined, and, therefore, can effectively attend to all kinds of requirements for the word-merging process. In terms of computational efficiency, we show that our algorithm can seamlessly integrate a fast search strategy developed in our previous work and, thus, well maintain the state-of-the-art merging speed. To the best of our survey, the proposed approach is the first one that addresses the hierarchical visual word mergence in such a flexible and unified manner. As demonstrated, it can maintain excellent image classification performance even after a significant dimension reduction, and outperform all the existing comparable visual word-merging methods. In a broad sense, our work provides an open platform for applying, evaluating, and developing new criteria for hierarchical word-merging tasks.
Rangarajan, Srinivas; Maravelias, Christos T.; Mavrikakis, Manos
2017-11-09
Here, we present a general optimization-based framework for (i) ab initio and experimental data driven mechanistic modeling and (ii) optimal catalyst design of heterogeneous catalytic systems. Both cases are formulated as a nonlinear optimization problem that is subject to a mean-field microkinetic model and thermodynamic consistency requirements as constraints, for which we seek sparse solutions through a ridge (L 2 regularization) penalty. The solution procedure involves an iterative sequence of forward simulation of the differential algebraic equations pertaining to the microkinetic model using a numerical tool capable of handling stiff systems, sensitivity calculations using linear algebra, and gradient-based nonlinear optimization.more » A multistart approach is used to explore the solution space, and a hierarchical clustering procedure is implemented for statistically classifying potentially competing solutions. An example of methanol synthesis through hydrogenation of CO and CO 2 on a Cu-based catalyst is used to illustrate the framework. The framework is fast, is robust, and can be used to comprehensively explore the model solution and design space of any heterogeneous catalytic system.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rangarajan, Srinivas; Maravelias, Christos T.; Mavrikakis, Manos
Here, we present a general optimization-based framework for (i) ab initio and experimental data driven mechanistic modeling and (ii) optimal catalyst design of heterogeneous catalytic systems. Both cases are formulated as a nonlinear optimization problem that is subject to a mean-field microkinetic model and thermodynamic consistency requirements as constraints, for which we seek sparse solutions through a ridge (L 2 regularization) penalty. The solution procedure involves an iterative sequence of forward simulation of the differential algebraic equations pertaining to the microkinetic model using a numerical tool capable of handling stiff systems, sensitivity calculations using linear algebra, and gradient-based nonlinear optimization.more » A multistart approach is used to explore the solution space, and a hierarchical clustering procedure is implemented for statistically classifying potentially competing solutions. An example of methanol synthesis through hydrogenation of CO and CO 2 on a Cu-based catalyst is used to illustrate the framework. The framework is fast, is robust, and can be used to comprehensively explore the model solution and design space of any heterogeneous catalytic system.« less
BioASF: a framework for automatically generating executable pathway models specified in BioPAX.
Haydarlou, Reza; Jacobsen, Annika; Bonzanni, Nicola; Feenstra, K Anton; Abeln, Sanne; Heringa, Jaap
2016-06-15
Biological pathways play a key role in most cellular functions. To better understand these functions, diverse computational and cell biology researchers use biological pathway data for various analysis and modeling purposes. For specifying these biological pathways, a community of researchers has defined BioPAX and provided various tools for creating, validating and visualizing BioPAX models. However, a generic software framework for simulating BioPAX models is missing. Here, we attempt to fill this gap by introducing a generic simulation framework for BioPAX. The framework explicitly separates the execution model from the model structure as provided by BioPAX, with the advantage that the modelling process becomes more reproducible and intrinsically more modular; this ensures natural biological constraints are satisfied upon execution. The framework is based on the principles of discrete event systems and multi-agent systems, and is capable of automatically generating a hierarchical multi-agent system for a given BioPAX model. To demonstrate the applicability of the framework, we simulated two types of biological network models: a gene regulatory network modeling the haematopoietic stem cell regulators and a signal transduction network modeling the Wnt/β-catenin signaling pathway. We observed that the results of the simulations performed using our framework were entirely consistent with the simulation results reported by the researchers who developed the original models in a proprietary language. The framework, implemented in Java, is open source and its source code, documentation and tutorial are available at http://www.ibi.vu.nl/programs/BioASF CONTACT: j.heringa@vu.nl. © The Author 2016. Published by Oxford University Press.
Facilitation of Language Acquisition Viewed through an Interpretative Lens: The Role of Authenticity
ERIC Educational Resources Information Center
Harper, Melanie Ann
2013-01-01
A paradigm is the conceptual framework or lens one uses to view reality. The field of speech-language pathology is traditionally rooted in the empirical paradigm, which believes that language can be fragmented into isolated skills and taught in a hierarchal fashion. This belief has resulted in service delivery models that remove students from…
ERIC Educational Resources Information Center
Liu, Yan; Bellibas, Mehmet Sukru; Printy, Susan
2018-01-01
Distributed leadership is a dynamic process and reciprocal interaction of the leader, the subordinates and the situation. This research was inspired by the theoretical framework of Spillane in order to contextualize distributed leadership and compare the variations using the Teaching and Learning International Survey 2013 data. The two-level…
The Challenge of Separating Effects of Simultaneous Education Projects on Student Achievement
ERIC Educational Resources Information Center
Ma, Xin; Ma, Lingling
2009-01-01
When multiple education projects operate in an overlapping or rear-ended manner, it is always a challenge to separate unique project effects on schooling outcomes. Our analysis represents a first attempt to address this challenge. A three-level hierarchical linear model (HLM) was presented as a general analytical framework to separate program…
ERIC Educational Resources Information Center
Jones, Sandra; Lefoe, Geraldine; Harvey, Marina; Ryland, Kevin
2012-01-01
New models of leadership are needed for the higher education sector to continue to graduate students with leading edge capabilities. While multiple theories of leadership exist, the higher education sector requires a less hierarchical approach that takes account of its specialised and professional context. Over the last decade the sector has…
Chowdhury, Rasheda Arman; Lina, Jean Marc; Kobayashi, Eliane; Grova, Christophe
2013-01-01
Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG) or Magneto-EncephaloGraphy (MEG) signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i) brain activity may be modeled using cortical parcels and (ii) brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP) method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM) and the Hierarchical Bayesian (HB) source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC) analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm(2) to 30 cm(2), whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered.
Chowdhury, Rasheda Arman; Lina, Jean Marc; Kobayashi, Eliane; Grova, Christophe
2013-01-01
Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG) or Magneto-EncephaloGraphy (MEG) signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i) brain activity may be modeled using cortical parcels and (ii) brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP) method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM) and the Hierarchical Bayesian (HB) source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC) analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm2 to 30 cm2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered. PMID:23418485
Architecture for reactive planning of robot actions
NASA Astrophysics Data System (ADS)
Riekki, Jukka P.; Roening, Juha
1995-01-01
In this article, a reactive system for planning robot actions is described. The described hierarchical control system architecture consists of planning-executing-monitoring-modelling elements (PEMM elements). A PEMM element is a goal-oriented, combined processing and data element. It includes a planner, an executor, a monitor, a modeler, and a local model. The elements form a tree-like structure. An element receives tasks from its ancestor and sends subtasks to its descendants. The model knowledge is distributed into the local models, which are connected to each other. The elements can be synchronized. The PEMM architecture is strictly hierarchical. It integrated planning, sensing, and modelling into a single framework. A PEMM-based control system is reactive, as it can cope with asynchronous events and operate under time constraints. The control system is intended to be used primarily to control mobile robots and robot manipulators in dynamic and partially unknown environments. It is suitable especially for applications consisting of physically separated devices and computing resources.
Multilevel Hierarchical Kernel Spectral Clustering for Real-Life Large Scale Complex Networks
Mall, Raghvendra; Langone, Rocco; Suykens, Johan A. K.
2014-01-01
Kernel spectral clustering corresponds to a weighted kernel principal component analysis problem in a constrained optimization framework. The primal formulation leads to an eigen-decomposition of a centered Laplacian matrix at the dual level. The dual formulation allows to build a model on a representative subgraph of the large scale network in the training phase and the model parameters are estimated in the validation stage. The KSC model has a powerful out-of-sample extension property which allows cluster affiliation for the unseen nodes of the big data network. In this paper we exploit the structure of the projections in the eigenspace during the validation stage to automatically determine a set of increasing distance thresholds. We use these distance thresholds in the test phase to obtain multiple levels of hierarchy for the large scale network. The hierarchical structure in the network is determined in a bottom-up fashion. We empirically showcase that real-world networks have multilevel hierarchical organization which cannot be detected efficiently by several state-of-the-art large scale hierarchical community detection techniques like the Louvain, OSLOM and Infomap methods. We show that a major advantage of our proposed approach is the ability to locate good quality clusters at both the finer and coarser levels of hierarchy using internal cluster quality metrics on 7 real-life networks. PMID:24949877
Brough, David B; Wheeler, Daniel; Kalidindi, Surya R
2017-03-01
There is a critical need for customized analytics that take into account the stochastic nature of the internal structure of materials at multiple length scales in order to extract relevant and transferable knowledge. Data driven Process-Structure-Property (PSP) linkages provide systemic, modular and hierarchical framework for community driven curation of materials knowledge, and its transference to design and manufacturing experts. The Materials Knowledge Systems in Python project (PyMKS) is the first open source materials data science framework that can be used to create high value PSP linkages for hierarchical materials that can be leveraged by experts in materials science and engineering, manufacturing, machine learning and data science communities. This paper describes the main functions available from this repository, along with illustrations of how these can be accessed, utilized, and potentially further refined by the broader community of researchers.
Brough, David B; Wheeler, Daniel; Kalidindi, Surya R.
2017-01-01
There is a critical need for customized analytics that take into account the stochastic nature of the internal structure of materials at multiple length scales in order to extract relevant and transferable knowledge. Data driven Process-Structure-Property (PSP) linkages provide systemic, modular and hierarchical framework for community driven curation of materials knowledge, and its transference to design and manufacturing experts. The Materials Knowledge Systems in Python project (PyMKS) is the first open source materials data science framework that can be used to create high value PSP linkages for hierarchical materials that can be leveraged by experts in materials science and engineering, manufacturing, machine learning and data science communities. This paper describes the main functions available from this repository, along with illustrations of how these can be accessed, utilized, and potentially further refined by the broader community of researchers. PMID:28690971
Butun, Ismail; Ra, In-Ho; Sankar, Ravi
2015-01-01
In this work, an intrusion detection system (IDS) framework based on multi-level clustering for hierarchical wireless sensor networks is proposed. The framework employs two types of intrusion detection approaches: (1) “downward-IDS (D-IDS)” to detect the abnormal behavior (intrusion) of the subordinate (member) nodes; and (2) “upward-IDS (U-IDS)” to detect the abnormal behavior of the cluster heads. By using analytical calculations, the optimum parameters for the D-IDS (number of maximum hops) and U-IDS (monitoring group size) of the framework are evaluated and presented. PMID:26593915
Control, responses and modularity of cellular regulatory networks: a control analysis perspective.
Bruggeman, F J; Snoep, J L; Westerhoff, H V
2008-11-01
Cells adapt to changes in environmental conditions through the concerted action of signalling, gene expression and metabolic subsystems. The authors will discuss a theoretical framework addressing such integrated systems. This 'hierarchical analysis' was first developed as an extension to a metabolic control analysis. It builds on the phenomenon that often the communication between signalling, gene expression and metabolic subsystems is almost exclusively via regulatory interactions and not via mass flow interactions. This allows for the treatment of the said subsystems as 'levels' in a hierarchical view of the organisation of the molecular reaction network of cells. Such a hierarchical approach has as a major advantage that levels can be analysed conceptually in isolation of each other (from a local intra-level perspective) and at a later stage integrated via their interactions (from a global inter-level perspective). Hereby, it allows for a modular approach with variable scope. A number of different approaches have been developed for the analysis of hierarchical systems, for example hierarchical control analysis and modular response analysis. The authors, here, review these methods and illustrate the strength of these types of analyses using a core model of a system with gene expression, metabolic and signal transduction levels.
Sasaki, Hatoko; Yonemoto, Naohiro; Mori, Rintaro; Nishida, Toshihiko; Kusuda, Satoshi; Nakayama, Takeo
2017-01-01
Abstract Objective To assess organizational culture in neonatal intensive care units (NICUs) in Japan. Design Cross-sectional survey of organizational culture. Setting Forty NICUs across Japan. Participants Physicians and nurses who worked in NICUs (n = 2006). Main Outcome Measures The Competing Values Framework (CVF) was used to assess the organizational culture of the study population. The 20-item CVF was divided into four culture archetypes: Group, Developmental, Hierarchical and Rational. We calculated geometric means (gmean) and 95% bootstrap confidence intervals of the individual dimensions by unit and occupation. The median number of staff, beds, physicians’ work hours and work engagement were also calculated to examine the differences by culture archetypes. Results Group (gmean = 34.6) and Hierarchical (gmean = 31.7) culture archetypes were higher than Developmental (gmean = 16.3) and Rational (gmean = 17.4) among physicians as a whole. Hierarchical (gmean = 36.3) was the highest followed by Group (gmean = 25.8), Developmental (gmean = 16.3) and Rational (gmean = 21.7) among nurses as a whole. Units with dominant Hierarchical culture had a slightly higher number of physicians (median = 7) than dominant Group culture (median = 6). Units with dominant Group culture had a higher number of beds (median = 12) than dominant Hierarchical culture (median = 9) among physicians. Nurses from units with a dominant Group culture (median = 2.8) had slightly higher work engagement compared with those in units with a dominant Hierarchical culture (median = 2.6). Conclusions Our findings revealed that organizational culture in NICUs varies depending on occupation and group size. Group and Hierarchical cultures predominated in Japanese NICUs. Assessing organizational culture will provide insights into the perceptions of unit values to improve quality of care. PMID:28371865
Sasaki, Hatoko; Yonemoto, Naohiro; Mori, Rintaro; Nishida, Toshihiko; Kusuda, Satoshi; Nakayama, Takeo
2017-06-01
To assess organizational culture in neonatal intensive care units (NICUs) in Japan. Cross-sectional survey of organizational culture. Forty NICUs across Japan. Physicians and nurses who worked in NICUs (n = 2006). The Competing Values Framework (CVF) was used to assess the organizational culture of the study population. The 20-item CVF was divided into four culture archetypes: Group, Developmental, Hierarchical and Rational. We calculated geometric means (gmean) and 95% bootstrap confidence intervals of the individual dimensions by unit and occupation. The median number of staff, beds, physicians' work hours and work engagement were also calculated to examine the differences by culture archetypes. Group (gmean = 34.6) and Hierarchical (gmean = 31.7) culture archetypes were higher than Developmental (gmean = 16.3) and Rational (gmean = 17.4) among physicians as a whole. Hierarchical (gmean = 36.3) was the highest followed by Group (gmean = 25.8), Developmental (gmean = 16.3) and Rational (gmean = 21.7) among nurses as a whole. Units with dominant Hierarchical culture had a slightly higher number of physicians (median = 7) than dominant Group culture (median = 6). Units with dominant Group culture had a higher number of beds (median = 12) than dominant Hierarchical culture (median = 9) among physicians. Nurses from units with a dominant Group culture (median = 2.8) had slightly higher work engagement compared with those in units with a dominant Hierarchical culture (median = 2.6). Our findings revealed that organizational culture in NICUs varies depending on occupation and group size. Group and Hierarchical cultures predominated in Japanese NICUs. Assessing organizational culture will provide insights into the perceptions of unit values to improve quality of care. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care
NASA Astrophysics Data System (ADS)
Wu, Qiusheng; Lane, Charles R.
2017-07-01
In traditional watershed delineation and topographic modeling, surface depressions are generally treated as spurious features and simply removed from a digital elevation model (DEM) to enforce flow continuity of water across the topographic surface to the watershed outlets. In reality, however, many depressions in the DEM are actual wetland landscape features with seasonal to permanent inundation patterning characterized by nested hierarchical structures and dynamic filling-spilling-merging surface-water hydrological processes. Differentiating and appropriately processing such ecohydrologically meaningful features remains a major technical terrain-processing challenge, particularly as high-resolution spatial data are increasingly used to support modeling and geographic analysis needs. The objectives of this study were to delineate hierarchical wetland catchments and model their hydrologic connectivity using high-resolution lidar data and aerial imagery. The graph-theory-based contour tree method was used to delineate the hierarchical wetland catchments and characterize their geometric and topological properties. Potential hydrologic connectivity between wetlands and streams were simulated using the least-cost-path algorithm. The resulting flow network delineated potential flow paths connecting wetland depressions to each other or to the river network on scales finer than those available through the National Hydrography Dataset. The results demonstrated that our proposed framework is promising for improving overland flow simulation and hydrologic connectivity analysis.
Shankle, William R.; Pooley, James P.; Steyvers, Mark; Hara, Junko; Mangrola, Tushar; Reisberg, Barry; Lee, Michael D.
2012-01-01
Determining how cognition affects functional abilities is important in Alzheimer’s disease and related disorders (ADRD). 280 patients (normal or ADRD) received a total of 1,514 assessments using the Functional Assessment Staging Test (FAST) procedure and the MCI Screen (MCIS). A hierarchical Bayesian cognitive processing (HBCP) model was created by embedding a signal detection theory (SDT) model of the MCIS delayed recognition memory task into a hierarchical Bayesian framework. The SDT model used latent parameters of discriminability (memory process) and response bias (executive function) to predict, simultaneously, recognition memory performance for each patient and each FAST severity group. The observed recognition memory data did not distinguish the six FAST severity stages, but the latent parameters completely separated them. The latent parameters were also used successfully to transform the ordinal FAST measure into a continuous measure reflecting the underlying continuum of functional severity. HBCP models applied to recognition memory data from clinical practice settings accurately translated a latent measure of cognition to a continuous measure of functional severity for both individuals and FAST groups. Such a translation links two levels of brain information processing, and may enable more accurate correlations with other levels, such as those characterized by biomarkers. PMID:22407225
Bao, Wei; Yue, Jun; Rao, Yulei
2017-01-01
The application of deep learning approaches to finance has received a great deal of attention from both investors and researchers. This study presents a novel deep learning framework where wavelet transforms (WT), stacked autoencoders (SAEs) and long-short term memory (LSTM) are combined for stock price forecasting. The SAEs for hierarchically extracted deep features is introduced into stock price forecasting for the first time. The deep learning framework comprises three stages. First, the stock price time series is decomposed by WT to eliminate noise. Second, SAEs is applied to generate deep high-level features for predicting the stock price. Third, high-level denoising features are fed into LSTM to forecast the next day's closing price. Six market indices and their corresponding index futures are chosen to examine the performance of the proposed model. Results show that the proposed model outperforms other similar models in both predictive accuracy and profitability performance.
Unsupervised active learning based on hierarchical graph-theoretic clustering.
Hu, Weiming; Hu, Wei; Xie, Nianhua; Maybank, Steve
2009-10-01
Most existing active learning approaches are supervised. Supervised active learning has the following problems: inefficiency in dealing with the semantic gap between the distribution of samples in the feature space and their labels, lack of ability in selecting new samples that belong to new categories that have not yet appeared in the training samples, and lack of adaptability to changes in the semantic interpretation of sample categories. To tackle these problems, we propose an unsupervised active learning framework based on hierarchical graph-theoretic clustering. In the framework, two promising graph-theoretic clustering algorithms, namely, dominant-set clustering and spectral clustering, are combined in a hierarchical fashion. Our framework has some advantages, such as ease of implementation, flexibility in architecture, and adaptability to changes in the labeling. Evaluations on data sets for network intrusion detection, image classification, and video classification have demonstrated that our active learning framework can effectively reduce the workload of manual classification while maintaining a high accuracy of automatic classification. It is shown that, overall, our framework outperforms the support-vector-machine-based supervised active learning, particularly in terms of dealing much more efficiently with new samples whose categories have not yet appeared in the training samples.
Learning to learn causal models.
Kemp, Charles; Goodman, Noah D; Tenenbaum, Joshua B
2010-09-01
Learning to understand a single causal system can be an achievement, but humans must learn about multiple causal systems over the course of a lifetime. We present a hierarchical Bayesian framework that helps to explain how learning about several causal systems can accelerate learning about systems that are subsequently encountered. Given experience with a set of objects, our framework learns a causal model for each object and a causal schema that captures commonalities among these causal models. The schema organizes the objects into categories and specifies the causal powers and characteristic features of these categories and the characteristic causal interactions between categories. A schema of this kind allows causal models for subsequent objects to be rapidly learned, and we explore this accelerated learning in four experiments. Our results confirm that humans learn rapidly about the causal powers of novel objects, and we show that our framework accounts better for our data than alternative models of causal learning. Copyright © 2010 Cognitive Science Society, Inc.
Hierarchical competitions subserving multi-attribute choice
Hunt, Laurence T; Dolan, Raymond J; Behrens, Timothy EJ
2015-01-01
Valuation is a key tenet of decision neuroscience, where it is generally assumed that different attributes of competing options are assimilated into unitary values. Such values are central to current neural models of choice. By contrast, psychological studies emphasize complex interactions between choice and valuation. Principles of neuronal selection also suggest competitive inhibition may occur in early valuation stages, before option selection. Here, we show behavior in multi-attribute choice is best explained by a model involving competition at multiple levels of representation. This hierarchical model also explains neural signals in human brain regions previously linked to valuation, including striatum, parietal and prefrontal cortex, where activity represents competition within-attribute, competition between attributes, and option selection. This multi-layered inhibition framework challenges the assumption that option values are computed before choice. Instead our results indicate a canonical competition mechanism throughout all stages of a processing hierarchy, not simply at a final choice stage. PMID:25306549
Towards a multi-level approach to the emergence of meaning processes in living systems.
Queiroz, João; El-Hani, Charbel Niño
2006-09-01
Any description of the emergence and evolution of different types of meaning processes (semiosis, sensu C.S.Peirce) in living systems must be supported by a theoretical framework which makes it possible to understand the nature and dynamics of such processes. Here we propose that the emergence of semiosis of different kinds can be understood as resulting from fundamental interactions in a triadically-organized hierarchical process. To grasp these interactions, we develop a model grounded on Stanley Salthe's hierarchical structuralism. This model can be applied to establish, in a general sense, a set of theoretical constraints for explaining the instantiation of different kinds of meaning processes (iconic, indexical, symbolic) in semiotic systems. We use it to model a semiotic process in the immune system, namely, B-cell activation, in order to offer insights into the heuristic role it can play in the development of explanations for specific semiotic processes.
NASA Technical Reports Server (NTRS)
Buntine, Wray L.
1995-01-01
Intelligent systems require software incorporating probabilistic reasoning, and often times learning. Networks provide a framework and methodology for creating this kind of software. This paper introduces network models based on chain graphs with deterministic nodes. Chain graphs are defined as a hierarchical combination of Bayesian and Markov networks. To model learning, plates on chain graphs are introduced to model independent samples. The paper concludes by discussing various operations that can be performed on chain graphs with plates as a simplification process or to generate learning algorithms.
Amundson, Courtney L.; Royle, J. Andrew; Handel, Colleen M.
2014-01-01
Imperfect detection during animal surveys biases estimates of abundance and can lead to improper conclusions regarding distribution and population trends. Farnsworth et al. (2005) developed a combined distance-sampling and time-removal model for point-transect surveys that addresses both availability (the probability that an animal is available for detection; e.g., that a bird sings) and perceptibility (the probability that an observer detects an animal, given that it is available for detection). We developed a hierarchical extension of the combined model that provides an integrated analysis framework for a collection of survey points at which both distance from the observer and time of initial detection are recorded. Implemented in a Bayesian framework, this extension facilitates evaluating covariates on abundance and detection probability, incorporating excess zero counts (i.e. zero-inflation), accounting for spatial autocorrelation, and estimating population density. Species-specific characteristics, such as behavioral displays and territorial dispersion, may lead to different patterns of availability and perceptibility, which may, in turn, influence the performance of such hierarchical models. Therefore, we first test our proposed model using simulated data under different scenarios of availability and perceptibility. We then illustrate its performance with empirical point-transect data for a songbird that consistently produces loud, frequent, primarily auditory signals, the Golden-crowned Sparrow (Zonotrichia atricapilla); and for 2 ptarmigan species (Lagopus spp.) that produce more intermittent, subtle, and primarily visual cues. Data were collected by multiple observers along point transects across a broad landscape in southwest Alaska, so we evaluated point-level covariates on perceptibility (observer and habitat), availability (date within season and time of day), and abundance (habitat, elevation, and slope), and included a nested point-within-transect and park-level effect. Our results suggest that this model can provide insight into the detection process during avian surveys and reduce bias in estimates of relative abundance but is best applied to surveys of species with greater availability (e.g., breeding songbirds).
Huang, Xiaobi; Elliott, Michael R.; Harlow, Siobán D.
2013-01-01
SUMMARY As women approach menopause, the patterns of their menstrual cycle lengths change. To study these changes, we need to jointly model both the mean and variability of cycle length. Our proposed model incorporates separate mean and variance change points for each woman and a hierarchical model to link them together, along with regression components to include predictors of menopausal onset such as age at menarche and parity. Additional complexity arises from the fact that the calendar data have substantial missingness due to hormone use, surgery, and failure to report. We integrate multiple imputation and time-to event modeling in a Bayesian estimation framework to deal with different forms of the missingness. Posterior predictive model checks are applied to evaluate the model fit. Our method successfully models patterns of women’s menstrual cycle trajectories throughout their late reproductive life and identifies change points for mean and variability of segment length, providing insight into the menopausal process. More generally, our model points the way toward increasing use of joint mean-variance models to predict health outcomes and better understand disease processes. PMID:24729638
Ma, Songyun; Scheider, Ingo; Bargmann, Swantje
2016-09-01
An anisotropic constitutive model is proposed in the framework of finite deformation to capture several damage mechanisms occurring in the microstructure of dental enamel, a hierarchical bio-composite. It provides the basis for a homogenization approach for an efficient multiscale (in this case: multiple hierarchy levels) investigation of the deformation and damage behavior. The influence of tension-compression asymmetry and fiber-matrix interaction on the nonlinear deformation behavior of dental enamel is studied by 3D micromechanical simulations under different loading conditions and fiber lengths. The complex deformation behavior and the characteristics and interaction of three damage mechanisms in the damage process of enamel are well captured. The proposed constitutive model incorporating anisotropic damage is applied to the first hierarchical level of dental enamel and validated by experimental results. The effect of the fiber orientation on the damage behavior and compressive strength is studied by comparing micro-pillar experiments of dental enamel at the first hierarchical level in multiple directions of fiber orientation. A very good agreement between computational and experimental results is found for the damage evolution process of dental enamel. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Retrieval Capabilities of Hierarchical Networks: From Dyson to Hopfield
NASA Astrophysics Data System (ADS)
Agliari, Elena; Barra, Adriano; Galluzzi, Andrea; Guerra, Francesco; Tantari, Daniele; Tavani, Flavia
2015-01-01
We consider statistical-mechanics models for spin systems built on hierarchical structures, which provide a simple example of non-mean-field framework. We show that the coupling decay with spin distance can give rise to peculiar features and phase diagrams much richer than their mean-field counterpart. In particular, we consider the Dyson model, mimicking ferromagnetism in lattices, and we prove the existence of a number of metastabilities, beyond the ordered state, which become stable in the thermodynamic limit. Such a feature is retained when the hierarchical structure is coupled with the Hebb rule for learning, hence mimicking the modular architecture of neurons, and gives rise to an associative network able to perform single pattern retrieval as well as multiple-pattern retrieval, depending crucially on the external stimuli and on the rate of interaction decay with distance; however, those emergent multitasking features reduce the network capacity with respect to the mean-field counterpart. The analysis is accomplished through statistical mechanics, Markov chain theory, signal-to-noise ratio technique, and numerical simulations in full consistency. Our results shed light on the biological complexity shown by real networks, and suggest future directions for understanding more realistic models.
Hierarchical Nearest-Neighbor Gaussian Process Models for Large Geostatistical Datasets.
Datta, Abhirup; Banerjee, Sudipto; Finley, Andrew O; Gelfand, Alan E
2016-01-01
Spatial process models for analyzing geostatistical data entail computations that become prohibitive as the number of spatial locations become large. This article develops a class of highly scalable nearest-neighbor Gaussian process (NNGP) models to provide fully model-based inference for large geostatistical datasets. We establish that the NNGP is a well-defined spatial process providing legitimate finite-dimensional Gaussian densities with sparse precision matrices. We embed the NNGP as a sparsity-inducing prior within a rich hierarchical modeling framework and outline how computationally efficient Markov chain Monte Carlo (MCMC) algorithms can be executed without storing or decomposing large matrices. The floating point operations (flops) per iteration of this algorithm is linear in the number of spatial locations, thereby rendering substantial scalability. We illustrate the computational and inferential benefits of the NNGP over competing methods using simulation studies and also analyze forest biomass from a massive U.S. Forest Inventory dataset at a scale that precludes alternative dimension-reducing methods. Supplementary materials for this article are available online.
Hierarchical Nearest-Neighbor Gaussian Process Models for Large Geostatistical Datasets
Datta, Abhirup; Banerjee, Sudipto; Finley, Andrew O.; Gelfand, Alan E.
2018-01-01
Spatial process models for analyzing geostatistical data entail computations that become prohibitive as the number of spatial locations become large. This article develops a class of highly scalable nearest-neighbor Gaussian process (NNGP) models to provide fully model-based inference for large geostatistical datasets. We establish that the NNGP is a well-defined spatial process providing legitimate finite-dimensional Gaussian densities with sparse precision matrices. We embed the NNGP as a sparsity-inducing prior within a rich hierarchical modeling framework and outline how computationally efficient Markov chain Monte Carlo (MCMC) algorithms can be executed without storing or decomposing large matrices. The floating point operations (flops) per iteration of this algorithm is linear in the number of spatial locations, thereby rendering substantial scalability. We illustrate the computational and inferential benefits of the NNGP over competing methods using simulation studies and also analyze forest biomass from a massive U.S. Forest Inventory dataset at a scale that precludes alternative dimension-reducing methods. Supplementary materials for this article are available online. PMID:29720777
Forbes, Miriam K.; Tackett, Jennifer L.; Markon, Kristian E.; Krueger, Robert F.
2016-01-01
In this review, we propose a novel developmentally informed framework to push research beyond a focus on comorbidity between discrete diagnostic categories, and to move towards research based on the well-validated dimensional and hierarchical structure of psychopathology. For example, a large body of research speaks to the validity and utility of the Internalizing and Externalizing (IE) spectra as organizing constructs for research on common forms of psychopathology. The IE spectra act as powerful explanatory variables that channel the psychopathological effects of genetic and environmental risk factors, predict adaptive functioning, and account for the likelihood of disorder-level manifestations of psychopathology. As such, our proposed theoretical framework uses the IE spectra as central constructs to guide future psychopathology research across the lifespan. The framework is particularly flexible, as any of the facets or factors from the dimensional and hierarchical structure of psychopathology can form the focus of research. We describe the utility and strengths of this framework for developmental psychopathology in particular, and explore avenues for future research. PMID:27739384
Fujii, Keisuke; Isaka, Tadao; Kouzaki, Motoki; Yamamoto, Yuji
2015-01-01
Humans interact by changing their actions, perceiving other’s actions and executing solutions in conflicting situations. Using oscillator models, nonlinear dynamics have been considered for describing these complex human movements as an emergence of self-organisation. However, these frameworks cannot explain the hierarchical structures of complex behaviours between conflicting inter-agent and adapting intra-agent systems, especially in sport competitions wherein mutually quick decision making and execution are required. Here we adopt a hybrid multiscale approach to model an attack-and-defend game during which both players predict the opponent’s movement and move with a delay. From both simulated and measured data, one synchronous outcome between two-agent (i.e. successful defence) can be described as one attractor. In contrast, the other coordination-breaking outcome (i.e. successful attack) cannot be explained using gradient dynamics because the asymmetric interaction cannot always assume a conserved physical quantity. Instead, we provide the asymmetric and asynchronous hierarchical dynamical models to discuss two-agent competition. Our framework suggests that possessing information about an opponent and oneself in local-coordinative and global-competitive scale enables us to gain a deeper understanding of sports competitions. We anticipate developments in the scientific fields of complex movement adapting to such uncontrolled environments. PMID:26538452
NASA Astrophysics Data System (ADS)
Fujii, Keisuke; Isaka, Tadao; Kouzaki, Motoki; Yamamoto, Yuji
2015-11-01
Humans interact by changing their actions, perceiving other’s actions and executing solutions in conflicting situations. Using oscillator models, nonlinear dynamics have been considered for describing these complex human movements as an emergence of self-organisation. However, these frameworks cannot explain the hierarchical structures of complex behaviours between conflicting inter-agent and adapting intra-agent systems, especially in sport competitions wherein mutually quick decision making and execution are required. Here we adopt a hybrid multiscale approach to model an attack-and-defend game during which both players predict the opponent’s movement and move with a delay. From both simulated and measured data, one synchronous outcome between two-agent (i.e. successful defence) can be described as one attractor. In contrast, the other coordination-breaking outcome (i.e. successful attack) cannot be explained using gradient dynamics because the asymmetric interaction cannot always assume a conserved physical quantity. Instead, we provide the asymmetric and asynchronous hierarchical dynamical models to discuss two-agent competition. Our framework suggests that possessing information about an opponent and oneself in local-coordinative and global-competitive scale enables us to gain a deeper understanding of sports competitions. We anticipate developments in the scientific fields of complex movement adapting to such uncontrolled environments.
NASA Astrophysics Data System (ADS)
Mandel, Kaisey; Kirshner, R. P.; Narayan, G.; Wood-Vasey, W. M.; Friedman, A. S.; Hicken, M.
2010-01-01
I have constructed a comprehensive statistical model for Type Ia supernova light curves spanning optical through near infrared data simultaneously. The near infrared light curves are found to be excellent standard candles (sigma(MH) = 0.11 +/- 0.03 mag) that are less vulnerable to systematic error from dust extinction, a major confounding factor for cosmological studies. A hierarchical statistical framework incorporates coherently multiple sources of randomness and uncertainty, including photometric error, intrinsic supernova light curve variations and correlations, dust extinction and reddening, peculiar velocity dispersion and distances, for probabilistic inference with Type Ia SN light curves. Inferences are drawn from the full probability density over individual supernovae and the SN Ia and dust populations, conditioned on a dataset of SN Ia light curves and redshifts. To compute probabilistic inferences with hierarchical models, I have developed BayeSN, a Markov Chain Monte Carlo algorithm based on Gibbs sampling. This code explores and samples the global probability density of parameters describing individual supernovae and the population. I have applied this hierarchical model to optical and near infrared data of over 100 nearby Type Ia SN from PAIRITEL, the CfA3 sample, and the literature. Using this statistical model, I find that SN with optical and NIR data have a smaller residual scatter in the Hubble diagram than SN with only optical data. The continued study of Type Ia SN in the near infrared will be important for improving their utility as precise and accurate cosmological distance indicators.
Margaret Carreiro; Wayne Zipperer
2011-01-01
The responses of urban park woodlands to large disturbances provide the opportunity to identify and examine linkages in social-ecological systems in urban landscapes.We propose that the Panarchy model consisting of hierarchically nested adaptive cycles provides a useful framework to evaluate those linkages.We use two case studies as examples â Cherokee Park in...
Effects of a 12-Week Resistance Exercise Program on Physical Self-Perceptions in College Students
ERIC Educational Resources Information Center
Moore, Justin B.; Mitchell, Nathanael G.; Bibeau, Wendy S.; Bartholomew, John B.
2011-01-01
There is an increase in literature suggesting exercise can promote positive changes in physical self-perceptions that can manifest as an increase in global self-esteem. In the present study, we assessed self-esteem using the hierarchical framework of the Exercise and Self-Esteem Model (EXSEM) along with cognitive facets at the subdomain level…
Brian J. Clough; Matthew B. Russell; Grant M. Domke; Christopher W. Woodall
2016-01-01
Accurate uncertainty assessments of plot-level live tree biomass stocks are an important precursor to estimating uncertainty in annual national greenhouse gas inventories (NGHGIs) developed from forest inventory data. However, current approaches employed within the United Statesâ NGHGI do not specifically incorporate methods to address error in tree-scale biomass...
NASA Astrophysics Data System (ADS)
Kavanaugh, M.; Muller-Karger, F. E.; Montes, E.; Santora, J. A.; Chavez, F.; Messié, M.; Doney, S. C.
2016-02-01
The pelagic ocean is a complex system in which physical, chemical and biological processes interact to shape patterns on multiple spatial and temporal scales and levels of ecological organization. Monitoring and management of marine seascapes must consider a hierarchical and dynamic mosaic, where the boundaries, extent, and location of features change with time. As part of a Marine Biodiversity Observing Network demonstration project, we conducted a multiscale classification of dynamic coastal seascapes in the northeastern Pacific and Gulf of Mexico using multivariate satellite and modeled data. Synoptic patterns were validated using mooring and ship-based observations that spanned multiple trophic levels and were collected as part of several long-term monitoring programs, including the Monterey Bay and Florida Keys National Marine Sanctuaries. Seascape extent and habitat diversity varied as a function of both seasonal and interannual forcing. We discuss the patterns of in situ observations in the context of seascape dynamics and the effect on rarefaction, spatial patchiness, and tracking and comparing ecosystems through time. A seascape framework presents an effective means to translate local biodiversity measurements to broader spatiotemporal scales, scales relevant for modeling the effects of global change and enabling whole-ecosystem management in the dynamic ocean.
Social determinants of childhood asthma symptoms: an ecological study in urban Latin America.
Fattore, Gisel L; Santos, Carlos A T; Barreto, Mauricio L
2014-04-01
Asthma is an important public health problem in urban Latin America. This study aimed to analyze the role of socioeconomic and environmental factors as potential determinants of asthma symptoms prevalence in children from Latin American (LA) urban centers. We selected 31 LA urban centers with complete data, and an ecological analysis was performed. According to our theoretical framework, the explanatory variables were classified in three levels: distal, intermediate, and proximate. The association between variables in the three levels and prevalence of asthma symptoms was examined by bivariate and multivariate linear regression analysis weighed by sample size. In a second stage, we fitted several linear regression models introducing sequentially the variables according to the predefined hierarchy. In the final hierarchical model Gini Index, crowding, sanitation, variation in infant mortality rates and homicide rates, explained great part of the variance in asthma prevalence between centers (R(2) = 75.0 %). We found a strong association between socioeconomic and environmental variables and prevalence of asthma symptoms in LA urban children, and according to our hierarchical framework and the results found we suggest that social inequalities (measured by the Gini Index) is a central determinant to explain high prevalence of asthma in LA.
Hierarchical control and performance evaluation of multi-vehicle autonomous systems
NASA Astrophysics Data System (ADS)
Balakirsky, Stephen; Scrapper, Chris; Messina, Elena
2005-05-01
This paper will describe how the Mobility Open Architecture Tools and Simulation (MOAST) framework can facilitate performance evaluations of RCS compliant multi-vehicle autonomous systems. This framework provides an environment that allows for simulated and real architectural components to function seamlessly together. By providing repeatable environmental conditions, this framework allows for the development of individual components as well as component performance metrics. MOAST is composed of high-fidelity and low-fidelity simulation systems, a detailed model of real-world terrain, actual hardware components, a central knowledge repository, and architectural glue to tie all of the components together. This paper will describe the framework"s components in detail and provide an example that illustrates how the framework can be utilized to develop and evaluate a single architectural component through the use of repeatable trials and experimentation that includes both virtual and real components functioning together
NASA Technical Reports Server (NTRS)
Maluf, David A.; Tran, Peter B.
2003-01-01
Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semistructured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.
An Extensible Schema-less Database Framework for Managing High-throughput Semi-Structured Documents
NASA Technical Reports Server (NTRS)
Maluf, David A.; Tran, Peter B.; La, Tracy; Clancy, Daniel (Technical Monitor)
2002-01-01
Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword searches of records for both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high throughput open database framework for managing, storing, and searching unstructured or semi structured arbitrary hierarchal models, XML and HTML.
Decentralized cooperative TOA/AOA target tracking for hierarchical wireless sensor networks.
Chen, Ying-Chih; Wen, Chih-Yu
2012-11-08
This paper proposes a distributed method for cooperative target tracking in hierarchical wireless sensor networks. The concept of leader-based information processing is conducted to achieve object positioning, considering a cluster-based network topology. Random timers and local information are applied to adaptively select a sub-cluster for the localization task. The proposed energy-efficient tracking algorithm allows each sub-cluster member to locally estimate the target position with a Bayesian filtering framework and a neural networking model, and further performs estimation fusion in the leader node with the covariance intersection algorithm. This paper evaluates the merits and trade-offs of the protocol design towards developing more efficient and practical algorithms for object position estimation.
Hierarchical Metal–Organic Framework Hybrids: Perturbation-Assisted Nanofusion Synthesis
Yue, Yanfeng; Fulvio, Pasquale F.; Dai, Sheng
2015-12-04
Metal–organic frameworks (MOFs) represent a new family of microporous materials; however, microporous–mesoporous hierarchical MOF materials have been less investigated because of the lack of simple, reliable methods to introduce mesopores to the crystalline microporous particles. State-of-the-art MOF hierarchical materials have been prepared by ligand extension methods or by using a template, resulting in intrinsic mesopores of longer ligands or replicated pores from template agents, respectively. However, mesoporous MOF materials obtained through ligand extension often collapse in the absence of guest molecules, which dramatically reduces the size of the pore aperture. Although the template-directed strategy allows for the preparation of hierarchicalmore » materials with larger mesopores, the latter requires a template removal step, which may result in the collapse of the implemented mesopores. Recently, a general template-free synthesis of hierarchical microporous crystalline frameworks, such as MOFs and Prussian blue analogues (PBAs), has been reported. Our new method is based on the kinetically controlled precipitation (perturbation), with simultaneous condensation and redissolution of polymorphic nanocrystallites in the mother liquor. This method further eliminates the use of extended organic ligands and the micropores do not collapse upon removal of trapped guest solvent molecules, thus yielding hierarchical MOF materials with intriguing porosity in the gram scale. The hierarchical MOF materials prepared in this way exhibited exceptional properties when tested for the adsorption of large organic dyes over their corresponding microporous frameworks, due to the enhanced pore accessibility and electrolyte diffusion within the mesopores. As for PBAs, the pore size distribution of these materials can be tailored by changing the metals substituting Fe cations in the PB lattice. For these, the textural mesopores increased from approximately 10 nm for Cu analogue (mesoCuHCF), to 16 nm in Co substituted compound (mesoCoHCF), and to as large as 30 nm for the Ni derivative (mesoNiHCF). And while bulk PB and analogues have a higher capacitance than hierarchical analogues for Na-batteries, the increased accessibility to the microporous channels of PBAs allow for faster intercalated ion exchange and diffusion than in bulk PBA crystals. Therefore, hierarchical PBAs are promising candidates for electrodes in future electrochemical energy storage devices with faster charge–discharge rates than batteries, namely pseudocapacitors. Finally, this new synthetic method opens the possibility to prepare hierarchical materials having bimodal distribution of mesopores, and to tailor the structural properties of MOFs for different applications, including contrasting agents for MRI, and drug delivery.« less
Michael Burke; Klaus Jorde; John M. Buffington
2009-01-01
River systems have been altered worldwide by dams and diversions, resulting in a broad array of environmental impacts. The use of a process-based, hierarchical framework for assessing environmental impacts of dams is explored here in terms of a case study of the Kootenai River, western North America. The goal of the case study is to isolate and quantify the relative...
Xi, Kai; Cao, Shuai; Peng, Xiaoyu; Ducati, Caterina; Kumar, R Vasant; Cheetham, Anthony K
2013-03-18
This paper presents a novel method and rationale for utilizing carbonized MOFs for sulphur loading to fabricate cathode structures for lithium-sulphur batteries. Unique carbon materials with differing hierarchical pore structures were synthesized from four types of zinc-containing metal-organic frameworks (MOFs). It is found that cathode materials made from MOFs-derived carbons with higher mesopore (2-50 nm) volumes exhibit increased initial discharge capacities, whereas carbons with higher micropore (<2 nm) volumes lead to cathode materials with better cycle stability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Banerjee, Debasis; Elsaidi, Sameh K.; Aguila, Briana
2016-10-20
Efficient and cost-effective removal of radioactive pertechnetate anions from nuclear waste is a key challenge to mitigate long-term nuclear waste storage issues. Traditional materials such as resins and layered double hydroxides (LDHs) were evaluated for their pertechnetate or perrhenate (the non-radioactive surrogate) removal capacity, but there is room for improvement in terms of capacity, selectivity and kinetics. A series of functionalized hierarchical porous frameworks were evaluated for their perrhenate removal capacity in the presence of other competing anions.
Buckland, Stephen T.; King, Ruth; Toms, Mike P.
2015-01-01
The development of methods for dealing with continuous data with a spike at zero has lagged behind those for overdispersed or zero‐inflated count data. We consider longitudinal ecological data corresponding to an annual average of 26 weekly maximum counts of birds, and are hence effectively continuous, bounded below by zero but also with a discrete mass at zero. We develop a Bayesian hierarchical Tweedie regression model that can directly accommodate the excess number of zeros common to this type of data, whilst accounting for both spatial and temporal correlation. Implementation of the model is conducted in a Markov chain Monte Carlo (MCMC) framework, using reversible jump MCMC to explore uncertainty across both parameter and model spaces. This regression modelling framework is very flexible and removes the need to make strong assumptions about mean‐variance relationships a priori. It can also directly account for the spike at zero, whilst being easily applicable to other types of data and other model formulations. Whilst a correlative study such as this cannot prove causation, our results suggest that an increase in an avian predator may have led to an overall decrease in the number of one of its prey species visiting garden feeding stations in the United Kingdom. This may reflect a change in behaviour of house sparrows to avoid feeding stations frequented by sparrowhawks, or a reduction in house sparrow population size as a result of sparrowhawk increase. PMID:25737026
A hierarchical model for probabilistic independent component analysis of multi-subject fMRI studies
Tang, Li
2014-01-01
Summary An important goal in fMRI studies is to decompose the observed series of brain images to identify and characterize underlying brain functional networks. Independent component analysis (ICA) has been shown to be a powerful computational tool for this purpose. Classic ICA has been successfully applied to single-subject fMRI data. The extension of ICA to group inferences in neuroimaging studies, however, is challenging due to the unavailability of a pre-specified group design matrix. Existing group ICA methods generally concatenate observed fMRI data across subjects on the temporal domain and then decompose multi-subject data in a similar manner to single-subject ICA. The major limitation of existing methods is that they ignore between-subject variability in spatial distributions of brain functional networks in group ICA. In this paper, we propose a new hierarchical probabilistic group ICA method to formally model subject-specific effects in both temporal and spatial domains when decomposing multi-subject fMRI data. The proposed method provides model-based estimation of brain functional networks at both the population and subject level. An important advantage of the hierarchical model is that it provides a formal statistical framework to investigate similarities and differences in brain functional networks across subjects, e.g., subjects with mental disorders or neurodegenerative diseases such as Parkinson’s as compared to normal subjects. We develop an EM algorithm for model estimation where both the E-step and M-step have explicit forms. We compare the performance of the proposed hierarchical model with that of two popular group ICA methods via simulation studies. We illustrate our method with application to an fMRI study of Zen meditation. PMID:24033125
Bayesian multivariate hierarchical transformation models for ROC analysis.
O'Malley, A James; Zou, Kelly H
2006-02-15
A Bayesian multivariate hierarchical transformation model (BMHTM) is developed for receiver operating characteristic (ROC) curve analysis based on clustered continuous diagnostic outcome data with covariates. Two special features of this model are that it incorporates non-linear monotone transformations of the outcomes and that multiple correlated outcomes may be analysed. The mean, variance, and transformation components are all modelled parametrically, enabling a wide range of inferences. The general framework is illustrated by focusing on two problems: (1) analysis of the diagnostic accuracy of a covariate-dependent univariate test outcome requiring a Box-Cox transformation within each cluster to map the test outcomes to a common family of distributions; (2) development of an optimal composite diagnostic test using multivariate clustered outcome data. In the second problem, the composite test is estimated using discriminant function analysis and compared to the test derived from logistic regression analysis where the gold standard is a binary outcome. The proposed methodology is illustrated on prostate cancer biopsy data from a multi-centre clinical trial.
Bayesian multivariate hierarchical transformation models for ROC analysis
O'Malley, A. James; Zou, Kelly H.
2006-01-01
SUMMARY A Bayesian multivariate hierarchical transformation model (BMHTM) is developed for receiver operating characteristic (ROC) curve analysis based on clustered continuous diagnostic outcome data with covariates. Two special features of this model are that it incorporates non-linear monotone transformations of the outcomes and that multiple correlated outcomes may be analysed. The mean, variance, and transformation components are all modelled parametrically, enabling a wide range of inferences. The general framework is illustrated by focusing on two problems: (1) analysis of the diagnostic accuracy of a covariate-dependent univariate test outcome requiring a Box–Cox transformation within each cluster to map the test outcomes to a common family of distributions; (2) development of an optimal composite diagnostic test using multivariate clustered outcome data. In the second problem, the composite test is estimated using discriminant function analysis and compared to the test derived from logistic regression analysis where the gold standard is a binary outcome. The proposed methodology is illustrated on prostate cancer biopsy data from a multi-centre clinical trial. PMID:16217836
The rational choice model in family decision making at the end of life.
Karasz, Alison; Sacajiu, Galit; Kogan, Misha; Watkins, Liza
2010-01-01
Most end-of-life decisions are made by family members. Current ethical guidelines for family decision making are based on a hierarchical model that emphasizes the patient's wishes over his or her best interests. Evidence suggests that the model poorly reflects the strategies and priorities of many families. Researchers observed and recorded 26 decision-making meetings between hospital staff and family members. Semi-structured follow-up interviews were conducted. Transcriptions were analyzed using qualitative techniques. For both staff and families, consideration of a patient's best interests generally took priority over the patient's wishes. Staff generally introduced discussion of the patient's wishes for rhetorical purposes, such as persuasion. Competing moral frameworks, which de-emphasized the salience of patients' autonomy and "right to choose," played a role in family decision making. The priority given to the patients' wishes in the hierarchical model does not reflect the priorities of staff and families in making decisions about end-of-life care.
Hierarchical Marginal Land Assessment for Land Use Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kang, Shujiang; Post, Wilfred M; Wang, Dali
2013-01-01
Marginal land provides an alternative potential for food and bioenergy production in the face of limited land resources; however, effective assessment of marginal lands is not well addressed. Concerns over environmental risks, ecosystem services and sustainability for marginal land have been widely raised. The objective of this study was to develop a hierarchical marginal land assessment framework for land use planning and management. We first identified major land functions linking production, environment, ecosystem services and economics, and then classified land resources into four categories of marginal land using suitability and limitations associated with major management goals, including physically marginal land,more » biologically marginal land, environmental-ecological marginal land, and economically marginal land. We tested this assessment framework in south-western Michigan, USA. Our results indicated that this marginal land assessment framework can be potentially feasible on land use planning for food and bioenergy production, and balancing multiple goals of land use management. We also compared our results with marginal land assessment from the Conservation Reserve Program (CRP) and land capability classes (LCC) that are used in the US. The hierarchical assessment framework has advantages of quantitatively reflecting land functions and multiple concerns. This provides a foundation upon which focused studies can be identified in order to improve the assessment framework by quantifying high-resolution land functions associated with environment and ecosystem services as well as their criteria are needed to improve the assessment framework.« less
Vassena, Eliana; Deraeve, James; Alexander, William H
2017-10-01
Human behavior is strongly driven by the pursuit of rewards. In daily life, however, benefits mostly come at a cost, often requiring that effort be exerted to obtain potential benefits. Medial PFC (MPFC) and dorsolateral PFC (DLPFC) are frequently implicated in the expectation of effortful control, showing increased activity as a function of predicted task difficulty. Such activity partially overlaps with expectation of reward and has been observed both during decision-making and during task preparation. Recently, novel computational frameworks have been developed to explain activity in these regions during cognitive control, based on the principle of prediction and prediction error (predicted response-outcome [PRO] model [Alexander, W. H., & Brown, J. W. Medial prefrontal cortex as an action-outcome predictor. Nature Neuroscience, 14, 1338-1344, 2011], hierarchical error representation [HER] model [Alexander, W. H., & Brown, J. W. Hierarchical error representation: A computational model of anterior cingulate and dorsolateral prefrontal cortex. Neural Computation, 27, 2354-2410, 2015]). Despite the broad explanatory power of these models, it is not clear whether they can also accommodate effects related to the expectation of effort observed in MPFC and DLPFC. Here, we propose a translation of these computational frameworks to the domain of effort-based behavior. First, we discuss how the PRO model, based on prediction error, can explain effort-related activity in MPFC, by reframing effort-based behavior in a predictive context. We propose that MPFC activity reflects monitoring of motivationally relevant variables (such as effort and reward), by coding expectations and discrepancies from such expectations. Moreover, we derive behavioral and neural model-based predictions for healthy controls and clinical populations with impairments of motivation. Second, we illustrate the possible translation to effort-based behavior of the HER model, an extended version of PRO model based on hierarchical error prediction, developed to explain MPFC-DLPFC interactions. We derive behavioral predictions that describe how effort and reward information is coded in PFC and how changing the configuration of such environmental information might affect decision-making and task performance involving motivation.
Hierarchical Decentralized Control Strategy for Demand-Side Primary Frequency Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lian, Jianming; Hansen, Jacob; Marinovici, Laurentiu D.
The Grid Friendlymore » $$^\\textrm{TM}$$ Appliance~(GFA) controller, developed at Pacific Northwest National Laboratory, was designed for the purpose of autonomously switching off appliances by detecting under-frequency events. In this paper, a new frequency responsive load~(FRL) controller is first proposed by extending the functionality of the original GFA controller. The proposed FRL controller can autonomously switch on (or off) end-use loads by detecting over-frequency (or under-frequency) events through local frequency measurement. Then, a hierarchical decentralized control framework is developed for engaging the end-use loads to provide primary frequency response with the proposed FRL controller. The developed framework has several important features that are desirable in terms of providing primary frequency control. It not only exclusively maintains the autonomous operation of the end-use loads, but also effectively overcomes the stability issue associated with high penetration of FRLs. The simulation results illustrate the effectiveness of the developed hierarchical control framework for providing primary frequency response with the proposed FRL controller.« less
Liu, Xiao; Qi, Wei; Wang, Yuefei; Su, Rongxin; He, Zhimin
2017-11-16
Metal-organic frameworks (MOFs) have drawn extensive research interest as candidates for enzyme immobilization owing to their tunable porosity, high surface area, and excellent chemical/thermal stability. Herein, we report a facile and universal strategy for enzyme immobilization using highly stable hierarchically porous metal-organic frameworks (HP-MOFs). The HP-MOFs were stable over a wide pH range (pH = 2-11 for HP-DUT-5) and met the catalysis conditions of most enzymes. The as-prepared hierarchical micro/mesoporous MOFs with mesoporous defects showed a superior adsorption capacity towards enzymes. The maximum adsorption capacity of HP-DUT-5 for glucose oxidase (GOx) and uricase was 208 mg g -1 and 225 mg g -1 , respectively. Furthermore, we constructed two multi-enzyme biosensors for glucose and uric acid (UA) by immobilizing GOx and uricase with horseradish peroxidase (HRP) on HP-DUT-5, respectively. These sensors were efficiently applied in the colorimetric detection of glucose and UA and showed good sensitivity, selectivity, and recyclability.
Individual differences in attention influence perceptual decision making.
Nunez, Michael D; Srinivasan, Ramesh; Vandekerckhove, Joachim
2015-01-01
Sequential sampling decision-making models have been successful in accounting for reaction time (RT) and accuracy data in two-alternative forced choice tasks. These models have been used to describe the behavior of populations of participants, and explanatory structures have been proposed to account for between individual variability in model parameters. In this study we show that individual differences in behavior from a novel perceptual decision making task can be attributed to (1) differences in evidence accumulation rates, (2) differences in variability of evidence accumulation within trials, and (3) differences in non-decision times across individuals. Using electroencephalography (EEG), we demonstrate that these differences in cognitive variables, in turn, can be explained by attentional differences as measured by phase-locking of steady-state visual evoked potential (SSVEP) responses to the signal and noise components of the visual stimulus. Parameters of a cognitive model (a diffusion model) were obtained from accuracy and RT distributions and related to phase-locking indices (PLIs) of SSVEPs with a single step in a hierarchical Bayesian framework. Participants who were able to suppress the SSVEP response to visual noise in high frequency bands were able to accumulate correct evidence faster and had shorter non-decision times (preprocessing or motor response times), leading to more accurate responses and faster response times. We show that the combination of cognitive modeling and neural data in a hierarchical Bayesian framework relates physiological processes to the cognitive processes of participants, and that a model with a new (out-of-sample) participant's neural data can predict that participant's behavior more accurately than models without physiological data.
Cannon, Robert C; Gleeson, Padraig; Crook, Sharon; Ganapathy, Gautham; Marin, Boris; Piasini, Eugenio; Silver, R Angus
2014-01-01
Computational models are increasingly important for studying complex neurophysiological systems. As scientific tools, it is essential that such models can be reproduced and critically evaluated by a range of scientists. However, published models are currently implemented using a diverse set of modeling approaches, simulation tools, and computer languages making them inaccessible and difficult to reproduce. Models also typically contain concepts that are tightly linked to domain-specific simulators, or depend on knowledge that is described exclusively in text-based documentation. To address these issues we have developed a compact, hierarchical, XML-based language called LEMS (Low Entropy Model Specification), that can define the structure and dynamics of a wide range of biological models in a fully machine readable format. We describe how LEMS underpins the latest version of NeuroML and show that this framework can define models of ion channels, synapses, neurons and networks. Unit handling, often a source of error when reusing models, is built into the core of the language by specifying physical quantities in models in terms of the base dimensions. We show how LEMS, together with the open source Java and Python based libraries we have developed, facilitates the generation of scripts for multiple neuronal simulators and provides a route for simulator free code generation. We establish that LEMS can be used to define models from systems biology and map them to neuroscience-domain specific simulators, enabling models to be shared between these traditionally separate disciplines. LEMS and NeuroML 2 provide a new, comprehensive framework for defining computational models of neuronal and other biological systems in a machine readable format, making them more reproducible and increasing the transparency and accessibility of their underlying structure and properties.
Cannon, Robert C.; Gleeson, Padraig; Crook, Sharon; Ganapathy, Gautham; Marin, Boris; Piasini, Eugenio; Silver, R. Angus
2014-01-01
Computational models are increasingly important for studying complex neurophysiological systems. As scientific tools, it is essential that such models can be reproduced and critically evaluated by a range of scientists. However, published models are currently implemented using a diverse set of modeling approaches, simulation tools, and computer languages making them inaccessible and difficult to reproduce. Models also typically contain concepts that are tightly linked to domain-specific simulators, or depend on knowledge that is described exclusively in text-based documentation. To address these issues we have developed a compact, hierarchical, XML-based language called LEMS (Low Entropy Model Specification), that can define the structure and dynamics of a wide range of biological models in a fully machine readable format. We describe how LEMS underpins the latest version of NeuroML and show that this framework can define models of ion channels, synapses, neurons and networks. Unit handling, often a source of error when reusing models, is built into the core of the language by specifying physical quantities in models in terms of the base dimensions. We show how LEMS, together with the open source Java and Python based libraries we have developed, facilitates the generation of scripts for multiple neuronal simulators and provides a route for simulator free code generation. We establish that LEMS can be used to define models from systems biology and map them to neuroscience-domain specific simulators, enabling models to be shared between these traditionally separate disciplines. LEMS and NeuroML 2 provide a new, comprehensive framework for defining computational models of neuronal and other biological systems in a machine readable format, making them more reproducible and increasing the transparency and accessibility of their underlying structure and properties. PMID:25309419
NASA Astrophysics Data System (ADS)
Dai, Heng; Chen, Xingyuan; Ye, Ming; Song, Xuehang; Zachara, John M.
2017-05-01
Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study, we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multilayer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially distributed input variables.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2017-12-01
Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multi-layer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed input variables.
NASA Astrophysics Data System (ADS)
Tsai, F. T.; Elshall, A. S.; Hanor, J. S.
2012-12-01
Subsurface modeling is challenging because of many possible competing propositions for each uncertain model component. How can we judge that we are selecting the correct proposition for an uncertain model component out of numerous competing propositions? How can we bridge the gap between synthetic mental principles such as mathematical expressions on one hand, and empirical observation such as observation data on the other hand when uncertainty exists on both sides? In this study, we introduce hierarchical Bayesian model averaging (HBMA) as a multi-model (multi-proposition) framework to represent our current state of knowledge and decision for hydrogeological structure modeling. The HBMA framework allows for segregating and prioritizing different sources of uncertainty, and for comparative evaluation of competing propositions for each source of uncertainty. We applied the HBMA to a study of hydrostratigraphy and uncertainty propagation of the Southern Hills aquifer system in the Baton Rouge area, Louisiana. We used geophysical data for hydrogeological structure construction through indictor hydrostratigraphy method and used lithologic data from drillers' logs for model structure calibration. However, due to uncertainty in model data, structure and parameters, multiple possible hydrostratigraphic models were produced and calibrated. The study considered four sources of uncertainties. To evaluate mathematical structure uncertainty, the study considered three different variogram models and two geological stationarity assumptions. With respect to geological structure uncertainty, the study considered two geological structures with respect to the Denham Springs-Scotlandville fault. With respect to data uncertainty, the study considered two calibration data sets. These four sources of uncertainty with their corresponding competing modeling propositions resulted in 24 calibrated models. The results showed that by segregating different sources of uncertainty, HBMA analysis provided insights on uncertainty priorities and propagation. In addition, it assisted in evaluating the relative importance of competing modeling propositions for each uncertain model component. By being able to dissect the uncertain model components and provide weighted representation of the competing propositions for each uncertain model component based on the background knowledge, the HBMA functions as an epistemic framework for advancing knowledge about the system under study.
Hierarchical Bayesian inference of the initial mass function in composite stellar populations
NASA Astrophysics Data System (ADS)
Dries, M.; Trager, S. C.; Koopmans, L. V. E.; Popping, G.; Somerville, R. S.
2018-03-01
The initial mass function (IMF) is a key ingredient in many studies of galaxy formation and evolution. Although the IMF is often assumed to be universal, there is continuing evidence that it is not universal. Spectroscopic studies that derive the IMF of the unresolved stellar populations of a galaxy often assume that this spectrum can be described by a single stellar population (SSP). To alleviate these limitations, in this paper we have developed a unique hierarchical Bayesian framework for modelling composite stellar populations (CSPs). Within this framework, we use a parametrized IMF prior to regulate a direct inference of the IMF. We use this new framework to determine the number of SSPs that is required to fit a set of realistic CSP mock spectra. The CSP mock spectra that we use are based on semi-analytic models and have an IMF that varies as a function of stellar velocity dispersion of the galaxy. Our results suggest that using a single SSP biases the determination of the IMF slope to a higher value than the true slope, although the trend with stellar velocity dispersion is overall recovered. If we include more SSPs in the fit, the Bayesian evidence increases significantly and the inferred IMF slopes of our mock spectra converge, within the errors, to their true values. Most of the bias is already removed by using two SSPs instead of one. We show that we can reconstruct the variable IMF of our mock spectra for signal-to-noise ratios exceeding ˜75.
Knebel, Alexander; Wulfert-Holzmann, Paul; Friebe, Sebastian; Pavel, Janet; Strauß, Ina; Mundstock, Alexander; Steinbach, Frank; Caro, Jürgen
2018-04-17
Membranes from metal-organic frameworks (MOFs) are highly interesting for industrial gas separation applications. Strongly improved performances for carbon capture and H 2 purification tasks in MOF membranes are obtained by using highly reproducable and very accuratly, hierarchically grown ZIF-8-on-ZIF-67 (ZIF-8@ZIF-67) nanostructures. To forgo hardly controllable solvothermal synthesis, particles and layers are prepared by self-assembling methods. It was possible for the first time to confirm ZIF-8-on-ZIF-67 membrane growth on rough and porous ceramic supports using the layer-by-layer deposition. Additionally, hierarchical particles are made in a fast RT synthesis with high monodispersity. Characterization of the hierarchical and epitaxial grown layers and particles is performed by SEM, TEM, EDXM and gas permeation. The system ZIF-8@ZIF-67 shows a nearly doubled H 2 /CO 2 separation factor, regardless of whether neat membrane or mixed-matrix-membrane in comparison to other MOF materials. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Morphew, Daniel; Shaw, James; Avins, Christopher; Chakrabarti, Dwaipayan
2018-03-27
Colloidal self-assembly is a promising bottom-up route to a wide variety of three-dimensional structures, from clusters to crystals. Programming hierarchical self-assembly of colloidal building blocks, which can give rise to structures ordered at multiple levels to rival biological complexity, poses a multiscale design problem. Here we explore a generic design principle that exploits a hierarchy of interaction strengths and employ this design principle in computer simulations to demonstrate the hierarchical self-assembly of triblock patchy colloidal particles into two distinct colloidal crystals. We obtain cubic diamond and body-centered cubic crystals via distinct clusters of uniform size and shape, namely, tetrahedra and octahedra, respectively. Such a conceptual design framework has the potential to reliably encode hierarchical self-assembly of colloidal particles into a high level of sophistication. Moreover, the design framework underpins a bottom-up route to cubic diamond colloidal crystals, which have remained elusive despite being much sought after for their attractive photonic applications.
Web-based Visual Analytics for Extreme Scale Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Evans, Katherine J; Harney, John F
In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less
Peng, Zhikun; Liu, Xu; Meng, Huan; Li, Zhongjun; Li, Baojun; Liu, Zhongyi; Liu, Shouchang
2017-02-08
In this work, RuO 2 honeycomb networks (RHCs) and hollow spherical structures (RHSs) were rationally designed and synthesized with modified-SiO 2 as a sacrificial template via two hydrothermal approaches. At a high current density of 20 A g -1 , the two hierarchical porous RuO 2 ·xH 2 O frameworks showed the specific capacitance as high as 628 and 597 F g -1 ; this is about 80% and 75% of the capacitance retention of 0.5 A g -1 for RHCs and RHSs, respectively. Even after 4000 cycles at 5 A g -1 , the RHCs and RHSs can still remain at 86% and 91% of their initial specific capacitances, respectively. These two hierarchical frameworks have a well-defined pathway that benefits for the transmission/diffusion of electrolyte and surface redox reactions. As a result, they exhibit good supercapacitor performance in both acid (H 2 SO 4 ) and alkaline (KOH) electrolytes. As compared to RuO 2 bulk structure and similar RuO 2 counterpart reported in pseudocapacitors, the two hierarchical porous RuO 2 ·xH 2 O frameworks have better energy storage capabilities, high-rate performance, and excellent cycling stability.
NASA Astrophysics Data System (ADS)
Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.
2015-12-01
Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.
NASA Astrophysics Data System (ADS)
Western, A. W.; Lintern, A.; Liu, S.; Ryu, D.; Webb, J. A.; Leahy, P.; Wilson, P.; Waters, D.; Bende-Michl, U.; Watson, M.
2016-12-01
Many streams, lakes and estuaries are experiencing increasing concentrations and loads of nutrient and sediments. Models that can predict the spatial and temporal variability in water quality of aquatic systems are required to help guide the management and restoration of polluted aquatic systems. We propose that a Bayesian hierarchical modelling framework could be used to predict water quality responses over varying spatial and temporal scales. Stream water quality data and spatial data of catchment characteristics collected throughout Victoria and Queensland (in Australia) over two decades will be used to develop this Bayesian hierarchical model. In this paper, we present the preliminary exploratory data analysis required for the development of the Bayesian hierarchical model. Specifically, we present the results of exploratory data analysis of Total Nitrogen (TN) concentrations in rivers in Victoria (in South-East Australia) to illustrate the catchment characteristics that appear to be influencing spatial variability in (1) mean concentrations of TN; and (2) the relationship between discharge and TN throughout the state. These important catchment characteristics were identified using: (1) monthly TN concentrations measured at 28 water quality gauging stations and (2) climate, land use, topographic and geologic characteristics of the catchments of these 28 sites. Spatial variability in TN concentrations had a positive correlation to fertiliser use in the catchment and average temperature. There were negative correlations between TN concentrations and catchment forest cover, annual runoff, runoff perenniality, soil erosivity and catchment slope. The relationship between discharge and TN concentrations showed spatial variability, possibly resulting from climatic and topographic differences between the sites. The results of this study will feed into the hierarchical Bayesian model of river water quality.
[The system-oriented model of psychosocial rehabilitation].
Iastrebov V S; Mitikhin, V G; Solokhina, T A; Mikhaĭlova, I I
2008-01-01
A model of psychosocial rehabilitation based on the system approach that allows taking into account both the patient-centered approach of the rehabilitation service, the development of its resource basis, the effectiveness of this care system in whole and its patterns as well has been worked out. In the framework of this model, the authors suggest to single out three basic stages of the psychosocial rehabilitation process: evaluation and planning, rehabilitation interventions per se, achievement of the result. In author's opinion, the most successful way for constructing a modern model of psychosocial rehabilitation is a method of hierarchic modeling which can reveal a complex chain of interactions between all participants of the rehabilitation process and factors involved in this process and at the same time specify the multi-level hierarchic character of these interactions and factors. An important advantage of this method is the possibility of obtaining as static as well dynamic evaluations of the rehabilitation service activity that may be used on the following levels: 1) patient; 2) his/her close environment; 3) macrosocial level. The obvious merits of the system-oriented model appear to be the possibility of application of its principles in the organization of specialized care for psychiatric patients on the local, regional and federal levels. The authors emphasize that hierarchic models have universal character and can be implemented in the elaboration of information-analytical systems aimed at solving the problems of monitoring and analysis of social-medical service activity in order to increase its effectiveness.
Du, Pengcheng; Dong, Yuman; Liu, Chang; Wei, Wenli; Liu, Dong; Liu, Peng
2018-05-15
Hierarchical porous nickel based metal-organic framework (Ni-MOF) constructed with nanosheets is fabricated by a facile hydrothermal process with the existence of trimesic acid and nickel ions. Various structures of Ni-MOFs can be obtained through adjusting the molar ratio of trimesic acid and nickel ion, the obtained hierarchical porous Ni-MOF exhibits optimal porous structure, which also possesses largest specific surface area. The hierarchical porous structure constructed with nanosheets can supply more active sites for electrochemical reactions to realize the excellent electrochemical properties, thus the hierarchical porous Ni-MOF reveals an outstanding specific capacitance of 1057 F/g at current density of 1 A/g, and delivers high specific capacitance of 649 F/g at current density of 30 A/g, indicating that it exhibits good rate capability of 63.4% even up to 30 A/g. The hierarchical porous Ni-MOF keeps 70% of its original value up to 2 500 charge-discharge cycles at the current density of 10 A/g. Furthermore, asymmetric supercapacitors (ASCs) were assembled based on hierarchical porous Ni-MOF and activated carbon (AC), the ASCs reveal specific capacitance of 87 F/g at current density of 0.5 A/g, and exhibit high energy density of 21.05 Wh/kg and power density of 6.03 kW/kg. Additionally, the tandem ASCs can light up a red LED. The hierarchical porous Ni-MOF exhibits promising applications in high performance supercapacitors. Copyright © 2018 Elsevier Inc. All rights reserved.
Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.
2018-01-01
Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.
Jefferson, Therese; Klass, Des; Lord, Linley; Nowak, Margaret; Thomas, Gail
2014-01-01
Leadership studies which focus on categorising leadership styles have been critiqued for failure to consider the lived experience of leadership. The purpose of this paper is to use the framework of Jepson's model of contextual dynamics to explore whether this framework assists understanding of the "how and why" of lived leadership experience within the nursing profession. Themes for a purposeful literature search and review, having regard to the Jepson model, are drawn from the contemporary and dynamic context of nursing. Government reports, coupled with preliminary interviews with a nurseleadership team, guided selection of contextual issues. The contextual interactions arising from managerialism, existing hierarchical models of leadership and increasing knowledge work provided insights into leadership experience in nursing, in the contexts of professional identity and changing educational and generational profiles of nurses. The authors conclude that employing a contextual frame provides insights in studying leadership experience. The author propose additions to the cultural and institutional dimensions of Jepson's model. The findings have implications for structuring and communicating key roles and policies relevant to nursing leadership. These include the need to: address perceptions around the legitimacy of current nursing leaders to provide clinical leadership; modify hierarchical models of nursing leadership; address implications of the role of the knowledge workers. Observing nursing leadership through the lens of Jepson's model of contextual dynamics confirms that this is an important way of exploring how leadership is enacted. The authors found, however, the model also provided a useful frame for considering the experience and understanding of leadership by those to be led.
Spatial occupancy models for large data sets
Johnson, Devin S.; Conn, Paul B.; Hooten, Mevin B.; Ray, Justina C.; Pond, Bruce A.
2013-01-01
Since its development, occupancy modeling has become a popular and useful tool for ecologists wishing to learn about the dynamics of species occurrence over time and space. Such models require presence–absence data to be collected at spatially indexed survey units. However, only recently have researchers recognized the need to correct for spatially induced overdisperison by explicitly accounting for spatial autocorrelation in occupancy probability. Previous efforts to incorporate such autocorrelation have largely focused on logit-normal formulations for occupancy, with spatial autocorrelation induced by a random effect within a hierarchical modeling framework. Although useful, computational time generally limits such an approach to relatively small data sets, and there are often problems with algorithm instability, yielding unsatisfactory results. Further, recent research has revealed a hidden form of multicollinearity in such applications, which may lead to parameter bias if not explicitly addressed. Combining several techniques, we present a unifying hierarchical spatial occupancy model specification that is particularly effective over large spatial extents. This approach employs a probit mixture framework for occupancy and can easily accommodate a reduced-dimensional spatial process to resolve issues with multicollinearity and spatial confounding while improving algorithm convergence. Using open-source software, we demonstrate this new model specification using a case study involving occupancy of caribou (Rangifer tarandus) over a set of 1080 survey units spanning a large contiguous region (108 000 km2) in northern Ontario, Canada. Overall, the combination of a more efficient specification and open-source software allows for a facile and stable implementation of spatial occupancy models for large data sets.
NASA Astrophysics Data System (ADS)
Petkov, C. I.
2014-09-01
Fitch proposes an appealing hypothesis that humans are dendrophiles, who constantly build mental trees supported by analogous hierarchical brain processes [1]. Moreover, it is argued that, by comparison, nonhuman animals build flat or more compact behaviorally-relevant structures. Should we thus expect less impressive hierarchical brain processes in other animals? Not necessarily.
Bao, Wei; Rao, Yulei
2017-01-01
The application of deep learning approaches to finance has received a great deal of attention from both investors and researchers. This study presents a novel deep learning framework where wavelet transforms (WT), stacked autoencoders (SAEs) and long-short term memory (LSTM) are combined for stock price forecasting. The SAEs for hierarchically extracted deep features is introduced into stock price forecasting for the first time. The deep learning framework comprises three stages. First, the stock price time series is decomposed by WT to eliminate noise. Second, SAEs is applied to generate deep high-level features for predicting the stock price. Third, high-level denoising features are fed into LSTM to forecast the next day’s closing price. Six market indices and their corresponding index futures are chosen to examine the performance of the proposed model. Results show that the proposed model outperforms other similar models in both predictive accuracy and profitability performance. PMID:28708865
NASA Astrophysics Data System (ADS)
Wang, Guoxu; Liu, Meng; Du, Juan; Liu, Lei; Yu, Yifeng; Sha, Jitong; Chen, Aibing
2018-03-01
The membrane carbon materials with hierarchical porous architecture are attractive because they can provide more channels for ion transport and shorten the ions transport path. Herein, we develop a facile way based on "confined nanospace deposition" to fabricate N-dopi-ng three dimensional hierarchical porous membrane carbon material (N-THPMC) via coating the nickel nitrate, silicate oligomers and triblock copolymer P123 on the branches of commercial polyamide membrane (PAM). During high temperature treatment, the mesoporous silica layer and Ni species serve as a "confined nanospace" and catalyst respectively, which are indispensable elements for formation of carbon framework, and the gas-phase carbon precursors which derive from the decomposition of PAM are deposited into the "confined nanospace" forming carbon framework. The N-THPMC with hierarchical macro/meso/microporous structure, N-doping (2.9%) and large specific surface area (994m2 g-1) well inherits the membrane morphology and hierarchical porous structure of PAM. The N-THPMC as electrode without binder exhibits a specific capacitance of 252 F g-1 at the current density of 1 A g-1 in 6 M KOH electrolyte and excellent cycling stability of 92.7% even after 5000 cycles.
Wildhaber, Mark L.; Wikle, Christopher K.; Anderson, Christopher J.; Franz, Kristie J.; Moran, Edward H.; Dey, Rima; Mader, Helmut; Kraml, Julia
2012-01-01
Climate change operates over a broad range of spatial and temporal scales. Understanding its effects on ecosystems requires multi-scale models. For understanding effects on fish populations of riverine ecosystems, climate predicted by coarse-resolution Global Climate Models must be downscaled to Regional Climate Models to watersheds to river hydrology to population response. An additional challenge is quantifying sources of uncertainty given the highly nonlinear nature of interactions between climate variables and community level processes. We present a modeling approach for understanding and accomodating uncertainty by applying multi-scale climate models and a hierarchical Bayesian modeling framework to Midwest fish population dynamics and by linking models for system components together by formal rules of probability. The proposed hierarchical modeling approach will account for sources of uncertainty in forecasts of community or population response. The goal is to evaluate the potential distributional changes in an ecological system, given distributional changes implied by a series of linked climate and system models under various emissions/use scenarios. This understanding will aid evaluation of management options for coping with global climate change. In our initial analyses, we found that predicted pallid sturgeon population responses were dependent on the climate scenario considered.
Rafii-Tari, Hedyeh; Liu, Jindong; Payne, Christopher J; Bicknell, Colin; Yang, Guang-Zhong
2014-01-01
Despite increased use of remote-controlled steerable catheter navigation systems for endovascular intervention, most current designs are based on master configurations which tend to alter natural operator tool interactions. This introduces problems to both ergonomics and shared human-robot control. This paper proposes a novel cooperative robotic catheterization system based on learning-from-demonstration. By encoding the higher-level structure of a catheterization task as a sequence of primitive motions, we demonstrate how to achieve prospective learning for complex tasks whilst incorporating subject-specific variations. A hierarchical Hidden Markov Model is used to model each movement primitive as well as their sequential relationship. This model is applied to generation of motion sequences, recognition of operator input, and prediction of future movements for the robot. The framework is validated by comparing catheter tip motions against the manual approach, showing significant improvements in the quality of catheterization. The results motivate the design of collaborative robotic systems that are intuitive to use, while reducing the cognitive workload of the operator.
Testing the sensitivity of terrestrial carbon models using remotely sensed biomass estimates
NASA Astrophysics Data System (ADS)
Hashimoto, H.; Saatchi, S. S.; Meyer, V.; Milesi, C.; Wang, W.; Ganguly, S.; Zhang, G.; Nemani, R. R.
2010-12-01
There is a large uncertainty in carbon allocation and biomass accumulation in forest ecosystems. With the recent availability of remotely sensed biomass estimates, we now can test some of the hypotheses commonly implemented in various ecosystem models. We used biomass estimates derived by integrating MODIS, GLAS and PALSAR data to verify above-ground biomass estimates simulated by a number of ecosystem models (CASA, BIOME-BGC, BEAMS, LPJ). This study extends the hierarchical framework (Wang et al., 2010) for diagnosing ecosystem models by incorporating independent estimates of biomass for testing and calibrating respiration, carbon allocation, turn-over algorithms or parameters.
A hierarchical two-phase framework for selecting genes in cancer datasets with a neuro-fuzzy system.
Lim, Jongwoo; Wang, Bohyun; Lim, Joon S
2016-04-29
Finding the minimum number of appropriate biomarkers for specific targets such as a lung cancer has been a challenging issue in bioinformatics. We propose a hierarchical two-phase framework for selecting appropriate biomarkers that extracts candidate biomarkers from the cancer microarray datasets and then selects the minimum number of appropriate biomarkers from the extracted candidate biomarkers datasets with a specific neuro-fuzzy algorithm, which is called a neural network with weighted fuzzy membership function (NEWFM). In this context, as the first phase, the proposed framework is to extract candidate biomarkers by using a Bhattacharyya distance method that measures the similarity of two discrete probability distributions. Finally, the proposed framework is able to reduce the cost of finding biomarkers by not receiving medical supplements and improve the accuracy of the biomarkers in specific cancer target datasets.
Qian, Wuyong; Wang, Zhou-Jing; Li, Kevin W.
2016-01-01
Although medical waste usually accounts for a small fraction of urban municipal waste, its proper disposal has been a challenging issue as it often contains infectious, radioactive, or hazardous waste. This article proposes a two-level hierarchical multicriteria decision model to address medical waste disposal method selection (MWDMS), where disposal methods are assessed against different criteria as intuitionistic fuzzy preference relations and criteria weights are furnished as real values. This paper first introduces new operations for a special class of intuitionistic fuzzy values, whose membership and non-membership information is cross ratio based ]0, 1[-values. New score and accuracy functions are defined in order to develop a comparison approach for ]0, 1[-valued intuitionistic fuzzy numbers. A weighted geometric operator is then put forward to aggregate a collection of ]0, 1[-valued intuitionistic fuzzy values. Similar to Saaty’s 1–9 scale, this paper proposes a cross-ratio-based bipolar 0.1–0.9 scale to characterize pairwise comparison results. Subsequently, a two-level hierarchical structure is formulated to handle multicriteria decision problems with intuitionistic preference relations. Finally, the proposed decision framework is applied to MWDMS to illustrate its feasibility and effectiveness. PMID:27618082
Qian, Wuyong; Wang, Zhou-Jing; Li, Kevin W
2016-09-09
Although medical waste usually accounts for a small fraction of urban municipal waste, its proper disposal has been a challenging issue as it often contains infectious, radioactive, or hazardous waste. This article proposes a two-level hierarchical multicriteria decision model to address medical waste disposal method selection (MWDMS), where disposal methods are assessed against different criteria as intuitionistic fuzzy preference relations and criteria weights are furnished as real values. This paper first introduces new operations for a special class of intuitionistic fuzzy values, whose membership and non-membership information is cross ratio based ]0, 1[-values. New score and accuracy functions are defined in order to develop a comparison approach for ]0, 1[-valued intuitionistic fuzzy numbers. A weighted geometric operator is then put forward to aggregate a collection of ]0, 1[-valued intuitionistic fuzzy values. Similar to Saaty's 1-9 scale, this paper proposes a cross-ratio-based bipolar 0.1-0.9 scale to characterize pairwise comparison results. Subsequently, a two-level hierarchical structure is formulated to handle multicriteria decision problems with intuitionistic preference relations. Finally, the proposed decision framework is applied to MWDMS to illustrate its feasibility and effectiveness.
NASA Astrophysics Data System (ADS)
Shiklomanov, A. N.; Cowdery, E.; Dietze, M.
2016-12-01
Recent syntheses of global trait databases have revealed that although the functional diversity among plant species is immense, this diversity is constrained by trade-offs between plant strategies. However, the use of among-trait and trait-environment correlations at the global scale for both qualitative ecological inference and land surface modeling has several important caveats. An alternative approach is to preserve the existing PFT-based model structure while using statistical analyses to account for uncertainty and variability in model parameters. In this study, we used a hierarchical Bayesian model of foliar traits in the TRY database to test the following hypotheses: (1) Leveraging the covariance between foliar traits will significantly constrain our uncertainty in their distributions; and (2) Among-trait covariance patterns are significantly different among and within PFTs, reflecting differences in trade-offs associated with biome-level evolution, site-level community assembly, and individual-level ecophysiological acclimation. We found that among-trait covariance significantly constrained estimates of trait means, and the additional information provided by across-PFT covariance led to more constraint still, especially for traits and PFTs with low sample sizes. We also found that among-trait correlations were highly variable among PFTs, and were generally inconsistent with correlations within PFTs. The hierarchical multivariate framework developed in our study can readily be enhanced with additional levels of hierarchy to account for geographic, species, and individual-level variability.
da Silva, Natal Santos; Undurraga, Eduardo A; da Silva Ferreira, Elis Regina; Estofolete, Cássia Fernanda; Nogueira, Maurício Lacerda
2018-01-01
In Brazil, the incidence of hospitalization due to dengue, as an indicator of severity, has drastically increased since 1998. The objective of our study was to identify risk factors associated with subsequent hospitalization related to dengue. We analyzed 7613 dengue confirmed via serology (ELISA), non-structural protein 1, or polymerase chain reaction amplification. We used a hierarchical framework to generate a multivariate logistic regression based on a variety of risk variables. This was followed by multiple statistical analyses to assess hierarchical model accuracy, variance, goodness of fit, and whether or not this model reliably represented the population. The final model, which included age, sex, ethnicity, previous dengue infection, hemorrhagic manifestations, plasma leakage, and organ failure, showed that all measured parameters, with the exception of previous dengue, were statistically significant. The presence of organ failure was associated with the highest risk of subsequent dengue hospitalization (OR=5·75; CI=3·53-9·37). Therefore, plasma leakage and organ failure were the main indicators of hospitalization due to dengue, although other variables of minor importance should also be considered to refer dengue patients to hospital treatment, which may lead to a reduction in avoidable deaths as well as costs related to dengue. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Chao; Xu, Zhijie; Lai, Canhai
This report is prepared for the demonstration of hierarchical prediction of carbon capture efficiency of a solvent-based absorption column. A computational fluid dynamics (CFD) model is first developed to simulate the core phenomena of solvent-based carbon capture, i.e., the CO2 physical absorption and chemical reaction, on a simplified geometry of wetted wall column (WWC) at bench scale. Aqueous solutions of ethanolamine (MEA) are commonly selected as a CO2 stream scrubbing liquid. CO2 is captured by both physical and chemical absorption using highly CO2 soluble and reactive solvent, MEA, during the scrubbing process. In order to provide confidence bound on themore » computational predictions of this complex engineering system, a hierarchical calibration and validation framework is proposed. The overall goal of this effort is to provide a mechanism-based predictive framework with confidence bound for overall mass transfer coefficient of the wetted wall column (WWC) with statistical analyses of the corresponding WWC experiments with increasing physical complexity.« less
Flexible and Hierarchical Metal-Organic Framework Composites for High-Performance Catalysis.
Huang, Ning; Drake, Hannah; Li, Jialuo; Pang, Jiangdong; Wang, Ying; Yuan, Shuai; Wang, Qi; Cai, Peiyu; Qin, Junsheng; Zhou, Hong-Cai
2018-05-18
The development of new types of porous composite materials is of great significance owing to their potentially improved performance over those of individual components and extensive applications in separation, energy storage, and heterogeneous catalysis. In this work, we integrated mesoporous metal-organic frameworks (MOFs) with macroporous melamine foam (MF) using a one-pot process, generating a series of MOF/MF composite materials with preserved crystallinity, hierarchical porosity, and increased stability over that of melamine foam. The MOF nanocrystals were threaded by the melamine foam networks, resembling a ball-and-stick model overall. As a proof-of-concept study, the resulting MOF/MF composite materials were employed as an effective heterogeneous catalyst for the epoxidation of cholesteryl esters. Combining the advantages of interpenetrative mesoporous and macroporous structures, the MOF/melamine foam composite provided higher dispersibility and more accessibility of catalytic sites, exhibiting excellent catalytic performance. This strategy constitutes an important step forward the development of other MOF composites and exploration of their high-performance catalysis. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Hierarchical Bayesian spatial models for multispecies conservation planning and monitoring.
Carroll, Carlos; Johnson, Devin S; Dunk, Jeffrey R; Zielinski, William J
2010-12-01
Biologists who develop and apply habitat models are often familiar with the statistical challenges posed by their data's spatial structure but are unsure of whether the use of complex spatial models will increase the utility of model results in planning. We compared the relative performance of nonspatial and hierarchical Bayesian spatial models for three vertebrate and invertebrate taxa of conservation concern (Church's sideband snails [Monadenia churchi], red tree voles [Arborimus longicaudus], and Pacific fishers [Martes pennanti pacifica]) that provide examples of a range of distributional extents and dispersal abilities. We used presence-absence data derived from regional monitoring programs to develop models with both landscape and site-level environmental covariates. We used Markov chain Monte Carlo algorithms and a conditional autoregressive or intrinsic conditional autoregressive model framework to fit spatial models. The fit of Bayesian spatial models was between 35 and 55% better than the fit of nonspatial analogue models. Bayesian spatial models outperformed analogous models developed with maximum entropy (Maxent) methods. Although the best spatial and nonspatial models included similar environmental variables, spatial models provided estimates of residual spatial effects that suggested how ecological processes might structure distribution patterns. Spatial models built from presence-absence data improved fit most for localized endemic species with ranges constrained by poorly known biogeographic factors and for widely distributed species suspected to be strongly affected by unmeasured environmental variables or population processes. By treating spatial effects as a variable of interest rather than a nuisance, hierarchical Bayesian spatial models, especially when they are based on a common broad-scale spatial lattice (here the national Forest Inventory and Analysis grid of 24 km(2) hexagons), can increase the relevance of habitat models to multispecies conservation planning. Journal compilation © 2010 Society for Conservation Biology. No claim to original US government works.
2010-02-27
investigated in more detail. The intermediate level of fidelity, though more expensive, is then used to refine the analysis , add geometric detail, and...design stage is used to further refine the analysis , narrowing the design to a handful of options. Figure 1. Integrated Hierarchical Framework. In...computational structural and computational fluid modeling. For the structural analysis tool we used McIntosh Structural Dynamics’ finite element code CNEVAL
Swallow, Ben; Buckland, Stephen T; King, Ruth; Toms, Mike P
2016-03-01
The development of methods for dealing with continuous data with a spike at zero has lagged behind those for overdispersed or zero-inflated count data. We consider longitudinal ecological data corresponding to an annual average of 26 weekly maximum counts of birds, and are hence effectively continuous, bounded below by zero but also with a discrete mass at zero. We develop a Bayesian hierarchical Tweedie regression model that can directly accommodate the excess number of zeros common to this type of data, whilst accounting for both spatial and temporal correlation. Implementation of the model is conducted in a Markov chain Monte Carlo (MCMC) framework, using reversible jump MCMC to explore uncertainty across both parameter and model spaces. This regression modelling framework is very flexible and removes the need to make strong assumptions about mean-variance relationships a priori. It can also directly account for the spike at zero, whilst being easily applicable to other types of data and other model formulations. Whilst a correlative study such as this cannot prove causation, our results suggest that an increase in an avian predator may have led to an overall decrease in the number of one of its prey species visiting garden feeding stations in the United Kingdom. This may reflect a change in behaviour of house sparrows to avoid feeding stations frequented by sparrowhawks, or a reduction in house sparrow population size as a result of sparrowhawk increase. © 2015 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
UNSUPERVISED TRANSIENT LIGHT CURVE ANALYSIS VIA HIERARCHICAL BAYESIAN INFERENCE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanders, N. E.; Soderberg, A. M.; Betancourt, M., E-mail: nsanders@cfa.harvard.edu
2015-02-10
Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. Wemore » present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST.« less
Snoopy--a unifying Petri net framework to investigate biomolecular networks.
Rohr, Christian; Marwan, Wolfgang; Heiner, Monika
2010-04-01
To investigate biomolecular networks, Snoopy provides a unifying Petri net framework comprising a family of related Petri net classes. Models can be hierarchically structured, allowing for the mastering of larger networks. To move easily between the qualitative, stochastic and continuous modelling paradigms, models can be converted into each other. We get models sharing structure, but specialized by their kinetic information. The analysis and iterative reverse engineering of biomolecular networks is supported by the simultaneous use of several Petri net classes, while the graphical user interface adapts dynamically to the active one. Built-in animation and simulation are complemented by exports to various analysis tools. Snoopy facilitates the addition of new Petri net classes thanks to its generic design. Our tool with Petri net samples is available free of charge for non-commercial use at http://www-dssz.informatik.tu-cottbus.de/snoopy.html; supported operating systems: Mac OS X, Windows and Linux (selected distributions).
Modeling trends from North American Breeding Bird Survey data: a spatially explicit approach
Bled, Florent; Sauer, John R.; Pardieck, Keith L.; Doherty, Paul; Royle, J. Andy
2013-01-01
Population trends, defined as interval-specific proportional changes in population size, are often used to help identify species of conservation interest. Efficient modeling of such trends depends on the consideration of the correlation of population changes with key spatial and environmental covariates. This can provide insights into causal mechanisms and allow spatially explicit summaries at scales that are of interest to management agencies. We expand the hierarchical modeling framework used in the North American Breeding Bird Survey (BBS) by developing a spatially explicit model of temporal trend using a conditional autoregressive (CAR) model. By adopting a formal spatial model for abundance, we produce spatially explicit abundance and trend estimates. Analyses based on large-scale geographic strata such as Bird Conservation Regions (BCR) can suffer from basic imbalances in spatial sampling. Our approach addresses this issue by providing an explicit weighting based on the fundamental sample allocation unit of the BBS. We applied the spatial model to three species from the BBS. Species have been chosen based upon their well-known population change patterns, which allows us to evaluate the quality of our model and the biological meaning of our estimates. We also compare our results with the ones obtained for BCRs using a nonspatial hierarchical model (Sauer and Link 2011). Globally, estimates for mean trends are consistent between the two approaches but spatial estimates provide much more precise trend estimates in regions on the edges of species ranges that were poorly estimated in non-spatial analyses. Incorporating a spatial component in the analysis not only allows us to obtain relevant and biologically meaningful estimates for population trends, but also enables us to provide a flexible framework in order to obtain trend estimates for any area.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dolgopolova, Ekaterina A.; Ejegbavwo, Otega A.; Martin, Corey R.
Growing necessity for efficient nuclear waste management is a driving force for development of alternative architectures towards fundamental understanding of mechanisms involved in actinide integration inside extended structures. In this manuscript, metal-organic frameworks (MOFs) were investigated as a model system for engineering radionuclide containing materials through utilization of unprecedented MOF modularity, which cannot be replicated in any other type of materials. Through the implementation of recent synthetic advances in the MOF field, hierarchical complexity of An-materials were built stepwise, which was only feasible due to preparation of the first examples of actinide-based frameworks with “unsaturated” metal nodes. The first successfulmore » attempts of solid-state metathesis and metal node extension in An-MOFs are reported, and the results of the former approach revealed drastic differences in chemical behavior of extended structures versus molecular species. Successful utilization of MOF modularity also allowed us to structurally characterize the first example of bimetallic An-An nodes. To the best of our knowledge, through combination of solid-state metathesis, guest incorporation, and capping linker installation, we were able to achieve the highest Th wt% in mono- and bi-actinide frameworks with minimal structural density. Overall, combination of a multistep synthetic approach with homogeneous actinide distribution and moderate solvothermal conditions could make MOFs an exceptionally powerful tool to address fundamental questions responsible for chemical behavior of An-based extended structures, and therefore, shed light on possible optimization of nuclear waste administration.« less
Dolgopolova, Ekaterina A; Ejegbavwo, Otega A; Martin, Corey R; Smith, Mark D; Setyawan, Wahyu; Karakalos, Stavros G; Henager, Charles H; Zur Loye, Hans-Conrad; Shustova, Natalia B
2017-11-22
Growing necessity for efficient nuclear waste management is a driving force for development of alternative architectures toward fundamental understanding of mechanisms involved in actinide (An) integration inside extended structures. In this manuscript, metal-organic frameworks (MOFs) were investigated as a model system for engineering radionuclide containing materials through utilization of unprecedented MOF modularity, which cannot be replicated in any other type of materials. Through the implementation of recent synthetic advances in the MOF field, hierarchical complexity of An-materials was built stepwise, which was only feasible due to preparation of the first examples of actinide-based frameworks with "unsaturated" metal nodes. The first successful attempts of solid-state metathesis and metal node extension in An-MOFs are reported, and the results of the former approach revealed drastic differences in chemical behavior of extended structures versus molecular species. Successful utilization of MOF modularity also allowed us to structurally characterize the first example of bimetallic An-An nodes. To the best of our knowledge, through combination of solid-state metathesis, guest incorporation, and capping linker installation, we were able to achieve the highest Th wt % in mono- and biactinide frameworks with minimal structural density. Overall, the combination of a multistep synthetic approach with homogeneous actinide distribution and moderate solvothermal conditions could make MOFs an exceptionally powerful tool to address fundamental questions responsible for chemical behavior of An-based extended structures and, therefore, shed light on possible optimization of nuclear waste administration.
Hierarchical Bayesian Model Averaging for Chance Constrained Remediation Designs
NASA Astrophysics Data System (ADS)
Chitsazan, N.; Tsai, F. T.
2012-12-01
Groundwater remediation designs are heavily relying on simulation models which are subjected to various sources of uncertainty in their predictions. To develop a robust remediation design, it is crucial to understand the effect of uncertainty sources. In this research, we introduce a hierarchical Bayesian model averaging (HBMA) framework to segregate and prioritize sources of uncertainty in a multi-layer frame, where each layer targets a source of uncertainty. The HBMA framework provides an insight to uncertainty priorities and propagation. In addition, HBMA allows evaluating model weights in different hierarchy levels and assessing the relative importance of models in each level. To account for uncertainty, we employ a chance constrained (CC) programming for stochastic remediation design. Chance constrained programming was implemented traditionally to account for parameter uncertainty. Recently, many studies suggested that model structure uncertainty is not negligible compared to parameter uncertainty. Using chance constrained programming along with HBMA can provide a rigorous tool for groundwater remediation designs under uncertainty. In this research, the HBMA-CC was applied to a remediation design in a synthetic aquifer. The design was to develop a scavenger well approach to mitigate saltwater intrusion toward production wells. HBMA was employed to assess uncertainties from model structure, parameter estimation and kriging interpolation. An improved harmony search optimization method was used to find the optimal location of the scavenger well. We evaluated prediction variances of chloride concentration at the production wells through the HBMA framework. The results showed that choosing the single best model may lead to a significant error in evaluating prediction variances for two reasons. First, considering the single best model, variances that stem from uncertainty in the model structure will be ignored. Second, considering the best model with non-dominant model weight may underestimate or overestimate prediction variances by ignoring other plausible propositions. Chance constraints allow developing a remediation design with a desirable reliability. However, considering the single best model, the calculated reliability will be different from the desirable reliability. We calculated the reliability of the design for the models at different levels of HBMA. The results showed that by moving toward the top layers of HBMA, the calculated reliability converges to the chosen reliability. We employed the chance constrained optimization along with the HBMA framework to find the optimal location and pumpage for the scavenger well. The results showed that using models at different levels in the HBMA framework, the optimal location of the scavenger well remained the same, but the optimal extraction rate was altered. Thus, we concluded that the optimal pumping rate was sensitive to the prediction variance. Also, the prediction variance was changed by using different extraction rate. Using very high extraction rate will cause prediction variances of chloride concentration at the production wells to approach zero regardless of which HBMA models used.
A hierarchical nest survival model integrating incomplete temporally varying covariates
Converse, Sarah J; Royle, J Andrew; Adler, Peter H; Urbanek, Richard P; Barzen, Jeb A
2013-01-01
Nest success is a critical determinant of the dynamics of avian populations, and nest survival modeling has played a key role in advancing avian ecology and management. Beginning with the development of daily nest survival models, and proceeding through subsequent extensions, the capacity for modeling the effects of hypothesized factors on nest survival has expanded greatly. We extend nest survival models further by introducing an approach to deal with incompletely observed, temporally varying covariates using a hierarchical model. Hierarchical modeling offers a way to separate process and observational components of demographic models to obtain estimates of the parameters of primary interest, and to evaluate structural effects of ecological and management interest. We built a hierarchical model for daily nest survival to analyze nest data from reintroduced whooping cranes (Grus americana) in the Eastern Migratory Population. This reintroduction effort has been beset by poor reproduction, apparently due primarily to nest abandonment by breeding birds. We used the model to assess support for the hypothesis that nest abandonment is caused by harassment from biting insects. We obtained indices of blood-feeding insect populations based on the spatially interpolated counts of insects captured in carbon dioxide traps. However, insect trapping was not conducted daily, and so we had incomplete information on a temporally variable covariate of interest. We therefore supplemented our nest survival model with a parallel model for estimating the values of the missing insect covariates. We used Bayesian model selection to identify the best predictors of daily nest survival. Our results suggest that the black fly Simulium annulus may be negatively affecting nest survival of reintroduced whooping cranes, with decreasing nest survival as abundance of S. annulus increases. The modeling framework we have developed will be applied in the future to a larger data set to evaluate the biting-insect hypothesis and other hypotheses for nesting failure in this reintroduced population; resulting inferences will support ongoing efforts to manage this population via an adaptive management approach. Wider application of our approach offers promise for modeling the effects of other temporally varying, but imperfectly observed covariates on nest survival, including the possibility of modeling temporally varying covariates collected from incubating adults. PMID:24340185
A hierarchical nest survival model integrating incomplete temporally varying covariates
Converse, Sarah J.; Royle, J. Andrew; Adler, Peter H.; Urbanek, Richard P.; Barzan, Jeb A.
2013-01-01
Nest success is a critical determinant of the dynamics of avian populations, and nest survival modeling has played a key role in advancing avian ecology and management. Beginning with the development of daily nest survival models, and proceeding through subsequent extensions, the capacity for modeling the effects of hypothesized factors on nest survival has expanded greatly. We extend nest survival models further by introducing an approach to deal with incompletely observed, temporally varying covariates using a hierarchical model. Hierarchical modeling offers a way to separate process and observational components of demographic models to obtain estimates of the parameters of primary interest, and to evaluate structural effects of ecological and management interest. We built a hierarchical model for daily nest survival to analyze nest data from reintroduced whooping cranes (Grus americana) in the Eastern Migratory Population. This reintroduction effort has been beset by poor reproduction, apparently due primarily to nest abandonment by breeding birds. We used the model to assess support for the hypothesis that nest abandonment is caused by harassment from biting insects. We obtained indices of blood-feeding insect populations based on the spatially interpolated counts of insects captured in carbon dioxide traps. However, insect trapping was not conducted daily, and so we had incomplete information on a temporally variable covariate of interest. We therefore supplemented our nest survival model with a parallel model for estimating the values of the missing insect covariates. We used Bayesian model selection to identify the best predictors of daily nest survival. Our results suggest that the black fly Simulium annulus may be negatively affecting nest survival of reintroduced whooping cranes, with decreasing nest survival as abundance of S. annulus increases. The modeling framework we have developed will be applied in the future to a larger data set to evaluate the biting-insect hypothesis and other hypotheses for nesting failure in this reintroduced population; resulting inferences will support ongoing efforts to manage this population via an adaptive management approach. Wider application of our approach offers promise for modeling the effects of other temporally varying, but imperfectly observed covariates on nest survival, including the possibility of modeling temporally varying covariates collected from incubating adults.
Lv, Kai; Yang, Chu-Ting; Liu, Yi; Hu, Sheng; Wang, Xiao-Lin
2018-01-01
To aid the design of a hierarchically porous unconventional metal-phosphonate framework (HP-UMPF) for practical radioanalytical separation, a systematic investigation of the hydrolytic stability of bulk phase against acidic corrosion has been carried out for an archetypical HP-UMPF. Bulk dissolution results suggest that aqueous acidity has a more paramount effect on incongruent leaching than the temperature, and the kinetic stability reaches equilibrium by way of an accumulation of a partial leached species on the corrosion conduits. A variation of particle morphology, hierarchical porosity and backbone composition upon corrosion reveals that they are hydrolytically resilient without suffering any great degradation of porous texture, although large aggregates crack into sporadic fractures while the nucleophilic attack of inorganic layers cause the leaching of tin and phosphorus. The remaining selectivity of these HP-UMPFs is dictated by a balance between the elimination of free phosphonate and the exposure of confined phosphonates, thus allowing a real-time tailor of radionuclide sequestration. Moreover, a plausible degradation mechanism has been proposed for the triple progressive dissolution of three-level hierarchical porous structures to elucidate resultant reactivity. These HP-UMPFs are compared with benchmark metal-organic frameworks (MOFs) to obtain a rough grading of hydrolytic stability and two feasible approaches are suggested for enhancing their hydrolytic stability that are intended for real-life separation protocols. PMID:29538348
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Di; Lian, Jianming; Sun, Yannan
Demand response is representing a significant but largely untapped resource that can greatly enhance the flexibility and reliability of power systems. In this paper, a hierarchical control framework is proposed to facilitate the integrated coordination between distributed energy resources and demand response. The proposed framework consists of coordination and device layers. In the coordination layer, various resource aggregations are optimally coordinated in a distributed manner to achieve the system-level objectives. In the device layer, individual resources are controlled in real time to follow the optimal power generation or consumption dispatched from the coordination layer. For the purpose of practical applications,more » a method is presented to determine the utility functions of controllable loads by taking into account the real-time load dynamics and the preferences of individual customers. The effectiveness of the proposed framework is validated by detailed simulation studies.« less
Generalized Aggregation and Coordination of Residential Loads in a Smart Community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hao, He; Somani, Abhishek; Lian, Jianming
2015-11-02
Flexibility from residential loads presents an enormous potential to provide various services to the smart grid. In this paper, we propose a unified hierarchical framework for aggregation and coordination of various residential loads in a smart community, such as Thermostatically Controlled Loads (TCLs), Distributed Energy Storages (DESs), residential Pool Pumps (PPs), and Electric Vehicles (EVs). A central idea of this framework is a virtual battery model, which provides a simple and intuitive tool to aggregate the flexibility of distributed loads. Moreover, a multi-stage Nash-bargainingbased coordination strategy is proposed to coordinate different aggregations of residential loads for demand response. Case studiesmore » are provided to demonstrate the efficacy of our proposed framework and coordination strategy in managing peak power demand in a smart residential community.« less
Reconstruction of late Holocene climate based on tree growth and mechanistic hierarchical models
Tipton, John; Hooten, Mevin B.; Pederson, Neil; Tingley, Martin; Bishop, Daniel
2016-01-01
Reconstruction of pre-instrumental, late Holocene climate is important for understanding how climate has changed in the past and how climate might change in the future. Statistical prediction of paleoclimate from tree ring widths is challenging because tree ring widths are a one-dimensional summary of annual growth that represents a multi-dimensional set of climatic and biotic influences. We develop a Bayesian hierarchical framework using a nonlinear, biologically motivated tree ring growth model to jointly reconstruct temperature and precipitation in the Hudson Valley, New York. Using a common growth function to describe the response of a tree to climate, we allow for species-specific parameterizations of the growth response. To enable predictive backcasts, we model the climate variables with a vector autoregressive process on an annual timescale coupled with a multivariate conditional autoregressive process that accounts for temporal correlation and cross-correlation between temperature and precipitation on a monthly scale. Our multi-scale temporal model allows for flexibility in the climate response through time at different temporal scales and predicts reasonable climate scenarios given tree ring width data.
Cruz-Marcelo, Alejandro; Ensor, Katherine B; Rosner, Gary L
2011-06-01
The term structure of interest rates is used to price defaultable bonds and credit derivatives, as well as to infer the quality of bonds for risk management purposes. We introduce a model that jointly estimates term structures by means of a Bayesian hierarchical model with a prior probability model based on Dirichlet process mixtures. The modeling methodology borrows strength across term structures for purposes of estimation. The main advantage of our framework is its ability to produce reliable estimators at the company level even when there are only a few bonds per company. After describing the proposed model, we discuss an empirical application in which the term structure of 197 individual companies is estimated. The sample of 197 consists of 143 companies with only one or two bonds. In-sample and out-of-sample tests are used to quantify the improvement in accuracy that results from approximating the term structure of corporate bonds with estimators by company rather than by credit rating, the latter being a popular choice in the financial literature. A complete description of a Markov chain Monte Carlo (MCMC) scheme for the proposed model is available as Supplementary Material.
Cruz-Marcelo, Alejandro; Ensor, Katherine B.; Rosner, Gary L.
2011-01-01
The term structure of interest rates is used to price defaultable bonds and credit derivatives, as well as to infer the quality of bonds for risk management purposes. We introduce a model that jointly estimates term structures by means of a Bayesian hierarchical model with a prior probability model based on Dirichlet process mixtures. The modeling methodology borrows strength across term structures for purposes of estimation. The main advantage of our framework is its ability to produce reliable estimators at the company level even when there are only a few bonds per company. After describing the proposed model, we discuss an empirical application in which the term structure of 197 individual companies is estimated. The sample of 197 consists of 143 companies with only one or two bonds. In-sample and out-of-sample tests are used to quantify the improvement in accuracy that results from approximating the term structure of corporate bonds with estimators by company rather than by credit rating, the latter being a popular choice in the financial literature. A complete description of a Markov chain Monte Carlo (MCMC) scheme for the proposed model is available as Supplementary Material. PMID:21765566
Combining information from multiple flood projections in a hierarchical Bayesian framework
NASA Astrophysics Data System (ADS)
Le Vine, Nataliya
2016-04-01
This study demonstrates, in the context of flood frequency analysis, the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach explicitly accommodates shared multimodel discrepancy as well as the probabilistic nature of the flood estimates, and treats the available models as a sample from a hypothetical complete (but unobserved) set of models. The methodology is applied to flood estimates from multiple hydrological projections (the Future Flows Hydrology data set) for 135 catchments in the UK. The advantages of the approach are shown to be: (1) to ensure adequate "baseline" with which to compare future changes; (2) to reduce flood estimate uncertainty; (3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; (4) to diminish the importance of model consistency when model biases are large; and (5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.
Hierarchical Production Planning: A Two Stage System.
1980-05-01
production planning issues (see [101,[121,[ 131 ). However, some elements of the hierarchical framework can also be construct- ively used to enhance an...inventories are nonzero, but the product families’ initial inventories are well balanced, i.e., if IJo = d , for 11,2,... ,I and JEJ(i)E) I ° Z j JCi ) JEJ(i
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Chao; Xu, Zhijie; Lai, Kevin
Part 1 of this paper presents a numerical model for non-reactive physical mass transfer across a wetted wall column (WWC). In Part 2, we improved the existing computational fluid dynamics (CFD) model to simulate chemical absorption occurring in a WWC as a bench-scale study of solvent-based carbon dioxide (CO2) capture. To generate data for WWC model validation, CO2 mass transfer across a monoethanolamine (MEA) solvent was first measured on a WWC experimental apparatus. The numerical model developed in this work can account for both chemical absorption and desorption of CO2 in MEA. In addition, the overall mass transfer coefficient predictedmore » using traditional/empirical correlations is conducted and compared with CFD prediction results for both steady and wavy falling films. A Bayesian statistical calibration algorithm is adopted to calibrate the reaction rate constants in chemical absorption/desorption of CO2 across a falling film of MEA. The posterior distributions of the two transport properties, i.e., Henry's constant and gas diffusivity in the non-reacting nitrous oxide (N2O)/MEA system obtained from Part 1 of this study, serves as priors for the calibration of CO2 reaction rate constants after using the N2O/CO2 analogy method. The calibrated model can be used to predict the CO2 mass transfer in a WWC for a wider range of operating conditions.« less
Wang, Chao; Xu, Zhijie; Lai, Kevin; ...
2017-10-24
Part 1 of this paper presents a numerical model for non-reactive physical mass transfer across a wetted wall column (WWC). In Part 2, we improved the existing computational fluid dynamics (CFD) model to simulate chemical absorption occurring in a WWC as a bench-scale study of solvent-based carbon dioxide (CO2) capture. To generate data for WWC model validation, CO2 mass transfer across a monoethanolamine (MEA) solvent was first measured on a WWC experimental apparatus. The numerical model developed in this work can account for both chemical absorption and desorption of CO2 in MEA. In addition, the overall mass transfer coefficient predictedmore » using traditional/empirical correlations is conducted and compared with CFD prediction results for both steady and wavy falling films. A Bayesian statistical calibration algorithm is adopted to calibrate the reaction rate constants in chemical absorption/desorption of CO2 across a falling film of MEA. The posterior distributions of the two transport properties, i.e., Henry's constant and gas diffusivity in the non-reacting nitrous oxide (N2O)/MEA system obtained from Part 1 of this study, serves as priors for the calibration of CO2 reaction rate constants after using the N2O/CO2 analogy method. The calibrated model can be used to predict the CO2 mass transfer in a WWC for a wider range of operating conditions.« less
Hierarchy of Gambling Choices: A Framework for Examining EGM Gambling Environment Preferences.
Thorne, Hannah Briony; Rockloff, Matthew Justus; Langham, Erika; Li, En
2016-12-01
This paper presents the Hierarchy of Gambling Choices (HGC), which is a consumer-oriented framework for understanding the key environmental and contextual features that influence peoples' selections of online and venue-based electronic gaming machines (EGMs). The HGC framework proposes that EGM gamblers make choices in selection of EGM gambling experiences utilising Tversky's (Psychol Rev 79(4):281-299, 1972). Elimination-by-Aspects model, and organise their choice in a hierarchical manner by virtue of EGMs being an "experience good" (Nelson in J Polit Econ 78(2):311-329, 1970). EGM features are divided into three levels: the platform-including, online, mobile or land-based; the provider or specific venue in which the gambling occurs; and the game or machine characteristics, such as graphical themes and bonus features. This framework will contribute to the gambling field by providing a manner in which to systematically explore the environment surrounding EGM gambling and how it affects behaviour.
A Hierarchical Approach to Fracture Mechanics
NASA Technical Reports Server (NTRS)
Saether, Erik; Taasan, Shlomo
2004-01-01
Recent research conducted under NASA LaRC's Creativity and Innovation Program has led to the development of an initial approach for a hierarchical fracture mechanics. This methodology unites failure mechanisms occurring at different length scales and provides a framework for a physics-based theory of fracture. At the nanoscale, parametric molecular dynamic simulations are used to compute the energy associated with atomic level failure mechanisms. This information is used in a mesoscale percolation model of defect coalescence to obtain statistics of fracture paths and energies through Monte Carlo simulations. The mathematical structure of predicted crack paths is described using concepts of fractal geometry. The non-integer fractal dimension relates geometric and energy measures between meso- and macroscales. For illustration, a fractal-based continuum strain energy release rate is derived for inter- and transgranular fracture in polycrystalline metals.
NASA Astrophysics Data System (ADS)
He, Juan; Lu, Xingping; Yu, Jie; Wang, Li; Song, Yonghai
2016-07-01
A novel Co(OH)2/glassy carbon electrode (GCE) has been fabricated via metal-organic framework (MOF)-directed method. In the strategy, the Co(BTC, 1,3,5-benzentricarboxylic acid) MOFs/GCE was firstly prepared by alternately immersing GCE in Co2+ and BTC solution based on a layer-by-layer method. And then, the Co(OH)2 with hierarchical flake nanostructure/GCE was constructed by immersing Co(BTC) MOFs/GCE into 0.1 M NaOH solution at room temperature. Such strategy improves the distribution of hierarchical Co(OH)2 nanostructures on electrode surface greatly, enhances the stability of nanomaterials on the electrode surface, and increases the use efficiency of the Co(OH)2 nanostructures. Scanning electron microscopy, energy dispersive X-ray spectroscopy, X-ray powder diffraction, energy dispersive spectroscopy, Fourier transform infrared spectroscopy, and Raman spectra were used to characterize the Co(BTC) MOFs/GCE and Co(OH)2/GCE. Based on the hierarchical Co(OH)2 nanostructures/GCE, a novel and sensitive nonenzymatic glucose sensor was developed. The good performance of the resulted sensor toward the detection of glucose was ascribed to hierarchical flake nanostructures, good mechanical stability, excellent distribution, and large specific surface area of Co(OH)2 nanostructures. The proposed preparation method is simple, efficient, and cheap .
NASA Astrophysics Data System (ADS)
Berliner, M.
2017-12-01
Bayesian statistical decision theory offers a natural framework for decision-policy making in the presence of uncertainty. Key advantages of the approach include efficient incorporation of information and observations. However, in complicated settings it is very difficult, perhaps essentially impossible, to formalize the mathematical inputs needed in the approach. Nevertheless, using the approach as a template is useful for decision support; that is, organizing and communicating our analyses. Bayesian hierarchical modeling is valuable in quantifying and managing uncertainty such cases. I review some aspects of the idea emphasizing statistical model development and use in the context of sea-level rise.
Availability-Based Importance Framework for Supplier Selection
2015-04-30
IMA Journal of Management Math, 15(2), 161– 174. Chen, C . -T., Lin, C . -T., & Huang, S. -F. (2006). A fuzzy approach for supplier evaluation and...reliability modeling: Principles and applications. Hoboken, NJ: Wiley. Liao, C . -N., & Kao, H. -P. (2011). An integrated fuzzy TOPSIS and MCGP approach to...5307–5326. Wang, J. -W., Cheng, C . -H., & Huang, K.- C . (2009). Fuzzy hierarchical TOPSIS for supplier selection. Applied Soft Computing, 9(1), 377
Liu, Jie; Guo, Liang; Jiang, Jiping; Jiang, Dexun; Liu, Rentao; Wang, Peng
2016-06-05
In the emergency management relevant to pollution accidents, efficiency emergency rescues can be deeply influenced by a reasonable assignment of the available emergency materials to the related risk sources. In this study, a two-stage optimization framework is developed for emergency material reserve layout planning under uncertainty to identify material warehouse locations and emergency material reserve schemes in pre-accident phase coping with potential environmental accidents. This framework is based on an integration of Hierarchical clustering analysis - improved center of gravity (HCA-ICG) model and material warehouse location - emergency material allocation (MWL-EMA) model. First, decision alternatives are generated using HCA-ICG to identify newly-built emergency material warehouses for risk sources which cannot be satisfied by existing ones with a time-effective manner. Second, emergency material reserve planning is obtained using MWL-EMA to make emergency materials be prepared in advance with a cost-effective manner. The optimization framework is then applied to emergency management system planning in Jiangsu province, China. The results demonstrate that the developed framework not only could facilitate material warehouse selection but also effectively provide emergency material for emergency operations in a quick response. Copyright © 2016. Published by Elsevier B.V.
Integrated urban water cycle management: the UrbanCycle model.
Hardy, M J; Kuczera, G; Coombes, P J
2005-01-01
Integrated urban water cycle management presents a new framework in which solutions to the provision of urban water services can be sought. It enables new and innovative solutions currently constrained by the existing urban water paradigm to be implemented. This paper introduces the UrbanCycle model. The model is being developed in response to the growing and changing needs of the water management sector and in light of the need for tools to evaluate integrated watercycle management approaches. The key concepts underpinning the UrbanCycle model are the adoption of continuous simulation, hierarchical network modelling, and the careful management of computational complexity. The paper reports on the integration of modelling capabilities across the allotment, and subdivision scales, enabling the interactions between these scales to be explored. A case study illustrates the impacts of various mitigation measures possible under an integrated water management framework. The temporal distribution of runoff into ephemeral streams from a residential allotment in Western Sydney is evaluated and linked to the geomorphic and ecological regimes in receiving waters.
IGMS: An Integrated ISO-to-Appliance Scale Grid Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan; Hale, Elaine; Hansen, Timothy M.
This paper describes the Integrated Grid Modeling System (IGMS), a novel electric power system modeling platform for integrated transmission-distribution analysis that co-simulates off-the-shelf tools on high performance computing (HPC) platforms to offer unprecedented resolution from ISO markets down to appliances and other end uses. Specifically, the system simultaneously models hundreds or thousands of distribution systems in co-simulation with detailed Independent System Operator (ISO) markets and AGC-level reserve deployment. IGMS uses a new MPI-based hierarchical co-simulation framework to connect existing sub-domain models. Our initial efforts integrate opensource tools for wholesale markets (FESTIV), bulk AC power flow (MATPOWER), and full-featured distribution systemsmore » including physics-based end-use and distributed generation models (many instances of GridLAB-D[TM]). The modular IGMS framework enables tool substitution and additions for multi-domain analyses. This paper describes the IGMS tool, characterizes its performance, and demonstrates the impacts of the coupled simulations for analyzing high-penetration solar PV and price responsive load scenarios.« less
NASA Astrophysics Data System (ADS)
Rudzinski, Joseph F.
Atomically-detailed molecular dynamics simulations have emerged as one of the most powerful theoretic tools for studying complex, condensed-phase systems. Despite their ability to provide incredible molecular insight, these simulations are insufficient for investigating complex biological processes, e.g., protein folding or molecular aggregation, on relevant length and time scales. The increasing scope and sophistication of atomically-detailed models has motivated the development of "hierarchical" approaches, which parameterize a low resolution, coarse-grained (CG) model based on simulations of an atomically-detailed model. The utility of hierarchical CG models depends on their ability to accurately incorporate the correct physics of the underlying model. One approach for ensuring this "consistency" between the models is to parameterize the CG model to reproduce the structural ensemble generated by the high resolution model. The many-body potential of mean force is the proper CG energy function for reproducing all structural distributions of the atomically-detailed model, at the CG level of resolution. However, this CG potential is a configuration-dependent free energy function that is generally too complicated to represent or simulate. The multiscale coarse-graining (MS-CG) method employs a generalized Yvon-Born-Green (g-YBG) relation to directly determine a variationally optimal approximation to the many-body potential of mean force. The MS-CG/g-YBG method provides a convenient and transparent framework for investigating the equilibrium structure of the system, at the CG level of resolution. In this work, we investigate the fundamental limitations and approximations of the MS-CG/g-YBG method. Throughout the work, we propose several theoretic constructs to directly relate the MS-CG/g-YBG method to other popular structure-based CG approaches. We investigate the physical interpretation of the MS-CG/g-YBG correlation matrix, the quantity responsible for disentangling the various contributions to the average force on a CG site. We then employ an iterative extension of the MS-CG/g-YBG method that improves the accuracy of a particular set of low order correlation functions relative to the original MS-CG/g-YBG model. We demonstrate that this method provides a powerful framework for identifying the precise source of error in an MS-CG/g-YBG model. We then propose a method for identifying an optimal CG representation, prior to the development of the CG model. We employ these techniques together to demonstrate that in the cases where the MS-CG/g-YBG method fails to determine an accurate model, a fundamental problem likely exists with the chosen CG representation or interaction set. Additionally, we explicitly demonstrate that while the iterative model successfully improves the accuracy of the low order structure, it does so by distorting the higher order structural correlations relative to the underlying model. Finally, we apply these methods to investigate the utility of the MS-CG/g- YBG method for developing models for systems with complex intramolecular structure. Overall, our results demonstrate the power of the g-YBG framework for developing accurate CG models and for investigating the driving forces of equilibrium structures for complex condensed-phase systems. This work also explicitly motivates future development of bottom-up CG methods and highlights some outstanding problems in the field. iii.
Construction of hierarchically porous metal–organic frameworks through linker labilization
Yuan, Shuai; Zou, Lanfang; Qin, Jun-Sheng; ...
2017-05-25
One major goal of metal–organic framework (MOF) research is the expansion of pore size and volume. Although many approaches have been attempted to increase the pore size of MOF materials, it is still a challenge to construct MOFs with precisely customized pore apertures for specific applications. W present a new method, namely linker labilization, to increase the MOF porosity and pore size, giving rise to hierarchical-pore architectures. Microporous MOFs with robust metal nodes and pro-labile linkers were initially synthesized. The mesopores were subsequently created as crystal defects through the splitting of a pro-labile-linker and the removal of the linker fragmentsmore » by acid treatment. We also demonstrate that linker labilization method can create controllable hierarchical porous structures in stable MOFs, which facilitates the diffusion and adsorption process of guest molecules to improve the performances of MOFs in adsorption and catalysis.« less
Construction of hierarchically porous metal-organic frameworks through linker labilization
NASA Astrophysics Data System (ADS)
Yuan, Shuai; Zou, Lanfang; Qin, Jun-Sheng; Li, Jialuo; Huang, Lan; Feng, Liang; Wang, Xuan; Bosch, Mathieu; Alsalme, Ali; Cagin, Tahir; Zhou, Hong-Cai
2017-05-01
A major goal of metal-organic framework (MOF) research is the expansion of pore size and volume. Although many approaches have been attempted to increase the pore size of MOF materials, it is still a challenge to construct MOFs with precisely customized pore apertures for specific applications. Herein, we present a new method, namely linker labilization, to increase the MOF porosity and pore size, giving rise to hierarchical-pore architectures. Microporous MOFs with robust metal nodes and pro-labile linkers were initially synthesized. The mesopores were subsequently created as crystal defects through the splitting of a pro-labile-linker and the removal of the linker fragments by acid treatment. We demonstrate that linker labilization method can create controllable hierarchical porous structures in stable MOFs, which facilitates the diffusion and adsorption process of guest molecules to improve the performances of MOFs in adsorption and catalysis.
Construction of hierarchically porous metal–organic frameworks through linker labilization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, Shuai; Zou, Lanfang; Qin, Jun-Sheng
One major goal of metal–organic framework (MOF) research is the expansion of pore size and volume. Although many approaches have been attempted to increase the pore size of MOF materials, it is still a challenge to construct MOFs with precisely customized pore apertures for specific applications. W present a new method, namely linker labilization, to increase the MOF porosity and pore size, giving rise to hierarchical-pore architectures. Microporous MOFs with robust metal nodes and pro-labile linkers were initially synthesized. The mesopores were subsequently created as crystal defects through the splitting of a pro-labile-linker and the removal of the linker fragmentsmore » by acid treatment. We also demonstrate that linker labilization method can create controllable hierarchical porous structures in stable MOFs, which facilitates the diffusion and adsorption process of guest molecules to improve the performances of MOFs in adsorption and catalysis.« less
2010-01-01
Changes to the glycosylation profile on HIV gp120 can influence viral pathogenesis and alter AIDS disease progression. The characterization of glycosylation differences at the sequence level is inadequate as the placement of carbohydrates is structurally complex. However, no structural framework is available to date for the study of HIV disease progression. In this study, we propose a novel machine-learning based framework for the prediction of AIDS disease progression in three stages (RP, SP, and LTNP) using the HIV structural gp120 profile. This new intelligent framework proves to be accurate and provides an important benchmark for predicting AIDS disease progression computationally. The model is trained using a novel HIV gp120 glycosylation structural profile to detect possible stages of AIDS disease progression for the target sequences of HIV+ individuals. The performance of the proposed model was compared to seven existing different machine-learning models on newly proposed gp120-Benchmark_1 dataset in terms of error-rate (MSE), accuracy (CCI), stability (STD), and complexity (TBM). The novel framework showed better predictive performance with 67.82% CCI, 30.21 MSE, 0.8 STD, and 2.62 TBM on the three stages of AIDS disease progression of 50 HIV+ individuals. This framework is an invaluable bioinformatics tool that will be useful to the clinical assessment of viral pathogenesis. PMID:21143806
Liu, Dan; Liu, Xuejun; Wu, Yiguang
2018-04-24
This paper presents an effective approach for depth reconstruction from a single image through the incorporation of semantic information and local details from the image. A unified framework for depth acquisition is constructed by joining a deep Convolutional Neural Network (CNN) and a continuous pairwise Conditional Random Field (CRF) model. Semantic information and relative depth trends of local regions inside the image are integrated into the framework. A deep CNN network is firstly used to automatically learn a hierarchical feature representation of the image. To get more local details in the image, the relative depth trends of local regions are incorporated into the network. Combined with semantic information of the image, a continuous pairwise CRF is then established and is used as the loss function of the unified model. Experiments on real scenes demonstrate that the proposed approach is effective and that the approach obtains satisfactory results.
Flexibility and rigidity of cross-linked Straight Fibrils under axial motion constraints.
Nagy Kem, Gyula
2016-09-01
The Straight Fibrils are stiff rod-like filaments and play a significant role in cellular processes as structural stability and intracellular transport. Introducing a 3D mechanical model for the motion of braced cylindrical fibrils under axial motion constraint; we provide some mechanism and a graph theoretical model for fibril structures and give the characterization of the flexibility and the rigidity of this bar-and-joint spatial framework. The connectedness and the circuit of the bracing graph characterize the flexibility of these structures. In this paper, we focus on the kinematical properties of hierarchical levels of fibrils and evaluate the number of the bracing elements for the rigidity and its computational complexity. The presented model is a good characterization of the frameworks of bio-fibrils such as microtubules, cellulose, which inspired this work. Copyright © 2016 Elsevier Ltd. All rights reserved.
Zhou, Ronggang; Chan, Alan H S
2017-01-01
In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process.
NASA Astrophysics Data System (ADS)
Bao, Cheng; Cai, Ningsheng; Croiset, Eric
2011-10-01
Following our integrated hierarchical modeling framework of natural gas internal reforming solid oxide fuel cell (IRSOFC), this paper firstly introduces the model libraries of main balancing units, including some state-of-the-art achievements and our specific work. Based on gPROMS programming code, flexible configuration and modular design are fully realized by specifying graphically all unit models in each level. Via comparison with the steady-state experimental data of Siemens-Westinghouse demonstration system, the in-house multi-level SOFC-gas turbine (GT) simulation platform is validated to be more accurate than the advanced power system analysis tool (APSAT). Moreover, some units of the demonstration system are designed reversely for analysis of a typically part-load transient process. The framework of distributed and dynamic modeling in most of units is significant for the development of control strategies in the future.
Fingelkurts, Andrew A; Fingelkurts, Alexander A; Neves, Carlos F H
2012-01-05
Instead of using low-level neurophysiology mimicking and exploratory programming methods commonly used in the machine consciousness field, the hierarchical operational architectonics (OA) framework of brain and mind functioning proposes an alternative conceptual-theoretical framework as a new direction in the area of model-driven machine (robot) consciousness engineering. The unified brain-mind theoretical OA model explicitly captures (though in an informal way) the basic essence of brain functional architecture, which indeed constitutes a theory of consciousness. The OA describes the neurophysiological basis of the phenomenal level of brain organization. In this context the problem of producing man-made "machine" consciousness and "artificial" thought is a matter of duplicating all levels of the operational architectonics hierarchy (with its inherent rules and mechanisms) found in the brain electromagnetic field. We hope that the conceptual-theoretical framework described in this paper will stimulate the interest of mathematicians and/or computer scientists to abstract and formalize principles of hierarchy of brain operations which are the building blocks for phenomenal consciousness and thought. Copyright © 2010 Elsevier B.V. All rights reserved.
Hydrothermal preparation of hierarchical ZIF-L nanostructures for enhanced CO2 capture.
Ding, Bing; Wang, Xianbiao; Xu, Yongfei; Feng, Shaojie; Ding, Yi; Pan, Yang; Xu, Weifan; Wang, Huanting
2018-06-01
A zeolitic imidazolate framework (ZIF-L) with hierarchical morphology was synthesized through hydrothermal method. The hierarchical product consists of ZIF-L leaves with length of several micrometers, width of 1 ∼ 2 μm and thickness of ∼300 nm cross connected symmetrically. It was found that the hydrothermal temperature is crucial for the formation of such hierarchical nanostructure. The formation mechanism was investigated to be a secondary crystal growth process. The hierarchical ZIF-L has larger surface area compared with the two-dimensional (2D) ZIF-L leaves. Subsequently, the hierarchical ZIF-L exhibited enhanced CO 2 adsorption capacity (1.56 mmol·g -1 ) as compared with that of the reported two-dimensional ZIF-L leaves (0.94 mmol·g -1 ). This work not only reveals a new strategy for the formation of hierarchical ZIF-L nanostructures, but also supplies a potential material for CO 2 capture. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Cucchi, K.; Kawa, N.; Hesse, F.; Rubin, Y.
2017-12-01
In order to reduce uncertainty in the prediction of subsurface flow and transport processes, practitioners should use all data available. However, classic inverse modeling frameworks typically only make use of information contained in in-situ field measurements to provide estimates of hydrogeological parameters. Such hydrogeological information about an aquifer is difficult and costly to acquire. In this data-scarce context, the transfer of ex-situ information coming from previously investigated sites can be critical for improving predictions by better constraining the estimation procedure. Bayesian inverse modeling provides a coherent framework to represent such ex-situ information by virtue of the prior distribution and combine them with in-situ information from the target site. In this study, we present an innovative data-driven approach for defining such informative priors for hydrogeological parameters at the target site. Our approach consists in two steps, both relying on statistical and machine learning methods. The first step is data selection; it consists in selecting sites similar to the target site. We use clustering methods for selecting similar sites based on observable hydrogeological features. The second step is data assimilation; it consists in assimilating data from the selected similar sites into the informative prior. We use a Bayesian hierarchical model to account for inter-site variability and to allow for the assimilation of multiple types of site-specific data. We present the application and validation of the presented methods on an established database of hydrogeological parameters. Data and methods are implemented in the form of an open-source R-package and therefore facilitate easy use by other practitioners.
Kaewkamnerdpong, Issarapong; Krisdapong, Sudaduang
2018-06-01
To assess the hierarchical associations between children's school performance and condition-specific (CS) oral health-related quality of life (OHRQoL), school absence, oral status, sociodemographic and economic status (SDES) and social capital; and to investigate the associations between CS OHRQoL and related oral status, adjusting for SDES and social capital. Data on 925 sixth grade children in Sakaeo province, Thailand, were collected through oral examinations for dental caries and oral hygiene, social capital questionnaires, OHRQoL interviews using the Child-Oral Impacts on Daily Performances index, parental self-administered questionnaires and school documents. A hierarchical conceptual framework was developed, and independent variables were hierarchically entered into multiple logistic models for CS OHRQoL and linear regression models for school performance. After adjusting for SDES and social capital, children with high DMFT or DT scores were significantly threefold more likely to have CS impacts attributed to dental caries. However, poor oral hygiene was not significantly associated with CS impacts attributed to gingival disease. High DMFT scores were significantly associated with lower school performance, whereas high Simplified Oral Hygiene Index scores were not. The final model showed that CS impacts attributed to dental caries and school absence accounted for the association between DMFT score and school performance. Dental caries was associated with CS impacts on OHRQoL, and exerted its effect on school performance through the CS impacts and school absence. There was no association between oral hygiene and CS impacts on OHRQoL or school performance. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
N-flation with hierarchically light axions in string compactifications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cicoli, Michele; Dutta, Koushik; Maharana, Anshuman, E-mail: mcicoli@ictp.it, E-mail: koushik.dutta@saha.ac.in, E-mail: anshumanmaharana@hri.res.in
2014-08-01
We propose a possible embedding of axionic N-flation in type IIB string compactifications where most of the Kähler moduli are stabilised by perturbative effects, and so are hierarchically heavier than the corresponding N>> 1 axions whose collective dynamics drives inflation. This is achieved in the framework of the LARGE Volume Scenario for moduli stabilisation. Our set-up can be used to realise a model of either large field inflation or quintessence, just by varying the volume of the internal space which controls the scale of the axionic potential. Both cases predict a very high scale of supersymmetry breaking. A fully explicit stringymore » embedding of N-flation would require control over dangerous back-reaction effects due to a large number of species. A viable reheating of the Standard Model degrees of freedom can be achieved after the end of inflation due to the perturbative decay of the N light axions which drive inflation.« less
Hierarchical spatiotemporal matrix models for characterizing invasions
Hooten, M.B.; Wikle, C.K.; Dorazio, R.M.; Royle, J. Andrew
2007-01-01
The growth and dispersal of biotic organisms is an important subject in ecology. Ecologists are able to accurately describe survival and fecundity in plant and animal populations and have developed quantitative approaches to study the dynamics of dispersal and population size. Of particular interest are the dynamics of invasive species. Such nonindigenous animals and plants can levy significant impacts on native biotic communities. Effective models for relative abundance have been developed; however, a better understanding of the dynamics of actual population size (as opposed to relative abundance) in an invasion would be beneficial to all branches of ecology. In this article, we adopt a hierarchical Bayesian framework for modeling the invasion of such species while addressing the discrete nature of the data and uncertainty associated with the probability of detection. The nonlinear dynamics between discrete time points are intuitively modeled through an embedded deterministic population model with density-dependent growth and dispersal components. Additionally, we illustrate the importance of accommodating spatially varying dispersal rates. The method is applied to the specific case of the Eurasian Collared-Dove, an invasive species at mid-invasion in the United States at the time of this writing.
Hierarchical spatiotemporal matrix models for characterizing invasions
Hooten, M.B.; Wikle, C.K.; Dorazio, R.M.; Royle, J. Andrew
2007-01-01
The growth and dispersal of biotic organisms is an important subject in ecology. Ecologists are able to accurately describe survival and fecundity in plant and animal populations and have developed quantitative approaches to study the dynamics of dispersal and population size. Of particular interest are the dynamics of invasive species. Such nonindigenous animals and plants can levy significant impacts on native biotic communities. Effective models for relative abundance have been developed; however, a better understanding of the dynamics of actual population size (as opposed to relative abundance) in an invasion would be beneficial to all branches of ecology. In this article, we adopt a hierarchical Bayesian framework for modeling the invasion of such species while addressing the discrete nature of the data and uncertainty associated with the probability of detection. The nonlinear dynamics between discrete time points are intuitively modeled through an embedded deterministic population model with density-dependent growth and dispersal components. Additionally, we illustrate the importance of accommodating spatially varying dispersal rates. The method is applied to the specific case of the Eurasian Collared-Dove, an invasive species at mid-invasion in the United States at the time of this writing. ?? 2006, The International Biometric Society.
A biologically consistent hierarchical framework for self-referencing survivalist computation
NASA Astrophysics Data System (ADS)
Cottam, Ron; Ranson, Willy; Vounckx, Roger
2000-05-01
Extensively scaled formally rational hardware and software are indirectly fallible, at the very least through temporal restrictions on the evaluation of their correctness. In addition, the apparent inability of formal rationality to successfully describe living systems as anything other than inanimate structures suggests that the development of self-referencing computational machines will require a different approach. There is currently a strong movement towards the adoption of semiotics as a descriptive medium in theoretical biology. We present a related computational semiosic construction (1, 2) consistent with evolutionary hierarchical emergence (3), which may serve as a framework for implementing anticipatory-oriented survivalist processing in real environments.
Search algorithm complexity modeling with application to image alignment and matching
NASA Astrophysics Data System (ADS)
DelMarco, Stephen
2014-05-01
Search algorithm complexity modeling, in the form of penetration rate estimation, provides a useful way to estimate search efficiency in application domains which involve searching over a hypothesis space of reference templates or models, as in model-based object recognition, automatic target recognition, and biometric recognition. The penetration rate quantifies the expected portion of the database that must be searched, and is useful for estimating search algorithm computational requirements. In this paper we perform mathematical modeling to derive general equations for penetration rate estimates that are applicable to a wide range of recognition problems. We extend previous penetration rate analyses to use more general probabilistic modeling assumptions. In particular we provide penetration rate equations within the framework of a model-based image alignment application domain in which a prioritized hierarchical grid search is used to rank subspace bins based on matching probability. We derive general equations, and provide special cases based on simplifying assumptions. We show how previously-derived penetration rate equations are special cases of the general formulation. We apply the analysis to model-based logo image alignment in which a hierarchical grid search is used over a geometric misalignment transform hypothesis space. We present numerical results validating the modeling assumptions and derived formulation.
Koo, Won-Tae; Jang, Ji-Soo; Qiao, Shaopeng; Hwang, Wontae; Jha, Gaurav; Penner, Reginald M; Kim, Il-Doo
2018-06-13
Here, we propose heterogeneous nucleation-assisted hierarchical growth of metal-organic frameworks (MOFs) for efficient particulate matter (PM) removal. The assembly of two-dimensional (2D) Zn-based zeolite imidazole frameworks (2D-ZIF-L) in deionized water over a period of time produced hierarchical ZIF-L (H-ZIF-L) on hydrophilic substrates. During the assembly, the second nucleation and growth of ZIF-L occurred on the surface of the first ZIF-L, leading to the formation of flowerlike H-ZIF-L on the substrate. The flowerlike H-ZIF-L was easily synthesized on various substrates, namely, glass, polyurethane three-dimensional foam, nylon microfibers, and nonwoven fabrics. We demonstrated H-ZIF-L-assembled polypropylene microfibers as a washable membrane filter with highly efficient PM removal property (92.5 ± 0.8% for PM 2.5 and 99.5 ± 0.2% for PM 10 ), low pressure drop (10.5 Pa at 25 L min -1 ), long-term stability, and superior recyclability. These outstanding particle filtering properties are mainly attributed to the unique structure of the 2D-shaped H-ZIF-L, which is tightly anchored on individual fibers comprising the membrane.
Biological hierarchies and the nature of extinction.
Congreve, Curtis R; Falk, Amanda R; Lamsdell, James C
2018-05-01
Hierarchy theory recognises that ecological and evolutionary units occur in a nested and interconnected hierarchical system, with cascading effects occurring between hierarchical levels. Different biological disciplines have routinely come into conflict over the primacy of different forcing mechanisms behind evolutionary and ecological change. These disconnects arise partly from differences in perspective (with some researchers favouring ecological forcing mechanisms while others favour developmental/historical mechanisms), as well as differences in the temporal framework in which workers operate. In particular, long-term palaeontological data often show that large-scale (macro) patterns of evolution are predominantly dictated by shifts in the abiotic environment, while short-term (micro) modern biological studies stress the importance of biotic interactions. We propose that thinking about ecological and evolutionary interactions in a hierarchical framework is a fruitful way to resolve these conflicts. Hierarchy theory suggests that changes occurring at lower hierarchical levels can have unexpected, complex effects at higher scales due to emergent interactions between simple systems. In this way, patterns occurring on short- and long-term time scales are equally valid, as changes that are driven from lower levels will manifest in different forms at higher levels. We propose that the dual hierarchy framework fits well with our current understanding of evolutionary and ecological theory. Furthermore, we describe how this framework can be used to understand major extinction events better. Multi-generational attritional loss of reproductive fitness (MALF) has recently been proposed as the primary mechanism behind extinction events, whereby extinction is explainable solely through processes that result in extirpation of populations through a shutdown of reproduction. While not necessarily explicit, the push to explain extinction through solely population-level dynamics could be used to suggest that environmentally mediated patterns of extinction or slowed speciation across geological time are largely artefacts of poor preservation or a coarse temporal scale. We demonstrate how MALF fits into a hierarchical framework, showing that MALF can be a primary forcing mechanism at lower scales that still results in differential survivorship patterns at the species and clade level which vary depending upon the initial environmental forcing mechanism. Thus, even if MALF is the primary mechanism of extinction across all mass extinction events, the primary environmental cause of these events will still affect the system and result in differential responses. Therefore, patterns at both temporal scales are relevant. © 2017 Cambridge Philosophical Society.
SBML Level 3 package: Hierarchical Model Composition, Version 1 Release 3
Smith, Lucian P.; Hucka, Michael; Hoops, Stefan; Finney, Andrew; Ginkel, Martin; Myers, Chris J.; Moraru, Ion; Liebermeister, Wolfram
2017-01-01
Summary Constructing a model in a hierarchical fashion is a natural approach to managing model complexity, and offers additional opportunities such as the potential to re-use model components. The SBML Level 3 Version 1 Core specification does not directly provide a mechanism for defining hierarchical models, but it does provide a mechanism for SBML packages to extend the Core specification and add additional syntactical constructs. The SBML Hierarchical Model Composition package for SBML Level 3 adds the necessary features to SBML to support hierarchical modeling. The package enables a modeler to include submodels within an enclosing SBML model, delete unneeded or redundant elements of that submodel, replace elements of that submodel with element of the containing model, and replace elements of the containing model with elements of the submodel. In addition, the package defines an optional “port” construct, allowing a model to be defined with suggested interfaces between hierarchical components; modelers can chose to use these interfaces, but they are not required to do so and can still interact directly with model elements if they so chose. Finally, the SBML Hierarchical Model Composition package is defined in such a way that a hierarchical model can be “flattened” to an equivalent, non-hierarchical version that uses only plain SBML constructs, thus enabling software tools that do not yet support hierarchy to nevertheless work with SBML hierarchical models. PMID:26528566
Ball-scale based hierarchical multi-object recognition in 3D medical images
NASA Astrophysics Data System (ADS)
Bağci, Ulas; Udupa, Jayaram K.; Chen, Xinjian
2010-03-01
This paper investigates, using prior shape models and the concept of ball scale (b-scale), ways of automatically recognizing objects in 3D images without performing elaborate searches or optimization. That is, the goal is to place the model in a single shot close to the right pose (position, orientation, and scale) in a given image so that the model boundaries fall in the close vicinity of object boundaries in the image. This is achieved via the following set of key ideas: (a) A semi-automatic way of constructing a multi-object shape model assembly. (b) A novel strategy of encoding, via b-scale, the pose relationship between objects in the training images and their intensity patterns captured in b-scale images. (c) A hierarchical mechanism of positioning the model, in a one-shot way, in a given image from a knowledge of the learnt pose relationship and the b-scale image of the given image to be segmented. The evaluation results on a set of 20 routine clinical abdominal female and male CT data sets indicate the following: (1) Incorporating a large number of objects improves the recognition accuracy dramatically. (2) The recognition algorithm can be thought as a hierarchical framework such that quick replacement of the model assembly is defined as coarse recognition and delineation itself is known as finest recognition. (3) Scale yields useful information about the relationship between the model assembly and any given image such that the recognition results in a placement of the model close to the actual pose without doing any elaborate searches or optimization. (4) Effective object recognition can make delineation most accurate.
Hierarchical State-Space Estimation of Leatherback Turtle Navigation Ability
Mills Flemming, Joanna; Jonsen, Ian D.; Field, Christopher A.
2010-01-01
Remotely sensed tracking technology has revealed remarkable migration patterns that were previously unknown; however, models to optimally use such data have developed more slowly. Here, we present a hierarchical Bayes state-space framework that allows us to combine tracking data from a collection of animals and make inferences at both individual and broader levels. We formulate models that allow the navigation ability of animals to be estimated and demonstrate how information can be combined over many animals to allow improved estimation. We also show how formal hypothesis testing regarding navigation ability can easily be accomplished in this framework. Using Argos satellite tracking data from 14 leatherback turtles, 7 males and 7 females, during their southward migration from Nova Scotia, Canada, we find that the circle of confusion (the radius around an animal's location within which it is unable to determine its location precisely) is approximately 96 km. This estimate suggests that the turtles' navigation does not need to be highly accurate, especially if they are able to use more reliable cues as they near their destination. Moreover, for the 14 turtles examined, there is little evidence to suggest that male and female navigation abilities differ. Because of the minimal assumptions made about the movement process, our approach can be used to estimate and compare navigation ability for many migratory species that are able to carry electronic tracking devices. PMID:21203382
Demirkus, Meltem; Precup, Doina; Clark, James J; Arbel, Tal
2016-06-01
Recent literature shows that facial attributes, i.e., contextual facial information, can be beneficial for improving the performance of real-world applications, such as face verification, face recognition, and image search. Examples of face attributes include gender, skin color, facial hair, etc. How to robustly obtain these facial attributes (traits) is still an open problem, especially in the presence of the challenges of real-world environments: non-uniform illumination conditions, arbitrary occlusions, motion blur and background clutter. What makes this problem even more difficult is the enormous variability presented by the same subject, due to arbitrary face scales, head poses, and facial expressions. In this paper, we focus on the problem of facial trait classification in real-world face videos. We have developed a fully automatic hierarchical and probabilistic framework that models the collective set of frame class distributions and feature spatial information over a video sequence. The experiments are conducted on a large real-world face video database that we have collected, labelled and made publicly available. The proposed method is flexible enough to be applied to any facial classification problem. Experiments on a large, real-world video database McGillFaces [1] of 18,000 video frames reveal that the proposed framework outperforms alternative approaches, by up to 16.96 and 10.13%, for the facial attributes of gender and facial hair, respectively.
Hierarchical SAPO‐34 Architectures with Tailored Acid Sites using Sustainable Sugar Templates
Miletto, Ivana; Ivaldi, Chiara; Paul, Geo; Chapman, Stephanie; Marchese, Leonardo; Raja, Robert
2018-01-01
Abstract In a distinct, bottom‐up synthetic methodology, monosaccharides (fructose and glucose) and disaccharides (sucrose) have been used as mesoporogens to template hierarchical SAPO‐34 catalysts. Detailed materials characterization, which includes solid‐state magic angle spinning NMR and probe‐based FTIR, reveals that, although the mesopore dimensions are modified by the identity of the sugar template, the desirable acid characteristics of the microporous framework are retained. When the activity of the hierarchical SAPO‐34 catalysts was evaluated in the industrially relevant Beckmann rearrangement, under liquid‐phase conditions, the enhanced mass‐transport properties of sucrose‐templated hierarchical SAPO‐34 were found to deliver a superior yield of ϵ‐caprolactam. PMID:29686961
Mohammed, Abdul-Wahid; Xu, Yang; Hu, Haixiao; Agyemang, Brighter
2016-09-21
In novel collaborative systems, cooperative entities collaborate services to achieve local and global objectives. With the growing pervasiveness of cyber-physical systems, however, such collaboration is hampered by differences in the operations of the cyber and physical objects, and the need for the dynamic formation of collaborative functionality given high-level system goals has become practical. In this paper, we propose a cross-layer automation and management model for cyber-physical systems. This models the dynamic formation of collaborative services pursuing laid-down system goals as an ontology-oriented hierarchical task network. Ontological intelligence provides the semantic technology of this model, and through semantic reasoning, primitive tasks can be dynamically composed from high-level system goals. In dealing with uncertainty, we further propose a novel bridge between hierarchical task networks and Markov logic networks, called the Markov task network. This leverages the efficient inference algorithms of Markov logic networks to reduce both computational and inferential loads in task decomposition. From the results of our experiments, high-precision service composition under uncertainty can be achieved using this approach.
NASA Technical Reports Server (NTRS)
Jung, Jinha; Pasolli, Edoardo; Prasad, Saurabh; Tilton, James C.; Crawford, Melba M.
2014-01-01
Acquiring current, accurate land-use information is critical for monitoring and understanding the impact of anthropogenic activities on natural environments.Remote sensing technologies are of increasing importance because of their capability to acquire information for large areas in a timely manner, enabling decision makers to be more effective in complex environments. Although optical imagery has demonstrated to be successful for land cover classification, active sensors, such as light detection and ranging (LiDAR), have distinct capabilities that can be exploited to improve classification results. However, utilization of LiDAR data for land cover classification has not been fully exploited. Moreover, spatial-spectral classification has recently gained significant attention since classification accuracy can be improved by extracting additional information from the neighboring pixels. Although spatial information has been widely used for spectral data, less attention has been given to LiDARdata. In this work, a new framework for land cover classification using discrete return LiDAR data is proposed. Pseudo-waveforms are generated from the LiDAR data and processed by hierarchical segmentation. Spatial featuresare extracted in a region-based way using a new unsupervised strategy for multiple pruning of the segmentation hierarchy. The proposed framework is validated experimentally on a real dataset acquired in an urban area. Better classification results are exhibited by the proposed framework compared to the cases in which basic LiDAR products such as digital surface model and intensity image are used. Moreover, the proposed region-based feature extraction strategy results in improved classification accuracies in comparison with a more traditional window-based approach.
NASA Technical Reports Server (NTRS)
Maluf, David A.; Tran, Peter B.
2003-01-01
Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.
A Hierarchical Biology Concept Framework: A Tool for Course Design
Khodor, Julia; Halme, Dina Gould; Walker, Graham C.
2004-01-01
A typical undergraduate biology curriculum covers a very large number of concepts and details. We describe the development of a Biology Concept Framework (BCF) as a possible way to organize this material to enhance teaching and learning. Our BCF is hierarchical, places details in context, nests related concepts, and articulates concepts that are inherently obvious to experts but often difficult for novices to grasp. Our BCF is also cross-referenced, highlighting interconnections between concepts. We have found our BCF to be a versatile tool for design, evaluation, and revision of course goals and materials. There has been a call for creating Biology Concept Inventories, multiple-choice exams that test important biology concepts, analogous to those in physics, astronomy, and chemistry. We argue that the community of researchers and educators must first reach consensus about not only what concepts are important to test, but also how the concepts should be organized and how that organization might influence teaching and learning. We think that our BCF can serve as a catalyst for community-wide discussion on organizing the vast number of concepts in biology, as a model for others to formulate their own BCFs and as a contribution toward the creation of a comprehensive BCF. PMID:15257339
NASA Astrophysics Data System (ADS)
Schinabeck, C.; Erpenbeck, A.; Härtle, R.; Thoss, M.
2016-11-01
Within the hierarchical quantum master equation (HQME) framework, an approach is presented, which allows a numerically exact description of nonequilibrium charge transport in nanosystems with strong electronic-vibrational coupling. The method is applied to a generic model of vibrationally coupled transport considering a broad spectrum of parameters ranging from the nonadiabatic to the adiabatic regime and including both resonant and off-resonant transport. We show that nonequilibrium effects are important in all these regimes. In particular, in the off-resonant transport regime, the inelastic cotunneling signal is analyzed for a vibrational mode in full nonequilibrium, revealing a complex interplay of different transport processes and deviations from the commonly used G0/2 rule of thumb. In addition, the HQME approach is used to benchmark approximate master equation and nonequilibrium Green's function methods.
Harmonising Nursing Terminologies Using a Conceptual Framework.
Jansen, Kay; Kim, Tae Youn; Coenen, Amy; Saba, Virginia; Hardiker, Nicholas
2016-01-01
The International Classification for Nursing Practice (ICNP®) and the Clinical Care Classification (CCC) System are standardised nursing terminologies that identify discrete elements of nursing practice, including nursing diagnoses, interventions, and outcomes. While CCC uses a conceptual framework or model with 21 Care Components to classify these elements, ICNP, built on a formal Web Ontology Language (OWL) description logic foundation, uses a logical hierarchical framework that is useful for computing and maintenance of ICNP. Since the logical framework of ICNP may not always align with the needs of nursing practice, an informal framework may be a more useful organisational tool to represent nursing content. The purpose of this study was to classify ICNP nursing diagnoses using the 21 Care Components of the CCC as a conceptual framework to facilitate usability and inter-operability of nursing diagnoses in electronic health records. Findings resulted in all 521 ICNP diagnoses being assigned to one of the 21 CCC Care Components. Further research is needed to validate the resulting product of this study with practitioners and develop recommendations for improvement of both terminologies.
Sharon E. Clarke; Sandra A. Bryce
1997-01-01
This document presents two spatial scales of a hierarchical, ecoregional framework and provides a connection to both larger and smaller scale ecological classifications. The two spatial scales are subregions (1:250,000) and landscape-level ecoregions (1:100,000), or Level IV and Level V ecoregions. Level IV ecoregions were developed by the Environmental Protection...
NASA Astrophysics Data System (ADS)
Hyo Park, Jung; Min Choi, Kyung; Joon Jeon, Hyung; Jung Choi, Yoon; Ku Kang, Jeung
2015-07-01
Although structures with the single functional constructions and micropores were demonstrated to capture many different molecules such as carbon dioxide, methane, and hydrogen with high capacities at low temperatures, their feeble interactions still limit practical applications at room temperature. Herein, we report in-situ growth observation of hierarchical pores in pomegranate metal-organic frameworks (pmg-MOFs) and their self-sequestering storage mechanism, not observed for pristine MOFs. Direct observation of hierarchical pores inside the pmg-MOF was evident by in-situ growth X-ray measurements while self-sequestering storage mechanism was revealed by in-situ gas sorption X-ray analysis and molecular dynamics simulations. The results show that meso/macropores are created at the early stage of crystal growth and then enclosed by micropore crystalline shells, where hierarchical pores are networking under self-sequestering mechanism to give enhanced gas storage. This pmg-MOF gives higher CO2 (39%) and CH4 (14%) storage capacity than pristine MOF at room temperature, in addition to fast kinetics with robust capacity retention during gas sorption cycles, thus giving the clue to control dynamic behaviors of gas adsorption.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Chao; Xu, Zhijie; Lai, Kevin
The first part of this paper (Part 1) presents a numerical model for non-reactive physical mass transfer across a wetted wall column (WWC). In Part 2, we improved the existing computational fluid dynamics (CFD) model to simulate chemical absorption occurring in a WWC as a bench-scale study of solvent-based carbon dioxide (CO2) capture. To generate data for WWC model validation, CO2 mass transfer across a monoethanolamine (MEA) solvent was first measured on a WWC experimental apparatus. The numerical model developed in this work has the ability to account for both chemical absorption and desorption of CO2 in MEA. In addition,more » the overall mass transfer coefficient predicted using traditional/empirical correlations is conducted and compared with CFD prediction results for both steady and wavy falling films. A Bayesian statistical calibration algorithm is adopted to calibrate the reaction rate constants in chemical absorption/desorption of CO2 across a falling film of MEA. The posterior distributions of the two transport properties, i.e., Henry’s constant and gas diffusivity in the non-reacting nitrous oxide (N2O)/MEA system obtained from Part 1 of this study, serves as priors for the calibration of CO2 reaction rate constants after using the N2O/CO2 analogy method. The calibrated model can be used to predict the CO2 mass transfer in a WWC for a wider range of operating conditions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Chao; Xu, Zhijie; Lai, Kevin
Part 1 of this paper presents a numerical model for non-reactive physical mass transfer across a wetted wall column (WWC). In Part 2, we improved the existing computational fluid dynamics (CFD) model to simulate chemical absorption occurring in a WWC as a bench-scale study of solvent-based carbon dioxide (CO 2) capture. In this study, to generate data for WWC model validation, CO 2 mass transfer across a monoethanolamine (MEA) solvent was first measured on a WWC experimental apparatus. The numerical model developed in this work can account for both chemical absorption and desorption of CO 2 in MEA. In addition,more » the overall mass transfer coefficient predicted using traditional/empirical correlations is conducted and compared with CFD prediction results for both steady and wavy falling films. A Bayesian statistical calibration algorithm is adopted to calibrate the reaction rate constants in chemical absorption/desorption of CO 2 across a falling film of MEA. The posterior distributions of the two transport properties, i.e., Henry's constant and gas diffusivity in the non-reacting nitrous oxide (N 2O)/MEA system obtained from Part 1 of this study, serves as priors for the calibration of CO 2 reaction rate constants after using the N 2O/CO 2 analogy method. Finally, the calibrated model can be used to predict the CO 2 mass transfer in a WWC for a wider range of operating conditions.« less
Wang, Chao; Xu, Zhijie; Lai, Kevin; ...
2017-10-24
Part 1 of this paper presents a numerical model for non-reactive physical mass transfer across a wetted wall column (WWC). In Part 2, we improved the existing computational fluid dynamics (CFD) model to simulate chemical absorption occurring in a WWC as a bench-scale study of solvent-based carbon dioxide (CO 2) capture. In this study, to generate data for WWC model validation, CO 2 mass transfer across a monoethanolamine (MEA) solvent was first measured on a WWC experimental apparatus. The numerical model developed in this work can account for both chemical absorption and desorption of CO 2 in MEA. In addition,more » the overall mass transfer coefficient predicted using traditional/empirical correlations is conducted and compared with CFD prediction results for both steady and wavy falling films. A Bayesian statistical calibration algorithm is adopted to calibrate the reaction rate constants in chemical absorption/desorption of CO 2 across a falling film of MEA. The posterior distributions of the two transport properties, i.e., Henry's constant and gas diffusivity in the non-reacting nitrous oxide (N 2O)/MEA system obtained from Part 1 of this study, serves as priors for the calibration of CO 2 reaction rate constants after using the N 2O/CO 2 analogy method. Finally, the calibrated model can be used to predict the CO 2 mass transfer in a WWC for a wider range of operating conditions.« less
A Framework for a Decision Support System in a Hierarchical Extended Enterprise Decision Context
NASA Astrophysics Data System (ADS)
Boza, Andrés; Ortiz, Angel; Vicens, Eduardo; Poler, Raul
Decision Support System (DSS) tools provide useful information to decision makers. In an Extended Enterprise, a new goal, changes in the current objectives or small changes in the extended enterprise configuration produce a necessary adjustment in its decision system. A DSS in this context must be flexible and agile to make suitable an easy and quickly adaptation to this new context. This paper proposes to extend the Hierarchical Production Planning (HPP) structure to an Extended Enterprise decision making context. In this way, a framework for DSS in Extended Enterprise context is defined using components of HPP. Interoperability details have been reviewed to identify the impact in this framework. The proposed framework allows overcoming some interoperability barriers, identifying and organizing components for a DSS in Extended Enterprise context, and working in the definition of an architecture to be used in the design process of a flexible DSS in Extended Enterprise context which can reuse components for futures Extended Enterprise configurations.
Merging information from multi-model flood projections in a hierarchical Bayesian framework
NASA Astrophysics Data System (ADS)
Le Vine, Nataliya
2016-04-01
Multi-model ensembles are becoming widely accepted for flood frequency change analysis. The use of multiple models results in large uncertainty around estimates of flood magnitudes, due to both uncertainty in model selection and natural variability of river flow. The challenge is therefore to extract the most meaningful signal from the multi-model predictions, accounting for both model quality and uncertainties in individual model estimates. The study demonstrates the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach facilitates explicit treatment of shared multi-model discrepancy as well as the probabilistic nature of the flood estimates, by treating the available models as a sample from a hypothetical complete (but unobserved) set of models. The advantages of the approach are: 1) to insure an adequate 'baseline' conditions with which to compare future changes; 2) to reduce flood estimate uncertainty; 3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; 4) to adjust multi-model consistency criteria when model biases are large; and 5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.
Creating the environment for driver distraction: A thematic framework of sociotechnical factors.
Parnell, Katie J; Stanton, Neville A; Plant, Katherine L
2018-04-01
As modern society becomes more reliant on technology, its use within the vehicle is becoming a concern for road safety due to both portable and built-in devices offering sources of distraction. While the effects of distracting technologies are well documented, little is known about the causal factors that lead to the drivers' engagement with technological devices. The relevance of the sociotechnical system within which the behaviour occurs requires further research. This paper presents two experiments, the first aims to assess the drivers self-reported decision to engage with technological tasks while driving and their reasoning for doing so with respect to the wider sociotechnical system. This utilised a semi-structured interview method, conducted with 30 drivers to initiate a discussion on their likelihood of engaging with 22 different tasks across 7 different road types. Inductive thematic analysis provided a hierarchical thematic framework that detailed the self-reported causal factors that influence the drivers' use of technology whilst driving. The second experiment assessed the relevance of the hierarchical framework to a model of distraction that was established from within the literature on the drivers use of distracting technologies while driving. The findings provide validation for some relationships studied in the literature, as well as providing insights into relationships that require further study. The role of the sociotechnical system in the engagement of distractions while driving is highlighted, with the causal factors reported by drivers suggesting the importance of considering the wider system within which the behaviour is occurring and how it may be creating the conditions for distraction to occur. This supports previous claims made within the literature based model. Recommendations are proposed that encourage a movement away from individual focused countermeasures towards systemic actors. Copyright © 2017 Elsevier Ltd. All rights reserved.
Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, Qifeng, E-mail: liaoqf@shanghaitech.edu.cn; Lin, Guang, E-mail: guanglin@purdue.edu
2016-07-15
In this paper we present a reduced basis ANOVA approach for partial deferential equations (PDEs) with random inputs. The ANOVA method combined with stochastic collocation methods provides model reduction in high-dimensional parameter space through decomposing high-dimensional inputs into unions of low-dimensional inputs. In this work, to further reduce the computational cost, we investigate spatial low-rank structures in the ANOVA-collocation method, and develop efficient spatial model reduction techniques using hierarchically generated reduced bases. We present a general mathematical framework of the methodology, validate its accuracy and demonstrate its efficiency with numerical experiments.
A hierarchical model for estimating density in camera-trap studies
Royle, J. Andrew; Nichols, James D.; Karanth, K.Ullas; Gopalaswamy, Arjun M.
2009-01-01
Estimating animal density using capture–recapture data from arrays of detection devices such as camera traps has been problematic due to the movement of individuals and heterogeneity in capture probability among them induced by differential exposure to trapping.We develop a spatial capture–recapture model for estimating density from camera-trapping data which contains explicit models for the spatial point process governing the distribution of individuals and their exposure to and detection by traps.We adopt a Bayesian approach to analysis of the hierarchical model using the technique of data augmentation.The model is applied to photographic capture–recapture data on tigers Panthera tigris in Nagarahole reserve, India. Using this model, we estimate the density of tigers to be 14·3 animals per 100 km2 during 2004.Synthesis and applications. Our modelling framework largely overcomes several weaknesses in conventional approaches to the estimation of animal density from trap arrays. It effectively deals with key problems such as individual heterogeneity in capture probabilities, movement of traps, presence of potential ‘holes’ in the array and ad hoc estimation of sample area. The formulation, thus, greatly enhances flexibility in the conduct of field surveys as well as in the analysis of data, from studies that may involve physical, photographic or DNA-based ‘captures’ of individual animals.
Lopez, Michael J; Schuckers, Michael
2017-05-01
Roughly 14% of regular season National Hockey League games since the 2005-06 season have been decided by a shoot-out, and the resulting allocation of points has impacted play-off races each season. But despite interest from fans, players and league officials, there is little in the way of published research on team or individual shoot-out performance. This manuscript attempts to fill that void. We present both generalised linear mixed model and Bayesian hierarchical model frameworks to model shoot-out outcomes, with results suggesting that there are (i) small but statistically significant talent gaps between shooters, (ii) marginal differences in performance among netminders and (iii) few, if any, predictors of player success after accounting for individual talent. We also provide a resampling strategy to highlight a selection bias with respect to shooter assignment, in which coaches choose their most skilled offensive players early in shoot-out rounds and are less likely to select players with poor past performances. Finally, given that per-shot data for shoot-outs do not currently exist in a single location for public use, we provide both our data and source code for other researchers interested in studying shoot-out outcomes.
A sustainable system of systems approach: a new HFE paradigm.
Thatcher, Andrew; Yeow, Paul H P
2016-01-01
Sustainability issues such as natural resource depletion, pollution and poor working conditions have no geographical boundaries in our interconnected world. To address these issues requires a paradigm shift within human factors and ergonomics (HFE), to think beyond a bounded, linear model understanding towards a broader systems framework. For this reason, we introduce a sustainable system of systems model that integrates the current hierarchical conceptualisation of possible interventions (i.e., micro-, meso- and macro-ergonomics) with important concepts from the sustainability literature, including the triple bottom line approach and the notion of time frames. Two practical examples from the HFE literature are presented to illustrate the model. The implications of this paradigm shift for HFE researchers and practitioners are discussed and include the long-term sustainability of the HFE community and comprehensive solutions to problems that consider the emergent issues that arise from this interconnected world. A sustainable world requires a broader systems thinking than that which currently exists in ergonomics. This study proposes a sustainable system of systems model that incorporates ideas from the ecological sciences, notably a nested hierarchy of systems and a hierarchical time dimension. The implications for sustainable design and the sustainability of the HFE community are considered.
Hierarchical Probabilistic Inference of Cosmic Shear
NASA Astrophysics Data System (ADS)
Schneider, Michael D.; Hogg, David W.; Marshall, Philip J.; Dawson, William A.; Meyers, Joshua; Bard, Deborah J.; Lang, Dustin
2015-07-01
Point estimators for the shearing of galaxy images induced by gravitational lensing involve a complex inverse problem in the presence of noise, pixelization, and model uncertainties. We present a probabilistic forward modeling approach to gravitational lensing inference that has the potential to mitigate the biased inferences in most common point estimators and is practical for upcoming lensing surveys. The first part of our statistical framework requires specification of a likelihood function for the pixel data in an imaging survey given parameterized models for the galaxies in the images. We derive the lensing shear posterior by marginalizing over all intrinsic galaxy properties that contribute to the pixel data (i.e., not limited to galaxy ellipticities) and learn the distributions for the intrinsic galaxy properties via hierarchical inference with a suitably flexible conditional probabilitiy distribution specification. We use importance sampling to separate the modeling of small imaging areas from the global shear inference, thereby rendering our algorithm computationally tractable for large surveys. With simple numerical examples we demonstrate the improvements in accuracy from our importance sampling approach, as well as the significance of the conditional distribution specification for the intrinsic galaxy properties when the data are generated from an unknown number of distinct galaxy populations with different morphological characteristics.
Beyond Recall in Reading Comprehension: Five Key Planning Decisions.
ERIC Educational Resources Information Center
Sinatra, Richard; Annacone, Dominic
Over the years, teacher questions have consistently aimed at literal comprehension, indicating that teachers lack understanding of the reading-thinking-questioning hierarchy. Benjamin Bloom's "Cognitive Taxonomy" can serve as a hierarchical framework for the design of questions. Within this framework, a teacher can confront decision…
Sun, Mengshu; Xue, Yuankun; Bogdan, Paul; Tang, Jian; Wang, Yanzhi; Lin, Xue
2018-01-01
Recently, a new approach has been introduced that leverages and over-provisions energy storage devices (ESDs) in data centers for performing power capping and facilitating capex/opex reductions, without performance overhead. To fully realize the potential benefits of the hierarchical ESD structure, we propose a comprehensive design, control, and provisioning framework including (i) designing power delivery architecture supporting hierarchical ESD structure and hybrid ESDs for some levels, as well as (ii) control and provisioning of the hierarchical ESD structure including run-time ESD charging/discharging control and design-time determination of ESD types, homogeneous/hybrid options, ESD provisioning at each level. Experiments have been conducted using real Google data center workloads based on realistic data center specifications.
Xue, Yuankun; Bogdan, Paul; Tang, Jian; Wang, Yanzhi; Lin, Xue
2018-01-01
Recently, a new approach has been introduced that leverages and over-provisions energy storage devices (ESDs) in data centers for performing power capping and facilitating capex/opex reductions, without performance overhead. To fully realize the potential benefits of the hierarchical ESD structure, we propose a comprehensive design, control, and provisioning framework including (i) designing power delivery architecture supporting hierarchical ESD structure and hybrid ESDs for some levels, as well as (ii) control and provisioning of the hierarchical ESD structure including run-time ESD charging/discharging control and design-time determination of ESD types, homogeneous/hybrid options, ESD provisioning at each level. Experiments have been conducted using real Google data center workloads based on realistic data center specifications. PMID:29351553
NASA Astrophysics Data System (ADS)
Chen, Jing; Hong, Min; Chen, Jiafu; Hu, Tianzhao; Xu, Qun
2018-06-01
Porous amorphous carbons with large number of defects and dangling bonds indicate great potential application in energy storage due to high specific surface area and strong adsorption properties, but poor conductivity and pore connection limit their practical application. Here few-layer graphene framework with high electrical conductivity is embedded and meanwhile hierarchical porous structure is constructed in amorphous hollow carbon spheres (HCSs) by catalysis of Fe clusters of angstrom scale, which are loaded in the interior of crosslinked polystyrene via a novel method. These unique HCSs effectively integrate the inherent properties from two-dimensional sp2-hybridized carbon, porous amorphous carbon, hierarchical pore structure and thin shell, leading to high specific capacitance up to 561 F g-1 at a current density of 0.5 A g-1 as an electrode of supercapacitor with excellent recyclability, which is much higher than those of other reported porous carbon materials up to present.
Databases for multilevel biophysiology research available at Physiome.jp.
Asai, Yoshiyuki; Abe, Takeshi; Li, Li; Oka, Hideki; Nomura, Taishin; Kitano, Hiroaki
2015-01-01
Physiome.jp (http://physiome.jp) is a portal site inaugurated in 2007 to support model-based research in physiome and systems biology. At Physiome.jp, several tools and databases are available to support construction of physiological, multi-hierarchical, large-scale models. There are three databases in Physiome.jp, housing mathematical models, morphological data, and time-series data. In late 2013, the site was fully renovated, and in May 2015, new functions were implemented to provide information infrastructure to support collaborative activities for developing models and performing simulations within the database framework. This article describes updates to the databases implemented since 2013, including cooperation among the three databases, interactive model browsing, user management, version management of models, management of parameter sets, and interoperability with applications.
Wang, Xiao; Gu, Jinghua; Hilakivi-Clarke, Leena; Clarke, Robert; Xuan, Jianhua
2017-01-15
The advent of high-throughput DNA methylation profiling techniques has enabled the possibility of accurate identification of differentially methylated genes for cancer research. The large number of measured loci facilitates whole genome methylation study, yet posing great challenges for differential methylation detection due to the high variability in tumor samples. We have developed a novel probabilistic approach, D: ifferential M: ethylation detection using a hierarchical B: ayesian model exploiting L: ocal D: ependency (DM-BLD), to detect differentially methylated genes based on a Bayesian framework. The DM-BLD approach features a joint model to capture both the local dependency of measured loci and the dependency of methylation change in samples. Specifically, the local dependency is modeled by Leroux conditional autoregressive structure; the dependency of methylation changes is modeled by a discrete Markov random field. A hierarchical Bayesian model is developed to fully take into account the local dependency for differential analysis, in which differential states are embedded as hidden variables. Simulation studies demonstrate that DM-BLD outperforms existing methods for differential methylation detection, particularly when the methylation change is moderate and the variability of methylation in samples is high. DM-BLD has been applied to breast cancer data to identify important methylated genes (such as polycomb target genes and genes involved in transcription factor activity) associated with breast cancer recurrence. A Matlab package of DM-BLD is available at http://www.cbil.ece.vt.edu/software.htm CONTACT: Xuan@vt.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Carle, Adam C; Riley, William; Hays, Ron D; Cella, David
2015-10-01
To guide measure development, National Institutes of Health-supported Patient reported Outcomes Measurement Information System (PROMIS) investigators developed a hierarchical domain framework. The framework specifies health domains at multiple levels. The initial PROMIS domain framework specified that physical function and symptoms such as Pain and Fatigue indicate Physical Health (PH); Depression, Anxiety, and Anger indicate Mental Health (MH); and Social Role Performance and Social Satisfaction indicate Social Health (SH). We used confirmatory factor analyses to evaluate the fit of the hypothesized framework to data collected from a large sample. We used data (n=14,098) from PROMIS's wave 1 field test and estimated domain scores using the PROMIS item response theory parameters. We then used confirmatory factor analyses to test whether the domains corresponded to the PROMIS domain framework as expected. A model corresponding to the domain framework did not provide ideal fit [root mean square error of approximation (RMSEA)=0.13; comparative fit index (CFI)=0.92; Tucker Lewis Index (TLI)=0.88; standardized root mean square residual (SRMR)=0.09]. On the basis of modification indices and exploratory factor analyses, we allowed Fatigue to load on both PH and MH. This model fit the data acceptably (RMSEA=0.08; CFI=0.97; TLI=0.96; SRMR=0.03). Our findings generally support the PROMIS domain framework. Allowing Fatigue to load on both PH and MH improved fit considerably.
Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie
2016-03-01
In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature. Copyright © 2015 Elsevier Ltd. All rights reserved.
A Framework for Monitoring Progress Using Summary Measures of Health.
Madans, Jennifer H; Weeks, Julie D
2016-10-01
Initiatives designed to monitor health typically incorporate numerous specific measures of health and the health system to assess improvements, or lack thereof, for policy and program purposes. The addition of summary measures provides overarching information which is essential for determining whether the goals of such initiatives are met. Summary measures are identified that relate to the individual indicators but that also reflect movement in the various parts of the system. A hierarchical framework that is conceptually consistent and which utilizes a succinct number of summary measures incorporating indicators of functioning and participation is proposed. While a large set of individual indicators can be useful for monitoring progress, these individual indicators do not provide an overall evaluation of health, defined broadly, at the population level. A hierarchical framework consisting of summary measures is important for monitoring the success of health improvement initiatives. © The Author(s) 2016.
Ahn, Woo-Young; Haines, Nathaniel; Zhang, Lei
2017-01-01
Reinforcement learning and decision-making (RLDM) provide a quantitative framework and computational theories with which we can disentangle psychiatric conditions into the basic dimensions of neurocognitive functioning. RLDM offer a novel approach to assessing and potentially diagnosing psychiatric patients, and there is growing enthusiasm for both RLDM and computational psychiatry among clinical researchers. Such a framework can also provide insights into the brain substrates of particular RLDM processes, as exemplified by model-based analysis of data from functional magnetic resonance imaging (fMRI) or electroencephalography (EEG). However, researchers often find the approach too technical and have difficulty adopting it for their research. Thus, a critical need remains to develop a user-friendly tool for the wide dissemination of computational psychiatric methods. We introduce an R package called hBayesDM (hierarchical Bayesian modeling of Decision-Making tasks), which offers computational modeling of an array of RLDM tasks and social exchange games. The hBayesDM package offers state-of-the-art hierarchical Bayesian modeling, in which both individual and group parameters (i.e., posterior distributions) are estimated simultaneously in a mutually constraining fashion. At the same time, the package is extremely user-friendly: users can perform computational modeling, output visualization, and Bayesian model comparisons, each with a single line of coding. Users can also extract the trial-by-trial latent variables (e.g., prediction errors) required for model-based fMRI/EEG. With the hBayesDM package, we anticipate that anyone with minimal knowledge of programming can take advantage of cutting-edge computational-modeling approaches to investigate the underlying processes of and interactions between multiple decision-making (e.g., goal-directed, habitual, and Pavlovian) systems. In this way, we expect that the hBayesDM package will contribute to the dissemination of advanced modeling approaches and enable a wide range of researchers to easily perform computational psychiatric research within different populations. PMID:29601060
Wildhaber, Mark L.; Wikle, Christopher K.; Moran, Edward H.; Anderson, Christopher J.; Franz, Kristie J.; Dey, Rima
2017-01-01
We present a hierarchical series of spatially decreasing and temporally increasing models to evaluate the uncertainty in the atmosphere – ocean global climate model (AOGCM) and the regional climate model (RCM) relative to the uncertainty in the somatic growth of the endangered pallid sturgeon (Scaphirhynchus albus). For effects on fish populations of riverine ecosystems, cli- mate output simulated by coarse-resolution AOGCMs and RCMs must be downscaled to basins to river hydrology to population response. One needs to transfer the information from these climate simulations down to the individual scale in a way that minimizes extrapolation and can account for spatio-temporal variability in the intervening stages. The goal is a framework to determine whether, given uncertainties in the climate models and the biological response, meaningful inference can still be made. The non-linear downscaling of climate information to the river scale requires that one realistically account for spatial and temporal variability across scale. Our down- scaling procedure includes the use of fixed/calibrated hydrological flow and temperature models coupled with a stochastically parameterized sturgeon bioenergetics model. We show that, although there is a large amount of uncertainty associated with both the climate model output and the fish growth process, one can establish significant differences in fish growth distributions between models, and between future and current climates for a given model.
A Framework for Teaching Social and Environmental Sustainability to Undergraduate Business Majors
ERIC Educational Resources Information Center
Brumagim, Alan L.; Cann, Cynthia W.
2012-01-01
The authors outline an undergraduate exercise to help students more fully understand the environmental and social justice aspects of business sustainability activities. A simple hierarchical framework, based on Maslow's (1943) work, was utilized to help the students understand, analyze, and judge the vast amount of corporate sustainability…
How Misapplication of the Hydrologic Unit Framework Diminishes the Meaning of Watersheds
Hydrologic units provide a convenient nationwide set of geographic polygons based on an arbitrary subdivision of the drainage of land surface areas at several hierarchical levels. Half or more of these units, however, are not true watersheds as the official name of the framework,...
NASA Astrophysics Data System (ADS)
Alameddine, Ibrahim; Karmakar, Subhankar; Qian, Song S.; Paerl, Hans W.; Reckhow, Kenneth H.
2013-10-01
The total maximum daily load program aims to monitor more than 40,000 standard violations in around 20,000 impaired water bodies across the United States. Given resource limitations, future monitoring efforts have to be hedged against the uncertainties in the monitored system, while taking into account existing knowledge. In that respect, we have developed a hierarchical spatiotemporal Bayesian model that can be used to optimize an existing monitoring network by retaining stations that provide the maximum amount of information, while identifying locations that would benefit from the addition of new stations. The model assumes the water quality parameters are adequately described by a joint matrix normal distribution. The adopted approach allows for a reduction in redundancies, while emphasizing information richness rather than data richness. The developed approach incorporates the concept of entropy to account for the associated uncertainties. Three different entropy-based criteria are adopted: total system entropy, chlorophyll-a standard violation entropy, and dissolved oxygen standard violation entropy. A multiple attribute decision making framework is adopted to integrate the competing design criteria and to generate a single optimal design. The approach is implemented on the water quality monitoring system of the Neuse River Estuary in North Carolina, USA. The model results indicate that the high priority monitoring areas identified by the total system entropy and the dissolved oxygen violation entropy criteria are largely coincident. The monitoring design based on the chlorophyll-a standard violation entropy proved to be less informative, given the low probabilities of violating the water quality standard in the estuary.
Rod-like hierarchical Sn/SnOx@C nanostructures with enhanced lithium storage properties
NASA Astrophysics Data System (ADS)
Yang, Juan; Chen, Sanmei; Tang, Jingjing; Tian, Hangyu; Bai, Tao; Zhou, Xiangyang
2018-03-01
Rod-like hierarchical Sn/SnOx@C nanostructures have been designed and synthesized via calcining resorcinol-formaldehyde (RF) resin coated Sn-based metal-organic frameworks. The rod-like hierarchical Sn/SnOx@C nanostructures are made of a great number of carbon-wrapped primary Sn/SnOx nanospheres of 100-200 nm in diameter. The as-prepared hierarchical Sn/SnOx@C nanocomposite manifests a high initial reversible capacity of 1177 mAh g-1 and remains 1001 mAh g-1 after 240 cycles at a current density of 200 mA g-1. It delivers outstanding high-rate performance with a reversible capacity of 823 mAh g-1 even at a high current density of 1000 mA g-1. The enhanced electrochemical performances of the Sn/SnOx@C electrode are mainly attributed to the synergistic effect of the unique hierarchical micro/nanostructures and the protective carbon layer.
DECISION-COMPONENTS OF NICE'S TECHNOLOGY APPRAISALS ASSESSMENT FRAMEWORK.
de Folter, Joost; Trusheim, Mark; Jonsson, Pall; Garner, Sarah
2018-01-01
Value assessment frameworks have gained prominence recently in the context of U.S. healthcare. Such frameworks set out a series of factors that are considered in funding decisions. The UK's National Institute of Health and Care Excellence (NICE) is an established health technology assessment (HTA) agency. We present a novel application of text analysis that characterizes NICE's Technology Appraisals in the context of the newer assessment frameworks and present the results in a visual way. A total of 243 documents of NICE's medicines guidance from 2007 to 2016 were analyzed. Text analysis was used to identify a hierarchical set of decision factors considered in the assessments. The frequency of decision factors stated in the documents was determined and their association with terms related to uncertainty. The results were incorporated into visual representations of hierarchical factors. We identified 125 decision factors, and hierarchically grouped these into eight domains: Clinical Effectiveness, Cost Effectiveness, Condition, Current Practice, Clinical Need, New Treatment, Studies, and Other Factors. Textual analysis showed all domains appeared consistently in the guidance documents. Many factors were commonly associated with terms relating to uncertainty. A series of visual representations was created. This study reveals the complexity and consistency of NICE's decision-making processes and demonstrates that cost effectiveness is not the only decision-criteria. The study highlights the importance of processes and methodology that can take both quantitative and qualitative information into account. Visualizations can help effectively communicate this complex information during the decision-making process and subsequently to stakeholders.
Zhou, Ronggang; Chan, Alan H. S.
2016-01-01
BACKGROUND: In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. OBJECTIVE: This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. METHODS: With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. RESULTS AND CONCLUSIONS: Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process. PMID:28035943
Nature of motor control: perspectives and issues.
Turvey, Michael T; Fonseca, Sergio
2009-01-01
Four perspectives on motor control provide the framework for developing a comprehensive theory of motor control in biological systems. The four perspectives, of decreasing orthodoxy, are distinguished by their sources of inspiration: neuroanatomy, robotics, self-organization, and ecological realities. Twelve major issues that commonly constrain (either explicitly or implicitly) the understanding of the control and coordination of movement are identified and evaluated within the framework of the four perspectives. The issues are as follows: (1) Is control strictly neural? (2) Is there a divide between planning and execution? (3) Does control entail a frequently involved knowledgeable executive? (4) Do analytical internal models mediate control? (5) Is anticipation necessarily model dependent? (6) Are movements preassembled? (7) Are the participating components context independent? (8) Is force transmission strictly myotendinous? (9) Is afference a matter of local linear signaling? (10) Is neural noise an impediment? (11) Do standard variables (of mechanics and physiology) suffice? (12) Is the organization of control hierarchical?
Bayesian Group Bridge for Bi-level Variable Selection.
Mallick, Himel; Yi, Nengjun
2017-06-01
A Bayesian bi-level variable selection method (BAGB: Bayesian Analysis of Group Bridge) is developed for regularized regression and classification. This new development is motivated by grouped data, where generic variables can be divided into multiple groups, with variables in the same group being mechanistically related or statistically correlated. As an alternative to frequentist group variable selection methods, BAGB incorporates structural information among predictors through a group-wise shrinkage prior. Posterior computation proceeds via an efficient MCMC algorithm. In addition to the usual ease-of-interpretation of hierarchical linear models, the Bayesian formulation produces valid standard errors, a feature that is notably absent in the frequentist framework. Empirical evidence of the attractiveness of the method is illustrated by extensive Monte Carlo simulations and real data analysis. Finally, several extensions of this new approach are presented, providing a unified framework for bi-level variable selection in general models with flexible penalties.
Nature of Motor Control: Perspectives and Issues
Turvey, M. T.; Fonseca, Sergio
2013-01-01
Four perspectives on motor control provide the framework for developing a comprehensive theory of motor control in biological systems. The four perspectives, of decreasing orthodoxy, are distinguished by their sources of inspiration: neuroanatomy, robotics, self-organization, and ecological realities. Twelve major issues that commonly constrain (either explicitly or implicitly) the understanding of the control and coordination of movement are identified and evaluated within the framework of the four perspectives. The issues are as follows: (1) Is control strictly neural? (2) Is there a divide between planning and execution? (3) Does control entail a frequently involved knowledgeable executive? (4) Do analytical internal models mediate control? (5) Is anticipation necessarily model dependent? (6) Are movements preassembled? (7) Are the participating components context independent? (8) Is force transmission strictly myotendinous? (9) Is afference a matter of local linear signaling? (10) Is neural noise an impediment? (11) Do standard variables (of mechanics and physiology) suffice? (12) Is the organization of control hierarchical? PMID:19227497
The Researching on Evaluation of Automatic Voltage Control Based on Improved Zoning Methodology
NASA Astrophysics Data System (ADS)
Xiao-jun, ZHU; Ang, FU; Guang-de, DONG; Rui-miao, WANG; De-fen, ZHU
2018-03-01
According to the present serious phenomenon of increasing size and structure of power system, hierarchically structured automatic voltage control(AVC) has been the researching spot. In the paper, the reduced control model is built and the adaptive reduced control model is researched to improve the voltage control effect. The theories of HCSD, HCVS, SKC and FCM are introduced and the effect on coordinated voltage regulation caused by different zoning methodologies is also researched. The generic framework for evaluating performance of coordinated voltage regulation is built. Finally, the IEEE-96 stsyem is used to divide the network. The 2383-bus Polish system is built to verify that the selection of a zoning methodology affects not only the coordinated voltage regulation operation, but also its robustness to erroneous data and proposes a comprehensive generic framework for evaluating its performance. The New England 39-bus network is used to verify the adaptive reduced control models’ performance.
NASA Astrophysics Data System (ADS)
Gavish, Yoni; O'Connell, Jerome; Marsh, Charles J.; Tarantino, Cristina; Blonda, Palma; Tomaselli, Valeria; Kunin, William E.
2018-02-01
The increasing need for high quality Habitat/Land-Cover (H/LC) maps has triggered considerable research into novel machine-learning based classification models. In many cases, H/LC classes follow pre-defined hierarchical classification schemes (e.g., CORINE), in which fine H/LC categories are thematically nested within more general categories. However, none of the existing machine-learning algorithms account for this pre-defined hierarchical structure. Here we introduce a novel Random Forest (RF) based application of hierarchical classification, which fits a separate local classification model in every branching point of the thematic tree, and then integrates all the different local models to a single global prediction. We applied the hierarchal RF approach in a NATURA 2000 site in Italy, using two land-cover (CORINE, FAO-LCCS) and one habitat classification scheme (EUNIS) that differ from one another in the shape of the class hierarchy. For all 3 classification schemes, both the hierarchical model and a flat model alternative provided accurate predictions, with kappa values mostly above 0.9 (despite using only 2.2-3.2% of the study area as training cells). The flat approach slightly outperformed the hierarchical models when the hierarchy was relatively simple, while the hierarchical model worked better under more complex thematic hierarchies. Most misclassifications came from habitat pairs that are thematically distant yet spectrally similar. In 2 out of 3 classification schemes, the additional constraints of the hierarchical model resulted with fewer such serious misclassifications relative to the flat model. The hierarchical model also provided valuable information on variable importance which can shed light into "black-box" based machine learning algorithms like RF. We suggest various ways by which hierarchical classification models can increase the accuracy and interpretability of H/LC classification maps.
Mapping Informative Clusters in a Hierarchial Framework of fMRI Multivariate Analysis
Xu, Rui; Zhen, Zonglei; Liu, Jia
2010-01-01
Pattern recognition methods have become increasingly popular in fMRI data analysis, which are powerful in discriminating between multi-voxel patterns of brain activities associated with different mental states. However, when they are used in functional brain mapping, the location of discriminative voxels varies significantly, raising difficulties in interpreting the locus of the effect. Here we proposed a hierarchical framework of multivariate approach that maps informative clusters rather than voxels to achieve reliable functional brain mapping without compromising the discriminative power. In particular, we first searched for local homogeneous clusters that consisted of voxels with similar response profiles. Then, a multi-voxel classifier was built for each cluster to extract discriminative information from the multi-voxel patterns. Finally, through multivariate ranking, outputs from the classifiers were served as a multi-cluster pattern to identify informative clusters by examining interactions among clusters. Results from both simulated and real fMRI data demonstrated that this hierarchical approach showed better performance in the robustness of functional brain mapping than traditional voxel-based multivariate methods. In addition, the mapped clusters were highly overlapped for two perceptually equivalent object categories, further confirming the validity of our approach. In short, the hierarchical framework of multivariate approach is suitable for both pattern classification and brain mapping in fMRI studies. PMID:21152081
Segurado, Pedro; Almeida, Carina; Neves, Ramiro; Ferreira, Maria Teresa; Branco, Paulo
2018-05-15
River basins are extremely complex hierarchical and directional systems that are affected by a multitude of interacting stressors. This complexity hampers effective management and conservation planning to be effectively implemented, especially under climate change. The objective of this work is to provide a wide scale approach to basin management by interpreting the effect of isolated and interacting factors in several biotic elements (fish, macroinvertebrates, phytobenthos and macrophytes). For that, a case study in the Sorraia basin (Central Portugal), a Mediterranean system mainly facing water scarcity and diffuse pollution problems, was chosen. To develop the proposed framework, a combination of process-based modelling to simulate hydrological and nutrient enrichment stressors and empirical modelling to relate these stressors - along with land use and natural background - with biotic indicators, was applied. Biotic indicators based on ecological quality ratios from WFD biomonitoring data were used as response variables. Temperature, river slope, % of agriculture in the upstream catchment and total N were the variables more frequently ranked as the most relevant. Both the two significant interactions found between single hydrological and nutrient enrichment stressors indicated antagonistic effects. This study demonstrates the potentialities of coupling process-based modelling with empirical modelling within a single framework, allowing relationships among different ecosystem states to be hierarchized, interpreted and predicted at multiple spatial and temporal scales. It also demonstrates how isolated and interacting stressors can have a different impact on biotic quality. When performing conservation or management plans, the stressor hierarchy should be considered as a way of prioritizing actions in a cost-effective perspective. Copyright © 2017 Elsevier B.V. All rights reserved.
Hammer, Monica; Balfors, Berit; Mörtberg, Ulla; Petersson, Mona; Quin, Andrew
2011-03-01
In this article, focusing on the ongoing implementation of the EU Water Framework Directive, we analyze some of the opportunities and challenges for a sustainable governance of water resources from an ecosystem management perspective. In the face of uncertainty and change, the ecosystem approach as a holistic and integrated management framework is increasingly recognized. The ongoing implementation of the Water Framework Directive (WFD) could be viewed as a reorganization phase in the process of change in institutional arrangements and ecosystems. In this case study from the Northern Baltic Sea River Basin District, Sweden, we focus in particular on data and information management from a multi-level governance perspective from the local stakeholder to the River Basin level. We apply a document analysis, hydrological mapping, and GIS models to analyze some of the institutional framework created for the implementation of the WFD. The study underlines the importance of institutional arrangements that can handle variability of local situations and trade-offs between solutions and priorities on different hierarchical levels.
Working toward integrated models of alpine plant distribution.
Carlson, Bradley Z; Randin, Christophe F; Boulangeat, Isabelle; Lavergne, Sébastien; Thuiller, Wilfried; Choler, Philippe
2013-10-01
Species distribution models (SDMs) have been frequently employed to forecast the response of alpine plants to global changes. Efforts to model alpine plant distribution have thus far been primarily based on a correlative approach, in which ecological processes are implicitly addressed through a statistical relationship between observed species occurrences and environmental predictors. Recent evidence, however, highlights the shortcomings of correlative SDMs, especially in alpine landscapes where plant species tend to be decoupled from atmospheric conditions in micro-topographic habitats and are particularly exposed to geomorphic disturbances. While alpine plants respond to the same limiting factors as plants found at lower elevations, alpine environments impose a particular set of scale-dependent and hierarchical drivers that shape the realized niche of species and that require explicit consideration in a modelling context. Several recent studies in the European Alps have successfully integrated both correlative and process-based elements into distribution models of alpine plants, but for the time being a single integrative modelling framework that includes all key drivers remains elusive. As a first step in working toward a comprehensive integrated model applicable to alpine plant communities, we propose a conceptual framework that structures the primary mechanisms affecting alpine plant distributions. We group processes into four categories, including multi-scalar abiotic drivers, gradient dependent species interactions, dispersal and spatial-temporal plant responses to disturbance. Finally, we propose a methodological framework aimed at developing an integrated model to better predict alpine plant distribution.
Jo, Young-Moo; Kim, Tae-Hyung; Lee, Chul-Soon; Lim, Kyeorei; Na, Chan Woong; Abdel-Hady, Faissal; Wazzan, Abdulaziz A; Lee, Jong-Heun
2018-03-14
Nearly monodisperse hollow hierarchical Co 3 O 4 nanocages of four different sizes (∼0.3, 1.0, 2.0, and 4.0 μm) consisting of nanosheets were prepared by controlled precipitation of zeolitic imidazolate framework-67 (ZIF-67) rhombic dodecahedra, followed by solvothermal synthesis of Co 3 O 4 nanocages using ZIF-67 self-sacrificial templates, and subsequent heat treatment for the development of high-performance methylbenzene sensors. The sensor based on hollow hierarchical Co 3 O 4 nanocages with the size of ∼1.0 μm exhibited not only ultrahigh responses (resistance ratios) to 5 ppm p-xylene (78.6) and toluene (43.8) but also a remarkably high selectivity to methylbenzene over the interference of ubiquitous ethanol at 225 °C. The unprecedented and high response and selectivity to methylbenzenes are attributed to the highly gas-accessible hollow hierarchical morphology with thin shells, abundant mesopores, and high surface area per unit volume as well as the high catalytic activity of Co 3 O 4 . Moreover, the size, shell thickness, mesopores, and hollow/hierarchical morphology of the nanocages, the key parameters determining the gas response and selectivity, could be well-controlled by tuning the precipitation of ZIF-67 rhombic dodecahedra and solvothermal reaction. This method can pave a new pathway for the design of high-performance methylbenzene sensors for monitoring the quality of indoor air.
Testolin, Alberto; Zorzi, Marco
2016-01-01
Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage. PMID:27468262
Testolin, Alberto; Zorzi, Marco
2016-01-01
Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage.
A continuous-time neural model for sequential action.
Kachergis, George; Wyatte, Dean; O'Reilly, Randall C; de Kleijn, Roy; Hommel, Bernhard
2014-11-05
Action selection, planning and execution are continuous processes that evolve over time, responding to perceptual feedback as well as evolving top-down constraints. Existing models of routine sequential action (e.g. coffee- or pancake-making) generally fall into one of two classes: hierarchical models that include hand-built task representations, or heterarchical models that must learn to represent hierarchy via temporal context, but thus far lack goal-orientedness. We present a biologically motivated model of the latter class that, because it is situated in the Leabra neural architecture, affords an opportunity to include both unsupervised and goal-directed learning mechanisms. Moreover, we embed this neurocomputational model in the theoretical framework of the theory of event coding (TEC), which posits that actions and perceptions share a common representation with bidirectional associations between the two. Thus, in this view, not only does perception select actions (along with task context), but actions are also used to generate perceptions (i.e. intended effects). We propose a neural model that implements TEC to carry out sequential action control in hierarchically structured tasks such as coffee-making. Unlike traditional feedforward discrete-time neural network models, which use static percepts to generate static outputs, our biological model accepts continuous-time inputs and likewise generates non-stationary outputs, making short-timescale dynamic predictions. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Conesa, D; Martínez-Beneito, M A; Amorós, R; López-Quílez, A
2015-04-01
Considerable effort has been devoted to the development of statistical algorithms for the automated monitoring of influenza surveillance data. In this article, we introduce a framework of models for the early detection of the onset of an influenza epidemic which is applicable to different kinds of surveillance data. In particular, the process of the observed cases is modelled via a Bayesian Hierarchical Poisson model in which the intensity parameter is a function of the incidence rate. The key point is to consider this incidence rate as a normal distribution in which both parameters (mean and variance) are modelled differently, depending on whether the system is in an epidemic or non-epidemic phase. To do so, we propose a hidden Markov model in which the transition between both phases is modelled as a function of the epidemic state of the previous week. Different options for modelling the rates are described, including the option of modelling the mean at each phase as autoregressive processes of order 0, 1 or 2. Bayesian inference is carried out to provide the probability of being in an epidemic state at any given moment. The methodology is applied to various influenza data sets. The results indicate that our methods outperform previous approaches in terms of sensitivity, specificity and timeliness. © The Author(s) 2011 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
NASA Astrophysics Data System (ADS)
Ichii, K.; Kondo, M.; Wang, W.; Hashimoto, H.; Nemani, R. R.
2012-12-01
Various satellite-based spatial products such as evapotranspiration (ET) and gross primary productivity (GPP) are now produced by integration of ground and satellite observations. Effective use of these multiple satellite-based products in terrestrial biosphere models is an important step toward better understanding of terrestrial carbon and water cycles. However, due to the complexity of terrestrial biosphere models with large number of model parameters, the application of these spatial data sets in terrestrial biosphere models is difficult. In this study, we established an effective but simple framework to refine a terrestrial biosphere model, Biome-BGC, using multiple satellite-based products as constraints. We tested the framework in the monsoon Asia region covered by AsiaFlux observations. The framework is based on the hierarchical analysis (Wang et al. 2009) with model parameter optimization constrained by satellite-based spatial data. The Biome-BGC model is separated into several tiers to minimize the freedom of model parameter selections and maximize the independency from the whole model. For example, the snow sub-model is first optimized using MODIS snow cover product, followed by soil water sub-model optimized by satellite-based ET (estimated by an empirical upscaling method; Support Vector Regression (SVR) method; Yang et al. 2007), photosynthesis model optimized by satellite-based GPP (based on SVR method), and respiration and residual carbon cycle models optimized by biomass data. As a result of initial assessment, we found that most of default sub-models (e.g. snow, water cycle and carbon cycle) showed large deviations from remote sensing observations. However, these biases were removed by applying the proposed framework. For example, gross primary productivities were initially underestimated in boreal and temperate forest and overestimated in tropical forests. However, the parameter optimization scheme successfully reduced these biases. Our analysis shows that terrestrial carbon and water cycle simulations in monsoon Asia were greatly improved, and the use of multiple satellite observations with this framework is an effective way for improving terrestrial biosphere models.
NASA Technical Reports Server (NTRS)
Afjeh, Abdollah A.; Reed, John A.
2003-01-01
This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.
Andrei, Victor; Arandjelović, Ognjen
2016-12-01
The rapidly expanding corpus of medical research literature presents major challenges in the understanding of previous work, the extraction of maximum information from collected data, and the identification of promising research directions. We present a case for the use of advanced machine learning techniques as an aide in this task and introduce a novel methodology that is shown to be capable of extracting meaningful information from large longitudinal corpora and of tracking complex temporal changes within it. Our framework is based on (i) the discretization of time into epochs, (ii) epoch-wise topic discovery using a hierarchical Dirichlet process-based model, and (iii) a temporal similarity graph which allows for the modelling of complex topic changes. More specifically, this is the first work that discusses and distinguishes between two groups of particularly challenging topic evolution phenomena: topic splitting and speciation and topic convergence and merging, in addition to the more widely recognized emergence and disappearance and gradual evolution. The proposed framework is evaluated on a public medical literature corpus.
Generalized multiple kernel learning with data-dependent priors.
Mao, Qi; Tsang, Ivor W; Gao, Shenghua; Wang, Li
2015-06-01
Multiple kernel learning (MKL) and classifier ensemble are two mainstream methods for solving learning problems in which some sets of features/views are more informative than others, or the features/views within a given set are inconsistent. In this paper, we first present a novel probabilistic interpretation of MKL such that maximum entropy discrimination with a noninformative prior over multiple views is equivalent to the formulation of MKL. Instead of using the noninformative prior, we introduce a novel data-dependent prior based on an ensemble of kernel predictors, which enhances the prediction performance of MKL by leveraging the merits of the classifier ensemble. With the proposed probabilistic framework of MKL, we propose a hierarchical Bayesian model to learn the proposed data-dependent prior and classification model simultaneously. The resultant problem is convex and other information (e.g., instances with either missing views or missing labels) can be seamlessly incorporated into the data-dependent priors. Furthermore, a variety of existing MKL models can be recovered under the proposed MKL framework and can be readily extended to incorporate these priors. Extensive experiments demonstrate the benefits of our proposed framework in supervised and semisupervised settings, as well as in tasks with partial correspondence among multiple views.
Zhang, Rui; Zhou, Tingting; Wang, Lili; Zhang, Tong
2018-03-21
Highly sensitive and stable gas sensors have attracted much attention because they are the key to innovations in the fields of environment, health, energy savings and security, etc. Sensing materials, which influence the practical sensing performance, are the crucial parts for gas sensors. Metal-organic frameworks (MOFs) are considered as alluring sensing materials for gas sensors because of the possession of high specific surface area, unique morphology, abundant metal sites, and functional linkers. Herein, four kinds of porous hierarchical Co 3 O 4 structures have been selectively controlled by optimizing the thermal decomposition (temperature, rate, and atmosphere) using ZIF-67 as precursor that was obtained from coprecipitation method with the co-assistance of cobalt salt and 2-methylimidazole in the solution of methanol. These hierarchical Co 3 O 4 structures, with controllable cross-linked channels, meso-/micropores, and adjustable surface area, are efficient catalytic materials for gas sensing. Benefits from structural advantages, core-shell, and porous core-shell Co 3 O 4 exhibit enhanced sensing performance compared to those of porous popcorn and nanoparticle Co 3 O 4 to acetone gas. These novel MOF-templated Co 3 O 4 hierarchical structures are so fantastic that they can be expected to be efficient sensing materials for development of low-temperature operating gas sensors.
Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary
2014-11-01
Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Causal assessment of surrogacy in a meta-analysis of colorectal cancer trials
Li, Yun; Taylor, Jeremy M.G.; Elliott, Michael R.; Sargent, Daniel J.
2011-01-01
When the true end points (T) are difficult or costly to measure, surrogate markers (S) are often collected in clinical trials to help predict the effect of the treatment (Z). There is great interest in understanding the relationship among S, T, and Z. A principal stratification (PS) framework has been proposed by Frangakis and Rubin (2002) to study their causal associations. In this paper, we extend the framework to a multiple trial setting and propose a Bayesian hierarchical PS model to assess surrogacy. We apply the method to data from a large collection of colon cancer trials in which S and T are binary. We obtain the trial-specific causal measures among S, T, and Z, as well as their overall population-level counterparts that are invariant across trials. The method allows for information sharing across trials and reduces the nonidentifiability problem. We examine the frequentist properties of our model estimates and the impact of the monotonicity assumption using simulations. We also illustrate the challenges in evaluating surrogacy in the counterfactual framework that result from nonidentifiability. PMID:21252079
Cross-scale integration of knowledge for predicting species ranges: a metamodeling framework
Talluto, Matthew V.; Boulangeat, Isabelle; Ameztegui, Aitor; Aubin, Isabelle; Berteaux, Dominique; Butler, Alyssa; Doyon, Frédérik; Drever, C. Ronnie; Fortin, Marie-Josée; Franceschini, Tony; Liénard, Jean; McKenney, Dan; Solarik, Kevin A.; Strigul, Nikolay; Thuiller, Wilfried; Gravel, Dominique
2016-01-01
Aim Current interest in forecasting changes to species ranges have resulted in a multitude of approaches to species distribution models (SDMs). However, most approaches include only a small subset of the available information, and many ignore smaller-scale processes such as growth, fecundity, and dispersal. Furthermore, different approaches often produce divergent predictions with no simple method to reconcile them. Here, we present a flexible framework for integrating models at multiple scales using hierarchical Bayesian methods. Location Eastern North America (as an example). Methods Our framework builds a metamodel that is constrained by the results of multiple sub-models and provides probabilistic estimates of species presence. We applied our approach to a simulated dataset to demonstrate the integration of a correlative SDM with a theoretical model. In a second example, we built an integrated model combining the results of a physiological model with presence-absence data for sugar maple (Acer saccharum), an abundant tree native to eastern North America. Results For both examples, the integrated models successfully included information from all data sources and substantially improved the characterization of uncertainty. For the second example, the integrated model outperformed the source models with respect to uncertainty when modelling the present range of the species. When projecting into the future, the model provided a consensus view of two models that differed substantially in their predictions. Uncertainty was reduced where the models agreed and was greater where they diverged, providing a more realistic view of the state of knowledge than either source model. Main conclusions We conclude by discussing the potential applications of our method and its accessibility to applied ecologists. In ideal cases, our framework can be easily implemented using off-the-shelf software. The framework has wide potential for use in species distribution modelling and can drive better integration of multi-source and multi-scale data into ecological decision-making. PMID:27499698
Cross-scale integration of knowledge for predicting species ranges: a metamodeling framework.
Talluto, Matthew V; Boulangeat, Isabelle; Ameztegui, Aitor; Aubin, Isabelle; Berteaux, Dominique; Butler, Alyssa; Doyon, Frédérik; Drever, C Ronnie; Fortin, Marie-Josée; Franceschini, Tony; Liénard, Jean; McKenney, Dan; Solarik, Kevin A; Strigul, Nikolay; Thuiller, Wilfried; Gravel, Dominique
2016-02-01
Current interest in forecasting changes to species ranges have resulted in a multitude of approaches to species distribution models (SDMs). However, most approaches include only a small subset of the available information, and many ignore smaller-scale processes such as growth, fecundity, and dispersal. Furthermore, different approaches often produce divergent predictions with no simple method to reconcile them. Here, we present a flexible framework for integrating models at multiple scales using hierarchical Bayesian methods. Eastern North America (as an example). Our framework builds a metamodel that is constrained by the results of multiple sub-models and provides probabilistic estimates of species presence. We applied our approach to a simulated dataset to demonstrate the integration of a correlative SDM with a theoretical model. In a second example, we built an integrated model combining the results of a physiological model with presence-absence data for sugar maple ( Acer saccharum ), an abundant tree native to eastern North America. For both examples, the integrated models successfully included information from all data sources and substantially improved the characterization of uncertainty. For the second example, the integrated model outperformed the source models with respect to uncertainty when modelling the present range of the species. When projecting into the future, the model provided a consensus view of two models that differed substantially in their predictions. Uncertainty was reduced where the models agreed and was greater where they diverged, providing a more realistic view of the state of knowledge than either source model. We conclude by discussing the potential applications of our method and its accessibility to applied ecologists. In ideal cases, our framework can be easily implemented using off-the-shelf software. The framework has wide potential for use in species distribution modelling and can drive better integration of multi-source and multi-scale data into ecological decision-making.
An analytics approach to designing patient centered medical homes.
Ajorlou, Saeede; Shams, Issac; Yang, Kai
2015-03-01
Recently the patient centered medical home (PCMH) model has become a popular team based approach focused on delivering more streamlined care to patients. In current practices of medical homes, a clinical based prediction frame is recommended because it can help match the portfolio capacity of PCMH teams with the actual load generated by a set of patients. Without such balances in clinical supply and demand, issues such as excessive under and over utilization of physicians, long waiting time for receiving the appropriate treatment, and non-continuity of care will eliminate many advantages of the medical home strategy. In this paper, by using the hierarchical generalized linear model with multivariate responses, we develop a clinical workload prediction model for care portfolio demands in a Bayesian framework. The model allows for heterogeneous variances and unstructured covariance matrices for nested random effects that arise through complex hierarchical care systems. We show that using a multivariate approach substantially enhances the precision of workload predictions at both primary and non primary care levels. We also demonstrate that care demands depend not only on patient demographics but also on other utilization factors, such as length of stay. Our analyses of a recent data from Veteran Health Administration further indicate that risk adjustment for patient health conditions can considerably improve the prediction power of the model.
Recent global methane trends: an investigation using hierarchical Bayesian methods
NASA Astrophysics Data System (ADS)
Rigby, M. L.; Stavert, A.; Ganesan, A.; Lunt, M. F.
2014-12-01
Following a decade with little growth, methane concentrations began to increase across the globe in 2007, and have continued to rise ever since. The reasons for this renewed growth are currently the subject of much debate. Here, we discuss the recent observed trends, and highlight some of the strengths and weaknesses in current "inverse" methods for quantifying fluxes using observations. In particular, we focus on the outstanding problems of accurately quantifying uncertainties in inverse frameworks. We examine to what extent the recent methane changes can be explained by the current generation of flux models and inventories. We examine the major modes of variability in wetland models along with the Global Fire Emissions Database (GFED) and the Emissions Database for Global Atmospheric Research (EDGAR). Using the Model for Ozone and Related Tracers (MOZART), we determine whether the spatial and temporal atmospheric trends predicted using these emissions can be brought into consistency with in situ atmospheric observations. We use a novel hierarchical Bayesian methodology in which scaling factors applied to the principal components of the flux fields are estimated simultaneously with the uncertainties associated with the a priori fluxes and with model representations of the observations. Using this method, we examine the predictive power of methane flux models for explaining recent fluctuations.
Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework.
Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana
2014-06-01
Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd.
Urbina, Angel; Mahadevan, Sankaran; Paez, Thomas L.
2012-03-01
Here, performance assessment of complex systems is ideally accomplished through system-level testing, but because they are expensive, such tests are seldom performed. On the other hand, for economic reasons, data from tests on individual components that are parts of complex systems are more readily available. The lack of system-level data leads to a need to build computational models of systems and use them for performance prediction in lieu of experiments. Because their complexity, models are sometimes built in a hierarchical manner, starting with simple components, progressing to collections of components, and finally, to the full system. Quantification of uncertainty inmore » the predicted response of a system model is required in order to establish confidence in the representation of actual system behavior. This paper proposes a framework for the complex, but very practical problem of quantification of uncertainty in system-level model predictions. It is based on Bayes networks and uses the available data at multiple levels of complexity (i.e., components, subsystem, etc.). Because epistemic sources of uncertainty were shown to be secondary, in this application, aleatoric only uncertainty is included in the present uncertainty quantification. An example showing application of the techniques to uncertainty quantification of measures of response of a real, complex aerospace system is included.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Zhijie; Lai, Canhai; Marcy, Peter William
2017-05-01
A challenging problem in designing pilot-scale carbon capture systems is to predict, with uncertainty, the adsorber performance and capture efficiency under various operating conditions where no direct experimental data exist. Motivated by this challenge, we previously proposed a hierarchical framework in which relevant parameters of physical models were sequentially calibrated from different laboratory-scale carbon capture unit (C2U) experiments. Specifically, three models of increasing complexity were identified based on the fundamental physical and chemical processes of the sorbent-based carbon capture technology. Results from the corresponding laboratory experiments were used to statistically calibrate the physical model parameters while quantifying some of theirmore » inherent uncertainty. The parameter distributions obtained from laboratory-scale C2U calibration runs are used in this study to facilitate prediction at a larger scale where no corresponding experimental results are available. In this paper, we first describe the multiphase reactive flow model for a sorbent-based 1-MW carbon capture system then analyze results from an ensemble of simulations with the upscaled model. The simulation results are used to quantify uncertainty regarding the design’s predicted efficiency in carbon capture. In particular, we determine the minimum gas flow rate necessary to achieve 90% capture efficiency with 95% confidence.« less
An Active Learning Framework for Hyperspectral Image Classification Using Hierarchical Segmentation
NASA Technical Reports Server (NTRS)
Zhang, Zhou; Pasolli, Edoardo; Crawford, Melba M.; Tilton, James C.
2015-01-01
Augmenting spectral data with spatial information for image classification has recently gained significant attention, as classification accuracy can often be improved by extracting spatial information from neighboring pixels. In this paper, we propose a new framework in which active learning (AL) and hierarchical segmentation (HSeg) are combined for spectral-spatial classification of hyperspectral images. The spatial information is extracted from a best segmentation obtained by pruning the HSeg tree using a new supervised strategy. The best segmentation is updated at each iteration of the AL process, thus taking advantage of informative labeled samples provided by the user. The proposed strategy incorporates spatial information in two ways: 1) concatenating the extracted spatial features and the original spectral features into a stacked vector and 2) extending the training set using a self-learning-based semi-supervised learning (SSL) approach. Finally, the two strategies are combined within an AL framework. The proposed framework is validated with two benchmark hyperspectral datasets. Higher classification accuracies are obtained by the proposed framework with respect to five other state-of-the-art spectral-spatial classification approaches. Moreover, the effectiveness of the proposed pruning strategy is also demonstrated relative to the approaches based on a fixed segmentation.
The Advantages of Hierarchical Linear Modeling. ERIC/AE Digest.
ERIC Educational Resources Information Center
Osborne, Jason W.
This digest introduces hierarchical data structure, describes how hierarchical models work, and presents three approaches to analyzing hierarchical data. Hierarchical, or nested data, present several problems for analysis. People or creatures that exist within hierarchies tend to be more similar to each other than people randomly sampled from the…
NASA Technical Reports Server (NTRS)
Badger, Julia M.; Claunch, Charles; Mathis, Frank
2017-01-01
The Modular Autonomous Systems Technology (MAST) framework is a tool for building distributed, hierarchical autonomous systems. Originally intended for the autonomous monitoring and control of spacecraft, this framework concept provides support for variable autonomy, assume-guarantee contracts, and efficient communication between subsystems and a centralized systems manager. MAST was developed at NASA's Johnson Space Center (JSC) and has been applied to an integrated spacecraft example scenario.
NASA Technical Reports Server (NTRS)
Breckenridge, Jonathan T.; Johnson, Stephen B.
2013-01-01
This paper describes the core framework used to implement a Goal-Function Tree (GFT) based systems engineering process using the Systems Modeling Language. It defines a set of principles built upon by the theoretical approach described in the InfoTech 2013 ISHM paper titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management" presented by Dr. Stephen B. Johnson. Using the SysML language, the principles in this paper describe the expansion of the SysML language as a baseline in order to: hierarchically describe a system, describe that system functionally within success space, and allocate detection mechanisms to success functions for system protection.
Evaluating multi-level models to test occupancy state responses of Plethodontid salamanders
Kroll, Andrew J.; Garcia, Tiffany S.; Jones, Jay E.; Dugger, Catherine; Murden, Blake; Johnson, Josh; Peerman, Summer; Brintz, Ben; Rochelle, Michael
2015-01-01
Plethodontid salamanders are diverse and widely distributed taxa and play critical roles in ecosystem processes. Due to salamander use of structurally complex habitats, and because only a portion of a population is available for sampling, evaluation of sampling designs and estimators is critical to provide strong inference about Plethodontid ecology and responses to conservation and management activities. We conducted a simulation study to evaluate the effectiveness of multi-scale and hierarchical single-scale occupancy models in the context of a Before-After Control-Impact (BACI) experimental design with multiple levels of sampling. Also, we fit the hierarchical single-scale model to empirical data collected for Oregon slender and Ensatina salamanders across two years on 66 forest stands in the Cascade Range, Oregon, USA. All models were fit within a Bayesian framework. Estimator precision in both models improved with increasing numbers of primary and secondary sampling units, underscoring the potential gains accrued when adding secondary sampling units. Both models showed evidence of estimator bias at low detection probabilities and low sample sizes; this problem was particularly acute for the multi-scale model. Our results suggested that sufficient sample sizes at both the primary and secondary sampling levels could ameliorate this issue. Empirical data indicated Oregon slender salamander occupancy was associated strongly with the amount of coarse woody debris (posterior mean = 0.74; SD = 0.24); Ensatina occupancy was not associated with amount of coarse woody debris (posterior mean = -0.01; SD = 0.29). Our simulation results indicate that either model is suitable for use in an experimental study of Plethodontid salamanders provided that sample sizes are sufficiently large. However, hierarchical single-scale and multi-scale models describe different processes and estimate different parameters. As a result, we recommend careful consideration of study questions and objectives prior to sampling data and fitting models.
NASA Astrophysics Data System (ADS)
Anderson, C. J.; Wildhaber, M. L.; Wikle, C. K.; Moran, E. H.; Franz, K. J.; Dey, R.
2012-12-01
Climate change operates over a broad range of spatial and temporal scales. Understanding the effects of change on ecosystems requires accounting for the propagation of information and uncertainty across these scales. For example, to understand potential climate change effects on fish populations in riverine ecosystems, climate conditions predicted by course-resolution atmosphere-ocean global climate models must first be translated to the regional climate scale. In turn, this regional information is used to force watershed models, which are used to force river condition models, which impact the population response. A critical challenge in such a multiscale modeling environment is to quantify sources of uncertainty given the highly nonlinear nature of interactions between climate variables and the individual organism. We use a hierarchical modeling approach for accommodating uncertainty in multiscale ecological impact studies. This framework allows for uncertainty due to system models, model parameter settings, and stochastic parameterizations. This approach is a hybrid between physical (deterministic) downscaling and statistical downscaling, recognizing that there is uncertainty in both. We use NARCCAP data to determine confidence the capability of climate models to simulate relevant processes and to quantify regional climate variability within the context of the hierarchical model of uncertainty quantification. By confidence, we mean the ability of the regional climate model to replicate observed mechanisms. We use the NCEP-driven simulations for this analysis. This provides a base from which regional change can be categorized as either a modification of previously observed mechanisms or emergence of new processes. The management implications for these categories of change are significantly different in that procedures to address impacts from existing processes may already be known and need adjustment; whereas, an emergent processes may require new management strategies. The results from hierarchical analysis of uncertainty are used to study the relative change in weights of the endangered Missouri River pallid sturgeon (Scaphirhynchus albus) under a 21st century climate scenario.
Mechanisms of Soil Aggregation: a biophysical modeling framework
NASA Astrophysics Data System (ADS)
Ghezzehei, T. A.; Or, D.
2016-12-01
Soil aggregation is one of the main crosscutting concepts in all sub-disciplines and applications of soil science from agriculture to climate regulation. The concept generally refers to adhesion of primary soil particles into distinct units that remain stable when subjected to disruptive forces. It is one of the most sensitive soil qualities that readily respond to disturbances such as cultivation, fire, drought, flooding, and changes in vegetation. These changes are commonly quantified and incorporated in soil models indirectly as alterations in carbon content and type, bulk density, aeration, permeability, as well as water retention characteristics. Soil aggregation that is primarily controlled by organic matter generally exhibits hierarchical organization of soil constituents into stable units that range in size from a few microns to centimeters. However, this conceptual model of soil aggregation as the key unifying mechanism remains poorly quantified and is rarely included in predictive soil models. Here we provide a biophysical framework for quantitative and predictive modeling of soil aggregation and its attendant soil characteristics. The framework treats aggregates as hotspots of biological, chemical and physical processes centered around roots and root residue. We keep track of the life cycle of an individual aggregate from it genesis in the rhizosphere, fueled by rhizodeposition and mediated by vigorous microbial activity, until its disappearance when the root-derived resources are depleted. The framework synthesizes current understanding of microbial life in porous media; water holding and soil binding capacity of biopolymers; and environmental controls on soil organic matter dynamics. The framework paves a way for integration of processes that are presently modeled as disparate or poorly coupled processes, including storage and protection of carbon, microbial activity, greenhouse gas fluxes, movement and storage of water, resistance of soils against erosion.
Adding a landscape ecology perspective to conservation and management planning
Kathryn E. Freemark; John R. Probst; John B. Dunning; Salllie J. Hejl
1993-01-01
We briefly review concepts in landscape ecology and discuss their relevance to the conservation and management of neotropical migrant landbirds. We then integrate a landscape perspective into a spatially-hierarchical framework for conservation and management planning for neotropical migrant landbirds (and other biota). The framework outlines a comprehensive approach by...
Ecological subregion codes by county, coterminous United States
Victor A. Rudis
1999-01-01
This publication presents the National Hierarchical Framework of Ecological Units (ECOMAP 1993) by county for the coterminous United States. Assignment of the framework to individual counties is based on the predominant area by province and section to facilitate integration of county-referenced information with areas of uniform ecological potential. Included are maps...
Multiple Object Retrieval in Image Databases Using Hierarchical Segmentation Tree
ERIC Educational Resources Information Center
Chen, Wei-Bang
2012-01-01
The purpose of this research is to develop a new visual information analysis, representation, and retrieval framework for automatic discovery of salient objects of user's interest in large-scale image databases. In particular, this dissertation describes a content-based image retrieval framework which supports multiple-object retrieval. The…
Cheng, Guanhui; Huang, Guohe; Dong, Cong; Xu, Ye; Chen, Xiujuan; Chen, Jiapei
2017-03-01
Due to the existence of complexities of heterogeneities, hierarchy, discreteness, and interactions in municipal solid waste management (MSWM) systems such as Beijing, China, a series of socio-economic and eco-environmental problems may emerge or worsen and result in irredeemable damages in the following decades. Meanwhile, existing studies, especially ones focusing on MSWM in Beijing, could hardly reflect these complexities in system simulations and provide reliable decision support for management practices. Thus, a framework of distributed mixed-integer fuzzy hierarchical programming (DMIFHP) is developed in this study for MSWM under these complexities. Beijing is selected as a representative case. The Beijing MSWM system is comprehensively analyzed in many aspects such as socio-economic conditions, natural conditions, spatial heterogeneities, treatment facilities, and system complexities, building a solid foundation for system simulation and optimization. Correspondingly, the MSWM system in Beijing is discretized as 235 grids to reflect spatial heterogeneity. A DMIFHP model which is a nonlinear programming problem is constructed to parameterize the Beijing MSWM system. To enable scientific solving of it, a solution algorithm is proposed based on coupling of fuzzy programming and mixed-integer linear programming. Innovations and advantages of the DMIFHP framework are discussed. The optimal MSWM schemes and mechanism revelations will be discussed in another companion paper due to length limitation.
A hierarchical spatial model for well yield in complex aquifers
NASA Astrophysics Data System (ADS)
Montgomery, J.; O'sullivan, F.
2017-12-01
Efficiently siting and managing groundwater wells requires reliable estimates of the amount of water that can be produced, or the well yield. This can be challenging to predict in highly complex, heterogeneous fractured aquifers due to the uncertainty around local hydraulic properties. Promising statistical approaches have been advanced in recent years. For instance, kriging and multivariate regression analysis have been applied to well test data with limited but encouraging levels of prediction accuracy. Additionally, some analytical solutions to diffusion in homogeneous porous media have been used to infer "effective" properties consistent with observed flow rates or drawdown. However, this is an under-specified inverse problem with substantial and irreducible uncertainty. We describe a flexible machine learning approach capable of combining diverse datasets with constraining physical and geostatistical models for improved well yield prediction accuracy and uncertainty quantification. Our approach can be implemented within a hierarchical Bayesian framework using Markov Chain Monte Carlo, which allows for additional sources of information to be incorporated in priors to further constrain and improve predictions and reduce the model order. We demonstrate the usefulness of this approach using data from over 7,000 wells in a fractured bedrock aquifer.
Oh, Sunghee; Song, Seongho
2017-01-01
In gene expression profile, data analysis pipeline is categorized into four levels, major downstream tasks, i.e., (1) identification of differential expression; (2) clustering co-expression patterns; (3) classification of subtypes of samples; and (4) detection of genetic regulatory networks, are performed posterior to preprocessing procedure such as normalization techniques. To be more specific, temporal dynamic gene expression data has its inherent feature, namely, two neighboring time points (previous and current state) are highly correlated with each other, compared to static expression data which samples are assumed as independent individuals. In this chapter, we demonstrate how HMMs and hierarchical Bayesian modeling methods capture the horizontal time dependency structures in time series expression profiles by focusing on the identification of differential expression. In addition, those differential expression genes and transcript variant isoforms over time detected in core prerequisite steps can be generally further applied in detection of genetic regulatory networks to comprehensively uncover dynamic repertoires in the aspects of system biology as the coupled framework.
Automated plant, production management system
NASA Astrophysics Data System (ADS)
Aksenova, V. I.; Belov, V. I.
1984-12-01
The development of a complex of tasks for the operational management of production (OUP) within the framework of an automated system for production management (ASUP) shows that it is impossible to have effective computations without reliable initial information. The influence of many factors involving the production and economic activity of the entire enterprise upon the plan and course of production are considered. It is suggested that an adequate model should be available which covers all levels of the hierarchical system: workplace, section (bridgade), shop, enterprise, and the model should be incorporated into the technological sequence of performance and there should be provisions for an adequate man machine system.
Clusters of poverty and disease emerge from feedbacks on an epidemiological network.
Pluciński, Mateusz M; Ngonghala, Calistus N; Getz, Wayne M; Bonds, Matthew H
2013-03-06
The distribution of health conditions is characterized by extreme inequality. These disparities have been alternately attributed to disease ecology and the economics of poverty. Here, we provide a novel framework that integrates epidemiological and economic growth theory on an individual-based hierarchically structured network. Our model indicates that, under certain parameter regimes, feedbacks between disease ecology and economics create clusters of low income and high disease that can stably persist in populations that become otherwise predominantly rich and free of disease. Surprisingly, unlike traditional poverty trap models, these localized disease-driven poverty traps can arise despite homogeneity of parameters and evenly distributed initial economic conditions.
Contemporary cybernetics and its facets of cognitive informatics and computational intelligence.
Wang, Yingxu; Kinsner, Witold; Zhang, Du
2009-08-01
This paper explores the architecture, theoretical foundations, and paradigms of contemporary cybernetics from perspectives of cognitive informatics (CI) and computational intelligence. The modern domain and the hierarchical behavioral model of cybernetics are elaborated at the imperative, autonomic, and cognitive layers. The CI facet of cybernetics is presented, which explains how the brain may be mimicked in cybernetics via CI and neural informatics. The computational intelligence facet is described with a generic intelligence model of cybernetics. The compatibility between natural and cybernetic intelligence is analyzed. A coherent framework of contemporary cybernetics is presented toward the development of transdisciplinary theories and applications in cybernetics, CI, and computational intelligence.
Vehicle logo recognition using multi-level fusion model
NASA Astrophysics Data System (ADS)
Ming, Wei; Xiao, Jianli
2018-04-01
Vehicle logo recognition plays an important role in manufacturer identification and vehicle recognition. This paper proposes a new vehicle logo recognition algorithm. It has a hierarchical framework, which consists of two fusion levels. At the first level, a feature fusion model is employed to map the original features to a higher dimension feature space. In this space, the vehicle logos become more recognizable. At the second level, a weighted voting strategy is proposed to promote the accuracy and the robustness of the recognition results. To evaluate the performance of the proposed algorithm, extensive experiments are performed, which demonstrate that the proposed algorithm can achieve high recognition accuracy and work robustly.
Advanced hierarchical distance sampling
Royle, Andy
2016-01-01
In this chapter, we cover a number of important extensions of the basic hierarchical distance-sampling (HDS) framework from Chapter 8. First, we discuss the inclusion of “individual covariates,” such as group size, in the HDS model. This is important in many surveys where animals form natural groups that are the primary observation unit, with the size of the group expected to have some influence on detectability. We also discuss HDS integrated with time-removal and double-observer or capture-recapture sampling. These “combined protocols” can be formulated as HDS models with individual covariates, and thus they have a commonality with HDS models involving group structure (group size being just another individual covariate). We cover several varieties of open-population HDS models that accommodate population dynamics. On one end of the spectrum, we cover models that allow replicate distance sampling surveys within a year, which estimate abundance relative to availability and temporary emigration through time. We consider a robust design version of that model. We then consider models with explicit dynamics based on the Dail and Madsen (2011) model and the work of Sollmann et al. (2015). The final major theme of this chapter is relatively newly developed spatial distance sampling models that accommodate explicit models describing the spatial distribution of individuals known as Point Process models. We provide novel formulations of spatial DS and HDS models in this chapter, including implementations of those models in the unmarked package using a hack of the pcount function for N-mixture models.
Cumulative biological impacts framework for solar energy projects in the California Desert
Davis, Frank W.; Kreitler, Jason R.; Soong, Oliver; Stoms, David M.; Dashiell, Stephanie; Hannah, Lee; Wilkinson, Whitney; Dingman, John
2013-01-01
This project developed analytical approaches, tools and geospatial data to support conservation planning for renewable energy development in the California deserts. Research focused on geographical analysis to avoid, minimize and mitigate the cumulative biological effects of utility-scale solar energy development. A hierarchical logic model was created to map the compatibility of new solar energy projects with current biological conservation values. The research indicated that the extent of compatible areas is much greater than the estimated land area required to achieve 2040 greenhouse gas reduction goals. Species distribution models were produced for 65 animal and plant species that were of potential conservation significance to the Desert Renewable Energy Conservation Plan process. These models mapped historical and projected future habitat suitability using 270 meter resolution climate grids. The results were integrated into analytical frameworks to locate potential sites for offsetting project impacts and evaluating the cumulative effects of multiple solar energy projects. Examples applying these frameworks in the Western Mojave Desert ecoregion show the potential of these publicly-available tools to assist regional planning efforts. Results also highlight the necessity to explicitly consider projected land use change and climate change when prioritizing areas for conservation and mitigation offsets. Project data, software and model results are all available online.
Toward Model Building for Visual Aesthetic Perception
Lughofer, Edwin; Zeng, Xianyi
2017-01-01
Several models of visual aesthetic perception have been proposed in recent years. Such models have drawn on investigations into the neural underpinnings of visual aesthetics, utilizing neurophysiological techniques and brain imaging techniques including functional magnetic resonance imaging, magnetoencephalography, and electroencephalography. The neural mechanisms underlying the aesthetic perception of the visual arts have been explained from the perspectives of neuropsychology, brain and cognitive science, informatics, and statistics. Although corresponding models have been constructed, the majority of these models contain elements that are difficult to be simulated or quantified using simple mathematical functions. In this review, we discuss the hypotheses, conceptions, and structures of six typical models for human aesthetic appreciation in the visual domain: the neuropsychological, information processing, mirror, quartet, and two hierarchical feed-forward layered models. Additionally, the neural foundation of aesthetic perception, appreciation, or judgement for each model is summarized. The development of a unified framework for the neurobiological mechanisms underlying the aesthetic perception of visual art and the validation of this framework via mathematical simulation is an interesting challenge in neuroaesthetics research. This review aims to provide information regarding the most promising proposals for bridging the gap between visual information processing and brain activity involved in aesthetic appreciation. PMID:29270194
Haptics-based dynamic implicit solid modeling.
Hua, Jing; Qin, Hong
2004-01-01
This paper systematically presents a novel, interactive solid modeling framework, Haptics-based Dynamic Implicit Solid Modeling, which is founded upon volumetric implicit functions and powerful physics-based modeling. In particular, we augment our modeling framework with a haptic mechanism in order to take advantage of additional realism associated with a 3D haptic interface. Our dynamic implicit solids are semi-algebraic sets of volumetric implicit functions and are governed by the principles of dynamics, hence responding to sculpting forces in a natural and predictable manner. In order to directly manipulate existing volumetric data sets as well as point clouds, we develop a hierarchical fitting algorithm to reconstruct and represent discrete data sets using our continuous implicit functions, which permit users to further design and edit those existing 3D models in real-time using a large variety of haptic and geometric toolkits, and visualize their interactive deformation at arbitrary resolution. The additional geometric and physical constraints afford more sophisticated control of the dynamic implicit solids. The versatility of our dynamic implicit modeling enables the user to easily modify both the geometry and the topology of modeled objects, while the inherent physical properties can offer an intuitive haptic interface for direct manipulation with force feedback.
Managing changes in the enterprise architecture modelling context
NASA Astrophysics Data System (ADS)
Khanh Dam, Hoa; Lê, Lam-Son; Ghose, Aditya
2016-07-01
Enterprise architecture (EA) models the whole enterprise in various aspects regarding both business processes and information technology resources. As the organisation grows, the architecture of its systems and processes must also evolve to meet the demands of the business environment. Evolving an EA model may involve making changes to various components across different levels of the EA. As a result, an important issue before making a change to an EA model is assessing the ripple effect of the change, i.e. change impact analysis. Another critical issue is change propagation: given a set of primary changes that have been made to the EA model, what additional secondary changes are needed to maintain consistency across multiple levels of the EA. There has been however limited work on supporting the maintenance and evolution of EA models. This article proposes an EA description language, namely ChangeAwareHierarchicalEA, integrated with an evolution framework to support both change impact analysis and change propagation within an EA model. The core part of our framework is a technique for computing the impact of a change and a new method for generating interactive repair plans from Alloy consistency rules that constrain the EA model.
Bayesian Hierarchical Classes Analysis
ERIC Educational Resources Information Center
Leenen, Iwin; Van Mechelen, Iven; Gelman, Andrew; De Knop, Stijn
2008-01-01
Hierarchical classes models are models for "N"-way "N"-mode data that represent the association among the "N" modes and simultaneously yield, for each mode, a hierarchical classification of its elements. In this paper we present a stochastic extension of the hierarchical classes model for two-way two-mode binary data. In line with the original…
NASA Technical Reports Server (NTRS)
Schmidt, Phillip; Garg, Sanjay
1991-01-01
A framework for a decentralized hierarchical controller partitioning structure is developed. This structure allows for the design of separate airframe and propulsion controllers which, when assembled, will meet the overall design criterion for the integrated airframe/propulsion system. An algorithm based on parameter optimization of the state-space representation for the subsystem controllers is described. The algorithm is currently being applied to an integrated flight propulsion control design example.
2016-08-31
crack initiation and SCG mechanisms (initiation and growth versus resistance). 2. Final summary Here, we present a hierarchical form of multiscale...prismatic faults in -Ti: A combined quantum mechanics /molecular mechanics study 2. Nano-indentation and slip transfer (critical in understanding crack...initiation) 3. An extended-finite element framework (XFEM) to study SCG mechanisms 4. Atomistic methods to develop a grain and twin boundaries database
Erdoğdu, Utku; Tan, Mehmet; Alhajj, Reda; Polat, Faruk; Rokne, Jon; Demetrick, Douglas
2013-01-01
The availability of enough samples for effective analysis and knowledge discovery has been a challenge in the research community, especially in the area of gene expression data analysis. Thus, the approaches being developed for data analysis have mostly suffered from the lack of enough data to train and test the constructed models. We argue that the process of sample generation could be successfully automated by employing some sophisticated machine learning techniques. An automated sample generation framework could successfully complement the actual sample generation from real cases. This argument is validated in this paper by describing a framework that integrates multiple models (perspectives) for sample generation. We illustrate its applicability for producing new gene expression data samples, a highly demanding area that has not received attention. The three perspectives employed in the process are based on models that are not closely related. The independence eliminates the bias of having the produced approach covering only certain characteristics of the domain and leading to samples skewed towards one direction. The first model is based on the Probabilistic Boolean Network (PBN) representation of the gene regulatory network underlying the given gene expression data. The second model integrates Hierarchical Markov Model (HIMM) and the third model employs a genetic algorithm in the process. Each model learns as much as possible characteristics of the domain being analysed and tries to incorporate the learned characteristics in generating new samples. In other words, the models base their analysis on domain knowledge implicitly present in the data itself. The developed framework has been extensively tested by checking how the new samples complement the original samples. The produced results are very promising in showing the effectiveness, usefulness and applicability of the proposed multi-model framework.
The MIL-88A-Derived Fe3O4-Carbon Hierarchical Nanocomposites for Electrochemical Sensing
Wang, Li; Zhang, Yayun; Li, Xia; Xie, Yingzhen; He, Juan; Yu, Jie; Song, Yonghai
2015-01-01
Metal or metal oxides/carbon nanocomposites with hierarchical superstructures have become one of the most promising functional materials in sensor, catalysis, energy conversion, etc. In this work, novel hierarchical Fe3O4/carbon superstructures have been fabricated based on metal-organic frameworks (MOFs)-derived method. Three kinds of Fe-MOFs (MIL-88A) with different morphologies were prepared beforehand as templates, and then pyrolyzed to fabricate the corresponding novel hierarchical Fe3O4/carbon superstructures. The systematic studies on the thermal decomposition process of the three kinds of MIL-88A and the effect of template morphology on the products were carried out in detail. Scanning electron microscopy, transmission electron microscopy, X-ray powder diffraction, X-ray photoelectron spectroscopy and thermal analysis were employed to investigate the hierarchical Fe3O4/carbon superstructures. Based on these resulted hierarchical Fe3O4/carbon superstructures, a novel and sensitive nonenzymatic N-acetyl cysteine sensor was developed. The porous and hierarchical superstructures and large surface area of the as-formed Fe3O4/carbon superstructures eventually contributed to the good electrocatalytic activity of the prepared sensor towards the oxidation of N-acetyl cysteine. The proposed preparation method of the hierarchical Fe3O4/carbon superstructures is simple, efficient, cheap and easy to mass production. It might open up a new way for hierarchical superstructures preparation. PMID:26387535
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moges, Edom; Demissie, Yonas; Li, Hong-Yi
2016-04-01
In most water resources applications, a single model structure might be inadequate to capture the dynamic multi-scale interactions among different hydrological processes. Calibrating single models for dynamic catchments, where multiple dominant processes exist, can result in displacement of errors from structure to parameters, which in turn leads to over-correction and biased predictions. An alternative to a single model structure is to develop local expert structures that are effective in representing the dominant components of the hydrologic process and adaptively integrate them based on an indicator variable. In this study, the Hierarchical Mixture of Experts (HME) framework is applied to integratemore » expert model structures representing the different components of the hydrologic process. Various signature diagnostic analyses are used to assess the presence of multiple dominant processes and the adequacy of a single model, as well as to identify the structures of the expert models. The approaches are applied for two distinct catchments, the Guadalupe River (Texas) and the French Broad River (North Carolina) from the Model Parameter Estimation Experiment (MOPEX), using different structures of the HBV model. The results show that the HME approach has a better performance over the single model for the Guadalupe catchment, where multiple dominant processes are witnessed through diagnostic measures. Whereas, the diagnostics and aggregated performance measures prove that French Broad has a homogeneous catchment response, making the single model adequate to capture the response.« less
Gel-based morphological design of zirconium metal-organic frameworks.
Bueken, Bart; Van Velthoven, Niels; Willhammar, Tom; Stassin, Timothée; Stassen, Ivo; Keen, David A; Baron, Gino V; Denayer, Joeri F M; Ameloot, Rob; Bals, Sara; De Vos, Dirk; Bennett, Thomas D
2017-05-01
The ability of metal-organic frameworks (MOFs) to gelate under specific synthetic conditions opens up new opportunities in the preparation and shaping of hierarchically porous MOF monoliths, which could be directly implemented for catalytic and adsorptive applications. In this work, we present the first examples of xero- or aerogel monoliths consisting solely of nanoparticles of several prototypical Zr 4+ -based MOFs: UiO-66-X (X = H, NH 2 , NO 2 , (OH) 2 ), UiO-67, MOF-801, MOF-808 and NU-1000. High reactant and water concentrations during synthesis were observed to induce the formation of gels, which were converted to monolithic materials by drying in air or supercritical CO 2 . Electron microscopy, combined with N 2 physisorption experiments, was used to show that irregular nanoparticle packing leads to pure MOF monoliths with hierarchical pore systems, featuring both intraparticle micropores and interparticle mesopores. Finally, UiO-66 gels were shaped into monolithic spheres of 600 μm diameter using an oil-drop method, creating promising candidates for packed-bed catalytic or adsorptive applications, where hierarchical pore systems can greatly mitigate mass transfer limitations.
A framework for linking cybersecurity metrics to the modeling of macroeconomic interdependencies.
Santos, Joost R; Haimes, Yacov Y; Lian, Chenyang
2007-10-01
Hierarchical decision making is a multidimensional process involving management of multiple objectives (with associated metrics and tradeoffs in terms of costs, benefits, and risks), which span various levels of a large-scale system. The nation is a hierarchical system as it consists multiple classes of decisionmakers and stakeholders ranging from national policymakers to operators of specific critical infrastructure subsystems. Critical infrastructures (e.g., transportation, telecommunications, power, banking, etc.) are highly complex and interconnected. These interconnections take the form of flows of information, shared security, and physical flows of commodities, among others. In recent years, economic and infrastructure sectors have become increasingly dependent on networked information systems for efficient operations and timely delivery of products and services. In order to ensure the stability, sustainability, and operability of our critical economic and infrastructure sectors, it is imperative to understand their inherent physical and economic linkages, in addition to their cyber interdependencies. An interdependency model based on a transformation of the Leontief input-output (I-O) model can be used for modeling: (1) the steady-state economic effects triggered by a consumption shift in a given sector (or set of sectors); and (2) the resulting ripple effects to other sectors. The inoperability metric is calculated for each sector; this is achieved by converting the economic impact (typically in monetary units) into a percentage value relative to the size of the sector. Disruptive events such as terrorist attacks, natural disasters, and large-scale accidents have historically shown cascading effects on both consumption and production. Hence, a dynamic model extension is necessary to demonstrate the interplay between combined demand and supply effects. The result is a foundational framework for modeling cybersecurity scenarios for the oil and gas sector. A hypothetical case study examines a cyber attack that causes a 5-week shortfall in the crude oil supply in the Gulf Coast area.
Cruz, Bruna B.; Miranda, Leandro E.; Cetra, Mauricio
2013-01-01
We hypothesised and tested a hierarchical organisation model where riparian landcover would influence bank composition and light availability, which in turn would influence instream environments and control fish assemblages. The study was conducted during the dry season in 11 headwater tributaries of the Sorocaba River in the upper Paraná River Basin, south-eastern Brazil. We focused on seven environmental factors each represented by one or multiple environmental variables and seven fish functional traits each represented by two or more classes. Multivariate direct gradient analyses suggested that riparian zone landcover can be considered a higher level causal factor in a network of relations that control instream characteristics and fish assemblages. Our results provide a framework for a hierarchical conceptual model that identifies singular and collective influences of variables from different scales on each other and ultimately on different aspects related to stream fish functional composition. This conceptual model is focused on the relationships between riparian landcover and instream variables as causal factors on the organisation of stream fish assemblages. Our results can also be viewed as a model for headwater stream management in that landcover can be manipulated to influence factors such as bank composition, substrates and water quality, whereas fish assemblage composition can be used as indicators to monitor the success of such efforts.
HIERARCHICAL PROBABILISTIC INFERENCE OF COSMIC SHEAR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneider, Michael D.; Dawson, William A.; Hogg, David W.
2015-07-01
Point estimators for the shearing of galaxy images induced by gravitational lensing involve a complex inverse problem in the presence of noise, pixelization, and model uncertainties. We present a probabilistic forward modeling approach to gravitational lensing inference that has the potential to mitigate the biased inferences in most common point estimators and is practical for upcoming lensing surveys. The first part of our statistical framework requires specification of a likelihood function for the pixel data in an imaging survey given parameterized models for the galaxies in the images. We derive the lensing shear posterior by marginalizing over all intrinsic galaxymore » properties that contribute to the pixel data (i.e., not limited to galaxy ellipticities) and learn the distributions for the intrinsic galaxy properties via hierarchical inference with a suitably flexible conditional probabilitiy distribution specification. We use importance sampling to separate the modeling of small imaging areas from the global shear inference, thereby rendering our algorithm computationally tractable for large surveys. With simple numerical examples we demonstrate the improvements in accuracy from our importance sampling approach, as well as the significance of the conditional distribution specification for the intrinsic galaxy properties when the data are generated from an unknown number of distinct galaxy populations with different morphological characteristics.« less
Decoding the Semantic Content of Natural Movies from Human Brain Activity
Huth, Alexander G.; Lee, Tyler; Nishimoto, Shinji; Bilenko, Natalia Y.; Vu, An T.; Gallant, Jack L.
2016-01-01
One crucial test for any quantitative model of the brain is to show that the model can be used to accurately decode information from evoked brain activity. Several recent neuroimaging studies have decoded the structure or semantic content of static visual images from human brain activity. Here we present a decoding algorithm that makes it possible to decode detailed information about the object and action categories present in natural movies from human brain activity signals measured by functional MRI. Decoding is accomplished using a hierarchical logistic regression (HLR) model that is based on labels that were manually assigned from the WordNet semantic taxonomy. This model makes it possible to simultaneously decode information about both specific and general categories, while respecting the relationships between them. Our results show that we can decode the presence of many object and action categories from averaged blood-oxygen level-dependent (BOLD) responses with a high degree of accuracy (area under the ROC curve > 0.9). Furthermore, we used this framework to test whether semantic relationships defined in the WordNet taxonomy are represented the same way in the human brain. This analysis showed that hierarchical relationships between general categories and atypical examples, such as organism and plant, did not seem to be reflected in representations measured by BOLD fMRI. PMID:27781035
Stewart, David R.; Long, James M.
2015-01-01
Species distribution models are useful tools to evaluate habitat relationships of fishes. We used hierarchical Bayesian multispecies mixture models to evaluate the relationships of both detection and abundance with habitat of reservoir fishes caught using tandem hoop nets. A total of 7,212 fish from 12 species were captured, and the majority of the catch was composed of Channel Catfish Ictalurus punctatus (46%), Bluegill Lepomis macrochirus(25%), and White Crappie Pomoxis annularis (14%). Detection estimates ranged from 8% to 69%, and modeling results suggested that fishes were primarily influenced by reservoir size and context, water clarity and temperature, and land-use types. Species were differentially abundant within and among habitat types, and some fishes were found to be more abundant in turbid, less impacted (e.g., by urbanization and agriculture) reservoirs with longer shoreline lengths; whereas, other species were found more often in clear, nutrient-rich impoundments that had generally shorter shoreline length and were surrounded by a higher percentage of agricultural land. Our results demonstrated that habitat and reservoir characteristics may differentially benefit species and assemblage structure. This study provides a useful framework for evaluating capture efficiency for not only hoop nets but other gear types used to sample fishes in reservoirs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Komiya, Yutaka; Suda, Takuma; Yamada, Shimako
2014-03-10
We investigate the chemical enrichment of r-process elements in the early evolutionary stages of the Milky Way halo within the framework of hierarchical galaxy formation using a semi-analytic merger tree. In this paper, we focus on heavy r-process elements, Ba and Eu, of extremely metal-poor (EMP) stars and give constraints on their astronomical sites. Our models take into account changes of the surface abundances of EMP stars by the accretion of interstellar medium (ISM). We also consider metal-enrichment of intergalactic medium by galactic winds and the resultant pre-enrichment of proto-galaxies. The trend and scatter of the observed r-process abundances aremore » well reproduced by our hierarchical model with ∼10% of core-collapse supernovae in low-mass end (∼10 M {sub ☉}) as a dominant r-process source and the star formation efficiency of ∼10{sup –10} yr{sup –1}. For neutron star mergers as an r-process source, their coalescence timescale has to be ∼10{sup 7} yr, and the event rates ∼100 times larger than currently observed in the Galaxy. We find that the accretion of ISM is a dominant source of r-process elements for stars with [Ba/H] < –3.5. In this model, a majority of stars at [Fe/H] < –3 are formed without r-process elements, but their surfaces are polluted by the ISM accretion. The pre-enrichment affects ∼4% of proto-galaxies, and yet, is surpassed by the ISM accretion in the surface of EMP stars.« less
Zhang, Gen; Tsujimoto, Masahiko; Packwood, Daniel; Duong, Nghia Tuan; Nishiyama, Yusuke; Kadota, Kentaro; Kitagawa, Susumu; Horike, Satoshi
2018-02-21
Covalent organic frameworks (COFs) represent an emerging class of crystalline porous materials that are constructed by the assembly of organic building blocks linked via covalent bonds. Several strategies have been developed for the construction of new COF structures; however, a facile approach to fabricate hierarchical COF architectures with controlled domain structures remains a significant challenge, and has not yet been achieved. In this study, a dynamic covalent chemistry (DCC)-based postsynthetic approach was employed at the solid-liquid interface to construct such structures. Two-dimensional imine-bonded COFs having different aromatic groups were prepared, and a homogeneously mixed-linker structure and a heterogeneously core-shell hollow structure were fabricated by controlling the reactivity of the postsynthetic reactions. Solid-state nuclear magnetic resonance (NMR) spectroscopy and transmission electron microscopy (TEM) confirmed the structures. COFs prepared by a postsynthetic approach exhibit several functional advantages compared with their parent phases. Their Brunauer-Emmett-Teller (BET) surface areas are 2-fold greater than those of their parent phases because of the higher crystallinity. In addition, the hydrophilicity of the material and the stepwise adsorption isotherms of H 2 O vapor in the hierarchical frameworks were precisely controlled, which was feasible because of the distribution of various domains of the two COFs by controlling the postsynthetic reaction. The approach opens new routes for constructing COF architectures with functionalities that are not possible in a single phase.
Yee, Susan H; Bradley, Patricia; Fisher, William S; Perreault, Sally D; Quackenboss, James; Johnson, Eric D; Bousquin, Justin; Murphy, Patricia A
2012-12-01
The U.S. Environmental Protection Agency has recently realigned its research enterprise around the concept of sustainability. Scientists from across multiple disciplines have a role to play in contributing the information, methods, and tools needed to more fully understand the long-term impacts of decisions on the social and economic sustainability of communities. Success will depend on a shift in thinking to integrate, organize, and prioritize research within a systems context. We used the Driving forces-Pressures-State-Impact-Response (DPSIR) framework as a basis for integrating social, cultural, and economic aspects of environmental and human health into a single framework. To make the framework broadly applicable to sustainability research planning, we provide a hierarchical system of DPSIR keywords and guidelines for use as a communication tool. The applicability of the integrated framework was first tested on a public health issue (asthma disparities) for purposes of discussion. We then applied the framework at a science planning meeting to identify opportunities for sustainable and healthy communities research. We conclude that an integrated systems framework has many potential roles in science planning, including identifying key issues, visualizing interactions within the system, identifying research gaps, organizing information, developing computational models, and identifying indicators.
GAMBIT: the global and modular beyond-the-standard-model inference tool
NASA Astrophysics Data System (ADS)
Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian
2017-11-01
We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.
The Frontal Lobes and Theory of Mind: Developmental Concepts from Adult Focal Lesion Research
ERIC Educational Resources Information Center
Stuss, Donald T.; Anderson, Vicki
2004-01-01
The primary objective in this paper is to present a framework to understand the structure of consciousness. We argue that consciousness has been difficult to define because there are different kinds of consciousness, hierarchically organized, which need to be differentiated. Our framework is based on evidence from adult focal lesion research. The…
Ryan A. McManamay; Donald J. Orth; Charles A. Dolloff; Emmaneul A. Firmpong
2012-01-01
River regulation has resulted in substantial losses in habitat connectivity, biodiversity and ecosystem services. River managers are faced with a growing need to protect the key aspects of the natural flow regime. A practical approach to providing environmental flow standards is to create a regional framework by classifying unregulated streams into groups of similar...
Cohen, Mark E; Dimick, Justin B; Bilimoria, Karl Y; Ko, Clifford Y; Richards, Karen; Hall, Bruce Lee
2009-12-01
Although logistic regression has commonly been used to adjust for risk differences in patient and case mix to permit quality comparisons across hospitals, hierarchical modeling has been advocated as the preferred methodology, because it accounts for clustering of patients within hospitals. It is unclear whether hierarchical models would yield important differences in quality assessments compared with logistic models when applied to American College of Surgeons (ACS) National Surgical Quality Improvement Program (NSQIP) data. Our objective was to evaluate differences in logistic versus hierarchical modeling for identifying hospitals with outlying outcomes in the ACS-NSQIP. Data from ACS-NSQIP patients who underwent colorectal operations in 2008 at hospitals that reported at least 100 operations were used to generate logistic and hierarchical prediction models for 30-day morbidity and mortality. Differences in risk-adjusted performance (ratio of observed-to-expected events) and outlier detections from the two models were compared. Logistic and hierarchical models identified the same 25 hospitals as morbidity outliers (14 low and 11 high outliers), but the hierarchical model identified 2 additional high outliers. Both models identified the same eight hospitals as mortality outliers (five low and three high outliers). The values of observed-to-expected events ratios and p values from the two models were highly correlated. Results were similar when data were permitted from hospitals providing < 100 patients. When applied to ACS-NSQIP data, logistic and hierarchical models provided nearly identical results with respect to identification of hospitals' observed-to-expected events ratio outliers. As hierarchical models are prone to implementation problems, logistic regression will remain an accurate and efficient method for performing risk adjustment of hospital quality comparisons.
Manca, Andrea; Lambert, Paul C; Sculpher, Mark; Rice, Nigel
2008-01-01
Healthcare cost-effectiveness analysis (CEA) often uses individual patient data (IPD) from multinational randomised controlled trials. Although designed to account for between-patient sampling variability in the clinical and economic data, standard analytical approaches to CEA ignore the presence of between-location variability in the study results. This is a restrictive limitation given that countries often differ in factors that could affect the results of CEAs, such as the availability of healthcare resources, their unit costs, clinical practice, and patient case-mix. We advocate the use of Bayesian bivariate hierarchical modelling to analyse multinational cost-effectiveness data. This analytical framework explicitly recognises that patient-level costs and outcomes are nested within countries. Using real life data, we illustrate how the proposed methods can be applied to obtain (a) more appropriate estimates of overall cost-effectiveness and associated measure of sampling uncertainty compared to standard CEA; and (b) country-specific cost-effectiveness estimates which can be used to assess the between-location variability of the study results, while controlling for differences in country-specific and patient-specific characteristics. It is demonstrated that results from standard CEA using IPD from multinational trials display a large degree of variability across the 17 countries included in the analysis, producing potentially misleading results. In contrast, ‘shrinkage estimates’ obtained from the modelling approach proposed here facilitate the appropriate quantification of country-specific cost-effectiveness estimates, while weighting the results based on the level of information available within each country. We suggest that the methods presented here represent a general framework for the analysis of economic data collected from different locations. PMID:17641141
Robust, Efficient Depth Reconstruction With Hierarchical Confidence-Based Matching.
Sun, Li; Chen, Ke; Song, Mingli; Tao, Dacheng; Chen, Gang; Chen, Chun
2017-07-01
In recent years, taking photos and capturing videos with mobile devices have become increasingly popular. Emerging applications based on the depth reconstruction technique have been developed, such as Google lens blur. However, depth reconstruction is difficult due to occlusions, non-diffuse surfaces, repetitive patterns, and textureless surfaces, and it has become more difficult due to the unstable image quality and uncontrolled scene condition in the mobile setting. In this paper, we present a novel hierarchical framework with multi-view confidence-based matching for robust, efficient depth reconstruction in uncontrolled scenes. Particularly, the proposed framework combines local cost aggregation with global cost optimization in a complementary manner that increases efficiency and accuracy. A depth map is efficiently obtained in a coarse-to-fine manner by using an image pyramid. Moreover, confidence maps are computed to robustly fuse multi-view matching cues, and to constrain the stereo matching on a finer scale. The proposed framework has been evaluated with challenging indoor and outdoor scenes, and has achieved robust and efficient depth reconstruction.
What are hierarchical models and how do we analyze them?
Royle, Andy
2016-01-01
In this chapter we provide a basic definition of hierarchical models and introduce the two canonical hierarchical models in this book: site occupancy and N-mixture models. The former is a hierarchical extension of logistic regression and the latter is a hierarchical extension of Poisson regression. We introduce basic concepts of probability modeling and statistical inference including likelihood and Bayesian perspectives. We go through the mechanics of maximizing the likelihood and characterizing the posterior distribution by Markov chain Monte Carlo (MCMC) methods. We give a general perspective on topics such as model selection and assessment of model fit, although we demonstrate these topics in practice in later chapters (especially Chapters 5, 6, 7, and 10 Chapter 5 Chapter 6 Chapter 7 Chapter 10)
Working toward integrated models of alpine plant distribution
Carlson, Bradley Z.; Randin, Christophe F.; Boulangeat, Isabelle; Lavergne, Sébastien; Thuiller, Wilfried; Choler, Philippe
2014-01-01
Species distribution models (SDMs) have been frequently employed to forecast the response of alpine plants to global changes. Efforts to model alpine plant distribution have thus far been primarily based on a correlative approach, in which ecological processes are implicitly addressed through a statistical relationship between observed species occurrences and environmental predictors. Recent evidence, however, highlights the shortcomings of correlative SDMs, especially in alpine landscapes where plant species tend to be decoupled from atmospheric conditions in micro-topographic habitats and are particularly exposed to geomorphic disturbances. While alpine plants respond to the same limiting factors as plants found at lower elevations, alpine environments impose a particular set of scale-dependent and hierarchical drivers that shape the realized niche of species and that require explicit consideration in a modelling context. Several recent studies in the European Alps have successfully integrated both correlative and process-based elements into distribution models of alpine plants, but for the time being a single integrative modelling framework that includes all key drivers remains elusive. As a first step in working toward a comprehensive integrated model applicable to alpine plant communities, we propose a conceptual framework that structures the primary mechanisms affecting alpine plant distributions. We group processes into four categories, including multi-scalar abiotic drivers, gradient dependent species interactions, dispersal and spatial–temporal plant responses to disturbance. Finally, we propose a methodological framework aimed at developing an integrated model to better predict alpine plant distribution. PMID:24790594
Epidemic spreading on hierarchical geographical networks with mobile agents
NASA Astrophysics Data System (ADS)
Han, Xiao-Pu; Zhao, Zhi-Dan; Hadzibeganovic, Tarik; Wang, Bing-Hong
2014-05-01
Hierarchical geographical traffic networks are critical for our understanding of scaling laws in human trajectories. Here, we investigate the susceptible-infected epidemic process evolving on hierarchical networks in which agents randomly walk along the edges and establish contacts in network nodes. We employ a metapopulation modeling framework that allows us to explore the contagion spread patterns in relation to multi-scale mobility behaviors. A series of computer simulations revealed that a shifted power-law-like negative relationship between the peak timing of epidemics τ0 and population density, and a logarithmic positive relationship between τ0 and the network size, can both be explained by the gradual enlargement of fluctuations in the spreading process. We employ a semi-analytical method to better understand the nature of these relationships and the role of pertinent demographic factors. Additionally, we provide a quantitative discussion of the efficiency of a border screening procedure in delaying epidemic outbreaks on hierarchical networks, yielding a rather limited feasibility of this mitigation strategy but also its non-trivial dependence on population density, infector detectability, and the diversity of the susceptible region. Our results suggest that the interplay between the human spatial dynamics, network topology, and demographic factors can have important consequences for the global spreading and control of infectious diseases. These findings provide novel insights into the combined effects of human mobility and the organization of geographical networks on spreading processes, with important implications for both epidemiological research and health policy.
Array-based Hierarchical Mesh Generation in Parallel
Ray, Navamita; Grindeanu, Iulian; Zhao, Xinglin; ...
2015-11-03
In this paper, we describe an array-based hierarchical mesh generation capability through uniform refinement of unstructured meshes for efficient solution of PDE's using finite element methods and multigrid solvers. A multi-degree, multi-dimensional and multi-level framework is designed to generate the nested hierarchies from an initial mesh that can be used for a number of purposes such as multi-level methods to generating large meshes. The capability is developed under the parallel mesh framework “Mesh Oriented dAtaBase” a.k.a MOAB. We describe the underlying data structures and algorithms to generate such hierarchies and present numerical results for computational efficiency and mesh quality. Inmore » conclusion, we also present results to demonstrate the applicability of the developed capability to a multigrid finite-element solver.« less
MOF-derived hierarchical double-shelled NiO/ZnO hollow spheres for high-performance supercapacitors.
Li, Guo-Chang; Liu, Peng-Fei; Liu, Rui; Liu, Minmin; Tao, Kai; Zhu, Shuai-Ru; Wu, Meng-Ke; Yi, Fei-Yan; Han, Lei
2016-09-14
Nanorods-composed yolk-shell bimetallic-organic frameworks microspheres are successfully synthesized by a one-step solvothermal method in the absence of any template or surfactant. Furthermore, hierarchical double-shelled NiO/ZnO hollow spheres are obtained by calcination of the bimetallic organic frameworks in air. The NiO/ZnO hollow spheres, as supercapacitor electrodes, exhibit high capacitance of 497 F g(-1) at the current density of 1.3 A g(-1) and present a superior cycling stability. The superior electrochemical performance is believed to come from the unique double-shelled NiO/ZnO hollow structures, which offer free space to accommodate the volume change during the ion insertion and desertion processes, as well as provide rich electroactive sites for the electrochemical reactions.
Crimmins, Shawn M.; Walleser, Liza R.; Hertel, Dan R.; McKann, Patrick C.; Rohweder, Jason J.; Thogmartin, Wayne E.
2016-01-01
There is growing need to develop models of spatial patterns in animal abundance, yet comparatively few examples of such models exist. This is especially true in situations where the abundance of one species may inhibit that of another, such as the intensively-farmed landscape of the Prairie Pothole Region (PPR) of the central United States, where waterfowl production is largely constrained by mesocarnivore nest predation. We used a hierarchical Bayesian approach to relate the distribution of various land-cover types to the relative abundances of four mesocarnivores in the PPR: coyote Canis latrans, raccoon Procyon lotor, red fox Vulpes vulpes, and striped skunk Mephitis mephitis. We developed models for each species at multiple spatial resolutions (41.4 km2, 10.4 km2, and 2.6 km2) to address different ecological and management-related questions. Model results for each species were similar irrespective of resolution. We found that the amount of row-crop agriculture was nearly ubiquitous in our best models, exhibiting a positive relationship with relative abundance for each species. The amount of native grassland land-cover was positively associated with coyote and raccoon relative abundance, but generally absent from models for red fox and skunk. Red fox and skunk were positively associated with each other, suggesting potential niche overlap. We found no evidence that coyote abundance limited that of other mesocarnivore species, as might be expected under a hypothesis of mesopredator release. The relationships between relative abundance and land-cover types were similar across spatial resolutions. Our results indicated that mesocarnivores in the PPR are most likely to occur in portions of the landscape with large amounts of agricultural land-cover. Further, our results indicated that track-survey data can be used in a hierarchical framework to gain inferences regarding spatial patterns in animal relative abundance.
Nilsen, Erlend B; Strand, Olav
2018-01-01
We developed a model for estimating demographic rates and population abundance based on multiple data sets revealing information about population age- and sex structure. Such models have previously been described in the literature as change-in-ratio models, but we extend the applicability of the models by i) using time series data allowing the full temporal dynamics to be modelled, by ii) casting the model in an explicit hierarchical modelling framework, and by iii) estimating parameters based on Bayesian inference. Based on sensitivity analyses we conclude that the approach developed here is able to obtain estimates of demographic rate with high precision whenever unbiased data of population structure are available. Our simulations revealed that this was true also when data on population abundance are not available or not included in the modelling framework. Nevertheless, when data on population structure are biased due to different observability of different age- and sex categories this will affect estimates of all demographic rates. Estimates of population size is particularly sensitive to such biases, whereas demographic rates can be relatively precisely estimated even with biased observation data as long as the bias is not severe. We then use the models to estimate demographic rates and population abundance for two Norwegian reindeer (Rangifer tarandus) populations where age-sex data were available for all harvested animals, and where population structure surveys were carried out in early summer (after calving) and late fall (after hunting season), and population size is counted in winter. We found that demographic rates were similar regardless whether we include population count data in the modelling, but that the estimated population size is affected by this decision. This suggest that monitoring programs that focus on population age- and sex structure will benefit from collecting additional data that allow estimation of observability for different age- and sex classes. In addition, our sensitivity analysis suggests that focusing monitoring towards changes in demographic rates might be more feasible than monitoring abundance in many situations where data on population age- and sex structure can be collected.
Funabashi, Hiroto; Takeuchi, Satoshi; Tsujimura, Seiya
2017-03-23
We designed a three-dimensional (3D) hierarchical pore structure to improve the current production efficiency and stability of direct electron transfer-type biocathodes. The 3D hierarchical electrode structure was fabricated using a MgO-templated porous carbon framework produced from two MgO templates with sizes of 40 and 150 nm. The results revealed that the optimal pore composition for a bilirubin oxidase-catalysed oxygen reduction cathode was a mixture of 33% macropores and 67% mesopores (MgOC 33 ). The macropores improve mass transfer inside the carbon material, and the mesopores improve the electron transfer efficiency of the enzyme by surrounding the enzyme with carbon.
NASA Astrophysics Data System (ADS)
Funabashi, Hiroto; Takeuchi, Satoshi; Tsujimura, Seiya
2017-03-01
We designed a three-dimensional (3D) hierarchical pore structure to improve the current production efficiency and stability of direct electron transfer-type biocathodes. The 3D hierarchical electrode structure was fabricated using a MgO-templated porous carbon framework produced from two MgO templates with sizes of 40 and 150 nm. The results revealed that the optimal pore composition for a bilirubin oxidase-catalysed oxygen reduction cathode was a mixture of 33% macropores and 67% mesopores (MgOC33). The macropores improve mass transfer inside the carbon material, and the mesopores improve the electron transfer efficiency of the enzyme by surrounding the enzyme with carbon.
A framework for the management of intellectual capital in the health care industry.
Grantham, C E; Nichols, L D; Schonberner, M
1997-01-01
This article proposes a new theoretical model for the effective management of intellectual capital in the health care industry. The evolution of knowledge-based resources as a value-adding characteristic of service industries coupled with mounting environmental pressures on health care necessitates the extension of current models of intellectual capital. Our theoretical model contains an expanded context linking its development to organizational learning theory and extends current theory by proposing a six-term archetype of organizational functioning built on flows of information. Further, our proposal offers a hierarchical dimension to intellectual capital and a method of scientific visualization for the measurement of intellectual capital. In conclusion, we offer some practical suggestions for future development, both for researchers and managers.
Bayesian analysis of non-homogeneous Markov chains: application to mental health data.
Sung, Minje; Soyer, Refik; Nhan, Nguyen
2007-07-10
In this paper we present a formal treatment of non-homogeneous Markov chains by introducing a hierarchical Bayesian framework. Our work is motivated by the analysis of correlated categorical data which arise in assessment of psychiatric treatment programs. In our development, we introduce a Markovian structure to describe the non-homogeneity of transition patterns. In doing so, we introduce a logistic regression set-up for Markov chains and incorporate covariates in our model. We present a Bayesian model using Markov chain Monte Carlo methods and develop inference procedures to address issues encountered in the analyses of data from psychiatric treatment programs. Our model and inference procedures are implemented to some real data from a psychiatric treatment study. Copyright 2006 John Wiley & Sons, Ltd.
Advances in Applications of Hierarchical Bayesian Methods with Hydrological Models
NASA Astrophysics Data System (ADS)
Alexander, R. B.; Schwarz, G. E.; Boyer, E. W.
2017-12-01
Mechanistic and empirical watershed models are increasingly used to inform water resource decisions. Growing access to historical stream measurements and data from in-situ sensor technologies has increased the need for improved techniques for coupling models with hydrological measurements. Techniques that account for the intrinsic uncertainties of both models and measurements are especially needed. Hierarchical Bayesian methods provide an efficient modeling tool for quantifying model and prediction uncertainties, including those associated with measurements. Hierarchical methods can also be used to explore spatial and temporal variations in model parameters and uncertainties that are informed by hydrological measurements. We used hierarchical Bayesian methods to develop a hybrid (statistical-mechanistic) SPARROW (SPAtially Referenced Regression On Watershed attributes) model of long-term mean annual streamflow across diverse environmental and climatic drainages in 18 U.S. hydrological regions. Our application illustrates the use of a new generation of Bayesian methods that offer more advanced computational efficiencies than the prior generation. Evaluations of the effects of hierarchical (regional) variations in model coefficients and uncertainties on model accuracy indicates improved prediction accuracies (median of 10-50%) but primarily in humid eastern regions, where model uncertainties are one-third of those in arid western regions. Generally moderate regional variability is observed for most hierarchical coefficients. Accounting for measurement and structural uncertainties, using hierarchical state-space techniques, revealed the effects of spatially-heterogeneous, latent hydrological processes in the "localized" drainages between calibration sites; this improved model precision, with only minor changes in regional coefficients. Our study can inform advances in the use of hierarchical methods with hydrological models to improve their integration with stream measurements.
Ilott, Irene; Gerrish, Kate; Eltringham, Sabrina A; Taylor, Carolyn; Pownall, Sue
2016-08-18
Swallowing difficulties challenge patient safety due to the increased risk of malnutrition, dehydration and aspiration pneumonia. A theoretically driven study was undertaken to examine the spread and sustainability of a locally developed innovation that involved using the Inter-Professional Dysphagia Framework to structure education for the workforce. A conceptual framework with 3 spread strategies (hierarchical control, participatory adaptation and facilitated evolution) was blended with a processual approach to sustaining organisational change. The aim was to understand the processes, mechanism and outcomes associated with the spread and sustainability of this safety initiative. An instrumental case study, prospectively tracked a dysphagia innovation for 34 months (April 2011 to January 2014) in a large health care organisation in England. A train-the-trainer intervention (as participatory adaptation) was deployed on care pathways for stroke and fractured neck of femur. Data were collected at the organisational and clinical level through interviews (n = 30) and document review. The coding frame combined the processual approach with the spread mechanisms. Pre-determined outcomes included the number of staff trained about dysphagia and impact related to changes in practice. The features and processes associated with hierarchical control and participatory adaptation were identified. Leadership, critical junctures, temporality and making the innovation routine were aspects of hierarchical control. Participatory adaptation was evident on the care pathways through stakeholder responses, workload and resource pressures. Six of the 25 ward based trainers cascaded the dysphagia training. The expected outcomes were achieved when the top-down mandate (hierarchical control) was supplemented by local engagement and support (participatory adaptation). Frameworks for spread and sustainability were combined to create a 'small theory' that described the interventions, the processes and desired outcomes a priori. This novel methodological approach confirmed what is known about spread and sustainability, highlighted the particularity of change and offered new insights into the factors associated with hierarchical control and participatory adaptation. The findings illustrate the dualities of organisational change as universal and context specific; as particular and amendable to theoretical generalisation. Appreciating these dualities may contribute to understanding why many innovations fail to become routine.
Computer-based analysis of microvascular alterations in a mouse model for Alzheimer's disease
NASA Astrophysics Data System (ADS)
Heinzer, Stefan; Müller, Ralph; Stampanoni, Marco; Abela, Rafael; Meyer, Eric P.; Ulmann-Schuler, Alexandra; Krucker, Thomas
2007-03-01
Vascular factors associated with Alzheimer's disease (AD) have recently gained increased attention. To investigate changes in vascular, particularly microvascular architecture, we developed a hierarchical imaging framework to obtain large-volume, high-resolution 3D images from brains of transgenic mice modeling AD. In this paper, we present imaging and data analysis methods which allow compiling unique characteristics from several hundred gigabytes of image data. Image acquisition is based on desktop micro-computed tomography (µCT) and local synchrotron-radiation µCT (SRµCT) scanning with a nominal voxel size of 16 µm and 1.4 µm, respectively. Two visualization approaches were implemented: stacks of Z-buffer projections for fast data browsing, and progressive-mesh based surface rendering for detailed 3D visualization of the large datasets. In a first step, image data was assessed visually via a Java client connected to a central database. Identified characteristics of interest were subsequently quantified using global morphometry software. To obtain even deeper insight into microvascular alterations, tree analysis software was developed providing local morphometric parameters such as number of vessel segments or vessel tortuosity. In the context of ever increasing image resolution and large datasets, computer-aided analysis has proven both powerful and indispensable. The hierarchical approach maintains the context of local phenomena, while proper visualization and morphometry provide the basis for detailed analysis of the pathology related to structure. Beyond analysis of microvascular changes in AD this framework will have significant impact considering that vascular changes are involved in other neurodegenerative diseases as well as in cancer, cardiovascular disease, asthma, and arthritis.
Semantic Context Detection Using Audio Event Fusion
NASA Astrophysics Data System (ADS)
Chu, Wei-Ta; Cheng, Wen-Huang; Wu, Ja-Ling
2006-12-01
Semantic-level content analysis is a crucial issue in achieving efficient content retrieval and management. We propose a hierarchical approach that models audio events over a time series in order to accomplish semantic context detection. Two levels of modeling, audio event and semantic context modeling, are devised to bridge the gap between physical audio features and semantic concepts. In this work, hidden Markov models (HMMs) are used to model four representative audio events, that is, gunshot, explosion, engine, and car braking, in action movies. At the semantic context level, generative (ergodic hidden Markov model) and discriminative (support vector machine (SVM)) approaches are investigated to fuse the characteristics and correlations among audio events, which provide cues for detecting gunplay and car-chasing scenes. The experimental results demonstrate the effectiveness of the proposed approaches and provide a preliminary framework for information mining by using audio characteristics.
On the importance of avoiding shortcuts in applying cognitive models to hierarchical data.
Boehm, Udo; Marsman, Maarten; Matzke, Dora; Wagenmakers, Eric-Jan
2018-06-12
Psychological experiments often yield data that are hierarchically structured. A number of popular shortcut strategies in cognitive modeling do not properly accommodate this structure and can result in biased conclusions. To gauge the severity of these biases, we conducted a simulation study for a two-group experiment. We first considered a modeling strategy that ignores the hierarchical data structure. In line with theoretical results, our simulations showed that Bayesian and frequentist methods that rely on this strategy are biased towards the null hypothesis. Secondly, we considered a modeling strategy that takes a two-step approach by first obtaining participant-level estimates from a hierarchical cognitive model and subsequently using these estimates in a follow-up statistical test. Methods that rely on this strategy are biased towards the alternative hypothesis. Only hierarchical models of the multilevel data lead to correct conclusions. Our results are particularly relevant for the use of hierarchical Bayesian parameter estimates in cognitive modeling.
Geiser, Christian; Bishop, Jacob; Lockhart, Ginger; Shiffman, Saul; Grenard, Jerry L.
2013-01-01
Latent state-trait (LST) and latent growth curve (LGC) models are frequently used in the analysis of longitudinal data. Although it is well-known that standard single-indicator LGC models can be analyzed within either the structural equation modeling (SEM) or multilevel (ML; hierarchical linear modeling) frameworks, few researchers realize that LST and multivariate LGC models, which use multiple indicators at each time point, can also be specified as ML models. In the present paper, we demonstrate that using the ML-SEM rather than the SL-SEM framework to estimate the parameters of these models can be practical when the study involves (1) a large number of time points, (2) individually-varying times of observation, (3) unequally spaced time intervals, and/or (4) incomplete data. Despite the practical advantages of the ML-SEM approach under these circumstances, there are also some limitations that researchers should consider. We present an application to an ecological momentary assessment study (N = 158 youths with an average of 23.49 observations of positive mood per person) using the software Mplus (Muthén and Muthén, 1998–2012) and discuss advantages and disadvantages of using the ML-SEM approach to estimate the parameters of LST and multiple-indicator LGC models. PMID:24416023
An HDF5-based framework for the distribution and analysis of ultrasonic concrete data
NASA Astrophysics Data System (ADS)
Prince, Luke; Clayton, Dwight; Santos-Villalobos, Hector
2017-02-01
There are many commercial ultrasonic tomography devices (UTDs) available for use in nondestructive evaluation (NDE) of reinforced concrete structures. These devices emit, measure, and store ultrasonic signals typically in the 25 kHz to 5 MHz frequency range. UTDs are characterized by a composition of multiple transducers, also known as a transducer array or phased array. Often, UTDs data are in a proprietary format. Consequently, NDE research data is limited to those who have prior non-disclosure agreements or the appropriate licenses. Thus, there is a need for a proper universal data framework to exist such that proprietary file datasets for different concrete specimens can be converted, organized, and stored with relative metadata for individual or collaborative NDE research. Building upon the Hierarchical Data Format (HDF5) model, we have developed a UTD data management framework and Graphic User Interface (GUI) to promote the algorithmic reconstruction of ultrasonic data in a controlled environment for easily reproducible and publishable results.
An HDF5-Based Framework for the Distribution and Analysis of Ultrasonic Concrete Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prince, Luke J; Clayton, Dwight A; Santos-Villalobos, Hector J
There are many commercial ultrasonic tomography devices (UTDs) available for use in nondestructive evaluation (NDE) of reinforced concrete structures. These devices emit, measure, and store ultrasonic signals typically in the 25 kHz to 5 MHz frequency range. UTDs are characterized by a composition of multiple transducers, also known as a transducer array or phased array. Often, UTDs data are in a proprietary format. Consequently, NDE research data is limited to those who have prior non-disclosure agreements or the appropriate licenses. Thus, there is a need for a proper universal data framework to exist such that proprietary file datasets for differentmore » concrete specimens can be converted, organized, and stored with relative metadata for individual or collaborative NDE research. Building upon the Hierarchical Data Format (HDF5) model, we have developed a UTD data management framework and Graphic User Interface (GUI) to promote the algorithmic reconstruction of ultrasonic data in a controlled environment for easily reproducible and publishable results.« less
Regional forest resource assessment in an ecological framework: the Southern United States
Victor A. Rudis
1998-01-01
Information about forest resources grouped by ecologically homogeneous area can be used to discern relationships between those resources and ecological processes. The author used forest resource data from 0.4-ha plots, and data on population and land area (by county), together with a global-to-local hierarchical framework of land areas with similar ecological potential...
Multi-Level Anomaly Detection on Time-Varying Graph Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bridges, Robert A; Collins, John P; Ferragut, Erik M
This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating probabilities at finer levels, and these closely related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, thismore » multi-scale analysis facilitates intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. To illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less
NASA Astrophysics Data System (ADS)
Zarekarizi, M.; Moradkhani, H.
2015-12-01
Extreme events are proven to be affected by climate change, influencing hydrologic simulations for which stationarity is usually a main assumption. Studies have discussed that this assumption would lead to large bias in model estimations and higher flood hazard consequently. Getting inspired by the importance of non-stationarity, we determined how the exceedance probabilities have changed over time in Johnson Creek River, Oregon. This could help estimate the probability of failure of a structure that was primarily designed to resist less likely floods according to common practice. Therefore, we built a climate informed Bayesian hierarchical model and non-stationarity was considered in modeling framework. Principle component analysis shows that North Atlantic Oscillation (NAO), Western Pacific Index (WPI) and Eastern Asia (EA) are mostly affecting stream flow in this river. We modeled flood extremes using peaks over threshold (POT) method rather than conventional annual maximum flood (AMF) mainly because it is possible to base the model on more information. We used available threshold selection methods to select a suitable threshold for the study area. Accounting for non-stationarity, model parameters vary through time with climate indices. We developed a couple of model scenarios and chose one which could best explain the variation in data based on performance measures. We also estimated return periods under non-stationarity condition. Results show that ignoring stationarity could increase the flood hazard up to four times which could increase the probability of an in-stream structure being overtopped.
Winners and losers in the competition for space in tropical forest canopies.
Kellner, James R; Asner, Gregory P
2014-05-01
Trees compete for space in the canopy, but where and how individuals or their component parts win or lose is poorly understood. We developed a stochastic model of three-dimensional dynamics in canopies using a hierarchical Bayesian framework, and analysed 267,533 positive height changes from 1.25 m pixels using data from airborne LiDAR within 43 ha on the windward flank of Mauna Kea. Model selection indicates a strong resident's advantage, with 97.9% of positions in the canopy retained by their occupants over 2 years. The remaining 2.1% were lost to a neighbouring contender. Absolute height was a poor predictor of success, but short stature greatly raised the risk of being overtopped. Growth in the canopy was exponentially distributed with a scaling parameter of 0.518. These findings show how size and spatial proximity influence the outcome of competition for space, and provide a general framework for the analysis of canopy dynamics. © 2014 John Wiley & Sons Ltd/CNRS.
Hierarchical multistage MCMC follow-up of continuous gravitational wave candidates
NASA Astrophysics Data System (ADS)
Ashton, G.; Prix, R.
2018-05-01
Leveraging Markov chain Monte Carlo optimization of the F statistic, we introduce a method for the hierarchical follow-up of continuous gravitational wave candidates identified by wide-parameter space semicoherent searches. We demonstrate parameter estimation for continuous wave sources and develop a framework and tools to understand and control the effective size of the parameter space, critical to the success of the method. Monte Carlo tests of simulated signals in noise demonstrate that this method is close to the theoretical optimal performance.
Cernicchiaro, N; Renter, D G; Xiang, S; White, B J; Bello, N M
2013-06-01
Variability in ADG of feedlot cattle can affect profits, thus making overall returns more unstable. Hence, knowledge of the factors that contribute to heterogeneity of variances in animal performance can help feedlot managers evaluate risks and minimize profit volatility when making managerial and economic decisions in commercial feedlots. The objectives of the present study were to evaluate heteroskedasticity, defined as heterogeneity of variances, in ADG of cohorts of commercial feedlot cattle, and to identify cattle demographic factors at feedlot arrival as potential sources of variance heterogeneity, accounting for cohort- and feedlot-level information in the data structure. An operational dataset compiled from 24,050 cohorts from 25 U. S. commercial feedlots in 2005 and 2006 was used for this study. Inference was based on a hierarchical Bayesian model implemented with Markov chain Monte Carlo, whereby cohorts were modeled at the residual level and feedlot-year clusters were modeled as random effects. Forward model selection based on deviance information criteria was used to screen potentially important explanatory variables for heteroskedasticity at cohort- and feedlot-year levels. The Bayesian modeling framework was preferred as it naturally accommodates the inherently hierarchical structure of feedlot data whereby cohorts are nested within feedlot-year clusters. Evidence for heterogeneity of variance components of ADG was substantial and primarily concentrated at the cohort level. Feedlot-year specific effects were, by far, the greatest contributors to ADG heteroskedasticity among cohorts, with an estimated ∼12-fold change in dispersion between most and least extreme feedlot-year clusters. In addition, identifiable demographic factors associated with greater heterogeneity of cohort-level variance included smaller cohort sizes, fewer days on feed, and greater arrival BW, as well as feedlot arrival during summer months. These results support that heterogeneity of variances in ADG is prevalent in feedlot performance and indicate potential sources of heteroskedasticity. Further investigation of factors associated with heteroskedasticity in feedlot performance is warranted to increase consistency and uniformity in commercial beef cattle production and subsequent profitability.
Safdari, Reza; Ghazisaeedi, Marjan; Mirzaee, Mahboobeh; Farzi, Jebrail; Goodini, Azadeh
2014-01-01
Dynamic reporting tools, such as dashboards, should be developed to measure emergency department (ED) performance. However, choosing an effective balanced set of performance measures and key performance indicators (KPIs) is a main challenge to accomplish this. The aim of this study was to develop a balanced set of KPIs for use in ED strategic dashboards following an analytic hierarchical process. The study was carried out in 2 phases: constructing ED performance measures based on balanced scorecard perspectives and incorporating them into analytic hierarchical process framework to select the final KPIs. The respondents placed most importance on ED internal processes perspective especially on measures related to timeliness and accessibility of care in ED. Some measures from financial, customer, and learning and growth perspectives were also selected as other top KPIs. Measures of care effectiveness and care safety were placed as the next priorities too. The respondents placed least importance on disease-/condition-specific "time to" measures. The methodology can be presented as a reference model for development of KPIs in various performance related areas based on a consistent and fair approach. Dashboards that are designed based on such a balanced set of KPIs will help to establish comprehensive performance measurements and fair benchmarks and comparisons.
Fu, Si-Yao; Yang, Guo-Sheng; Kuai, Xin-Kai
2012-01-01
In this paper, we present a quantitative, highly structured cortex-simulated model, which can be simply described as feedforward, hierarchical simulation of ventral stream of visual cortex using biologically plausible, computationally convenient spiking neural network system. The motivation comes directly from recent pioneering works on detailed functional decomposition analysis of the feedforward pathway of the ventral stream of visual cortex and developments on artificial spiking neural networks (SNNs). By combining the logical structure of the cortical hierarchy and computing power of the spiking neuron model, a practical framework has been presented. As a proof of principle, we demonstrate our system on several facial expression recognition tasks. The proposed cortical-like feedforward hierarchy framework has the merit of capability of dealing with complicated pattern recognition problems, suggesting that, by combining the cognitive models with modern neurocomputational approaches, the neurosystematic approach to the study of cortex-like mechanism has the potential to extend our knowledge of brain mechanisms underlying the cognitive analysis and to advance theoretical models of how we recognize face or, more specifically, perceive other people's facial expression in a rich, dynamic, and complex environment, providing a new starting point for improved models of visual cortex-like mechanism. PMID:23193391
Fu, Si-Yao; Yang, Guo-Sheng; Kuai, Xin-Kai
2012-01-01
In this paper, we present a quantitative, highly structured cortex-simulated model, which can be simply described as feedforward, hierarchical simulation of ventral stream of visual cortex using biologically plausible, computationally convenient spiking neural network system. The motivation comes directly from recent pioneering works on detailed functional decomposition analysis of the feedforward pathway of the ventral stream of visual cortex and developments on artificial spiking neural networks (SNNs). By combining the logical structure of the cortical hierarchy and computing power of the spiking neuron model, a practical framework has been presented. As a proof of principle, we demonstrate our system on several facial expression recognition tasks. The proposed cortical-like feedforward hierarchy framework has the merit of capability of dealing with complicated pattern recognition problems, suggesting that, by combining the cognitive models with modern neurocomputational approaches, the neurosystematic approach to the study of cortex-like mechanism has the potential to extend our knowledge of brain mechanisms underlying the cognitive analysis and to advance theoretical models of how we recognize face or, more specifically, perceive other people's facial expression in a rich, dynamic, and complex environment, providing a new starting point for improved models of visual cortex-like mechanism.
Generalized estimators of avian abundance from count survey data
Royle, J. Andrew
2004-01-01
I consider modeling avian abundance from spatially referenced bird count data collected according to common protocols such as capture?recapture, multiple observer, removal sampling and simple point counts. Small sample sizes and large numbers of parameters have motivated many analyses that disregard the spatial indexing of the data, and thus do not provide an adequate treatment of spatial structure. I describe a general framework for modeling spatially replicated data that regards local abundance as a random process, motivated by the view that the set of spatially referenced local populations (at the sample locations) constitute a metapopulation. Under this view, attention can be focused on developing a model for the variation in local abundance independent of the sampling protocol being considered. The metapopulation model structure, when combined with the data generating model, define a simple hierarchical model that can be analyzed using conventional methods. The proposed modeling framework is completely general in the sense that broad classes of metapopulation models may be considered, site level covariates on detection and abundance may be considered, and estimates of abundance and related quantities may be obtained for sample locations, groups of locations, unsampled locations. Two brief examples are given, the first involving simple point counts, and the second based on temporary removal counts. Extension of these models to open systems is briefly discussed.
Hierarchy-associated semantic-rule inference framework for classifying indoor scenes
NASA Astrophysics Data System (ADS)
Yu, Dan; Liu, Peng; Ye, Zhipeng; Tang, Xianglong; Zhao, Wei
2016-03-01
Typically, the initial task of classifying indoor scenes is challenging, because the spatial layout and decoration of a scene can vary considerably. Recent efforts at classifying object relationships commonly depend on the results of scene annotation and predefined rules, making classification inflexible. Furthermore, annotation results are easily affected by external factors. Inspired by human cognition, a scene-classification framework was proposed using the empirically based annotation (EBA) and a match-over rule-based (MRB) inference system. The semantic hierarchy of images is exploited by EBA to construct rules empirically for MRB classification. The problem of scene classification is divided into low-level annotation and high-level inference from a macro perspective. Low-level annotation involves detecting the semantic hierarchy and annotating the scene with a deformable-parts model and a bag-of-visual-words model. In high-level inference, hierarchical rules are extracted to train the decision tree for classification. The categories of testing samples are generated from the parts to the whole. Compared with traditional classification strategies, the proposed semantic hierarchy and corresponding rules reduce the effect of a variable background and improve the classification performance. The proposed framework was evaluated on a popular indoor scene dataset, and the experimental results demonstrate its effectiveness.
ERIC Educational Resources Information Center
Nimon, Kim
2012-01-01
Using state achievement data that are openly accessible, this paper demonstrates the application of hierarchical linear modeling within the context of career technical education research. Three prominent approaches to analyzing clustered data (i.e., modeling aggregated data, modeling disaggregated data, modeling hierarchical data) are discussed…
Tao, Kai; Han, Xue; Ma, Qingxiang; Han, Lei
2018-03-06
Metal-organic frameworks (MOFs) have emerged as a new platform for the construction of various functional materials for energy related applications. Here, a facile MOF templating method is developed to fabricate a hierarchical nickel-cobalt sulfide nanosheet array on conductive Ni foam (Ni-Co-S/NF) as a binder-free electrode for supercapacitors. A uniform 2D Co-MOF nanowall array is first grown in situ on Ni foam in aqueous solution at room temperature, and then the Co-MOF nanowalls are converted into hierarchical Ni-Co-S nanoarchitectures via an etching and ion-exchange reaction with Ni(NO 3 ) 2 , and a subsequent solvothermal sulfurization. Taking advantage of the compositional and structural merits of the hierarchical Ni-Co-S nanosheet array and conductive Ni foam, such as fast electron transportation, short ion diffusion path, abundant active sites and rich redox reactions, the obtained Ni-Co-S/NF electrode exhibits excellent electrochemical capacitive performance (1406.9 F g -1 at 0.5 A g -1 , 53.9% retention at 10 A g -1 and 88.6% retention over 1000 cycles), which is superior to control CoS/NF. An asymmetric supercapacitor (ASC) assembled by using the as-fabricated Ni-Co-S/NF as the positive electrode and activated carbon (AC) as the negative electrode delivers a high energy density of 24.8 W h kg -1 at a high power density of 849.5 W kg -1 . Even when the power density is as high as 8.5 kW kg -1 , the ASC still exhibits a high energy density of 12.5 W h kg -1 . This facile synthetic strategy can also be extended to fabricate other hierarchical integrated electrodes for high-efficiency electrochemical energy conversion and storage devices.
Mechanosensitive Channels: Insights from Continuum-Based Simulations
Tang, Yuye; Yoo, Jejoong; Yethiraj, Arun; Cui, Qiang; Chen, Xi
2009-01-01
Mechanotransduction plays an important role in regulating cell functions and it is an active topic of research in biophysics. Despite recent advances in experimental and numerical techniques, the intrinsic multiscale nature imposes tremendous challenges for revealing the working mechanisms of mechanosensitive channels. Recently, a continuum-mechanics based hierarchical modeling and simulation framework has been established and applied to study the mechanical responses and gating behaviors of a prototypical mechanosensitive channel, the mechanosensitive channel of large conductance (MscL) in bacteria Escherichia coli (E. coli), from which several putative gating mechanisms have been tested and new insights deduced. This article reviews these latest findings using the continuum mechanics framework and suggests possible improvements for future simulation studies. This computationally efficient and versatile continuum-mechanics based protocol is poised to make contributions to the study of a variety of mechanobiology problems. PMID:18787764
McCord, David M; Achee, Margaret C; Cannon, Elissa M; Harrop, Tiffany M; Poynter, William D
2017-01-01
The National Institute of Mental Health has proposed a paradigm shift in the conceptualization of psychopathology, abandoning the traditional categorical model in favor of one based on hierarchically organized dimensional constructs (Insel et al., 2010 ). One explicit goal of this initiative, the Research Domain Criteria (RDoC) project, is to facilitate the incorporation of newly available neurobiologic variables into research on psychopathology. The Minnesota Multiphasic Personality Inventory-2-Restructured Form (MMPI-2-RF; Ben-Porath & Tellegen, 2008/2011 ) represents a similar paradigm shift, also adopting a hierarchical arrangement of dimensional constructs. This study examined associations between MMPI-2-RF measures of psychopathology and eye-movement metrics. Participants were college students (n = 270) who completed the MMPI-2-RF and then viewed a sequence of 30-s video clips. Results show a pattern of positive correlations between pupil size and emotional/internalizing dysfunction scales when viewing video eliciting negative emotional reactions, reflecting greater arousability in individuals with higher scores on these measures. In contrast, when viewing stimuli depicting angry, threatening material, a clear pattern of negative correlations was found between pupil size and behavioral/externalizing trait measures. These data add to the construct validity of the MMPI-2-RF and support the use of the RDoC matrix as a framework for research on psychopathology.
A Multi-modal, Discriminative and Spatially Invariant CNN for RGB-D Object Labeling.
Asif, Umar; Bennamoun, Mohammed; Sohel, Ferdous
2017-08-30
While deep convolutional neural networks have shown a remarkable success in image classification, the problems of inter-class similarities, intra-class variances, the effective combination of multimodal data, and the spatial variability in images of objects remain to be major challenges. To address these problems, this paper proposes a novel framework to learn a discriminative and spatially invariant classification model for object and indoor scene recognition using multimodal RGB-D imagery. This is achieved through three postulates: 1) spatial invariance - this is achieved by combining a spatial transformer network with a deep convolutional neural network to learn features which are invariant to spatial translations, rotations, and scale changes, 2) high discriminative capability - this is achieved by introducing Fisher encoding within the CNN architecture to learn features which have small inter-class similarities and large intra-class compactness, and 3) multimodal hierarchical fusion - this is achieved through the regularization of semantic segmentation to a multi-modal CNN architecture, where class probabilities are estimated at different hierarchical levels (i.e., imageand pixel-levels), and fused into a Conditional Random Field (CRF)- based inference hypothesis, the optimization of which produces consistent class labels in RGB-D images. Extensive experimental evaluations on RGB-D object and scene datasets, and live video streams (acquired from Kinect) show that our framework produces superior object and scene classification results compared to the state-of-the-art methods.
Hierarchical Multinomial Processing Tree Models: A Latent-Trait Approach
ERIC Educational Resources Information Center
Klauer, Karl Christoph
2010-01-01
Multinomial processing tree models are widely used in many areas of psychology. A hierarchical extension of the model class is proposed, using a multivariate normal distribution of person-level parameters with the mean and covariance matrix to be estimated from the data. The hierarchical model allows one to take variability between persons into…
NASA Astrophysics Data System (ADS)
Wen, Di; Ding, Xiaoqing
2003-12-01
In this paper we propose a general framework for character segmentation in complex multilingual documents, which is an endeavor to combine the traditionally separated segmentation and recognition processes into a cooperative system. The framework contains three basic steps: Dissection, Local Optimization and Global Optimization, which are designed to fuse various properties of the segmentation hypotheses hierarchically into a composite evaluation to decide the final recognition results. Experimental results show that this framework is general enough to be applied in variety of documents. A sample system based on this framework to recognize Chinese, Japanese and Korean documents and experimental performance is reported finally.
Gray, B.R.; Haro, R.J.; Rogala, J.T.; Sauer, J.S.
2005-01-01
1. Macroinvertebrate count data often exhibit nested or hierarchical structure. Examples include multiple measurements along each of a set of streams, and multiple synoptic measurements from each of a set of ponds. With data exhibiting hierarchical structure, outcomes at both sampling (e.g. Within stream) and aggregated (e.g. Stream) scales are often of interest. Unfortunately, methods for modelling hierarchical count data have received little attention in the ecological literature. 2. We demonstrate the use of hierarchical count models using fingernail clam (Family: Sphaeriidae) count data and habitat predictors derived from sampling and aggregated spatial scales. The sampling scale corresponded to that of a standard Ponar grab (0.052 m(2)) and the aggregated scale to impounded and backwater regions within 38-197 km reaches of the Upper Mississippi River. Impounded and backwater regions were resampled annually for 10 years. Consequently, measurements on clams were nested within years. Counts were treated as negative binomial random variates, and means from each resampling event as random departures from the impounded and backwater region grand means. 3. Clam models were improved by the addition of covariates that varied at both the sampling and regional scales. Substrate composition varied at the sampling scale and was associated with model improvements, and reductions (for a given mean) in variance at the sampling scale. Inorganic suspended solids (ISS) levels, measured in the summer preceding sampling, also yielded model improvements and were associated with reductions in variances at the regional rather than sampling scales. ISS levels were negatively associated with mean clam counts. 4. Hierarchical models allow hierarchically structured data to be modelled without ignoring information specific to levels of the hierarchy. In addition, information at each hierarchical level may be modelled as functions of covariates that themselves vary by and within levels. As a result, hierarchical models provide researchers and resource managers with a method for modelling hierarchical data that explicitly recognises both the sampling design and the information contained in the corresponding data.
Ubiquitous Robotic Technology for Smart Manufacturing System.
Wang, Wenshan; Zhu, Xiaoxiao; Wang, Liyu; Qiu, Qiang; Cao, Qixin
2016-01-01
As the manufacturing tasks become more individualized and more flexible, the machines in smart factory are required to do variable tasks collaboratively without reprogramming. This paper for the first time discusses the similarity between smart manufacturing systems and the ubiquitous robotic systems and makes an effort on deploying ubiquitous robotic technology to the smart factory. Specifically, a component based framework is proposed in order to enable the communication and cooperation of the heterogeneous robotic devices. Further, compared to the service robotic domain, the smart manufacturing systems are often in larger size. So a hierarchical planning method was implemented to improve the planning efficiency. A test bed of smart factory is developed. It demonstrates that the proposed framework is suitable for industrial domain, and the hierarchical planning method is able to solve large problems intractable with flat methods.
Ubiquitous Robotic Technology for Smart Manufacturing System
Zhu, Xiaoxiao; Wang, Liyu; Qiu, Qiang; Cao, Qixin
2016-01-01
As the manufacturing tasks become more individualized and more flexible, the machines in smart factory are required to do variable tasks collaboratively without reprogramming. This paper for the first time discusses the similarity between smart manufacturing systems and the ubiquitous robotic systems and makes an effort on deploying ubiquitous robotic technology to the smart factory. Specifically, a component based framework is proposed in order to enable the communication and cooperation of the heterogeneous robotic devices. Further, compared to the service robotic domain, the smart manufacturing systems are often in larger size. So a hierarchical planning method was implemented to improve the planning efficiency. A test bed of smart factory is developed. It demonstrates that the proposed framework is suitable for industrial domain, and the hierarchical planning method is able to solve large problems intractable with flat methods. PMID:27446206
Ultralight mesoporous magnetic frameworks by interfacial assembly of Prussian blue nanocubes.
Kong, Biao; Tang, Jing; Wu, Zhangxiong; Wei, Jing; Wu, Hao; Wang, Yongcheng; Zheng, Gengfeng; Zhao, Dongyuan
2014-03-10
A facile approach for the synthesis of ultralight iron oxide hierarchical structures with tailorable macro- and mesoporosity is reported. This method entails the growth of porous Prussian blue (PB) single crystals on the surface of a polyurethane sponge, followed by in situ thermal conversion of PB crystals into three-dimensional mesoporous iron oxide (3DMI) architectures. Compared to previously reported ultralight materials, the 3DMI architectures possess hierarchical macro- and mesoporous frameworks with multiple advantageous features, including high surface area (ca. 117 m(2) g(-1)) and ultralow density (6-11 mg cm(-3)). Furthermore, they can be synthesized on a kilogram scale. More importantly, these 3DMI structures exhibit superparamagnetism and tunable hydrophilicity/hydrophobicity, thus allowing for efficient multiphase interfacial adsorption and fast multiphase catalysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Werner, Benjamin; Scott, Jacob G; Sottoriva, Andrea; Anderson, Alexander R A; Traulsen, Arne; Altrock, Philipp M
2016-04-01
Many tumors are hierarchically organized and driven by a subpopulation of tumor-initiating cells (TIC), or cancer stem cells. TICs are uniquely capable of recapitulating the tumor and are thought to be highly resistant to radio- and chemotherapy. Macroscopic patterns of tumor expansion before treatment and tumor regression during treatment are tied to the dynamics of TICs. Until now, the quantitative information about the fraction of TICs from macroscopic tumor burden trajectories could not be inferred. In this study, we generated a quantitative method based on a mathematical model that describes hierarchically organized tumor dynamics and patient-derived tumor burden information. The method identifies two characteristic equilibrium TIC regimes during expansion and regression. We show that tumor expansion and regression curves can be leveraged to infer estimates of the TIC fraction in individual patients at detection and after continued therapy. Furthermore, our method is parameter-free; it solely requires the knowledge of a patient's tumor burden over multiple time points to reveal microscopic properties of the malignancy. We demonstrate proof of concept in the case of chronic myeloid leukemia (CML), wherein our model recapitulated the clinical history of the disease in two independent patient cohorts. On the basis of patient-specific treatment responses in CML, we predict that after one year of targeted treatment, the fraction of TICs increases 100-fold and continues to increase up to 1,000-fold after 5 years of treatment. Our novel framework may significantly influence the implementation of personalized treatment strategies and has the potential for rapid translation into the clinic. Cancer Res; 76(7); 1705-13. ©2016 AACR. ©2016 American Association for Cancer Research.
Heudtlass, Peter; Guha-Sapir, Debarati; Speybroeck, Niko
2018-05-31
The crude death rate (CDR) is one of the defining indicators of humanitarian emergencies. When data from vital registration systems are not available, it is common practice to estimate the CDR from household surveys with cluster-sampling design. However, sample sizes are often too small to compare mortality estimates to emergency thresholds, at least in a frequentist framework. Several authors have proposed Bayesian methods for health surveys in humanitarian crises. Here, we develop an approach specifically for mortality data and cluster-sampling surveys. We describe a Bayesian hierarchical Poisson-Gamma mixture model with generic (weakly informative) priors that could be used as default in absence of any specific prior knowledge, and compare Bayesian and frequentist CDR estimates using five different mortality datasets. We provide an interpretation of the Bayesian estimates in the context of an emergency threshold and demonstrate how to interpret parameters at the cluster level and ways in which informative priors can be introduced. With the same set of weakly informative priors, Bayesian CDR estimates are equivalent to frequentist estimates, for all practical purposes. The probability that the CDR surpasses the emergency threshold can be derived directly from the posterior of the mean of the mixing distribution. All observation in the datasets contribute to the estimation of cluster-level estimates, through the hierarchical structure of the model. In a context of sparse data, Bayesian mortality assessments have advantages over frequentist ones already when using only weakly informative priors. More informative priors offer a formal and transparent way of combining new data with existing data and expert knowledge and can help to improve decision-making in humanitarian crises by complementing frequentist estimates.
Chemical-mechanical stability of the hierarchical structure of shell nacre
NASA Astrophysics Data System (ADS)
Sun, Jinmei; Guo, Wanlin
2010-02-01
The hierarchical structure and mechanical property of shell nacre are experimentally investigated from the new aspects of chemical stability and chemistry-mechanics coupling. Through chemical deproteinization or demineralization methods together with characterization techniques at micro/nano scales, it is found that the nacre of abalone, haliotis discus hannai, contains a hierarchical structure stacked with irregular aragonite platelets and interplatelet organic matrix thin layers. Yet the aragonite platelet itself is a nanocomposite consisting of nanoparticles and intraplatelet organic matrix framework. The mean diameter of the nanoparticles and the distribution of framework are quite different for different platelets. Though the interplatelet and intraplatelet organic matrix can be both decomposed by sodium hydroxide solution, the chemical stability of individual aragonite platelets is much higher than that of the microstructure stacked with them. Further, macroscopic bending test or nanoindentation experiment is performed on the micro/nanostructure of nacre after sodium hydroxide treatment. It is found that the Young’s modulus of both the stacked microstructure and nanocomposite platelet reduced. The reduction of the microstructure is more remark than that of the platelet. Therefore the chemical-mechanical stability of the nanocomposite platelet itself is much higher than that of the stacked microstructure of nacre.
Hierarchical Higher Order Crf for the Classification of Airborne LIDAR Point Clouds in Urban Areas
NASA Astrophysics Data System (ADS)
Niemeyer, J.; Rottensteiner, F.; Soergel, U.; Heipke, C.
2016-06-01
We propose a novel hierarchical approach for the classification of airborne 3D lidar points. Spatial and semantic context is incorporated via a two-layer Conditional Random Field (CRF). The first layer operates on a point level and utilises higher order cliques. Segments are generated from the labelling obtained in this way. They are the entities of the second layer, which incorporates larger scale context. The classification result of the segments is introduced as an energy term for the next iteration of the point-based layer. This framework iterates and mutually propagates context to improve the classification results. Potentially wrong decisions can be revised at later stages. The output is a labelled point cloud as well as segments roughly corresponding to object instances. Moreover, we present two new contextual features for the segment classification: the distance and the orientation of a segment with respect to the closest road. It is shown that the classification benefits from these features. In our experiments the hierarchical framework improve the overall accuracies by 2.3% on a point-based level and by 3.0% on a segment-based level, respectively, compared to a purely point-based classification.
Progressive Dictionary Learning with Hierarchical Predictive Structure for Scalable Video Coding.
Dai, Wenrui; Shen, Yangmei; Xiong, Hongkai; Jiang, Xiaoqian; Zou, Junni; Taubman, David
2017-04-12
Dictionary learning has emerged as a promising alternative to the conventional hybrid coding framework. However, the rigid structure of sequential training and prediction degrades its performance in scalable video coding. This paper proposes a progressive dictionary learning framework with hierarchical predictive structure for scalable video coding, especially in low bitrate region. For pyramidal layers, sparse representation based on spatio-temporal dictionary is adopted to improve the coding efficiency of enhancement layers (ELs) with a guarantee of reconstruction performance. The overcomplete dictionary is trained to adaptively capture local structures along motion trajectories as well as exploit the correlations between neighboring layers of resolutions. Furthermore, progressive dictionary learning is developed to enable the scalability in temporal domain and restrict the error propagation in a close-loop predictor. Under the hierarchical predictive structure, online learning is leveraged to guarantee the training and prediction performance with an improved convergence rate. To accommodate with the stateof- the-art scalable extension of H.264/AVC and latest HEVC, standardized codec cores are utilized to encode the base and enhancement layers. Experimental results show that the proposed method outperforms the latest SHVC and HEVC simulcast over extensive test sequences with various resolutions.
Self-Supervised Video Hashing With Hierarchical Binary Auto-Encoder.
Song, Jingkuan; Zhang, Hanwang; Li, Xiangpeng; Gao, Lianli; Wang, Meng; Hong, Richang
2018-07-01
Existing video hash functions are built on three isolated stages: frame pooling, relaxed learning, and binarization, which have not adequately explored the temporal order of video frames in a joint binary optimization model, resulting in severe information loss. In this paper, we propose a novel unsupervised video hashing framework dubbed self-supervised video hashing (SSVH), which is able to capture the temporal nature of videos in an end-to-end learning to hash fashion. We specifically address two central problems: 1) how to design an encoder-decoder architecture to generate binary codes for videos and 2) how to equip the binary codes with the ability of accurate video retrieval. We design a hierarchical binary auto-encoder to model the temporal dependencies in videos with multiple granularities, and embed the videos into binary codes with less computations than the stacked architecture. Then, we encourage the binary codes to simultaneously reconstruct the visual content and neighborhood structure of the videos. Experiments on two real-world data sets show that our SSVH method can significantly outperform the state-of-the-art methods and achieve the current best performance on the task of unsupervised video retrieval.
NASA Astrophysics Data System (ADS)
Benaskeur, Abder R.; Roy, Jean
2001-08-01
Sensor Management (SM) has to do with how to best manage, coordinate and organize the use of sensing resources in a manner that synergistically improves the process of data fusion. Based on the contextual information, SM develops options for collecting further information, allocates and directs the sensors towards the achievement of the mission goals and/or tunes the parameters for the realtime improvement of the effectiveness of the sensing process. Conscious of the important role that SM has to play in modern data fusion systems, we are currently studying advanced SM Concepts that would help increase the survivability of the current Halifax and Iroquois Class ships, as well as their possible future upgrades. For this purpose, a hierarchical scheme has been proposed for data fusion and resource management adaptation, based on the control theory and within the process refinement paradigm of the JDL data fusion model, and taking into account the multi-agent model put forward by the SASS Group for the situation analysis process. The novelty of this work lies in the unified framework that has been defined for tackling the adaptation of both the fusion process and the sensor/weapon management.
Liu, Kai; Ren, Xiaokang; Sun, Jianxuan; Zou, Qianli; Yan, Xuehai
2018-06-01
The emergence of light-energy-utilizing metabolism is likely to be a critical milestone in prebiotic chemistry and the origin of life. However, how the primitive pigment is spontaneously generated still remains unknown. Herein, a primitive pigment model based on adaptive self-organization of amino acids (Cystine, Cys) and metal ions (zinc ion, Zn 2+ ) followed by chemical evolution under hydrothermal conditions is developed. The resulting hybrid microspheres are composed of radially aligned cystine/zinc (Cys/Zn) assembly decorated with carbonate-doped zinc sulfide (C-ZnS) nanocrystals. The part of C-ZnS can work as a light-harvesting antenna to capture ultraviolet and visible light, and use it in various photochemical reactions, including hydrogen (H 2 ) evolution, carbon dioxide (CO 2 ) photoreduction, and reduction of nicotinamide adenine dinucleotide (NAD + ) to nicotinamide adenine dinucleotide hydride (NADH). Additionally, guest molecules (e.g., glutamate dehydrogenase, GDH) can be encapsulated within the hierarchical Cys/Zn framework, which facilitates sustainable photoenzymatic synthesis of glutamate. This study helps deepen insight into the emergent functionality (conversion of light energy) and complexity (hierarchical architecture) from interaction and reaction of prebiotic molecules. The primitive pigment model is also promising to work as an artificial photosynthetic microreactor.
Hierarchical statistical modeling of xylem vulnerability to cavitation.
Ogle, Kiona; Barber, Jarrett J; Willson, Cynthia; Thompson, Brenda
2009-01-01
Cavitation of xylem elements diminishes the water transport capacity of plants, and quantifying xylem vulnerability to cavitation is important to understanding plant function. Current approaches to analyzing hydraulic conductivity (K) data to infer vulnerability to cavitation suffer from problems such as the use of potentially unrealistic vulnerability curves, difficulty interpreting parameters in these curves, a statistical framework that ignores sampling design, and an overly simplistic view of uncertainty. This study illustrates how two common curves (exponential-sigmoid and Weibull) can be reparameterized in terms of meaningful parameters: maximum conductivity (k(sat)), water potential (-P) at which percentage loss of conductivity (PLC) =X% (P(X)), and the slope of the PLC curve at P(X) (S(X)), a 'sensitivity' index. We provide a hierarchical Bayesian method for fitting the reparameterized curves to K(H) data. We illustrate the method using data for roots and stems of two populations of Juniperus scopulorum and test for differences in k(sat), P(X), and S(X) between different groups. Two important results emerge from this study. First, the Weibull model is preferred because it produces biologically realistic estimates of PLC near P = 0 MPa. Second, stochastic embolisms contribute an important source of uncertainty that should be included in such analyses.
Self-Supervised Video Hashing With Hierarchical Binary Auto-Encoder
NASA Astrophysics Data System (ADS)
Song, Jingkuan; Zhang, Hanwang; Li, Xiangpeng; Gao, Lianli; Wang, Meng; Hong, Richang
2018-07-01
Existing video hash functions are built on three isolated stages: frame pooling, relaxed learning, and binarization, which have not adequately explored the temporal order of video frames in a joint binary optimization model, resulting in severe information loss. In this paper, we propose a novel unsupervised video hashing framework dubbed Self-Supervised Video Hashing (SSVH), that is able to capture the temporal nature of videos in an end-to-end learning-to-hash fashion. We specifically address two central problems: 1) how to design an encoder-decoder architecture to generate binary codes for videos; and 2) how to equip the binary codes with the ability of accurate video retrieval. We design a hierarchical binary autoencoder to model the temporal dependencies in videos with multiple granularities, and embed the videos into binary codes with less computations than the stacked architecture. Then, we encourage the binary codes to simultaneously reconstruct the visual content and neighborhood structure of the videos. Experiments on two real-world datasets (FCVID and YFCC) show that our SSVH method can significantly outperform the state-of-the-art methods and achieve the currently best performance on the task of unsupervised video retrieval.
Robust, Adaptive Functional Regression in Functional Mixed Model Framework.
Zhu, Hongxiao; Brown, Philip J; Morris, Jeffrey S
2011-09-01
Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient enough to handle large data sets, and yields posterior samples of all model parameters that can be used to perform desired Bayesian estimation and inference. Although we present details for a specific implementation of the R-FMM using specific distributional choices in the hierarchical model, 1D functions, and wavelet transforms, the method can be applied more generally using other heavy-tailed distributions, higher dimensional functions (e.g. images), and using other invertible transformations as alternatives to wavelets.
Robust, Adaptive Functional Regression in Functional Mixed Model Framework
Zhu, Hongxiao; Brown, Philip J.; Morris, Jeffrey S.
2012-01-01
Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient enough to handle large data sets, and yields posterior samples of all model parameters that can be used to perform desired Bayesian estimation and inference. Although we present details for a specific implementation of the R-FMM using specific distributional choices in the hierarchical model, 1D functions, and wavelet transforms, the method can be applied more generally using other heavy-tailed distributions, higher dimensional functions (e.g. images), and using other invertible transformations as alternatives to wavelets. PMID:22308015
Reasons for Hierarchical Linear Modeling: A Reminder.
ERIC Educational Resources Information Center
Wang, Jianjun
1999-01-01
Uses examples of hierarchical linear modeling (HLM) at local and national levels to illustrate proper applications of HLM and dummy variable regression. Raises cautions about the circumstances under which hierarchical data do not need HLM. (SLD)
Use Hierarchical Storage and Analysis to Exploit Intrinsic Parallelism
NASA Astrophysics Data System (ADS)
Zender, C. S.; Wang, W.; Vicente, P.
2013-12-01
Big Data is an ugly name for the scientific opportunities and challenges created by the growing wealth of geoscience data. How to weave large, disparate datasets together to best reveal their underlying properties, to exploit their strengths and minimize their weaknesses, to continually aggregate more information than the world knew yesterday and less than we will learn tomorrow? Data analytics techniques (statistics, data mining, machine learning, etc.) can accelerate pattern recognition and discovery. However, often researchers must, prior to analysis, organize multiple related datasets into a coherent framework. Hierarchical organization permits entire dataset to be stored in nested groups that reflect their intrinsic relationships and similarities. Hierarchical data can be simpler and faster to analyze by coding operators to automatically parallelize processes over isomorphic storage units, i.e., groups. The newest generation of netCDF Operators (NCO) embody this hierarchical approach, while still supporting traditional analysis approaches. We will use NCO to demonstrate the trade-offs involved in processing a prototypical Big Data application (analysis of CMIP5 datasets) using hierarchical and traditional analysis approaches.
Bayesian spatio-temporal modeling of particulate matter concentrations in Peninsular Malaysia
NASA Astrophysics Data System (ADS)
Manga, Edna; Awang, Norhashidah
2016-06-01
This article presents an application of a Bayesian spatio-temporal Gaussian process (GP) model on particulate matter concentrations from Peninsular Malaysia. We analyze daily PM10 concentration levels from 35 monitoring sites in June and July 2011. The spatiotemporal model set in a Bayesian hierarchical framework allows for inclusion of informative covariates, meteorological variables and spatiotemporal interactions. Posterior density estimates of the model parameters are obtained by Markov chain Monte Carlo methods. Preliminary data analysis indicate information on PM10 levels at sites classified as industrial locations could explain part of the space time variations. We include the site-type indicator in our modeling efforts. Results of the parameter estimates for the fitted GP model show significant spatio-temporal structure and positive effect of the location-type explanatory variable. We also compute some validation criteria for the out of sample sites that show the adequacy of the model for predicting PM10 at unmonitored sites.
Robust Real-Time Music Transcription with a Compositional Hierarchical Model.
Pesek, Matevž; Leonardis, Aleš; Marolt, Matija
2017-01-01
The paper presents a new compositional hierarchical model for robust music transcription. Its main features are unsupervised learning of a hierarchical representation of input data, transparency, which enables insights into the learned representation, as well as robustness and speed which make it suitable for real-world and real-time use. The model consists of multiple layers, each composed of a number of parts. The hierarchical nature of the model corresponds well to hierarchical structures in music. The parts in lower layers correspond to low-level concepts (e.g. tone partials), while the parts in higher layers combine lower-level representations into more complex concepts (tones, chords). The layers are learned in an unsupervised manner from music signals. Parts in each layer are compositions of parts from previous layers based on statistical co-occurrences as the driving force of the learning process. In the paper, we present the model's structure and compare it to other hierarchical approaches in the field of music information retrieval. We evaluate the model's performance for the multiple fundamental frequency estimation. Finally, we elaborate on extensions of the model towards other music information retrieval tasks.
ERIC Educational Resources Information Center
Rocconi, Louis M.
2011-01-01
Hierarchical linear models (HLM) solve the problems associated with the unit of analysis problem such as misestimated standard errors, heterogeneity of regression and aggregation bias by modeling all levels of interest simultaneously. Hierarchical linear modeling resolves the problem of misestimated standard errors by incorporating a unique random…
Life cycle of soil sggregates: from root residue to microbial and physical hotspots
NASA Astrophysics Data System (ADS)
Ghezzehei, T. A.; Or, D.
2017-12-01
Soil aggregation is a physical state of soil in which clumps of primary soil particles are held together by biological and/or chemical cementing agents. Aggregations plays important role in storage and movement of water and essential gases, nutrient cycling, and ultimately supporting microbial and plant life. It is also one of the most dynamic and sensitive soil qualities, which readily responds to disturbances such as cultivation, fire, drought, flooding, and changes in vegetation. Soil aggregation that is primarily controlled by organic matter generally exhibits hierarchical organization of soil constituents into stable units that range in size from a few microns to centimeters. However, this conceptual model of soil aggregation as the key unifying mechanism remains poorly quantified and is rarely included in predictive soil models. Here we provide a biophysical framework for quantitative and predictive modeling of soil aggregation and its attendant soil characteristics. The framework treats aggregates as hotspots of biological, chemical and physical processes centered around roots and root residue. We keep track of the life cycle of an individual aggregate from it genesis in the rhizosphere, fueled by rhizodeposition and mediated by vigorous microbial activity, until its disappearance when the root-derived resources are depleted. The framework synthesizes current understanding of microbial life in porous media; water holding and soil binding capacity of biopolymers; and environmental controls on soil organic matter dynamics. The framework paves a way for integration of processes that are presently modeled as disparate or poorly coupled processes, including storage and protection of carbon, microbial activity, greenhouse gas fluxes, movement and storage of water, resistance of soils against erosion.
ERIC Educational Resources Information Center
Wholeben, Brent Edward
A number of key issues facing elementary, secondary, and postsecondary educational administrators during retrenchment require a hierarchical decision-modeling approach. This paper identifies and discusses the use of a hierarchical multiple-alternatives modeling formulation (computer-based) that compares and evaluates a group of solution…
NASA Astrophysics Data System (ADS)
Hong, Liang
2013-10-01
The availability of high spatial resolution remote sensing data provides new opportunities for urban land-cover classification. More geometric details can be observed in the high resolution remote sensing image, Also Ground objects in the high resolution remote sensing image have displayed rich texture, structure, shape and hierarchical semantic characters. More landscape elements are represented by a small group of pixels. Recently years, the an object-based remote sensing analysis methodology is widely accepted and applied in high resolution remote sensing image processing. The classification method based on Geo-ontology and conditional random fields is presented in this paper. The proposed method is made up of four blocks: (1) the hierarchical ground objects semantic framework is constructed based on geoontology; (2) segmentation by mean-shift algorithm, which image objects are generated. And the mean-shift method is to get boundary preserved and spectrally homogeneous over-segmentation regions ;(3) the relations between the hierarchical ground objects semantic and over-segmentation regions are defined based on conditional random fields framework ;(4) the hierarchical classification results are obtained based on geo-ontology and conditional random fields. Finally, high-resolution remote sensed image data -GeoEye, is used to testify the performance of the presented method. And the experimental results have shown the superiority of this method to the eCognition method both on the effectively and accuracy, which implies it is suitable for the classification of high resolution remote sensing image.
A hierarchical instrumental decision theory of nicotine dependence.
Hogarth, Lee; Troisi, Joseph R
2015-01-01
It is important to characterize the learning processes governing tobacco-seeking in order to understand how best to treat this behavior. Most drug learning theories have adopted a Pavlovian framework wherein the conditioned response is the main motivational process. We favor instead a hierarchical instrumental decision account, wherein expectations about the instrumental contingency between voluntary tobacco-seeking and the receipt of nicotine reward determines the probability of executing this behavior. To support this view, we review titration and nicotine discrimination research showing that internal signals for deprivation/satiation modulate expectations about the current incentive value of smoking, thereby modulating the propensity of this behavior. We also review research on cue-reactivity which has shown that external smoking cues modulate expectations about the probability of the tobacco-seeking response being effective, thereby modulating the propensity of this behavior. Economic decision theory is then considered to elucidate how expectations about the value and probability of response-nicotine contingency are integrated to form an overall utility estimate for that option for comparison with qualitatively different, nonsubstitute reinforcers, to determine response selection. As an applied test for this hierarchical instrumental decision framework, we consider how well it accounts for individual liability to smoking uptake and perseveration, pharmacotherapy, cue-extinction therapies, and plain packaging. We conclude that the hierarchical instrumental account is successful in reconciling this broad range of phenomenon precisely because it accepts that multiple diverse sources of internal and external information must be integrated to shape the decision to smoke.
NASA Astrophysics Data System (ADS)
Sun, Kaioqiong; Udupa, Jayaram K.; Odhner, Dewey; Tong, Yubing; Torigian, Drew A.
2014-03-01
This paper proposes a thoracic anatomy segmentation method based on hierarchical recognition and delineation guided by a built fuzzy model. Labeled binary samples for each organ are registered and aligned into a 3D fuzzy set representing the fuzzy shape model for the organ. The gray intensity distributions of the corresponding regions of the organ in the original image are recorded in the model. The hierarchical relation and mean location relation between different organs are also captured in the model. Following the hierarchical structure and location relation, the fuzzy shape model of different organs is registered to the given target image to achieve object recognition. A fuzzy connected delineation method is then used to obtain the final segmentation result of organs with seed points provided by recognition. The hierarchical structure and location relation integrated in the model provide the initial parameters for registration and make the recognition efficient and robust. The 3D fuzzy model combined with hierarchical affine registration ensures that accurate recognition can be obtained for both non-sparse and sparse organs. The results on real images are presented and shown to be better than a recently reported fuzzy model-based anatomy recognition strategy.
Brely, Lucas; Bosia, Federico; Pugno, Nicola M
2018-06-20
Contact unit size reduction is a widely studied mechanism as a means to improve adhesion in natural fibrillar systems, such as those observed in beetles or geckos. However, these animals also display complex structural features in the way the contact is subdivided in a hierarchical manner. Here, we study the influence of hierarchical fibrillar architectures on the load distribution over the contact elements of the adhesive system, and the corresponding delamination behaviour. We present an analytical model to derive the load distribution in a fibrillar system loaded in shear, including hierarchical splitting of contacts, i.e. a "hierarchical shear-lag" model that generalizes the well-known shear-lag model used in mechanics. The influence on the detachment process is investigated introducing a numerical procedure that allows the derivation of the maximum delamination force as a function of the considered geometry, including statistical variability of local adhesive energy. Our study suggests that contact splitting generates improved adhesion only in the ideal case of extremely compliant contacts. In real cases, to produce efficient adhesive performance, contact splitting needs to be coupled with hierarchical architectures to counterbalance high load concentrations resulting from contact unit size reduction, generating multiple delamination fronts and helping to avoid detrimental non-uniform load distributions. We show that these results can be summarized in a generalized adhesion scaling scheme for hierarchical structures, proving the beneficial effect of multiple hierarchical levels. The model can thus be used to predict the adhesive performance of hierarchical adhesive structures, as well as the mechanical behaviour of composite materials with hierarchical reinforcements.
Parameter estimation and prediction for the course of a single epidemic outbreak of a plant disease.
Kleczkowski, A; Gilligan, C A
2007-10-22
Many epidemics of plant diseases are characterized by large variability among individual outbreaks. However, individual epidemics often follow a well-defined trajectory which is much more predictable in the short term than the ensemble (collection) of potential epidemics. In this paper, we introduce a modelling framework that allows us to deal with individual replicated outbreaks, based upon a Bayesian hierarchical analysis. Information about 'similar' replicate epidemics can be incorporated into a hierarchical model, allowing both ensemble and individual parameters to be estimated. The model is used to analyse the data from a replicated experiment involving spread of Rhizoctonia solani on radish in the presence or absence of a biocontrol agent, Trichoderma viride. The rate of primary (soil-to-plant) infection is found to be the most variable factor determining the final size of epidemics. Breakdown of biological control in some replicates results in high levels of primary infection and increased variability. The model can be used to predict new outbreaks of disease based upon knowledge from a 'library' of previous epidemics and partial information about the current outbreak. We show that forecasting improves significantly with knowledge about the history of a particular epidemic, whereas the precision of hindcasting to identify the past course of the epidemic is largely independent of detailed knowledge of the epidemic trajectory. The results have important consequences for parameter estimation, inference and prediction for emerging epidemic outbreaks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao Hanqing; Fu Zhiguo; Lu Xiaoguang
Guided by the sedimentation theory and knowledge of modern and ancient fluvial deposition and utilizing the abundant information of sedimentary series, microfacies type and petrophysical parameters from well logging curves of close spaced thousands of wells located in a large area. A new method for establishing detailed sedimentation and permeability distribution models for fluvial reservoirs have been developed successfully. This study aimed at the geometry and internal architecture of sandbodies, in accordance to their hierarchical levels of heterogeneity and building up sedimentation and permeability distribution models of fluvial reservoirs, describing the reservoir heterogeneity on the light of the river sedimentarymore » rules. The results and methods obtained in outcrop and modem sedimentation studies have successfully supported the study. Taking advantage of this method, the major producing layers (PI{sub 1-2}), which have been considered as heterogeneous and thick fluvial reservoirs extending widely in lateral are researched in detail. These layers are subdivided into single sedimentary units vertically and the microfacies are identified horizontally. Furthermore, a complex system is recognized according to their hierarchical levels from large to small, meander belt, single channel sandbody, meander scroll, point bar, and lateral accretion bodies of point bar. The achieved results improved the description of areal distribution of point bar sandbodies, provide an accurate and detailed framework model for establishing high resolution predicting model. By using geostatistic technique, it also plays an important role in searching for enriched zone of residual oil distribution.« less
Hierarchical Representation Learning for Kinship Verification.
Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul
2017-01-01
Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.
Hierarchical model analysis of the Atlantic Flyway Breeding Waterfowl Survey
Sauer, John R.; Zimmerman, Guthrie S.; Klimstra, Jon D.; Link, William A.
2014-01-01
We used log-linear hierarchical models to analyze data from the Atlantic Flyway Breeding Waterfowl Survey. The survey has been conducted by state biologists each year since 1989 in the northeastern United States from Virginia north to New Hampshire and Vermont. Although yearly population estimates from the survey are used by the United States Fish and Wildlife Service for estimating regional waterfowl population status for mallards (Anas platyrhynchos), black ducks (Anas rubripes), wood ducks (Aix sponsa), and Canada geese (Branta canadensis), they are not routinely adjusted to control for time of day effects and other survey design issues. The hierarchical model analysis permits estimation of year effects and population change while accommodating the repeated sampling of plots and controlling for time of day effects in counting. We compared population estimates from the current stratified random sample analysis to population estimates from hierarchical models with alternative model structures that describe year to year changes as random year effects, a trend with random year effects, or year effects modeled as 1-year differences. Patterns of population change from the hierarchical model results generally were similar to the patterns described by stratified random sample estimates, but significant visibility differences occurred between twilight to midday counts in all species. Controlling for the effects of time of day resulted in larger population estimates for all species in the hierarchical model analysis relative to the stratified random sample analysis. The hierarchical models also provided a convenient means of estimating population trend as derived statistics from the analysis. We detected significant declines in mallard and American black ducks and significant increases in wood ducks and Canada geese, a trend that had not been significant for 3 of these 4 species in the prior analysis. We recommend using hierarchical models for analysis of the Atlantic Flyway Breeding Waterfowl Survey.
MacNab, Ying C
2016-08-01
This paper concerns with multivariate conditional autoregressive models defined by linear combination of independent or correlated underlying spatial processes. Known as linear models of coregionalization, the method offers a systematic and unified approach for formulating multivariate extensions to a broad range of univariate conditional autoregressive models. The resulting multivariate spatial models represent classes of coregionalized multivariate conditional autoregressive models that enable flexible modelling of multivariate spatial interactions, yielding coregionalization models with symmetric or asymmetric cross-covariances of different spatial variation and smoothness. In the context of multivariate disease mapping, for example, they facilitate borrowing strength both over space and cross variables, allowing for more flexible multivariate spatial smoothing. Specifically, we present a broadened coregionalization framework to include order-dependent, order-free, and order-robust multivariate models; a new class of order-free coregionalized multivariate conditional autoregressives is introduced. We tackle computational challenges and present solutions that are integral for Bayesian analysis of these models. We also discuss two ways of computing deviance information criterion for comparison among competing hierarchical models with or without unidentifiable prior parameters. The models and related methodology are developed in the broad context of modelling multivariate data on spatial lattice and illustrated in the context of multivariate disease mapping. The coregionalization framework and related methods also present a general approach for building spatially structured cross-covariance functions for multivariate geostatistics. © The Author(s) 2016.
Schaid, Daniel J
2010-01-01
Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.
Henschel, Volkmar; Engel, Jutta; Hölzel, Dieter; Mansmann, Ulrich
2009-02-10
Multivariate analysis of interval censored event data based on classical likelihood methods is notoriously cumbersome. Likelihood inference for models which additionally include random effects are not available at all. Developed algorithms bear problems for practical users like: matrix inversion, slow convergence, no assessment of statistical uncertainty. MCMC procedures combined with imputation are used to implement hierarchical models for interval censored data within a Bayesian framework. Two examples from clinical practice demonstrate the handling of clustered interval censored event times as well as multilayer random effects for inter-institutional quality assessment. The software developed is called survBayes and is freely available at CRAN. The proposed software supports the solution of complex analyses in many fields of clinical epidemiology as well as health services research.
Variational Integrators for Interconnected Lagrange-Dirac Systems
NASA Astrophysics Data System (ADS)
Parks, Helen; Leok, Melvin
2017-10-01
Interconnected systems are an important class of mathematical models, as they allow for the construction of complex, hierarchical, multiphysics, and multiscale models by the interconnection of simpler subsystems. Lagrange-Dirac mechanical systems provide a broad category of mathematical models that are closed under interconnection, and in this paper, we develop a framework for the interconnection of discrete Lagrange-Dirac mechanical systems, with a view toward constructing geometric structure-preserving discretizations of interconnected systems. This work builds on previous work on the interconnection of continuous Lagrange-Dirac systems (Jacobs and Yoshimura in J Geom Mech 6(1):67-98, 2014) and discrete Dirac variational integrators (Leok and Ohsawa in Found Comput Math 11(5), 529-562, 2011). We test our results by simulating some of the continuous examples given in Jacobs and Yoshimura (2014).
A hierarchical model for estimating change in American Woodcock populations
Sauer, J.R.; Link, W.A.; Kendall, W.L.; Kelley, J.R.; Niven, D.K.
2008-01-01
The Singing-Ground Survey (SGS) is a primary source of information on population change for American woodcock (Scolopax minor). We analyzed the SGS using a hierarchical log-linear model and compared the estimates of change and annual indices of abundance to a route regression analysis of SGS data. We also grouped SGS routes into Bird Conservation Regions (BCRs) and estimated population change and annual indices using BCRs within states and provinces as strata. Based on the hierarchical model?based estimates, we concluded that woodcock populations were declining in North America between 1968 and 2006 (trend = -0.9%/yr, 95% credible interval: -1.2, -0.5). Singing-Ground Survey results are generally similar between analytical approaches, but the hierarchical model has several important advantages over the route regression. Hierarchical models better accommodate changes in survey efficiency over time and space by treating strata, years, and observers as random effects in the context of a log-linear model, providing trend estimates that are derived directly from the annual indices. We also conducted a hierarchical model analysis of woodcock data from the Christmas Bird Count and the North American Breeding Bird Survey. All surveys showed general consistency in patterns of population change, but the SGS had the shortest credible intervals. We suggest that population management and conservation planning for woodcock involving interpretation of the SGS use estimates provided by the hierarchical model.
CONSTRUCTING A FLEXIBLE LIKELIHOOD FUNCTION FOR SPECTROSCOPIC INFERENCE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czekala, Ian; Andrews, Sean M.; Mandel, Kaisey S.
2015-10-20
We present a modular, extensible likelihood framework for spectroscopic inference based on synthetic model spectra. The subtraction of an imperfect model from a continuously sampled spectrum introduces covariance between adjacent datapoints (pixels) into the residual spectrum. For the high signal-to-noise data with large spectral range that is commonly employed in stellar astrophysics, that covariant structure can lead to dramatically underestimated parameter uncertainties (and, in some cases, biases). We construct a likelihood function that accounts for the structure of the covariance matrix, utilizing the machinery of Gaussian process kernels. This framework specifically addresses the common problem of mismatches in model spectralmore » line strengths (with respect to data) due to intrinsic model imperfections (e.g., in the atomic/molecular databases or opacity prescriptions) by developing a novel local covariance kernel formalism that identifies and self-consistently downweights pathological spectral line “outliers.” By fitting many spectra in a hierarchical manner, these local kernels provide a mechanism to learn about and build data-driven corrections to synthetic spectral libraries. An open-source software implementation of this approach is available at http://iancze.github.io/Starfish, including a sophisticated probabilistic scheme for spectral interpolation when using model libraries that are sparsely sampled in the stellar parameters. We demonstrate some salient features of the framework by fitting the high-resolution V-band spectrum of WASP-14, an F5 dwarf with a transiting exoplanet, and the moderate-resolution K-band spectrum of Gliese 51, an M5 field dwarf.« less
2017-09-01
efficacy of statistical post-processing methods downstream of these dynamical model components with a hierarchical multivariate Bayesian approach to...Bayesian hierarchical modeling, Markov chain Monte Carlo methods , Metropolis algorithm, machine learning, atmospheric prediction 15. NUMBER OF PAGES...scale processes. However, this dissertation explores the efficacy of statistical post-processing methods downstream of these dynamical model components