Sample records for functional modelling approach

  1. Accurate analytical modeling of junctionless DG-MOSFET by green's function approach

    NASA Astrophysics Data System (ADS)

    Nandi, Ashutosh; Pandey, Nilesh

    2017-11-01

    An accurate analytical model of Junctionless double gate MOSFET (JL-DG-MOSFET) in the subthreshold regime of operation is developed in this work using green's function approach. The approach considers 2-D mixed boundary conditions and multi-zone techniques to provide an exact analytical solution to 2-D Poisson's equation. The Fourier coefficients are calculated correctly to derive the potential equations that are further used to model the channel current and subthreshold slope of the device. The threshold voltage roll-off is computed from parallel shifts of Ids-Vgs curves between the long channel and short-channel devices. It is observed that the green's function approach of solving 2-D Poisson's equation in both oxide and silicon region can accurately predict channel potential, subthreshold current (Isub), threshold voltage (Vt) roll-off and subthreshold slope (SS) of both long & short channel devices designed with different doping concentrations and higher as well as lower tsi/tox ratio. All the analytical model results are verified through comparisons with TCAD Sentaurus simulation results. It is observed that the model matches quite well with TCAD device simulations.

  2. An alternative approach for modeling strength differential effect in sheet metals with symmetric yield functions

    NASA Astrophysics Data System (ADS)

    Kurukuri, Srihari; Worswick, Michael J.

    2013-12-01

    An alternative approach is proposed to utilize symmetric yield functions for modeling the tension-compression asymmetry commonly observed in hcp materials. In this work, the strength differential (SD) effect is modeled by choosing separate symmetric plane stress yield functions (for example, Barlat Yld 2000-2d) for the tension i.e., in the first quadrant of principal stress space, and compression i.e., third quadrant of principal stress space. In the second and fourth quadrants, the yield locus is constructed by adopting interpolating functions between uniaxial tensile and compressive stress states. In this work, different interpolating functions are chosen and the predictive capability of each approach is discussed. The main advantage of this proposed approach is that the yield locus parameters are deterministic and relatively easy to identify when compared to the Cazacu family of yield functions commonly used for modeling SD effect observed in hcp materials.

  3. The Use of Modeling Approach for Teaching Exponential Functions

    NASA Astrophysics Data System (ADS)

    Nunes, L. F.; Prates, D. B.; da Silva, J. M.

    2017-12-01

    This work presents a discussion related to the teaching and learning of mathematical contents related to the study of exponential functions in a freshman students group enrolled in the first semester of the Science and Technology Bachelor’s (STB of the Federal University of Jequitinhonha and Mucuri Valleys (UFVJM). As a contextualization tool strongly mentioned in the literature, the modelling approach was used as an educational teaching tool to produce contextualization in the teaching-learning process of exponential functions to these students. In this sense, were used some simple models elaborated with the GeoGebra software and, to have a qualitative evaluation of the investigation and the results, was used Didactic Engineering as a methodology research. As a consequence of this detailed research, some interesting details about the teaching and learning process were observed, discussed and described.

  4. Numerical approach to model independently reconstruct f (R ) functions through cosmographic data

    NASA Astrophysics Data System (ADS)

    Pizza, Liberato

    2015-06-01

    The challenging issue of determining the correct f (R ) among several possibilities is revised here by means of numerical reconstructions of the modified Friedmann equations around the redshift interval z ∈[0 ,1 ] . Frequently, a severe degeneracy between f (R ) approaches occurs, since different paradigms correctly explain present time dynamics. To set the initial conditions on the f (R ) functions, we involve the use of the so-called cosmography of the Universe, i.e., the technique of fixing constraints on the observable Universe by comparing expanded observables with current data. This powerful approach is essentially model independent, and correspondingly we got a model-independent reconstruction of f (R (z )) classes within the interval z ∈[0 ,1 ]. To allow the Hubble rate to evolve around z ≤1 , we considered three relevant frameworks of effective cosmological dynamics, i.e., the Λ CDM model, the Chevallier-Polarski-Linder parametrization, and a polynomial approach to dark energy. Finally, cumbersome algebra permits passing from f (z ) to f (R ), and the general outcome of our work is the determination of a viable f (R ) function, which effectively describes the observed Universe dynamics.

  5. A function space approach to smoothing with applications to model error estimation for flexible spacecraft control

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.

    1981-01-01

    A function space approach to smoothing is used to obtain a set of model error estimates inherent in a reduced-order model. By establishing knowledge of inevitable deficiencies in the truncated model, the error estimates provide a foundation for updating the model and thereby improving system performance. The function space smoothing solution leads to a specification of a method for computation of the model error estimates and development of model error analysis techniques for comparison between actual and estimated errors. The paper summarizes the model error estimation approach as well as an application arising in the area of modeling for spacecraft attitude control.

  6. Statistical Methods for Proteomic Biomarker Discovery based on Feature Extraction or Functional Modeling Approaches.

    PubMed

    Morris, Jeffrey S

    2012-01-01

    In recent years, developments in molecular biotechnology have led to the increased promise of detecting and validating biomarkers, or molecular markers that relate to various biological or medical outcomes. Proteomics, the direct study of proteins in biological samples, plays an important role in the biomarker discovery process. These technologies produce complex, high dimensional functional and image data that present many analytical challenges that must be addressed properly for effective comparative proteomics studies that can yield potential biomarkers. Specific challenges include experimental design, preprocessing, feature extraction, and statistical analysis accounting for the inherent multiple testing issues. This paper reviews various computational aspects of comparative proteomic studies, and summarizes contributions I along with numerous collaborators have made. First, there is an overview of comparative proteomics technologies, followed by a discussion of important experimental design and preprocessing issues that must be considered before statistical analysis can be done. Next, the two key approaches to analyzing proteomics data, feature extraction and functional modeling, are described. Feature extraction involves detection and quantification of discrete features like peaks or spots that theoretically correspond to different proteins in the sample. After an overview of the feature extraction approach, specific methods for mass spectrometry ( Cromwell ) and 2D gel electrophoresis ( Pinnacle ) are described. The functional modeling approach involves modeling the proteomic data in their entirety as functions or images. A general discussion of the approach is followed by the presentation of a specific method that can be applied, wavelet-based functional mixed models, and its extensions. All methods are illustrated by application to two example proteomic data sets, one from mass spectrometry and one from 2D gel electrophoresis. While the specific methods

  7. Modelling short time series in metabolomics: a functional data analysis approach.

    PubMed

    Montana, Giovanni; Berk, Maurice; Ebbels, Tim

    2011-01-01

    Metabolomics is the study of the complement of small molecule metabolites in cells, biofluids and tissues. Many metabolomic experiments are designed to compare changes observed over time under two or more experimental conditions (e.g. a control and drug-treated group), thus producing time course data. Models from traditional time series analysis are often unsuitable because, by design, only very few time points are available and there are a high number of missing values. We propose a functional data analysis approach for modelling short time series arising in metabolomic studies which overcomes these obstacles. Our model assumes that each observed time series is a smooth random curve, and we propose a statistical approach for inferring this curve from repeated measurements taken on the experimental units. A test statistic for detecting differences between temporal profiles associated with two experimental conditions is then presented. The methodology has been applied to NMR spectroscopy data collected in a pre-clinical toxicology study.

  8. Functional enzyme-based modeling approach for dynamic simulation of denitrification process in hyporheic zone sediments: Genetically structured microbial community model

    NASA Astrophysics Data System (ADS)

    Song, H. S.; Li, M.; Qian, W.; Song, X.; Chen, X.; Scheibe, T. D.; Fredrickson, J.; Zachara, J. M.; Liu, C.

    2016-12-01

    Modeling environmental microbial communities at individual organism level is currently intractable due to overwhelming structural complexity. Functional guild-based approaches alleviate this problem by lumping microorganisms into fewer groups based on their functional similarities. This reduction may become ineffective, however, when individual species perform multiple functions as environmental conditions vary. In contrast, the functional enzyme-based modeling approach we present here describes microbial community dynamics based on identified functional enzymes (rather than individual species or their groups). Previous studies in the literature along this line used biomass or functional genes as surrogate measures of enzymes due to the lack of analytical methods for quantifying enzymes in environmental samples. Leveraging our recent development of a signature peptide-based technique enabling sensitive quantification of functional enzymes in environmental samples, we developed a genetically structured microbial community model (GSMCM) to incorporate enzyme concentrations and various other omics measurements (if available) as key modeling input. We formulated the GSMCM based on the cybernetic metabolic modeling framework to rationally account for cellular regulation without relying on empirical inhibition kinetics. In the case study of modeling denitrification process in Columbia River hyporheic zone sediments collected from the Hanford Reach, our GSMCM provided a quantitative fit to complex experimental data in denitrification, including the delayed response of enzyme activation to the change in substrate concentration. Our future goal is to extend the modeling scope to the prediction of carbon and nitrogen cycles and contaminant fate. Integration of a simpler version of the GSMCM with PFLOTRAN for multi-scale field simulations is in progress.

  9. Cylindrically symmetric Green's function approach for modeling the crystal growth morphology of ice.

    PubMed

    Libbrecht, K G

    1999-08-01

    We describe a front-tracking Green's function approach to modeling cylindrically symmetric crystal growth. This method is simple to implement, and with little computer power can adequately model a wide range of physical situations. We apply the method to modeling the hexagonal prism growth of ice crystals, which is governed primarily by diffusion along with anisotropic surface kinetic processes. From ice crystal growth observations in air, we derive measurements of the kinetic growth coefficients for the basal and prism faces as a function of temperature, for supersaturations near the water saturation level. These measurements are interpreted in the context of a model for the nucleation and growth of ice, in which the growth dynamics are dominated by the structure of a disordered layer on the ice surfaces.

  10. A perturbative approach to the redshift space correlation function: beyond the Standard Model

    NASA Astrophysics Data System (ADS)

    Bose, Benjamin; Koyama, Kazuya

    2017-08-01

    We extend our previous redshift space power spectrum code to the redshift space correlation function. Here we focus on the Gaussian Streaming Model (GSM). Again, the code accommodates a wide range of modified gravity and dark energy models. For the non-linear real space correlation function used in the GSM we use the Fourier transform of the RegPT 1-loop matter power spectrum. We compare predictions of the GSM for a Vainshtein screened and Chameleon screened model as well as GR. These predictions are compared to the Fourier transform of the Taruya, Nishimichi and Saito (TNS) redshift space power spectrum model which is fit to N-body data. We find very good agreement between the Fourier transform of the TNS model and the GSM predictions, with <= 6% deviations in the first two correlation function multipoles for all models for redshift space separations in 50Mpch <= s <= 180Mpc/h. Excellent agreement is found in the differences between the modified gravity and GR multipole predictions for both approaches to the redshift space correlation function, highlighting their matched ability in picking up deviations from GR. We elucidate the timeliness of such non-standard templates at the dawn of stage-IV surveys and discuss necessary preparations and extensions needed for upcoming high quality data.

  11. A perturbative approach to the redshift space correlation function: beyond the Standard Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bose, Benjamin; Koyama, Kazuya, E-mail: benjamin.bose@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk

    We extend our previous redshift space power spectrum code to the redshift space correlation function. Here we focus on the Gaussian Streaming Model (GSM). Again, the code accommodates a wide range of modified gravity and dark energy models. For the non-linear real space correlation function used in the GSM we use the Fourier transform of the RegPT 1-loop matter power spectrum. We compare predictions of the GSM for a Vainshtein screened and Chameleon screened model as well as GR. These predictions are compared to the Fourier transform of the Taruya, Nishimichi and Saito (TNS) redshift space power spectrum model whichmore » is fit to N-body data. We find very good agreement between the Fourier transform of the TNS model and the GSM predictions, with ≤ 6% deviations in the first two correlation function multipoles for all models for redshift space separations in 50Mpc h ≤ s ≤ 180Mpc/ h . Excellent agreement is found in the differences between the modified gravity and GR multipole predictions for both approaches to the redshift space correlation function, highlighting their matched ability in picking up deviations from GR. We elucidate the timeliness of such non-standard templates at the dawn of stage-IV surveys and discuss necessary preparations and extensions needed for upcoming high quality data.« less

  12. Transfer Function Identification Using Orthogonal Fourier Transform Modeling Functions

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    2013-01-01

    A method for transfer function identification, including both model structure determination and parameter estimation, was developed and demonstrated. The approach uses orthogonal modeling functions generated from frequency domain data obtained by Fourier transformation of time series data. The method was applied to simulation data to identify continuous-time transfer function models and unsteady aerodynamic models. Model fit error, estimated model parameters, and the associated uncertainties were used to show the effectiveness of the method for identifying accurate transfer function models from noisy data.

  13. Estimating Function Approaches for Spatial Point Processes

    NASA Astrophysics Data System (ADS)

    Deng, Chong

    Spatial point pattern data consist of locations of events that are often of interest in biological and ecological studies. Such data are commonly viewed as a realization from a stochastic process called spatial point process. To fit a parametric spatial point process model to such data, likelihood-based methods have been widely studied. However, while maximum likelihood estimation is often too computationally intensive for Cox and cluster processes, pairwise likelihood methods such as composite likelihood, Palm likelihood usually suffer from the loss of information due to the ignorance of correlation among pairs. For many types of correlated data other than spatial point processes, when likelihood-based approaches are not desirable, estimating functions have been widely used for model fitting. In this dissertation, we explore the estimating function approaches for fitting spatial point process models. These approaches, which are based on the asymptotic optimal estimating function theories, can be used to incorporate the correlation among data and yield more efficient estimators. We conducted a series of studies to demonstrate that these estmating function approaches are good alternatives to balance the trade-off between computation complexity and estimating efficiency. First, we propose a new estimating procedure that improves the efficiency of pairwise composite likelihood method in estimating clustering parameters. Our approach combines estimating functions derived from pairwise composite likeli-hood estimation and estimating functions that account for correlations among the pairwise contributions. Our method can be used to fit a variety of parametric spatial point process models and can yield more efficient estimators for the clustering parameters than pairwise composite likelihood estimation. We demonstrate its efficacy through a simulation study and an application to the longleaf pine data. Second, we further explore the quasi-likelihood approach on fitting

  14. Model-based Utility Functions

    NASA Astrophysics Data System (ADS)

    Hibbard, Bill

    2012-05-01

    Orseau and Ring, as well as Dewey, have recently described problems, including self-delusion, with the behavior of agents using various definitions of utility functions. An agent's utility function is defined in terms of the agent's history of interactions with its environment. This paper argues, via two examples, that the behavior problems can be avoided by formulating the utility function in two steps: 1) inferring a model of the environment from interactions, and 2) computing utility as a function of the environment model. Basing a utility function on a model that the agent must learn implies that the utility function must initially be expressed in terms of specifications to be matched to structures in the learned model. These specifications constitute prior assumptions about the environment so this approach will not work with arbitrary environments. But the approach should work for agents designed by humans to act in the physical world. The paper also addresses the issue of self-modifying agents and shows that if provided with the possibility to modify their utility functions agents will not choose to do so, under some usual assumptions.

  15. Point-to-point migration functions and gravity model renormalization: approaches to aggregation in spatial interaction modeling.

    PubMed

    Slater, P B

    1985-08-01

    Two distinct approaches to assessing the effect of geographic scale on spatial interactions are modeled. In the first, the question of whether a distance deterrence function, which explains interactions for one system of zones, can also succeed on a more aggregate scale, is examined. Only the two-parameter function for which it is found that distances between macrozones are weighted averaged of distances between component zones is satisfactory in this regard. Estimation of continuous (point-to-point) functions--in the form of quadrivariate cubic polynomials--for US interstate migration streams, is then undertaken. Upon numerical integration, these higher order surfaces yield predictions of interzonal and intrazonal movements at any scale of interest. Test of spatial stationarity, isotropy, and symmetry of interstate migration are conducted in this framework.

  16. Model-based functional neuroimaging using dynamic neural fields: An integrative cognitive neuroscience approach

    PubMed Central

    Wijeakumar, Sobanawartiny; Ambrose, Joseph P.; Spencer, John P.; Curtu, Rodica

    2017-01-01

    A fundamental challenge in cognitive neuroscience is to develop theoretical frameworks that effectively span the gap between brain and behavior, between neuroscience and psychology. Here, we attempt to bridge this divide by formalizing an integrative cognitive neuroscience approach using dynamic field theory (DFT). We begin by providing an overview of how DFT seeks to understand the neural population dynamics that underlie cognitive processes through previous applications and comparisons to other modeling approaches. We then use previously published behavioral and neural data from a response selection Go/Nogo task as a case study for model simulations. Results from this study served as the ‘standard’ for comparisons with a model-based fMRI approach using dynamic neural fields (DNF). The tutorial explains the rationale and hypotheses involved in the process of creating the DNF architecture and fitting model parameters. Two DNF models, with similar structure and parameter sets, are then compared. Both models effectively simulated reaction times from the task as we varied the number of stimulus-response mappings and the proportion of Go trials. Next, we directly simulated hemodynamic predictions from the neural activation patterns from each model. These predictions were tested using general linear models (GLMs). Results showed that the DNF model that was created by tuning parameters to capture simultaneously trends in neural activation and behavioral data quantitatively outperformed a Standard GLM analysis of the same dataset. Further, by using the GLM results to assign functional roles to particular clusters in the brain, we illustrate how DNF models shed new light on the neural populations’ dynamics within particular brain regions. Thus, the present study illustrates how an interactive cognitive neuroscience model can be used in practice to bridge the gap between brain and behavior. PMID:29118459

  17. Optogenetic approaches to evaluate striatal function in animal models of Parkinson disease.

    PubMed

    Parker, Krystal L; Kim, Youngcho; Alberico, Stephanie L; Emmons, Eric B; Narayanan, Nandakumar S

    2016-03-01

    Optogenetics refers to the ability to control cells that have been genetically modified to express light-sensitive ion channels. The introduction of optogenetic approaches has facilitated the dissection of neural circuits. Optogenetics allows for the precise stimulation and inhibition of specific sets of neurons and their projections with fine temporal specificity. These techniques are ideally suited to investigating neural circuitry underlying motor and cognitive dysfunction in animal models of human disease. Here, we focus on how optogenetics has been used over the last decade to probe striatal circuits that are involved in Parkinson disease, a neurodegenerative condition involving motor and cognitive abnormalities resulting from degeneration of midbrain dopaminergic neurons. The precise mechanisms underlying the striatal contribution to both cognitive and motor dysfunction in Parkinson disease are unknown. Although optogenetic approaches are somewhat removed from clinical use, insight from these studies can help identify novel therapeutic targets and may inspire new treatments for Parkinson disease. Elucidating how neuronal and behavioral functions are influenced and potentially rescued by optogenetic manipulation in animal models could prove to be translatable to humans. These insights can be used to guide future brain-stimulation approaches for motor and cognitive abnormalities in Parkinson disease and other neuropsychiatric diseases.

  18. A stochastic approach for model reduction and memory function design in hydrogeophysical inversion

    NASA Astrophysics Data System (ADS)

    Hou, Z.; Kellogg, A.; Terry, N.

    2009-12-01

    Geophysical (e.g., seismic, electromagnetic, radar) techniques and statistical methods are essential for research related to subsurface characterization, including monitoring subsurface flow and transport processes, oil/gas reservoir identification, etc. For deep subsurface characterization such as reservoir petroleum exploration, seismic methods have been widely used. Recently, electromagnetic (EM) methods have drawn great attention in the area of reservoir characterization. However, considering the enormous computational demand corresponding to seismic and EM forward modeling, it is usually a big problem to have too many unknown parameters in the modeling domain. For shallow subsurface applications, the characterization can be very complicated considering the complexity and nonlinearity of flow and transport processes in the unsaturated zone. It is warranted to reduce the dimension of parameter space to a reasonable level. Another common concern is how to make the best use of time-lapse data with spatial-temporal correlations. This is even more critical when we try to monitor subsurface processes using geophysical data collected at different times. The normal practice is to get the inverse images individually. These images are not necessarily continuous or even reasonably related, because of the non-uniqueness of hydrogeophysical inversion. We propose to use a stochastic framework by integrating minimum-relative-entropy concept, quasi Monto Carlo sampling techniques, and statistical tests. The approach allows efficient and sufficient exploration of all possibilities of model parameters and evaluation of their significances to geophysical responses. The analyses enable us to reduce the parameter space significantly. The approach can be combined with Bayesian updating, allowing us to treat the updated ‘posterior’ pdf as a memory function, which stores all the information up to date about the distributions of soil/field attributes/properties, then consider the

  19. The basis function approach for modeling autocorrelation in ecological data

    USGS Publications Warehouse

    Hefley, Trevor J.; Broms, Kristin M.; Brost, Brian M.; Buderman, Frances E.; Kay, Shannon L.; Scharf, Henry; Tipton, John; Williams, Perry J.; Hooten, Mevin B.

    2017-01-01

    Analyzing ecological data often requires modeling the autocorrelation created by spatial and temporal processes. Many seemingly disparate statistical methods used to account for autocorrelation can be expressed as regression models that include basis functions. Basis functions also enable ecologists to modify a wide range of existing ecological models in order to account for autocorrelation, which can improve inference and predictive accuracy. Furthermore, understanding the properties of basis functions is essential for evaluating the fit of spatial or time-series models, detecting a hidden form of collinearity, and analyzing large data sets. We present important concepts and properties related to basis functions and illustrate several tools and techniques ecologists can use when modeling autocorrelation in ecological data.

  20. Transferability of species distribution models: a functional habitat approach for two regionally threatened butterflies.

    PubMed

    Vanreusel, Wouter; Maes, Dirk; Van Dyck, Hans

    2007-02-01

    Numerous models for predicting species distribution have been developed for conservation purposes. Most of them make use of environmental data (e.g., climate, topography, land use) at a coarse grid resolution (often kilometres). Such approaches are useful for conservation policy issues including reserve-network selection. The efficiency of predictive models for species distribution is usually tested on the area for which they were developed. Although highly interesting from the point of view of conservation efficiency, transferability of such models to independent areas is still under debate. We tested the transferability of habitat-based predictive distribution models for two regionally threatened butterflies, the green hairstreak (Callophrys rubi) and the grayling (Hipparchia semele), within and among three nature reserves in northeastern Belgium. We built predictive models based on spatially detailed maps of area-wide distribution and density of ecological resources. We used resources directly related to ecological functions (host plants, nectar sources, shelter, microclimate) rather than environmental surrogate variables. We obtained models that performed well with few resource variables. All models were transferable--although to different degrees--among the independent areas within the same broad geographical region. We argue that habitat models based on essential functional resources could transfer better in space than models that use indirect environmental variables. Because functional variables can easily be interpreted and even be directly affected by terrain managers, these models can be useful tools to guide species-adapted reserve management.

  1. The basis function approach for modeling autocorrelation in ecological data.

    PubMed

    Hefley, Trevor J; Broms, Kristin M; Brost, Brian M; Buderman, Frances E; Kay, Shannon L; Scharf, Henry R; Tipton, John R; Williams, Perry J; Hooten, Mevin B

    2017-03-01

    Analyzing ecological data often requires modeling the autocorrelation created by spatial and temporal processes. Many seemingly disparate statistical methods used to account for autocorrelation can be expressed as regression models that include basis functions. Basis functions also enable ecologists to modify a wide range of existing ecological models in order to account for autocorrelation, which can improve inference and predictive accuracy. Furthermore, understanding the properties of basis functions is essential for evaluating the fit of spatial or time-series models, detecting a hidden form of collinearity, and analyzing large data sets. We present important concepts and properties related to basis functions and illustrate several tools and techniques ecologists can use when modeling autocorrelation in ecological data. © 2016 by the Ecological Society of America.

  2. Optimizing the general linear model for functional near-infrared spectroscopy: an adaptive hemodynamic response function approach

    PubMed Central

    Uga, Minako; Dan, Ippeita; Sano, Toshifumi; Dan, Haruka; Watanabe, Eiju

    2014-01-01

    Abstract. An increasing number of functional near-infrared spectroscopy (fNIRS) studies utilize a general linear model (GLM) approach, which serves as a standard statistical method for functional magnetic resonance imaging (fMRI) data analysis. While fMRI solely measures the blood oxygen level dependent (BOLD) signal, fNIRS measures the changes of oxy-hemoglobin (oxy-Hb) and deoxy-hemoglobin (deoxy-Hb) signals at a temporal resolution severalfold higher. This suggests the necessity of adjusting the temporal parameters of a GLM for fNIRS signals. Thus, we devised a GLM-based method utilizing an adaptive hemodynamic response function (HRF). We sought the optimum temporal parameters to best explain the observed time series data during verbal fluency and naming tasks. The peak delay of the HRF was systematically changed to achieve the best-fit model for the observed oxy- and deoxy-Hb time series data. The optimized peak delay showed different values for each Hb signal and task. When the optimized peak delays were adopted, the deoxy-Hb data yielded comparable activations with similar statistical power and spatial patterns to oxy-Hb data. The adaptive HRF method could suitably explain the behaviors of both Hb parameters during tasks with the different cognitive loads during a time course, and thus would serve as an objective method to fully utilize the temporal structures of all fNIRS data. PMID:26157973

  3. Functional Risk Modeling for Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Thomson, Fraser; Mathias, Donovan; Go, Susie; Nejad, Hamed

    2010-01-01

    We introduce an approach to risk modeling that we call functional modeling , which we have developed to estimate the capabilities of a lunar base. The functional model tracks the availability of functions provided by systems, in addition to the operational state of those systems constituent strings. By tracking functions, we are able to identify cases where identical functions are provided by elements (rovers, habitats, etc.) that are connected together on the lunar surface. We credit functional diversity in those cases, and in doing so compute more realistic estimates of operational mode availabilities. The functional modeling approach yields more realistic estimates of the availability of the various operational modes provided to astronauts by the ensemble of surface elements included in a lunar base architecture. By tracking functional availability the effects of diverse backup, which often exists when two or more independent elements are connected together, is properly accounted for.

  4. A systemic approach for modeling soil functions

    NASA Astrophysics Data System (ADS)

    Vogel, Hans-Jörg; Bartke, Stephan; Daedlow, Katrin; Helming, Katharina; Kögel-Knabner, Ingrid; Lang, Birgit; Rabot, Eva; Russell, David; Stößel, Bastian; Weller, Ulrich; Wiesmeier, Martin; Wollschläger, Ute

    2018-03-01

    The central importance of soil for the functioning of terrestrial systems is increasingly recognized. Critically relevant for water quality, climate control, nutrient cycling and biodiversity, soil provides more functions than just the basis for agricultural production. Nowadays, soil is increasingly under pressure as a limited resource for the production of food, energy and raw materials. This has led to an increasing demand for concepts assessing soil functions so that they can be adequately considered in decision-making aimed at sustainable soil management. The various soil science disciplines have progressively developed highly sophisticated methods to explore the multitude of physical, chemical and biological processes in soil. It is not obvious, however, how the steadily improving insight into soil processes may contribute to the evaluation of soil functions. Here, we present to a new systemic modeling framework that allows for a consistent coupling between reductionist yet observable indicators for soil functions with detailed process understanding. It is based on the mechanistic relationships between soil functional attributes, each explained by a network of interacting processes as derived from scientific evidence. The non-linear character of these interactions produces stability and resilience of soil with respect to functional characteristics. We anticipate that this new conceptional framework will integrate the various soil science disciplines and help identify important future research questions at the interface between disciplines. It allows the overwhelming complexity of soil systems to be adequately coped with and paves the way for steadily improving our capability to assess soil functions based on scientific understanding.

  5. A conditional Granger causality model approach for group analysis in functional MRI

    PubMed Central

    Zhou, Zhenyu; Wang, Xunheng; Klahr, Nelson J.; Liu, Wei; Arias, Diana; Liu, Hongzhi; von Deneen, Karen M.; Wen, Ying; Lu, Zuhong; Xu, Dongrong; Liu, Yijun

    2011-01-01

    Granger causality model (GCM) derived from multivariate vector autoregressive models of data has been employed for identifying effective connectivity in the human brain with functional MR imaging (fMRI) and to reveal complex temporal and spatial dynamics underlying a variety of cognitive processes. In the most recent fMRI effective connectivity measures, pairwise GCM has commonly been applied based on single voxel values or average values from special brain areas at the group level. Although a few novel conditional GCM methods have been proposed to quantify the connections between brain areas, our study is the first to propose a viable standardized approach for group analysis of an fMRI data with GCM. To compare the effectiveness of our approach with traditional pairwise GCM models, we applied a well-established conditional GCM to pre-selected time series of brain regions resulting from general linear model (GLM) and group spatial kernel independent component analysis (ICA) of an fMRI dataset in the temporal domain. Datasets consisting of one task-related and one resting-state fMRI were used to investigate connections among brain areas with the conditional GCM method. With the GLM detected brain activation regions in the emotion related cortex during the block design paradigm, the conditional GCM method was proposed to study the causality of the habituation between the left amygdala and pregenual cingulate cortex during emotion processing. For the resting-state dataset, it is possible to calculate not only the effective connectivity between networks but also the heterogeneity within a single network. Our results have further shown a particular interacting pattern of default mode network (DMN) that can be characterized as both afferent and efferent influences on the medial prefrontal cortex (mPFC) and posterior cingulate cortex (PCC). These results suggest that the conditional GCM approach based on a linear multivariate vector autoregressive (MVAR) model can achieve

  6. Evaluating the Functionality of Conceptual Models

    NASA Astrophysics Data System (ADS)

    Mehmood, Kashif; Cherfi, Samira Si-Said

    Conceptual models serve as the blueprints of information systems and their quality plays decisive role in the success of the end system. It has been witnessed that majority of the IS change-requests results due to deficient functionalities in the information systems. Therefore, a good analysis and design method should ensure that conceptual models are functionally correct and complete, as they are the communicating mediator between the users and the development team. Conceptual model is said to be functionally complete if it represents all the relevant features of the application domain and covers all the specified requirements. Our approach evaluates the functional aspects on multiple levels of granularity in addition to providing the corrective actions or transformation for improvement. This approach has been empirically validated by practitioners through a survey.

  7. Functional Enzyme-Based Approach for Linking Microbial Community Functions with Biogeochemical Process Kinetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Minjing; Qian, Wei-jun; Gao, Yuqian

    The kinetics of biogeochemical processes in natural and engineered environmental systems are typically described using Monod-type or modified Monod-type models. These models rely on biomass as surrogates for functional enzymes in microbial community that catalyze biogeochemical reactions. A major challenge to apply such models is the difficulty to quantitatively measure functional biomass for constraining and validating the models. On the other hand, omics-based approaches have been increasingly used to characterize microbial community structure, functions, and metabolites. Here we proposed an enzyme-based model that can incorporate omics-data to link microbial community functions with biogeochemical process kinetics. The model treats enzymes asmore » time-variable catalysts for biogeochemical reactions and applies biogeochemical reaction network to incorporate intermediate metabolites. The sequences of genes and proteins from metagenomes, as well as those from the UniProt database, were used for targeted enzyme quantification and to provide insights into the dynamic linkage among functional genes, enzymes, and metabolites that are necessary to be incorporated in the model. The application of the model was demonstrated using denitrification as an example by comparing model-simulated with measured functional enzymes, genes, denitrification substrates and intermediates« less

  8. Assessment of Safety and Functional Efficacy of Stem Cell-Based Therapeutic Approaches Using Retinal Degenerative Animal Models

    PubMed Central

    Lin, Tai-Chi; Zhu, Danhong; Hinton, David R.; Clegg, Dennis O.; Humayun, Mark S.

    2017-01-01

    Dysfunction and death of retinal pigment epithelium (RPE) and or photoreceptors can lead to irreversible vision loss. The eye represents an ideal microenvironment for stem cell-based therapy. It is considered an “immune privileged” site, and the number of cells needed for therapy is relatively low for the area of focused vision (macula). Further, surgical placement of stem cell-derived grafts (RPE, retinal progenitors, and photoreceptor precursors) into the vitreous cavity or subretinal space has been well established. For preclinical tests, assessments of stem cell-derived graft survival and functionality are conducted in animal models by various noninvasive approaches and imaging modalities. In vivo experiments conducted in animal models based on replacing photoreceptors and/or RPE cells have shown survival and functionality of the transplanted cells, rescue of the host retina, and improvement of visual function. Based on the positive results obtained from these animal experiments, human clinical trials are being initiated. Despite such progress in stem cell research, ethical, regulatory, safety, and technical difficulties still remain a challenge for the transformation of this technique into a standard clinical approach. In this review, the current status of preclinical safety and efficacy studies for retinal cell replacement therapies conducted in animal models will be discussed. PMID:28928775

  9. Modelling of Sub-daily Hydrological Processes Using Daily Time-Step Models: A Distribution Function Approach to Temporal Scaling

    NASA Astrophysics Data System (ADS)

    Kandel, D. D.; Western, A. W.; Grayson, R. B.

    2004-12-01

    Mismatches in scale between the fundamental processes, the model and supporting data are a major limitation in hydrologic modelling. Surface runoff generation via infiltration excess and the process of soil erosion are fundamentally short time-scale phenomena and their average behaviour is mostly determined by the short time-scale peak intensities of rainfall. Ideally, these processes should be simulated using time-steps of the order of minutes to appropriately resolve the effect of rainfall intensity variations. However, sub-daily data support is often inadequate and the processes are usually simulated by calibrating daily (or even coarser) time-step models. Generally process descriptions are not modified but rather effective parameter values are used to account for the effect of temporal lumping, assuming that the effect of the scale mismatch can be counterbalanced by tuning the parameter values at the model time-step of interest. Often this results in parameter values that are difficult to interpret physically. A similar approach is often taken spatially. This is problematic as these processes generally operate or interact non-linearly. This indicates a need for better techniques to simulate sub-daily processes using daily time-step models while still using widely available daily information. A new method applicable to many rainfall-runoff-erosion models is presented. The method is based on temporal scaling using statistical distributions of rainfall intensity to represent sub-daily intensity variations in a daily time-step model. This allows the effect of short time-scale nonlinear processes to be captured while modelling at a daily time-step, which is often attractive due to the wide availability of daily forcing data. The approach relies on characterising the rainfall intensity variation within a day using a cumulative distribution function (cdf). This cdf is then modified by various linear and nonlinear processes typically represented in hydrological and

  10. Functional Additive Mixed Models

    PubMed Central

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2014-01-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach. PMID:26347592

  11. Functional Additive Mixed Models.

    PubMed

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2015-04-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach.

  12. From data to function: functional modeling of poultry genomics data.

    PubMed

    McCarthy, F M; Lyons, E

    2013-09-01

    One of the challenges of functional genomics is to create a better understanding of the biological system being studied so that the data produced are leveraged to provide gains for agriculture, human health, and the environment. Functional modeling enables researchers to make sense of these data as it reframes a long list of genes or gene products (mRNA, ncRNA, and proteins) by grouping based upon function, be it individual molecular functions or interactions between these molecules or broader biological processes, including metabolic and signaling pathways. However, poultry researchers have been hampered by a lack of functional annotation data, tools, and training to use these data and tools. Moreover, this lack is becoming more critical as new sequencing technologies enable us to generate data not only for an increasingly diverse range of species but also individual genomes and populations of individuals. We discuss the impact of these new sequencing technologies on poultry research, with a specific focus on what functional modeling resources are available for poultry researchers. We also describe key strategies for researchers who wish to functionally model their own data, providing background information about functional modeling approaches, the data and tools to support these approaches, and the strengths and limitations of each. Specifically, we describe methods for functional analysis using Gene Ontology (GO) functional summaries, functional enrichment analysis, and pathways and network modeling. As annotation efforts begin to provide the fundamental data that underpin poultry functional modeling (such as improved gene identification, standardized gene nomenclature, temporal and spatial expression data and gene product function), tool developers are incorporating these data into new and existing tools that are used for functional modeling, and cyberinfrastructure is being developed to provide the necessary extendibility and scalability for storing and

  13. Functional Generalized Additive Models.

    PubMed

    McLean, Mathew W; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F { X ( t ), t } where F (·,·) is an unknown regression function and X ( t ) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Müller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F (·,·) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X ( t ) is a signal from diffusion tensor imaging at position, t , along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online.

  14. Functional renormalization group approach to SU(N ) Heisenberg models: Real-space renormalization group at arbitrary N

    NASA Astrophysics Data System (ADS)

    Buessen, Finn Lasse; Roscher, Dietrich; Diehl, Sebastian; Trebst, Simon

    2018-02-01

    The pseudofermion functional renormalization group (pf-FRG) is one of the few numerical approaches that has been demonstrated to quantitatively determine the ordering tendencies of frustrated quantum magnets in two and three spatial dimensions. The approach, however, relies on a number of presumptions and approximations, in particular the choice of pseudofermion decomposition and the truncation of an infinite number of flow equations to a finite set. Here we generalize the pf-FRG approach to SU (N )-spin systems with arbitrary N and demonstrate that the scheme becomes exact in the large-N limit. Numerically solving the generalized real-space renormalization group equations for arbitrary N , we can make a stringent connection between the physically most significant case of SU(2) spins and more accessible SU (N ) models. In a case study of the square-lattice SU (N ) Heisenberg antiferromagnet, we explicitly demonstrate that the generalized pf-FRG approach is capable of identifying the instability indicating the transition into a staggered flux spin liquid ground state in these models for large, but finite, values of N . In a companion paper [Roscher et al., Phys. Rev. B 97, 064416 (2018), 10.1103/PhysRevB.97.064416] we formulate a momentum-space pf-FRG approach for SU (N ) spin models that allows us to explicitly study the large-N limit and access the low-temperature spin liquid phase.

  15. Electromagnetic scaling functions within the Green's function Monte Carlo approach

    DOE PAGES

    Rocco, N.; Alvarez-Ruso, L.; Lovato, A.; ...

    2017-07-24

    We have studied the scaling properties of the electromagnetic response functions of 4He and 12C nuclei computed by the Green's function Monte Carlo approach, retaining only the one-body current contribution. Longitudinal and transverse scaling functions have been obtained in the relativistic and nonrelativistic cases and compared to experiment for various kinematics. The characteristic asymmetric shape of the scaling function exhibited by data emerges in the calculations in spite of the nonrelativistic nature of the model. The results are mostly consistent with scaling of zeroth, first, and second kinds. Our analysis reveals a direct correspondence between the scaling and the nucleon-densitymore » response functions. In conclusion, the scaling function obtained from the proton-density response displays scaling of the first kind, even more evidently than the longitudinal and transverse scaling functions« less

  16. Electromagnetic scaling functions within the Green's function Monte Carlo approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rocco, N.; Alvarez-Ruso, L.; Lovato, A.

    We have studied the scaling properties of the electromagnetic response functions of 4He and 12C nuclei computed by the Green's function Monte Carlo approach, retaining only the one-body current contribution. Longitudinal and transverse scaling functions have been obtained in the relativistic and nonrelativistic cases and compared to experiment for various kinematics. The characteristic asymmetric shape of the scaling function exhibited by data emerges in the calculations in spite of the nonrelativistic nature of the model. The results are mostly consistent with scaling of zeroth, first, and second kinds. Our analysis reveals a direct correspondence between the scaling and the nucleon-densitymore » response functions. In conclusion, the scaling function obtained from the proton-density response displays scaling of the first kind, even more evidently than the longitudinal and transverse scaling functions« less

  17. A multi-frequency receiver function inversion approach for crustal velocity structure

    NASA Astrophysics Data System (ADS)

    Li, Xuelei; Li, Zhiwei; Hao, Tianyao; Wang, Sheng; Xing, Jian

    2017-05-01

    In order to constrain the crustal velocity structures better, we developed a new nonlinear inversion approach based on multi-frequency receiver function waveforms. With the global optimizing algorithm of Differential Evolution (DE), low-frequency receiver function waveforms can primarily constrain large-scale velocity structures, while high-frequency receiver function waveforms show the advantages in recovering small-scale velocity structures. Based on the synthetic tests with multi-frequency receiver function waveforms, the proposed approach can constrain both long- and short-wavelength characteristics of the crustal velocity structures simultaneously. Inversions with real data are also conducted for the seismic stations of KMNB in southeast China and HYB in Indian continent, where crustal structures have been well studied by former researchers. Comparisons of inverted velocity models from previous and our studies suggest good consistency, but better waveform fitness with fewer model parameters are achieved by our proposed approach. Comprehensive tests with synthetic and real data suggest that the proposed inversion approach with multi-frequency receiver function is effective and robust in inverting the crustal velocity structures.

  18. Introducing linear functions: an alternative statistical approach

    NASA Astrophysics Data System (ADS)

    Nolan, Caroline; Herbert, Sandra

    2015-12-01

    The introduction of linear functions is the turning point where many students decide if mathematics is useful or not. This means the role of parameters and variables in linear functions could be considered to be `threshold concepts'. There is recognition that linear functions can be taught in context through the exploration of linear modelling examples, but this has its limitations. Currently, statistical data is easily attainable, and graphics or computer algebra system (CAS) calculators are common in many classrooms. The use of this technology provides ease of access to different representations of linear functions as well as the ability to fit a least-squares line for real-life data. This means these calculators could support a possible alternative approach to the introduction of linear functions. This study compares the results of an end-of-topic test for two classes of Australian middle secondary students at a regional school to determine if such an alternative approach is feasible. In this study, test questions were grouped by concept and subjected to concept by concept analysis of the means of test results of the two classes. This analysis revealed that the students following the alternative approach demonstrated greater competence with non-standard questions.

  19. An overview of the recent approaches for terroir functional modelling, footprinting and zoning

    NASA Astrophysics Data System (ADS)

    Vaudour, E.; Costantini, E.; Jones, G. V.; Mocali, S.

    2014-11-01

    Notions of terroir and their conceptualization through agri-environmental sciences have become popular in many parts of world. Originally developed for wine, terroir now encompasses many other crops including fruits, vegetables, cheese, olive oil, coffee, cacao and other crops, linking the uniqueness and quality of both beverages and foods to the environment where they are produced, giving the consumer a sense of place. Climate, geology, geomorphology, and soil are the main environmental factors which compose the terroir effect at different scales. Often considered immutable at the cultural scale, the natural components of terroir are actually a set of processes, which together create a delicate equilibrium and regulation of its effect on products in both space and time. Due to both a greater need to better understand regional to site variations in crop production and the growth in spatial analytic technologies, the study of terroir has shifted from a largely descriptive regional science to a more applied, technical research field. Furthermore, the explosion of spatial data availability and sensing technologies has made the within-field scale of study more valuable to the individual grower. The result has been greater adoption but also issues associated with both the spatial and temporal scales required for practical applications, as well as the relevant approaches for data synthesis. Moreover, as soil microbial communities are known to be of vital importance for terrestrial processes by driving the major soil geochemical cycles and supporting healthy plant growth, an intensive investigation of the microbial organization and their function is also required. Our objective is to present an overview of existing data and modelling approaches for terroir functional modelling, footprinting and zoning at local and regional scales. This review will focus on three main areas of recent terroir research: (1) quantifying the influences of terroir components on plant growth

  20. Model-free estimation of the psychometric function

    PubMed Central

    Żychaluk, Kamila; Foster, David H.

    2009-01-01

    A subject's response to the strength of a stimulus is described by the psychometric function, from which summary measures, such as a threshold or slope, may be derived. Traditionally, this function is estimated by fitting a parametric model to the experimental data, usually the proportion of successful trials at each stimulus level. Common models include the Gaussian and Weibull cumulative distribution functions. This approach works well if the model is correct, but it can mislead if not. In practice, the correct model is rarely known. Here, a nonparametric approach based on local linear fitting is advocated. No assumption is made about the true model underlying the data, except that the function is smooth. The critical role of the bandwidth is identified, and its optimum value estimated by a cross-validation procedure. As a demonstration, seven vision and hearing data sets were fitted by the local linear method and by several parametric models. The local linear method frequently performed better and never worse than the parametric ones. Supplemental materials for this article can be downloaded from app.psychonomic-journals.org/content/supplemental. PMID:19633355

  1. Three-dimensional ionospheric tomography reconstruction using the model function approach in Tikhonov regularization

    NASA Astrophysics Data System (ADS)

    Wang, Sicheng; Huang, Sixun; Xiang, Jie; Fang, Hanxian; Feng, Jian; Wang, Yu

    2016-12-01

    Ionospheric tomography is based on the observed slant total electron content (sTEC) along different satellite-receiver rays to reconstruct the three-dimensional electron density distributions. Due to incomplete measurements provided by the satellite-receiver geometry, it is a typical ill-posed problem, and how to overcome the ill-posedness is still a crucial content of research. In this paper, Tikhonov regularization method is used and the model function approach is applied to determine the optimal regularization parameter. This algorithm not only balances the weights between sTEC observations and background electron density field but also converges globally and rapidly. The background error covariance is given by multiplying background model variance and location-dependent spatial correlation, and the correlation model is developed by using sample statistics from an ensemble of the International Reference Ionosphere 2012 (IRI2012) model outputs. The Global Navigation Satellite System (GNSS) observations in China are used to present the reconstruction results, and measurements from two ionosondes are used to make independent validations. Both the test cases using artificial sTEC observations and actual GNSS sTEC measurements show that the regularization method can effectively improve the background model outputs.

  2. Towards aspect-oriented functional--structural plant modelling.

    PubMed

    Cieslak, Mikolaj; Seleznyova, Alla N; Prusinkiewicz, Przemyslaw; Hanan, Jim

    2011-10-01

    Functional-structural plant models (FSPMs) are used to integrate knowledge and test hypotheses of plant behaviour, and to aid in the development of decision support systems. A significant amount of effort is being put into providing a sound methodology for building them. Standard techniques, such as procedural or object-oriented programming, are not suited for clearly separating aspects of plant function that criss-cross between different components of plant structure, which makes it difficult to reuse and share their implementations. The aim of this paper is to present an aspect-oriented programming approach that helps to overcome this difficulty. The L-system-based plant modelling language L+C was used to develop an aspect-oriented approach to plant modelling based on multi-modules. Each element of the plant structure was represented by a sequence of L-system modules (rather than a single module), with each module representing an aspect of the element's function. Separate sets of productions were used for modelling each aspect, with context-sensitive rules facilitated by local lists of modules to consider/ignore. Aspect weaving or communication between aspects was made possible through the use of pseudo-L-systems, where the strict-predecessor of a production rule was specified as a multi-module. The new approach was used to integrate previously modelled aspects of carbon dynamics, apical dominance and biomechanics with a model of a developing kiwifruit shoot. These aspects were specified independently and their implementation was based on source code provided by the original authors without major changes. This new aspect-oriented approach to plant modelling is well suited for studying complex phenomena in plant science, because it can be used to integrate separate models of individual aspects of plant development and function, both previously constructed and new, into clearly organized, comprehensive FSPMs. In a future work, this approach could be further

  3. Modeling solvation effects in real-space and real-time within density functional approaches

    NASA Astrophysics Data System (ADS)

    Delgado, Alain; Corni, Stefano; Pittalis, Stefano; Rozzi, Carlo Andrea

    2015-10-01

    The Polarizable Continuum Model (PCM) can be used in conjunction with Density Functional Theory (DFT) and its time-dependent extension (TDDFT) to simulate the electronic and optical properties of molecules and nanoparticles immersed in a dielectric environment, typically liquid solvents. In this contribution, we develop a methodology to account for solvation effects in real-space (and real-time) (TD)DFT calculations. The boundary elements method is used to calculate the solvent reaction potential in terms of the apparent charges that spread over the van der Waals solute surface. In a real-space representation, this potential may exhibit a Coulomb singularity at grid points that are close to the cavity surface. We propose a simple approach to regularize such singularity by using a set of spherical Gaussian functions to distribute the apparent charges. We have implemented the proposed method in the Octopus code and present results for the solvation free energies and solvatochromic shifts for a representative set of organic molecules in water.

  4. Modeling solvation effects in real-space and real-time within density functional approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delgado, Alain; Centro de Aplicaciones Tecnológicas y Desarrollo Nuclear, Calle 30 # 502, 11300 La Habana; Corni, Stefano

    2015-10-14

    The Polarizable Continuum Model (PCM) can be used in conjunction with Density Functional Theory (DFT) and its time-dependent extension (TDDFT) to simulate the electronic and optical properties of molecules and nanoparticles immersed in a dielectric environment, typically liquid solvents. In this contribution, we develop a methodology to account for solvation effects in real-space (and real-time) (TD)DFT calculations. The boundary elements method is used to calculate the solvent reaction potential in terms of the apparent charges that spread over the van der Waals solute surface. In a real-space representation, this potential may exhibit a Coulomb singularity at grid points that aremore » close to the cavity surface. We propose a simple approach to regularize such singularity by using a set of spherical Gaussian functions to distribute the apparent charges. We have implemented the proposed method in the OCTOPUS code and present results for the solvation free energies and solvatochromic shifts for a representative set of organic molecules in water.« less

  5. Connectotyping: Model Based Fingerprinting of the Functional Connectome

    PubMed Central

    Miranda-Dominguez, Oscar; Mills, Brian D.; Carpenter, Samuel D.; Grant, Kathleen A.; Kroenke, Christopher D.; Nigg, Joel T.; Fair, Damien A.

    2014-01-01

    A better characterization of how an individual’s brain is functionally organized will likely bring dramatic advances to many fields of study. Here we show a model-based approach toward characterizing resting state functional connectivity MRI (rs-fcMRI) that is capable of identifying a so-called “connectotype”, or functional fingerprint in individual participants. The approach rests on a simple linear model that proposes the activity of a given brain region can be described by the weighted sum of its functional neighboring regions. The resulting coefficients correspond to a personalized model-based connectivity matrix that is capable of predicting the timeseries of each subject. Importantly, the model itself is subject specific and has the ability to predict an individual at a later date using a limited number of non-sequential frames. While we show that there is a significant amount of shared variance between models across subjects, the model’s ability to discriminate an individual is driven by unique connections in higher order control regions in frontal and parietal cortices. Furthermore, we show that the connectotype is present in non-human primates as well, highlighting the translational potential of the approach. PMID:25386919

  6. Penalized spline estimation for functional coefficient regression models.

    PubMed

    Cao, Yanrong; Lin, Haiqun; Wu, Tracy Z; Yu, Yan

    2010-04-01

    The functional coefficient regression models assume that the regression coefficients vary with some "threshold" variable, providing appreciable flexibility in capturing the underlying dynamics in data and avoiding the so-called "curse of dimensionality" in multivariate nonparametric estimation. We first investigate the estimation, inference, and forecasting for the functional coefficient regression models with dependent observations via penalized splines. The P-spline approach, as a direct ridge regression shrinkage type global smoothing method, is computationally efficient and stable. With established fixed-knot asymptotics, inference is readily available. Exact inference can be obtained for fixed smoothing parameter λ, which is most appealing for finite samples. Our penalized spline approach gives an explicit model expression, which also enables multi-step-ahead forecasting via simulations. Furthermore, we examine different methods of choosing the important smoothing parameter λ: modified multi-fold cross-validation (MCV), generalized cross-validation (GCV), and an extension of empirical bias bandwidth selection (EBBS) to P-splines. In addition, we implement smoothing parameter selection using mixed model framework through restricted maximum likelihood (REML) for P-spline functional coefficient regression models with independent observations. The P-spline approach also easily allows different smoothness for different functional coefficients, which is enabled by assigning different penalty λ accordingly. We demonstrate the proposed approach by both simulation examples and a real data application.

  7. Detection of Differential Item Functioning Using the Lasso Approach

    ERIC Educational Resources Information Center

    Magis, David; Tuerlinckx, Francis; De Boeck, Paul

    2015-01-01

    This article proposes a novel approach to detect differential item functioning (DIF) among dichotomously scored items. Unlike standard DIF methods that perform an item-by-item analysis, we propose the "LR lasso DIF method": logistic regression (LR) model is formulated for all item responses. The model contains item-specific intercepts,…

  8. An overview of the recent approaches for terroir functional modelling, footprinting and zoning

    NASA Astrophysics Data System (ADS)

    Costantini, Edoardo; Emmanuelle, Vaudour; Jones, Gregory; Mocali, Stefano

    2014-05-01

    Notions of terroir and their conceptualization through agri-environmental sciences have become popular in many parts of world. Originally developed for wine, terroir is now investigated for fruits, vegetables, cheese, olive oil, coffee, cacao and other crops, linking the uniqueness and quality of both beverages and foods to the environment where they are produced, giving the consumer a sense of place. Climate, geology, geomorphology, and soil are the main environmental factors which compose the terroir effect at different scales. Often considered immutable at the cultural scale, the natural components of terroir are actually a set of processes, which together create a delicate equilibrium and regulation of its effect on products in both space and time. Due to both a greater need to better understand regional to site variations in crop production and the growth in spatial analytic technologies, the study of terroir has shifted from a largely descriptive regional science to a more applied, technical research field. Furthermore, the explosion of spatial data availability and elaboration technologies have made the scale of study more valuable to the individual grower, resulting in greater adoption and application. Moreover, as soil microbial communities are known to be of vital importance for terrestrial processes by driving the major soil geochemical cycles and supporting healthy plant growth, an intensive investigation of the microbial organization and their function is also required. Our objective is to present an overview of existing data and modeling approaches for terroir functional modeling, footprinting and zoning at local and regional scales. This review will focus on four main areas of recent terroir research: 1) quantifying the influences of terroir components on plant growth, fruit composition and quality, mostly examining climate-soil-water relationships; 2) the metagenomic approach as new tool to unravel the biogeochemical cycles of both macro- and

  9. Thermal form-factor approach to dynamical correlation functions of integrable lattice models

    NASA Astrophysics Data System (ADS)

    Göhmann, Frank; Karbach, Michael; Klümper, Andreas; Kozlowski, Karol K.; Suzuki, Junji

    2017-11-01

    We propose a method for calculating dynamical correlation functions at finite temperature in integrable lattice models of Yang-Baxter type. The method is based on an expansion of the correlation functions as a series over matrix elements of a time-dependent quantum transfer matrix rather than the Hamiltonian. In the infinite Trotter-number limit the matrix elements become time independent and turn into the thermal form factors studied previously in the context of static correlation functions. We make this explicit with the example of the XXZ model. We show how the form factors can be summed utilizing certain auxiliary functions solving finite sets of nonlinear integral equations. The case of the XX model is worked out in more detail leading to a novel form-factor series representation of the dynamical transverse two-point function.

  10. Modeling and multi-response optimization of pervaporation of organic aqueous solutions using desirability function approach.

    PubMed

    Cojocaru, C; Khayet, M; Zakrzewska-Trznadel, G; Jaworska, A

    2009-08-15

    The factorial design of experiments and desirability function approach has been applied for multi-response optimization in pervaporation separation process. Two organic aqueous solutions were considered as model mixtures, water/acetonitrile and water/ethanol mixtures. Two responses have been employed in multi-response optimization of pervaporation, total permeate flux and organic selectivity. The effects of three experimental factors (feed temperature, initial concentration of organic compound in feed solution, and downstream pressure) on the pervaporation responses have been investigated. The experiments were performed according to a 2(3) full factorial experimental design. The factorial models have been obtained from experimental design and validated statistically by analysis of variance (ANOVA). The spatial representations of the response functions were drawn together with the corresponding contour line plots. Factorial models have been used to develop the overall desirability function. In addition, the overlap contour plots were presented to identify the desirability zone and to determine the optimum point. The optimal operating conditions were found to be, in the case of water/acetonitrile mixture, a feed temperature of 55 degrees C, an initial concentration of 6.58% and a downstream pressure of 13.99 kPa, while for water/ethanol mixture a feed temperature of 55 degrees C, an initial concentration of 4.53% and a downstream pressure of 9.57 kPa. Under such optimum conditions it was observed experimentally an improvement of both the total permeate flux and selectivity.

  11. Mathematical Modeling Approaches in Plant Metabolomics.

    PubMed

    Fürtauer, Lisa; Weiszmann, Jakob; Weckwerth, Wolfram; Nägele, Thomas

    2018-01-01

    The experimental analysis of a plant metabolome typically results in a comprehensive and multidimensional data set. To interpret metabolomics data in the context of biochemical regulation and environmental fluctuation, various approaches of mathematical modeling have been developed and have proven useful. In this chapter, a general introduction to mathematical modeling is presented and discussed in context of plant metabolism. A particular focus is laid on the suitability of mathematical approaches to functionally integrate plant metabolomics data in a metabolic network and combine it with other biochemical or physiological parameters.

  12. Cost function approach for estimating derived demand for composite wood products

    Treesearch

    T. C. Marcin

    1991-01-01

    A cost function approach was examined for using the concept of duality between production and input factor demands. A translog cost function was used to represent residential construction costs and derived conditional factor demand equations. Alternative models were derived from the translog cost function by imposing parameter restrictions.

  13. Learning predictive models that use pattern discovery--a bootstrap evaluative approach applied in organ functioning sequences.

    PubMed

    Toma, Tudor; Bosman, Robert-Jan; Siebes, Arno; Peek, Niels; Abu-Hanna, Ameen

    2010-08-01

    An important problem in the Intensive Care is how to predict on a given day of stay the eventual hospital mortality for a specific patient. A recent approach to solve this problem suggested the use of frequent temporal sequences (FTSs) as predictors. Methods following this approach were evaluated in the past by inducing a model from a training set and validating the prognostic performance on an independent test set. Although this evaluative approach addresses the validity of the specific models induced in an experiment, it falls short of evaluating the inductive method itself. To achieve this, one must account for the inherent sources of variation in the experimental design. The main aim of this work is to demonstrate a procedure based on bootstrapping, specifically the .632 bootstrap procedure, for evaluating inductive methods that discover patterns, such as FTSs. A second aim is to apply this approach to find out whether a recently suggested inductive method that discovers FTSs of organ functioning status is superior over a traditional method that does not use temporal sequences when compared on each successive day of stay at the Intensive Care Unit. The use of bootstrapping with logistic regression using pre-specified covariates is known in the statistical literature. Using inductive methods of prognostic models based on temporal sequence discovery within the bootstrap procedure is however novel at least in predictive models in the Intensive Care. Our results of applying the bootstrap-based evaluative procedure demonstrate the superiority of the FTS-based inductive method over the traditional method in terms of discrimination as well as accuracy. In addition we illustrate the insights gained by the analyst into the discovered FTSs from the bootstrap samples. Copyright 2010 Elsevier Inc. All rights reserved.

  14. Exploring Mouse Protein Function via Multiple Approaches.

    PubMed

    Huang, Guohua; Chu, Chen; Huang, Tao; Kong, Xiangyin; Zhang, Yunhua; Zhang, Ning; Cai, Yu-Dong

    2016-01-01

    Although the number of available protein sequences is growing exponentially, functional protein annotations lag far behind. Therefore, accurate identification of protein functions remains one of the major challenges in molecular biology. In this study, we presented a novel approach to predict mouse protein functions. The approach was a sequential combination of a similarity-based approach, an interaction-based approach and a pseudo amino acid composition-based approach. The method achieved an accuracy of about 0.8450 for the 1st-order predictions in the leave-one-out and ten-fold cross-validations. For the results yielded by the leave-one-out cross-validation, although the similarity-based approach alone achieved an accuracy of 0.8756, it was unable to predict the functions of proteins with no homologues. Comparatively, the pseudo amino acid composition-based approach alone reached an accuracy of 0.6786. Although the accuracy was lower than that of the previous approach, it could predict the functions of almost all proteins, even proteins with no homologues. Therefore, the combined method balanced the advantages and disadvantages of both approaches to achieve efficient performance. Furthermore, the results yielded by the ten-fold cross-validation indicate that the combined method is still effective and stable when there are no close homologs are available. However, the accuracy of the predicted functions can only be determined according to known protein functions based on current knowledge. Many protein functions remain unknown. By exploring the functions of proteins for which the 1st-order predicted functions are wrong but the 2nd-order predicted functions are correct, the 1st-order wrongly predicted functions were shown to be closely associated with the genes encoding the proteins. The so-called wrongly predicted functions could also potentially be correct upon future experimental verification. Therefore, the accuracy of the presented method may be much higher in

  15. Exploring Mouse Protein Function via Multiple Approaches

    PubMed Central

    Huang, Tao; Kong, Xiangyin; Zhang, Yunhua; Zhang, Ning

    2016-01-01

    Although the number of available protein sequences is growing exponentially, functional protein annotations lag far behind. Therefore, accurate identification of protein functions remains one of the major challenges in molecular biology. In this study, we presented a novel approach to predict mouse protein functions. The approach was a sequential combination of a similarity-based approach, an interaction-based approach and a pseudo amino acid composition-based approach. The method achieved an accuracy of about 0.8450 for the 1st-order predictions in the leave-one-out and ten-fold cross-validations. For the results yielded by the leave-one-out cross-validation, although the similarity-based approach alone achieved an accuracy of 0.8756, it was unable to predict the functions of proteins with no homologues. Comparatively, the pseudo amino acid composition-based approach alone reached an accuracy of 0.6786. Although the accuracy was lower than that of the previous approach, it could predict the functions of almost all proteins, even proteins with no homologues. Therefore, the combined method balanced the advantages and disadvantages of both approaches to achieve efficient performance. Furthermore, the results yielded by the ten-fold cross-validation indicate that the combined method is still effective and stable when there are no close homologs are available. However, the accuracy of the predicted functions can only be determined according to known protein functions based on current knowledge. Many protein functions remain unknown. By exploring the functions of proteins for which the 1st-order predicted functions are wrong but the 2nd-order predicted functions are correct, the 1st-order wrongly predicted functions were shown to be closely associated with the genes encoding the proteins. The so-called wrongly predicted functions could also potentially be correct upon future experimental verification. Therefore, the accuracy of the presented method may be much higher in

  16. A deterministic width function model

    NASA Astrophysics Data System (ADS)

    Puente, C. E.; Sivakumar, B.

    Use of a deterministic fractal-multifractal (FM) geometric method to model width functions of natural river networks, as derived distributions of simple multifractal measures via fractal interpolating functions, is reported. It is first demonstrated that the FM procedure may be used to simulate natural width functions, preserving their most relevant features like their overall shape and texture and their observed power-law scaling on their power spectra. It is then shown, via two natural river networks (Racoon and Brushy creeks in the United States), that the FM approach may also be used to closely approximate existing width functions.

  17. A density functional approach to ferrogels

    NASA Astrophysics Data System (ADS)

    Cremer, P.; Heinen, M.; Menzel, A. M.; Löwen, H.

    2017-07-01

    Ferrogels consist of magnetic colloidal particles embedded in an elastic polymer matrix. As a consequence, their structural and rheological properties are governed by a competition between magnetic particle-particle interactions and mechanical matrix elasticity. Typically, the particles are permanently fixed within the matrix, which makes them distinguishable by their positions. Over time, particle neighbors do not change due to the fixation by the matrix. Here we present a classical density functional approach for such ferrogels. We map the elastic matrix-induced interactions between neighboring colloidal particles distinguishable by their positions onto effective pairwise interactions between indistinguishable particles similar to a ‘pairwise pseudopotential’. Using Monte-Carlo computer simulations, we demonstrate for one-dimensional dipole-spring models of ferrogels that this mapping is justified. We then use the pseudopotential as an input into classical density functional theory of inhomogeneous fluids and predict the bulk elastic modulus of the ferrogel under various conditions. In addition, we propose the use of an ‘external pseudopotential’ when one switches from the viewpoint of a one-dimensional dipole-spring object to a one-dimensional chain embedded in an infinitely extended bulk matrix. Our mapping approach paves the way to describe various inhomogeneous situations of ferrogels using classical density functional concepts of inhomogeneous fluids.

  18. A functional-dynamic reflection on participatory processes in modeling projects.

    PubMed

    Seidl, Roman

    2015-12-01

    The participation of nonscientists in modeling projects/studies is increasingly employed to fulfill different functions. However, it is not well investigated if and how explicitly these functions and the dynamics of a participatory process are reflected by modeling projects in particular. In this review study, I explore participatory modeling projects from a functional-dynamic process perspective. The main differences among projects relate to the functions of participation-most often, more than one per project can be identified, along with the degree of explicit reflection (i.e., awareness and anticipation) on the dynamic process perspective. Moreover, two main approaches are revealed: participatory modeling covering diverse approaches and companion modeling. It becomes apparent that the degree of reflection on the participatory process itself is not always explicit and perfectly visible in the descriptions of the modeling projects. Thus, the use of common protocols or templates is discussed to facilitate project planning, as well as the publication of project results. A generic template may help, not in providing details of a project or model development, but in explicitly reflecting on the participatory process. It can serve to systematize the particular project's approach to stakeholder collaboration, and thus quality management.

  19. Fault Diagnosis approach based on a model-based reasoner and a functional designer for a wind turbine. An approach towards self-maintenance

    NASA Astrophysics Data System (ADS)

    Echavarria, E.; Tomiyama, T.; van Bussel, G. J. W.

    2007-07-01

    The objective of this on-going research is to develop a design methodology to increase the availability for offshore wind farms, by means of an intelligent maintenance system capable of responding to faults by reconfiguring the system or subsystems, without increasing service visits, complexity, or costs. The idea is to make use of the existing functional redundancies within the system and sub-systems to keep the wind turbine operational, even at a reduced capacity if necessary. Re-configuration is intended to be a built-in capability to be used as a repair strategy, based on these existing functionalities provided by the components. The possible solutions can range from using information from adjacent wind turbines, such as wind speed and direction, to setting up different operational modes, for instance re-wiring, re-connecting, changing parameters or control strategy. The methodology described in this paper is based on qualitative physics and consists of a fault diagnosis system based on a model-based reasoner (MBR), and on a functional redundancy designer (FRD). Both design tools make use of a function-behaviour-state (FBS) model. A design methodology based on the re-configuration concept to achieve self-maintained wind turbines is an interesting and promising approach to reduce stoppage rate, failure events, maintenance visits, and to maintain energy output possibly at reduced rate until the next scheduled maintenance.

  20. Integrative approaches for modeling regulation and function of the respiratory system.

    PubMed

    Ben-Tal, Alona; Tawhai, Merryn H

    2013-01-01

    Mathematical models have been central to understanding the interaction between neural control and breathing. Models of the entire respiratory system-which comprises the lungs and the neural circuitry that controls their ventilation-have been derived using simplifying assumptions to compartmentalize each component of the system and to define the interactions between components. These full system models often rely-through necessity-on empirically derived relationships or parameters, in addition to physiological values. In parallel with the development of whole respiratory system models are mathematical models that focus on furthering a detailed understanding of the neural control network, or of the several functions that contribute to gas exchange within the lung. These models are biophysically based, and rely on physiological parameters. They include single-unit models for a breathing lung or neural circuit, through to spatially distributed models of ventilation and perfusion, or multicircuit models for neural control. The challenge is to bring together these more recent advances in models of neural control with models of lung function, into a full simulation for the respiratory system that builds upon the more detailed models but remains computationally tractable. This requires first understanding the mathematical models that have been developed for the respiratory system at different levels, and which could be used to study how physiological levels of O2 and CO2 in the blood are maintained. Copyright © 2013 Wiley Periodicals, Inc.

  1. Plant functional diversity increases grassland productivity-related water vapor fluxes: an Ecotron and modeling approach.

    PubMed

    Milcu, Alexandru; Eugster, Werner; Bachmann, Dörte; Guderle, Marcus; Roscher, Christiane; Gockele, Annette; Landais, Damien; Ravel, Olivier; Gessler, Arthur; Lange, Markus; Ebeling, Anne; Weisser, Wolfgang W; Roy, Jacques; Hildebrandt, Anke; Buchmann, Nina

    2016-08-01

    The impact of species richness and functional diversity of plants on ecosystem water vapor fluxes has been little investigated. To address this knowledge gap, we combined a lysimeter setup in a controlled environment facility (Ecotron) with large ecosystem samples/monoliths originating from a long-term biodiversity experiment (The Jena Experiment) and a modeling approach. Our goals were (1) quantifying the impact of plant species richness (four vs. 16 species) on day- and nighttime ecosystem water vapor fluxes; (2) partitioning ecosystem evapotranspiration into evaporation and plant transpiration using the Shuttleworth and Wallace (SW) energy partitioning model; and (3) identifying the most parsimonious predictors of water vapor fluxes using plant functional-trait-based metrics such as functional diversity and community weighted means. Daytime measured and modeled evapotranspiration were significantly higher in the higher plant diversity treatment, suggesting increased water acquisition. The SW model suggests that, at low plant species richness, a higher proportion of the available energy was diverted to evaporation (a non-productive flux), while, at higher species richness, the proportion of ecosystem transpiration (a productivity-related water flux) increased. While it is well established that LAI controls ecosystem transpiration, here we also identified that the diversity of leaf nitrogen concentration among species in a community is a consistent predictor of ecosystem water vapor fluxes during daytime. The results provide evidence that, at the peak of the growing season, higher leaf area index (LAI) and lower percentage of bare ground at high plant diversity diverts more of the available water to transpiration, a flux closely coupled with photosynthesis and productivity. Higher rates of transpiration presumably contribute to the positive effect of diversity on productivity. © 2016 by the Ecological Society of America.

  2. A three-way approach for protein function classification

    PubMed Central

    2017-01-01

    The knowledge of protein functions plays an essential role in understanding biological cells and has a significant impact on human life in areas such as personalized medicine, better crops and improved therapeutic interventions. Due to expense and inherent difficulty of biological experiments, intelligent methods are generally relied upon for automatic assignment of functions to proteins. The technological advancements in the field of biology are improving our understanding of biological processes and are regularly resulting in new features and characteristics that better describe the role of proteins. It is inevitable to neglect and overlook these anticipated features in designing more effective classification techniques. A key issue in this context, that is not being sufficiently addressed, is how to build effective classification models and approaches for protein function prediction by incorporating and taking advantage from the ever evolving biological information. In this article, we propose a three-way decision making approach which provides provisions for seeking and incorporating future information. We considered probabilistic rough sets based models such as Game-Theoretic Rough Sets (GTRS) and Information-Theoretic Rough Sets (ITRS) for inducing three-way decisions. An architecture of protein functions classification with probabilistic rough sets based three-way decisions is proposed and explained. Experiments are carried out on Saccharomyces cerevisiae species dataset obtained from Uniprot database with the corresponding functional classes extracted from the Gene Ontology (GO) database. The results indicate that as the level of biological information increases, the number of deferred cases are reduced while maintaining similar level of accuracy. PMID:28234929

  3. A three-way approach for protein function classification.

    PubMed

    Ur Rehman, Hafeez; Azam, Nouman; Yao, JingTao; Benso, Alfredo

    2017-01-01

    The knowledge of protein functions plays an essential role in understanding biological cells and has a significant impact on human life in areas such as personalized medicine, better crops and improved therapeutic interventions. Due to expense and inherent difficulty of biological experiments, intelligent methods are generally relied upon for automatic assignment of functions to proteins. The technological advancements in the field of biology are improving our understanding of biological processes and are regularly resulting in new features and characteristics that better describe the role of proteins. It is inevitable to neglect and overlook these anticipated features in designing more effective classification techniques. A key issue in this context, that is not being sufficiently addressed, is how to build effective classification models and approaches for protein function prediction by incorporating and taking advantage from the ever evolving biological information. In this article, we propose a three-way decision making approach which provides provisions for seeking and incorporating future information. We considered probabilistic rough sets based models such as Game-Theoretic Rough Sets (GTRS) and Information-Theoretic Rough Sets (ITRS) for inducing three-way decisions. An architecture of protein functions classification with probabilistic rough sets based three-way decisions is proposed and explained. Experiments are carried out on Saccharomyces cerevisiae species dataset obtained from Uniprot database with the corresponding functional classes extracted from the Gene Ontology (GO) database. The results indicate that as the level of biological information increases, the number of deferred cases are reduced while maintaining similar level of accuracy.

  4. Modeling phytoplankton community in reservoirs. A comparison between taxonomic and functional groups-based models.

    PubMed

    Di Maggio, Jimena; Fernández, Carolina; Parodi, Elisa R; Diaz, M Soledad; Estrada, Vanina

    2016-01-01

    In this paper we address the formulation of two mechanistic water quality models that differ in the way the phytoplankton community is described. We carry out parameter estimation subject to differential-algebraic constraints and validation for each model and comparison between models performance. The first approach aggregates phytoplankton species based on their phylogenetic characteristics (Taxonomic group model) and the second one, on their morpho-functional properties following Reynolds' classification (Functional group model). The latter approach takes into account tolerance and sensitivity to environmental conditions. The constrained parameter estimation problems are formulated within an equation oriented framework, with a maximum likelihood objective function. The study site is Paso de las Piedras Reservoir (Argentina), which supplies water for consumption for 450,000 population. Numerical results show that phytoplankton morpho-functional groups more closely represent each species growth requirements within the group. Each model performance is quantitatively assessed by three diagnostic measures. Parameter estimation results for seasonal dynamics of the phytoplankton community and main biogeochemical variables for a one-year time horizon are presented and compared for both models, showing the functional group model enhanced performance. Finally, we explore increasing nutrient loading scenarios and predict their effect on phytoplankton dynamics throughout a one-year time horizon. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. A data-model integration approach toward improved understanding on wetland functions and hydrological benefits at the catchment scale

    NASA Astrophysics Data System (ADS)

    Yeo, I. Y.; Lang, M.; Lee, S.; Huang, C.; Jin, H.; McCarty, G.; Sadeghi, A.

    2017-12-01

    The wetland ecosystem plays crucial roles in improving hydrological function and ecological integrity for the downstream water and the surrounding landscape. However, changing behaviours and functioning of wetland ecosystems are poorly understood and extremely difficult to characterize. Improved understanding on hydrological behaviours of wetlands, considering their interaction with surrounding landscapes and impacts on downstream waters, is an essential first step toward closing the knowledge gap. We present an integrated wetland-catchment modelling study that capitalizes on recently developed inundation maps and other geospatial data. The aim of the data-model integration is to improve spatial prediction of wetland inundation and evaluate cumulative hydrological benefits at the catchment scale. In this paper, we highlight problems arising from data preparation, parameterization, and process representation in simulating wetlands within a distributed catchment model, and report the recent progress on mapping of wetland dynamics (i.e., inundation) using multiple remotely sensed data. We demonstrate the value of spatially explicit inundation information to develop site-specific wetland parameters and to evaluate model prediction at multi-spatial and temporal scales. This spatial data-model integrated framework is tested using Soil and Water Assessment Tool (SWAT) with improved wetland extension, and applied for an agricultural watershed in the Mid-Atlantic Coastal Plain, USA. This study illustrates necessity of spatially distributed information and a data integrated modelling approach to predict inundation of wetlands and hydrologic function at the local landscape scale, where monitoring and conservation decision making take place.

  6. A Semiparametric Approach for Composite Functional Mapping of Dynamic Quantitative Traits

    PubMed Central

    Yang, Runqing; Gao, Huijiang; Wang, Xin; Zhang, Ji; Zeng, Zhao-Bang; Wu, Rongling

    2007-01-01

    Functional mapping has emerged as a powerful tool for mapping quantitative trait loci (QTL) that control developmental patterns of complex dynamic traits. Original functional mapping has been constructed within the context of simple interval mapping, without consideration of separate multiple linked QTL for a dynamic trait. In this article, we present a statistical framework for mapping QTL that affect dynamic traits by capitalizing on the strengths of functional mapping and composite interval mapping. Within this so-called composite functional-mapping framework, functional mapping models the time-dependent genetic effects of a QTL tested within a marker interval using a biologically meaningful parametric function, whereas composite interval mapping models the time-dependent genetic effects of the markers outside the test interval to control the genome background using a flexible nonparametric approach based on Legendre polynomials. Such a semiparametric framework was formulated by a maximum-likelihood model and implemented with the EM algorithm, allowing for the estimation and the test of the mathematical parameters that define the QTL effects and the regression coefficients of the Legendre polynomials that describe the marker effects. Simulation studies were performed to investigate the statistical behavior of composite functional mapping and compare its advantage in separating multiple linked QTL as compared to functional mapping. We used the new mapping approach to analyze a genetic mapping example in rice, leading to the identification of multiple QTL, some of which are linked on the same chromosome, that control the developmental trajectory of leaf age. PMID:17947431

  7. Predicting cognitive function of the Malaysian elderly: a structural equation modelling approach.

    PubMed

    Foong, Hui Foh; Hamid, Tengku Aizan; Ibrahim, Rahimah; Haron, Sharifah Azizah; Shahar, Suzana

    2018-01-01

    The aim of this study was to identify the predictors of elderly's cognitive function based on biopsychosocial and cognitive reserve perspectives. The study included 2322 community-dwelling elderly in Malaysia, randomly selected through a multi-stage proportional cluster random sampling from Peninsular Malaysia. The elderly were surveyed on socio-demographic information, biomarkers, psychosocial status, disability, and cognitive function. A biopsychosocial model of cognitive function was developed to test variables' predictive power on cognitive function. Statistical analyses were performed using SPSS (version 15.0) in conjunction with Analysis of Moment Structures Graphics (AMOS 7.0). The estimated theoretical model fitted the data well. Psychosocial stress and metabolic syndrome (MetS) negatively predicted cognitive function and psychosocial stress appeared as a main predictor. Socio-demographic characteristics, except gender, also had significant effects on cognitive function. However, disability failed to predict cognitive function. Several factors together may predict cognitive function in the Malaysian elderly population, and the variance accounted for it is large enough to be considered substantial. Key factor associated with the elderly's cognitive function seems to be psychosocial well-being. Thus, psychosocial well-being should be included in the elderly assessment, apart from medical conditions, both in clinical and community setting.

  8. Nonrelativistic approaches derived from point-coupling relativistic models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lourenco, O.; Dutra, M.; Delfino, A.

    2010-03-15

    We construct nonrelativistic versions of relativistic nonlinear hadronic point-coupling models, based on new normalized spinor wave functions after small component reduction. These expansions give us energy density functionals that can be compared to their relativistic counterparts. We show that the agreement between the nonrelativistic limit approach and the Skyrme parametrizations becomes strongly dependent on the incompressibility of each model. We also show that the particular case A=B=0 (Walecka model) leads to the same energy density functional of the Skyrme parametrizations SV and ZR2, while the truncation scheme, up to order {rho}{sup 3}, leads to parametrizations for which {sigma}=1.

  9. Dynamic physiological modeling for functional diffuse optical tomography

    PubMed Central

    Diamond, Solomon Gilbert; Huppert, Theodore J.; Kolehmainen, Ville; Franceschini, Maria Angela; Kaipio, Jari P.; Arridge, Simon R.; Boas, David A.

    2009-01-01

    Diffuse optical tomography (DOT) is a noninvasive imaging technology that is sensitive to local concentration changes in oxy- and deoxyhemoglobin. When applied to functional neuroimaging, DOT measures hemodynamics in the scalp and brain that reflect competing metabolic demands and cardiovascular dynamics. The diffuse nature of near-infrared photon migration in tissue and the multitude of physiological systems that affect hemodynamics motivate the use of anatomical and physiological models to improve estimates of the functional hemodynamic response. In this paper, we present a linear state-space model for DOT analysis that models the physiological fluctuations present in the data with either static or dynamic estimation. We demonstrate the approach by using auxiliary measurements of blood pressure variability and heart rate variability as inputs to model the background physiology in DOT data. We evaluate the improvements accorded by modeling this physiology on ten human subjects with simulated functional hemodynamic responses added to the baseline physiology. Adding physiological modeling with a static estimator significantly improved estimates of the simulated functional response, and further significant improvements were achieved with a dynamic Kalman filter estimator (paired t tests, n = 10, P < 0.05). These results suggest that physiological modeling can improve DOT analysis. The further improvement with the Kalman filter encourages continued research into dynamic linear modeling of the physiology present in DOT. Cardiovascular dynamics also affect the blood-oxygen-dependent (BOLD) signal in functional magnetic resonance imaging (fMRI). This state-space approach to DOT analysis could be extended to BOLD fMRI analysis, multimodal studies and real-time analysis. PMID:16242967

  10. Casimir force in brane worlds: Coinciding results from Green's and zeta function approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linares, Roman; Morales-Tecotl, Hugo A.; Pedraza, Omar

    2010-06-15

    Casimir force encodes the structure of the field modes as vacuum fluctuations and so it is sensitive to the extra dimensions of brane worlds. Now, in flat spacetimes of arbitrary dimension the two standard approaches to the Casimir force, Green's function, and zeta function yield the same result, but for brane world models this was only assumed. In this work we show that both approaches yield the same Casimir force in the case of universal extra dimensions and Randall-Sundrum scenarios with one and two branes added by p compact dimensions. Essentially, the details of the mode eigenfunctions that enter themore » Casimir force in the Green's function approach get removed due to their orthogonality relations with a measure involving the right hypervolume of the plates, and this leaves just the contribution coming from the zeta function approach. The present analysis corrects previous results showing a difference between the two approaches for the single brane Randall-Sundrum; this was due to an erroneous hypervolume of the plates introduced by the authors when using the Green's function. For all the models we discuss here, the resulting Casimir force can be neatly expressed in terms of two four-dimensional Casimir force contributions: one for the massless mode and the other for a tower of massive modes associated with the extra dimensions.« less

  11. A New Approach in Generating Meteorological Forecasts for Ensemble Streamflow Forecasting using Multivariate Functions

    NASA Astrophysics Data System (ADS)

    Khajehei, S.; Madadgar, S.; Moradkhani, H.

    2014-12-01

    The reliability and accuracy of hydrological predictions are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model parameters and model structure. To reduce the total uncertainty in hydrological applications, one approach is to reduce the uncertainty in meteorological forcing by using the statistical methods based on the conditional probability density functions (pdf). However, one of the requirements for current methods is to assume the Gaussian distribution for the marginal distribution of the observed and modeled meteorology. Here we propose a Bayesian approach based on Copula functions to develop the conditional distribution of precipitation forecast needed in deriving a hydrologic model for a sub-basin in the Columbia River Basin. Copula functions are introduced as an alternative approach in capturing the uncertainties related to meteorological forcing. Copulas are multivariate joint distribution of univariate marginal distributions, which are capable to model the joint behavior of variables with any level of correlation and dependency. The method is applied to the monthly forecast of CPC with 0.25x0.25 degree resolution to reproduce the PRISM dataset over 1970-2000. Results are compared with Ensemble Pre-Processor approach as a common procedure used by National Weather Service River forecast centers in reproducing observed climatology during a ten-year verification period (2000-2010).

  12. From head to tail: new models and approaches in primate functional anatomy and biomechanics.

    PubMed

    Organ, Jason M; Deleon, Valerie B; Wang, Qian; Smith, Timothy D

    2010-04-01

    This special issue of The Anatomical Record (AR) is based on interest generated by a symposium at the 2008 annual meeting of the American Association of Anatomists (AAA) at Experimental Biology, entitled "An Evolutionary Perspective on Human Anatomy." The development of this volume in turn provided impetus for a Biological Anthropology Mini-Meeting, organized by members of the AAA for the 2010 Experimental Biology meeting in Anaheim, California. The research presented in these pages reflects the themes of these symposia and provides a snapshot of the current state of primate functional anatomy and biomechanics research. The 17 articles in this special issue utilize new models and/or approaches to study long-standing questions about the evolution of our closest relatives, including soft-tissue dissection and microanatomical techniques, experimental approaches to morphology, kinematic and kinetic biomechanics, high-resolution computed tomography, and Finite Element Analysis (FEA). This volume continues a close historical association between the disciplines of anatomy and biological anthropology: anatomists benefit from an understanding of the evolutionary history of our modern form, and biological anthropologists rely on anatomical principles to make informed evolutionary inferences about our closest relatives. (c) 2010 Wiley-Liss, Inc.

  13. A real-space stochastic density matrix approach for density functional electronic structure.

    PubMed

    Beck, Thomas L

    2015-12-21

    The recent development of real-space grid methods has led to more efficient, accurate, and adaptable approaches for large-scale electrostatics and density functional electronic structure modeling. With the incorporation of multiscale techniques, linear-scaling real-space solvers are possible for density functional problems if localized orbitals are used to represent the Kohn-Sham energy functional. These methods still suffer from high computational and storage overheads, however, due to extensive matrix operations related to the underlying wave function grid representation. In this paper, an alternative stochastic method is outlined that aims to solve directly for the one-electron density matrix in real space. In order to illustrate aspects of the method, model calculations are performed for simple one-dimensional problems that display some features of the more general problem, such as spatial nodes in the density matrix. This orbital-free approach may prove helpful considering a future involving increasingly parallel computing architectures. Its primary advantage is the near-locality of the random walks, allowing for simultaneous updates of the density matrix in different regions of space partitioned across the processors. In addition, it allows for testing and enforcement of the particle number and idempotency constraints through stabilization of a Feynman-Kac functional integral as opposed to the extensive matrix operations in traditional approaches.

  14. Functional Foods and Lifestyle Approaches for Diabetes Prevention and Management.

    PubMed

    Alkhatib, Ahmad; Tsang, Catherine; Tiss, Ali; Bahorun, Theeshan; Arefanian, Hossein; Barake, Roula; Khadir, Abdelkrim; Tuomilehto, Jaakko

    2017-12-01

    Functional foods contain biologically active ingredients associated with physiological health benefits for preventing and managing chronic diseases, such as type 2 diabetes mellitus (T2DM). A regular consumption of functional foods may be associated with enhanced anti-oxidant, anti-inflammatory, insulin sensitivity, and anti-cholesterol functions, which are considered integral to prevent and manage T2DM. Components of the Mediterranean diet (MD)-such as fruits, vegetables, oily fish, olive oil, and tree nuts-serve as a model for functional foods based on their natural contents of nutraceuticals, including polyphenols, terpenoids, flavonoids, alkaloids, sterols, pigments, and unsaturated fatty acids. Polyphenols within MD and polyphenol-rich herbs-such as coffee, green tea, black tea, and yerba maté-have shown clinically-meaningful benefits on metabolic and microvascular activities, cholesterol and fasting glucose lowering, and anti-inflammation and anti-oxidation in high-risk and T2DM patients. However, combining exercise with functional food consumption can trigger and augment several metabolic and cardiovascular protective benefits, but it is under-investigated in people with T2DM and bariatric surgery patients. Detecting functional food benefits can now rely on an "omics" biological profiling of individuals' molecular, genetics, transcriptomics, proteomics, and metabolomics, but is under-investigated in multi-component interventions. A personalized approach for preventing and managing T2DM should consider biological and behavioral models, and embed nutrition education as part of lifestyle diabetes prevention studies. Functional foods may provide additional benefits in such an approach.

  15. Exploiting the functional and taxonomic structure of genomic data by probabilistic topic modeling.

    PubMed

    Chen, Xin; Hu, Xiaohua; Lim, Tze Y; Shen, Xiajiong; Park, E K; Rosen, Gail L

    2012-01-01

    In this paper, we present a method that enable both homology-based approach and composition-based approach to further study the functional core (i.e., microbial core and gene core, correspondingly). In the proposed method, the identification of major functionality groups is achieved by generative topic modeling, which is able to extract useful information from unlabeled data. We first show that generative topic model can be used to model the taxon abundance information obtained by homology-based approach and study the microbial core. The model considers each sample as a “document,” which has a mixture of functional groups, while each functional group (also known as a “latent topic”) is a weight mixture of species. Therefore, estimating the generative topic model for taxon abundance data will uncover the distribution over latent functions (latent topic) in each sample. Second, we show that, generative topic model can also be used to study the genome-level composition of “N-mer” features (DNA subreads obtained by composition-based approaches). The model consider each genome as a mixture of latten genetic patterns (latent topics), while each functional pattern is a weighted mixture of the “N-mer” features, thus the existence of core genomes can be indicated by a set of common N-mer features. After studying the mutual information between latent topics and gene regions, we provide an explanation of the functional roles of uncovered latten genetic patterns. The experimental results demonstrate the effectiveness of proposed method.

  16. An overview of the recent approaches to terroir functional modelling, footprinting and zoning

    NASA Astrophysics Data System (ADS)

    Vaudour, E.; Costantini, E.; Jones, G. V.; Mocali, S.

    2015-03-01

    Notions of terroir and their conceptualization through agro-environmental sciences have become popular in many parts of world. Originally developed for wine, terroir now encompasses many other crops including fruits, vegetables, cheese, olive oil, coffee, cacao and other crops, linking the uniqueness and quality of both beverages and foods to the environment where they are produced, giving the consumer a sense of place. Climate, geology, geomorphology and soil are the main environmental factors which make up the terroir effect on different scales. Often considered immutable culturally, the natural components of terroir are actually a set of processes, which together create a delicate equilibrium and regulation of its effect on products in both space and time. Due to both a greater need to better understand regional-to-site variations in crop production and the growth in spatial analytic technologies, the study of terroir has shifted from a largely descriptive regional science to a more applied, technical research field. Furthermore, the explosion of spatial data availability and sensing technologies has made the within-field scale of study more valuable to the individual grower. The result has been greater adoption of these technologies but also issues associated with both the spatial and temporal scales required for practical applications, as well as the relevant approaches for data synthesis. Moreover, as soil microbial communities are known to be of vital importance for terrestrial processes by driving the major soil geochemical cycles and supporting healthy plant growth, an intensive investigation of the microbial organization and their function is also required. Our objective is to present an overview of existing data and modelling approaches for terroir functional modelling, footprinting and zoning on local and regional scales. This review will focus on two main areas of recent terroir research: (1) using new tools to unravel the biogeochemical cycles of both

  17. Modeling gene expression measurement error: a quasi-likelihood approach

    PubMed Central

    Strimmer, Korbinian

    2003-01-01

    Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution) or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale). Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood). Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic) variance structure of the data. As the quasi-likelihood behaves (almost) like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye) effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also improved the power of

  18. Two-particle correlation function and dihadron correlation approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vechernin, V. V., E-mail: v.vechernin@spbu.ru; Ivanov, K. O.; Neverov, D. I.

    It is shown that, in the case of asymmetric nuclear interactions, the application of the traditional dihadron correlation approach to determining a two-particle correlation function C may lead to a form distorted in relation to the canonical pair correlation function {sub C}{sup 2}. This result was obtained both by means of exact analytic calculations of correlation functions within a simple string model for proton–nucleus and deuteron–nucleus collisions and by means of Monte Carlo simulations based on employing the HIJING event generator. It is also shown that the method based on studying multiplicity correlations in two narrow observation windows separated inmore » rapidity makes it possible to determine correctly the canonical pair correlation function C{sub 2} for all cases, including the case where the rapidity distribution of product particles is not uniform.« less

  19. Functional Foods and Lifestyle Approaches for Diabetes Prevention and Management

    PubMed Central

    Alkhatib, Ahmad; Tsang, Catherine; Tiss, Ali; Bahorun, Theeshan; Arefanian, Hossein; Barake, Roula; Khadir, Abdelkrim; Tuomilehto, Jaakko

    2017-01-01

    Functional foods contain biologically active ingredients associated with physiological health benefits for preventing and managing chronic diseases, such as type 2 diabetes mellitus (T2DM). A regular consumption of functional foods may be associated with enhanced anti-oxidant, anti-inflammatory, insulin sensitivity, and anti-cholesterol functions, which are considered integral to prevent and manage T2DM. Components of the Mediterranean diet (MD)—such as fruits, vegetables, oily fish, olive oil, and tree nuts—serve as a model for functional foods based on their natural contents of nutraceuticals, including polyphenols, terpenoids, flavonoids, alkaloids, sterols, pigments, and unsaturated fatty acids. Polyphenols within MD and polyphenol-rich herbs—such as coffee, green tea, black tea, and yerba maté—have shown clinically-meaningful benefits on metabolic and microvascular activities, cholesterol and fasting glucose lowering, and anti-inflammation and anti-oxidation in high-risk and T2DM patients. However, combining exercise with functional food consumption can trigger and augment several metabolic and cardiovascular protective benefits, but it is under-investigated in people with T2DM and bariatric surgery patients. Detecting functional food benefits can now rely on an “omics” biological profiling of individuals’ molecular, genetics, transcriptomics, proteomics, and metabolomics, but is under-investigated in multi-component interventions. A personalized approach for preventing and managing T2DM should consider biological and behavioral models, and embed nutrition education as part of lifestyle diabetes prevention studies. Functional foods may provide additional benefits in such an approach. PMID:29194424

  20. Modelling approaches for evaluating multiscale tendon mechanics

    PubMed Central

    Fang, Fei; Lake, Spencer P.

    2016-01-01

    Tendon exhibits anisotropic, inhomogeneous and viscoelastic mechanical properties that are determined by its complicated hierarchical structure and varying amounts/organization of different tissue constituents. Although extensive research has been conducted to use modelling approaches to interpret tendon structure–function relationships in combination with experimental data, many issues remain unclear (i.e. the role of minor components such as decorin, aggrecan and elastin), and the integration of mechanical analysis across different length scales has not been well applied to explore stress or strain transfer from macro- to microscale. This review outlines mathematical and computational models that have been used to understand tendon mechanics at different scales of the hierarchical organization. Model representations at the molecular, fibril and tissue levels are discussed, including formulations that follow phenomenological and microstructural approaches (which include evaluations of crimp, helical structure and the interaction between collagen fibrils and proteoglycans). Multiscale modelling approaches incorporating tendon features are suggested to be an advantageous methodology to understand further the physiological mechanical response of tendon and corresponding adaptation of properties owing to unique in vivo loading environments. PMID:26855747

  1. Engelmann Spruce Site Index Models: A Comparison of Model Functions and Parameterizations

    PubMed Central

    Nigh, Gordon

    2015-01-01

    Engelmann spruce (Picea engelmannii Parry ex Engelm.) is a high-elevation species found in western Canada and western USA. As this species becomes increasingly targeted for harvesting, better height growth information is required for good management of this species. This project was initiated to fill this need. The objective of the project was threefold: develop a site index model for Engelmann spruce; compare the fits and modelling and application issues between three model formulations and four parameterizations; and more closely examine the grounded-Generalized Algebraic Difference Approach (g-GADA) model parameterization. The model fitting data consisted of 84 stem analyzed Engelmann spruce site trees sampled across the Engelmann Spruce – Subalpine Fir biogeoclimatic zone. The fitted models were based on the Chapman-Richards function, a modified Hossfeld IV function, and the Schumacher function. The model parameterizations that were tested are indicator variables, mixed-effects, GADA, and g-GADA. Model evaluation was based on the finite-sample corrected version of Akaike’s Information Criteria and the estimated variance. Model parameterization had more of an influence on the fit than did model formulation, with the indicator variable method providing the best fit, followed by the mixed-effects modelling (9% increase in the variance for the Chapman-Richards and Schumacher formulations over the indicator variable parameterization), g-GADA (optimal approach) (335% increase in the variance), and the GADA/g-GADA (with the GADA parameterization) (346% increase in the variance). Factors related to the application of the model must be considered when selecting the model for use as the best fitting methods have the most barriers in their application in terms of data and software requirements. PMID:25853472

  2. Thresholding functional connectomes by means of mixture modeling.

    PubMed

    Bielczyk, Natalia Z; Walocha, Fabian; Ebel, Patrick W; Haak, Koen V; Llera, Alberto; Buitelaar, Jan K; Glennon, Jeffrey C; Beckmann, Christian F

    2018-05-01

    Functional connectivity has been shown to be a very promising tool for studying the large-scale functional architecture of the human brain. In network research in fMRI, functional connectivity is considered as a set of pair-wise interactions between the nodes of the network. These interactions are typically operationalized through the full or partial correlation between all pairs of regional time series. Estimating the structure of the latent underlying functional connectome from the set of pair-wise partial correlations remains an open research problem though. Typically, this thresholding problem is approached by proportional thresholding, or by means of parametric or non-parametric permutation testing across a cohort of subjects at each possible connection. As an alternative, we propose a data-driven thresholding approach for network matrices on the basis of mixture modeling. This approach allows for creating subject-specific sparse connectomes by modeling the full set of partial correlations as a mixture of low correlation values associated with weak or unreliable edges in the connectome and a sparse set of reliable connections. Consequently, we propose to use alternative thresholding strategy based on the model fit using pseudo-False Discovery Rates derived on the basis of the empirical null estimated as part of the mixture distribution. We evaluate the method on synthetic benchmark fMRI datasets where the underlying network structure is known, and demonstrate that it gives improved performance with respect to the alternative methods for thresholding connectomes, given the canonical thresholding levels. We also demonstrate that mixture modeling gives highly reproducible results when applied to the functional connectomes of the visual system derived from the n-back Working Memory task in the Human Connectome Project. The sparse connectomes obtained from mixture modeling are further discussed in the light of the previous knowledge of the functional architecture

  3. Online model checking approach based parameter estimation to a neuronal fate decision simulation model in Caenorhabditis elegans with hybrid functional Petri net with extension.

    PubMed

    Li, Chen; Nagasaki, Masao; Koh, Chuan Hock; Miyano, Satoru

    2011-05-01

    Mathematical modeling and simulation studies are playing an increasingly important role in helping researchers elucidate how living organisms function in cells. In systems biology, researchers typically tune many parameters manually to achieve simulation results that are consistent with biological knowledge. This severely limits the size and complexity of simulation models built. In order to break this limitation, we propose a computational framework to automatically estimate kinetic parameters for a given network structure. We utilized an online (on-the-fly) model checking technique (which saves resources compared to the offline approach), with a quantitative modeling and simulation architecture named hybrid functional Petri net with extension (HFPNe). We demonstrate the applicability of this framework by the analysis of the underlying model for the neuronal cell fate decision model (ASE fate model) in Caenorhabditis elegans. First, we built a quantitative ASE fate model containing 3327 components emulating nine genetic conditions. Then, using our developed efficient online model checker, MIRACH 1.0, together with parameter estimation, we ran 20-million simulation runs, and were able to locate 57 parameter sets for 23 parameters in the model that are consistent with 45 biological rules extracted from published biological articles without much manual intervention. To evaluate the robustness of these 57 parameter sets, we run another 20 million simulation runs using different magnitudes of noise. Our simulation results concluded that among these models, one model is the most reasonable and robust simulation model owing to the high stability against these stochastic noises. Our simulation results provide interesting biological findings which could be used for future wet-lab experiments.

  4. A New Approach to Predict Microbial Community Assembly and Function Using a Stochastic, Genome-Enabled Modeling Framework

    NASA Astrophysics Data System (ADS)

    King, E.; Brodie, E.; Anantharaman, K.; Karaoz, U.; Bouskill, N.; Banfield, J. F.; Steefel, C. I.; Molins, S.

    2016-12-01

    Characterizing and predicting the microbial and chemical compositions of subsurface aquatic systems necessitates an understanding of the metabolism and physiology of organisms that are often uncultured or studied under conditions not relevant for one's environment of interest. Cultivation-independent approaches are therefore important and have greatly enhanced our ability to characterize functional microbial diversity. The capability to reconstruct genomes representing thousands of populations from microbial communities using metagenomic techniques provides a foundation for development of predictive models for community structure and function. Here, we discuss a genome-informed stochastic trait-based model incorporated into a reactive transport framework to represent the activities of coupled guilds of hypothetical microorganisms. Metabolic pathways for each microbe within a functional guild are parameterized from metagenomic data with a unique combination of traits governing organism fitness under dynamic environmental conditions. We simulate the thermodynamics of coupled electron donor and acceptor reactions to predict the energy available for cellular maintenance, respiration, biomass development, and enzyme production. While `omics analyses can now characterize the metabolic potential of microbial communities, it is functionally redundant as well as computationally prohibitive to explicitly include the thousands of recovered organisms into biogeochemical models. However, one can derive potential metabolic pathways from genomes along with trait-linkages to build probability distributions of traits. These distributions are used to assemble groups of microbes that couple one or more of these pathways. From the initial ensemble of microbes, only a subset will persist based on the interaction of their physiological and metabolic traits with environmental conditions, competing organisms, etc. Here, we analyze the predicted niches of these hypothetical microbes and

  5. Challenges and opportunities for integrating lake ecosystem modelling approaches

    USGS Publications Warehouse

    Mooij, Wolf M.; Trolle, Dennis; Jeppesen, Erik; Arhonditsis, George; Belolipetsky, Pavel V.; Chitamwebwa, Deonatus B.R.; Degermendzhy, Andrey G.; DeAngelis, Donald L.; Domis, Lisette N. De Senerpont; Downing, Andrea S.; Elliott, J. Alex; Ruberto, Carlos Ruberto; Gaedke, Ursula; Genova, Svetlana N.; Gulati, Ramesh D.; Hakanson, Lars; Hamilton, David P.; Hipsey, Matthew R.; Hoen, Jochem 't; Hulsmann, Stephan; Los, F. Hans; Makler-Pick, Vardit; Petzoldt, Thomas; Prokopkin, Igor G.; Rinke, Karsten; Schep, Sebastiaan A.; Tominaga, Koji; Van Dam, Anne A.; Van Nes, Egbert H.; Wells, Scott A.; Janse, Jan H.

    2010-01-01

    A large number and wide variety of lake ecosystem models have been developed and published during the past four decades. We identify two challenges for making further progress in this field. One such challenge is to avoid developing more models largely following the concept of others ('reinventing the wheel'). The other challenge is to avoid focusing on only one type of model, while ignoring new and diverse approaches that have become available ('having tunnel vision'). In this paper, we aim at improving the awareness of existing models and knowledge of concurrent approaches in lake ecosystem modelling, without covering all possible model tools and avenues. First, we present a broad variety of modelling approaches. To illustrate these approaches, we give brief descriptions of rather arbitrarily selected sets of specific models. We deal with static models (steady state and regression models), complex dynamic models (CAEDYM, CE-QUAL-W2, Delft 3D-ECO, LakeMab, LakeWeb, MyLake, PCLake, PROTECH, SALMO), structurally dynamic models and minimal dynamic models. We also discuss a group of approaches that could all be classified as individual based: super-individual models (Piscator, Charisma), physiologically structured models, stage-structured models and trait-based models. We briefly mention genetic algorithms, neural networks, Kalman filters and fuzzy logic. Thereafter, we zoom in, as an in-depth example, on the multi-decadal development and application of the lake ecosystem model PCLake and related models (PCLake Metamodel, Lake Shira Model, IPH-TRIM3D-PCLake). In the discussion, we argue that while the historical development of each approach and model is understandable given its 'leading principle', there are many opportunities for combining approaches. We take the point of view that a single 'right' approach does not exist and should not be strived for. Instead, multiple modelling approaches, applied concurrently to a given problem, can help develop an integrative

  6. Hierarchical approaches for systems modeling in cardiac development.

    PubMed

    Gould, Russell A; Aboulmouna, Lina M; Varner, Jeffrey D; Butcher, Jonathan T

    2013-01-01

    Ordered cardiac morphogenesis and function are essential for all vertebrate life. The heart begins as a simple contractile tube, but quickly grows and morphs into a multichambered pumping organ complete with valves, while maintaining regulation of blood flow and nutrient distribution. Though not identical, cardiac morphogenesis shares many molecular and morphological processes across vertebrate species. Quantitative data across multiple time and length scales have been gathered through decades of reductionist single variable analyses. These range from detailed molecular signaling pathways at the cellular levels to cardiac function at the tissue/organ levels. However, none of these components act in true isolation from others, and each, in turn, exhibits short- and long-range effects in both time and space. With the absence of a gene, entire signaling cascades and genetic profiles may be shifted, resulting in complex feedback mechanisms. Also taking into account local microenvironmental changes throughout development, it is apparent that a systems level approach is an essential resource to accelerate information generation concerning the functional relationships across multiple length scales (molecular data vs physiological function) and structural development. In this review, we discuss relevant in vivo and in vitro experimental approaches, compare different computational frameworks for systems modeling, and the latest information about systems modeling of cardiac development. Finally, we conclude with some important future directions for cardiac systems modeling. Copyright © 2013 Wiley Periodicals, Inc.

  7. 14D. The Epidemic of Diabesity: Functional Medicine Approaches to Treatment, Recovery, and Prevention

    PubMed Central

    2013-01-01

    Focus Areas: Integrative Approaches to Care, Supporting Behavioral Change This brief presentation will offer a view of obesity and diabetes from a functional medicine approach. It will include the use of the Functional Medicine Matrix Model, the GO.TO.IT operating system, and the clinical timeline. Assessing pre-diabetes, diabetes, and obesity must include an evaluation of digestive function and nutritional status, the role of inflammation and insulin resistance, and the consideration of persistent organic pollutants (POPs) as a driver, as well as the relationship to mitochondrial energy production. This presentation will use a case-based format to highlight the unique Functional Medicine approach to assessing the root cause(s) of obesity and diabetes, as well as demonstrating appropriate treatment modalities.

  8. Medical Spanish: A Functional Approach.

    ERIC Educational Resources Information Center

    Hendrickson, James M.

    A functional approach to language teaching begins with knowing how students intend to use the foreign language for specific purposes and in specific situations. Instructors of medical Spanish can begin by determining the specific language functions that their students must be able to express when communicating with Hispanic patients, by means of a…

  9. Simulation-based model checking approach to cell fate specification during Caenorhabditis elegans vulval development by hybrid functional Petri net with extension.

    PubMed

    Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru

    2009-04-27

    Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I

  10. Stress and Resilience in Functional Somatic Syndromes – A Structural Equation Modeling Approach

    PubMed Central

    Fischer, Susanne; Lemmer, Gunnar; Gollwitzer, Mario; Nater, Urs M.

    2014-01-01

    Background Stress has been suggested to play a role in the development and perpetuation of functional somatic syndromes. The mechanisms of how this might occur are not clear. Purpose We propose a multi-dimensional stress model which posits that childhood trauma increases adult stress reactivity (i.e., an individual's tendency to respond strongly to stressors) and reduces resilience (e.g., the belief in one's competence). This in turn facilitates the manifestation of functional somatic syndromes via chronic stress. We tested this model cross-sectionally and prospectively. Methods Young adults participated in a web survey at two time points. Structural equation modeling was used to test our model. The final sample consisted of 3′054 participants, and 429 of these participated in the follow-up survey. Results Our proposed model fit the data in the cross-sectional (χ2(21)  = 48.808, p<.001, CFI  = .995, TLI  = .992, RMSEA  = .021, 90% CI [.013.029]) and prospective analyses (χ2(21)  = 32.675, p<.05, CFI  = .982, TLI  = .969, RMSEA  = .036, 90% CI [.001.059]). Discussion Our findings have several clinical implications, suggesting a role for stress management training in the prevention and treatment of functional somatic syndromes. PMID:25396736

  11. Response Surface Modeling Using Multivariate Orthogonal Functions

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.; DeLoach, Richard

    2001-01-01

    A nonlinear modeling technique was used to characterize response surfaces for non-dimensional longitudinal aerodynamic force and moment coefficients, based on wind tunnel data from a commercial jet transport model. Data were collected using two experimental procedures - one based on modem design of experiments (MDOE), and one using a classical one factor at a time (OFAT) approach. The nonlinear modeling technique used multivariate orthogonal functions generated from the independent variable data as modeling functions in a least squares context to characterize the response surfaces. Model terms were selected automatically using a prediction error metric. Prediction error bounds computed from the modeling data alone were found to be- a good measure of actual prediction error for prediction points within the inference space. Root-mean-square model fit error and prediction error were less than 4 percent of the mean response value in all cases. Efficacy and prediction performance of the response surface models identified from both MDOE and OFAT experiments were investigated.

  12. Interactively Open Autonomy Unifies Two Approaches to Function

    NASA Astrophysics Data System (ADS)

    Collier, John

    2004-08-01

    Functionality is essential to any form of anticipation beyond simple directedness at an end. In the literature on function in biology, there are two distinct approaches. One, the etiological view, places the origin of function in selection, while the other, the organizational view, individuates function by organizational role. Both approaches have well-known advantages and disadvantages. I propose a reconciliation of the two approaches, based in an interactivist approach to the individuation and stability of organisms. The approach was suggested by Kant in the Critique of Judgment, but since it requires, on his account, the identification a new form of causation, it has not been accessible by analytical techniques. I proceed by construction of the required concept to fit certain design requirements. This construction builds on concepts introduced in my previous four talks to these meetings.

  13. Generalized neurofuzzy network modeling algorithms using Bézier-Bernstein polynomial functions and additive decomposition.

    PubMed

    Hong, X; Harris, C J

    2000-01-01

    This paper introduces a new neurofuzzy model construction algorithm for nonlinear dynamic systems based upon basis functions that are Bézier-Bernstein polynomial functions. This paper is generalized in that it copes with n-dimensional inputs by utilising an additive decomposition construction to overcome the curse of dimensionality associated with high n. This new construction algorithm also introduces univariate Bézier-Bernstein polynomial functions for the completeness of the generalized procedure. Like the B-spline expansion based neurofuzzy systems, Bézier-Bernstein polynomial function based neurofuzzy networks hold desirable properties such as nonnegativity of the basis functions, unity of support, and interpretability of basis function as fuzzy membership functions, moreover with the additional advantages of structural parsimony and Delaunay input space partition, essentially overcoming the curse of dimensionality associated with conventional fuzzy and RBF networks. This new modeling network is based on additive decomposition approach together with two separate basis function formation approaches for both univariate and bivariate Bézier-Bernstein polynomial functions used in model construction. The overall network weights are then learnt using conventional least squares methods. Numerical examples are included to demonstrate the effectiveness of this new data based modeling approach.

  14. A Bayesian spatial model for neuroimaging data based on biologically informed basis functions.

    PubMed

    Huertas, Ismael; Oldehinkel, Marianne; van Oort, Erik S B; Garcia-Solis, David; Mir, Pablo; Beckmann, Christian F; Marquand, Andre F

    2017-11-01

    The dominant approach to neuroimaging data analysis employs the voxel as the unit of computation. While convenient, voxels lack biological meaning and their size is arbitrarily determined by the resolution of the image. Here, we propose a multivariate spatial model in which neuroimaging data are characterised as a linearly weighted combination of multiscale basis functions which map onto underlying brain nuclei or networks or nuclei. In this model, the elementary building blocks are derived to reflect the functional anatomy of the brain during the resting state. This model is estimated using a Bayesian framework which accurately quantifies uncertainty and automatically finds the most accurate and parsimonious combination of basis functions describing the data. We demonstrate the utility of this framework by predicting quantitative SPECT images of striatal dopamine function and we compare a variety of basis sets including generic isotropic functions, anatomical representations of the striatum derived from structural MRI, and two different soft functional parcellations of the striatum derived from resting-state fMRI (rfMRI). We found that a combination of ∼50 multiscale functional basis functions accurately represented the striatal dopamine activity, and that functional basis functions derived from an advanced parcellation technique known as Instantaneous Connectivity Parcellation (ICP) provided the most parsimonious models of dopamine function. Importantly, functional basis functions derived from resting fMRI were more accurate than both structural and generic basis sets in representing dopamine function in the striatum for a fixed model order. We demonstrate the translational validity of our framework by constructing classification models for discriminating parkinsonian disorders and their subtypes. Here, we show that ICP approach is the only basis set that performs well across all comparisons and performs better overall than the classical voxel-based approach

  15. A rational model of function learning.

    PubMed

    Lucas, Christopher G; Griffiths, Thomas L; Williams, Joseph J; Kalish, Michael L

    2015-10-01

    Theories of how people learn relationships between continuous variables have tended to focus on two possibilities: one, that people are estimating explicit functions, or two that they are performing associative learning supported by similarity. We provide a rational analysis of function learning, drawing on work on regression in machine learning and statistics. Using the equivalence of Bayesian linear regression and Gaussian processes, which provide a probabilistic basis for similarity-based function learning, we show that learning explicit rules and using similarity can be seen as two views of one solution to this problem. We use this insight to define a rational model of human function learning that combines the strengths of both approaches and accounts for a wide variety of experimental results.

  16. From databases to modelling of functional pathways.

    PubMed

    Nasi, Sergio

    2004-01-01

    This short review comments on current informatics resources and methodologies in the study of functional pathways in cell biology. It highlights recent achievements in unveiling the structural design of protein and gene networks and discusses current approaches to model and simulate the dynamics of regulatory pathways in the cell.

  17. Work Functions for Models of Scandate Surfaces

    NASA Technical Reports Server (NTRS)

    Mueller, Wolfgang

    1997-01-01

    The electronic structure, surface dipole properties, and work functions of scandate surfaces have been investigated using the fully relativistic scattered-wave cluster approach. Three different types of model surfaces are considered: (1) a monolayer of Ba-Sc-O on W(100), (2) Ba or BaO adsorbed on Sc2O3 + W, and (3) BaO on SC2O3 + WO3. Changes in the work function due to Ba or BaO adsorption on the different surfaces are calculated by employing the depolarization model of interacting surface dipoles. The largest work function change and the lowest work function of 1.54 eV are obtained for Ba adsorbed on the Sc-O monolayer on W(100). The adsorption of Ba on Sc2O3 + W does not lead to a low work function, but the adsorption of BaO results in a work function of about 1.6-1.9 eV. BaO adsorbed on Sc2O3 + WO3, or scandium tungstates, may also lead to low work functions.

  18. ``Green's function'' approach & low-mode asymmetries

    NASA Astrophysics Data System (ADS)

    Masse, Laurent; Clark, Dan; Salmonson, Jay; MacLaren, Steve; Ma, Tammy; Khan, Shahab; Pino, Jesse; Ralph, Jo; Czajka, C.; Tipton, Robert; Landen, Otto; Kyrala, Georges; 2 Team; 1 Team

    2017-10-01

    Long wavelength, low mode asymmetries are believed to play a leading role in limiting the performance of current ICF implosions on NIF. These long wavelength modes are initiated and driven by asymmetries in the x-ray flux from the hohlraum; however, the underlying hydrodynamics of the implosion also act to amplify these asymmetries. The work presented here aim to deepen our understanding of the interplay of the drive asymmetries and the underlying implosion hydrodynamics in determining the final imploded configuration. This is accomplished through a synthesis of numerical modeling, analytic theory, and experimental data. In detail, we use a Green's function approach to connect the drive asymmetry seen by the capsule to the measured inflight and hot spot symmetries. The approach has been validated against a suite of numerical simulations. Ultimately, we hope this work will identify additional measurements to further constrain the asymmetries and increase hohlraum illumination design flexibility on the NIF. The technique and derivation of associated error bars will be presented. LLC, (LLNS) Contract No. DE-AC52-07NA27344.

  19. A functional approach to movement analysis and error identification in sports and physical education

    PubMed Central

    Hossner, Ernst-Joachim; Schiebl, Frank; Göhner, Ulrich

    2015-01-01

    In a hypothesis-and-theory paper, a functional approach to movement analysis in sports is introduced. In this approach, contrary to classical concepts, it is not anymore the “ideal” movement of elite athletes that is taken as a template for the movements produced by learners. Instead, movements are understood as the means to solve given tasks that in turn, are defined by to-be-achieved task goals. A functional analysis comprises the steps of (1) recognizing constraints that define the functional structure, (2) identifying sub-actions that subserve the achievement of structure-dependent goals, (3) explicating modalities as specifics of the movement execution, and (4) assigning functions to actions, sub-actions and modalities. Regarding motor-control theory, a functional approach can be linked to a dynamical-system framework of behavioral shaping, to cognitive models of modular effect-related motor control as well as to explicit concepts of goal setting and goal achievement. Finally, it is shown that a functional approach is of particular help for sports practice in the context of structuring part practice, recognizing functionally equivalent task solutions, finding innovative technique alternatives, distinguishing errors from style, and identifying root causes of movement errors. PMID:26441717

  20. Functional approach to high-throughput plant growth analysis

    PubMed Central

    2013-01-01

    Method Taking advantage of the current rapid development in imaging systems and computer vision algorithms, we present HPGA, a high-throughput phenotyping platform for plant growth modeling and functional analysis, which produces better understanding of energy distribution in regards of the balance between growth and defense. HPGA has two components, PAE (Plant Area Estimation) and GMA (Growth Modeling and Analysis). In PAE, by taking the complex leaf overlap problem into consideration, the area of every plant is measured from top-view images in four steps. Given the abundant measurements obtained with PAE, in the second module GMA, a nonlinear growth model is applied to generate growth curves, followed by functional data analysis. Results Experimental results on model plant Arabidopsis thaliana show that, compared to an existing approach, HPGA reduces the error rate of measuring plant area by half. The application of HPGA on the cfq mutant plants under fluctuating light reveals the correlation between low photosynthetic rates and small plant area (compared to wild type), which raises a hypothesis that knocking out cfq changes the sensitivity of the energy distribution under fluctuating light conditions to repress leaf growth. Availability HPGA is available at http://www.msu.edu/~jinchen/HPGA. PMID:24565437

  1. Dynamic modeling approaches to characterize the functioning of health systems: A systematic review of the literature.

    PubMed

    Chang, Angela Y; Ogbuoji, Osondu; Atun, Rifat; Verguet, Stéphane

    2017-12-01

    Universal Health Coverage (UHC) is one of the targets for the United Nations Sustainable Development Goal 3. The impetus for UHC has led to an increased demand for time-sensitive tools to enhance our knowledge of how health systems function and to evaluate impact of system interventions. We define the field of "health system modeling" (HSM) as an area of research where dynamic mathematical models can be designed in order to describe, predict, and quantitatively capture the functioning of health systems. HSM can be used to explore the dynamic relationships among different system components, including organizational design, financing and other resources (such as investments in resources and supply chain management systems) - what we call "inputs" - on access, coverage, and quality of care - what we call "outputs", toward improved health system "outcomes", namely increased levels and fairer distributions of population health and financial risk protection. We undertook a systematic review to identify the existing approaches used in HSM. We identified "systems thinking" - a conceptual and qualitative description of the critical interactions within a health system - as an important underlying precursor to HSM, and collated a critical collection of such articles. We then reviewed and categorized articles from two schools of thoughts: "system dynamics" (SD)" and "susceptible-infected-recovered-plus" (SIR+). SD emphasizes the notion of accumulations of stocks in the system, inflows and outflows, and causal feedback structure to predict intended and unintended consequences of policy interventions. The SIR + models link a typical disease transmission model with another that captures certain aspects of the system that impact the outcomes of the main model. These existing methods provide critical insights in informing the design of HSM, and provide a departure point to extend this research agenda. We highlight the opportunity to advance modeling methods to further understand

  2. From Databases to Modelling of Functional Pathways

    PubMed Central

    2004-01-01

    This short review comments on current informatics resources and methodologies in the study of functional pathways in cell biology. It highlights recent achievements in unveiling the structural design of protein and gene networks and discusses current approaches to model and simulate the dynamics of regulatory pathways in the cell. PMID:18629070

  3. Gravitational radiation during plunge - a Green's function approach

    NASA Astrophysics Data System (ADS)

    Nampalliwar, Sourabh; Price, Richard; Khanna, Gaurav

    2015-04-01

    During the merger of binary compact objects, an important stage is the plunge. A short part of the Gravitational waveform, it marks the end of early inspiral and determines the quasinormal ringing (QNR) of the final product of the merger. In this talk, we describe the approach of using the Fourier domain Green's function in the particle perturbation approximation to understand the excitation of QNR. We show that the resulting understanding is successful in explaining QNR in toy models and in the Schwarzschild background.

  4. A Prototype Symbolic Model of Canonical Functional Neuroanatomy of the Motor System

    PubMed Central

    Rubin, Daniel L.; Halle, Michael; Musen, Mark; Kikinis, Ron

    2008-01-01

    Recent advances in bioinformatics have opened entire new avenues for organizing, integrating and retrieving neuroscientific data, in a digital, machine-processable format, which can be at the same time understood by humans, using ontological, symbolic data representations. Declarative information stored in ontological format can be perused and maintained by domain experts, interpreted by machines, and serve as basis for a multitude of decision-support, computerized simulation, data mining, and teaching applications. We have developed a prototype symbolic model of canonical neuroanatomy of the motor system. Our symbolic model is intended to support symbolic lookup, logical inference and mathematical modeling by integrating descriptive, qualitative and quantitative functional neuroanatomical knowledge. Furthermore, we show how our approach can be extended to modeling impaired brain connectivity in disease states, such as common movement disorders. In developing our ontology, we adopted a disciplined modeling approach, relying on a set of declared principles, a high-level schema, Aristotelian definitions, and a frame-based authoring system. These features, along with the use of the Unified Medical Language System (UMLS) vocabulary, enable the alignment of our functional ontology with an existing comprehensive ontology of human anatomy, and thus allow for combining the structural and functional views of neuroanatomy for clinical decision support and neuroanatomy teaching applications. Although the scope of our current prototype ontology is limited to a particular functional system in the brain, it may be possible to adapt this approach for modeling other brain functional systems as well. PMID:18164666

  5. A Non-parametric Approach to Constrain the Transfer Function in Reverberation Mapping

    NASA Astrophysics Data System (ADS)

    Li, Yan-Rong; Wang, Jian-Min; Bai, Jin-Ming

    2016-11-01

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (I.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  6. Fast computation of the electrolyte-concentration transfer function of a lithium-ion cell model

    NASA Astrophysics Data System (ADS)

    Rodríguez, Albert; Plett, Gregory L.; Trimboli, M. Scott

    2017-08-01

    One approach to creating physics-based reduced-order models (ROMs) of battery-cell dynamics requires first generating linearized Laplace-domain transfer functions of all cell internal electrochemical variables of interest. Then, the resulting infinite-dimensional transfer functions can be reduced by various means in order to find an approximate low-dimensional model. These methods include Padé approximation or the Discrete-Time Realization algorithm. In a previous article, Lee and colleagues developed a transfer function of the electrolyte concentration for a porous-electrode pseudo-two-dimensional lithium-ion cell model. Their approach used separation of variables and Sturm-Liouville theory to compute an infinite-series solution to the transfer function, which they then truncated to a finite number of terms for reasons of practicality. Here, we instead use a variation-of-parameters approach to arrive at a different representation of the identical solution that does not require a series expansion. The primary benefits of the new approach are speed of computation of the transfer function and the removal of the requirement to approximate the transfer function by truncating the number of terms evaluated. Results show that the speedup of the new method can be more than 3800.

  7. Toward quantitative understanding on microbial community structure and functioning: a modeling-centered approach using degradation of marine oil spills as example

    PubMed Central

    Röling, Wilfred F. M.; van Bodegom, Peter M.

    2014-01-01

    Molecular ecology approaches are rapidly advancing our insights into the microorganisms involved in the degradation of marine oil spills and their metabolic potentials. Yet, many questions remain open: how do oil-degrading microbial communities assemble in terms of functional diversity, species abundances and organization and what are the drivers? How do the functional properties of microorganisms scale to processes at the ecosystem level? How does mass flow among species, and which factors and species control and regulate fluxes, stability and other ecosystem functions? Can generic rules on oil-degradation be derived, and what drivers underlie these rules? How can we engineer oil-degrading microbial communities such that toxic polycyclic aromatic hydrocarbons are degraded faster? These types of questions apply to the field of microbial ecology in general. We outline how recent advances in single-species systems biology might be extended to help answer these questions. We argue that bottom-up mechanistic modeling allows deciphering the respective roles and interactions among microorganisms. In particular constraint-based, metagenome-derived community-scale flux balance analysis appears suited for this goal as it allows calculating degradation-related fluxes based on physiological constraints and growth strategies, without needing detailed kinetic information. We subsequently discuss what is required to make these approaches successful, and identify a need to better understand microbial physiology in order to advance microbial ecology. We advocate the development of databases containing microbial physiological data. Answering the posed questions is far from trivial. Oil-degrading communities are, however, an attractive setting to start testing systems biology-derived models and hypotheses as they are relatively simple in diversity and key activities, with several key players being isolated and a high availability of experimental data and approaches. PMID:24723922

  8. Toward quantitative understanding on microbial community structure and functioning: a modeling-centered approach using degradation of marine oil spills as example.

    PubMed

    Röling, Wilfred F M; van Bodegom, Peter M

    2014-01-01

    Molecular ecology approaches are rapidly advancing our insights into the microorganisms involved in the degradation of marine oil spills and their metabolic potentials. Yet, many questions remain open: how do oil-degrading microbial communities assemble in terms of functional diversity, species abundances and organization and what are the drivers? How do the functional properties of microorganisms scale to processes at the ecosystem level? How does mass flow among species, and which factors and species control and regulate fluxes, stability and other ecosystem functions? Can generic rules on oil-degradation be derived, and what drivers underlie these rules? How can we engineer oil-degrading microbial communities such that toxic polycyclic aromatic hydrocarbons are degraded faster? These types of questions apply to the field of microbial ecology in general. We outline how recent advances in single-species systems biology might be extended to help answer these questions. We argue that bottom-up mechanistic modeling allows deciphering the respective roles and interactions among microorganisms. In particular constraint-based, metagenome-derived community-scale flux balance analysis appears suited for this goal as it allows calculating degradation-related fluxes based on physiological constraints and growth strategies, without needing detailed kinetic information. We subsequently discuss what is required to make these approaches successful, and identify a need to better understand microbial physiology in order to advance microbial ecology. We advocate the development of databases containing microbial physiological data. Answering the posed questions is far from trivial. Oil-degrading communities are, however, an attractive setting to start testing systems biology-derived models and hypotheses as they are relatively simple in diversity and key activities, with several key players being isolated and a high availability of experimental data and approaches.

  9. A Density Functional Approach to Polarizable Models: A Kim-Gordon-Response Density Interaction Potential for Molecular Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tabacchi, G; Hutter, J; Mundy, C

    2005-04-07

    A combined linear response--frozen electron density model has been implemented in a molecular dynamics scheme derived from an extended Lagrangian formalism. This approach is based on a partition of the electronic charge distribution into a frozen region described by Kim-Gordon theory, and a response contribution determined by the instaneous ionic configuration of the system. The method is free from empirical pair-potentials and the parameterization protocol involves only calculations on properly chosen subsystems. They apply this method to a series of alkali halides in different physical phases and are able to reproduce experimental structural and thermodynamic properties with an accuracy comparablemore » to Kohn-Sham density functional calculations.« less

  10. Different Approaches to Covariate Inclusion in the Mixture Rasch Model

    ERIC Educational Resources Information Center

    Li, Tongyun; Jiao, Hong; Macready, George B.

    2016-01-01

    The present study investigates different approaches to adding covariates and the impact in fitting mixture item response theory models. Mixture item response theory models serve as an important methodology for tackling several psychometric issues in test development, including the detection of latent differential item functioning. A Monte Carlo…

  11. Bridging the gap between measurements and modelling: a cardiovascular functional avatar.

    PubMed

    Casas, Belén; Lantz, Jonas; Viola, Federica; Cedersund, Gunnar; Bolger, Ann F; Carlhäll, Carl-Johan; Karlsson, Matts; Ebbers, Tino

    2017-07-24

    Lumped parameter models of the cardiovascular system have the potential to assist researchers and clinicians to better understand cardiovascular function. The value of such models increases when they are subject specific. However, most approaches to personalize lumped parameter models have thus far required invasive measurements or fall short of being subject specific due to a lack of the necessary clinical data. Here, we propose an approach to personalize parameters in a model of the heart and the systemic circulation using exclusively non-invasive measurements. The personalized model is created using flow data from four-dimensional magnetic resonance imaging and cuff pressure measurements in the brachial artery. We term this personalized model the cardiovascular avatar. In our proof-of-concept study, we evaluated the capability of the avatar to reproduce pressures and flows in a group of eight healthy subjects. Both quantitatively and qualitatively, the model-based results agreed well with the pressure and flow measurements obtained in vivo for each subject. This non-invasive and personalized approach can synthesize medical data into clinically relevant indicators of cardiovascular function, and estimate hemodynamic variables that cannot be assessed directly from clinical measurements.

  12. [Mathematic concept model of accumulation of functional disorders associated with environmental factors].

    PubMed

    Zaĭtseva, N V; Trusov, P V; Kir'ianov, D A

    2012-01-01

    The mathematic concept model presented describes accumulation of functional disorders associated with environmental factors, plays predictive role and is designed for assessments of possible effects caused by heterogenous factors with variable exposures. Considering exposure changes with self-restoration process opens prospects of using the model to evaluate, analyse and manage occupational risks. To develop current theoretic approaches, the authors suggested a model considering age-related body peculiarities, systemic interactions of organs, including neuro-humoral regulation, accumulation of functional disorders due to external factors, rehabilitation of functions during treatment. General objective setting covers defining over a hundred unknow coefficients that characterize speed of various processes within the body. To solve this problem, the authors used iteration approach, successive identification, that starts from the certain primary approximation of the model parameters and processes subsequent updating on the basis of new theoretic and empirical knowledge.

  13. Leveraging Modeling Approaches: Reaction Networks and Rules

    PubMed Central

    Blinov, Michael L.; Moraru, Ion I.

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high resolution and/or high throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatio-temporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks – the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks. PMID:22161349

  14. Leveraging modeling approaches: reaction networks and rules.

    PubMed

    Blinov, Michael L; Moraru, Ion I

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high-resolution and/or high-throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatiotemporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks - the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks.

  15. The Layer-Oriented Approach to Declarative Languages for Biological Modeling

    PubMed Central

    Raikov, Ivan; De Schutter, Erik

    2012-01-01

    We present a new approach to modeling languages for computational biology, which we call the layer-oriented approach. The approach stems from the observation that many diverse biological phenomena are described using a small set of mathematical formalisms (e.g. differential equations), while at the same time different domains and subdomains of computational biology require that models are structured according to the accepted terminology and classification of that domain. Our approach uses distinct semantic layers to represent the domain-specific biological concepts and the underlying mathematical formalisms. Additional functionality can be transparently added to the language by adding more layers. This approach is specifically concerned with declarative languages, and throughout the paper we note some of the limitations inherent to declarative approaches. The layer-oriented approach is a way to specify explicitly how high-level biological modeling concepts are mapped to a computational representation, while abstracting away details of particular programming languages and simulation environments. To illustrate this process, we define an example language for describing models of ionic currents, and use a general mathematical notation for semantic transformations to show how to generate model simulation code for various simulation environments. We use the example language to describe a Purkinje neuron model and demonstrate how the layer-oriented approach can be used for solving several practical issues of computational neuroscience model development. We discuss the advantages and limitations of the approach in comparison with other modeling language efforts in the domain of computational biology and outline some principles for extensible, flexible modeling language design. We conclude by describing in detail the semantic transformations defined for our language. PMID:22615554

  16. The layer-oriented approach to declarative languages for biological modeling.

    PubMed

    Raikov, Ivan; De Schutter, Erik

    2012-01-01

    We present a new approach to modeling languages for computational biology, which we call the layer-oriented approach. The approach stems from the observation that many diverse biological phenomena are described using a small set of mathematical formalisms (e.g. differential equations), while at the same time different domains and subdomains of computational biology require that models are structured according to the accepted terminology and classification of that domain. Our approach uses distinct semantic layers to represent the domain-specific biological concepts and the underlying mathematical formalisms. Additional functionality can be transparently added to the language by adding more layers. This approach is specifically concerned with declarative languages, and throughout the paper we note some of the limitations inherent to declarative approaches. The layer-oriented approach is a way to specify explicitly how high-level biological modeling concepts are mapped to a computational representation, while abstracting away details of particular programming languages and simulation environments. To illustrate this process, we define an example language for describing models of ionic currents, and use a general mathematical notation for semantic transformations to show how to generate model simulation code for various simulation environments. We use the example language to describe a Purkinje neuron model and demonstrate how the layer-oriented approach can be used for solving several practical issues of computational neuroscience model development. We discuss the advantages and limitations of the approach in comparison with other modeling language efforts in the domain of computational biology and outline some principles for extensible, flexible modeling language design. We conclude by describing in detail the semantic transformations defined for our language.

  17. A statistical mechanical approach to restricted integer partition functions

    NASA Astrophysics Data System (ADS)

    Zhou, Chi-Chun; Dai, Wu-Sheng

    2018-05-01

    The main aim of this paper is twofold: (1) suggesting a statistical mechanical approach to the calculation of the generating function of restricted integer partition functions which count the number of partitions—a way of writing an integer as a sum of other integers under certain restrictions. In this approach, the generating function of restricted integer partition functions is constructed from the canonical partition functions of various quantum gases. (2) Introducing a new type of restricted integer partition functions corresponding to general statistics which is a generalization of Gentile statistics in statistical mechanics; many kinds of restricted integer partition functions are special cases of this restricted integer partition function. Moreover, with statistical mechanics as a bridge, we reveal a mathematical fact: the generating function of restricted integer partition function is just the symmetric function which is a class of functions being invariant under the action of permutation groups. Using this approach, we provide some expressions of restricted integer partition functions as examples.

  18. Gain-of-function mutagenesis approaches in rice for functional genomics and improvement of crop productivity.

    PubMed

    Moin, Mazahar; Bakshi, Achala; Saha, Anusree; Dutta, Mouboni; Kirti, P B

    2017-07-01

    The epitome of any genome research is to identify all the existing genes in a genome and investigate their roles. Various techniques have been applied to unveil the functions either by silencing or over-expressing the genes by targeted expression or random mutagenesis. Rice is the most appropriate model crop for generating a mutant resource for functional genomic studies because of the availability of high-quality genome sequence and relatively smaller genome size. Rice has syntenic relationships with members of other cereals. Hence, characterization of functionally unknown genes in rice will possibly provide key genetic insights and can lead to comparative genomics involving other cereals. The current review attempts to discuss the available gain-of-function mutagenesis techniques for functional genomics, emphasizing the contemporary approach, activation tagging and alterations to this method for the enhancement of yield and productivity of rice. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  19. Spinal cord injuries functional rehabilitation - Traditional approaches and new strategies in physiotherapy.

    PubMed

    de Almeida, Patrícia Maria Duarte

    2006-02-01

    Considering the body structures and systems loss of function, after a Spinal Cord Injury, with is respective activities limitations and social participation restriction, the rehabilitation process goals are to achieve the maximal functional independence and quality of life allowed by the clinical lesion. For this is necessary a rehabilitation period with a rehabilitation team, including the physiotherapist whose interventions will depend on factors such degree of completeness or incompleteness and patient clinical stage. Physiotherapy approach includes several procedures and techniques related with a traditional model or with the recent perspective of neuronal regeneration. Following a traditional model, the interventions in complete A and incomplete B lesions, is based on compensatory method of functional rehabilitation using the non affected muscles. In the incomplete C and D lesions, motor re-education below the lesion, using key points to facilitate normal and selective patterns of movement is preferable. In other way if the neuronal regeneration is possible with respective function improve; the physiotherapy approach goals are to maintain muscular trofism and improve the recruitment of motor units using intensive techniques. In both, there is no scientific evidence to support the procedures, exists a lack of investigation and most of the research are methodologically poor. © 2006 Sociedade Portuguesa de Pneumologia/SPP.

  20. Comparison and Contrast of Two General Functional Regression Modeling Frameworks.

    PubMed

    Morris, Jeffrey S

    2017-02-01

    In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable.

  1. An exploratory data analysis of electroencephalograms using the functional boxplots approach

    PubMed Central

    Ngo, Duy; Sun, Ying; Genton, Marc G.; Wu, Jennifer; Srinivasan, Ramesh; Cramer, Steven C.; Ombao, Hernando

    2015-01-01

    Many model-based methods have been developed over the last several decades for analysis of electroencephalograms (EEGs) in order to understand electrical neural data. In this work, we propose to use the functional boxplot (FBP) to analyze log periodograms of EEG time series data in the spectral domain. The functional bloxplot approach produces a median curve—which is not equivalent to connecting medians obtained from frequency-specific boxplots. In addition, this approach identifies a functional median, summarizes variability, and detects potential outliers. By extending FBPs analysis from one-dimensional curves to surfaces, surface boxplots are also used to explore the variation of the spectral power for the alpha (8–12 Hz) and beta (16–32 Hz) frequency bands across the brain cortical surface. By using rank-based nonparametric tests, we also investigate the stationarity of EEG traces across an exam acquired during resting-state by comparing the spectrum during the early vs. late phases of a single resting-state EEG exam. PMID:26347598

  2. A very efficient approach to compute the first-passage probability density function in a time-changed Brownian model: Applications in finance

    NASA Astrophysics Data System (ADS)

    Ballestra, Luca Vincenzo; Pacelli, Graziella; Radi, Davide

    2016-12-01

    We propose a numerical method to compute the first-passage probability density function in a time-changed Brownian model. In particular, we derive an integral representation of such a density function in which the integrand functions must be obtained solving a system of Volterra equations of the first kind. In addition, we develop an ad-hoc numerical procedure to regularize and solve this system of integral equations. The proposed method is tested on three application problems of interest in mathematical finance, namely the calculation of the survival probability of an indebted firm, the pricing of a single-knock-out put option and the pricing of a double-knock-out put option. The results obtained reveal that the novel approach is extremely accurate and fast, and performs significantly better than the finite difference method.

  3. Interprofessional approach for teaching functional knee joint anatomy.

    PubMed

    Meyer, Jakob J; Obmann, Markus M; Gießler, Marianne; Schuldis, Dominik; Brückner, Ann-Kathrin; Strohm, Peter C; Sandeck, Florian; Spittau, Björn

    2017-03-01

    Profound knowledge in functional and clinical anatomy is a prerequisite for efficient diagnosis in medical practice. However, anatomy teaching does not always consider functional and clinical aspects. Here we introduce a new interprofessional approach to effectively teach the anatomy of the knee joint. The presented teaching approach involves anatomists, orthopaedists and physical therapists to teach anatomy of the knee joint in small groups under functional and clinical aspects. The knee joint courses were implemented during early stages of the medical curriculum and medical students were grouped with students of physical therapy to sensitize students to the importance of interprofessional work. Evaluation results clearly demonstrate that medical students and physical therapy students appreciated this teaching approach. First evaluations of following curricular anatomy exams suggest a benefit of course participants in knee-related multiple choice questions. Together, the interprofessional approach presented here proves to be a suitable approach to teach functional and clinical anatomy of the knee joint and further trains interprofessional work between prospective physicians and physical therapists as a basis for successful healthcare management. Copyright © 2016 The Authors. Published by Elsevier GmbH.. All rights reserved.

  4. Measuring Work Functioning: Validity of a Weighted Composite Work Functioning Approach.

    PubMed

    Boezeman, Edwin J; Sluiter, Judith K; Nieuwenhuijsen, Karen

    2015-09-01

    To examine the construct validity of a weighted composite work functioning measurement approach. Workers (health-impaired/healthy) (n = 117) completed a composite measure survey that recorded four central work functioning aspects with existing scales: capacity to work, quality of work performance, quantity of work, and recovery from work. Previous derived weights reflecting the relative importance of these aspects of work functioning were used to calculate the composite weighted work functioning score of the workers. Work role functioning, productivity, and quality of life were used for validation. Correlations were calculated and norms applied to examine convergent and divergent construct validity. A t test was conducted and a norm applied to examine discriminative construct validity. Overall the weighted composite work functioning measure demonstrated construct validity. As predicted, the weighted composite score correlated (p < .001) strongly (r > .60) with work role functioning and productivity (convergent construct validity), and moderately (.30 < r < .60) with physical quality of life and less strongly than work role functioning and productivity with mental quality of life (divergent validity). Further, the weighted composite measure detected that health-impaired workers show with a large effect size (Cohen's d > .80) significantly worse work functioning than healthy workers (discriminative validity). The weighted composite work functioning measurement approach takes into account the relative importance of the different work functioning aspects and demonstrated good convergent, fair divergent, and good discriminative construct validity.

  5. Model compilation: An approach to automated model derivation

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Baudin, Catherine; Iwasaki, Yumi; Nayak, Pandurang; Tanaka, Kazuo

    1990-01-01

    An approach is introduced to automated model derivation for knowledge based systems. The approach, model compilation, involves procedurally generating the set of domain models used by a knowledge based system. With an implemented example, how this approach can be used to derive models of different precision and abstraction is illustrated, and models are tailored to different tasks, from a given set of base domain models. In particular, two implemented model compilers are described, each of which takes as input a base model that describes the structure and behavior of a simple electromechanical device, the Reaction Wheel Assembly of NASA's Hubble Space Telescope. The compilers transform this relatively general base model into simple task specific models for troubleshooting and redesign, respectively, by applying a sequence of model transformations. Each transformation in this sequence produces an increasingly more specialized model. The compilation approach lessens the burden of updating and maintaining consistency among models by enabling their automatic regeneration.

  6. A Unified Approach to Functional Principal Component Analysis and Functional Multiple-Set Canonical Correlation.

    PubMed

    Choi, Ji Yeh; Hwang, Heungsun; Yamamoto, Michio; Jung, Kwanghee; Woodward, Todd S

    2017-06-01

    Functional principal component analysis (FPCA) and functional multiple-set canonical correlation analysis (FMCCA) are data reduction techniques for functional data that are collected in the form of smooth curves or functions over a continuum such as time or space. In FPCA, low-dimensional components are extracted from a single functional dataset such that they explain the most variance of the dataset, whereas in FMCCA, low-dimensional components are obtained from each of multiple functional datasets in such a way that the associations among the components are maximized across the different sets. In this paper, we propose a unified approach to FPCA and FMCCA. The proposed approach subsumes both techniques as special cases. Furthermore, it permits a compromise between the techniques, such that components are obtained from each set of functional data to maximize their associations across different datasets, while accounting for the variance of the data well. We propose a single optimization criterion for the proposed approach, and develop an alternating regularized least squares algorithm to minimize the criterion in combination with basis function approximations to functions. We conduct a simulation study to investigate the performance of the proposed approach based on synthetic data. We also apply the approach for the analysis of multiple-subject functional magnetic resonance imaging data to obtain low-dimensional components of blood-oxygen level-dependent signal changes of the brain over time, which are highly correlated across the subjects as well as representative of the data. The extracted components are used to identify networks of neural activity that are commonly activated across the subjects while carrying out a working memory task.

  7. A Latent Class Approach to Fitting the Weighted Euclidean Model, CLASCAL.

    ERIC Educational Resources Information Center

    Winsberg, Suzanne; De Soete, Geert

    1993-01-01

    A weighted Euclidean distance model is proposed that incorporates a latent class approach (CLASCAL). The contribution to the distance function between two stimuli is per dimension weighted identically by all subjects in the same latent class. A model selection strategy is proposed and illustrated. (SLD)

  8. Metalloproteomics: Forward and Reverse Approaches in Metalloprotein Structural and Functional Characterization

    PubMed Central

    Shi, Wuxian; Chance, Mark R.

    2010-01-01

    About one-third of all proteins are associated with a metal. Metalloproteomics is defined as the structural and functional characterization of metalloproteins on a genome-wide scale. The methodologies utilized in metalloproteomics, including both forward (bottom-up) and reverse (top-down) technologies, to provide information on the identity, quantity and function of metalloproteins are discussed. Important techniques frequently employed in metalloproteomics include classical proteomics tools such as mass spectrometry and 2-D gels, immobilized-metal affinity chromatography, bioinformatics sequence analysis and homology modeling, X-ray absorption spectroscopy and other synchrotron radiation based tools. Combinative applications of these techniques provide a powerful approach to understand the function of metalloproteins. PMID:21130021

  9. Identification of functional differences in metabolic networks using comparative genomics and constraint-based models.

    PubMed

    Hamilton, Joshua J; Reed, Jennifer L

    2012-01-01

    Genome-scale network reconstructions are useful tools for understanding cellular metabolism, and comparisons of such reconstructions can provide insight into metabolic differences between organisms. Recent efforts toward comparing genome-scale models have focused primarily on aligning metabolic networks at the reaction level and then looking at differences and similarities in reaction and gene content. However, these reaction comparison approaches are time-consuming and do not identify the effect network differences have on the functional states of the network. We have developed a bilevel mixed-integer programming approach, CONGA, to identify functional differences between metabolic networks by comparing network reconstructions aligned at the gene level. We first identify orthologous genes across two reconstructions and then use CONGA to identify conditions under which differences in gene content give rise to differences in metabolic capabilities. By seeking genes whose deletion in one or both models disproportionately changes flux through a selected reaction (e.g., growth or by-product secretion) in one model over another, we are able to identify structural metabolic network differences enabling unique metabolic capabilities. Using CONGA, we explore functional differences between two metabolic reconstructions of Escherichia coli and identify a set of reactions responsible for chemical production differences between the two models. We also use this approach to aid in the development of a genome-scale model of Synechococcus sp. PCC 7002. Finally, we propose potential antimicrobial targets in Mycobacterium tuberculosis and Staphylococcus aureus based on differences in their metabolic capabilities. Through these examples, we demonstrate that a gene-centric approach to comparing metabolic networks allows for a rapid comparison of metabolic models at a functional level. Using CONGA, we can identify differences in reaction and gene content which give rise to different

  10. Identification of Functional Differences in Metabolic Networks Using Comparative Genomics and Constraint-Based Models

    PubMed Central

    Hamilton, Joshua J.; Reed, Jennifer L.

    2012-01-01

    Genome-scale network reconstructions are useful tools for understanding cellular metabolism, and comparisons of such reconstructions can provide insight into metabolic differences between organisms. Recent efforts toward comparing genome-scale models have focused primarily on aligning metabolic networks at the reaction level and then looking at differences and similarities in reaction and gene content. However, these reaction comparison approaches are time-consuming and do not identify the effect network differences have on the functional states of the network. We have developed a bilevel mixed-integer programming approach, CONGA, to identify functional differences between metabolic networks by comparing network reconstructions aligned at the gene level. We first identify orthologous genes across two reconstructions and then use CONGA to identify conditions under which differences in gene content give rise to differences in metabolic capabilities. By seeking genes whose deletion in one or both models disproportionately changes flux through a selected reaction (e.g., growth or by-product secretion) in one model over another, we are able to identify structural metabolic network differences enabling unique metabolic capabilities. Using CONGA, we explore functional differences between two metabolic reconstructions of Escherichia coli and identify a set of reactions responsible for chemical production differences between the two models. We also use this approach to aid in the development of a genome-scale model of Synechococcus sp. PCC 7002. Finally, we propose potential antimicrobial targets in Mycobacterium tuberculosis and Staphylococcus aureus based on differences in their metabolic capabilities. Through these examples, we demonstrate that a gene-centric approach to comparing metabolic networks allows for a rapid comparison of metabolic models at a functional level. Using CONGA, we can identify differences in reaction and gene content which give rise to different

  11. Integrating Environmental Genomics and Biogeochemical Models: a Gene-centric Approach

    NASA Astrophysics Data System (ADS)

    Reed, D. C.; Algar, C. K.; Huber, J. A.; Dick, G.

    2013-12-01

    Rapid advances in molecular microbial ecology have yielded an unprecedented amount of data about the evolutionary relationships and functional traits of microbial communities that regulate global geochemical cycles. Biogeochemical models, however, are trailing in the wake of the environmental genomics revolution and such models rarely incorporate explicit representations of bacteria and archaea, nor are they compatible with nucleic acid or protein sequence data. Here, we present a functional gene-based framework for describing microbial communities in biogeochemical models that uses genomics data and provides predictions that are readily testable using cutting-edge molecular tools. To demonstrate the approach in practice, nitrogen cycling in the Arabian Sea oxygen minimum zone (OMZ) was modelled to examine key questions about cryptic sulphur cycling and dinitrogen production pathways in OMZs. By directly linking geochemical dynamics to the genetic composition of microbial communities, the method provides mechanistic insights into patterns and biogeochemical consequences of marine microbes. Such an approach is critical for informing our understanding of the key role microbes play in modulating Earth's biogeochemistry.

  12. Functional linear models for zero-inflated count data with application to modeling hospitalizations in patients on dialysis.

    PubMed

    Sentürk, Damla; Dalrymple, Lorien S; Nguyen, Danh V

    2014-11-30

    We propose functional linear models for zero-inflated count data with a focus on the functional hurdle and functional zero-inflated Poisson (ZIP) models. Although the hurdle model assumes the counts come from a mixture of a degenerate distribution at zero and a zero-truncated Poisson distribution, the ZIP model considers a mixture of a degenerate distribution at zero and a standard Poisson distribution. We extend the generalized functional linear model framework with a functional predictor and multiple cross-sectional predictors to model counts generated by a mixture distribution. We propose an estimation procedure for functional hurdle and ZIP models, called penalized reconstruction, geared towards error-prone and sparsely observed longitudinal functional predictors. The approach relies on dimension reduction and pooling of information across subjects involving basis expansions and penalized maximum likelihood techniques. The developed functional hurdle model is applied to modeling hospitalizations within the first 2 years from initiation of dialysis, with a high percentage of zeros, in the Comprehensive Dialysis Study participants. Hospitalization counts are modeled as a function of sparse longitudinal measurements of serum albumin concentrations, patient demographics, and comorbidities. Simulation studies are used to study finite sample properties of the proposed method and include comparisons with an adaptation of standard principal components regression. Copyright © 2014 John Wiley & Sons, Ltd.

  13. Bayesian modelling of lung function data from multiple-breath washout tests.

    PubMed

    Mahar, Robert K; Carlin, John B; Ranganathan, Sarath; Ponsonby, Anne-Louise; Vuillermin, Peter; Vukcevic, Damjan

    2018-05-30

    Paediatric respiratory researchers have widely adopted the multiple-breath washout (MBW) test because it allows assessment of lung function in unsedated infants and is well suited to longitudinal studies of lung development and disease. However, a substantial proportion of MBW tests in infants fail current acceptability criteria. We hypothesised that a model-based approach to analysing the data, in place of traditional simple empirical summaries, would enable more efficient use of these tests. We therefore developed a novel statistical model for infant MBW data and applied it to 1197 tests from 432 individuals from a large birth cohort study. We focus on Bayesian estimation of the lung clearance index, the most commonly used summary of lung function from MBW tests. Our results show that the model provides an excellent fit to the data and shed further light on statistical properties of the standard empirical approach. Furthermore, the modelling approach enables the lung clearance index to be estimated by using tests with different degrees of completeness, something not possible with the standard approach. Our model therefore allows previously unused data to be used rather than discarded, as well as routine use of shorter tests without significant loss of precision. Beyond our specific application, our work illustrates a number of important aspects of Bayesian modelling in practice, such as the importance of hierarchical specifications to account for repeated measurements and the value of model checking via posterior predictive distributions. Copyright © 2018 John Wiley & Sons, Ltd.

  14. Exploring a microbial ecosystem approach to modeling deep ocean biogeochemical cycles

    NASA Astrophysics Data System (ADS)

    Zakem, E.; Follows, M. J.

    2014-12-01

    Though microbial respiration of organic matter in the deep ocean governs ocean and atmosphere biogeochemistry, it is not represented mechanistically in current global biogeochemical models. We seek approaches that are feasible for a global resolution, yet still reflect the enormous biodiversity of the deep microbial community and its associated metabolic pathways. We present a modeling framework grounded in thermodynamics and redox reaction stoichiometry that represents diverse microbial metabolisms explicitly. We describe a bacterial/archaeal functional type with two parameters: a growth efficiency representing the chemistry underlying a bacterial metabolism, and a rate limitation given by the rate of uptake of each of the necessary substrates for that metabolism. We then apply this approach to answer questions about microbial ecology. As a start, we resolve two dominant heterotrophic respiratory pathways- reduction of oxygen and nitrate- and associated microbial functional types. We combine these into an ecological model and a two-dimensional ocean circulation model to explore the organization, biogeochemistry, and ecology of oxygen minimum zones. Intensified upwelling and lateral transport conspire to produce an oxygen minimum at mid-depth, populated by anaerobic denitrifiers. This modeling approach should ultimately allow for the emergence of bacterial biogeography from competition of metabolisms and for the incorporation of microbial feedbacks to the climate system.

  15. Comparison and Contrast of Two General Functional Regression Modeling Frameworks

    PubMed Central

    Morris, Jeffrey S.

    2017-01-01

    In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable. PMID:28736502

  16. Linking density functional and mode coupling models for supercooled liquids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Premkumar, Leishangthem; Bidhoodi, Neeta; Das, Shankar P.

    2016-03-28

    We compare predictions from two familiar models of the metastable supercooled liquid, respectively, constructed with thermodynamic and dynamic approaches. In the so called density functional theory the free energy F[ρ] of the liquid is a functional of the inhomogeneous density ρ(r). The metastable state is identified as a local minimum of F[ρ]. The sharp density profile characterizing ρ(r) is identified as a single particle oscillator, whose frequency is obtained from the parameters of the optimum density function. On the other hand, a dynamic approach to supercooled liquids is taken in the mode coupling theory (MCT) which predict a sharp ergodicity-non-ergodicitymore » transition at a critical density. The single particle dynamics in the non-ergodic state, treated approximately, represents a propagating mode whose characteristic frequency is computed from the corresponding memory function of the MCT. The mass localization parameters in the above two models (treated in their simplest forms) are obtained, respectively, in terms of the corresponding natural frequencies depicted and are shown to have comparable magnitudes.« less

  17. A validated approach for modeling collapse of steel structures

    NASA Astrophysics Data System (ADS)

    Saykin, Vitaliy Victorovich

    A civil engineering structure is faced with many hazardous conditions such as blasts, earthquakes, hurricanes, tornadoes, floods, and fires during its lifetime. Even though structures are designed for credible events that can happen during a lifetime of the structure, extreme events do happen and cause catastrophic failures. Understanding the causes and effects of structural collapse is now at the core of critical areas of national need. One factor that makes studying structural collapse difficult is the lack of full-scale structural collapse experimental test results against which researchers could validate their proposed collapse modeling approaches. The goal of this work is the creation of an element deletion strategy based on fracture models for use in validated prediction of collapse of steel structures. The current work reviews the state-of-the-art of finite element deletion strategies for use in collapse modeling of structures. It is shown that current approaches to element deletion in collapse modeling do not take into account stress triaxiality in vulnerable areas of the structure, which is important for proper fracture and element deletion modeling. The report then reviews triaxiality and its role in fracture prediction. It is shown that fracture in ductile materials is a function of triaxiality. It is also shown that, depending on the triaxiality range, different fracture mechanisms are active and should be accounted for. An approach using semi-empirical fracture models as a function of triaxiality are employed. The models to determine fracture initiation, softening and subsequent finite element deletion are outlined. This procedure allows for stress-displacement softening at an integration point of a finite element in order to subsequently remove the element. This approach avoids abrupt changes in the stress that would create dynamic instabilities, thus making the results more reliable and accurate. The calibration and validation of these models are

  18. Network diffusion accurately models the relationship between structural and functional brain connectivity networks

    PubMed Central

    Abdelnour, Farras; Voss, Henning U.; Raj, Ashish

    2014-01-01

    The relationship between anatomic connectivity of large-scale brain networks and their functional connectivity is of immense importance and an area of active research. Previous attempts have required complex simulations which model the dynamics of each cortical region, and explore the coupling between regions as derived by anatomic connections. While much insight is gained from these non-linear simulations, they can be computationally taxing tools for predicting functional from anatomic connectivities. Little attention has been paid to linear models. Here we show that a properly designed linear model appears to be superior to previous non-linear approaches in capturing the brain’s long-range second order correlation structure that governs the relationship between anatomic and functional connectivities. We derive a linear network of brain dynamics based on graph diffusion, whereby the diffusing quantity undergoes a random walk on a graph. We test our model using subjects who underwent diffusion MRI and resting state fMRI. The network diffusion model applied to the structural networks largely predicts the correlation structures derived from their fMRI data, to a greater extent than other approaches. The utility of the proposed approach is that it can routinely be used to infer functional correlation from anatomic connectivity. And since it is linear, anatomic connectivity can also be inferred from functional data. The success of our model confirms the linearity of ensemble average signals in the brain, and implies that their long-range correlation structure may percolate within the brain via purely mechanistic processes enacted on its structural connectivity pathways. PMID:24384152

  19. Functional Linear Model with Zero-value Coefficient Function at Sub-regions.

    PubMed

    Zhou, Jianhui; Wang, Nae-Yuh; Wang, Naisyin

    2013-01-01

    We propose a shrinkage method to estimate the coefficient function in a functional linear regression model when the value of the coefficient function is zero within certain sub-regions. Besides identifying the null region in which the coefficient function is zero, we also aim to perform estimation and inferences for the nonparametrically estimated coefficient function without over-shrinking the values. Our proposal consists of two stages. In stage one, the Dantzig selector is employed to provide initial location of the null region. In stage two, we propose a group SCAD approach to refine the estimated location of the null region and to provide the estimation and inference procedures for the coefficient function. Our considerations have certain advantages in this functional setup. One goal is to reduce the number of parameters employed in the model. With a one-stage procedure, it is needed to use a large number of knots in order to precisely identify the zero-coefficient region; however, the variation and estimation difficulties increase with the number of parameters. Owing to the additional refinement stage, we avoid this necessity and our estimator achieves superior numerical performance in practice. We show that our estimator enjoys the Oracle property; it identifies the null region with probability tending to 1, and it achieves the same asymptotic normality for the estimated coefficient function on the non-null region as the functional linear model estimator when the non-null region is known. Numerically, our refined estimator overcomes the shortcomings of the initial Dantzig estimator which tends to under-estimate the absolute scale of non-zero coefficients. The performance of the proposed method is illustrated in simulation studies. We apply the method in an analysis of data collected by the Johns Hopkins Precursors Study, where the primary interests are in estimating the strength of association between body mass index in midlife and the quality of life in

  20. Toward a multiscale modeling framework for understanding serotonergic function

    PubMed Central

    Wong-Lin, KongFatt; Wang, Da-Hui; Moustafa, Ahmed A; Cohen, Jeremiah Y; Nakamura, Kae

    2017-01-01

    Despite its importance in regulating emotion and mental wellbeing, the complex structure and function of the serotonergic system present formidable challenges toward understanding its mechanisms. In this paper, we review studies investigating the interactions between serotonergic and related brain systems and their behavior at multiple scales, with a focus on biologically-based computational modeling. We first discuss serotonergic intracellular signaling and neuronal excitability, followed by neuronal circuit and systems levels. At each level of organization, we will discuss the experimental work accompanied by related computational modeling work. We then suggest that a multiscale modeling approach that integrates the various levels of neurobiological organization could potentially transform the way we understand the complex functions associated with serotonin. PMID:28417684

  1. Spectral functions with the density matrix renormalization group: Krylov-space approach for correction vectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    Frequency-dependent correlations, such as the spectral function and the dynamical structure factor, help illustrate condensed matter experiments. Within the density matrix renormalization group (DMRG) framework, an accurate method for calculating spectral functions directly in frequency is the correction-vector method. The correction vector can be computed by solving a linear equation or by minimizing a functional. Our paper proposes an alternative to calculate the correction vector: to use the Krylov-space approach. This paper also studies the accuracy and performance of the Krylov-space approach, when applied to the Heisenberg, the t-J, and the Hubbard models. The cases we studied indicate that themore » Krylov-space approach can be more accurate and efficient than the conjugate gradient, and that the error of the former integrates best when a Krylov-space decomposition is also used for ground state DMRG.« less

  2. Spectral functions with the density matrix renormalization group: Krylov-space approach for correction vectors

    DOE PAGES

    None, None

    2016-11-21

    Frequency-dependent correlations, such as the spectral function and the dynamical structure factor, help illustrate condensed matter experiments. Within the density matrix renormalization group (DMRG) framework, an accurate method for calculating spectral functions directly in frequency is the correction-vector method. The correction vector can be computed by solving a linear equation or by minimizing a functional. Our paper proposes an alternative to calculate the correction vector: to use the Krylov-space approach. This paper also studies the accuracy and performance of the Krylov-space approach, when applied to the Heisenberg, the t-J, and the Hubbard models. The cases we studied indicate that themore » Krylov-space approach can be more accurate and efficient than the conjugate gradient, and that the error of the former integrates best when a Krylov-space decomposition is also used for ground state DMRG.« less

  3. Towards refactoring the Molecular Function Ontology with a UML profile for function modeling.

    PubMed

    Burek, Patryk; Loebe, Frank; Herre, Heinrich

    2017-10-04

    Gene Ontology (GO) is the largest resource for cataloging gene products. This resource grows steadily and, naturally, this growth raises issues regarding the structure of the ontology. Moreover, modeling and refactoring large ontologies such as GO is generally far from being simple, as a whole as well as when focusing on certain aspects or fragments. It seems that human-friendly graphical modeling languages such as the Unified Modeling Language (UML) could be helpful in connection with these tasks. We investigate the use of UML for making the structural organization of the Molecular Function Ontology (MFO), a sub-ontology of GO, more explicit. More precisely, we present a UML dialect, called the Function Modeling Language (FueL), which is suited for capturing functions in an ontologically founded way. FueL is equipped, among other features, with language elements that arise from studying patterns of subsumption between functions. We show how to use this UML dialect for capturing the structure of molecular functions. Furthermore, we propose and discuss some refactoring options concerning fragments of MFO. FueL enables the systematic, graphical representation of functions and their interrelations, including making information explicit that is currently either implicit in MFO or is mainly captured in textual descriptions. Moreover, the considered subsumption patterns lend themselves to the methodical analysis of refactoring options with respect to MFO. On this basis we argue that the approach can increase the comprehensibility of the structure of MFO for humans and can support communication, for example, during revision and further development.

  4. Detection of Differential Item Functioning with Nonlinear Regression: A Non-IRT Approach Accounting for Guessing

    ERIC Educational Resources Information Center

    Drabinová, Adéla; Martinková, Patrícia

    2017-01-01

    In this article we present a general approach not relying on item response theory models (non-IRT) to detect differential item functioning (DIF) in dichotomous items with presence of guessing. The proposed nonlinear regression (NLR) procedure for DIF detection is an extension of method based on logistic regression. As a non-IRT approach, NLR can…

  5. A biopsychosocial approach to women's sexual function and dysfunction at midlife: A narrative review.

    PubMed

    Thomas, Holly N; Thurston, Rebecca C

    2016-05-01

    A satisfying sex life is an important component of overall well-being, but sexual dysfunction is common, especially in midlife women. The aim of this review is (a) to define sexual function and dysfunction, (b) to present theoretical models of female sexual response, (c) to examine longitudinal studies of how sexual function changes during midlife, and (d) to review treatment options. Four types of female sexual dysfunction are currently recognized: Female Orgasmic Disorder, Female Sexual Interest/Arousal Disorder, Genito-Pelvic Pain/Penetration Disorder, and Substance/Medication-Induced Sexual Dysfunction. However, optimal sexual function transcends the simple absence of dysfunction. A biopsychosocial approach that simultaneously considers physical, psychological, sociocultural, and interpersonal factors is necessary to guide research and clinical care regarding women's sexual function. Most longitudinal studies reveal an association between advancing menopause status and worsening sexual function. Psychosocial variables, such as availability of a partner, relationship quality, and psychological functioning, also play an integral role. Future directions for research should include deepening our understanding of how sexual function changes with aging and developing safe and effective approaches to optimizing women's sexual function with aging. Overall, holistic, biopsychosocial approaches to women's sexual function are necessary to fully understand and treat this key component of midlife women's well-being. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Developmental Programming of Renal Function and Re-Programming Approaches.

    PubMed

    Nüsken, Eva; Dötsch, Jörg; Weber, Lutz T; Nüsken, Kai-Dietrich

    2018-01-01

    Chronic kidney disease affects more than 10% of the population. Programming studies have examined the interrelationship between environmental factors in early life and differences in morbidity and mortality between individuals. A number of important principles has been identified, namely permanent structural modifications of organs and cells, long-lasting adjustments of endocrine regulatory circuits, as well as altered gene transcription. Risk factors include intrauterine deficiencies by disturbed placental function or maternal malnutrition, prematurity, intrauterine and postnatal stress, intrauterine and postnatal overnutrition, as well as dietary dysbalances in postnatal life. This mini-review discusses critical developmental periods and long-term sequelae of renal programming in humans and presents studies examining the underlying mechanisms as well as interventional approaches to "re-program" renal susceptibility toward disease. Clinical manifestations of programmed kidney disease include arterial hypertension, proteinuria, aggravation of inflammatory glomerular disease, and loss of kidney function. Nephron number, regulation of the renin-angiotensin-aldosterone system, renal sodium transport, vasomotor and endothelial function, myogenic response, and tubuloglomerular feedback have been identified as being vulnerable to environmental factors. Oxidative stress levels, metabolic pathways, including insulin, leptin, steroids, and arachidonic acid, DNA methylation, and histone configuration may be significantly altered by adverse environmental conditions. Studies on re-programming interventions focused on dietary or anti-oxidative approaches so far. Further studies that broaden our understanding of renal programming mechanisms are needed to ultimately develop preventive strategies. Targeted re-programming interventions in animal models focusing on known mechanisms will contribute to new concepts which finally will have to be translated to human application. Early

  7. Developmental Programming of Renal Function and Re-Programming Approaches

    PubMed Central

    Nüsken, Eva; Dötsch, Jörg; Weber, Lutz T.; Nüsken, Kai-Dietrich

    2018-01-01

    Chronic kidney disease affects more than 10% of the population. Programming studies have examined the interrelationship between environmental factors in early life and differences in morbidity and mortality between individuals. A number of important principles has been identified, namely permanent structural modifications of organs and cells, long-lasting adjustments of endocrine regulatory circuits, as well as altered gene transcription. Risk factors include intrauterine deficiencies by disturbed placental function or maternal malnutrition, prematurity, intrauterine and postnatal stress, intrauterine and postnatal overnutrition, as well as dietary dysbalances in postnatal life. This mini-review discusses critical developmental periods and long-term sequelae of renal programming in humans and presents studies examining the underlying mechanisms as well as interventional approaches to “re-program” renal susceptibility toward disease. Clinical manifestations of programmed kidney disease include arterial hypertension, proteinuria, aggravation of inflammatory glomerular disease, and loss of kidney function. Nephron number, regulation of the renin–angiotensin–aldosterone system, renal sodium transport, vasomotor and endothelial function, myogenic response, and tubuloglomerular feedback have been identified as being vulnerable to environmental factors. Oxidative stress levels, metabolic pathways, including insulin, leptin, steroids, and arachidonic acid, DNA methylation, and histone configuration may be significantly altered by adverse environmental conditions. Studies on re-programming interventions focused on dietary or anti-oxidative approaches so far. Further studies that broaden our understanding of renal programming mechanisms are needed to ultimately develop preventive strategies. Targeted re-programming interventions in animal models focusing on known mechanisms will contribute to new concepts which finally will have to be translated to human application

  8. Methodology to develop crash modification functions for road safety treatments with fully specified and hierarchical models.

    PubMed

    Chen, Yongsheng; Persaud, Bhagwant

    2014-09-01

    Crash modification factors (CMFs) for road safety treatments are developed as multiplicative factors that are used to reflect the expected changes in safety performance associated with changes in highway design and/or the traffic control features. However, current CMFs have methodological drawbacks. For example, variability with application circumstance is not well understood, and, as important, correlation is not addressed when several CMFs are applied multiplicatively. These issues can be addressed by developing safety performance functions (SPFs) with components of crash modification functions (CM-Functions), an approach that includes all CMF related variables, along with others, while capturing quantitative and other effects of factors and accounting for cross-factor correlations. CM-Functions can capture the safety impact of factors through a continuous and quantitative approach, avoiding the problematic categorical analysis that is often used to capture CMF variability. There are two formulations to develop such SPFs with CM-Function components - fully specified models and hierarchical models. Based on sample datasets from two Canadian cities, both approaches are investigated in this paper. While both model formulations yielded promising results and reasonable CM-Functions, the hierarchical model was found to be more suitable in retaining homogeneity of first-level SPFs, while addressing CM-Functions in sub-level modeling. In addition, hierarchical models better capture the correlations between different impact factors. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    PubMed Central

    Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  10. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-10-30

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  11. Functional CAR models for large spatially correlated functional datasets.

    PubMed

    Zhang, Lin; Baladandayuthapani, Veerabhadran; Zhu, Hongxiao; Baggerly, Keith A; Majewski, Tadeusz; Czerniak, Bogdan A; Morris, Jeffrey S

    2016-01-01

    We develop a functional conditional autoregressive (CAR) model for spatially correlated data for which functions are collected on areal units of a lattice. Our model performs functional response regression while accounting for spatial correlations with potentially nonseparable and nonstationary covariance structure, in both the space and functional domains. We show theoretically that our construction leads to a CAR model at each functional location, with spatial covariance parameters varying and borrowing strength across the functional domain. Using basis transformation strategies, the nonseparable spatial-functional model is computationally scalable to enormous functional datasets, generalizable to different basis functions, and can be used on functions defined on higher dimensional domains such as images. Through simulation studies, we demonstrate that accounting for the spatial correlation in our modeling leads to improved functional regression performance. Applied to a high-throughput spatially correlated copy number dataset, the model identifies genetic markers not identified by comparable methods that ignore spatial correlations.

  12. Regulator Loss Functions and Hierarchical Modeling for Safety Decision Making.

    PubMed

    Hatfield, Laura A; Baugh, Christine M; Azzone, Vanessa; Normand, Sharon-Lise T

    2017-07-01

    Regulators must act to protect the public when evidence indicates safety problems with medical devices. This requires complex tradeoffs among risks and benefits, which conventional safety surveillance methods do not incorporate. To combine explicit regulator loss functions with statistical evidence on medical device safety signals to improve decision making. In the Hospital Cost and Utilization Project National Inpatient Sample, we select pediatric inpatient admissions and identify adverse medical device events (AMDEs). We fit hierarchical Bayesian models to the annual hospital-level AMDE rates, accounting for patient and hospital characteristics. These models produce expected AMDE rates (a safety target), against which we compare the observed rates in a test year to compute a safety signal. We specify a set of loss functions that quantify the costs and benefits of each action as a function of the safety signal. We integrate the loss functions over the posterior distribution of the safety signal to obtain the posterior (Bayes) risk; the preferred action has the smallest Bayes risk. Using simulation and an analysis of AMDE data, we compare our minimum-risk decisions to a conventional Z score approach for classifying safety signals. The 2 rules produced different actions for nearly half of hospitals (45%). In the simulation, decisions that minimize Bayes risk outperform Z score-based decisions, even when the loss functions or hierarchical models are misspecified. Our method is sensitive to the choice of loss functions; eliciting quantitative inputs to the loss functions from regulators is challenging. A decision-theoretic approach to acting on safety signals is potentially promising but requires careful specification of loss functions in consultation with subject matter experts.

  13. Combining computer modelling and cardiac imaging to understand right ventricular pump function.

    PubMed

    Walmsley, John; van Everdingen, Wouter; Cramer, Maarten J; Prinzen, Frits W; Delhaas, Tammo; Lumens, Joost

    2017-10-01

    Right ventricular (RV) dysfunction is a strong predictor of outcome in heart failure and is a key determinant of exercise capacity. Despite these crucial findings, the RV remains understudied in the clinical, experimental, and computer modelling literature. This review outlines how recent advances in using computer modelling and cardiac imaging synergistically help to understand RV function in health and disease. We begin by highlighting the complexity of interactions that make modelling the RV both challenging and necessary, and then summarize the multiscale modelling approaches used to date to simulate RV pump function in the context of these interactions. We go on to demonstrate how these modelling approaches in combination with cardiac imaging have improved understanding of RV pump function in pulmonary arterial hypertension, arrhythmogenic right ventricular cardiomyopathy, dyssynchronous heart failure and cardiac resynchronization therapy, hypoplastic left heart syndrome, and repaired tetralogy of Fallot. We conclude with a perspective on key issues to be addressed by computational models of the RV in the near future. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2017. For permissions, please email: journals.permissions@oup.com.

  14. A Multi-Level Model of Moral Functioning Revisited

    ERIC Educational Resources Information Center

    Reed, Don Collins

    2009-01-01

    The model of moral functioning scaffolded in the 2008 "JME" Special Issue is here revisited in response to three papers criticising that volume. As guest editor of that Special Issue I have formulated the main body of this response, concerning the dynamic systems approach to moral development, the problem of moral relativism and the role of…

  15. Targeted versus statistical approaches to selecting parameters for modelling sediment provenance

    NASA Astrophysics Data System (ADS)

    Laceby, J. Patrick

    2017-04-01

    One effective field-based approach to modelling sediment provenance is the source fingerprinting technique. Arguably, one of the most important steps for this approach is selecting the appropriate suite of parameters or fingerprints used to model source contributions. Accordingly, approaches to selecting parameters for sediment source fingerprinting will be reviewed. Thereafter, opportunities and limitations of these approaches and some future research directions will be presented. For properties to be effective tracers of sediment, they must discriminate between sources whilst behaving conservatively. Conservative behavior is characterized by constancy in sediment properties, where the properties of sediment sources remain constant, or at the very least, any variation in these properties should occur in a predictable and measurable way. Therefore, properties selected for sediment source fingerprinting should remain constant through sediment detachment, transportation and deposition processes, or vary in a predictable and measurable way. One approach to select conservative properties for sediment source fingerprinting is to identify targeted tracers, such as caesium-137, that provide specific source information (e.g. surface versus subsurface origins). A second approach is to use statistical tests to select an optimal suite of conservative properties capable of modelling sediment provenance. In general, statistical approaches use a combination of a discrimination (e.g. Kruskal Wallis H-test, Mann-Whitney U-test) and parameter selection statistics (e.g. Discriminant Function Analysis or Principle Component Analysis). The challenge is that modelling sediment provenance is often not straightforward and there is increasing debate in the literature surrounding the most appropriate approach to selecting elements for modelling. Moving forward, it would be beneficial if researchers test their results with multiple modelling approaches, artificial mixtures, and multiple

  16. Operator function modeling: An approach to cognitive task analysis in supervisory control systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1987-01-01

    In a study of models of operators in complex, automated space systems, an operator function model (OFM) methodology was extended to represent cognitive as well as manual operator activities. Development continued on a software tool called OFMdraw, which facilitates construction of an OFM by permitting construction of a heterarchic network of nodes and arcs. Emphasis was placed on development of OFMspert, an expert system designed both to model human operation and to assist real human operators. The system uses a blackboard method of problem solving to make an on-line representation of operator intentions, called ACTIN (actions interpreter).

  17. Influence of xc functional on thermal-elastic properties of Ceria: A DFT-based Debye-Grüneisen model approach

    NASA Astrophysics Data System (ADS)

    Lee, Ji-Hwan; Tak, Youngjoo; Lee, Taehun; Soon, Aloysius

    Ceria (CeO2-x) is widely studied as a choice electrolyte material for intermediate-temperature (~ 800 K) solid oxide fuel cells. At this temperature, maintaining its chemical stability and thermal-mechanical integrity of this oxide are of utmost importance. To understand their thermal-elastic properties, we firstly test the influence of various approximations to the density-functional theory (DFT) xc functionals on specific thermal-elastic properties of both CeO2 and Ce2O3. Namely, we consider the local-density approximation (LDA), the generalized gradient approximation (GGA-PBE) with and without additional Hubbard U as applied to the 4 f electron of Ce, as well as the recently popularized hybrid functional due to Heyd-Scuseria-Ernzehof (HSE06). Next, we then couple this to a volume-dependent Debye-Grüneisen model to determine the thermodynamic quantities of ceria at arbitrary temperatures. We find an explicit description of the strong correlation (e.g. via the DFT + U and hybrid functional approach) is necessary to have a good agreement with experimental values, in contrast to the mean-field treatment in standard xc approximations (such as LDA or GGA-PBE). We acknowledge support from Samsung Research Funding Center of Samsung Electronics (SRFC-MA1501-03).

  18. Asymptotic behaviour of two-point functions in multi-species models

    NASA Astrophysics Data System (ADS)

    Kozlowski, Karol K.; Ragoucy, Eric

    2016-05-01

    We extract the long-distance asymptotic behaviour of two-point correlation functions in massless quantum integrable models containing multi-species excitations. For such a purpose, we extend to these models the method of a large-distance regime re-summation of the form factor expansion of correlation functions. The key feature of our analysis is a technical hypothesis on the large-volume behaviour of the form factors of local operators in such models. We check the validity of this hypothesis on the example of the SU (3)-invariant XXX magnet by means of the determinant representations for the form factors of local operators in this model. Our approach confirms the structure of the critical exponents obtained previously for numerous models solvable by the nested Bethe Ansatz.

  19. Confidence Intervals for a Semiparametric Approach to Modeling Nonlinear Relations among Latent Variables

    ERIC Educational Resources Information Center

    Pek, Jolynn; Losardo, Diane; Bauer, Daniel J.

    2011-01-01

    Compared to parametric models, nonparametric and semiparametric approaches to modeling nonlinearity between latent variables have the advantage of recovering global relationships of unknown functional form. Bauer (2005) proposed an indirect application of finite mixtures of structural equation models where latent components are estimated in the…

  20. Competing risk models in reliability systems, an exponential distribution model with Bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, I.

    2018-03-01

    The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.

  1. A biopsychosocial approach to women’s sexual function and dysfunction at midlife: A narrative review

    PubMed Central

    Thomas, Holly N.; Thurston, Rebecca C.

    2016-01-01

    A satisfying sex life is an important component of overall well-being, but sexual dysfunction is common, especially in midlife women. The aim of this review is (a) to define sexual function and dysfunction, (b) to present theoretical models of female sexual response, (c) to examine longitudinal studies of how sexual function changes during midlife, and (d) to review treatment options. Four types of female sexual dysfunction are currently recognized: Female Orgasmic Disorder, Female Sexual Interest/Arousal Disorder, Genito-Pelvic Pain/Penetration Disorder, and Substance/Medication-Induced Sexual Dysfunction. However, optimal sexual function transcends the simple absence of dysfunction. A biopsychosocial approach that simultaneously considers physical, psychological, sociocultural, and interpersonal factors is necessary to guide research and clinical care regarding women’s sexual function. Most longitudinal studies reveal an association between advancing menopause status and worsening sexual function. Psychosocial variables, such as availability of a partner, relationship quality, and psychological functioning, also play an integral role. Future directions for research should include deepening our understanding of how sexual function changes with aging and developing safe and effective approaches to optimizing women’s sexual function with aging. Overall, holistic, biopsychosocial approaches to women’s sexual function are necessary to fully understand and treat this key component of midlife women’s well-being. PMID:27013288

  2. Scaling within the spectral function approach

    NASA Astrophysics Data System (ADS)

    Sobczyk, J. E.; Rocco, N.; Lovato, A.; Nieves, J.

    2018-03-01

    Scaling features of the nuclear electromagnetic response functions unveil aspects of nuclear dynamics that are crucial for interpreting neutrino- and electron-scattering data. In the large momentum-transfer regime, the nucleon-density response function defines a universal scaling function, which is independent of the nature of the probe. In this work, we analyze the nucleon-density response function of 12C, neglecting collective excitations. We employ particle and hole spectral functions obtained within two distinct many-body methods, both widely used to describe electroweak reactions in nuclei. We show that the two approaches provide compatible nucleon-density scaling functions that for large momentum transfers satisfy first-kind scaling. Both methods yield scaling functions characterized by an asymmetric shape, although less pronounced than that of experimental scaling functions. This asymmetry, only mildly affected by final state interactions, is mostly due to nucleon-nucleon correlations, encoded in the continuum component of the hole spectral function.

  3. Functional renormalization-group approaches, one-particle (irreducible) reducible with respect to local Green’s functions, with dynamical mean-field theory as a starting point

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katanin, A. A., E-mail: katanin@mail.ru

    We consider formulations of the functional renormalization-group (fRG) flow for correlated electronic systems with the dynamical mean-field theory as a starting point. We classify the corresponding renormalization-group schemes into those neglecting one-particle irreducible six-point vertices (with respect to the local Green’s functions) and neglecting one-particle reducible six-point vertices. The former class is represented by the recently introduced DMF{sup 2}RG approach [31], but also by the scale-dependent generalization of the one-particle irreducible representation (with respect to local Green’s functions, 1PI-LGF) of the generating functional [20]. The second class is represented by the fRG flow within the dual fermion approach [16, 32].more » We compare formulations of the fRG approach in each of these cases and suggest their further application to study 2D systems within the Hubbard model.« less

  4. Integrated reclamation: Approaching ecological function?

    Treesearch

    Ann L. Hild; Nancy L. Shaw; Ginger B. Paige

    2009-01-01

    Attempts to reclaim arid and semiarid lands have traditionally targeted plant species composition. Much research attention has been directed to seeding rates, species mixes and timing of seeding. However, in order to attain functioning systems, attention to structure and process must compliment existing efforts. We ask how to use a systems approach to enhance...

  5. Thermal noise model of antiferromagnetic dynamics: A macroscopic approach

    NASA Astrophysics Data System (ADS)

    Li, Xilai; Semenov, Yuriy; Kim, Ki Wook

    In the search for post-silicon technologies, antiferromagnetic (AFM) spintronics is receiving widespread attention. Due to faster dynamics when compared with its ferromagnetic counterpart, AFM enables ultra-fast magnetization switching and THz oscillations. A crucial factor that affects the stability of antiferromagnetic dynamics is the thermal fluctuation, rarely considered in AFM research. Here, we derive from theory both stochastic dynamic equations for the macroscopic AFM Neel vector (L-vector) and the corresponding Fokker-Plank equation for the L-vector distribution function. For the dynamic equation approach, thermal noise is modeled by a stochastic fluctuating magnetic field that affects the AFM dynamics. The field is correlated within the correlation time and the amplitude is derived from the energy dissipation theory. For the distribution function approach, the inertial behavior of AFM dynamics forces consideration of the generalized space, including both coordinates and velocities. Finally, applying the proposed thermal noise model, we analyze a particular case of L-vector reversal of AFM nanoparticles by voltage controlled perpendicular magnetic anisotropy (PMA) with a tailored pulse width. This work was supported, in part, by SRC/NRI SWAN.

  6. Minimization of required model runs in the Random Mixing approach to inverse groundwater flow and transport modeling

    NASA Astrophysics Data System (ADS)

    Hoerning, Sebastian; Bardossy, Andras; du Plessis, Jaco

    2017-04-01

    Most geostatistical inverse groundwater flow and transport modelling approaches utilize a numerical solver to minimize the discrepancy between observed and simulated hydraulic heads and/or hydraulic concentration values. The optimization procedure often requires many model runs, which for complex models lead to long run times. Random Mixing is a promising new geostatistical technique for inverse modelling. The method is an extension of the gradual deformation approach. It works by finding a field which preserves the covariance structure and maintains observed hydraulic conductivities. This field is perturbed by mixing it with new fields that fulfill the homogeneous conditions. This mixing is expressed as an optimization problem which aims to minimize the difference between the observed and simulated hydraulic heads and/or concentration values. To preserve the spatial structure, the mixing weights must lie on the unit hyper-sphere. We present a modification to the Random Mixing algorithm which significantly reduces the number of model runs required. The approach involves taking n equally spaced points on the unit circle as weights for mixing conditional random fields. Each of these mixtures provides a solution to the forward model at the conditioning locations. For each of the locations the solutions are then interpolated around the circle to provide solutions for additional mixing weights at very low computational cost. The interpolated solutions are used to search for a mixture which maximally reduces the objective function. This is in contrast to other approaches which evaluate the objective function for the n mixtures and then interpolate the obtained values. Keeping the mixture on the unit circle makes it easy to generate equidistant sampling points in the space; however, this means that only two fields are mixed at a time. Once the optimal mixture for two fields has been found, they are combined to form the input to the next iteration of the algorithm. This

  7. Optimization of global model composed of radial basis functions using the term-ranking approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Peng; Tao, Chao, E-mail: taochao@nju.edu.cn; Liu, Xiao-Jun

    2014-03-15

    A term-ranking method is put forward to optimize the global model composed of radial basis functions to improve the predictability of the model. The effectiveness of the proposed method is examined by numerical simulation and experimental data. Numerical simulations indicate that this method can significantly lengthen the prediction time and decrease the Bayesian information criterion of the model. The application to real voice signal shows that the optimized global model can capture more predictable component in chaos-like voice data and simultaneously reduce the predictable component (periodic pitch) in the residual signal.

  8. A toxicity cost function approach to optimal CPA equilibration in tissues.

    PubMed

    Benson, James D; Higgins, Adam Z; Desai, Kunjan; Eroglu, Ali

    2018-02-01

    There is growing need for cryopreserved tissue samples that can be used in transplantation and regenerative medicine. While a number of specific tissue types have been successfully cryopreserved, this success is not general, and there is not a uniform approach to cryopreservation of arbitrary tissues. Additionally, while there are a number of long-established approaches towards optimizing cryoprotocols in single cell suspensions, and even plated cell monolayers, computational approaches in tissue cryopreservation have classically been limited to explanatory models. Here we develop a numerical approach to adapt cell-based CPA equilibration damage models for use in a classical tissue mass transport model. To implement this with real-world parameters, we measured CPA diffusivity in three human-sourced tissue types, skin, fibroid and myometrium, yielding propylene glycol diffusivities of 0.6 × 10 -6  cm 2 /s, 1.2 × 10 -6  cm 2 /s and 1.3 × 10 -6  cm 2 /s, respectively. Based on these results, we numerically predict and compare optimal multistep equilibration protocols that minimize the cell-based cumulative toxicity cost function and the damage due to excessive osmotic gradients at the tissue boundary. Our numerical results show that there are fundamental differences between protocols designed to minimize total CPA exposure time in tissues and protocols designed to minimize accumulated CPA toxicity, and that "one size fits all" stepwise approaches are predicted to be more toxic and take considerably longer than needed. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. A Data-Driven Approach to Reverse Engineering Customer Engagement Models: Towards Functional Constructs

    PubMed Central

    de Vries, Natalie Jane; Carlson, Jamie; Moscato, Pablo

    2014-01-01

    Online consumer behavior in general and online customer engagement with brands in particular, has become a major focus of research activity fuelled by the exponential increase of interactive functions of the internet and social media platforms and applications. Current research in this area is mostly hypothesis-driven and much debate about the concept of Customer Engagement and its related constructs remains existent in the literature. In this paper, we aim to propose a novel methodology for reverse engineering a consumer behavior model for online customer engagement, based on a computational and data-driven perspective. This methodology could be generalized and prove useful for future research in the fields of consumer behaviors using questionnaire data or studies investigating other types of human behaviors. The method we propose contains five main stages; symbolic regression analysis, graph building, community detection, evaluation of results and finally, investigation of directed cycles and common feedback loops. The ‘communities’ of questionnaire items that emerge from our community detection method form possible ‘functional constructs’ inferred from data rather than assumed from literature and theory. Our results show consistent partitioning of questionnaire items into such ‘functional constructs’ suggesting the method proposed here could be adopted as a new data-driven way of human behavior modeling. PMID:25036766

  10. A data-driven approach to reverse engineering customer engagement models: towards functional constructs.

    PubMed

    de Vries, Natalie Jane; Carlson, Jamie; Moscato, Pablo

    2014-01-01

    Online consumer behavior in general and online customer engagement with brands in particular, has become a major focus of research activity fuelled by the exponential increase of interactive functions of the internet and social media platforms and applications. Current research in this area is mostly hypothesis-driven and much debate about the concept of Customer Engagement and its related constructs remains existent in the literature. In this paper, we aim to propose a novel methodology for reverse engineering a consumer behavior model for online customer engagement, based on a computational and data-driven perspective. This methodology could be generalized and prove useful for future research in the fields of consumer behaviors using questionnaire data or studies investigating other types of human behaviors. The method we propose contains five main stages; symbolic regression analysis, graph building, community detection, evaluation of results and finally, investigation of directed cycles and common feedback loops. The 'communities' of questionnaire items that emerge from our community detection method form possible 'functional constructs' inferred from data rather than assumed from literature and theory. Our results show consistent partitioning of questionnaire items into such 'functional constructs' suggesting the method proposed here could be adopted as a new data-driven way of human behavior modeling.

  11. Modeling transitions in body composition: the approach to steady state for anthropometric measures and physiological functions in the Minnesota human starvation study

    PubMed Central

    Hargrove, James L; Heinz, Grete; Heinz, Otto

    2008-01-01

    Background This study evaluated whether the changes in several anthropometric and functional measures during caloric restriction combined with walking and treadmill exercise would fit a simple model of approach to steady state (a plateau) that can be solved using spreadsheet software (Microsoft Excel®). We hypothesized that transitions in waist girth and several body compartments would fit a simple exponential model that approaches a stable steady-state. Methods The model (an equation) was applied to outcomes reported in the Minnesota starvation experiment using Microsoft Excel's Solver® function to derive rate parameters (k) and projected steady state values. However, data for most end-points were available only at t = 0, 12 and 24 weeks of caloric restriction. Therefore, we derived 2 new equations that enable model solutions to be calculated from 3 equally spaced data points. Results For the group of male subjects in the Minnesota study, body mass declined with a first order rate constant of about 0.079 wk-1. The fractional rate of loss of fat free mass, which includes components that remained almost constant during starvation, was 0.064 wk-1, compared to a rate of loss of fat mass of 0.103 wk-1. The rate of loss of abdominal fat, as exemplified by the change in the waist girth, was 0.213 wk-1. On average, 0.77 kg was lost per cm of waist girth. Other girths showed rates of loss between 0.085 and 0.131 wk-1. Resting energy expenditure (REE) declined at 0.131 wk-1. Changes in heart volume, hand strength, work capacity and N excretion showed rates of loss in the same range. The group of 32 subjects was close to steady state or had already reached steady state for the variables under consideration at the end of semi-starvation. Conclusion When energy intake is changed to new, relatively constant levels, while physical activity is maintained, changes in several anthropometric and physiological measures can be modeled as an exponential approach to steady state using

  12. Defining Function in the Functional Medicine Model.

    PubMed

    Bland, Jeffrey

    2017-02-01

    In the functional medicine model, the word function is aligned with the evolving understanding that disease is an endpoint and function is a process. Function can move both forward and backward. The vector of change in function through time is, in part, determined by the unique interaction of an individual's genome with their environment, diet, and lifestyle. The functional medicine model for health care is concerned less with what we call the dysfunction or disease , and more about the dynamic processes that resulted in the person's dysfunction. The previous concept of functional somatic syndromes as psychosomatic in origin has now been replaced with a new concept of function that is rooted in the emerging 21st-century understanding of systems network-enabled biology.

  13. Defining Function in the Functional Medicine Model

    PubMed Central

    Bland, Jeffrey

    2017-01-01

    In the functional medicine model, the word function is aligned with the evolving understanding that disease is an endpoint and function is a process. Function can move both forward and backward. The vector of change in function through time is, in part, determined by the unique interaction of an individual’s genome with their environment, diet, and lifestyle. The functional medicine model for health care is concerned less with what we call the dysfunction or disease, and more about the dynamic processes that resulted in the person’s dysfunction. The previous concept of functional somatic syndromes as psychosomatic in origin has now been replaced with a new concept of function that is rooted in the emerging 21st-century understanding of systems network-enabled biology. PMID:28223904

  14. A genetic algorithms approach for altering the membership functions in fuzzy logic controllers

    NASA Technical Reports Server (NTRS)

    Shehadeh, Hana; Lea, Robert N.

    1992-01-01

    Through previous work, a fuzzy control system was developed to perform translational and rotational control of a space vehicle. This problem was then re-examined to determine the effectiveness of genetic algorithms on fine tuning the controller. This paper explains the problems associated with the design of this fuzzy controller and offers a technique for tuning fuzzy logic controllers. A fuzzy logic controller is a rule-based system that uses fuzzy linguistic variables to model human rule-of-thumb approaches to control actions within a given system. This 'fuzzy expert system' features rules that direct the decision process and membership functions that convert the linguistic variables into the precise numeric values used for system control. Defining the fuzzy membership functions is the most time consuming aspect of the controller design. One single change in the membership functions could significantly alter the performance of the controller. This membership function definition can be accomplished by using a trial and error technique to alter the membership functions creating a highly tuned controller. This approach can be time consuming and requires a great deal of knowledge from human experts. In order to shorten development time, an iterative procedure for altering the membership functions to create a tuned set that used a minimal amount of fuel for velocity vector approach and station-keep maneuvers was developed. Genetic algorithms, search techniques used for optimization, were utilized to solve this problem.

  15. Linear mixed-effects modeling approach to FMRI group analysis

    PubMed Central

    Chen, Gang; Saad, Ziad S.; Britton, Jennifer C.; Pine, Daniel S.; Cox, Robert W.

    2013-01-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance–covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance–covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the

  16. Linear mixed-effects modeling approach to FMRI group analysis.

    PubMed

    Chen, Gang; Saad, Ziad S; Britton, Jennifer C; Pine, Daniel S; Cox, Robert W

    2013-06-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance-covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance-covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the sensitivity

  17. Hierarchical organization of functional connectivity in the mouse brain: a complex network approach.

    PubMed

    Bardella, Giampiero; Bifone, Angelo; Gabrielli, Andrea; Gozzi, Alessandro; Squartini, Tiziano

    2016-08-18

    This paper represents a contribution to the study of the brain functional connectivity from the perspective of complex networks theory. More specifically, we apply graph theoretical analyses to provide evidence of the modular structure of the mouse brain and to shed light on its hierarchical organization. We propose a novel percolation analysis and we apply our approach to the analysis of a resting-state functional MRI data set from 41 mice. This approach reveals a robust hierarchical structure of modules persistent across different subjects. Importantly, we test this approach against a statistical benchmark (or null model) which constrains only the distributions of empirical correlations. Our results unambiguously show that the hierarchical character of the mouse brain modular structure is not trivially encoded into this lower-order constraint. Finally, we investigate the modular structure of the mouse brain by computing the Minimal Spanning Forest, a technique that identifies subnetworks characterized by the strongest internal correlations. This approach represents a faster alternative to other community detection methods and provides a means to rank modules on the basis of the strength of their internal edges.

  18. Hierarchical organization of functional connectivity in the mouse brain: a complex network approach

    NASA Astrophysics Data System (ADS)

    Bardella, Giampiero; Bifone, Angelo; Gabrielli, Andrea; Gozzi, Alessandro; Squartini, Tiziano

    2016-08-01

    This paper represents a contribution to the study of the brain functional connectivity from the perspective of complex networks theory. More specifically, we apply graph theoretical analyses to provide evidence of the modular structure of the mouse brain and to shed light on its hierarchical organization. We propose a novel percolation analysis and we apply our approach to the analysis of a resting-state functional MRI data set from 41 mice. This approach reveals a robust hierarchical structure of modules persistent across different subjects. Importantly, we test this approach against a statistical benchmark (or null model) which constrains only the distributions of empirical correlations. Our results unambiguously show that the hierarchical character of the mouse brain modular structure is not trivially encoded into this lower-order constraint. Finally, we investigate the modular structure of the mouse brain by computing the Minimal Spanning Forest, a technique that identifies subnetworks characterized by the strongest internal correlations. This approach represents a faster alternative to other community detection methods and provides a means to rank modules on the basis of the strength of their internal edges.

  19. A continuous optimization approach for inferring parameters in mathematical models of regulatory networks.

    PubMed

    Deng, Zhimin; Tian, Tianhai

    2014-07-29

    The advances of systems biology have raised a large number of sophisticated mathematical models for describing the dynamic property of complex biological systems. One of the major steps in developing mathematical models is to estimate unknown parameters of the model based on experimentally measured quantities. However, experimental conditions limit the amount of data that is available for mathematical modelling. The number of unknown parameters in mathematical models may be larger than the number of observation data. The imbalance between the number of experimental data and number of unknown parameters makes reverse-engineering problems particularly challenging. To address the issue of inadequate experimental data, we propose a continuous optimization approach for making reliable inference of model parameters. This approach first uses a spline interpolation to generate continuous functions of system dynamics as well as the first and second order derivatives of continuous functions. The expanded dataset is the basis to infer unknown model parameters using various continuous optimization criteria, including the error of simulation only, error of both simulation and the first derivative, or error of simulation as well as the first and second derivatives. We use three case studies to demonstrate the accuracy and reliability of the proposed new approach. Compared with the corresponding discrete criteria using experimental data at the measurement time points only, numerical results of the ERK kinase activation module show that the continuous absolute-error criteria using both function and high order derivatives generate estimates with better accuracy. This result is also supported by the second and third case studies for the G1/S transition network and the MAP kinase pathway, respectively. This suggests that the continuous absolute-error criteria lead to more accurate estimates than the corresponding discrete criteria. We also study the robustness property of these three

  20. Atom and Bond Fukui Functions and Matrices: A Hirshfeld-I Atoms-in-Molecule Approach.

    PubMed

    Oña, Ofelia B; De Clercq, Olivier; Alcoba, Diego R; Torre, Alicia; Lain, Luis; Van Neck, Dimitri; Bultinck, Patrick

    2016-09-19

    The Fukui function is often used in its atom-condensed form by isolating it from the molecular Fukui function using a chosen weight function for the atom in the molecule. Recently, Fukui functions and matrices for both atoms and bonds separately were introduced for semiempirical and ab initio levels of theory using Hückel and Mulliken atoms-in-molecule models. In this work, a double partitioning method of the Fukui matrix is proposed within the Hirshfeld-I atoms-in-molecule framework. Diagonalizing the resulting atomic and bond matrices gives eigenvalues and eigenvectors (Fukui orbitals) describing the reactivity of atoms and bonds. The Fukui function is the diagonal element of the Fukui matrix and may be resolved in atom and bond contributions. The extra information contained in the atom and bond resolution of the Fukui matrices and functions is highlighted. The effect of the choice of weight function arising from the Hirshfeld-I approach to obtain atom- and bond-condensed Fukui functions is studied. A comparison of the results with those generated by using the Mulliken atoms-in-molecule approach shows low correlation between the two partitioning schemes. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. An Approach for Integrating the Prioritization of Functional and Nonfunctional Requirements

    PubMed Central

    Dabbagh, Mohammad; Lee, Sai Peck

    2014-01-01

    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches. PMID:24982987

  2. An approach for integrating the prioritization of functional and nonfunctional requirements.

    PubMed

    Dabbagh, Mohammad; Lee, Sai Peck

    2014-01-01

    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.

  3. Enhancements to the SSME transfer function modeling code

    NASA Technical Reports Server (NTRS)

    Irwin, R. Dennis; Mitchell, Jerrel R.; Bartholomew, David L.; Glenn, Russell D.

    1995-01-01

    This report details the results of a one year effort by Ohio University to apply the transfer function modeling and analysis tools developed under NASA Grant NAG8-167 (Irwin, 1992), (Bartholomew, 1992) to attempt the generation of Space Shuttle Main Engine High Pressure Turbopump transfer functions from time domain data. In addition, new enhancements to the transfer function modeling codes which enhance the code functionality are presented, along with some ideas for improved modeling methods and future work. Section 2 contains a review of the analytical background used to generate transfer functions with the SSME transfer function modeling software. Section 2.1 presents the 'ratio method' developed for obtaining models of systems that are subject to single unmeasured excitation sources and have two or more measured output signals. Since most of the models developed during the investigation use the Eigensystem Realization Algorithm (ERA) for model generation, Section 2.2 presents an introduction of ERA, and Section 2.3 describes how it can be used to model spectral quantities. Section 2.4 details the Residue Identification Algorithm (RID) including the use of Constrained Least Squares (CLS) and Total Least Squares (TLS). Most of this information can be found in the report (and is repeated for convenience). Section 3 chronicles the effort of applying the SSME transfer function modeling codes to the a51p394.dat and a51p1294.dat time data files to generate transfer functions from the unmeasured input to the 129.4 degree sensor output. Included are transfer function modeling attempts using five methods. The first method is a direct application of the SSME codes to the data files and the second method uses the underlying trends in the spectral density estimates to form transfer function models with less clustering of poles and zeros than the models obtained by the direct method. In the third approach, the time data is low pass filtered prior to the modeling process in an

  4. NEMA, a functional-structural model of nitrogen economy within wheat culms after flowering. I. Model description.

    PubMed

    Bertheloot, Jessica; Cournède, Paul-Henry; Andrieu, Bruno

    2011-10-01

    Models simulating nitrogen use by plants are potentially efficient tools to optimize the use of fertilizers in agriculture. Most crop models assume that a target nitrogen concentration can be defined for plant tissues and formalize a demand for nitrogen, depending on the difference between the target and actual nitrogen concentrations. However, the teleonomic nature of the approach has been criticized. This paper proposes a mechanistic model of nitrogen economy, NEMA (Nitrogen Economy Model within plant Architecture), which links nitrogen fluxes to nitrogen concentration and physiological processes. A functional-structural approach is used: plant aerial parts are described in a botanically realistic way and physiological processes are expressed at the scale of each aerial organ or root compartment as a function of local conditions (light and resources). NEMA was developed for winter wheat (Triticum aestivum) after flowering. The model simulates the nitrogen (N) content of each photosynthetic organ as regulated by Rubisco turnover, which depends on intercepted light and a mobile N pool shared by all organs. This pool is enriched by N acquisition from the soil and N release from vegetative organs, and is depleted by grain uptake and protein synthesis in vegetative organs; NEMA accounts for the negative feedback from circulating N on N acquisition from the soil, which is supposed to follow the activities of nitrate transport systems. Organ N content and intercepted light determine dry matter production via photosynthesis, which is distributed between organs according to a demand-driven approach. NEMA integrates the main feedbacks known to regulate plant N economy. Other novel features are the simulation of N for all photosynthetic tissues and the use of an explicit description of the plant that allows how the local environment of tissues regulates their N content to be taken into account. We believe this represents an appropriate frame for modelling nitrogen in

  5. Nonparametric Transfer Function Models

    PubMed Central

    Liu, Jun M.; Chen, Rong; Yao, Qiwei

    2009-01-01

    In this paper a class of nonparametric transfer function models is proposed to model nonlinear relationships between ‘input’ and ‘output’ time series. The transfer function is smooth with unknown functional forms, and the noise is assumed to be a stationary autoregressive-moving average (ARMA) process. The nonparametric transfer function is estimated jointly with the ARMA parameters. By modeling the correlation in the noise, the transfer function can be estimated more efficiently. The parsimonious ARMA structure improves the estimation efficiency in finite samples. The asymptotic properties of the estimators are investigated. The finite-sample properties are illustrated through simulations and one empirical example. PMID:20628584

  6. Fuzzy parametric uncertainty analysis of linear dynamical systems: A surrogate modeling approach

    NASA Astrophysics Data System (ADS)

    Chowdhury, R.; Adhikari, S.

    2012-10-01

    Uncertainty propagation engineering systems possess significant computational challenges. This paper explores the possibility of using correlated function expansion based metamodelling approach when uncertain system parameters are modeled using Fuzzy variables. In particular, the application of High-Dimensional Model Representation (HDMR) is proposed for fuzzy finite element analysis of dynamical systems. The HDMR expansion is a set of quantitative model assessment and analysis tools for capturing high-dimensional input-output system behavior based on a hierarchy of functions of increasing dimensions. The input variables may be either finite-dimensional (i.e., a vector of parameters chosen from the Euclidean space RM) or may be infinite-dimensional as in the function space CM[0,1]. The computational effort to determine the expansion functions using the alpha cut method scales polynomially with the number of variables rather than exponentially. This logic is based on the fundamental assumption underlying the HDMR representation that only low-order correlations among the input variables are likely to have significant impacts upon the outputs for most high-dimensional complex systems. The proposed method is integrated with a commercial Finite Element software. Modal analysis of a simplified aircraft wing with Fuzzy parameters has been used to illustrate the generality of the proposed approach. In the numerical examples, triangular membership functions have been used and the results have been validated against direct Monte Carlo simulations.

  7. A whole-brain computational modeling approach to explain the alterations in resting-state functional connectivity during progression of Alzheimer's disease.

    PubMed

    Demirtaş, Murat; Falcon, Carles; Tucholka, Alan; Gispert, Juan Domingo; Molinuevo, José Luis; Deco, Gustavo

    2017-01-01

    Alzheimer's disease (AD) is the most common dementia with dramatic consequences. The research in structural and functional neuroimaging showed altered brain connectivity in AD. In this study, we investigated the whole-brain resting state functional connectivity (FC) of the subjects with preclinical Alzheimer's disease (PAD), mild cognitive impairment due to AD (MCI) and mild dementia due to Alzheimer's disease (AD), the impact of APOE4 carriership, as well as in relation to variations in core AD CSF biomarkers. The synchronization in the whole-brain was monotonously decreasing during the course of the disease progression. Furthermore, in AD patients we found widespread significant decreases in functional connectivity (FC) strengths particularly in the brain regions with high global connectivity. We employed a whole-brain computational modeling approach to study the mechanisms underlying these alterations. To characterize the causal interactions between brain regions, we estimated the effective connectivity (EC) in the model. We found that the significant EC differences in AD were primarily located in left temporal lobe. Then, we systematically manipulated the underlying dynamics of the model to investigate simulated changes in FC based on the healthy control subjects. Furthermore, we found distinct patterns involving CSF biomarkers of amyloid-beta (Aβ1 - 42) total tau (t-tau) and phosphorylated tau (p-tau). CSF Aβ1 - 42 was associated to the contrast between healthy control subjects and clinical groups. Nevertheless, tau CSF biomarkers were associated to the variability in whole-brain synchronization and sensory integration regions. These associations were robust across clinical groups, unlike the associations that were found for CSF Aβ1 - 42. APOE4 carriership showed no significant correlations with the connectivity measures.

  8. Expanding the Range of Plant Functional Diversity Represented in Global Vegetation Models: Towards Lineage-based Plant Functional Types

    NASA Astrophysics Data System (ADS)

    Still, C. J.; Griffith, D.; Edwards, E.; Forrestel, E.; Lehmann, C.; Anderson, M.; Craine, J.; Pau, S.; Osborne, C.

    2014-12-01

    Variation in plant species traits, such as photosynthetic and hydraulic properties, can indicate vulnerability or resilience to climate change, and feed back to broad-scale spatial and temporal patterns in biogeochemistry, demographics, and biogeography. Yet, predicting how vegetation will respond to future environmental changes is severely limited by the inability of our models to represent species-level trait variation in processes and properties, as current generation process-based models are mostly based on the generalized and abstracted concept of plant functional types (PFTs) which were originally developed for hydrological modeling. For example, there are close to 11,000 grass species, but most vegetation models have only a single C4 grass and one or two C3 grass PFTs. However, while species trait databases are expanding rapidly, they have been produced mostly from unstructured research, with a focus on easily researched traits that are not necessarily the most important for determining plant function. Additionally, implementing realistic species-level trait variation in models is challenging. Combining related and ecologically similar species in these models might ameliorate this limitation. Here we argue for an intermediate, lineage-based approach to PFTs, which draws upon recent advances in gene sequencing and phylogenetic modeling, and where trait complex variations and anatomical features are constrained by a shared evolutionary history. We provide an example of this approach with grass lineages that vary in photosynthetic pathway (C3 or C4) and other functional and structural traits. We use machine learning approaches and geospatial databases to infer the most important environmental controls and climate niche variation for the distribution of grass lineages, and utilize a rapidly expanding grass trait database to demonstrate examples of lineage-based grass PFTs. For example, grasses in the Andropogoneae are typically tall species that dominate wet and

  9. Simple model dielectric functions for insulators

    NASA Astrophysics Data System (ADS)

    Vos, Maarten; Grande, Pedro L.

    2017-05-01

    The Drude dielectric function is a simple way of describing the dielectric function of free electron materials, which have an uniform electron density, in a classical way. The Mermin dielectric function describes a free electron gas, but is based on quantum physics. More complex metals have varying electron densities and are often described by a sum of Drude dielectric functions, the weight of each function being taken proportional to the volume with the corresponding density. Here we describe a slight variation on the Drude dielectric functions that describes insulators in a semi-classical way and a form of the Levine-Louie dielectric function including a relaxation time that does the same within the framework of quantum physics. In the optical limit the semi-classical description of an insulator and the quantum physics description coincide, in the same way as the Drude and Mermin dielectric function coincide in the optical limit for metals. There is a simple relation between the coefficients used in the classical and quantum approaches, a relation that ensures that the obtained dielectric function corresponds to the right static refractive index. For water we give a comparison of the model dielectric function at non-zero momentum with inelastic X-ray measurements, both at relative small momenta and in the Compton limit. The Levine-Louie dielectric function including a relaxation time describes the spectra at small momentum quite well, but in the Compton limit there are significant deviations.

  10. Modeling multivariate time series on manifolds with skew radial basis functions.

    PubMed

    Jamshidi, Arta A; Kirby, Michael J

    2011-01-01

    We present an approach for constructing nonlinear empirical mappings from high-dimensional domains to multivariate ranges. We employ radial basis functions and skew radial basis functions for constructing a model using data that are potentially scattered or sparse. The algorithm progresses iteratively, adding a new function at each step to refine the model. The placement of the functions is driven by a statistical hypothesis test that accounts for correlation in the multivariate range variables. The test is applied on training and validation data and reveals nonstatistical or geometric structure when it fails. At each step, the added function is fit to data contained in a spatiotemporally defined local region to determine the parameters--in particular, the scale of the local model. The scale of the function is determined by the zero crossings of the autocorrelation function of the residuals. The model parameters and the number of basis functions are determined automatically from the given data, and there is no need to initialize any ad hoc parameters save for the selection of the skew radial basis functions. Compactly supported skew radial basis functions are employed to improve model accuracy, order, and convergence properties. The extension of the algorithm to higher-dimensional ranges produces reduced-order models by exploiting the existence of correlation in the range variable data. Structure is tested not just in a single time series but between all pairs of time series. We illustrate the new methodologies using several illustrative problems, including modeling data on manifolds and the prediction of chaotic time series.

  11. A new approach for developing adjoint models

    NASA Astrophysics Data System (ADS)

    Farrell, P. E.; Funke, S. W.

    2011-12-01

    Many data assimilation algorithms rely on the availability of gradients of misfit functionals, which can be efficiently computed with adjoint models. However, the development of an adjoint model for a complex geophysical code is generally very difficult. Algorithmic differentiation (AD, also called automatic differentiation) offers one strategy for simplifying this task: it takes the abstraction that a model is a sequence of primitive instructions, each of which may be differentiated in turn. While extremely successful, this low-level abstraction runs into time-consuming difficulties when applied to the whole codebase of a model, such as differentiating through linear solves, model I/O, calls to external libraries, language features that are unsupported by the AD tool, and the use of multiple programming languages. While these difficulties can be overcome, it requires a large amount of technical expertise and an intimate familiarity with both the AD tool and the model. An alternative to applying the AD tool to the whole codebase is to assemble the discrete adjoint equations and use these to compute the necessary gradients. With this approach, the AD tool must be applied to the nonlinear assembly operators, which are typically small, self-contained units of the codebase. The disadvantage of this approach is that the assembly of the discrete adjoint equations is still very difficult to perform correctly, especially for complex multiphysics models that perform temporal integration; as it stands, this approach is as difficult and time-consuming as applying AD to the whole model. In this work, we have developed a library which greatly simplifies and automates the alternate approach of assembling the discrete adjoint equations. We propose a complementary, higher-level abstraction to that of AD: that a model is a sequence of linear solves. The developer annotates model source code with library calls that build a 'tape' of the operators involved and their dependencies, and

  12. Approaches to surface complexation modeling of Uranium(VI) adsorption on aquifer sediments

    NASA Astrophysics Data System (ADS)

    Davis, James A.; Meece, David E.; Kohler, Matthias; Curtis, Gary P.

    2004-09-01

    Uranium(VI) adsorption onto aquifer sediments was studied in batch experiments as a function of pH and U(VI) and dissolved carbonate concentrations in artificial groundwater solutions. The sediments were collected from an alluvial aquifer at a location upgradient of contamination from a former uranium mill operation at Naturita, Colorado (USA). The ranges of aqueous chemical conditions used in the U(VI) adsorption experiments (pH 6.9 to 7.9; U(VI) concentration 2.5 · 10 -8 to 1 · 10 -5 M; partial pressure of carbon dioxide gas 0.05 to 6.8%) were based on the spatial variation in chemical conditions observed in 1999-2000 in the Naturita alluvial aquifer. The major minerals in the sediments were quartz, feldspars, and calcite, with minor amounts of magnetite and clay minerals. Quartz grains commonly exhibited coatings that were greater than 10 nm in thickness and composed of an illite-smectite clay with occluded ferrihydrite and goethite nanoparticles. Chemical extractions of quartz grains removed from the sediments were used to estimate the masses of iron and aluminum present in the coatings. Various surface complexation modeling approaches were compared in terms of the ability to describe the U(VI) experimental data and the data requirements for model application to the sediments. Published models for U(VI) adsorption on reference minerals were applied to predict U(VI) adsorption based on assumptions about the sediment surface composition and physical properties (e.g., surface area and electrical double layer). Predictions from these models were highly variable, with results overpredicting or underpredicting the experimental data, depending on the assumptions used to apply the model. Although the models for reference minerals are supported by detailed experimental studies (and in ideal cases, surface spectroscopy), the results suggest that errors are caused in applying the models directly to the sediments by uncertain knowledge of: 1) the proportion and types of

  13. Approaches to surface complexation modeling of Uranium(VI) adsorption on aquifer sediments

    USGS Publications Warehouse

    Davis, J.A.; Meece, D.E.; Kohler, M.; Curtis, G.P.

    2004-01-01

    Uranium(VI) adsorption onto aquifer sediments was studied in batch experiments as a function of pH and U(VI) and dissolved carbonate concentrations in artificial groundwater solutions. The sediments were collected from an alluvial aquifer at a location upgradient of contamination from a former uranium mill operation at Naturita, Colorado (USA). The ranges of aqueous chemical conditions used in the U(VI) adsorption experiments (pH 6.9 to 7.9; U(VI) concentration 2.5 ?? 10-8 to 1 ?? 10-5 M; partial pressure of carbon dioxide gas 0.05 to 6.8%) were based on the spatial variation in chemical conditions observed in 1999-2000 in the Naturita alluvial aquifer. The major minerals in the sediments were quartz, feldspars, and calcite, with minor amounts of magnetite and clay minerals. Quartz grains commonly exhibited coatings that were greater than 10 nm in thickness and composed of an illite-smectite clay with occluded ferrihydrite and goethite nanoparticles. Chemical extractions of quartz grains removed from the sediments were used to estimate the masses of iron and aluminum present in the coatings. Various surface complexation modeling approaches were compared in terms of the ability to describe the U(VI) experimental data and the data requirements for model application to the sediments. Published models for U(VI) adsorption on reference minerals were applied to predict U(VI) adsorption based on assumptions about the sediment surface composition and physical properties (e.g., surface area and electrical double layer). Predictions from these models were highly variable, with results overpredicting or underpredicting the experimental data, depending on the assumptions used to apply the model. Although the models for reference minerals are supported by detailed experimental studies (and in ideal cases, surface spectroscopy), the results suggest that errors are caused in applying the models directly to the sediments by uncertain knowledge of: 1) the proportion and types of

  14. Symbolic Regression for the Estimation of Transfer Functions of Hydrological Models

    NASA Astrophysics Data System (ADS)

    Klotz, D.; Herrnegger, M.; Schulz, K.

    2017-11-01

    Current concepts for parameter regionalization of spatially distributed rainfall-runoff models rely on the a priori definition of transfer functions that globally map land surface characteristics (such as soil texture, land use, and digital elevation) into the model parameter space. However, these transfer functions are often chosen ad hoc or derived from small-scale experiments. This study proposes and tests an approach for inferring the structure and parametrization of possible transfer functions from runoff data to potentially circumvent these difficulties. The concept uses context-free grammars to generate possible proposition for transfer functions. The resulting structure can then be parametrized with classical optimization techniques. Several virtual experiments are performed to examine the potential for an appropriate estimation of transfer function, all of them using a very simple conceptual rainfall-runoff model with data from the Austrian Mur catchment. The results suggest that a priori defined transfer functions are in general well identifiable by the method. However, the deduction process might be inhibited, e.g., by noise in the runoff observation data, often leading to transfer function estimates of lower structural complexity.

  15. The Independent Gradient Model: A New Approach for Probing Strong and Weak Interactions in Molecules from Wave Function Calculations.

    PubMed

    Lefebvre, Corentin; Khartabil, Hassan; Boisson, Jean-Charles; Contreras-García, Julia; Piquemal, Jean-Philip; Hénon, Eric

    2018-03-19

    Extraction of the chemical interaction signature from local descriptors based on electron density (ED) is still a fruitful field of development in chemical interpretation. In a previous work that used promolecular ED (frozen ED), the new descriptor, δg , was defined. It represents the difference between a virtual upper limit of the ED gradient (∇ρIGM , IGM=independent gradient model) that represents a noninteracting system and the true ED gradient (∇ρ ). It can be seen as a measure of electron sharing brought by ED contragradience. A compelling feature of this model is to provide an automatic workflow that extracts the signature of interactions between selected groups of atoms. As with the noncovalent interaction (NCI) approach, it provides chemists with a visual understanding of the interactions present in chemical systems. ∇ρIGM is achieved simply by using absolute values upon summing the individual gradient contributions that make up the total ED gradient. Hereby, we extend this model to relaxed ED calculated from a wave function. To this end, we formulated gradient-based partitioning (GBP) to assess the contribution of each orbital to the total ED gradient. We highlight these new possibilities across two prototypical examples of organic chemistry: the unconventional hexamethylbenzene dication, with a hexa-coordinated carbon atom, and β-thioaminoacrolein. It will be shown how a bond-by-bond picture can be obtained from a wave function, which opens the way to monitor specific interactions along reaction paths. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Identifying Similarities in Cognitive Subtest Functional Requirements: An Empirical Approach

    ERIC Educational Resources Information Center

    Frisby, Craig L.; Parkin, Jason R.

    2007-01-01

    In the cognitive test interpretation literature, a Rational/Intuitive, Indirect Empirical, or Combined approach is typically used to construct conceptual taxonomies of the functional (behavioral) similarities between subtests. To address shortcomings of these approaches, the functional requirements for 49 subtests from six individually…

  17. On continuous and discontinuous approaches for modeling groundwater flow in heterogeneous media using the Numerical Manifold Method: Model development and comparison

    NASA Astrophysics Data System (ADS)

    Hu, Mengsu; Wang, Yuan; Rutqvist, Jonny

    2015-06-01

    One major challenge in modeling groundwater flow within heterogeneous geological media is that of modeling arbitrarily oriented or intersected boundaries and inner material interfaces. The Numerical Manifold Method (NMM) has recently emerged as a promising method for such modeling, in its ability to handle boundaries, its flexibility in constructing physical cover functions (continuous or with gradient jump), its meshing efficiency with a fixed mathematical mesh (covers), its convenience for enhancing approximation precision, and its integration precision, achieved by simplex integration. In this paper, we report on developing and comparing two new approaches for boundary constraints using the NMM, namely a continuous approach with jump functions and a discontinuous approach with Lagrange multipliers. In the discontinuous Lagrange multiplier method (LMM), the material interfaces are regarded as discontinuities which divide mathematical covers into different physical covers. We define and derive stringent forms of Lagrange multipliers to link the divided physical covers, thus satisfying the continuity requirement of the refraction law. In the continuous Jump Function Method (JFM), the material interfaces are regarded as inner interfaces contained within physical covers. We briefly define jump terms to represent the discontinuity of the head gradient across an interface to satisfy the refraction law. We then make a theoretical comparison between the two approaches in terms of global degrees of freedom, treatment of multiple material interfaces, treatment of small area, treatment of moving interfaces, the feasibility of coupling with mechanical analysis and applicability to other numerical methods. The newly derived boundary-constraint approaches are coded into a NMM model for groundwater flow analysis, and tested for precision and efficiency on different simulation examples. We first test the LMM for a Dirichlet boundary and then test both LMM and JFM for an

  18. Predicting seasonal influenza transmission using functional regression models with temporal dependence.

    PubMed

    Oviedo de la Fuente, Manuel; Febrero-Bande, Manuel; Muñoz, María Pilar; Domínguez, Àngela

    2018-01-01

    This paper proposes a novel approach that uses meteorological information to predict the incidence of influenza in Galicia (Spain). It extends the Generalized Least Squares (GLS) methods in the multivariate framework to functional regression models with dependent errors. These kinds of models are useful when the recent history of the incidence of influenza are readily unavailable (for instance, by delays on the communication with health informants) and the prediction must be constructed by correcting the temporal dependence of the residuals and using more accessible variables. A simulation study shows that the GLS estimators render better estimations of the parameters associated with the regression model than they do with the classical models. They obtain extremely good results from the predictive point of view and are competitive with the classical time series approach for the incidence of influenza. An iterative version of the GLS estimator (called iGLS) was also proposed that can help to model complicated dependence structures. For constructing the model, the distance correlation measure [Formula: see text] was employed to select relevant information to predict influenza rate mixing multivariate and functional variables. These kinds of models are extremely useful to health managers in allocating resources in advance to manage influenza epidemics.

  19. Rethinking plant functional types in Earth System Models: pan-tropical analysis of tree survival across environmental gradients

    NASA Astrophysics Data System (ADS)

    Johnson, D. J.; Needham, J.; Xu, C.; Davies, S. J.; Bunyavejchewin, S.; Giardina, C. P.; Condit, R.; Cordell, S.; Litton, C. M.; Hubbell, S.; Kassim, A. R. B.; Shawn, L. K. Y.; Nasardin, M. B.; Ong, P.; Ostertag, R.; Sack, L.; Tan, S. K. S.; Yap, S.; McDowell, N. G.; McMahon, S.

    2016-12-01

    Terrestrial carbon cycling is a function of the growth and survival of trees. Current model representations of tree growth and survival at a global scale rely on coarse plant functional traits that are parameterized very generally. In view of the large biodiversity in the tropical forests, it is important that we account for the functional diversity in order to better predict tropical forest responses to future climate changes. Several next generation Earth System Models are moving towards a size-structured, trait-based approach to modelling vegetation globally, but the challenge of which and how many traits are necessary to capture forest complexity remains. Additionally, the challenge of collecting sufficient trait data to describe the vast species richness of tropical forests is enormous. We propose a more fundamental approach to these problems by characterizing forests by their patterns of survival. We expect our approach to distill real-world tree survival into a reasonable number of functional types. Using 10 large-area tropical forest plots that span geographic, edaphic and climatic gradients, we model tree survival as a function of tree size for hundreds of species. We found surprisingly few categories of size-survival functions emerge. This indicates some fundamental strategies at play across diverse forests to constrain the range of possible size-survival functions. Initial cluster analysis indicates that four to eight functional forms are necessary to describe variation in size-survival relations. Temporal variation in size-survival functions can be related to local environmental variation, allowing us to parameterize how demographically similar groups of species respond to perturbations in the ecosystem. We believe this methodology will yield a synthetic approach to classifying forest systems that will greatly reduce uncertainty and complexity in global vegetation models.

  20. An interdisciplinary approach for earthquake modelling and forecasting

    NASA Astrophysics Data System (ADS)

    Han, P.; Zhuang, J.; Hattori, K.; Ogata, Y.

    2016-12-01

    Earthquake is one of the most serious disasters, which may cause heavy casualties and economic losses. Especially in the past two decades, huge/mega earthquakes have hit many countries. Effective earthquake forecasting (including time, location, and magnitude) becomes extremely important and urgent. To date, various heuristically derived algorithms have been developed for forecasting earthquakes. Generally, they can be classified into two types: catalog-based approaches and non-catalog-based approaches. Thanks to the rapid development of statistical seismology in the past 30 years, now we are able to evaluate the performances of these earthquake forecast approaches quantitatively. Although a certain amount of precursory information is available in both earthquake catalogs and non-catalog observations, the earthquake forecast is still far from satisfactory. In most case, the precursory phenomena were studied individually. An earthquake model that combines self-exciting and mutually exciting elements was developed by Ogata and Utsu from the Hawkes process. The core idea of this combined model is that the status of the event at present is controlled by the event itself (self-exciting) and all the external factors (mutually exciting) in the past. In essence, the conditional intensity function is a time-varying Poisson process with rate λ(t), which is composed of the background rate, the self-exciting term (the information from past seismic events), and the external excitation term (the information from past non-seismic observations). This model shows us a way to integrate the catalog-based forecast and non-catalog-based forecast. Against this background, we are trying to develop a new earthquake forecast model which combines catalog-based and non-catalog-based approaches.

  1. Model-driven discovery of underground metabolic functions in Escherichia coli.

    PubMed

    Guzmán, Gabriela I; Utrilla, José; Nurk, Sergey; Brunk, Elizabeth; Monk, Jonathan M; Ebrahim, Ali; Palsson, Bernhard O; Feist, Adam M

    2015-01-20

    Enzyme promiscuity toward substrates has been discussed in evolutionary terms as providing the flexibility to adapt to novel environments. In the present work, we describe an approach toward exploring such enzyme promiscuity in the space of a metabolic network. This approach leverages genome-scale models, which have been widely used for predicting growth phenotypes in various environments or following a genetic perturbation; however, these predictions occasionally fail. Failed predictions of gene essentiality offer an opportunity for targeting biological discovery, suggesting the presence of unknown underground pathways stemming from enzymatic cross-reactivity. We demonstrate a workflow that couples constraint-based modeling and bioinformatic tools with KO strain analysis and adaptive laboratory evolution for the purpose of predicting promiscuity at the genome scale. Three cases of genes that are incorrectly predicted as essential in Escherichia coli--aspC, argD, and gltA--are examined, and isozyme functions are uncovered for each to a different extent. Seven isozyme functions based on genetic and transcriptional evidence are suggested between the genes aspC and tyrB, argD and astC, gabT and puuE, and gltA and prpC. This study demonstrates how a targeted model-driven approach to discovery can systematically fill knowledge gaps, characterize underground metabolism, and elucidate regulatory mechanisms of adaptation in response to gene KO perturbations.

  2. Goal-Function Tree Modeling for Systems Engineering and Fault Management

    NASA Technical Reports Server (NTRS)

    Patterson, Jonathan D.; Johnson, Stephen B.

    2013-01-01

    The draft NASA Fault Management (FM) Handbook (2012) states that Fault Management (FM) is a "part of systems engineering", and that it "demands a system-level perspective" (NASAHDBK- 1002, 7). What, exactly, is the relationship between systems engineering and FM? To NASA, systems engineering (SE) is "the art and science of developing an operable system capable of meeting requirements within often opposed constraints" (NASA/SP-2007-6105, 3). Systems engineering starts with the elucidation and development of requirements, which set the goals that the system is to achieve. To achieve these goals, the systems engineer typically defines functions, and the functions in turn are the basis for design trades to determine the best means to perform the functions. System Health Management (SHM), by contrast, defines "the capabilities of a system that preserve the system's ability to function as intended" (Johnson et al., 2011, 3). Fault Management, in turn, is the operational subset of SHM, which detects current or future failures, and takes operational measures to prevent or respond to these failures. Failure, in turn, is the "unacceptable performance of intended function." (Johnson 2011, 605) Thus the relationship of SE to FM is that SE defines the functions and the design to perform those functions to meet system goals and requirements, while FM detects the inability to perform those functions and takes action. SHM and FM are in essence "the dark side" of SE. For every function to be performed (SE), there is the possibility that it is not successfully performed (SHM); FM defines the means to operationally detect and respond to this lack of success. We can also describe this in terms of goals: for every goal to be achieved, there is the possibility that it is not achieved; FM defines the means to operationally detect and respond to this inability to achieve the goal. This brief description of relationships between SE, SHM, and FM provide hints to a modeling approach to

  3. Use case driven approach to develop simulation model for PCS of APR1400 simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong Wook, Kim; Hong Soo, Kim; Hyeon Tae, Kang

    2006-07-01

    The full-scope simulator is being developed to evaluate specific design feature and to support the iterative design and validation in the Man-Machine Interface System (MMIS) design of Advanced Power Reactor (APR) 1400. The simulator consists of process model, control logic model, and MMI for the APR1400 as well as the Power Control System (PCS). In this paper, a use case driven approach is proposed to develop a simulation model for PCS. In this approach, a system is considered from the point of view of its users. User's view of the system is based on interactions with the system and themore » resultant responses. In use case driven approach, we initially consider the system as a black box and look at its interactions with the users. From these interactions, use cases of the system are identified. Then the system is modeled using these use cases as functions. Lower levels expand the functionalities of each of these use cases. Hence, starting from the topmost level view of the system, we proceeded down to the lowest level (the internal view of the system). The model of the system thus developed is use case driven. This paper will introduce the functionality of the PCS simulation model, including a requirement analysis based on use case and the validation result of development of PCS model. The PCS simulation model using use case will be first used during the full-scope simulator development for nuclear power plant and will be supplied to Shin-Kori 3 and 4 plant. The use case based simulation model development can be useful for the design and implementation of simulation models. (authors)« less

  4. Model dielectric function for 2D semiconductors including substrate screening

    NASA Astrophysics Data System (ADS)

    Trolle, Mads L.; Pedersen, Thomas G.; Véniard, Valerie

    2017-01-01

    Dielectric screening of excitons in 2D semiconductors is known to be a highly non-local effect, which in reciprocal space translates to a strong dependence on momentum transfer q. We present an analytical model dielectric function, including the full non-linear q-dependency, which may be used as an alternative to more numerically taxing ab initio screening functions. By verifying the good agreement between excitonic optical properties calculated using our model dielectric function, and those derived from ab initio methods, we demonstrate the versatility of this approach. Our test systems include: Monolayer hBN, monolayer MoS2, and the surface exciton of a 2 × 1 reconstructed Si(111) surface. Additionally, using our model, we easily take substrate screening effects into account. Hence, we include also a systematic study of the effects of substrate media on the excitonic optical properties of MoS2 and hBN.

  5. "Shape function + memory mechanism"-based hysteresis modeling of magnetorheological fluid actuators

    NASA Astrophysics Data System (ADS)

    Qian, Li-Jun; Chen, Peng; Cai, Fei-Long; Bai, Xian-Xu

    2018-03-01

    A hysteresis model based on "shape function + memory mechanism" is presented and its feasibility is verified through modeling the hysteresis behavior of a magnetorheological (MR) damper. A hysteresis phenomenon in resistor-capacitor (RC) circuit is first presented and analyzed. In the hysteresis model, the "memory mechanism" originating from the charging and discharging processes of the RC circuit is constructed by adopting a virtual displacement variable and updating laws for the reference points. The "shape function" is achieved and generalized from analytical solutions of the simple semi-linear Duhem model. Using the approach, the memory mechanism reveals the essence of specific Duhem model and the general shape function provides a direct and clear means to fit the hysteresis loop. In the frame of the structure of a "Restructured phenomenological model", the original hysteresis operator, i.e., the Bouc-Wen operator, is replaced with the new hysteresis operator. The comparative work with the Bouc-Wen operator based model demonstrates superior performances of high computational efficiency and comparable accuracy of the new hysteresis operator-based model.

  6. A secured e-tendering modeling using misuse case approach

    NASA Astrophysics Data System (ADS)

    Mohd, Haslina; Robie, Muhammad Afdhal Muhammad; Baharom, Fauziah; Darus, Norida Muhd; Saip, Mohamed Ali; Yasin, Azman

    2016-08-01

    Major risk factors relating to electronic transactions may lead to destructive impacts on trust and transparency in the process of tendering. Currently, electronic tendering (e-tendering) systems still remain uncertain in issues relating to legal and security compliance and most importantly it has an unclear security framework. Particularly, the available systems are lacking in addressing integrity, confidentiality, authentication, and non-repudiation in e-tendering requirements. Thus, one of the challenges in developing an e-tendering system is to ensure the system requirements include the function for secured and trusted environment. Therefore, this paper aims to model a secured e-tendering system using misuse case approach. The modeling process begins with identifying the e-tendering process, which is based on the Australian Standard Code of Tendering (AS 4120-1994). It is followed by identifying security threats and their countermeasure. Then, the e-tendering was modelled using misuse case approach. The model can contribute to e-tendering developers and also to other researchers or experts in the e-tendering domain.

  7. Modeling and Circumventing the Effect of Sediments and Water Column on Receiver Functions

    NASA Astrophysics Data System (ADS)

    Audet, P.

    2017-12-01

    Teleseismic P-wave receiver functions are routinely used to resolve crust and mantle structure in various geologic settings. Receiver functions are approximations to the Earth's Green's functions and are composed of various scattered phase arrivals, depending on the complexity of the underlying Earth structure. For simple structure, the dominant arrivals (converted and back-scattered P-to-S phases) are well separated in time and can be reliably used in estimating crustal velocity structure. In the presence of sedimentary layers, strong reverberations typically produce high-amplitude oscillations that contaminate the early part of the wave train and receiver functions can be difficult to interpret in terms of underlying structure. The effect of a water column also limits the interpretability of under-water receiver functions due to the additional acoustic wave propagating within the water column that can contaminate structural arrivals. We perform numerical modeling of teleseismic Green's functions and receiver functions using a reflectivity technique for a range of Earth models that include thin sedimentary layers and overlying water column. These modeling results indicate that, as expected, receiver functions are difficult to interpret in the presence of sediments, but the contaminating effect of the water column is dependent on the thickness of the water layer. To circumvent these effects and recover source-side structure, we propose using an approach based on transfer function modeling that bypasses receiver functions altogether and estimates crustal properties directly from the waveforms (Frederiksen and Delayney, 2015). Using this approach, reasonable assumptions about the properties of the sedimentary layer can be included in forward calculations of the Green's functions that are convolved with radial waveforms to predict vertical waveforms. Exploration of model space using Monte Carlo-style search and least-square waveform misfits can be performed to

  8. MADGiC: a model-based approach for identifying driver genes in cancer

    PubMed Central

    Korthauer, Keegan D.; Kendziorski, Christina

    2015-01-01

    Motivation: Identifying and prioritizing somatic mutations is an important and challenging area of cancer research that can provide new insights into gene function as well as new targets for drug development. Most methods for prioritizing mutations rely primarily on frequency-based criteria, where a gene is identified as having a driver mutation if it is altered in significantly more samples than expected according to a background model. Although useful, frequency-based methods are limited in that all mutations are treated equally. It is well known, however, that some mutations have no functional consequence, while others may have a major deleterious impact. The spatial pattern of mutations within a gene provides further insight into their functional consequence. Properly accounting for these factors improves both the power and accuracy of inference. Also important is an accurate background model. Results: Here, we develop a Model-based Approach for identifying Driver Genes in Cancer (termed MADGiC) that incorporates both frequency and functional impact criteria and accommodates a number of factors to improve the background model. Simulation studies demonstrate advantages of the approach, including a substantial increase in power over competing methods. Further advantages are illustrated in an analysis of ovarian and lung cancer data from The Cancer Genome Atlas (TCGA) project. Availability and implementation: R code to implement this method is available at http://www.biostat.wisc.edu/ kendzior/MADGiC/. Contact: kendzior@biostat.wisc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25573922

  9. Two approaches to estimating the effect of parenting on the development of executive function in early childhood.

    PubMed

    Blair, Clancy; Raver, C Cybele; Berry, Daniel J

    2014-02-01

    In the current article, we contrast 2 analytical approaches to estimate the relation of parenting to executive function development in a sample of 1,292 children assessed longitudinally between the ages of 36 and 60 months of age. Children were administered a newly developed and validated battery of 6 executive function tasks tapping inhibitory control, working memory, and attention shifting. Residualized change analysis indicated that higher quality parenting as indicated by higher scores on widely used measures of parenting at both earlier and later time points predicted more positive gain in executive function at 60 months. Latent change score models in which parenting and executive function over time were held to standards of longitudinal measurement invariance provided additional evidence of the association between change in parenting quality and change in executive function. In these models, cross-lagged paths indicated that in addition to parenting predicting change in executive function, executive function bidirectionally predicted change in parenting quality. Results were robust with the addition of covariates, including child sex, race, maternal education, and household income-to-need. Strengths and drawbacks of the 2 analytic approaches are discussed, and the findings are considered in light of emerging methodological innovations for testing the extent to which executive function is malleable and open to the influence of experience.

  10. Robust, Adaptive Functional Regression in Functional Mixed Model Framework.

    PubMed

    Zhu, Hongxiao; Brown, Philip J; Morris, Jeffrey S

    2011-09-01

    Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient

  11. Robust, Adaptive Functional Regression in Functional Mixed Model Framework

    PubMed Central

    Zhu, Hongxiao; Brown, Philip J.; Morris, Jeffrey S.

    2012-01-01

    Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient

  12. Fast Geometric Consensus Approach for Protein Model Quality Assessment

    PubMed Central

    Adamczak, Rafal; Pillardy, Jaroslaw; Vallat, Brinda K.

    2011-01-01

    Abstract Model quality assessment (MQA) is an integral part of protein structure prediction methods that typically generate multiple candidate models. The challenge lies in ranking and selecting the best models using a variety of physical, knowledge-based, and geometric consensus (GC)-based scoring functions. In particular, 3D-Jury and related GC methods assume that well-predicted (sub-)structures are more likely to occur frequently in a population of candidate models, compared to incorrectly folded fragments. While this approach is very successful in the context of diversified sets of models, identifying similar substructures is computationally expensive since all pairs of models need to be superimposed using MaxSub or related heuristics for structure-to-structure alignment. Here, we consider a fast alternative, in which structural similarity is assessed using 1D profiles, e.g., consisting of relative solvent accessibilities and secondary structures of equivalent amino acid residues in the respective models. We show that the new approach, dubbed 1D-Jury, allows to implicitly compare and rank N models in O(N) time, as opposed to quadratic complexity of 3D-Jury and related clustering-based methods. In addition, 1D-Jury avoids computationally expensive 3D superposition of pairs of models. At the same time, structural similarity scores based on 1D profiles are shown to correlate strongly with those obtained using MaxSub. In terms of the ability to select the best models as top candidates 1D-Jury performs on par with other GC methods. Other potential applications of the new approach, including fast clustering of large numbers of intermediate structures generated by folding simulations, are discussed as well. PMID:21244273

  13. An endorsement-based approach to student modeling for planner-controlled intelligent tutoring systems

    NASA Technical Reports Server (NTRS)

    Murray, William R.

    1990-01-01

    An approach is described to student modeling for intelligent tutoring systems based on an explicit representation of the tutor's beliefs about the student and the arguments for and against those beliefs (called endorsements). A lexicographic comparison of arguments, sorted according to evidence reliability, provides a principled means of determining those beliefs that are considered true, false, or uncertain. Each of these beliefs is ultimately justified by underlying assessment data. The endorsement-based approach to student modeling is particularly appropriate for tutors controlled by instructional planners. These tutors place greater demands on a student model than opportunistic tutors. Numerical calculi approaches are less well-suited because it is difficult to correctly assign numbers for evidence reliability and rule plausibility. It may also be difficult to interpret final results and provide suitable combining functions. When numeric measures of uncertainty are used, arbitrary numeric thresholds are often required for planning decisions. Such an approach is inappropriate when robust context-sensitive planning decisions must be made. A TMS-based implementation of the endorsement-based approach to student modeling is presented, this approach is compared to alternatives, and a project history is provided describing the evolution of this approach.

  14. 3D geometric modeling and simulation of laser propagation through turbulence with plenoptic functions

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Nelson, William; Davis, Christopher C.

    2014-10-01

    Plenoptic functions are functions that preserve all the necessary light field information of optical events. Theoretical work has demonstrated that geometric based plenoptic functions can serve equally well in the traditional wave propagation equation known as the "scalar stochastic Helmholtz equation". However, in addressing problems of 3D turbulence simulation, the dominant methods using phase screen models have limitations both in explaining the choice of parameters (on the transverse plane) in real-world measurements, and finding proper correlations between neighboring phase screens (the Markov assumption breaks down). Though possible corrections to phase screen models are still promising, the equivalent geometric approach based on plenoptic functions begins to show some advantages. In fact, in these geometric approaches, a continuous wave problem is reduced to discrete trajectories of rays. This allows for convenience in parallel computing and guarantees conservation of energy. Besides the pairwise independence of simulated rays, the assigned refractive index grids can be directly tested by temperature measurements with tiny thermoprobes combined with other parameters such as humidity level and wind speed. Furthermore, without loss of generality one can break the causal chain in phase screen models by defining regional refractive centers to allow rays that are less affected to propagate through directly. As a result, our work shows that the 3D geometric approach serves as an efficient and accurate method in assessing relevant turbulence problems with inputs of several environmental measurements and reasonable guesses (such as Cn 2 levels). This approach will facilitate analysis and possible corrections in lateral wave propagation problems, such as image de-blurring, prediction of laser propagation over long ranges, and improvement of free space optic communication systems. In this paper, the plenoptic function model and relevant parallel algorithm computing

  15. Testing process predictions of models of risky choice: a quantitative model comparison approach

    PubMed Central

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  16. Viticulture microzoning: a functional approach aiming to grape and wine qualities

    NASA Astrophysics Data System (ADS)

    Bonfante, A.; Agrillo, A.; Albrizio, R.; Basile, A.; Buonomo, R.; De Mascellis, R.; Gambuti, A.; Giorio, P.; Guida, G.; Langella, G.; Manna, P.; Minieri, L.; Moio, L.; Siani, T.; Terribile, F.

    2014-12-01

    This paper aims to test a new physically oriented approach to viticulture zoning at the farm scale, strongly rooted on hydropedology and aiming to achieve a better use of environmental features with respect to plant requirement and wine production. The physics of our approach is defined by the use of soil-plant-atmosphere simulation models which applies physically-based equations to describe the soil hydrological processes and solves soil-plant water status. This study (ZOVISA project) was conducted in a farm devoted to high quality wines production (Aglianico DOC), located in South Italy (Campania region, Mirabella Eclano-AV). The soil spatial distribution was obtained after standard soil survey informed by geophysical survey. Two Homogenous Zones (HZs) were identified; in each one of those a physically based model was applied to solve the soil water balance and estimate the soil functional behaviour (crop water stress index, CWSI) defining the functional Homogeneous Zones (fHzs). In these last, experimental plots were established and monitored for investigating soil-plant water status, crop development (biometric and physiological parameters) and daily climate variables (temperature, solar radiation, rainfall, wind). The effects of crop water status on crop response over must and wine quality were then evaluated in the fHZs. This was performed by comparing crop water stress with (i) crop physiological measurement (leaf gas exchange, chlorophyll a fluorescence, leaf water potential, chlorophyll content, LAI measurement), (ii) grape bunches measurements (berry weight, sugar content, titratable acidity, etc.) and (iii) wine quality (aromatic response). Eventually this experiment has proved the usefulness of the physical based approach also in the case of mapping viticulture microzoning.

  17. Gaussian functional regression for output prediction: Model assimilation and experimental design

    NASA Astrophysics Data System (ADS)

    Nguyen, N. C.; Peraire, J.

    2016-03-01

    In this paper, we introduce a Gaussian functional regression (GFR) technique that integrates multi-fidelity models with model reduction to efficiently predict the input-output relationship of a high-fidelity model. The GFR method combines the high-fidelity model with a low-fidelity model to provide an estimate of the output of the high-fidelity model in the form of a posterior distribution that can characterize uncertainty in the prediction. A reduced basis approximation is constructed upon the low-fidelity model and incorporated into the GFR method to yield an inexpensive posterior distribution of the output estimate. As this posterior distribution depends crucially on a set of training inputs at which the high-fidelity models are simulated, we develop a greedy sampling algorithm to select the training inputs. Our approach results in an output prediction model that inherits the fidelity of the high-fidelity model and has the computational complexity of the reduced basis approximation. Numerical results are presented to demonstrate the proposed approach.

  18. Reconstructing Regional Ionospheric Electron Density: A Combined Spherical Slepian Function and Empirical Orthogonal Function Approach

    NASA Astrophysics Data System (ADS)

    Farzaneh, Saeed; Forootan, Ehsan

    2018-03-01

    The computerized ionospheric tomography is a method for imaging the Earth's ionosphere using a sounding technique and computing the slant total electron content (STEC) values from data of the global positioning system (GPS). The most common approach for ionospheric tomography is the voxel-based model, in which (1) the ionosphere is divided into voxels, (2) the STEC is then measured along (many) satellite signal paths, and finally (3) an inversion procedure is applied to reconstruct the electron density distribution of the ionosphere. In this study, a computationally efficient approach is introduced, which improves the inversion procedure of step 3. Our proposed method combines the empirical orthogonal function and the spherical Slepian base functions to describe the vertical and horizontal distribution of electron density, respectively. Thus, it can be applied on regional and global case studies. Numerical application is demonstrated using the ground-based GPS data over South America. Our results are validated against ionospheric tomography obtained from the constellation observing system for meteorology, ionosphere, and climate (COSMIC) observations and the global ionosphere map estimated by international centers, as well as by comparison with STEC derived from independent GPS stations. Using the proposed approach, we find that while using 30 GPS measurements in South America, one can achieve comparable accuracy with those from COSMIC data within the reported accuracy (1 × 1011 el/cm3) of the product. Comparisons with real observations of two GPS stations indicate an absolute difference is less than 2 TECU (where 1 total electron content unit, TECU, is 1016 electrons/m2).

  19. A mixed-mode traffic assignment model with new time-flow impedance function

    NASA Astrophysics Data System (ADS)

    Lin, Gui-Hua; Hu, Yu; Zou, Yuan-Yang

    2018-01-01

    Recently, with the wide adoption of electric vehicles, transportation network has shown different characteristics and been further developed. In this paper, we present a new time-flow impedance function, which may be more realistic than the existing time-flow impedance functions. Based on this new impedance function, we present an optimization model for a mixed-mode traffic network in which battery electric vehicles (BEVs) and gasoline vehicles (GVs) are chosen. We suggest two approaches to handle the model: One is to use the interior point (IP) algorithm and the other is to employ the sequential quadratic programming (SQP) algorithm. Three numerical examples are presented to illustrate the efficiency of these approaches. In particular, our numerical results show that more travelers prefer to choosing BEVs when the distance limit of BEVs is long enough and the unit operating cost of GVs is higher than that of BEVs, and the SQP algorithm is faster than the IP algorithm.

  20. An approach to multiscale modelling with graph grammars.

    PubMed

    Ong, Yongzhi; Streit, Katarína; Henke, Michael; Kurth, Winfried

    2014-09-01

    Functional-structural plant models (FSPMs) simulate biological processes at different spatial scales. Methods exist for multiscale data representation and modification, but the advantages of using multiple scales in the dynamic aspects of FSPMs remain unclear. Results from multiscale models in various other areas of science that share fundamental modelling issues with FSPMs suggest that potential advantages do exist, and this study therefore aims to introduce an approach to multiscale modelling in FSPMs. A three-part graph data structure and grammar is revisited, and presented with a conceptual framework for multiscale modelling. The framework is used for identifying roles, categorizing and describing scale-to-scale interactions, thus allowing alternative approaches to model development as opposed to correlation-based modelling at a single scale. Reverse information flow (from macro- to micro-scale) is catered for in the framework. The methods are implemented within the programming language XL. Three example models are implemented using the proposed multiscale graph model and framework. The first illustrates the fundamental usage of the graph data structure and grammar, the second uses probabilistic modelling for organs at the fine scale in order to derive crown growth, and the third combines multiscale plant topology with ozone trends and metabolic network simulations in order to model juvenile beech stands under exposure to a toxic trace gas. The graph data structure supports data representation and grammar operations at multiple scales. The results demonstrate that multiscale modelling is a viable method in FSPM and an alternative to correlation-based modelling. Advantages and disadvantages of multiscale modelling are illustrated by comparisons with single-scale implementations, leading to motivations for further research in sensitivity analysis and run-time efficiency for these models.

  1. Finite Element Model Calibration Approach for Area I-X

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Buehrle, Ralph D.; Templeton, Justin D.; Gaspar, James L.; Lazor, Daniel R.; Parks, Russell A.; Bartolotta, Paul A.

    2010-01-01

    Ares I-X is a pathfinder vehicle concept under development by NASA to demonstrate a new class of launch vehicles. Although this vehicle is essentially a shell of what the Ares I vehicle will be, efforts are underway to model and calibrate the analytical models before its maiden flight. Work reported in this document will summarize the model calibration approach used including uncertainty quantification of vehicle responses and the use of non-conventional boundary conditions during component testing. Since finite element modeling is the primary modeling tool, the calibration process uses these models, often developed by different groups, to assess model deficiencies and to update parameters to reconcile test with predictions. Data for two major component tests and the flight vehicle are presented along with the calibration results. For calibration, sensitivity analysis is conducted using Analysis of Variance (ANOVA). To reduce the computational burden associated with ANOVA calculations, response surface models are used in lieu of computationally intensive finite element solutions. From the sensitivity studies, parameter importance is assessed as a function of frequency. In addition, the work presents an approach to evaluate the probability that a parameter set exists to reconcile test with analysis. Comparisons of pretest predictions of frequency response uncertainty bounds with measured data, results from the variance-based sensitivity analysis, and results from component test models with calibrated boundary stiffness models are all presented.

  2. Finite Element Model Calibration Approach for Ares I-X

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Buehrle, Ralph D.; Templeton, Justin D.; Lazor, Daniel R.; Gaspar, James L.; Parks, Russel A.; Bartolotta, Paul A.

    2010-01-01

    Ares I-X is a pathfinder vehicle concept under development by NASA to demonstrate a new class of launch vehicles. Although this vehicle is essentially a shell of what the Ares I vehicle will be, efforts are underway to model and calibrate the analytical models before its maiden flight. Work reported in this document will summarize the model calibration approach used including uncertainty quantification of vehicle responses and the use of nonconventional boundary conditions during component testing. Since finite element modeling is the primary modeling tool, the calibration process uses these models, often developed by different groups, to assess model deficiencies and to update parameters to reconcile test with predictions. Data for two major component tests and the flight vehicle are presented along with the calibration results. For calibration, sensitivity analysis is conducted using Analysis of Variance (ANOVA). To reduce the computational burden associated with ANOVA calculations, response surface models are used in lieu of computationally intensive finite element solutions. From the sensitivity studies, parameter importance is assessed as a function of frequency. In addition, the work presents an approach to evaluate the probability that a parameter set exists to reconcile test with analysis. Comparisons of pre-test predictions of frequency response uncertainty bounds with measured data, results from the variance-based sensitivity analysis, and results from component test models with calibrated boundary stiffness models are all presented.

  3. Evaluating scaling models in biology using hierarchical Bayesian approaches

    PubMed Central

    Price, Charles A; Ogle, Kiona; White, Ethan P; Weitz, Joshua S

    2009-01-01

    Theoretical models for allometric relationships between organismal form and function are typically tested by comparing a single predicted relationship with empirical data. Several prominent models, however, predict more than one allometric relationship, and comparisons among alternative models have not taken this into account. Here we evaluate several different scaling models of plant morphology within a hierarchical Bayesian framework that simultaneously fits multiple scaling relationships to three large allometric datasets. The scaling models include: inflexible universal models derived from biophysical assumptions (e.g. elastic similarity or fractal networks), a flexible variation of a fractal network model, and a highly flexible model constrained only by basic algebraic relationships. We demonstrate that variation in intraspecific allometric scaling exponents is inconsistent with the universal models, and that more flexible approaches that allow for biological variability at the species level outperform universal models, even when accounting for relative increases in model complexity. PMID:19453621

  4. Moving mode shape function approach for spinning disk and asymmetric disc brake squeal

    NASA Astrophysics Data System (ADS)

    Kang, Jaeyoung

    2018-06-01

    The solution approach of an asymmetric spinning disk under stationary friction loads requires the mode shape function fixed in the disk in the assumed mode method when the equations of motion is described in the space-fixed frame. This model description will be termed the 'moving mode shape function approach' and it allows us to formulate the stationary contact load problem in both the axisymmetric and asymmetric disk cases. Numerical results show that the eigenvalues of the time-periodic axisymmetric disk system are time-invariant. When the axisymmetry of the disk is broken, the positive real parts of the eigenvalues highly vary with the rotation of the disk in the slow speeds in such application as disc brake squeal. By using the Floquet stability analysis, it is also shown that breaking the axisymmetry of the disc alters the stability boundaries of the system.

  5. An Evolutionary Computation Approach to Examine Functional Brain Plasticity.

    PubMed

    Roy, Arnab; Campbell, Colin; Bernier, Rachel A; Hillary, Frank G

    2016-01-01

    One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs) evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC) based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair) such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN) and the executive control network (ECN) during recovery from traumatic brain injury (TBI); the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in the strength

  6. Gene-centric approach to integrating environmental genomics and biogeochemical models.

    PubMed

    Reed, Daniel C; Algar, Christopher K; Huber, Julie A; Dick, Gregory J

    2014-02-04

    Rapid advances in molecular microbial ecology have yielded an unprecedented amount of data about the evolutionary relationships and functional traits of microbial communities that regulate global geochemical cycles. Biogeochemical models, however, are trailing in the wake of the environmental genomics revolution, and such models rarely incorporate explicit representations of bacteria and archaea, nor are they compatible with nucleic acid or protein sequence data. Here, we present a functional gene-based framework for describing microbial communities in biogeochemical models by incorporating genomics data to provide predictions that are readily testable. To demonstrate the approach in practice, nitrogen cycling in the Arabian Sea oxygen minimum zone (OMZ) was modeled to examine key questions about cryptic sulfur cycling and dinitrogen production pathways in OMZs. Simulations support previous assertions that denitrification dominates over anammox in the central Arabian Sea, which has important implications for the loss of fixed nitrogen from the oceans. Furthermore, cryptic sulfur cycling was shown to attenuate the secondary nitrite maximum often observed in OMZs owing to changes in the composition of the chemolithoautotrophic community and dominant metabolic pathways. Results underscore the need to explicitly integrate microbes into biogeochemical models rather than just the metabolisms they mediate. By directly linking geochemical dynamics to the genetic composition of microbial communities, the method provides a framework for achieving mechanistic insights into patterns and biogeochemical consequences of marine microbes. Such an approach is critical for informing our understanding of the key role microbes play in modulating Earth's biogeochemistry.

  7. An Integrated model for Product Quality Development—A case study on Quality functions deployment and AHP based approach

    NASA Astrophysics Data System (ADS)

    Maitra, Subrata; Banerjee, Debamalya

    2010-10-01

    Present article is based on application of the product quality and improvement of design related with the nature of failure of machineries and plant operational problems of an industrial blower fan Company. The project aims at developing the product on the basis of standardized production parameters for selling its products in the market. Special attention is also being paid to the blower fans which have been ordered directly by the customer on the basis of installed capacity of air to be provided by the fan. Application of quality function deployment is primarily a customer oriented approach. Proposed model of QFD integrated with AHP to select and rank the decision criterions on the commercial and technical factors and the measurement of the decision parameters for selection of best product in the compettitive environment. The present AHP-QFD model justifies the selection of a blower fan with the help of the group of experts' opinion by pairwise comparison of the customer's and ergonomy based technical design requirements. The steps invoved in implementation of the QFD—AHP and selection of weighted criterion may be helpful for all similar purpose industries maintaining cost and utility for competitive product.

  8. System Behavior Models: A Survey of Approaches

    DTIC Science & Technology

    2016-06-01

    MODELS: A SURVEY OF APPROACHES by Scott R. Ruppel June 2016 Thesis Advisor: Kristin Giammarco Second Reader: John M. Green THIS PAGE...Thesis 4. TITLE AND SUBTITLE SYSTEM BEHAVIOR MODELS: A SURVEY OF APPROACHES 5. FUNDING NUMBERS 6. AUTHOR(S) Scott R. Ruppel 7. PERFORMING...Monterey Phoenix, Petri nets, behavior modeling, model-based systems engineering, modeling approaches, modeling survey 15. NUMBER OF PAGES 85 16

  9. Combining Formal and Functional Approaches to Topic Structure

    ERIC Educational Resources Information Center

    Zellers, Margaret; Post, Brechtje

    2012-01-01

    Fragmentation between formal and functional approaches to prosodic variation is an ongoing problem in linguistic research. In particular, the frameworks of the Phonetics of Talk-in-Interaction (PTI) and Empirical Phonology (EP) take very different theoretical and methodological approaches to this kind of variation. We argue that it is fruitful to…

  10. Development of a structured approach for decomposition of complex systems on a functional basis

    NASA Astrophysics Data System (ADS)

    Yildirim, Unal; Felician Campean, I.

    2014-07-01

    The purpose of this paper is to present the System State Flow Diagram (SSFD) as a structured and coherent methodology to decompose a complex system on a solution- independent functional basis. The paper starts by reviewing common function modelling frameworks in literature and discusses practical requirements of the SSFD in the context of the current literature and current approaches in industry. The proposed methodology is illustrated through the analysis of a case study: design analysis of a generic Bread Toasting System (BTS).

  11. Physiology-based modelling approaches to characterize fish habitat suitability: Their usefulness and limitations

    NASA Astrophysics Data System (ADS)

    Teal, Lorna R.; Marras, Stefano; Peck, Myron A.; Domenici, Paolo

    2018-02-01

    Models are useful tools for predicting the impact of global change on species distribution and abundance. As ectotherms, fish are being challenged to adapt or track changes in their environment, either in time through a phenological shift or in space by a biogeographic shift. Past modelling efforts have largely been based on correlative Species Distribution Models, which use known occurrences of species across landscapes of interest to define sets of conditions under which species are likely to maintain populations. The practical advantages of this correlative approach are its simplicity and the flexibility in terms of data requirements. However, effective conservation management requires models that make projections beyond the range of available data. One way to deal with such an extrapolation is to use a mechanistic approach based on physiological processes underlying climate change effects on organisms. Here we illustrate two approaches for developing physiology-based models to characterize fish habitat suitability. (i) Aerobic Scope Models (ASM) are based on the relationship between environmental factors and aerobic scope (defined as the difference between maximum and standard (basal) metabolism). This approach is based on experimental data collected by using a number of treatments that allow a function to be derived to predict aerobic metabolic scope from the stressor/environmental factor(s). This function is then integrated with environmental (oceanographic) data of current and future scenarios. For any given species, this approach allows habitat suitability maps to be generated at various spatiotemporal scales. The strength of the ASM approach relies on the estimate of relative performance when comparing, for example, different locations or different species. (ii) Dynamic Energy Budget (DEB) models are based on first principles including the idea that metabolism is organised in the same way within all animals. The (standard) DEB model aims to describe

  12. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach

    PubMed Central

    Seeley, Matthew K.; Francom, Devin; Reese, C. Shane; Hopkins, J. Ty

    2017-01-01

    Abstract In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle, knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. When using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function. PMID:29339984

  13. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach.

    PubMed

    Park, Jihong; Seeley, Matthew K; Francom, Devin; Reese, C Shane; Hopkins, J Ty

    2017-12-01

    In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle, knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. When using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function.

  14. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Jihong; Seeley, Matthew K.; Francom, Devin

    In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle,more » knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. Thus when using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function.« less

  15. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach

    DOE PAGES

    Park, Jihong; Seeley, Matthew K.; Francom, Devin; ...

    2017-12-28

    In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle,more » knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. Thus when using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function.« less

  16. Time shift in slope failure prediction between unimodal and bimodal modeling approaches

    NASA Astrophysics Data System (ADS)

    Ciervo, Fabio; Casini, Francesca; Nicolina Papa, Maria; Medina, Vicente

    2016-04-01

    Together with the need to use more appropriate mathematical expressions for describing hydro-mechanical soil processes, a challenge issue relates to the need of considering the effects induced by terrain heterogeneities on the physical mechanisms, taking into account the implications of the heterogeneities in affecting time-dependent hydro-mechanical variables, would improve the prediction capacities of models, such as the ones used in early warning systems. The presence of the heterogeneities in partially-saturated slopes results in irregular propagation of the moisture and suction front. To mathematically represent the "dual-implication" generally induced by the heterogeneities in describing the hydraulic terrain behavior, several bimodal hydraulic models have been presented in literature and replaced the conventional sigmoidal/unimodal functions; this presupposes that the scale of the macrostructure is comparable with the local scale (Darcy scale), thus the Richards' model can be assumed adequate to mathematically reproduce the processes. The purpose of this work is to focus on the differences in simulating flow infiltration processes and slope stability conditions originated from preliminary choices of hydraulic models and contextually between different approaches to evaluate the factor of safety (FoS). In particular, the results of two approaches are compared. The first one includes the conventional expression of the FoS under saturated conditions and the widespread used hydraulic model of van Genuchten-Mualem. The second approach includes a generalized FoS equation for infinite-slope model under variably saturated soil conditions (Lu and Godt, 2008) and the bimodal Romano et al.'s (2011) functions to describe the hydraulic response. The extension of the above mentioned approach to the bimodal context is based on an analytical method to assess the effects of the hydraulic properties on soil shear developed integrating a bimodal lognormal hydraulic function

  17. Behavioral facilitation: a cognitive model of individual differences in approach motivation.

    PubMed

    Robinson, Michael D; Meier, Brian P; Tamir, Maya; Wilkowski, Benjamin M; Ode, Scott

    2009-02-01

    Approach motivation consists of the active, engaged pursuit of one's goals. The purpose of the present three studies (N = 258) was to examine whether approach motivation could be cognitively modeled, thereby providing process-based insights into personality functioning. Behavioral facilitation was assessed in terms of faster (or facilitated) reaction time with practice. As hypothesized, such tendencies predicted higher levels of approach motivation, higher levels of positive affect, and lower levels of depressive symptoms and did so across cognitive, behavioral, self-reported, and peer-reported outcomes. Tendencies toward behavioral facilitation, on the other hand, did not correlate with self-reported traits (Study 1) and did not predict avoidance motivation or negative affect (all studies). The results indicate a systematic relationship between behavioral facilitation in cognitive tasks and approach motivation in daily life. Results are discussed in terms of the benefits of modeling the cognitive processes hypothesized to underlie individual differences motivation, affect, and depression. (c) 2009 APA, all rights reserved

  18. Weighted functional linear regression models for gene-based association analysis.

    PubMed

    Belonogova, Nadezhda M; Svishcheva, Gulnara R; Wilson, James F; Campbell, Harry; Axenovich, Tatiana I

    2018-01-01

    Functional linear regression models are effectively used in gene-based association analysis of complex traits. These models combine information about individual genetic variants, taking into account their positions and reducing the influence of noise and/or observation errors. To increase the power of methods, where several differently informative components are combined, weights are introduced to give the advantage to more informative components. Allele-specific weights have been introduced to collapsing and kernel-based approaches to gene-based association analysis. Here we have for the first time introduced weights to functional linear regression models adapted for both independent and family samples. Using data simulated on the basis of GAW17 genotypes and weights defined by allele frequencies via the beta distribution, we demonstrated that type I errors correspond to declared values and that increasing the weights of causal variants allows the power of functional linear models to be increased. We applied the new method to real data on blood pressure from the ORCADES sample. Five of the six known genes with P < 0.1 in at least one analysis had lower P values with weighted models. Moreover, we found an association between diastolic blood pressure and the VMP1 gene (P = 8.18×10-6), when we used a weighted functional model. For this gene, the unweighted functional and weighted kernel-based models had P = 0.004 and 0.006, respectively. The new method has been implemented in the program package FREGAT, which is freely available at https://cran.r-project.org/web/packages/FREGAT/index.html.

  19. An Odds Ratio Approach for Detecting DDF under the Nested Logit Modeling Framework

    ERIC Educational Resources Information Center

    Terzi, Ragip; Suh, Youngsuk

    2015-01-01

    An odds ratio approach (ORA) under the framework of a nested logit model was proposed for evaluating differential distractor functioning (DDF) in multiple-choice items and was compared with an existing ORA developed under the nominal response model. The performances of the two ORAs for detecting DDF were investigated through an extensive…

  20. Modelling the Impact of Soil Management on Soil Functions

    NASA Astrophysics Data System (ADS)

    Vogel, H. J.; Weller, U.; Rabot, E.; Stößel, B.; Lang, B.; Wiesmeier, M.; Urbanski, L.; Wollschläger, U.

    2017-12-01

    Due to an increasing soil loss and an increasing demand for food and energy there is an enormous pressure on soils as the central resource for agricultural production. Besides the importance of soils for biomass production there are other essential soil functions, i.e. filter and buffer for water, carbon sequestration, provision and recycling of nutrients, and habitat for biological activity. All these functions have a direct feed back to biogeochemical cycles and climate. To render agricultural production efficient and sustainable we need to develop model tools that are capable to predict quantitatively the impact of a multitude of management measures on these soil functions. These functions are considered as emergent properties produced by soils as complex systems. The major challenge is to handle the multitude of physical, chemical and biological processes interacting in a non-linear manner. A large number of validated models for specific soil processes are available. However, it is not possible to simulate soil functions by coupling all the relevant processes at the detailed (i.e. molecular) level where they are well understood. A new systems perspective is required to evaluate the ensemble of soil functions and their sensitivity to external forcing. Another challenge is that soils are spatially heterogeneous systems by nature. Soil processes are highly dependent on the local soil properties and, hence, any model to predict soil functions needs to account for the site-specific conditions. For upscaling towards regional scales the spatial distribution of functional soil types need to be taken into account. We propose a new systemic model approach based on a thorough analysis of the interactions between physical, chemical and biological processes considering their site-specific characteristics. It is demonstrated for the example of soil compaction and the recovery of soil structure, water capacity and carbon stocks as a result of plant growth and biological

  1. Bayesian spatiotemporal model of fMRI data using transfer functions.

    PubMed

    Quirós, Alicia; Diez, Raquel Montes; Wilson, Simon P

    2010-09-01

    This research describes a new Bayesian spatiotemporal model to analyse BOLD fMRI studies. In the temporal dimension, we describe the shape of the hemodynamic response function (HRF) with a transfer function model. The spatial continuity and local homogeneity of the evoked responses are modelled by a Gaussian Markov random field prior on the parameter indicating activations. The proposal constitutes an extension of the spatiotemporal model presented in a previous approach [Quirós, A., Montes Diez, R. and Gamerman, D., 2010. Bayesian spatiotemporal model of fMRI data, Neuroimage, 49: 442-456], offering more flexibility in the estimation of the HRF and computational advantages in the resulting MCMC algorithm. Simulations from the model are performed in order to ascertain the performance of the sampling scheme and the ability of the posterior to estimate model parameters, as well as to check the model sensitivity to signal to noise ratio. Results are shown on synthetic data and on a real data set from a block-design fMRI experiment. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  2. Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform.

    PubMed

    Wu, Hau-Tieng; Wu, Han-Kuei; Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong

    2016-01-01

    We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features.

  3. Relationship of amotivation to neurocognition, self-efficacy and functioning in first-episode psychosis: a structural equation modeling approach.

    PubMed

    Chang, W C; Kwong, V W Y; Hui, C L M; Chan, S K W; Lee, E H M; Chen, E Y H

    2017-03-01

    Better understanding of the complex interplay among key determinants of functional outcome is crucial to promoting recovery in psychotic disorders. However, this is understudied in the early course of illness. We aimed to examine the relationships among negative symptoms, neurocognition, general self-efficacy and global functioning in first-episode psychosis (FEP) patients using structural equation modeling (SEM). Three hundred and twenty-one Chinese patients aged 26-55 years presenting with FEP to an early intervention program in Hong Kong were recruited. Assessments encompassing symptom profiles, functioning, perceived general self-efficacy and a battery of neurocognitive tests were conducted. Negative symptom measurement was subdivided into amotivation and diminished expression (DE) domain scores based on the ratings in the Scale for the Assessment of Negative Symptoms. An initial SEM model showed no significant association between functioning and DE which was removed from further analysis. A final trimmed model yielded very good model fit (χ2 = 15.48, p = 0.63; comparative fit index = 1.00; root mean square error of approximation <0.001) and demonstrated that amotivation, neurocognition and general self-efficacy had a direct effect on global functioning. Amotivation was also found to mediate a significant indirect effect of neurocognition and general self-efficacy on functioning. Neurocognition was not significantly related to general self-efficacy. Our results indicate a critical intermediary role of amotivation in linking neurocognitive impairment to functioning in FEP. General self-efficacy may represent a promising treatment target for improvement of motivational deficits and functional outcome in the early illness stage.

  4. Charge transport calculations by a wave-packet dynamical approach using maximally localized Wannier functions based on density functional theory: Application to high-mobility organic semiconductors

    NASA Astrophysics Data System (ADS)

    Ishii, Hiroyuki; Kobayashi, Nobuhiko; Hirose, Kenji

    2017-01-01

    We present a wave-packet dynamical approach to charge transport using maximally localized Wannier functions based on density functional theory including van der Waals interactions. We apply it to the transport properties of pentacene and rubrene single crystals and show the temperature-dependent natures from bandlike to thermally activated behaviors as a function of the magnitude of external static disorder. We compare the results with those obtained by the conventional band and hopping models and experiments.

  5. Two Approaches to Estimating the Effect of Parenting on the Development of Executive Function in Early Childhood

    PubMed Central

    Blair, Clancy; Raver, C. Cybele; Berry, Daniel J.

    2015-01-01

    In the current article, we contrast 2 analytical approaches to estimate the relation of parenting to executive function development in a sample of 1,292 children assessed longitudinally between the ages of 36 and 60 months of age. Children were administered a newly developed and validated battery of 6 executive function tasks tapping inhibitory control, working memory, and attention shifting. Residualized change analysis indicated that higher quality parenting as indicated by higher scores on widely used measures of parenting at both earlier and later time points predicted more positive gain in executive function at 60 months. Latent change score models in which parenting and executive function over time were held to standards of longitudinal measurement invariance provided additional evidence of the association between change in parenting quality and change in executive function. In these models, cross-lagged paths indicated that in addition to parenting predicting change in executive function, executive function bidirectionally predicted change in parenting quality. Results were robust with the addition of covariates, including child sex, race, maternal education, and household income-to-need. Strengths and drawbacks of the 2 analytic approaches are discussed, and the findings are considered in light of emerging methodological innovations for testing the extent to which executive function is malleable and open to the influence of experience. PMID:23834294

  6. MODELING OF METAL BINDING ON HUMIC SUBSTANCES USING THE NIST DATABASE: AN A PRIORI FUNCTIONAL GROUP APPROACH

    EPA Science Inventory

    Various modeling approaches have been developed for metal binding on humic substances. However, most of these models are still curve-fitting exercises-- the resulting set of parameters such as affinity constants (or the distribution of them) is found to depend on pH, ionic stren...

  7. Statistical limitations in functional neuroimaging. I. Non-inferential methods and statistical models.

    PubMed Central

    Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P

    1999-01-01

    Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149

  8. AUTOMATED ANALYSIS OF QUANTITATIVE IMAGE DATA USING ISOMORPHIC FUNCTIONAL MIXED MODELS, WITH APPLICATION TO PROTEOMICS DATA.

    PubMed

    Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Herrick, Richard C; Sanna, Pietro; Gutstein, Howard

    2011-01-01

    Image data are increasingly encountered and are of growing importance in many areas of science. Much of these data are quantitative image data, which are characterized by intensities that represent some measurement of interest in the scanned images. The data typically consist of multiple images on the same domain and the goal of the research is to combine the quantitative information across images to make inference about populations or interventions. In this paper, we present a unified analysis framework for the analysis of quantitative image data using a Bayesian functional mixed model approach. This framework is flexible enough to handle complex, irregular images with many local features, and can model the simultaneous effects of multiple factors on the image intensities and account for the correlation between images induced by the design. We introduce a general isomorphic modeling approach to fitting the functional mixed model, of which the wavelet-based functional mixed model is one special case. With suitable modeling choices, this approach leads to efficient calculations and can result in flexible modeling and adaptive smoothing of the salient features in the data. The proposed method has the following advantages: it can be run automatically, it produces inferential plots indicating which regions of the image are associated with each factor, it simultaneously considers the practical and statistical significance of findings, and it controls the false discovery rate. Although the method we present is general and can be applied to quantitative image data from any application, in this paper we focus on image-based proteomic data. We apply our method to an animal study investigating the effects of opiate addiction on the brain proteome. Our image-based functional mixed model approach finds results that are missed with conventional spot-based analysis approaches. In particular, we find that the significant regions of the image identified by the proposed method

  9. Comparisons of Multilevel Modeling and Structural Equation Modeling Approaches to Actor-Partner Interdependence Model.

    PubMed

    Hong, Sehee; Kim, Soyoung

    2018-01-01

    There are basically two modeling approaches applicable to analyzing an actor-partner interdependence model: the multilevel modeling (hierarchical linear model) and the structural equation modeling. This article explains how to use these two models in analyzing an actor-partner interdependence model and how these two approaches work differently. As an empirical example, marital conflict data were used to analyze an actor-partner interdependence model. The multilevel modeling and the structural equation modeling produced virtually identical estimates for a basic model. However, the structural equation modeling approach allowed more realistic assumptions on measurement errors and factor loadings, rendering better model fit indices.

  10. In Silico Modeling of Indigo and Tyrian Purple Single-Electron Nano-Transistors Using Density Functional Theory Approach

    NASA Astrophysics Data System (ADS)

    Shityakov, Sergey; Roewer, Norbert; Förster, Carola; Broscheit, Jens-Albert

    2017-07-01

    The purpose of this study was to develop and implement an in silico model of indigoid-based single-electron transistor (SET) nanodevices, which consist of indigoid molecules from natural dye weakly coupled to gold electrodes that function in a Coulomb blockade regime. The electronic properties of the indigoid molecules were investigated using the optimized density-functional theory (DFT) with a continuum model. Higher electron transport characteristics were determined for Tyrian purple, consistent with experimentally derived data. Overall, these results can be used to correctly predict and emphasize the electron transport functions of organic SETs, demonstrating their potential for sustainable nanoelectronics comprising the biodegradable and biocompatible materials.

  11. Determination of excitation profile and dielectric function spatial nonuniformity in porous silicon by using WKB approach.

    PubMed

    He, Wei; Yurkevich, Igor V; Canham, Leigh T; Loni, Armando; Kaplan, Andrey

    2014-11-03

    We develop an analytical model based on the WKB approach to evaluate the experimental results of the femtosecond pump-probe measurements of the transmittance and reflectance obtained on thin membranes of porous silicon. The model allows us to retrieve a pump-induced nonuniform complex dielectric function change along the membrane depth. We show that the model fitting to the experimental data requires a minimal number of fitting parameters while still complying with the restriction imposed by the Kramers-Kronig relation. The developed model has a broad range of applications for experimental data analysis and practical implementation in the design of devices involving a spatially nonuniform dielectric function, such as in biosensing, wave-guiding, solar energy harvesting, photonics and electro-optical devices.

  12. A Systematic Approach to Determining the Identifiability of Multistage Carcinogenesis Models.

    PubMed

    Brouwer, Andrew F; Meza, Rafael; Eisenberg, Marisa C

    2017-07-01

    Multistage clonal expansion (MSCE) models of carcinogenesis are continuous-time Markov process models often used to relate cancer incidence to biological mechanism. Identifiability analysis determines what model parameter combinations can, theoretically, be estimated from given data. We use a systematic approach, based on differential algebra methods traditionally used for deterministic ordinary differential equation (ODE) models, to determine identifiable combinations for a generalized subclass of MSCE models with any number of preinitation stages and one clonal expansion. Additionally, we determine the identifiable combinations of the generalized MSCE model with up to four clonal expansion stages, and conjecture the results for any number of clonal expansion stages. The results improve upon previous work in a number of ways and provide a framework to find the identifiable combinations for further variations on the MSCE models. Finally, our approach, which takes advantage of the Kolmogorov backward equations for the probability generating functions of the Markov process, demonstrates that identifiability methods used in engineering and mathematics for systems of ODEs can be applied to continuous-time Markov processes. © 2016 Society for Risk Analysis.

  13. A class of stochastic optimization problems with one quadratic & several linear objective functions and extended portfolio selection model

    NASA Astrophysics Data System (ADS)

    Xu, Jiuping; Li, Jun

    2002-09-01

    In this paper a class of stochastic multiple-objective programming problems with one quadratic, several linear objective functions and linear constraints has been introduced. The former model is transformed into a deterministic multiple-objective nonlinear programming model by means of the introduction of random variables' expectation. The reference direction approach is used to deal with linear objectives and results in a linear parametric optimization formula with a single linear objective function. This objective function is combined with the quadratic function using the weighted sums. The quadratic problem is transformed into a linear (parametric) complementary problem, the basic formula for the proposed approach. The sufficient and necessary conditions for (properly, weakly) efficient solutions and some construction characteristics of (weakly) efficient solution sets are obtained. An interactive algorithm is proposed based on reference direction and weighted sums. Varying the parameter vector on the right-hand side of the model, the DM can freely search the efficient frontier with the model. An extended portfolio selection model is formed when liquidity is considered as another objective to be optimized besides expectation and risk. The interactive approach is illustrated with a practical example.

  14. On the limitations of standard statistical modeling in biological systems: a full Bayesian approach for biology.

    PubMed

    Gomez-Ramirez, Jaime; Sanz, Ricardo

    2013-09-01

    One of the most important scientific challenges today is the quantitative and predictive understanding of biological function. Classical mathematical and computational approaches have been enormously successful in modeling inert matter, but they may be inadequate to address inherent features of biological systems. We address the conceptual and methodological obstacles that lie in the inverse problem in biological systems modeling. We introduce a full Bayesian approach (FBA), a theoretical framework to study biological function, in which probability distributions are conditional on biophysical information that physically resides in the biological system that is studied by the scientist. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Function-specific and Enhanced Brain Structural Connectivity Mapping via Joint Modeling of Diffusion and Functional MRI.

    PubMed

    Chu, Shu-Hsien; Parhi, Keshab K; Lenglet, Christophe

    2018-03-16

    A joint structural-functional brain network model is presented, which enables the discovery of function-specific brain circuits, and recovers structural connections that are under-estimated by diffusion MRI (dMRI). Incorporating information from functional MRI (fMRI) into diffusion MRI to estimate brain circuits is a challenging task. Usually, seed regions for tractography are selected from fMRI activation maps to extract the white matter pathways of interest. The proposed method jointly analyzes whole brain dMRI and fMRI data, allowing the estimation of complete function-specific structural networks instead of interactively investigating the connectivity of individual cortical/sub-cortical areas. Additionally, tractography techniques are prone to limitations, which can result in erroneous pathways. The proposed framework explicitly models the interactions between structural and functional connectivity measures thereby improving anatomical circuit estimation. Results on Human Connectome Project (HCP) data demonstrate the benefits of the approach by successfully identifying function-specific anatomical circuits, such as the language and resting-state networks. In contrast to correlation-based or independent component analysis (ICA) functional connectivity mapping, detailed anatomical connectivity patterns are revealed for each functional module. Results on a phantom (Fibercup) also indicate improvements in structural connectivity mapping by rejecting false-positive connections with insufficient support from fMRI, and enhancing under-estimated connectivity with strong functional correlation.

  16. Statistical inference of dynamic resting-state functional connectivity using hierarchical observation modeling.

    PubMed

    Sojoudi, Alireza; Goodyear, Bradley G

    2016-12-01

    Spontaneous fluctuations of blood-oxygenation level-dependent functional magnetic resonance imaging (BOLD fMRI) signals are highly synchronous between brain regions that serve similar functions. This provides a means to investigate functional networks; however, most analysis techniques assume functional connections are constant over time. This may be problematic in the case of neurological disease, where functional connections may be highly variable. Recently, several methods have been proposed to determine moment-to-moment changes in the strength of functional connections over an imaging session (so called dynamic connectivity). Here a novel analysis framework based on a hierarchical observation modeling approach was proposed, to permit statistical inference of the presence of dynamic connectivity. A two-level linear model composed of overlapping sliding windows of fMRI signals, incorporating the fact that overlapping windows are not independent was described. To test this approach, datasets were synthesized whereby functional connectivity was either constant (significant or insignificant) or modulated by an external input. The method successfully determines the statistical significance of a functional connection in phase with the modulation, and it exhibits greater sensitivity and specificity in detecting regions with variable connectivity, when compared with sliding-window correlation analysis. For real data, this technique possesses greater reproducibility and provides a more discriminative estimate of dynamic connectivity than sliding-window correlation analysis. Hum Brain Mapp 37:4566-4580, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  17. Optimal speech motor control and token-to-token variability: a Bayesian modeling approach.

    PubMed

    Patri, Jean-François; Diard, Julien; Perrier, Pascal

    2015-12-01

    The remarkable capacity of the speech motor system to adapt to various speech conditions is due to an excess of degrees of freedom, which enables producing similar acoustical properties with different sets of control strategies. To explain how the central nervous system selects one of the possible strategies, a common approach, in line with optimal motor control theories, is to model speech motor planning as the solution of an optimality problem based on cost functions. Despite the success of this approach, one of its drawbacks is the intrinsic contradiction between the concept of optimality and the observed experimental intra-speaker token-to-token variability. The present paper proposes an alternative approach by formulating feedforward optimal control in a probabilistic Bayesian modeling framework. This is illustrated by controlling a biomechanical model of the vocal tract for speech production and by comparing it with an existing optimal control model (GEPPETO). The essential elements of this optimal control model are presented first. From them the Bayesian model is constructed in a progressive way. Performance of the Bayesian model is evaluated based on computer simulations and compared to the optimal control model. This approach is shown to be appropriate for solving the speech planning problem while accounting for variability in a principled way.

  18. A Bayesian Hierarchical Modeling Approach to Predicting Flow in Ungauged Basins

    NASA Astrophysics Data System (ADS)

    Gronewold, A.; Alameddine, I.; Anderson, R. M.

    2009-12-01

    Recent innovative approaches to identifying and applying regression-based relationships between land use patterns (such as increasing impervious surface area and decreasing vegetative cover) and rainfall-runoff model parameters represent novel and promising improvements to predicting flow from ungauged basins. In particular, these approaches allow for predicting flows under uncertain and potentially variable future conditions due to rapid land cover changes, variable climate conditions, and other factors. Despite the broad range of literature on estimating rainfall-runoff model parameters, however, the absence of a robust set of modeling tools for identifying and quantifying uncertainties in (and correlation between) rainfall-runoff model parameters represents a significant gap in current hydrological modeling research. Here, we build upon a series of recent publications promoting novel Bayesian and probabilistic modeling strategies for quantifying rainfall-runoff model parameter estimation uncertainty. Our approach applies alternative measures of rainfall-runoff model parameter joint likelihood (including Nash-Sutcliffe efficiency, among others) to simulate samples from the joint parameter posterior probability density function. We then use these correlated samples as response variables in a Bayesian hierarchical model with land use coverage data as predictor variables in order to develop a robust land use-based tool for forecasting flow in ungauged basins while accounting for, and explicitly acknowledging, parameter estimation uncertainty. We apply this modeling strategy to low-relief coastal watersheds of Eastern North Carolina, an area representative of coastal resource waters throughout the world because of its sensitive embayments and because of the abundant (but currently threatened) natural resources it hosts. Consequently, this area is the subject of several ongoing studies and large-scale planning initiatives, including those conducted through the United

  19. Gene Function Hypotheses for the Campylobacter jejuni Glycome Generated by a Logic-Based Approach

    PubMed Central

    Sternberg, Michael J.E.; Tamaddoni-Nezhad, Alireza; Lesk, Victor I.; Kay, Emily; Hitchen, Paul G.; Cootes, Adrian; van Alphen, Lieke B.; Lamoureux, Marc P.; Jarrell, Harold C.; Rawlings, Christopher J.; Soo, Evelyn C.; Szymanski, Christine M.; Dell, Anne; Wren, Brendan W.; Muggleton, Stephen H.

    2013-01-01

    Increasingly, experimental data on biological systems are obtained from several sources and computational approaches are required to integrate this information and derive models for the function of the system. Here, we demonstrate the power of a logic-based machine learning approach to propose hypotheses for gene function integrating information from two diverse experimental approaches. Specifically, we use inductive logic programming that automatically proposes hypotheses explaining the empirical data with respect to logically encoded background knowledge. We study the capsular polysaccharide biosynthetic pathway of the major human gastrointestinal pathogen Campylobacter jejuni. We consider several key steps in the formation of capsular polysaccharide consisting of 15 genes of which 8 have assigned function, and we explore the extent to which functions can be hypothesised for the remaining 7. Two sources of experimental data provide the information for learning—the results of knockout experiments on the genes involved in capsule formation and the absence/presence of capsule genes in a multitude of strains of different serotypes. The machine learning uses the pathway structure as background knowledge. We propose assignments of specific genes to five previously unassigned reaction steps. For four of these steps, there was an unambiguous optimal assignment of gene to reaction, and to the fifth, there were three candidate genes. Several of these assignments were consistent with additional experimental results. We therefore show that the logic-based methodology provides a robust strategy to integrate results from different experimental approaches and propose hypotheses for the behaviour of a biological system. PMID:23103756

  20. Gene function hypotheses for the Campylobacter jejuni glycome generated by a logic-based approach.

    PubMed

    Sternberg, Michael J E; Tamaddoni-Nezhad, Alireza; Lesk, Victor I; Kay, Emily; Hitchen, Paul G; Cootes, Adrian; van Alphen, Lieke B; Lamoureux, Marc P; Jarrell, Harold C; Rawlings, Christopher J; Soo, Evelyn C; Szymanski, Christine M; Dell, Anne; Wren, Brendan W; Muggleton, Stephen H

    2013-01-09

    Increasingly, experimental data on biological systems are obtained from several sources and computational approaches are required to integrate this information and derive models for the function of the system. Here, we demonstrate the power of a logic-based machine learning approach to propose hypotheses for gene function integrating information from two diverse experimental approaches. Specifically, we use inductive logic programming that automatically proposes hypotheses explaining the empirical data with respect to logically encoded background knowledge. We study the capsular polysaccharide biosynthetic pathway of the major human gastrointestinal pathogen Campylobacter jejuni. We consider several key steps in the formation of capsular polysaccharide consisting of 15 genes of which 8 have assigned function, and we explore the extent to which functions can be hypothesised for the remaining 7. Two sources of experimental data provide the information for learning-the results of knockout experiments on the genes involved in capsule formation and the absence/presence of capsule genes in a multitude of strains of different serotypes. The machine learning uses the pathway structure as background knowledge. We propose assignments of specific genes to five previously unassigned reaction steps. For four of these steps, there was an unambiguous optimal assignment of gene to reaction, and to the fifth, there were three candidate genes. Several of these assignments were consistent with additional experimental results. We therefore show that the logic-based methodology provides a robust strategy to integrate results from different experimental approaches and propose hypotheses for the behaviour of a biological system. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Optimizing Experimental Design for Comparing Models of Brain Function

    PubMed Central

    Daunizeau, Jean; Preuschoff, Kerstin; Friston, Karl; Stephan, Klaas

    2011-01-01

    This article presents the first attempt to formalize the optimization of experimental design with the aim of comparing models of brain function based on neuroimaging data. We demonstrate our approach in the context of Dynamic Causal Modelling (DCM), which relates experimental manipulations to observed network dynamics (via hidden neuronal states) and provides an inference framework for selecting among candidate models. Here, we show how to optimize the sensitivity of model selection by choosing among experimental designs according to their respective model selection accuracy. Using Bayesian decision theory, we (i) derive the Laplace-Chernoff risk for model selection, (ii) disclose its relationship with classical design optimality criteria and (iii) assess its sensitivity to basic modelling assumptions. We then evaluate the approach when identifying brain networks using DCM. Monte-Carlo simulations and empirical analyses of fMRI data from a simple bimanual motor task in humans serve to demonstrate the relationship between network identification and the optimal experimental design. For example, we show that deciding whether there is a feedback connection requires shorter epoch durations, relative to asking whether there is experimentally induced change in a connection that is known to be present. Finally, we discuss limitations and potential extensions of this work. PMID:22125485

  2. Arbitrariness is not enough: towards a functional approach to the genetic code.

    PubMed

    Lacková, Ľudmila; Matlach, Vladimír; Faltýnek, Dan

    2017-12-01

    Arbitrariness in the genetic code is one of the main reasons for a linguistic approach to molecular biology: the genetic code is usually understood as an arbitrary relation between amino acids and nucleobases. However, from a semiotic point of view, arbitrariness should not be the only condition for definition of a code, consequently it is not completely correct to talk about "code" in this case. Yet we suppose that there exist a code in the process of protein synthesis, but on a higher level than the nucleic bases chains. Semiotically, a code should be always associated with a function and we propose to define the genetic code not only relationally (in basis of relation between nucleobases and amino acids) but also in terms of function (function of a protein as meaning of the code). Even if the functional definition of meaning in the genetic code has been discussed in the field of biosemiotics, its further implications have not been considered. In fact, if the function of a protein represents the meaning of the genetic code (the sign's object), then it is crucial to reconsider the notion of its expression (the sign) as well. In our contribution, we will show that the actual model of the genetic code is not the only possible and we will propose a more appropriate model from a semiotic point of view.

  3. Three novel approaches to structural identifiability analysis in mixed-effects models.

    PubMed

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2016-05-06

    Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not

  4. Goal-Function Tree Modeling for Systems Engineering and Fault Management

    NASA Technical Reports Server (NTRS)

    Johnson, Stephen B.; Breckenridge, Jonathan T.

    2013-01-01

    The draft NASA Fault Management (FM) Handbook (2012) states that Fault Management (FM) is a "part of systems engineering", and that it "demands a system-level perspective" (NASAHDBK- 1002, 7). What, exactly, is the relationship between systems engineering and FM? To NASA, systems engineering (SE) is "the art and science of developing an operable system capable of meeting requirements within often opposed constraints" (NASA/SP-2007-6105, 3). Systems engineering starts with the elucidation and development of requirements, which set the goals that the system is to achieve. To achieve these goals, the systems engineer typically defines functions, and the functions in turn are the basis for design trades to determine the best means to perform the functions. System Health Management (SHM), by contrast, defines "the capabilities of a system that preserve the system's ability to function as intended" (Johnson et al., 2011, 3). Fault Management, in turn, is the operational subset of SHM, which detects current or future failures, and takes operational measures to prevent or respond to these failures. Failure, in turn, is the "unacceptable performance of intended function." (Johnson 2011, 605) Thus the relationship of SE to FM is that SE defines the functions and the design to perform those functions to meet system goals and requirements, while FM detects the inability to perform those functions and takes action. SHM and FM are in essence "the dark side" of SE. For every function to be performed (SE), there is the possibility that it is not successfully performed (SHM); FM defines the means to operationally detect and respond to this lack of success. We can also describe this in terms of goals: for every goal to be achieved, there is the possibility that it is not achieved; FM defines the means to operationally detect and respond to this inability to achieve the goal. This brief description of relationships between SE, SHM, and FM provide hints to a modeling approach to

  5. A new polytopic approach for the unknown input functional observer design

    NASA Astrophysics Data System (ADS)

    Bezzaoucha, Souad; Voos, Holger; Darouach, Mohamed

    2018-03-01

    In this paper, a constructive procedure to design Functional Unknown Input Observers for nonlinear continuous time systems is proposed under the Polytopic Takagi-Sugeno framework. An equivalent representation for the nonlinear model is achieved using the sector nonlinearity transformation. Applying the Lyapunov theory and the ? attenuation, linear matrix inequalities conditions are deduced which are solved for feasibility to obtain the observer design matrices. To cope with the effect of unknown inputs, classical approach of decoupling the unknown input for the linear case is used. Both algebraic and solver-based solutions are proposed (relaxed conditions). Necessary and sufficient conditions for the existence of the functional polytopic observer are given. For both approaches, the general and particular cases (measurable premise variables, full state estimation with full and reduced order cases) are considered and it is shown that the proposed conditions correspond to the one presented for standard linear case. To illustrate the proposed theoretical results, detailed numerical simulations are presented for a Quadrotor Aerial Robots Landing and a Waste Water Treatment Plant. Both systems are highly nonlinear and represented in a T-S polytopic form with unmeasurable premise variables and unknown inputs.

  6. Examining the Relations between Executive Function, Math, and Literacy during the Transition to Kindergarten: A Multi-Analytic Approach

    ERIC Educational Resources Information Center

    Schmitt, Sara A.; Geldhof, G. John; Purpura, David J.; Duncan, Robert; McClelland, Megan M.

    2017-01-01

    The present study explored the bidirectional and longitudinal associations between executive function (EF) and early academic skills (math and literacy) across 4 waves of measurement during the transition from preschool to kindergarten using 2 complementary analytical approaches: cross-lagged panel modeling and latent growth curve modeling (LCGM).…

  7. A hidden Markov model approach to neuron firing patterns.

    PubMed Central

    Camproux, A C; Saunier, F; Chouvet, G; Thalabard, J C; Thomas, G

    1996-01-01

    Analysis and characterization of neuronal discharge patterns are of interest to neurophysiologists and neuropharmacologists. In this paper we present a hidden Markov model approach to modeling single neuron electrical activity. Basically the model assumes that each interspike interval corresponds to one of several possible states of the neuron. Fitting the model to experimental series of interspike intervals by maximum likelihood allows estimation of the number of possible underlying neuron states, the probability density functions of interspike intervals corresponding to each state, and the transition probabilities between states. We present an application to the analysis of recordings of a locus coeruleus neuron under three pharmacological conditions. The model distinguishes two states during halothane anesthesia and during recovery from halothane anesthesia, and four states after administration of clonidine. The transition probabilities yield additional insights into the mechanisms of neuron firing. Images FIGURE 3 PMID:8913581

  8. A hidden Markov model approach to neuron firing patterns.

    PubMed

    Camproux, A C; Saunier, F; Chouvet, G; Thalabard, J C; Thomas, G

    1996-11-01

    Analysis and characterization of neuronal discharge patterns are of interest to neurophysiologists and neuropharmacologists. In this paper we present a hidden Markov model approach to modeling single neuron electrical activity. Basically the model assumes that each interspike interval corresponds to one of several possible states of the neuron. Fitting the model to experimental series of interspike intervals by maximum likelihood allows estimation of the number of possible underlying neuron states, the probability density functions of interspike intervals corresponding to each state, and the transition probabilities between states. We present an application to the analysis of recordings of a locus coeruleus neuron under three pharmacological conditions. The model distinguishes two states during halothane anesthesia and during recovery from halothane anesthesia, and four states after administration of clonidine. The transition probabilities yield additional insights into the mechanisms of neuron firing.

  9. Comparing soil functions for a wide range of agriculture soils focusing on production for bioenergy using a combined isotope-based observation and modelling approach

    NASA Astrophysics Data System (ADS)

    Leistert, Hannes; Herbstritt, Barbara; Weiler, Markus

    2017-04-01

    Increase crop production for bioenergy will result in changes in land use and the resulting soil functions and may generate new chances and risks. However, detailed data and information are still missing how soil function may be altered under changing crop productions for bioenergy, in particular for a wide range of agricultural soils since most data are currently derived from individual experimental sites studying different bioenergy crops at one location. We developed a new, rapid measurement approach to investigate the influence of bioenergy plants on the water cycle and different soil functions (filter and buffer of water and N-cycling). For this approach, we drilled 89 soil cores (1-3 m deep) in spring and fall at 11 sites with different soil properties and climatic conditions comparing different crops (grass, corn, willow, poplar, and other less common bioenergy crops) and analyzing 1150 soil samples for water content, nitrate concentration and stable water isotopes. We benchmarked a soil hydrological model (1-D numerical Richards equation, ADE, water isotope fractionation including liquid and vapor composition of isotopes) using longer-term climate variables and water isotopes in precipitation to derive crop specific parameterization and to specifically validate the differences in water transport and water partitioning into evaporation, transpiration and groundwater recharge among the sites and crops using the water isotopes in particular. The model simulation were in good agreement with the observed isotope profiles and allowed us to differentiate among the different crops. We defined different indicators for the soil functions considered in this study. These indicators included the proportion of groundwater recharge, transit time of water (different percentiles) though the upper 2m and nutrient leaching potential (e.g. nitrate) during the dormant season from the rooting zone. The parameterized model was first used to calculate the indicators for the

  10. Multi-subject hierarchical inverse covariance modelling improves estimation of functional brain networks.

    PubMed

    Colclough, Giles L; Woolrich, Mark W; Harrison, Samuel J; Rojas López, Pedro A; Valdes-Sosa, Pedro A; Smith, Stephen M

    2018-05-07

    A Bayesian model for sparse, hierarchical, inver-covariance estimation is presented, and applied to multi-subject functional connectivity estimation in the human brain. It enables simultaneous inference of the strength of connectivity between brain regions at both subject and population level, and is applicable to fMRI, MEG and EEG data. Two versions of the model can encourage sparse connectivity, either using continuous priors to suppress irrelevant connections, or using an explicit description of the network structure to estimate the connection probability between each pair of regions. A large evaluation of this model, and thirteen methods that represent the state of the art of inverse covariance modelling, is conducted using both simulated and resting-state functional imaging datasets. Our novel Bayesian approach has similar performance to the best extant alternative, Ng et al.'s Sparse Group Gaussian Graphical Model algorithm, which also is based on a hierarchical structure. Using data from the Human Connectome Project, we show that these hierarchical models are able to reduce the measurement error in MEG beta-band functional networks by 10%, producing concomitant increases in estimates of the genetic influence on functional connectivity. Copyright © 2018. Published by Elsevier Inc.

  11. Connection between two statistical approaches for the modelling of particle velocity and concentration distributions in turbulent flow: The mesoscopic Eulerian formalism and the two-point probability density function method

    NASA Astrophysics Data System (ADS)

    Simonin, Olivier; Zaichik, Leonid I.; Alipchenkov, Vladimir M.; Février, Pierre

    2006-12-01

    The objective of the paper is to elucidate a connection between two approaches that have been separately proposed for modelling the statistical spatial properties of inertial particles in turbulent fluid flows. One of the approaches proposed recently by Février, Simonin, and Squires [J. Fluid Mech. 533, 1 (2005)] is based on the partitioning of particle turbulent velocity field into spatially correlated (mesoscopic Eulerian) and random-uncorrelated (quasi-Brownian) components. The other approach stems from a kinetic equation for the two-point probability density function of the velocity distributions of two particles [Zaichik and Alipchenkov, Phys. Fluids 15, 1776 (2003)]. Comparisons between these approaches are performed for isotropic homogeneous turbulence and demonstrate encouraging agreement.

  12. Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform

    PubMed Central

    Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong

    2016-01-01

    We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features. PMID:27304979

  13. A Bayesian approach for parameter estimation and prediction using a computationally intensive model

    DOE PAGES

    Higdon, Dave; McDonnell, Jordan D.; Schunck, Nicolas; ...

    2015-02-05

    Bayesian methods have been successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based modelmore » $$\\eta (\\theta )$$, where θ denotes the uncertain, best input setting. Hence the statistical model is of the form $$y=\\eta (\\theta )+\\epsilon ,$$ where $$\\epsilon $$ accounts for measurement, and possibly other, error sources. When nonlinearity is present in $$\\eta (\\cdot )$$, the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and nonstandard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. Although generally applicable, MCMC requires thousands (or even millions) of evaluations of the physics model $$\\eta (\\cdot )$$. This requirement is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we present an approach adapted from Bayesian model calibration. This approach combines output from an ensemble of computational model runs with physical measurements, within a statistical formulation, to carry out inference. A key component of this approach is a statistical response surface, or emulator, estimated from the ensemble of model runs. We demonstrate this approach with a case study in estimating parameters for a density functional theory model, using experimental mass/binding energy measurements from a collection of atomic nuclei. Lastly, we also demonstrate how this approach produces uncertainties in predictions for recent mass measurements obtained at Argonne National Laboratory.« less

  14. Intercomparison Of Approaches For Modeling Second Order Ionospheric Corrections Using Gnss Measurements

    NASA Astrophysics Data System (ADS)

    Garcia Fernandez, M.; Butala, M.; Komjathy, A.; Desai, S. D.

    2012-12-01

    Correcting GNSS tracking data for the effects of second order ionospheric effects have been shown to cause a southward shift in GNSS-based precise point positioning solutions by as much as 10 mm, depending on the solar cycle conditions. The most commonly used approaches for modeling the higher order ionospheric effect include, (a) the use of global ionosphere maps to determine vertical total electron content (VTEC) and convert to slant TEC (STEC) assuming a thin shell ionosphere, and (b) using the dual-frequency measurements themselves to determine STEC. The latter approach benefits from not requiring ionospheric mapping functions between VTEC and STEC. However, this approach will require calibrations with receiver and transmitter Differential Code Biases (DCBs). We present results from comparisons of the two approaches. For the first approach, we also compare the use of VTEC observations from IONEX maps compared to climatological model-derived VTEC as provided by the International Reference Ionosphere (IRI2012). We consider various metrics to evaluate the relative performance of the different approaches, including station repeatability, GNSS-based reference frame recovery, and post-fit measurement residuals. Overall, the GIM-based approaches tend to provide lower noise in second order ionosphere correction and positioning solutions. The use of IONEX and IRI2012 models of VTEC provide similar results, especially in periods of low solar activity periods. The use of the IRI2012 model provides a convenient approach for operational scenarios by eliminating the dependence on routine updates of the GIMs, and also serves as a useful source of VTEC when IONEX maps may not be readily available.

  15. Understanding Individual-Level Change through the Basis Functions of a Latent Curve Model

    ERIC Educational Resources Information Center

    Blozis, Shelley A.; Harring, Jeffrey R.

    2017-01-01

    Latent curve models have become a popular approach to the analysis of longitudinal data. At the individual level, the model expresses an individual's response as a linear combination of what are called "basis functions" that are common to all members of a population and weights that may vary among individuals. This article uses…

  16. A parallel approach of COFFEE objective function to multiple sequence alignment

    NASA Astrophysics Data System (ADS)

    Zafalon, G. F. D.; Visotaky, J. M. V.; Amorim, A. R.; Valêncio, C. R.; Neves, L. A.; de Souza, R. C. G.; Machado, J. M.

    2015-09-01

    The computational tools to assist genomic analyzes show even more necessary due to fast increasing of data amount available. With high computational costs of deterministic algorithms for sequence alignments, many works concentrate their efforts in the development of heuristic approaches to multiple sequence alignments. However, the selection of an approach, which offers solutions with good biological significance and feasible execution time, is a great challenge. Thus, this work aims to show the parallelization of the processing steps of MSA-GA tool using multithread paradigm in the execution of COFFEE objective function. The standard objective function implemented in the tool is the Weighted Sum of Pairs (WSP), which produces some distortions in the final alignments when sequences sets with low similarity are aligned. Then, in studies previously performed we implemented the COFFEE objective function in the tool to smooth these distortions. Although the nature of COFFEE objective function implies in the increasing of execution time, this approach presents points, which can be executed in parallel. With the improvements implemented in this work, we can verify the execution time of new approach is 24% faster than the sequential approach with COFFEE. Moreover, the COFFEE multithreaded approach is more efficient than WSP, because besides it is slightly fast, its biological results are better.

  17. Metal Mixture Modeling Evaluation project: 2. Comparison of four modeling approaches

    USGS Publications Warehouse

    Farley, Kevin J.; Meyer, Joe; Balistrieri, Laurie S.; DeSchamphelaere, Karl; Iwasaki, Yuichi; Janssen, Colin; Kamo, Masashi; Lofts, Steve; Mebane, Christopher A.; Naito, Wataru; Ryan, Adam C.; Santore, Robert C.; Tipping, Edward

    2015-01-01

    As part of the Metal Mixture Modeling Evaluation (MMME) project, models were developed by the National Institute of Advanced Industrial Science and Technology (Japan), the U.S. Geological Survey (USA), HDR⎪HydroQual, Inc. (USA), and the Centre for Ecology and Hydrology (UK) to address the effects of metal mixtures on biological responses of aquatic organisms. A comparison of the 4 models, as they were presented at the MMME Workshop in Brussels, Belgium (May 2012), is provided herein. Overall, the models were found to be similar in structure (free ion activities computed by WHAM; specific or non-specific binding of metals/cations in or on the organism; specification of metal potency factors and/or toxicity response functions to relate metal accumulation to biological response). Major differences in modeling approaches are attributed to various modeling assumptions (e.g., single versus multiple types of binding site on the organism) and specific calibration strategies that affected the selection of model parameters. The models provided a reasonable description of additive (or nearly additive) toxicity for a number of individual toxicity test results. Less-than-additive toxicity was more difficult to describe with the available models. Because of limitations in the available datasets and the strong inter-relationships among the model parameters (log KM values, potency factors, toxicity response parameters), further evaluation of specific model assumptions and calibration strategies is needed.

  18. Stochastic theory of size exclusion chromatography by the characteristic function approach.

    PubMed

    Dondi, Francesco; Cavazzini, Alberto; Remelli, Maurizio; Felinger, Attila; Martin, Michel

    2002-01-18

    A general stochastic theory of size exclusion chromatography (SEC) able to account for size dependence on both pore ingress and egress processes, moving zone dispersion and pore size distribution, was developed. The relationship between stochastic-chromatographic and batch equilibrium conditions are discussed and the fundamental role of the 'ergodic' hypothesis in establishing a link between them is emphasized. SEC models are solved by means of the characteristic function method and chromatographic parameters like plate height, peak skewness and excess are derived. The peak shapes are obtained by numerical inversion of the characteristic function under the most general conditions of the exploited models. Separate size effects on pore ingress and pore egress processes are investigated and their effects on both retention selectivity and efficiency are clearly shown. The peak splitting phenomenon and peak tailing due to incomplete sample sorption near to the exclusion limit is discussed. An SEC model for columns with two types of pores is discussed and several effects on retention selectivity and efficiency coming from pore size differences and their relative abundance are singled out. The relevance of moving zone dispersion on separation is investigated. The present approach proves to be general and able to account for more complex SEC conditions such as continuous pore size distributions and mixed retention mechanism.

  19. Charge redistribution in QM:QM ONIOM model systems: a constrained density functional theory approach

    NASA Astrophysics Data System (ADS)

    Beckett, Daniel; Krukau, Aliaksandr; Raghavachari, Krishnan

    2017-11-01

    The ONIOM hybrid method has found considerable success in QM:QM studies designed to approximate a high level of theory at a significantly reduced cost. This cost reduction is achieved by treating only a small model system with the target level of theory and the rest of the system with a low, inexpensive, level of theory. However, the choice of an appropriate model system is a limiting factor in ONIOM calculations and effects such as charge redistribution across the model system boundary must be considered as a source of error. In an effort to increase the general applicability of the ONIOM model, a method to treat the charge redistribution effect is developed using constrained density functional theory (CDFT) to constrain the charge experienced by the model system in the full calculation to the link atoms in the truncated model system calculations. Two separate CDFT-ONIOM schemes are developed and tested on a set of 20 reactions with eight combinations of levels of theory. It is shown that a scheme using a scaled Lagrange multiplier term obtained from the low-level CDFT model calculation outperforms ONIOM at each combination of levels of theory from 32% to 70%.

  20. An exemplar-based approach to individualized parcellation reveals the need for sex specific functional networks

    PubMed Central

    Salehi, Mehraveh; Karbasi, Amin; Shen, Xilin; Scheinost, Dustin; Constable, R. Todd

    2018-01-01

    Recent work with functional connectivity data has led to significant progress in understanding the functional organization of the brain. While the majority of the literature has focused on group-level parcellation approaches, there is ample evidence that the brain varies in both structure and function across individuals. In this work, we introduce a parcellation technique that incorporates delineation of functional networks both at the individual- and group-level. The proposed technique deploys the notion of “submodularity” to jointly parcellate the cerebral cortex while establishing an inclusive correspondence between the individualized functional networks. Using this parcellation technique, we successfully established a cross-validated predictive model that predicts individuals’ sex, solely based on the parcellation schemes (i.e. the node-to-network assignment vectors). The sex prediction finding illustrates that individualized parcellation of functional networks can reveal subgroups in a population and suggests that the use of a global network parcellation may overlook fundamental differences in network organization. This is a particularly important point to consider in studies comparing patients versus controls or even patient subgroups. Network organization may differ between individuals and global configurations should not be assumed. This approach to the individualized study of functional organization in the brain has many implications for both neuroscience and clinical applications. PMID:28882628

  1. The Relationship Between Approach to Activity Engagement, Specific Aspects of Physical Function, and Pain Duration in Chronic Pain.

    PubMed

    Andrews, Nicole E; Strong, Jenny; Meredith, Pamela J

    2016-01-01

    To examine: (1) the relationships between habitual approach to activity engagement and specific aspects of physical functioning in chronic pain; and (2) whether or not these relationships differ according to pain duration. Outpatients (N=169) with generalized chronic pain completed a set of written questionnaires. Categories of "approach to activity engagement" were created using the confronting and avoidance subscales of the Pain and Activity Relations Questionnaire. An interaction term between "approach to activity engagement" categories and pain duration was entered into analysis with age, sex, pain intensity, the categorical "approach to activity engagement" variable, and pain duration, in 9 ordinal regression models investigating functioning in a variety of daily activities. The "approach to activity engagement" category predicted the personal care, lifting, sleeping, social life, and traveling aspects of physical functioning but, interestingly, not the performance skills used during these activities, that is, walking, sitting, and standing. The interaction term was significant in 2 models; however, the effect of pain duration on associations was the inverse of that theorized, with the relationship between variables becoming less pronounced with increasing duration of pain. The results of this study do not support the commonly held notion that avoidance and/or overactivity behavior leads to deconditioning and reduced physical capacity over time. Findings do, however, suggest that a relationship exists between avoidance and/or overactivity behavior and reduced participation in activities. Implications for the clinical management of chronic pain and directions for further research are discussed.

  2. Structured penalties for functional linear models-partially empirical eigenvectors for regression.

    PubMed

    Randolph, Timothy W; Harezlak, Jaroslaw; Feng, Ziding

    2012-01-01

    One of the challenges with functional data is incorporating geometric structure, or local correlation, into the analysis. This structure is inherent in the output from an increasing number of biomedical technologies, and a functional linear model is often used to estimate the relationship between the predictor functions and scalar responses. Common approaches to the problem of estimating a coefficient function typically involve two stages: regularization and estimation. Regularization is usually done via dimension reduction, projecting onto a predefined span of basis functions or a reduced set of eigenvectors (principal components). In contrast, we present a unified approach that directly incorporates geometric structure into the estimation process by exploiting the joint eigenproperties of the predictors and a linear penalty operator. In this sense, the components in the regression are 'partially empirical' and the framework is provided by the generalized singular value decomposition (GSVD). The form of the penalized estimation is not new, but the GSVD clarifies the process and informs the choice of penalty by making explicit the joint influence of the penalty and predictors on the bias, variance and performance of the estimated coefficient function. Laboratory spectroscopy data and simulations are used to illustrate the concepts.

  3. A novel approach for modelling vegetation distributions and analysing vegetation sensitivity through trait-climate relationships in China

    PubMed Central

    Yang, Yanzheng; Zhu, Qiuan; Peng, Changhui; Wang, Han; Xue, Wei; Lin, Guanghui; Wen, Zhongming; Chang, Jie; Wang, Meng; Liu, Guobin; Li, Shiqing

    2016-01-01

    Increasing evidence indicates that current dynamic global vegetation models (DGVMs) have suffered from insufficient realism and are difficult to improve, particularly because they are built on plant functional type (PFT) schemes. Therefore, new approaches, such as plant trait-based methods, are urgently needed to replace PFT schemes when predicting the distribution of vegetation and investigating vegetation sensitivity. As an important direction towards constructing next-generation DGVMs based on plant functional traits, we propose a novel approach for modelling vegetation distributions and analysing vegetation sensitivity through trait-climate relationships in China. The results demonstrated that a Gaussian mixture model (GMM) trained with a LMA-Nmass-LAI data combination yielded an accuracy of 72.82% in simulating vegetation distribution, providing more detailed parameter information regarding community structures and ecosystem functions. The new approach also performed well in analyses of vegetation sensitivity to different climatic scenarios. Although the trait-climate relationship is not the only candidate useful for predicting vegetation distributions and analysing climatic sensitivity, it sheds new light on the development of next-generation trait-based DGVMs. PMID:27052108

  4. Functionally relevant climate variables for arid lands: Aclimatic water deficit approach for modelling desert shrub distributions

    Treesearch

    Thomas E. Dilts; Peter J. Weisberg; Camie M. Dencker; Jeanne C. Chambers

    2015-01-01

    We have three goals. (1) To develop a suite of functionally relevant climate variables for modelling vegetation distribution on arid and semi-arid landscapes of the Great Basin, USA. (2) To compare the predictive power of vegetation distribution models based on mechanistically proximate factors (water deficit variables) and factors that are more mechanistically removed...

  5. Modelling the Constraints of Spatial Environment in Fauna Movement Simulations: Comparison of a Boundaries Accurate Function and a Cost Function

    NASA Astrophysics Data System (ADS)

    Jolivet, L.; Cohen, M.; Ruas, A.

    2015-08-01

    Landscape influences fauna movement at different levels, from habitat selection to choices of movements' direction. Our goal is to provide a development frame in order to test simulation functions for animal's movement. We describe our approach for such simulations and we compare two types of functions to calculate trajectories. To do so, we first modelled the role of landscape elements to differentiate between elements that facilitate movements and the ones being hindrances. Different influences are identified depending on landscape elements and on animal species. Knowledge were gathered from ecologists, literature and observation datasets. Second, we analysed the description of animal movement recorded with GPS at fine scale, corresponding to high temporal frequency and good location accuracy. Analysing this type of data provides information on the relation between landscape features and movements. We implemented an agent-based simulation approach to calculate potential trajectories constrained by the spatial environment and individual's behaviour. We tested two functions that consider space differently: one function takes into account the geometry and the types of landscape elements and one cost function sums up the spatial surroundings of an individual. Results highlight the fact that the cost function exaggerates the distances travelled by an individual and simplifies movement patterns. The geometry accurate function represents a good bottom-up approach for discovering interesting areas or obstacles for movements.

  6. Cooperative fuzzy games approach to setting target levels of ECs in quality function deployment.

    PubMed

    Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang

    2014-01-01

    Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach.

  7. Cooperative Fuzzy Games Approach to Setting Target Levels of ECs in Quality Function Deployment

    PubMed Central

    Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang

    2014-01-01

    Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach. PMID:25097884

  8. The Notional-Functional Approach: Teaching the Real Language in Its Natural Context.

    ERIC Educational Resources Information Center

    Laine, Elaine

    This study of the notional-functional approach to second language teaching reviews the history and theoretical background of the method, current issues, and implementation of a notional-functional syllabus. Chapter 1 discusses the history and theory of the approach and the organization and advantages of the notional-functional syllabus. Chapter 2…

  9. The Feynman-Vernon Influence Functional Approach in QED

    NASA Astrophysics Data System (ADS)

    Biryukov, Alexander; Shleenkov, Mark

    2016-10-01

    In the path integral approach we describe evolution of interacting electromagnetic and fermionic fields by the use of density matrix formalism. The equation for density matrix and transitions probability for fermionic field is obtained as average of electromagnetic field influence functional. We obtain a formula for electromagnetic field influence functional calculating for its various initial and final state. We derive electromagnetic field influence functional when its initial and final states are vacuum. We present Lagrangian for relativistic fermionic field under influence of electromagnetic field vacuum.

  10. A functional language approach in high-speed digital simulation

    NASA Technical Reports Server (NTRS)

    Ercegovac, M. D.; Lu, S.-L.

    1983-01-01

    A functional programming approach for a multi-microprocessor architecture is presented. The language, based on Backus FP, its intermediate form and the translation process are discussed and illustrated with an example. The approach allows performance analysis to be performed at a high level as an aid in program partitioning.

  11. Breaking Functional Connectivity into Components: A Novel Approach Using an Individual-Based Model, and First Outcomes

    PubMed Central

    Pe'er, Guy; Henle, Klaus; Dislich, Claudia; Frank, Karin

    2011-01-01

    Landscape connectivity is a key factor determining the viability of populations in fragmented landscapes. Predicting ‘functional connectivity’, namely whether a patch or a landscape functions as connected from the perspective of a focal species, poses various challenges. First, empirical data on the movement behaviour of species is often scarce. Second, animal-landscape interactions are bound to yield complex patterns. Lastly, functional connectivity involves various components that are rarely assessed separately. We introduce the spatially explicit, individual-based model FunCon as means to distinguish between components of functional connectivity and to assess how each of them affects the sensitivity of species and communities to landscape structures. We then present the results of exploratory simulations over six landscapes of different fragmentation levels and across a range of hypothetical bird species that differ in their response to habitat edges. i) Our results demonstrate that estimations of functional connectivity depend not only on the response of species to edges (avoidance versus penetration into the matrix), the movement mode investigated (home range movements versus dispersal), and the way in which the matrix is being crossed (random walk versus gap crossing), but also on the choice of connectivity measure (in this case, the model output examined). ii) We further show a strong effect of the mortality scenario applied, indicating that movement decisions that do not fully match the mortality risks are likely to reduce connectivity and enhance sensitivity to fragmentation. iii) Despite these complexities, some consistent patterns emerged. For instance, the ranking order of landscapes in terms of functional connectivity was mostly consistent across the entire range of hypothetical species, indicating that simple landscape indices can potentially serve as valuable surrogates for functional connectivity. Yet such simplifications must be carefully

  12. Transient high frequency signal estimation: A model-based processing approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, F.L.

    1985-03-22

    By utilizing the superposition property of linear systems a method of estimating the incident signal from reflective nondispersive data is developed. One of the basic merits of this approach is that, the reflections were removed by direct application of a Weiner type estimation algorithm, after the appropriate input was synthesized. The structure of the nondispersive signal model is well documented, and thus its' credence is established. The model is stated and more effort is devoted to practical methods of estimating the model parameters. Though a general approach was developed for obtaining the reflection weights, a simpler approach was employed here,more » since a fairly good reflection model is available. The technique essentially consists of calculating ratios of the autocorrelation function at lag zero and that lag where the incident and first reflection coincide. We initially performed our processing procedure on a measurement of a single signal. Multiple application of the processing procedure was required when we applied the reflection removal technique on a measurement containing information from the interaction of two physical phenomena. All processing was performed using SIG, an interactive signal processing package. One of the many consequences of using SIG was that repetitive operations were, for the most part, automated. A custom menu was designed to perform the deconvolution process.« less

  13. SLS Navigation Model-Based Design Approach

    NASA Technical Reports Server (NTRS)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and

  14. A Comparison of Functional Models for Use in the Function-Failure Design Method

    NASA Technical Reports Server (NTRS)

    Stock, Michael E.; Stone, Robert B.; Tumer, Irem Y.

    2006-01-01

    When failure analysis and prevention, guided by historical design knowledge, are coupled with product design at its conception, shorter design cycles are possible. By decreasing the design time of a product in this manner, design costs are reduced and the product will better suit the customer s needs. Prior work indicates that similar failure modes occur with products (or components) with similar functionality. To capitalize on this finding, a knowledge base of historical failure information linked to functionality is assembled for use by designers. One possible use for this knowledge base is within the Elemental Function-Failure Design Method (EFDM). This design methodology and failure analysis tool begins at conceptual design and keeps the designer cognizant of failures that are likely to occur based on the product s functionality. The EFDM offers potential improvement over current failure analysis methods, such as FMEA, FMECA, and Fault Tree Analysis, because it can be implemented hand in hand with other conceptual design steps and carried throughout a product s design cycle. These other failure analysis methods can only truly be effective after a physical design has been completed. The EFDM however is only as good as the knowledge base that it draws from, and therefore it is of utmost importance to develop a knowledge base that will be suitable for use across a wide spectrum of products. One fundamental question that arises in using the EFDM is: At what level of detail should functional descriptions of components be encoded? This paper explores two approaches to populating a knowledge base with actual failure occurrence information from Bell 206 helicopters. Functional models expressed at various levels of detail are investigated to determine the necessary detail for an applicable knowledge base that can be used by designers in both new designs as well as redesigns. High level and more detailed functional descriptions are derived for each failed component based

  15. Integrating models with data in ecology and palaeoecology: advances towards a model-data fusion approach.

    PubMed

    Peng, Changhui; Guiot, Joel; Wu, Haibin; Jiang, Hong; Luo, Yiqi

    2011-05-01

    It is increasingly being recognized that global ecological research requires novel methods and strategies in which to combine process-based ecological models and data in cohesive, systematic ways. Model-data fusion (MDF) is an emerging area of research in ecology and palaeoecology. It provides a new quantitative approach that offers a high level of empirical constraint over model predictions based on observations using inverse modelling and data assimilation (DA) techniques. Increasing demands to integrate model and data methods in the past decade has led to MDF utilization in palaeoecology, ecology and earth system sciences. This paper reviews key features and principles of MDF and highlights different approaches with regards to DA. After providing a critical evaluation of the numerous benefits of MDF and its current applications in palaeoecology (i.e., palaeoclimatic reconstruction, palaeovegetation and palaeocarbon storage) and ecology (i.e. parameter and uncertainty estimation, model error identification, remote sensing and ecological forecasting), the paper discusses method limitations, current challenges and future research direction. In the ongoing data-rich era of today's world, MDF could become an important diagnostic and prognostic tool in which to improve our understanding of ecological processes while testing ecological theory and hypotheses and forecasting changes in ecosystem structure, function and services. © 2011 Blackwell Publishing Ltd/CNRS.

  16. Priming effect and microbial diversity in ecosystem functioning and response to global change: a modeling approach using the SYMPHONY model.

    PubMed

    Perveen, Nazia; Barot, Sébastien; Alvarez, Gaël; Klumpp, Katja; Martin, Raphael; Rapaport, Alain; Herfurth, Damien; Louault, Frédérique; Fontaine, Sébastien

    2014-04-01

    Integration of the priming effect (PE) in ecosystem models is crucial to better predict the consequences of global change on ecosystem carbon (C) dynamics and its feedbacks on climate. Over the last decade, many attempts have been made to model PE in soil. However, PE has not yet been incorporated into any ecosystem models. Here, we build plant/soil models to explore how PE and microbial diversity influence soil/plant interactions and ecosystem C and nitrogen (N) dynamics in response to global change (elevated CO2 and atmospheric N depositions). Our results show that plant persistence, soil organic matter (SOM) accumulation, and low N leaching in undisturbed ecosystems relies on a fine adjustment of microbial N mineralization to plant N uptake. This adjustment can be modeled in the SYMPHONY model by considering the destruction of SOM through PE, and the interactions between two microbial functional groups: SOM decomposers and SOM builders. After estimation of parameters, SYMPHONY provided realistic predictions on forage production, soil C storage and N leaching for a permanent grassland. Consistent with recent observations, SYMPHONY predicted a CO2 -induced modification of soil microbial communities leading to an intensification of SOM mineralization and a decrease in the soil C stock. SYMPHONY also indicated that atmospheric N deposition may promote SOM accumulation via changes in the structure and metabolic activities of microbial communities. Collectively, these results suggest that the PE and functional role of microbial diversity may be incorporated in ecosystem models with a few additional parameters, improving accuracy of predictions. © 2013 John Wiley & Sons Ltd.

  17. A unified perspective on robot control - The energy Lyapunov function approach

    NASA Technical Reports Server (NTRS)

    Wen, John T.

    1990-01-01

    A unified framework for the stability analysis of robot tracking control is presented. By using an energy-motivated Lyapunov function candidate, the closed-loop stability is shown for a large family of control laws sharing a common structure of proportional and derivative feedback and a model-based feedforward. The feedforward can be zero, partial or complete linearized dynamics, partial or complete nonlinear dynamics, or linearized or nonlinear dynamics with parameter adaptation. As result, the dichotomous approaches to the robot control problem based on the open-loop linearization and nonlinear Lyapunov analysis are both included in this treatment. Furthermore, quantitative estimates of the trade-offs between different schemes in terms of the tracking performance, steady state error, domain of convergence, realtime computation load and required a prior model information are derived.

  18. The use of algorithmic behavioural transfer functions in parametric EO system performance models

    NASA Astrophysics Data System (ADS)

    Hickman, Duncan L.; Smith, Moira I.

    2015-10-01

    The use of mathematical models to predict the overall performance of an electro-optic (EO) system is well-established as a methodology and is used widely to support requirements definition, system design, and produce performance predictions. Traditionally these models have been based upon cascades of transfer functions based on established physical theory, such as the calculation of signal levels from radiometry equations, as well as the use of statistical models. However, the performance of an EO system is increasing being dominated by the on-board processing of the image data and this automated interpretation of image content is complex in nature and presents significant modelling challenges. Models and simulations of EO systems tend to either involve processing of image data as part of a performance simulation (image-flow) or else a series of mathematical functions that attempt to define the overall system characteristics (parametric). The former approach is generally more accurate but statistically and theoretically weak in terms of specific operational scenarios, and is also time consuming. The latter approach is generally faster but is unable to provide accurate predictions of a system's performance under operational conditions. An alternative and novel architecture is presented in this paper which combines the processing speed attributes of parametric models with the accuracy of image-flow representations in a statistically valid framework. An additional dimension needed to create an effective simulation is a robust software design whose architecture reflects the structure of the EO System and its interfaces. As such, the design of the simulator can be viewed as a software prototype of a new EO System or an abstraction of an existing design. This new approach has been used successfully to model a number of complex military systems and has been shown to combine improved performance estimation with speed of computation. Within the paper details of the approach

  19. Metal mixture modeling evaluation project: 2. Comparison of four modeling approaches.

    PubMed

    Farley, Kevin J; Meyer, Joseph S; Balistrieri, Laurie S; De Schamphelaere, Karel A C; Iwasaki, Yuichi; Janssen, Colin R; Kamo, Masashi; Lofts, Stephen; Mebane, Christopher A; Naito, Wataru; Ryan, Adam C; Santore, Robert C; Tipping, Edward

    2015-04-01

    As part of the Metal Mixture Modeling Evaluation (MMME) project, models were developed by the National Institute of Advanced Industrial Science and Technology (Japan), the US Geological Survey (USA), HDR|HydroQual (USA), and the Centre for Ecology and Hydrology (United Kingdom) to address the effects of metal mixtures on biological responses of aquatic organisms. A comparison of the 4 models, as they were presented at the MMME workshop in Brussels, Belgium (May 2012), is provided in the present study. Overall, the models were found to be similar in structure (free ion activities computed by the Windermere humic aqueous model [WHAM]; specific or nonspecific binding of metals/cations in or on the organism; specification of metal potency factors or toxicity response functions to relate metal accumulation to biological response). Major differences in modeling approaches are attributed to various modeling assumptions (e.g., single vs multiple types of binding sites on the organism) and specific calibration strategies that affected the selection of model parameters. The models provided a reasonable description of additive (or nearly additive) toxicity for a number of individual toxicity test results. Less-than-additive toxicity was more difficult to describe with the available models. Because of limitations in the available datasets and the strong interrelationships among the model parameters (binding constants, potency factors, toxicity response parameters), further evaluation of specific model assumptions and calibration strategies is needed. © 2014 SETAC.

  20. Finding your inner modeler: An NSF-sponsored workshop to introduce cell biologists to modeling/computational approaches.

    PubMed

    Stone, David E; Haswell, Elizabeth S; Sztul, Elizabeth

    2017-01-01

    In classical Cell Biology, fundamental cellular processes are revealed empirically, one experiment at a time. While this approach has been enormously fruitful, our understanding of cells is far from complete. In fact, the more we know, the more keenly we perceive our ignorance of the profoundly complex and dynamic molecular systems that underlie cell structure and function. Thus, it has become apparent to many cell biologists that experimentation alone is unlikely to yield major new paradigms, and that empiricism must be combined with theory and computational approaches to yield major new discoveries. To facilitate those discoveries, three workshops will convene annually for one day in three successive summers (2017-2019) to promote the use of computational modeling by cell biologists currently unconvinced of its utility or unsure how to apply it. The first of these workshops was held at the University of Illinois, Chicago in July 2017. Organized to facilitate interactions between traditional cell biologists and computational modelers, it provided a unique educational opportunity: a primer on how cell biologists with little or no relevant experience can incorporate computational modeling into their research. Here, we report on the workshop and describe how it addressed key issues that cell biologists face when considering modeling including: (1) Is my project appropriate for modeling? (2) What kind of data do I need to model my process? (3) How do I find a modeler to help me in integrating modeling approaches into my work? And, perhaps most importantly, (4) why should I bother?

  1. Using combined hydrological variables for extracting functional signatures of catchments to better assess the acceptability of model structures in conceptual catchment modelling

    NASA Astrophysics Data System (ADS)

    Fovet, O.; Hrachowitz, M.; RUIZ, L.; Gascuel-odoux, C.; Savenije, H.

    2013-12-01

    While most hydrological models reproduce the general flow dynamics of a system, they frequently fail to adequately mimic system internal processes. This is likely to make them inadequate to simulate solutes transport. For example, the hysteresis between storage and discharge, which is often observed in shallow hard-rock aquifers, is rarely well reproduced by models. One main reason is that this hysteresis has little weight in the calibration because objective functions are based on time series of individual variables. This reduces the ability of classical calibration/validation procedures to assess the relevance of the conceptual hypothesis associated with hydrological models. Calibrating models on variables derived from the combination of different individual variables (like stream discharge and groundwater levels) is a way to insure that models will be accepted based on their consistency. Here we therefore test the value of this more systems-like approach to test different hypothesis on the behaviour of a small experimental low-land catchment in French Brittany (ORE AgrHys) where a high hysteresis is observed on the stream flow vs. shallow groundwater level relationship. Several conceptual models were applied to this site, and calibrated using objective functions based on metrics of this hysteresis. The tested model structures differed with respect to the storage function in each reservoir, the storage-discharge function in each reservoir, the deep loss expressions (as constant or variable fraction), the number of reservoirs (from 1 to 4) and their organization (parallel, series). The observed hysteretic groundwater level-discharge relationship was not satisfactorily reproduced by most of the tested models except for the most complex ones. Those were thus more consistent, their underlying hypotheses are probably more realistic even though their performance for simulating observed stream flow was decreased. Selecting models based on such systems-like approach is

  2. Adaptive cruise control with stop&go function using the state-dependent nonlinear model predictive control approach.

    PubMed

    Shakouri, Payman; Ordys, Andrzej; Askari, Mohamad R

    2012-09-01

    In the design of adaptive cruise control (ACC) system two separate control loops - an outer loop to maintain the safe distance from the vehicle traveling in front and an inner loop to control the brake pedal and throttle opening position - are commonly used. In this paper a different approach is proposed in which a single control loop is utilized. The objective of the distance tracking is incorporated into the single nonlinear model predictive control (NMPC) by extending the original linear time invariant (LTI) models obtained by linearizing the nonlinear dynamic model of the vehicle. This is achieved by introducing the additional states corresponding to the relative distance between leading and following vehicles, and also the velocity of the leading vehicle. Control of the brake and throttle position is implemented by taking the state-dependent approach. The model demonstrates to be more effective in tracking the speed and distance by eliminating the necessity of switching between the two controllers. It also offers smooth variation in brake and throttle controlling signal which subsequently results in a more uniform acceleration of the vehicle. The results of proposed method are compared with other ACC systems using two separate control loops. Furthermore, an ACC simulation results using a stop&go scenario are shown, demonstrating a better fulfillment of the design requirements. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Basis function models for animal movement

    USGS Publications Warehouse

    Hooten, Mevin B.; Johnson, Devin S.

    2017-01-01

    Advances in satellite-based data collection techniques have served as a catalyst for new statistical methodology to analyze these data. In wildlife ecological studies, satellite-based data and methodology have provided a wealth of information about animal space use and the investigation of individual-based animal–environment relationships. With the technology for data collection improving dramatically over time, we are left with massive archives of historical animal telemetry data of varying quality. While many contemporary statistical approaches for inferring movement behavior are specified in discrete time, we develop a flexible continuous-time stochastic integral equation framework that is amenable to reduced-rank second-order covariance parameterizations. We demonstrate how the associated first-order basis functions can be constructed to mimic behavioral characteristics in realistic trajectory processes using telemetry data from mule deer and mountain lion individuals in western North America. Our approach is parallelizable and provides inference for heterogenous trajectories using nonstationary spatial modeling techniques that are feasible for large telemetry datasets. Supplementary materials for this article are available online.

  4. Verifying the functional ability of microstructured surfaces by model-based testing

    NASA Astrophysics Data System (ADS)

    Hartmann, Wito; Weckenmann, Albert

    2014-09-01

    Micro- and nanotechnology enables the use of new product features such as improved light absorption, self-cleaning or protection, which are based, on the one hand, on the size of functional nanostructures and the other hand, on material-specific properties. With the need to reliably measure progressively smaller geometric features, coordinate and surface-measuring instruments have been refined and now allow high-resolution topography and structure measurements down to the sub-nanometre range. Nevertheless, in many cases it is not possible to make a clear statement about the functional ability of the workpiece or its topography because conventional concepts of dimensioning and tolerancing are solely geometry oriented and standardized surface parameters are not sufficient to consider interaction with non-geometric parameters, which are dominant for functions such as sliding, wetting, sealing and optical reflection. To verify the functional ability of microstructured surfaces, a method was developed based on a parameterized mathematical-physical model of the function. From this model, function-related properties can be identified and geometric parameters can be derived, which may be different for the manufacturing and verification processes. With this method it is possible to optimize the definition of the shape of the workpiece regarding the intended function by applying theoretical and experimental knowledge, as well as modelling and simulation. Advantages of this approach will be discussed and demonstrated by the example of a microstructured inking roll.

  5. Distribution function approach to redshift space distortions. Part IV: perturbation theory applied to dark matter

    NASA Astrophysics Data System (ADS)

    Vlah, Zvonimir; Seljak, Uroš; McDonald, Patrick; Okumura, Teppei; Baldauf, Tobias

    2012-11-01

    We develop a perturbative approach to redshift space distortions (RSD) using the phase space distribution function approach and apply it to the dark matter redshift space power spectrum and its moments. RSD can be written as a sum over density weighted velocity moments correlators, with the lowest order being density, momentum density and stress energy density. We use standard and extended perturbation theory (PT) to determine their auto and cross correlators, comparing them to N-body simulations. We show which of the terms can be modeled well with the standard PT and which need additional terms that include higher order corrections which cannot be modeled in PT. Most of these additional terms are related to the small scale velocity dispersion effects, the so called finger of god (FoG) effects, which affect some, but not all, of the terms in this expansion, and which can be approximately modeled using a simple physically motivated ansatz such as the halo model. We point out that there are several velocity dispersions that enter into the detailed RSD analysis with very different amplitudes, which can be approximately predicted by the halo model. In contrast to previous models our approach systematically includes all of the terms at a given order in PT and provides a physical interpretation for the small scale dispersion values. We investigate RSD power spectrum as a function of μ, the cosine of the angle between the Fourier mode and line of sight, focusing on the lowest order powers of μ and multipole moments which dominate the observable RSD power spectrum. Overall we find considerable success in modeling many, but not all, of the terms in this expansion. This is similar to the situation in real space, but predicting power spectrum in redshift space is more difficult because of the explicit influence of small scale dispersion type effects in RSD, which extend to very large scales.

  6. Approach to functional magnetic resonance imaging of language based on models of language organization.

    PubMed

    McGraw, P; Mathews, V P; Wang, Y; Phillips, M D

    2001-05-01

    Functional MR imaging (fMRI) has been a useful tool in the evaluation of language both in normal individuals and patient populations. The purpose of this article is to use various models of language as a framework to review fMRI studies. Specifically, fMRI language studies are subdivided into the following categories: word generation or fluency, passive listening, orthography, phonology, semantics, and syntax.

  7. Modeling corneal surfaces with rational functions for high-speed videokeratoscopy data compression.

    PubMed

    Schneider, Martin; Iskander, D Robert; Collins, Michael J

    2009-02-01

    High-speed videokeratoscopy is an emerging technique that enables study of the corneal surface and tear-film dynamics. Unlike its static predecessor, this new technique results in a very large amount of digital data for which storage needs become significant. We aimed to design a compression technique that would use mathematical functions to parsimoniously fit corneal surface data with a minimum number of coefficients. Since the Zernike polynomial functions that have been traditionally used for modeling corneal surfaces may not necessarily correctly represent given corneal surface data in terms of its optical performance, we introduced the concept of Zernike polynomial-based rational functions. Modeling optimality criteria were employed in terms of both the rms surface error as well as the point spread function cross-correlation. The parameters of approximations were estimated using a nonlinear least-squares procedure based on the Levenberg-Marquardt algorithm. A large number of retrospective videokeratoscopic measurements were used to evaluate the performance of the proposed rational-function-based modeling approach. The results indicate that the rational functions almost always outperform the traditional Zernike polynomial approximations with the same number of coefficients.

  8. A Comparison of Two-Stage Approaches for Fitting Nonlinear Ordinary Differential Equation Models with Mixed Effects.

    PubMed

    Chow, Sy-Miin; Bendezú, Jason J; Cole, Pamela M; Ram, Nilam

    2016-01-01

    Several approaches exist for estimating the derivatives of observed data for model exploration purposes, including functional data analysis (FDA; Ramsay & Silverman, 2005 ), generalized local linear approximation (GLLA; Boker, Deboeck, Edler, & Peel, 2010 ), and generalized orthogonal local derivative approximation (GOLD; Deboeck, 2010 ). These derivative estimation procedures can be used in a two-stage process to fit mixed effects ordinary differential equation (ODE) models. While the performance and utility of these routines for estimating linear ODEs have been established, they have not yet been evaluated in the context of nonlinear ODEs with mixed effects. We compared properties of the GLLA and GOLD to an FDA-based two-stage approach denoted herein as functional ordinary differential equation with mixed effects (FODEmixed) in a Monte Carlo (MC) study using a nonlinear coupled oscillators model with mixed effects. Simulation results showed that overall, the FODEmixed outperformed both the GLLA and GOLD across all the embedding dimensions considered, but a novel use of a fourth-order GLLA approach combined with very high embedding dimensions yielded estimation results that almost paralleled those from the FODEmixed. We discuss the strengths and limitations of each approach and demonstrate how output from each stage of FODEmixed may be used to inform empirical modeling of young children's self-regulation.

  9. A trust region approach with multivariate Padé model for optimal circuit design

    NASA Astrophysics Data System (ADS)

    Abdel-Malek, Hany L.; Ebid, Shaimaa E. K.; Mohamed, Ahmed S. A.

    2017-11-01

    Since the optimization process requires a significant number of consecutive function evaluations, it is recommended to replace the function by an easily evaluated approximation model during the optimization process. The model suggested in this article is based on a multivariate Padé approximation. This model is constructed using data points of ?, where ? is the number of parameters. The model is updated over a sequence of trust regions. This model avoids the slow convergence of linear models of ? and has features of quadratic models that need interpolation data points of ?. The proposed approach is tested by applying it to several benchmark problems. Yield optimization using such a direct method is applied to some practical circuit examples. Minimax solution leads to a suitable initial point to carry out the yield optimization process. The yield is optimized by the proposed derivative-free method for active and passive filter examples.

  10. Systems engineering interfaces: A model based approach

    NASA Astrophysics Data System (ADS)

    Fosse, E.; Delp, C. L.

    The engineering of interfaces is a critical function of the discipline of Systems Engineering. Included in interface engineering are instances of interaction. Interfaces provide the specifications of the relevant properties of a system or component that can be connected to other systems or components while instances of interaction are identified in order to specify the actual integration to other systems or components. Current Systems Engineering practices rely on a variety of documents and diagrams to describe interface specifications and instances of interaction. The SysML[1] specification provides a precise model based representation for interfaces and interface instance integration. This paper will describe interface engineering as implemented by the Operations Revitalization Task using SysML, starting with a generic case and culminating with a focus on a Flight System to Ground Interaction. The reusability of the interface engineering approach presented as well as its extensibility to more complex interfaces and interactions will be shown. Model-derived tables will support the case studies shown and are examples of model-based documentation products.

  11. Neural modeling and functional neuroimaging.

    PubMed

    Horwitz, B; Sporns, O

    1994-01-01

    Two research areas that so far have had little interaction with one another are functional neuroimaging and computational neuroscience. The application of computational models and techniques to the inherently rich data sets generated by "standard" neurophysiological methods has proven useful for interpreting these data sets and for providing predictions and hypotheses for further experiments. We suggest that both theory- and data-driven computational modeling of neuronal systems can help to interpret data generated by functional neuroimaging methods, especially those used with human subjects. In this article, we point out four sets of questions, addressable by computational neuroscientists whose answere would be of value and interest to those who perform functional neuroimaging. The first set consist of determining the neurobiological substrate of the signals measured by functional neuroimaging. The second set concerns developing systems-level models of functional neuroimaging data. The third set of questions involves integrating functional neuroimaging data across modalities, with a particular emphasis on relating electromagnetic with hemodynamic data. The last set asks how one can relate systems-level models to those at the neuronal and neural ensemble levels. We feel that there are ample reasons to link functional neuroimaging and neural modeling, and that combining the results from the two disciplines will result in furthering our understanding of the central nervous system. © 1994 Wiley-Liss, Inc. This Article is a US Goverment work and, as such, is in the public domain in the United State of America. Copyright © 1994 Wiley-Liss, Inc.

  12. Modeling continuous covariates with a "spike" at zero: Bivariate approaches.

    PubMed

    Jenkner, Carolin; Lorenz, Eva; Becher, Heiko; Sauerbrei, Willi

    2016-07-01

    In epidemiology and clinical research, predictors often take value zero for a large amount of observations while the distribution of the remaining observations is continuous. These predictors are called variables with a spike at zero. Examples include smoking or alcohol consumption. Recently, an extension of the fractional polynomial (FP) procedure, a technique for modeling nonlinear relationships, was proposed to deal with such situations. To indicate whether or not a value is zero, a binary variable is added to the model. In a two stage procedure, called FP-spike, the necessity of the binary variable and/or the continuous FP function for the positive part are assessed for a suitable fit. In univariate analyses, the FP-spike procedure usually leads to functional relationships that are easy to interpret. This paper introduces four approaches for dealing with two variables with a spike at zero (SAZ). The methods depend on the bivariate distribution of zero and nonzero values. Bi-Sep is the simplest of the four bivariate approaches. It uses the univariate FP-spike procedure separately for the two SAZ variables. In Bi-D3, Bi-D1, and Bi-Sub, proportions of zeros in both variables are considered simultaneously in the binary indicators. Therefore, these strategies can account for correlated variables. The methods can be used for arbitrary distributions of the covariates. For illustration and comparison of results, data from a case-control study on laryngeal cancer, with smoking and alcohol intake as two SAZ variables, is considered. In addition, a possible extension to three or more SAZ variables is outlined. A combination of log-linear models for the analysis of the correlation in combination with the bivariate approaches is proposed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. A novel approach to multihazard modeling and simulation.

    PubMed

    Smith, Silas W; Portelli, Ian; Narzisi, Giuseppe; Nelson, Lewis S; Menges, Fabian; Rekow, E Dianne; Mincer, Joshua S; Mishra, Bhubaneswar; Goldfrank, Lewis R

    2009-06-01

    To develop and apply a novel modeling approach to support medical and public health disaster planning and response using a sarin release scenario in a metropolitan environment. An agent-based disaster simulation model was developed incorporating the principles of dose response, surge response, and psychosocial characteristics superimposed on topographically accurate geographic information system architecture. The modeling scenarios involved passive and active releases of sarin in multiple transportation hubs in a metropolitan city. Parameters evaluated included emergency medical services, hospital surge capacity (including implementation of disaster plan), and behavioral and psychosocial characteristics of the victims. In passive sarin release scenarios of 5 to 15 L, mortality increased nonlinearly from 0.13% to 8.69%, reaching 55.4% with active dispersion, reflecting higher initial doses. Cumulative mortality rates from releases in 1 to 3 major transportation hubs similarly increased nonlinearly as a function of dose and systemic stress. The increase in mortality rate was most pronounced in the 80% to 100% emergency department occupancy range, analogous to the previously observed queuing phenomenon. Effective implementation of hospital disaster plans decreased mortality and injury severity. Decreasing ambulance response time and increasing available responding units reduced mortality among potentially salvageable patients. Adverse psychosocial characteristics (excess worry and low compliance) increased demands on health care resources. Transfer to alternative urban sites was possible. An agent-based modeling approach provides a mechanism to assess complex individual and systemwide effects in rare events.

  14. The significance of the choice of radiobiological (NTCP) models in treatment plan objective functions.

    PubMed

    Miller, J; Fuller, M; Vinod, S; Suchowerska, N; Holloway, L

    2009-06-01

    A Clinician's discrimination between radiation therapy treatment plans is traditionally a subjective process, based on experience and existing protocols. A more objective and quantitative approach to distinguish between treatment plans is to use radiobiological or dosimetric objective functions, based on radiobiological or dosimetric models. The efficacy of models is not well understood, nor is the correlation of the rank of plans resulting from the use of models compared to the traditional subjective approach. One such radiobiological model is the Normal Tissue Complication Probability (NTCP). Dosimetric models or indicators are more accepted in clinical practice. In this study, three radiobiological models, Lyman NTCP, critical volume NTCP and relative seriality NTCP, and three dosimetric models, Mean Lung Dose (MLD) and the Lung volumes irradiated at 10Gy (V10) and 20Gy (V20), were used to rank a series of treatment plans using, harm to normal (Lung) tissue as the objective criterion. None of the models considered in this study showed consistent correlation with the Radiation Oncologists plan ranking. If radiobiological or dosimetric models are to be used in objective functions for lung treatments, based on this study it is recommended that the Lyman NTCP model be used because it will provide most consistency with traditional clinician ranking.

  15. From Process Understanding Via Soil Functions to Sustainable Soil Management - A Systemic Approach

    NASA Astrophysics Data System (ADS)

    Wollschlaeger, U.; Bartke, S.; Bartkowski, B.; Daedlow, K.; Helming, K.; Kogel-Knabner, I.; Lang, B.; Rabot, E.; Russell, D.; Stößel, B.; Weller, U.; Wiesmeier, M.; Rabot, E.; Vogel, H. J.

    2017-12-01

    Fertile soils are central resources for the production of biomass and the provision of food and energy. A growing world population and latest climate targets lead to an increasing demand for both, food and bio-energy, which requires preserving and improving the long-term productivity of soils as a bio-economic resource. At the same time, other soil functions and ecosystem services need to be maintained: filter for clean water, carbon sequestration, provision and recycling of nutrients, and habitat for biological activity. All these soil functions result from the interaction of a multitude of physical, chemical and biological processes that are not yet sufficiently understood. In addition, we lack understanding about the interplay between the socio-economic system and the soil system and how soil functions benefit human wellbeing. Hence, a solid and integrated assessment of soil quality requires the consideration of the ensemble of soil functions and its relation to soil management to finally be able to develop site-specific options for sustainable soil management. We present an integrated modeling approach that investigates the influence of soil management on the ensemble of soil functions. It is based on the mechanistic relationships between soil functional attributes, each explained by a network of interacting processes as derived from scientific evidence. As the evidence base required for feeding the model is for the most part stored in the existing scientific literature, another central component of our work is to set up a public "knowledge-portal" providing the infrastructure for a community effort towards a comprehensive knowledge base on soil processes as a basis for model developments. The connection to the socio-economic system is established using the Drivers-Pressures-Impacts-States-Responses (DPSIR) framework where our improved understanding about soil ecosystem processes is linked to ecosystem services and resource efficiency via the soil functions.

  16. Asymptotic-preserving Lagrangian approach for modeling anisotropic transport in magnetized plasmas

    NASA Astrophysics Data System (ADS)

    Chacon, Luis; Del-Castillo-Negrete, Diego

    2012-03-01

    Modeling electron transport in magnetized plasmas is extremely challenging due to the extreme anisotropy between parallel (to the magnetic field) and perpendicular directions (the transport-coefficient ratio χ/χ˜10^10 in fusion plasmas). Recently, a novel Lagrangian Green's function method has been proposedfootnotetextD. del-Castillo-Negrete, L. Chac'on, PRL, 106, 195004 (2011); D. del-Castillo-Negrete, L. Chac'on, Phys. Plasmas, submitted (2011) to solve the local and non-local purely parallel transport equation in general 3D magnetic fields. The approach avoids numerical pollution, is inherently positivity-preserving, and is scalable algorithmically (i.e., work per degree-of-freedom is grid-independent). In this poster, we discuss the extension of the Lagrangian Green's function approach to include perpendicular transport terms and sources. We present an asymptotic-preserving numerical formulation, which ensures a consistent numerical discretization temporally and spatially for arbitrary χ/χ ratios. We will demonstrate the potential of the approach with various challenging configurations, including the case of transport across a magnetic island in cylindrical geometry.

  17. From Network Analysis to Functional Metabolic Modeling of the Human Gut Microbiota.

    PubMed

    Bauer, Eugen; Thiele, Ines

    2018-01-01

    An important hallmark of the human gut microbiota is its species diversity and complexity. Various diseases have been associated with a decreased diversity leading to reduced metabolic functionalities. Common approaches to investigate the human microbiota include high-throughput sequencing with subsequent correlative analyses. However, to understand the ecology of the human gut microbiota and consequently design novel treatments for diseases, it is important to represent the different interactions between microbes with their associated metabolites. Computational systems biology approaches can give further mechanistic insights by constructing data- or knowledge-driven networks that represent microbe interactions. In this minireview, we will discuss current approaches in systems biology to analyze the human gut microbiota, with a particular focus on constraint-based modeling. We will discuss various community modeling techniques with their advantages and differences, as well as their application to predict the metabolic mechanisms of intestinal microbial communities. Finally, we will discuss future perspectives and current challenges of simulating realistic and comprehensive models of the human gut microbiota.

  18. A system decomposition approach to the design of functional observers

    NASA Astrophysics Data System (ADS)

    Fernando, Tyrone; Trinh, Hieu

    2014-09-01

    This paper reports a system decomposition that allows the construction of a minimum-order functional observer using a state observer design approach. The system decomposition translates the functional observer design problem to that of a state observer for a smaller decomposed subsystem. Functional observability indices are introduced, and a closed-form expression for the minimum order required for a functional observer is derived in terms of those functional observability indices.

  19. Wavelet-based functional linear mixed models: an application to measurement error-corrected distributed lag models.

    PubMed

    Malloy, Elizabeth J; Morris, Jeffrey S; Adar, Sara D; Suh, Helen; Gold, Diane R; Coull, Brent A

    2010-07-01

    Frequently, exposure data are measured over time on a grid of discrete values that collectively define a functional observation. In many applications, researchers are interested in using these measurements as covariates to predict a scalar response in a regression setting, with interest focusing on the most biologically relevant time window of exposure. One example is in panel studies of the health effects of particulate matter (PM), where particle levels are measured over time. In such studies, there are many more values of the functional data than observations in the data set so that regularization of the corresponding functional regression coefficient is necessary for estimation. Additional issues in this setting are the possibility of exposure measurement error and the need to incorporate additional potential confounders, such as meteorological or co-pollutant measures, that themselves may have effects that vary over time. To accommodate all these features, we develop wavelet-based linear mixed distributed lag models that incorporate repeated measures of functional data as covariates into a linear mixed model. A Bayesian approach to model fitting uses wavelet shrinkage to regularize functional coefficients. We show that, as long as the exposure error induces fine-scale variability in the functional exposure profile and the distributed lag function representing the exposure effect varies smoothly in time, the model corrects for the exposure measurement error without further adjustment. Both these conditions are likely to hold in the environmental applications we consider. We examine properties of the method using simulations and apply the method to data from a study examining the association between PM, measured as hourly averages for 1-7 days, and markers of acute systemic inflammation. We use the method to fully control for the effects of confounding by other time-varying predictors, such as temperature and co-pollutants.

  20. Hybrid modeling in biochemical systems theory by means of functional petri nets.

    PubMed

    Wu, Jialiang; Voit, Eberhard

    2009-02-01

    Many biological systems are genuinely hybrids consisting of interacting discrete and continuous components and processes that often operate at different time scales. It is therefore desirable to create modeling frameworks capable of combining differently structured processes and permitting their analysis over multiple time horizons. During the past 40 years, Biochemical Systems Theory (BST) has been a very successful approach to elucidating metabolic, gene regulatory, and signaling systems. However, its foundation in ordinary differential equations has precluded BST from directly addressing problems containing switches, delays, and stochastic effects. In this study, we extend BST to hybrid modeling within the framework of Hybrid Functional Petri Nets (HFPN). First, we show how the canonical GMA and S-system models in BST can be directly implemented in a standard Petri Net framework. In a second step we demonstrate how to account for different types of time delays as well as for discrete, stochastic, and switching effects. Using representative test cases, we validate the hybrid modeling approach through comparative analyses and simulations with other approaches and highlight the feasibility, quality, and efficiency of the hybrid method.

  1. Energy-density field approach for low- and medium-frequency vibroacoustic analysis of complex structures using a statistical computational model

    NASA Astrophysics Data System (ADS)

    Kassem, M.; Soize, C.; Gagliardini, L.

    2009-06-01

    In this paper, an energy-density field approach applied to the vibroacoustic analysis of complex industrial structures in the low- and medium-frequency ranges is presented. This approach uses a statistical computational model. The analyzed system consists of an automotive vehicle structure coupled with its internal acoustic cavity. The objective of this paper is to make use of the statistical properties of the frequency response functions of the vibroacoustic system observed from previous experimental and numerical work. The frequency response functions are expressed in terms of a dimensionless matrix which is estimated using the proposed energy approach. Using this dimensionless matrix, a simplified vibroacoustic model is proposed.

  2. Combining wet and dry research: experience with model development for cardiac mechano-electric structure-function studies

    PubMed Central

    Quinn, T. Alexander; Kohl, Peter

    2013-01-01

    Since the development of the first mathematical cardiac cell model 50 years ago, computational modelling has become an increasingly powerful tool for the analysis of data and for the integration of information related to complex cardiac behaviour. Current models build on decades of iteration between experiment and theory, representing a collective understanding of cardiac function. All models, whether computational, experimental, or conceptual, are simplified representations of reality and, like tools in a toolbox, suitable for specific applications. Their range of applicability can be explored (and expanded) by iterative combination of ‘wet’ and ‘dry’ investigation, where experimental or clinical data are used to first build and then validate computational models (allowing integration of previous findings, quantitative assessment of conceptual models, and projection across relevant spatial and temporal scales), while computational simulations are utilized for plausibility assessment, hypotheses-generation, and prediction (thereby defining further experimental research targets). When implemented effectively, this combined wet/dry research approach can support the development of a more complete and cohesive understanding of integrated biological function. This review illustrates the utility of such an approach, based on recent examples of multi-scale studies of cardiac structure and mechano-electric function. PMID:23334215

  3. Novel approaches in function-driven single-cell genomics.

    PubMed

    Doud, Devin F R; Woyke, Tanja

    2017-07-01

    Deeper sequencing and improved bioinformatics in conjunction with single-cell and metagenomic approaches continue to illuminate undercharacterized environmental microbial communities. This has propelled the 'who is there, and what might they be doing' paradigm to the uncultivated and has already radically changed the topology of the tree of life and provided key insights into the microbial contribution to biogeochemistry. While characterization of 'who' based on marker genes can describe a large fraction of the community, answering 'what are they doing' remains the elusive pinnacle for microbiology. Function-driven single-cell genomics provides a solution by using a function-based screen to subsample complex microbial communities in a targeted manner for the isolation and genome sequencing of single cells. This enables single-cell sequencing to be focused on cells with specific phenotypic or metabolic characteristics of interest. Recovered genomes are conclusively implicated for both encoding and exhibiting the feature of interest, improving downstream annotation and revealing activity levels within that environment. This emerging approach has already improved our understanding of microbial community functioning and facilitated the experimental analysis of uncharacterized gene product space. Here we provide a comprehensive review of strategies that have been applied for function-driven single-cell genomics and the future directions we envision. © FEMS 2017.

  4. Novel approaches in function-driven single-cell genomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doud, Devin F. R.; Woyke, Tanja

    Deeper sequencing and improved bioinformatics in conjunction with single-cell and metagenomic approaches continue to illuminate undercharacterized environmental microbial communities. This has propelled the 'who is there, and what might they be doing' paradigm to the uncultivated and has already radically changed the topology of the tree of life and provided key insights into the microbial contribution to biogeochemistry. While characterization of 'who' based on marker genes can describe a large fraction of the community, answering 'what are they doing' remains the elusive pinnacle for microbiology. Function-driven single-cell genomics provides a solution by using a function-based screen to subsample complex microbialmore » communities in a targeted manner for the isolation and genome sequencing of single cells. This enables single-cell sequencing to be focused on cells with specific phenotypic or metabolic characteristics of interest. Recovered genomes are conclusively implicated for both encoding and exhibiting the feature of interest, improving downstream annotation and revealing activity levels within that environment. This emerging approach has already improved our understanding of microbial community functioning and facilitated the experimental analysis of uncharacterized gene product space. Here we provide a comprehensive review of strategies that have been applied for function-driven single-cell genomics and the future directions we envision.« less

  5. Novel approaches in function-driven single-cell genomics

    DOE PAGES

    Doud, Devin F. R.; Woyke, Tanja

    2017-06-07

    Deeper sequencing and improved bioinformatics in conjunction with single-cell and metagenomic approaches continue to illuminate undercharacterized environmental microbial communities. This has propelled the 'who is there, and what might they be doing' paradigm to the uncultivated and has already radically changed the topology of the tree of life and provided key insights into the microbial contribution to biogeochemistry. While characterization of 'who' based on marker genes can describe a large fraction of the community, answering 'what are they doing' remains the elusive pinnacle for microbiology. Function-driven single-cell genomics provides a solution by using a function-based screen to subsample complex microbialmore » communities in a targeted manner for the isolation and genome sequencing of single cells. This enables single-cell sequencing to be focused on cells with specific phenotypic or metabolic characteristics of interest. Recovered genomes are conclusively implicated for both encoding and exhibiting the feature of interest, improving downstream annotation and revealing activity levels within that environment. This emerging approach has already improved our understanding of microbial community functioning and facilitated the experimental analysis of uncharacterized gene product space. Here we provide a comprehensive review of strategies that have been applied for function-driven single-cell genomics and the future directions we envision.« less

  6. A connectivity-based modeling approach for representing hysteresis in macroscopic two-phase flow properties

    DOE PAGES

    Cihan, Abdullah; Birkholzer, Jens; Trevisan, Luca; ...

    2014-12-31

    During CO 2 injection and storage in deep reservoirs, the injected CO 2 enters into an initially brine saturated porous medium, and after the injection stops, natural groundwater flow eventually displaces the injected mobile-phase CO 2, leaving behind residual non-wetting fluid. Accurate modeling of two-phase flow processes are needed for predicting fate and transport of injected CO 2, evaluating environmental risks and designing more effective storage schemes. The entrapped non-wetting fluid saturation is typically a function of the spatially varying maximum saturation at the end of injection. At the pore-scale, distribution of void sizes and connectivity of void space playmore » a major role for the macroscopic hysteresis behavior and capillary entrapment of wetting and non-wetting fluids. This paper presents development of an approach based on the connectivity of void space for modeling hysteretic capillary pressure-saturation-relative permeability relationships. The new approach uses void-size distribution and a measure of void space connectivity to compute the hysteretic constitutive functions and to predict entrapped fluid phase saturations. Two functions, the drainage connectivity function and the wetting connectivity function, are introduced to characterize connectivity of fluids in void space during drainage and wetting processes. These functions can be estimated through pore-scale simulations in computer-generated porous media or from traditional experimental measurements of primary drainage and main wetting curves. The hysteresis model for saturation-capillary pressure is tested successfully by comparing the model-predicted residual saturation and scanning curves with actual data sets obtained from column experiments found in the literature. A numerical two-phase model simulator with the new hysteresis functions is tested against laboratory experiments conducted in a quasi-two-dimensional flow cell (91.4cm×5.6cm×61cm), packed with homogeneous and

  7. QMEANclust: estimation of protein model quality by combining a composite scoring function with structural density information.

    PubMed

    Benkert, Pascal; Schwede, Torsten; Tosatto, Silvio Ce

    2009-05-20

    The selection of the most accurate protein model from a set of alternatives is a crucial step in protein structure prediction both in template-based and ab initio approaches. Scoring functions have been developed which can either return a quality estimate for a single model or derive a score from the information contained in the ensemble of models for a given sequence. Local structural features occurring more frequently in the ensemble have a greater probability of being correct. Within the context of the CASP experiment, these so called consensus methods have been shown to perform considerably better in selecting good candidate models, but tend to fail if the best models are far from the dominant structural cluster. In this paper we show that model selection can be improved if both approaches are combined by pre-filtering the models used during the calculation of the structural consensus. Our recently published QMEAN composite scoring function has been improved by including an all-atom interaction potential term. The preliminary model ranking based on the new QMEAN score is used to select a subset of reliable models against which the structural consensus score is calculated. This scoring function called QMEANclust achieves a correlation coefficient of predicted quality score and GDT_TS of 0.9 averaged over the 98 CASP7 targets and perform significantly better in selecting good models from the ensemble of server models than any other groups participating in the quality estimation category of CASP7. Both scoring functions are also benchmarked on the MOULDER test set consisting of 20 target proteins each with 300 alternatives models generated by MODELLER. QMEAN outperforms all other tested scoring functions operating on individual models, while the consensus method QMEANclust only works properly on decoy sets containing a certain fraction of near-native conformations. We also present a local version of QMEAN for the per-residue estimation of model quality (QMEANlocal

  8. An exemplar-based approach to individualized parcellation reveals the need for sex specific functional networks.

    PubMed

    Salehi, Mehraveh; Karbasi, Amin; Shen, Xilin; Scheinost, Dustin; Constable, R Todd

    2018-04-15

    Recent work with functional connectivity data has led to significant progress in understanding the functional organization of the brain. While the majority of the literature has focused on group-level parcellation approaches, there is ample evidence that the brain varies in both structure and function across individuals. In this work, we introduce a parcellation technique that incorporates delineation of functional networks both at the individual- and group-level. The proposed technique deploys the notion of "submodularity" to jointly parcellate the cerebral cortex while establishing an inclusive correspondence between the individualized functional networks. Using this parcellation technique, we successfully established a cross-validated predictive model that predicts individuals' sex, solely based on the parcellation schemes (i.e. the node-to-network assignment vectors). The sex prediction finding illustrates that individualized parcellation of functional networks can reveal subgroups in a population and suggests that the use of a global network parcellation may overlook fundamental differences in network organization. This is a particularly important point to consider in studies comparing patients versus controls or even patient subgroups. Network organization may differ between individuals and global configurations should not be assumed. This approach to the individualized study of functional organization in the brain has many implications for both neuroscience and clinical applications. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Advance Preparation in Task-Switching: Converging Evidence from Behavioral, Brain Activation, and Model-Based Approaches

    PubMed Central

    Karayanidis, Frini; Jamadar, Sharna; Ruge, Hannes; Phillips, Natalie; Heathcote, Andrew; Forstmann, Birte U.

    2010-01-01

    Recent research has taken advantage of the temporal and spatial resolution of event-related brain potentials (ERPs) and functional magnetic resonance imaging (fMRI) to identify the time course and neural circuitry of preparatory processes required to switch between different tasks. Here we overview some key findings contributing to understanding strategic processes in advance preparation. Findings from these methodologies are compatible with advance preparation conceptualized as a set of processes activated for both switch and repeat trials, but with substantial variability as a function of individual differences and task requirements. We then highlight new approaches that attempt to capitalize on this variability to link behavior and brain activation patterns. One approach examines correlations among behavioral, ERP and fMRI measures. A second “model-based” approach accounts for differences in preparatory processes by estimating quantitative model parameters that reflect latent psychological processes. We argue that integration of behavioral and neuroscientific methodologies is key to understanding the complex nature of advance preparation in task-switching. PMID:21833196

  10. Imputation approaches for animal movement modeling

    USGS Publications Warehouse

    Scharf, Henry; Hooten, Mevin B.; Johnson, Devin S.

    2017-01-01

    The analysis of telemetry data is common in animal ecological studies. While the collection of telemetry data for individual animals has improved dramatically, the methods to properly account for inherent uncertainties (e.g., measurement error, dependence, barriers to movement) have lagged behind. Still, many new statistical approaches have been developed to infer unknown quantities affecting animal movement or predict movement based on telemetry data. Hierarchical statistical models are useful to account for some of the aforementioned uncertainties, as well as provide population-level inference, but they often come with an increased computational burden. For certain types of statistical models, it is straightforward to provide inference if the latent true animal trajectory is known, but challenging otherwise. In these cases, approaches related to multiple imputation have been employed to account for the uncertainty associated with our knowledge of the latent trajectory. Despite the increasing use of imputation approaches for modeling animal movement, the general sensitivity and accuracy of these methods have not been explored in detail. We provide an introduction to animal movement modeling and describe how imputation approaches may be helpful for certain types of models. We also assess the performance of imputation approaches in two simulation studies. Our simulation studies suggests that inference for model parameters directly related to the location of an individual may be more accurate than inference for parameters associated with higher-order processes such as velocity or acceleration. Finally, we apply these methods to analyze a telemetry data set involving northern fur seals (Callorhinus ursinus) in the Bering Sea. Supplementary materials accompanying this paper appear online.

  11. A Statistical Approach For Modeling Tropical Cyclones. Synthetic Hurricanes Generator Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pasqualini, Donatella

    This manuscript brie y describes a statistical ap- proach to generate synthetic tropical cyclone tracks to be used in risk evaluations. The Synthetic Hur- ricane Generator (SynHurG) model allows model- ing hurricane risk in the United States supporting decision makers and implementations of adaptation strategies to extreme weather. In the literature there are mainly two approaches to model hurricane hazard for risk prediction: deterministic-statistical approaches, where the storm key physical parameters are calculated using physi- cal complex climate models and the tracks are usually determined statistically from historical data; and sta- tistical approaches, where both variables and tracks are estimatedmore » stochastically using historical records. SynHurG falls in the second category adopting a pure stochastic approach.« less

  12. Interaction Models for Functional Regression.

    PubMed

    Usset, Joseph; Staicu, Ana-Maria; Maity, Arnab

    2016-02-01

    A functional regression model with a scalar response and multiple functional predictors is proposed that accommodates two-way interactions in addition to their main effects. The proposed estimation procedure models the main effects using penalized regression splines, and the interaction effect by a tensor product basis. Extensions to generalized linear models and data observed on sparse grids or with measurement error are presented. A hypothesis testing procedure for the functional interaction effect is described. The proposed method can be easily implemented through existing software. Numerical studies show that fitting an additive model in the presence of interaction leads to both poor estimation performance and lost prediction power, while fitting an interaction model where there is in fact no interaction leads to negligible losses. The methodology is illustrated on the AneuRisk65 study data.

  13. A Systematic Approach for Real-Time Operator Functional State Assessment

    NASA Technical Reports Server (NTRS)

    Zhang, Guangfan; Wang, Wei; Pepe, Aaron; Xu, Roger; Schnell, Thomas; Anderson, Nick; Heitkamp, Dean; Li, Jiang; Li, Feng; McKenzie, Frederick

    2012-01-01

    A task overload condition often leads to high stress for an operator, causing performance degradation and possibly disastrous consequences. Just as dangerous, with automated flight systems, an operator may experience a task underload condition (during the en-route flight phase, for example), becoming easily bored and finding it difficult to maintain sustained attention. When an unexpected event occurs, either internal or external to the automated system, the disengaged operator may neglect, misunderstand, or respond slowly/inappropriately to the situation. In this paper, we discuss an approach for Operator Functional State (OFS) monitoring in a typical aviation environment. A systematic ground truth finding procedure has been designed based on subjective evaluations, performance measures, and strong physiological indicators. The derived OFS ground truth is continuous in time compared to a very sparse estimation of OFS based on an expert review or subjective evaluations. It can capture the variations of OFS during a mission to better guide through the training process of the OFS assessment model. Furthermore, an OFS assessment model framework based on advanced machine learning techniques was designed and the systematic approach was then verified and validated with experimental data collected in a high fidelity Boeing 737 simulator. Preliminary results show highly accurate engagement/disengagement detection making it suitable for real-time applications to assess pilot engagement.

  14. A Functional Subnetwork Approach to Designing Synthetic Nervous Systems That Control Legged Robot Locomotion

    PubMed Central

    Szczecinski, Nicholas S.; Hunt, Alexander J.; Quinn, Roger D.

    2017-01-01

    A dynamical model of an animal’s nervous system, or synthetic nervous system (SNS), is a potentially transformational control method. Due to increasingly detailed data on the connectivity and dynamics of both mammalian and insect nervous systems, controlling a legged robot with an SNS is largely a problem of parameter tuning. Our approach to this problem is to design functional subnetworks that perform specific operations, and then assemble them into larger models of the nervous system. In this paper, we present networks that perform addition, subtraction, multiplication, division, differentiation, and integration of incoming signals. Parameters are set within each subnetwork to produce the desired output by utilizing the operating range of neural activity, R, the gain of the operation, k, and bounds based on biological values. The assembly of large networks from functional subnetworks underpins our recent results with MantisBot. PMID:28848419

  15. Application of various FLD modelling approaches

    NASA Astrophysics Data System (ADS)

    Banabic, D.; Aretz, H.; Paraianu, L.; Jurco, P.

    2005-07-01

    This paper focuses on a comparison between different modelling approaches to predict the forming limit diagram (FLD) for sheet metal forming under a linear strain path using the recently introduced orthotropic yield criterion BBC2003 (Banabic D et al 2005 Int. J. Plasticity 21 493-512). The FLD models considered here are a finite element based approach, the well known Marciniak-Kuczynski model, the modified maximum force criterion according to Hora et al (1996 Proc. Numisheet'96 Conf. (Dearborn/Michigan) pp 252-6), Swift's diffuse (Swift H W 1952 J. Mech. Phys. Solids 1 1-18) and Hill's classical localized necking approach (Hill R 1952 J. Mech. Phys. Solids 1 19-30). The FLD of an AA5182-O aluminium sheet alloy has been determined experimentally in order to quantify the predictive capabilities of the models mentioned above.

  16. A functional approach to emotion in autonomous systems.

    PubMed

    Sanz, Ricardo; Hernández, Carlos; Gómez, Jaime; Hernando, Adolfo

    2010-01-01

    The construction of fully effective systems seems to pass through the proper exploitation of goal-centric self-evaluative capabilities that let the system teleologically self-manage. Emotions seem to provide this kind of functionality to biological systems and hence the interest in emotion for function sustainment in artificial systems performing in changing and uncertain environments; far beyond the media hullabaloo of displaying human-like emotion-laden faces in robots. This chapter provides a brief analysis of the scientific theories of emotion and presents an engineering approach for developing technology for robust autonomy by implementing functionality inspired in that of biological emotions.

  17. Technical note: Comparison of methane ebullition modelling approaches used in terrestrial wetland models

    NASA Astrophysics Data System (ADS)

    Peltola, Olli; Raivonen, Maarit; Li, Xuefei; Vesala, Timo

    2018-02-01

    Emission via bubbling, i.e. ebullition, is one of the main methane (CH4) emission pathways from wetlands to the atmosphere. Direct measurement of gas bubble formation, growth and release in the peat-water matrix is challenging and in consequence these processes are relatively unknown and are coarsely represented in current wetland CH4 emission models. In this study we aimed to evaluate three ebullition modelling approaches and their effect on model performance. This was achieved by implementing the three approaches in one process-based CH4 emission model. All the approaches were based on some kind of threshold: either on CH4 pore water concentration (ECT), pressure (EPT) or free-phase gas volume (EBG) threshold. The model was run using 4 years of data from a boreal sedge fen and the results were compared with eddy covariance measurements of CH4 fluxes.

    Modelled annual CH4 emissions were largely unaffected by the different ebullition modelling approaches; however, temporal variability in CH4 emissions varied an order of magnitude between the approaches. Hence the ebullition modelling approach drives the temporal variability in modelled CH4 emissions and therefore significantly impacts, for instance, high-frequency (daily scale) model comparison and calibration against measurements. The modelling approach based on the most recent knowledge of the ebullition process (volume threshold, EBG) agreed the best with the measured fluxes (R2 = 0.63) and hence produced the most reasonable results, although there was a scale mismatch between the measurements (ecosystem scale with heterogeneous ebullition locations) and model results (single horizontally homogeneous peat column). The approach should be favoured over the two other more widely used ebullition modelling approaches and researchers are encouraged to implement it into their CH4 emission models.

  18. Minimally invasive functional approach for cholesteatoma surgery.

    PubMed

    Hanna, Bassem M; Kivekäs, Ilkka; Wu, Yi-Hsuan; Guo, Lee J; Lin, Huang; Guidi, Jessica; Poe, Dennis

    2014-10-01

    Report the efficacy of a functional minimally invasive approach for cholesteatoma surgery. Retrospective review of surgical cases performed between 1996 and 2008. One hundred sixty-nine patient charts were reviewed in which ears with primary cholesteatomas that extended beyond the mesotympanum were operated on with a plan for canal wall up (CWU) mastoidectomy. The surgical approach consisted of progressive exposure from transcanal to postauricular tympanoplasty to CWU mastoidectomy, as needed, to identify and lyse the fibrous attachments that bind the capsule to the surrounding mucosa. Endoscopic guidance was employed as appropriate to minimize exposure needs. Any planned second-stage operations were attempted with a transcanal approach if appropriate and with endoscopic assistance. One hundred eighty-four ears of 169 patients were included. The median age was 32 years (range, 1-79 years). The mean follow-up was 3.2 years (range, 1-11 years). Eighty-three (45%) were planned for a second-look operation, and three (2%) required unplanned second operations. The overall recurrence rate was 24/184 (13%), and the unexpected residual rate was 5/184 (3%). The residual rate with endoscopy (5/119, 4%,) or without endoscopy (1/65, 2%), were not significantly different. Hearing results in 156 ears improved significantly, from a preoperative pure-tone average (PTA) of 41 dB to a postoperative PTA average of 29 dB (P < .0001). A functional minimally invasive approach to cholesteatoma surgery provided equivalent residual rates but higher recurrence rates compared to published canal wall down mastoidectomy. Endoscopic techniques were helpful in providing adequate views while minimizing exposure. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  19. Polaron mobility obtained by a variational approach for lattice Fröhlich models

    NASA Astrophysics Data System (ADS)

    Kornjača, Milan; Vukmirović, Nenad

    2018-04-01

    Charge carrier mobility for a class of lattice models with long-range electron-phonon interaction was investigated. The approach for mobility calculation is based on a suitably chosen unitary transformation of the model Hamiltonian which transforms it into the form where the remaining interaction part can be treated as a perturbation. Relevant spectral functions were then obtained using Matsubara Green's functions technique and charge carrier mobility was evaluated using Kubo's linear response formula. Numerical results were presented for a wide range of electron-phonon interaction strengths and temperatures in the case of one-dimensional version of the model. The results indicate that the mobility decreases with increasing temperature for all electron-phonon interaction strengths in the investigated range, while longer interaction range leads to more mobile carriers.

  20. New approaches to probing Minkowski functionals

    NASA Astrophysics Data System (ADS)

    Munshi, D.; Smidt, J.; Cooray, A.; Renzi, A.; Heavens, A.; Coles, P.

    2013-10-01

    We generalize the concept of the ordinary skew-spectrum to probe the effect of non-Gaussianity on the morphology of cosmic microwave background (CMB) maps in several domains: in real space (where they are commonly known as cumulant-correlators), and in harmonic and needlet bases. The essential aim is to retain more information than normally contained in these statistics, in order to assist in determining the source of any measured non-Gaussianity, in the same spirit as Munshi & Heavens skew-spectra were used to identify foreground contaminants to the CMB bispectrum in Planck data. Using a perturbative series to construct the Minkowski functionals (MFs), we provide a pseudo-C_ℓ based approach in both harmonic and needlet representations to estimate these spectra in the presence of a mask and inhomogeneous noise. Assuming homogeneous noise, we present approximate expressions for error covariance for the purpose of joint estimation of these spectra. We present specific results for four different models of primordial non-Gaussianity local, equilateral, orthogonal and enfolded models, as well as non-Gaussianity caused by unsubtracted point sources. Closed form results of next-order corrections to MFs too are obtained in terms of a quadruplet of kurt-spectra. We also use the method of modal decomposition of the bispectrum and trispectrum to reconstruct the MFs as an alternative method of reconstruction of morphological properties of CMB maps. Finally, we introduce the odd-parity skew-spectra to probe the odd-parity bispectrum and its impact on the morphology of the CMB sky. Although developed for the CMB, the generic results obtained here can be useful in other areas of cosmology.

  1. A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models.

    PubMed

    Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S; Wu, Xiaowei; Müller, Rolf

    2018-01-01

    Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design.

  2. A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models

    PubMed Central

    Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S.; Wu, Xiaowei; Müller, Rolf

    2017-01-01

    Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design. PMID:29749977

  3. Investigating different approaches to develop informative priors in hierarchical Bayesian safety performance functions.

    PubMed

    Yu, Rongjie; Abdel-Aty, Mohamed

    2013-07-01

    The Bayesian inference method has been frequently adopted to develop safety performance functions. One advantage of the Bayesian inference is that prior information for the independent variables can be included in the inference procedures. However, there are few studies that discussed how to formulate informative priors for the independent variables and evaluated the effects of incorporating informative priors in developing safety performance functions. This paper addresses this deficiency by introducing four approaches of developing informative priors for the independent variables based on historical data and expert experience. Merits of these informative priors have been tested along with two types of Bayesian hierarchical models (Poisson-gamma and Poisson-lognormal models). Deviance information criterion (DIC), R-square values, and coefficients of variance for the estimations were utilized as evaluation measures to select the best model(s). Comparison across the models indicated that the Poisson-gamma model is superior with a better model fit and it is much more robust with the informative priors. Moreover, the two-stage Bayesian updating informative priors provided the best goodness-of-fit and coefficient estimation accuracies. Furthermore, informative priors for the inverse dispersion parameter have also been introduced and tested. Different types of informative priors' effects on the model estimations and goodness-of-fit have been compared and concluded. Finally, based on the results, recommendations for future research topics and study applications have been made. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Anger, Fear, Uncertainty, and Attitudes: A Test of the Cognitive-Functional Model.

    ERIC Educational Resources Information Center

    Nabi, Robin L.

    2002-01-01

    Explains that the cognitive-functional model of discrete negative emotions and attitude change attempts to bridge the theoretical gap between "emotional" and "rational" approaches to persuasion by focusing on how emotions motivate attention to and processing of persuasive messages. Explores the effects two emotions, anger and fear, and two levels…

  5. Does Surgical Approach Affect Patient-reported Function After Primary THA?

    PubMed

    Graves, Sara C; Dropkin, Benjamin M; Keeney, Benjamin J; Lurie, Jon D; Tomek, Ivan M

    2016-04-01

    Total hip arthroplasty (THA) relieves pain and improves physical function in patients with hip osteoarthritis, but requires a year or more for full postoperative recovery. Proponents of intermuscular surgical approaches believe that the direct-anterior approach may restore physical function more quickly than transgluteal approaches, perhaps because of diminished muscle trauma. To evaluate this, we compared patient-reported physical function and other outcome metrics during the first year after surgery between groups of patients who underwent primary THA either through the direct-anterior approach or posterior approach. We asked: (1) Is a primary THA using a direct-anterior approach associated with better patient-reported physical function at early postoperative times (1 and 3 months) compared with a THA performed through the posterior approach? (2) Is the direct-anterior approach THA associated with shorter operative times and higher rates of noninstitutional discharge than a posterior approach THA? Between October 2008 and February 2010, an arthroplasty fellowship-trained surgeon performed 135 THAs. All 135 were performed using the posterior approach. During that period, we used this approach when patients had any moderate to severe degenerative joint disease of the hip attributable to any type of arthritis refractory to nonoperative treatment measures. Of the patients who were treated with this approach, 21 (17%; 23 hips) were lost to followup, whereas 109 (83%; 112 hips) were available for followup at 1 year. Between February and September 2011, the same surgeon performed 86 THAs. All 86 were performed using the direct-anterior approach. During that period, we used this approach when patients with all types of moderate to severe degenerative joint disease had nonoperative treatment measures fail. Of the patients who were treated with this approach, 35 (41%; 35 hips) were lost to followup, whereas 51 (59%; 51 hips) were available for followup at 1 year. THAs

  6. Air Pollution and Lung Function in Dutch Children: A Comparison of Exposure Estimates and Associations Based on Land Use Regression and Dispersion Exposure Modeling Approaches

    PubMed Central

    Gehring, Ulrike; Hoek, Gerard; Keuken, Menno; Jonkers, Sander; Beelen, Rob; Eeftens, Marloes; Postma, Dirkje S.; Brunekreef, Bert

    2015-01-01

    . 2015. Air pollution and lung function in Dutch children: a comparison of exposure estimates and associations based on land use regression and dispersion exposure modeling approaches. Environ Health Perspect 123:847–851; http://dx.doi.org/10.1289/ehp.1408541 PMID:25839747

  7. A Unified Approach to Modeling Multidisciplinary Interactions

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Bhatia, Kumar G.

    2000-01-01

    There are a number of existing methods to transfer information among various disciplines. For a multidisciplinary application with n disciplines, the traditional methods may be required to model (n(exp 2) - n) interactions. This paper presents a unified three-dimensional approach that reduces the number of interactions from (n(exp 2) - n) to 2n by using a computer-aided design model. The proposed modeling approach unifies the interactions among various disciplines. The approach is independent of specific discipline implementation, and a number of existing methods can be reformulated in the context of the proposed unified approach. This paper provides an overview of the proposed unified approach and reformulations for two existing methods. The unified approach is specially tailored for application environments where the geometry is created and managed through a computer-aided design system. Results are presented for a blended-wing body and a high-speed civil transport.

  8. Influence Function Learning in Information Diffusion Networks.

    PubMed

    Du, Nan; Liang, Yingyu; Balcan, Maria-Florina; Song, Le

    2014-06-01

    Can we learn the influence of a set of people in a social network from cascades of information diffusion? This question is often addressed by a two-stage approach: first learn a diffusion model, and then calculate the influence based on the learned model. Thus, the success of this approach relies heavily on the correctness of the diffusion model which is hard to verify for real world data. In this paper, we exploit the insight that the influence functions in many diffusion models are coverage functions, and propose a novel parameterization of such functions using a convex combination of random basis functions. Moreover, we propose an efficient maximum likelihood based algorithm to learn such functions directly from cascade data, and hence bypass the need to specify a particular diffusion model in advance. We provide both theoretical and empirical analysis for our approach, showing that the proposed approach can provably learn the influence function with low sample complexity, be robust to the unknown diffusion models, and significantly outperform existing approaches in both synthetic and real world data.

  9. Influence Function Learning in Information Diffusion Networks

    PubMed Central

    Du, Nan; Liang, Yingyu; Balcan, Maria-Florina; Song, Le

    2015-01-01

    Can we learn the influence of a set of people in a social network from cascades of information diffusion? This question is often addressed by a two-stage approach: first learn a diffusion model, and then calculate the influence based on the learned model. Thus, the success of this approach relies heavily on the correctness of the diffusion model which is hard to verify for real world data. In this paper, we exploit the insight that the influence functions in many diffusion models are coverage functions, and propose a novel parameterization of such functions using a convex combination of random basis functions. Moreover, we propose an efficient maximum likelihood based algorithm to learn such functions directly from cascade data, and hence bypass the need to specify a particular diffusion model in advance. We provide both theoretical and empirical analysis for our approach, showing that the proposed approach can provably learn the influence function with low sample complexity, be robust to the unknown diffusion models, and significantly outperform existing approaches in both synthetic and real world data. PMID:25973445

  10. Volumetric brain magnetic resonance imaging predicts functioning in bipolar disorder: A machine learning approach.

    PubMed

    Sartori, Juliana M; Reckziegel, Ramiro; Passos, Ives Cavalcante; Czepielewski, Leticia S; Fijtman, Adam; Sodré, Leonardo A; Massuda, Raffael; Goi, Pedro D; Vianna-Sulzbach, Miréia; Cardoso, Taiane de Azevedo; Kapczinski, Flávio; Mwangi, Benson; Gama, Clarissa S

    2018-08-01

    Neuroimaging studies have been steadily explored in Bipolar Disorder (BD) in the last decades. Neuroanatomical changes tend to be more pronounced in patients with repeated episodes. Although the role of such changes in cognition and memory is well established, daily-life functioning impairments bulge among the consequences of the proposed progression. The objective of this study was to analyze MRI volumetric modifications in BD and healthy controls (HC) as possible predictors of daily-life functioning through a machine learning approach. Ninety-four participants (35 DSM-IV BD type I and 59 HC) underwent clinical and functioning assessments, and structural MRI. Functioning was assessed using the Functioning Assessment Short Test (FAST). The machine learning analysis was used to identify possible candidates of regional brain volumes that could predict functioning status, through a support vector regression algorithm. Patients with BD and HC did not differ in age, education and marital status. There were significant differences between groups in gender, BMI, FAST score, and employment status. There was significant correlation between observed and predicted FAST score for patients with BD, but not for controls. According to the model, the brain structures volumes that could predict FAST scores were: left superior frontal cortex, left rostral medial frontal cortex, right white matter total volume and right lateral ventricle volume. The machine learning approach demonstrated that brain volume changes in MRI were predictors of FAST score in patients with BD and could identify specific brain areas related to functioning impairment. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Determining the mechanical constitutive properties of metals as a function of strain rate and temperature: A combined experimental and modeling approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    I. M. Robertson; A. Beaudoin; J. Lambros

    2004-01-05

    OAK-135 Development and validation of constitutive models for polycrystalline materials subjected to high strain rate loading over a range of temperatures are needed to predict the response of engineering materials to in-service type conditions (foreign object damage, high-strain rate forging, high-speed sheet forming, deformation behavior during forming, response to extreme conditions, etc.). To account accurately for the complex effects that can occur during extreme and variable loading conditions, requires significant and detailed computational and modeling efforts. These efforts must be closely coupled with precise and targeted experimental measurements that not only verify the predictions of the models, but also providemore » input about the fundamental processes responsible for the macroscopic response. Achieving this coupling between modeling and experimentation is the guiding principle of this program. Specifically, this program seeks to bridge the length scale between discrete dislocation interactions with grain boundaries and continuum models for polycrystalline plasticity. Achieving this goal requires incorporating these complex dislocation-interface interactions into the well-defined behavior of single crystals. Despite the widespread study of metal plasticity, this aspect is not well understood for simple loading conditions, let alone extreme ones. Our experimental approach includes determining the high-strain rate response as a function of strain and temperature with post-mortem characterization of the microstructure, quasi-static testing of pre-deformed material, and direct observation of the dislocation behavior during reloading by using the in situ transmission electron microscope deformation technique. These experiments will provide the basis for development and validation of physically-based constitutive models, which will include dislocation-grain boundary interactions for polycrystalline systems. One aspect of the program will involve the

  12. Function Model for Community Health Service Information

    NASA Astrophysics Data System (ADS)

    Yang, Peng; Pan, Feng; Liu, Danhong; Xu, Yongyong

    In order to construct a function model of community health service (CHS) information for development of CHS information management system, Integration Definition for Function Modeling (IDEF0), an IEEE standard which is extended from Structured Analysis and Design(SADT) and now is a widely used function modeling method, was used to classifying its information from top to bottom. The contents of every level of the model were described and coded. Then function model for CHS information, which includes 4 super-classes, 15 classes and 28 sub-classed of business function, 43 business processes and 168 business activities, was established. This model can facilitate information management system development and workflow refinement.

  13. Elucidating the functional relationship between working memory capacity and psychometric intelligence: a fixed-links modeling approach for experimental repeated-measures designs.

    PubMed

    Thomas, Philipp; Rammsayer, Thomas; Schweizer, Karl; Troche, Stefan

    2015-01-01

    Numerous studies reported a strong link between working memory capacity (WMC) and fluid intelligence (Gf), although views differ in respect to how close these two constructs are related to each other. In the present study, we used a WMC task with five levels of task demands to assess the relationship between WMC and Gf by means of a new methodological approach referred to as fixed-links modeling. Fixed-links models belong to the family of confirmatory factor analysis (CFA) and are of particular interest for experimental, repeated-measures designs. With this technique, processes systematically varying across task conditions can be disentangled from processes unaffected by the experimental manipulation. Proceeding from the assumption that experimental manipulation in a WMC task leads to increasing demands on WMC, the processes systematically varying across task conditions can be assumed to be WMC-specific. Processes not varying across task conditions, on the other hand, are probably independent of WMC. Fixed-links models allow for representing these two kinds of processes by two independent latent variables. In contrast to traditional CFA where a common latent variable is derived from the different task conditions, fixed-links models facilitate a more precise or purified representation of the WMC-related processes of interest. By using fixed-links modeling to analyze data of 200 participants, we identified a non-experimental latent variable, representing processes that remained constant irrespective of the WMC task conditions, and an experimental latent variable which reflected processes that varied as a function of experimental manipulation. This latter variable represents the increasing demands on WMC and, hence, was considered a purified measure of WMC controlled for the constant processes. Fixed-links modeling showed that both the purified measure of WMC (β = .48) as well as the constant processes involved in the task (β = .45) were related to Gf. Taken

  14. Towards new approaches in phenological modelling

    NASA Astrophysics Data System (ADS)

    Chmielewski, Frank-M.; Götz, Klaus-P.; Rawel, Harshard M.; Homann, Thomas

    2014-05-01

    Modelling of phenological stages is based on temperature sums for many decades, describing both the chilling and the forcing requirement of woody plants until the beginning of leafing or flowering. Parts of this approach go back to Reaumur (1735), who originally proposed the concept of growing degree-days. Now, there is a growing body of opinion that asks for new methods in phenological modelling and more in-depth studies on dormancy release of woody plants. This requirement is easily understandable if we consider the wide application of phenological models, which can even affect the results of climate models. To this day, in phenological models still a number of parameters need to be optimised on observations, although some basic physiological knowledge of the chilling and forcing requirement of plants is already considered in these approaches (semi-mechanistic models). Limiting, for a fundamental improvement of these models, is the lack of knowledge about the course of dormancy in woody plants, which cannot be directly observed and which is also insufficiently described in the literature. Modern metabolomic methods provide a solution for this problem and allow both, the validation of currently used phenological models as well as the development of mechanistic approaches. In order to develop this kind of models, changes of metabolites (concentration, temporal course) must be set in relation to the variability of environmental (steering) parameters (weather, day length, etc.). This necessarily requires multi-year (3-5 yr.) and high-resolution (weekly probes between autumn and spring) data. The feasibility of this approach has already been tested in a 3-year pilot-study on sweet cherries. Our suggested methodology is not only limited to the flowering of fruit trees, it can be also applied to tree species of the natural vegetation, where even greater deficits in phenological modelling exist.

  15. An integrative approach to ortholog prediction for disease-focused and other functional studies.

    PubMed

    Hu, Yanhui; Flockhart, Ian; Vinayagam, Arunachalam; Bergwitz, Clemens; Berger, Bonnie; Perrimon, Norbert; Mohr, Stephanie E

    2011-08-31

    Mapping of orthologous genes among species serves an important role in functional genomics by allowing researchers to develop hypotheses about gene function in one species based on what is known about the functions of orthologs in other species. Several tools for predicting orthologous gene relationships are available. However, these tools can give different results and identification of predicted orthologs is not always straightforward. We report a simple but effective tool, the Drosophila RNAi Screening Center Integrative Ortholog Prediction Tool (DIOPT; http://www.flyrnai.org/diopt), for rapid identification of orthologs. DIOPT integrates existing approaches, facilitating rapid identification of orthologs among human, mouse, zebrafish, C. elegans, Drosophila, and S. cerevisiae. As compared to individual tools, DIOPT shows increased sensitivity with only a modest decrease in specificity. Moreover, the flexibility built into the DIOPT graphical user interface allows researchers with different goals to appropriately 'cast a wide net' or limit results to highest confidence predictions. DIOPT also displays protein and domain alignments, including percent amino acid identity, for predicted ortholog pairs. This helps users identify the most appropriate matches among multiple possible orthologs. To facilitate using model organisms for functional analysis of human disease-associated genes, we used DIOPT to predict high-confidence orthologs of disease genes in Online Mendelian Inheritance in Man (OMIM) and genes in genome-wide association study (GWAS) data sets. The results are accessible through the DIOPT diseases and traits query tool (DIOPT-DIST; http://www.flyrnai.org/diopt-dist). DIOPT and DIOPT-DIST are useful resources for researchers working with model organisms, especially those who are interested in exploiting model organisms such as Drosophila to study the functions of human disease genes.

  16. A modular approach for item response theory modeling with the R package flirt.

    PubMed

    Jeon, Minjeong; Rijmen, Frank

    2016-06-01

    The new R package flirt is introduced for flexible item response theory (IRT) modeling of psychological, educational, and behavior assessment data. flirt integrates a generalized linear and nonlinear mixed modeling framework with graphical model theory. The graphical model framework allows for efficient maximum likelihood estimation. The key feature of flirt is its modular approach to facilitate convenient and flexible model specifications. Researchers can construct customized IRT models by simply selecting various modeling modules, such as parametric forms, number of dimensions, item and person covariates, person groups, link functions, etc. In this paper, we describe major features of flirt and provide examples to illustrate how flirt works in practice.

  17. Effective use of integrated hydrological models in basin-scale water resources management: surrogate modeling approaches

    NASA Astrophysics Data System (ADS)

    Zheng, Y.; Wu, B.; Wu, X.

    2015-12-01

    Integrated hydrological models (IHMs) consider surface water and subsurface water as a unified system, and have been widely adopted in basin-scale water resources studies. However, due to IHMs' mathematical complexity and high computational cost, it is difficult to implement them in an iterative model evaluation process (e.g., Monte Carlo Simulation, simulation-optimization analysis, etc.), which diminishes their applicability for supporting decision-making in real-world situations. Our studies investigated how to effectively use complex IHMs to address real-world water issues via surrogate modeling. Three surrogate modeling approaches were considered, including 1) DYCORS (DYnamic COordinate search using Response Surface models), a well-established response surface-based optimization algorithm; 2) SOIM (Surrogate-based Optimization for Integrated surface water-groundwater Modeling), a response surface-based optimization algorithm that we developed specifically for IHMs; and 3) Probabilistic Collocation Method (PCM), a stochastic response surface approach. Our investigation was based on a modeling case study in the Heihe River Basin (HRB), China's second largest endorheic river basin. The GSFLOW (Coupled Ground-Water and Surface-Water Flow Model) model was employed. Two decision problems were discussed. One is to optimize, both in time and in space, the conjunctive use of surface water and groundwater for agricultural irrigation in the middle HRB region; and the other is to cost-effectively collect hydrological data based on a data-worth evaluation. Overall, our study results highlight the value of incorporating an IHM in making decisions of water resources management and hydrological data collection. An IHM like GSFLOW can provide great flexibility to formulating proper objective functions and constraints for various optimization problems. On the other hand, it has been demonstrated that surrogate modeling approaches can pave the path for such incorporation in real

  18. Functional model of biological neural networks.

    PubMed

    Lo, James Ting-Ho

    2010-12-01

    A functional model of biological neural networks, called temporal hierarchical probabilistic associative memory (THPAM), is proposed in this paper. THPAM comprises functional models of dendritic trees for encoding inputs to neurons, a first type of neuron for generating spike trains, a second type of neuron for generating graded signals to modulate neurons of the first type, supervised and unsupervised Hebbian learning mechanisms for easy learning and retrieving, an arrangement of dendritic trees for maximizing generalization, hardwiring for rotation-translation-scaling invariance, and feedback connections with different delay durations for neurons to make full use of present and past informations generated by neurons in the same and higher layers. These functional models and their processing operations have many functions of biological neural networks that have not been achieved by other models in the open literature and provide logically coherent answers to many long-standing neuroscientific questions. However, biological justifications of these functional models and their processing operations are required for THPAM to qualify as a macroscopic model (or low-order approximate) of biological neural networks.

  19. A Comparison of Two-Stage Approaches for Fitting Nonlinear Ordinary Differential Equation (ODE) Models with Mixed Effects

    PubMed Central

    Chow, Sy-Miin; Bendezú, Jason J.; Cole, Pamela M.; Ram, Nilam

    2016-01-01

    Several approaches currently exist for estimating the derivatives of observed data for model exploration purposes, including functional data analysis (FDA), generalized local linear approximation (GLLA), and generalized orthogonal local derivative approximation (GOLD). These derivative estimation procedures can be used in a two-stage process to fit mixed effects ordinary differential equation (ODE) models. While the performance and utility of these routines for estimating linear ODEs have been established, they have not yet been evaluated in the context of nonlinear ODEs with mixed effects. We compared properties of the GLLA and GOLD to an FDA-based two-stage approach denoted herein as functional ordinary differential equation with mixed effects (FODEmixed) in a Monte Carlo study using a nonlinear coupled oscillators model with mixed effects. Simulation results showed that overall, the FODEmixed outperformed both the GLLA and GOLD across all the embedding dimensions considered, but a novel use of a fourth-order GLLA approach combined with very high embedding dimensions yielded estimation results that almost paralleled those from the FODEmixed. We discuss the strengths and limitations of each approach and demonstrate how output from each stage of FODEmixed may be used to inform empirical modeling of young children’s self-regulation. PMID:27391255

  20. An Approach toward the Development of a Functional Encoding Model of Short Term Memory during Reading.

    ERIC Educational Resources Information Center

    Herndon, Mary Anne

    1978-01-01

    In a model of the functioning of short term memory, the encoding of information for subsequent storage in long term memory is simulated. In the encoding process, semantically equivalent paragraphs are detected for recombination into a macro information unit. (HOD)

  1. Electronic Excitations in Solution: The Interplay between State Specific Approaches and a Time-Dependent Density Functional Theory Description.

    PubMed

    Guido, Ciro A; Jacquemin, Denis; Adamo, Carlo; Mennucci, Benedetta

    2015-12-08

    We critically analyze the performances of continuum solvation models when coupled to time-dependent density functional theory (TD-DFT) to predict solvent effects on both absorption and emission energies of chromophores in solution. Different polarization schemes of the polarizable continuum model (PCM), such as linear response (LR) and three different state specific (SS) approaches, are considered and compared. We show the necessity of introducing a SS model in cases where large electron density rearrangements are involved in the excitations, such as charge-transfer transitions in both twisted and quadrupolar compounds, and underline the very delicate interplay between the selected polarization method and the chosen exchange-correlation functional. This interplay originates in the different descriptions of the transition and ground/excited state multipolar moments by the different functionals. As a result, the choice of both the DFT functional and the solvent polarization scheme has to be consistent with the nature of the studied electronic excitation.

  2. First-Principles Approach to Model Electrochemical Reactions: Understanding the Fundamental Mechanisms behind Mg Corrosion

    NASA Astrophysics Data System (ADS)

    Surendralal, Sudarsan; Todorova, Mira; Finnis, Michael W.; Neugebauer, Jörg

    2018-06-01

    Combining concepts of semiconductor physics and corrosion science, we develop a novel approach that allows us to perform ab initio calculations under controlled potentiostat conditions for electrochemical systems. The proposed approach can be straightforwardly applied in standard density functional theory codes. To demonstrate the performance and the opportunities opened by this approach, we study the chemical reactions that take place during initial corrosion at the water-Mg interface under anodic polarization. Based on this insight, we derive an atomistic model that explains the origin of the anodic hydrogen evolution.

  3. Improved Model Fitting for the Empirical Green's Function Approach Using Hierarchical Models

    NASA Astrophysics Data System (ADS)

    Van Houtte, Chris; Denolle, Marine

    2018-04-01

    Stress drops calculated from source spectral studies currently show larger variability than what is implied by empirical ground motion models. One of the potential origins of the inflated variability is the simplified model-fitting techniques used in most source spectral studies. This study examines a variety of model-fitting methods and shows that the choice of method can explain some of the discrepancy. The preferred method is Bayesian hierarchical modeling, which can reduce bias, better quantify uncertainties, and allow additional effects to be resolved. Two case study earthquakes are examined, the 2016 MW7.1 Kumamoto, Japan earthquake and a MW5.3 aftershock of the 2016 MW7.8 Kaikōura earthquake. By using hierarchical models, the variation of the corner frequency, fc, and the falloff rate, n, across the focal sphere can be retrieved without overfitting the data. Other methods commonly used to calculate corner frequencies may give substantial biases. In particular, if fc was calculated for the Kumamoto earthquake using an ω-square model, the obtained fc could be twice as large as a realistic value.

  4. A systematic approach for finding the objective function and active constraints for dynamic flux balance analysis.

    PubMed

    Nikdel, Ali; Braatz, Richard D; Budman, Hector M

    2018-05-01

    Dynamic flux balance analysis (DFBA) has become an instrumental modeling tool for describing the dynamic behavior of bioprocesses. DFBA involves the maximization of a biologically meaningful objective subject to kinetic constraints on the rate of consumption/production of metabolites. In this paper, we propose a systematic data-based approach for finding both the biological objective function and a minimum set of active constraints necessary for matching the model predictions to the experimental data. The proposed algorithm accounts for the errors in the experiments and eliminates the need for ad hoc choices of objective function and constraints as done in previous studies. The method is illustrated for two cases: (1) for in silico (simulated) data generated by a mathematical model for Escherichia coli and (2) for actual experimental data collected from the batch fermentation of Bordetella Pertussis (whooping cough).

  5. JuPOETs: a constrained multiobjective optimization approach to estimate biochemical model ensembles in the Julia programming language.

    PubMed

    Bassen, David M; Vilkhovoy, Michael; Minot, Mason; Butcher, Jonathan T; Varner, Jeffrey D

    2017-01-25

    Ensemble modeling is a promising approach for obtaining robust predictions and coarse grained population behavior in deterministic mathematical models. Ensemble approaches address model uncertainty by using parameter or model families instead of single best-fit parameters or fixed model structures. Parameter ensembles can be selected based upon simulation error, along with other criteria such as diversity or steady-state performance. Simulations using parameter ensembles can estimate confidence intervals on model variables, and robustly constrain model predictions, despite having many poorly constrained parameters. In this software note, we present a multiobjective based technique to estimate parameter or models ensembles, the Pareto Optimal Ensemble Technique in the Julia programming language (JuPOETs). JuPOETs integrates simulated annealing with Pareto optimality to estimate ensembles on or near the optimal tradeoff surface between competing training objectives. We demonstrate JuPOETs on a suite of multiobjective problems, including test functions with parameter bounds and system constraints as well as for the identification of a proof-of-concept biochemical model with four conflicting training objectives. JuPOETs identified optimal or near optimal solutions approximately six-fold faster than a corresponding implementation in Octave for the suite of test functions. For the proof-of-concept biochemical model, JuPOETs produced an ensemble of parameters that gave both the mean of the training data for conflicting data sets, while simultaneously estimating parameter sets that performed well on each of the individual objective functions. JuPOETs is a promising approach for the estimation of parameter and model ensembles using multiobjective optimization. JuPOETs can be adapted to solve many problem types, including mixed binary and continuous variable types, bilevel optimization problems and constrained problems without altering the base algorithm. JuPOETs is open

  6. A DYNAMIC DENSITY FUNCTIONAL THEORY APPROACH TO DIFFUSION IN WHITE DWARFS AND NEUTRON STAR ENVELOPES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diaw, A.; Murillo, M. S.

    2016-09-20

    We develop a multicomponent hydrodynamic model based on moments of the Born–Bogolyubov–Green–Kirkwood–Yvon hierarchy equations for physical conditions relevant to astrophysical plasmas. These equations incorporate strong correlations through a density functional theory closure, while transport enters through a relaxation approximation. This approach enables the introduction of Coulomb coupling correction terms into the standard Burgers equations. The diffusive currents for these strongly coupled plasmas is self-consistently derived. The settling of impurities and its impact on cooling can be greatly affected by strong Coulomb coupling, which we show can be quantified using the direct correlation function.

  7. A Modeling Approach for Burn Scar Assessment Using Natural Features and Elastic Property

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsap, L V; Zhang, Y; Goldgof, D B

    2004-04-02

    A modeling approach is presented for quantitative burn scar assessment. Emphases are given to: (1) constructing a finite element model from natural image features with an adaptive mesh, and (2) quantifying the Young's modulus of scars using the finite element model and the regularization method. A set of natural point features is extracted from the images of burn patients. A Delaunay triangle mesh is then generated that adapts to the point features. A 3D finite element model is built on top of the mesh with the aid of range images providing the depth information. The Young's modulus of scars ismore » quantified with a simplified regularization functional, assuming that the knowledge of scar's geometry is available. The consistency between the Relative Elasticity Index and the physician's rating based on the Vancouver Scale (a relative scale used to rate burn scars) indicates that the proposed modeling approach has high potentials for image-based quantitative burn scar assessment.« less

  8. Fragment approach to constrained density functional theory calculations using Daubechies wavelets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ratcliff, Laura E.; Genovese, Luigi; Mohr, Stephan

    2015-06-21

    In a recent paper, we presented a linear scaling Kohn-Sham density functional theory (DFT) code based on Daubechies wavelets, where a minimal set of localized support functions are optimized in situ and therefore adapted to the chemical properties of the molecular system. Thanks to the systematically controllable accuracy of the underlying basis set, this approach is able to provide an optimal contracted basis for a given system: accuracies for ground state energies and atomic forces are of the same quality as an uncontracted, cubic scaling approach. This basis set offers, by construction, a natural subset where the density matrix ofmore » the system can be projected. In this paper, we demonstrate the flexibility of this minimal basis formalism in providing a basis set that can be reused as-is, i.e., without reoptimization, for charge-constrained DFT calculations within a fragment approach. Support functions, represented in the underlying wavelet grid, of the template fragments are roto-translated with high numerical precision to the required positions and used as projectors for the charge weight function. We demonstrate the interest of this approach to express highly precise and efficient calculations for preparing diabatic states and for the computational setup of systems in complex environments.« less

  9. Computational functional genomics-based approaches in analgesic drug discovery and repurposing.

    PubMed

    Lippmann, Catharina; Kringel, Dario; Ultsch, Alfred; Lötsch, Jörn

    2018-06-01

    Persistent pain is a major healthcare problem affecting a fifth of adults worldwide with still limited treatment options. The search for new analgesics increasingly includes the novel research area of functional genomics, which combines data derived from various processes related to DNA sequence, gene expression or protein function and uses advanced methods of data mining and knowledge discovery with the goal of understanding the relationship between the genome and the phenotype. Its use in drug discovery and repurposing for analgesic indications has so far been performed using knowledge discovery in gene function and drug target-related databases; next-generation sequencing; and functional proteomics-based approaches. Here, we discuss recent efforts in functional genomics-based approaches to analgesic drug discovery and repurposing and highlight the potential of computational functional genomics in this field including a demonstration of the workflow using a novel R library 'dbtORA'.

  10. Virtual Plants Need Water Too: Functional-Structural Root System Models in the Context of Drought Tolerance Breeding

    PubMed Central

    Ndour, Adama; Vadez, Vincent; Pradal, Christophe; Lucas, Mikaël

    2017-01-01

    Developing a sustainable agricultural model is one of the great challenges of the coming years. The agricultural practices inherited from the Green Revolution of the 1960s show their limits today, and new paradigms need to be explored to counter rising issues such as the multiplication of climate-change related drought episodes. Two such new paradigms are the use of functional-structural plant models to complement and rationalize breeding approaches and a renewed focus on root systems as untapped sources of plant amelioration. Since the late 1980s, numerous functional and structural models of root systems were developed and used to investigate the properties of root systems in soil or lab-conditions. In this review, we focus on the conception and use of such root models in the broader context of research on root-driven drought tolerance, on the basis of root system architecture (RSA) phenotyping. Such models result from the integration of architectural, physiological and environmental data. Here, we consider the different phenotyping techniques allowing for root architectural and physiological study and their limits. We discuss how QTL and breeding studies support the manipulation of RSA as a way to improve drought resistance. We then go over the integration of the generated data within architectural models, how those architectural models can be coupled with functional hydraulic models, and how functional parameters can be measured to feed those models. We then consider the assessment and validation of those hydraulic models through confrontation of simulations to experimentations. Finally, we discuss the up and coming challenges facing root systems functional-structural modeling approaches in the context of breeding. PMID:29018456

  11. Virtual Plants Need Water Too: Functional-Structural Root System Models in the Context of Drought Tolerance Breeding.

    PubMed

    Ndour, Adama; Vadez, Vincent; Pradal, Christophe; Lucas, Mikaël

    2017-01-01

    Developing a sustainable agricultural model is one of the great challenges of the coming years. The agricultural practices inherited from the Green Revolution of the 1960s show their limits today, and new paradigms need to be explored to counter rising issues such as the multiplication of climate-change related drought episodes. Two such new paradigms are the use of functional-structural plant models to complement and rationalize breeding approaches and a renewed focus on root systems as untapped sources of plant amelioration. Since the late 1980s, numerous functional and structural models of root systems were developed and used to investigate the properties of root systems in soil or lab-conditions. In this review, we focus on the conception and use of such root models in the broader context of research on root-driven drought tolerance, on the basis of root system architecture (RSA) phenotyping. Such models result from the integration of architectural, physiological and environmental data. Here, we consider the different phenotyping techniques allowing for root architectural and physiological study and their limits. We discuss how QTL and breeding studies support the manipulation of RSA as a way to improve drought resistance. We then go over the integration of the generated data within architectural models, how those architectural models can be coupled with functional hydraulic models, and how functional parameters can be measured to feed those models. We then consider the assessment and validation of those hydraulic models through confrontation of simulations to experimentations. Finally, we discuss the up and coming challenges facing root systems functional-structural modeling approaches in the context of breeding.

  12. Comparison of penalty functions on a penalty approach to mixed-integer optimization

    NASA Astrophysics Data System (ADS)

    Francisco, Rogério B.; Costa, M. Fernanda P.; Rocha, Ana Maria A. C.; Fernandes, Edite M. G. P.

    2016-06-01

    In this paper, we present a comparative study involving several penalty functions that can be used in a penalty approach for globally solving bound mixed-integer nonlinear programming (bMIMLP) problems. The penalty approach relies on a continuous reformulation of the bMINLP problem by adding a particular penalty term to the objective function. A penalty function based on the `erf' function is proposed. The continuous nonlinear optimization problems are sequentially solved by the population-based firefly algorithm. Preliminary numerical experiments are carried out in order to analyze the quality of the produced solutions, when compared with other penalty functions available in the literature.

  13. The barrier function of organotypic non-melanoma skin cancer models.

    PubMed

    Zoschke, Christian; Ulrich, Martina; Sochorová, Michaela; Wolff, Christopher; Vávrová, Kateřina; Ma, Nan; Ulrich, Claas; Brandner, Johanna M; Schäfer-Korting, Monika

    2016-07-10

    Non-melanoma skin cancer (NMSC) is the most frequent human cancer with continuously rising incidences worldwide. Herein, we investigated the molecular basis for the impaired skin barrier function of organotypic NMSC models. We unraveled disturbed epidermal differentiation by reflectance confocal microscopy and histopathological evaluation. While the presence of claudin-4 and occludin were distinctly reduced, zonula occludens protein-1 was more wide-spread, and claudin-1 was heterogeneously distributed within the NMSC models compared with normal reconstructed human skin. Moreover, the cancer altered stratum corneum lipid packing and profile with decreased cholesterol content, increased phospholipid amount, and altered ceramide subclasses. These alterations contributed to increased surface pH and to 1.5 to 2.6-fold enhanced caffeine permeability of the NMSC models. Three topical applications of ingenol mebutate gel (0.015%) caused abundant epidermal cell necrosis, decreased Ki-67 indices, and increased lactate dehydrogenase activity. Taken together, our study provides new biological insights into the microenvironment of organotypic NMSC models, improves the understanding of the disease model by revealing causes for impaired skin barrier function in NMSC models at the molecular level, and fosters human cell-based approaches in preclinical drug evaluation. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Functional genomics approaches in parasitic helminths.

    PubMed

    Hagen, J; Lee, E F; Fairlie, W D; Kalinna, B H

    2012-01-01

    As research on parasitic helminths is moving into the post-genomic era, an enormous effort is directed towards deciphering gene function and to achieve gene annotation. The sequences that are available in public databases undoubtedly hold information that can be utilized for new interventions and control but the exploitation of these resources has until recently remained difficult. Only now, with the emergence of methods to genetically manipulate and transform parasitic worms will it be possible to gain a comprehensive understanding of the molecular mechanisms involved in nutrition, metabolism, developmental switches/maturation and interaction with the host immune system. This review focuses on functional genomics approaches in parasitic helminths that are currently used, to highlight potential applications of these technologies in the areas of cell biology, systems biology and immunobiology of parasitic helminths. © 2011 Blackwell Publishing Ltd.

  15. Transfer function modeling of damping mechanisms in viscoelastic plates

    NASA Technical Reports Server (NTRS)

    Slater, J. C.; Inman, D. J.

    1991-01-01

    This work formulates a method for the modeling of material damping characteristics in plates. The Sophie German equation of classical plate theory is modified to incorporate hysteresis effects represented by complex stiffness using the transfer function approach proposed by Golla and Hughes, (1985). However, this procedure is not limited to this representation. The governing characteristic equation is decoupled through separation of variables, yielding a solution similar to that of undamped classical plate theory, allowing solution of the steady state as well as the transient response problem.

  16. A simplified approach to quasi-linear viscoelastic modeling

    PubMed Central

    Nekouzadeh, Ali; Pryse, Kenneth M.; Elson, Elliot L.; Genin, Guy M.

    2007-01-01

    The fitting of quasi-linear viscoelastic (QLV) constitutive models to material data often involves somewhat cumbersome numerical convolution. A new approach to treating quasi-linearity in one dimension is described and applied to characterize the behavior of reconstituted collagen. This approach is based on a new principle for including nonlinearity and requires considerably less computation than other comparable models for both model calibration and response prediction, especially for smoothly applied stretching. Additionally, the approach allows relaxation to adapt with the strain history. The modeling approach is demonstrated through tests on pure reconstituted collagen. Sequences of “ramp-and-hold” stretching tests were applied to rectangular collagen specimens. The relaxation force data from the “hold” was used to calibrate a new “adaptive QLV model” and several models from literature, and the force data from the “ramp” was used to check the accuracy of model predictions. Additionally, the ability of the models to predict the force response on a reloading of the specimen was assessed. The “adaptive QLV model” based on this new approach predicts collagen behavior comparably to or better than existing models, with much less computation. PMID:17499254

  17. A Deep Learning based Approach to Reduced Order Modeling of Fluids using LSTM Neural Networks

    NASA Astrophysics Data System (ADS)

    Mohan, Arvind; Gaitonde, Datta

    2017-11-01

    Reduced Order Modeling (ROM) can be used as surrogates to prohibitively expensive simulations to model flow behavior for long time periods. ROM is predicated on extracting dominant spatio-temporal features of the flow from CFD or experimental datasets. We explore ROM development with a deep learning approach, which comprises of learning functional relationships between different variables in large datasets for predictive modeling. Although deep learning and related artificial intelligence based predictive modeling techniques have shown varied success in other fields, such approaches are in their initial stages of application to fluid dynamics. Here, we explore the application of the Long Short Term Memory (LSTM) neural network to sequential data, specifically to predict the time coefficients of Proper Orthogonal Decomposition (POD) modes of the flow for future timesteps, by training it on data at previous timesteps. The approach is demonstrated by constructing ROMs of several canonical flows. Additionally, we show that statistical estimates of stationarity in the training data can indicate a priori how amenable a given flow-field is to this approach. Finally, the potential and limitations of deep learning based ROM approaches will be elucidated and further developments discussed.

  18. Functional Connectivity Mapping in the Animal Model: Principles and Applications of Resting-State fMRI

    PubMed Central

    Gorges, Martin; Roselli, Francesco; Müller, Hans-Peter; Ludolph, Albert C.; Rasche, Volker; Kassubek, Jan

    2017-01-01

    “Resting-state” fMRI has substantially contributed to the understanding of human and non-human functional brain organization by the analysis of correlated patterns in spontaneous activity within dedicated brain systems. Spontaneous neural activity is indirectly measured from the blood oxygenation level-dependent signal as acquired by echo planar imaging, when subjects quietly “resting” in the scanner. Animal models including disease or knockout models allow a broad spectrum of experimental manipulations not applicable in humans. The non-invasive fMRI approach provides a promising tool for cross-species comparative investigations. This review focuses on the principles of “resting-state” functional connectivity analysis and its applications to living animals. The translational aspect from in vivo animal models toward clinical applications in humans is emphasized. We introduce the fMRI-based investigation of the non-human brain’s hemodynamics, the methodological issues in the data postprocessing, and the functional data interpretation from different abstraction levels. The longer term goal of integrating fMRI connectivity data with structural connectomes obtained with tracing and optical imaging approaches is presented and will allow the interrogation of fMRI data in terms of directional flow of information and may identify the structural underpinnings of observed functional connectivity patterns. PMID:28539914

  19. Uncovering stability mechanisms in microbial ecosystems - combining microcosm experiments, computational modelling and ecological theory in a multidisciplinary approach

    NASA Astrophysics Data System (ADS)

    Worrich, Anja; König, Sara; Banitz, Thomas; Centler, Florian; Frank, Karin; Kästner, Matthias; Miltner, Anja; Thullner, Martin; Wick, Lukas

    2015-04-01

    Although bacterial degraders in soil are commonly exposed to fluctuating environmental conditions, the functional performance of the biodegradation processes can often be maintained by resistance and resilience mechanisms. However, there is still a gap in the mechanistic understanding of key factors contributing to the stability of such an ecosystem service. Therefore we developed an integrated approach combining microcosm experiments, simulation models and ecological theory to directly make use of the strengths of these disciplines. In a continuous interplay process, data, hypotheses, and central questions are exchanged between disciplines to initiate new experiments and models to ultimately identify buffer mechanisms and factors providing functional stability. We focus on drying and rewetting-cycles in soil ecosystems, which are a major abiotic driver for bacterial activity. Functional recovery of the system was found to depend on different spatial processes in the computational model. In particular, bacterial motility is a prerequisite for biodegradation if either bacteria or substrate are heterogeneously distributed. Hence, laboratory experiments focussing on bacterial dispersal processes were conducted and confirmed this finding also for functional resistance. Obtained results will be incorporated into the model in the next step. Overall, the combination of computational modelling and laboratory experiments identified spatial processes as the main driving force for functional stability in the considered system, and has proved a powerful methodological approach.

  20. Pharmacological approaches to restore mitochondrial function

    PubMed Central

    Andreux, Pénélope A.; Houtkooper, Riekelt H.; Auwerx, Johan

    2014-01-01

    Mitochondrial dysfunction is not only a hallmark of rare inherited mitochondrial disorders, but is also implicated in age-related diseases, including those that affect the metabolic and nervous system, such as type 2 diabetes and Parkinson’s disease. Numerous pathways maintain and/or restore proper mitochondrial function, including mitochondrial biogenesis, mitochondrial dynamics, mitophagy, and the mitochondrial unfolded protein response. New and powerful phenotypic assays in cell-based models, as well as multicellular organisms, have been developed to explore these different aspects of mitochondrial function. Modulating mitochondrial function has therefore emerged as an attractive therapeutic strategy for a range of diseases, which has spurred active drug discovery efforts in this area. PMID:23666487

  1. Development on electromagnetic impedance function modeling and its estimation

    NASA Astrophysics Data System (ADS)

    Sutarno, D.

    2015-09-01

    Today the Electromagnetic methods such as magnetotellurics (MT) and controlled sources audio MT (CSAMT) is used in a broad variety of applications. Its usefulness in poor seismic areas and its negligible environmental impact are integral parts of effective exploration at minimum cost. As exploration was forced into more difficult areas, the importance of MT and CSAMT, in conjunction with other techniques, has tended to grow continuously. However, there are obviously important and difficult problems remaining to be solved concerning our ability to collect process and interpret MT as well as CSAMT in complex 3D structural environments. This talk aim at reviewing and discussing the recent development on MT as well as CSAMT impedance functions modeling, and also some improvements on estimation procedures for the corresponding impedance functions. In MT impedance modeling, research efforts focus on developing numerical method for computing the impedance functions of three dimensionally (3-D) earth resistivity models. On that reason, 3-D finite elements numerical modeling for the impedances is developed based on edge element method. Whereas, in the CSAMT case, the efforts were focused to accomplish the non-plane wave problem in the corresponding impedance functions. Concerning estimation of MT and CSAMT impedance functions, researches were focused on improving quality of the estimates. On that objective, non-linear regression approach based on the robust M-estimators and the Hilbert transform operating on the causal transfer functions, were used to dealing with outliers (abnormal data) which are frequently superimposed on a normal ambient MT as well as CSAMT noise fields. As validated, the proposed MT impedance modeling method gives acceptable results for standard three dimensional resistivity models. Whilst, the full solution based modeling that accommodate the non-plane wave effect for CSAMT impedances is applied for all measurement zones, including near-, transition

  2. Development on electromagnetic impedance function modeling and its estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutarno, D., E-mail: Sutarno@fi.itb.ac.id

    2015-09-30

    Today the Electromagnetic methods such as magnetotellurics (MT) and controlled sources audio MT (CSAMT) is used in a broad variety of applications. Its usefulness in poor seismic areas and its negligible environmental impact are integral parts of effective exploration at minimum cost. As exploration was forced into more difficult areas, the importance of MT and CSAMT, in conjunction with other techniques, has tended to grow continuously. However, there are obviously important and difficult problems remaining to be solved concerning our ability to collect process and interpret MT as well as CSAMT in complex 3D structural environments. This talk aim atmore » reviewing and discussing the recent development on MT as well as CSAMT impedance functions modeling, and also some improvements on estimation procedures for the corresponding impedance functions. In MT impedance modeling, research efforts focus on developing numerical method for computing the impedance functions of three dimensionally (3-D) earth resistivity models. On that reason, 3-D finite elements numerical modeling for the impedances is developed based on edge element method. Whereas, in the CSAMT case, the efforts were focused to accomplish the non-plane wave problem in the corresponding impedance functions. Concerning estimation of MT and CSAMT impedance functions, researches were focused on improving quality of the estimates. On that objective, non-linear regression approach based on the robust M-estimators and the Hilbert transform operating on the causal transfer functions, were used to dealing with outliers (abnormal data) which are frequently superimposed on a normal ambient MT as well as CSAMT noise fields. As validated, the proposed MT impedance modeling method gives acceptable results for standard three dimensional resistivity models. Whilst, the full solution based modeling that accommodate the non-plane wave effect for CSAMT impedances is applied for all measurement zones, including near

  3. Robust functional regression model for marginal mean and subject-specific inferences.

    PubMed

    Cao, Chunzheng; Shi, Jian Qing; Lee, Youngjo

    2017-01-01

    We introduce flexible robust functional regression models, using various heavy-tailed processes, including a Student t-process. We propose efficient algorithms in estimating parameters for the marginal mean inferences and in predicting conditional means as well as interpolation and extrapolation for the subject-specific inferences. We develop bootstrap prediction intervals (PIs) for conditional mean curves. Numerical studies show that the proposed model provides a robust approach against data contamination or distribution misspecification, and the proposed PIs maintain the nominal confidence levels. A real data application is presented as an illustrative example.

  4. Reducing equifinality of hydrological models by integrating Functional Streamflow Disaggregation

    NASA Astrophysics Data System (ADS)

    Lüdtke, Stefan; Apel, Heiko; Nied, Manuela; Carl, Peter; Merz, Bruno

    2014-05-01

    A universal problem of the calibration of hydrological models is the equifinality of different parameter sets derived from the calibration of models against total runoff values. This is an intrinsic problem stemming from the quality of the calibration data and the simplified process representation by the model. However, discharge data contains additional information which can be extracted by signal processing methods. An analysis specifically developed for the disaggregation of runoff time series into flow components is the Functional Streamflow Disaggregation (FSD; Carl & Behrendt, 2008). This method is used in the calibration of an implementation of the hydrological model SWIM in a medium sized watershed in Thailand. FSD is applied to disaggregate the discharge time series into three flow components which are interpreted as base flow, inter-flow and surface runoff. In addition to total runoff, the model is calibrated against these three components in a modified GLUE analysis, with the aim to identify structural model deficiencies, assess the internal process representation and to tackle equifinality. We developed a model dependent (MDA) approach calibrating the model runoff components against the FSD components, and a model independent (MIA) approach comparing the FSD of the model results and the FSD of calibration data. The results indicate, that the decomposition provides valuable information for the calibration. Particularly MDA highlights and discards a number of standard GLUE behavioural models underestimating the contribution of soil water to river discharge. Both, MDA and MIA yield to a reduction of the parameter ranges by a factor up to 3 in comparison to standard GLUE. Based on these results, we conclude that the developed calibration approach is able to reduce the equifinality of hydrological model parameterizations. The effect on the uncertainty of the model predictions is strongest by applying MDA and shows only minor reductions for MIA. Besides

  5. A Self-Organizing State-Space-Model Approach for Parameter Estimation in Hodgkin-Huxley-Type Models of Single Neurons

    PubMed Central

    Vavoulis, Dimitrios V.; Straub, Volko A.; Aston, John A. D.; Feng, Jianfeng

    2012-01-01

    Traditional approaches to the problem of parameter estimation in biophysical models of neurons and neural networks usually adopt a global search algorithm (for example, an evolutionary algorithm), often in combination with a local search method (such as gradient descent) in order to minimize the value of a cost function, which measures the discrepancy between various features of the available experimental data and model output. In this study, we approach the problem of parameter estimation in conductance-based models of single neurons from a different perspective. By adopting a hidden-dynamical-systems formalism, we expressed parameter estimation as an inference problem in these systems, which can then be tackled using a range of well-established statistical inference methods. The particular method we used was Kitagawa's self-organizing state-space model, which was applied on a number of Hodgkin-Huxley-type models using simulated or actual electrophysiological data. We showed that the algorithm can be used to estimate a large number of parameters, including maximal conductances, reversal potentials, kinetics of ionic currents, measurement and intrinsic noise, based on low-dimensional experimental data and sufficiently informative priors in the form of pre-defined constraints imposed on model parameters. The algorithm remained operational even when very noisy experimental data were used. Importantly, by combining the self-organizing state-space model with an adaptive sampling algorithm akin to the Covariance Matrix Adaptation Evolution Strategy, we achieved a significant reduction in the variance of parameter estimates. The algorithm did not require the explicit formulation of a cost function and it was straightforward to apply on compartmental models and multiple data sets. Overall, the proposed methodology is particularly suitable for resolving high-dimensional inference problems based on noisy electrophysiological data and, therefore, a potentially useful tool in

  6. Developing the snow component of a distributed hydrological model: a step-wise approach based on multi-objective analysis

    NASA Astrophysics Data System (ADS)

    Dunn, S. M.; Colohan, R. J. E.

    1999-09-01

    A snow component has been developed for the distributed hydrological model, DIY, using an approach that sequentially evaluates the behaviour of different functions as they are implemented in the model. The evaluation is performed using multi-objective functions to ensure that the internal structure of the model is correct. The development of the model, using a sub-catchment in the Cairngorm Mountains in Scotland, demonstrated that the degree-day model can be enhanced for hydroclimatic conditions typical of those found in Scotland, without increasing meteorological data requirements. An important element of the snow model is a function to account for wind re-distribution. This causes large accumulations of snow in small pockets, which are shown to be important in sustaining baseflows in the rivers during the late spring and early summer, long after the snowpack has melted from the bulk of the catchment. The importance of the wind function would not have been identified using a single objective function of total streamflow to evaluate the model behaviour.

  7. Macroscopic dielectric function within time-dependent density functional theory—Real time evolution versus the Casida approach

    NASA Astrophysics Data System (ADS)

    Sander, Tobias; Kresse, Georg

    2017-02-01

    Linear optical properties can be calculated by solving the time-dependent density functional theory equations. Linearization of the equation of motion around the ground state orbitals results in the so-called Casida equation, which is formally very similar to the Bethe-Salpeter equation. Alternatively one can determine the spectral functions by applying an infinitely short electric field in time and then following the evolution of the electron orbitals and the evolution of the dipole moments. The long wavelength response function is then given by the Fourier transformation of the evolution of the dipole moments in time. In this work, we compare the results and performance of these two approaches for the projector augmented wave method. To allow for large time steps and still rely on a simple difference scheme to solve the differential equation, we correct for the errors in the frequency domain, using a simple analytic equation. In general, we find that both approaches yield virtually indistinguishable results. For standard density functionals, the time evolution approach is, with respect to the computational performance, clearly superior compared to the solution of the Casida equation. However, for functionals including nonlocal exchange, the direct solution of the Casida equation is usually much more efficient, even though it scales less beneficial with the system size. We relate this to the large computational prefactors in evaluating the nonlocal exchange, which renders the time evolution algorithm fairly inefficient.

  8. Fuzzy set approach to quality function deployment: An investigation

    NASA Technical Reports Server (NTRS)

    Masud, Abu S. M.

    1992-01-01

    The final report of the 1992 NASA/ASEE Summer Faculty Fellowship at the Space Exploration Initiative Office (SEIO) in Langley Research Center is presented. Quality Function Deployment (QFD) is a process, focused on facilitating the integration of the customer's voice in the design and development of a product or service. Various input, in the form of judgements and evaluations, are required during the QFD analyses. All the input variables in these analyses are treated as numeric variables. The purpose of the research was to investigate how QFD analyses can be performed when some or all of the input variables are treated as linguistic variables with values expressed as fuzzy numbers. The reason for this consideration is that human judgement, perception, and cognition are often ambiguous and are better represented as fuzzy numbers. Two approaches for using fuzzy sets in QFD have been proposed. In both cases, all the input variables are considered as linguistic variables with values indicated as linguistic expressions. These expressions are then converted to fuzzy numbers. The difference between the two approaches is due to how the QFD computations are performed with these fuzzy numbers. In Approach 1, the fuzzy numbers are first converted to their equivalent crisp scores and then the QFD computations are performed using these crisp scores. As a result, the output of this approach are crisp numbers, similar to those in traditional QFD. In Approach 2, all the QFD computations are performed with the fuzzy numbers and the output are fuzzy numbers also. Both the approaches have been explained with the help of illustrative examples of QFD application. Approach 2 has also been applied in a QFD application exercise in SEIO, involving a 'mini moon rover' design. The mini moon rover is a proposed tele-operated vehicle that will traverse and perform various tasks, including autonomous operations, on the moon surface. The output of the moon rover application exercise is a

  9. Functional Medicine Approach to Traumatic Brain Injury.

    PubMed

    Richer, Alice C

    2017-08-01

    Background: The U.S. military has seen dramatic increases in traumatic brain injuries (TBIs) among military personnel due to the nature of modern-day conflicts. Conventional TBI treatment for secondary brain injuries has suboptimal success rates, and patients, families, and healthcare professionals are increasingly turning to alternative medicine treatments. Objective: Effective treatments for the secondary injury cascades that occur after an initial brain trauma are unclear at this time. The goal of successful treatment options for secondary TBI injuries is to reduce oxidative stress, excitotoxicity, and inflammation while supporting mitochondrial functions and repair of membranes, synapses, and axons. Intervention: A new paradigm of medical care, known as functional medicine, is increasing in popularity and acceptance. Functional medicine combines conventional treatment methods with complementary, genetic, holistic, and nutritional therapies. The approach is to assess the patient as a whole person, taking into account the interconnectedness of the body and its unique reaction to disease, injury, and illness while working to restore balance and optimal health. Functional medicine treatment recommendations often include the use of acupuncture, Ayurveda, chiropractic manipulation, detoxification programs, herbal and homeopathic supplements, specialized diets, massage, meditation and mindfulness practices, neurobiofeedback, nutritional supplements, t'ai chi , and yoga. At present, some of these alternative treatments appear to be beneficial, but more research is needed to validate reported outcomes. Conclusions: Few clinical studies validate the effectiveness of alternative therapies for TBIs. However, further clinical trials and empirical studies warrant further investigation based on some reported positive results from research studies, case histories, anecdotal evidence, and widespread popularity of some approaches. To date, only nutritional therapies and

  10. Functional Medicine Approach to Traumatic Brain Injury

    PubMed Central

    2017-01-01

    Abstract Background: The U.S. military has seen dramatic increases in traumatic brain injuries (TBIs) among military personnel due to the nature of modern-day conflicts. Conventional TBI treatment for secondary brain injuries has suboptimal success rates, and patients, families, and healthcare professionals are increasingly turning to alternative medicine treatments. Objective: Effective treatments for the secondary injury cascades that occur after an initial brain trauma are unclear at this time. The goal of successful treatment options for secondary TBI injuries is to reduce oxidative stress, excitotoxicity, and inflammation while supporting mitochondrial functions and repair of membranes, synapses, and axons. Intervention: A new paradigm of medical care, known as functional medicine, is increasing in popularity and acceptance. Functional medicine combines conventional treatment methods with complementary, genetic, holistic, and nutritional therapies. The approach is to assess the patient as a whole person, taking into account the interconnectedness of the body and its unique reaction to disease, injury, and illness while working to restore balance and optimal health. Functional medicine treatment recommendations often include the use of acupuncture, Ayurveda, chiropractic manipulation, detoxification programs, herbal and homeopathic supplements, specialized diets, massage, meditation and mindfulness practices, neurobiofeedback, nutritional supplements, t'ai chi, and yoga. At present, some of these alternative treatments appear to be beneficial, but more research is needed to validate reported outcomes. Conclusions: Few clinical studies validate the effectiveness of alternative therapies for TBIs. However, further clinical trials and empirical studies warrant further investigation based on some reported positive results from research studies, case histories, anecdotal evidence, and widespread popularity of some approaches. To date, only nutritional therapies and

  11. Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, D. B. B.

    2015-12-01

    Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.

  12. Geometric and electrostatic modeling using molecular rigidity functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mu, Lin; Xia, Kelin; Wei, Guowei

    Geometric and electrostatic modeling is an essential component in computational biophysics and molecular biology. Commonly used geometric representations admit geometric singularities such as cusps, tips and self-intersecting facets that lead to computational instabilities in the molecular modeling. Our present work explores the use of flexibility and rigidity index (FRI), which has a proved superiority in protein B-factor prediction, for biomolecular geometric representation and associated electrostatic analysis. FRI rigidity surfaces are free of geometric singularities. We propose a rigidity based Poisson–Boltzmann equation for biomolecular electrostatic analysis. These approaches to surface and electrostatic modeling are validated by a set of 21 proteins.more » Our results are compared with those of established methods. Finally, being smooth and analytically differentiable, FRI rigidity functions offer excellent curvature analysis, which characterizes concave and convex regions on protein surfaces. Polarized curvatures constructed by using the product of minimum curvature and electrostatic potential is shown to predict potential protein–ligand binding sites.« less

  13. Geometric and electrostatic modeling using molecular rigidity functions

    DOE PAGES

    Mu, Lin; Xia, Kelin; Wei, Guowei

    2017-03-01

    Geometric and electrostatic modeling is an essential component in computational biophysics and molecular biology. Commonly used geometric representations admit geometric singularities such as cusps, tips and self-intersecting facets that lead to computational instabilities in the molecular modeling. Our present work explores the use of flexibility and rigidity index (FRI), which has a proved superiority in protein B-factor prediction, for biomolecular geometric representation and associated electrostatic analysis. FRI rigidity surfaces are free of geometric singularities. We propose a rigidity based Poisson–Boltzmann equation for biomolecular electrostatic analysis. These approaches to surface and electrostatic modeling are validated by a set of 21 proteins.more » Our results are compared with those of established methods. Finally, being smooth and analytically differentiable, FRI rigidity functions offer excellent curvature analysis, which characterizes concave and convex regions on protein surfaces. Polarized curvatures constructed by using the product of minimum curvature and electrostatic potential is shown to predict potential protein–ligand binding sites.« less

  14. A metabolomics and mouse models approach to study inflammatory and immune responses to radiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fornace, Albert J.; Li, Henghong

    2013-12-02

    The three-year project entitled "A Metabolomics and Mouse Models Approach to Study Inflammatory and Immune Responses to Radiation" was initiated in September 2009. The overall objectives of this project were to investigate the acute and persistent effects of low dose radiation on T cell lymphocyte function and physiology, as well the contributions of these cells to radiation-induced inflammatory responses. Inflammation after ionizing radiation (IR), even at low doses, may impact a variety of disease processes, including infectious disease, cardiovascular disease, cancer, and other potentially inflammatory disorders. There were three overall specific aims: 1. To investigate acute and persistent effects ofmore » low dose radiation on T cell subsets and function; 2. A genetic approach with mouse models to investigate p38 MAPK pathways that are involved in radiation-induced inflammatory signaling; 3. To investigate the effect of radiation quality on the inflammatory response. We have completed the work proposed in these aims.« less

  15. Value function in economic growth model

    NASA Astrophysics Data System (ADS)

    Bagno, Alexander; Tarasyev, Alexandr A.; Tarasyev, Alexander M.

    2017-11-01

    Properties of the value function are examined in an infinite horizon optimal control problem with an unlimited integrand index appearing in the quality functional with a discount factor. Optimal control problems of such type describe solutions in models of economic growth. Necessary and sufficient conditions are derived to ensure that the value function satisfies the infinitesimal stability properties. It is proved that value function coincides with the minimax solution of the Hamilton-Jacobi equation. Description of the growth asymptotic behavior for the value function is provided for the logarithmic, power and exponential quality functionals and an example is given to illustrate construction of the value function in economic growth models.

  16. Function-Task-Competency Approach to Curriculum Development in Vocational Education in Agriculture: Research Report No. 1. Project Background, Plan, and Model Development.

    ERIC Educational Resources Information Center

    Matteson, Harold R.

    The report explains the construction of the function-task-competency method of developing vocational education curricula in agriculture at the secondary and postsecondary levels. It discusses at some length five approaches to the development of vocational education curricula used in the past: the subject approach (which centers on subjects taught…

  17. A generalized nonlinear model-based mixed multinomial logit approach for crash data analysis.

    PubMed

    Zeng, Ziqiang; Zhu, Wenbo; Ke, Ruimin; Ash, John; Wang, Yinhai; Xu, Jiuping; Xu, Xinxin

    2017-02-01

    The mixed multinomial logit (MNL) approach, which can account for unobserved heterogeneity, is a promising unordered model that has been employed in analyzing the effect of factors contributing to crash severity. However, its basic assumption of using a linear function to explore the relationship between the probability of crash severity and its contributing factors can be violated in reality. This paper develops a generalized nonlinear model-based mixed MNL approach which is capable of capturing non-monotonic relationships by developing nonlinear predictors for the contributing factors in the context of unobserved heterogeneity. The crash data on seven Interstate freeways in Washington between January 2011 and December 2014 are collected to develop the nonlinear predictors in the model. Thirteen contributing factors in terms of traffic characteristics, roadway geometric characteristics, and weather conditions are identified to have significant mixed (fixed or random) effects on the crash density in three crash severity levels: fatal, injury, and property damage only. The proposed model is compared with the standard mixed MNL model. The comparison results suggest a slight superiority of the new approach in terms of model fit measured by the Akaike Information Criterion (12.06 percent decrease) and Bayesian Information Criterion (9.11 percent decrease). The predicted crash densities for all three levels of crash severities of the new approach are also closer (on average) to the observations than the ones predicted by the standard mixed MNL model. Finally, the significance and impacts of the contributing factors are analyzed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Modeling microbial community structure and functional diversity across time and space.

    PubMed

    Larsen, Peter E; Gibbons, Sean M; Gilbert, Jack A

    2012-07-01

    Microbial communities exhibit exquisitely complex structure. Many aspects of this complexity, from the number of species to the total number of interactions, are currently very difficult to examine directly. However, extraordinary efforts are being made to make these systems accessible to scientific investigation. While recent advances in high-throughput sequencing technologies have improved accessibility to the taxonomic and functional diversity of complex communities, monitoring the dynamics of these systems over time and space - using appropriate experimental design - is still expensive. Fortunately, modeling can be used as a lens to focus low-resolution observations of community dynamics to enable mathematical abstractions of functional and taxonomic dynamics across space and time. Here, we review the approaches for modeling bacterial diversity at both the very large and the very small scales at which microbial systems interact with their environments. We show that modeling can help to connect biogeochemical processes to specific microbial metabolic pathways. © 2012 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  19. Protein loop modeling using a new hybrid energy function and its application to modeling in inaccurate structural environments.

    PubMed

    Park, Hahnbeom; Lee, Gyu Rie; Heo, Lim; Seok, Chaok

    2014-01-01

    Protein loop modeling is a tool for predicting protein local structures of particular interest, providing opportunities for applications involving protein structure prediction and de novo protein design. Until recently, the majority of loop modeling methods have been developed and tested by reconstructing loops in frameworks of experimentally resolved structures. In many practical applications, however, the protein loops to be modeled are located in inaccurate structural environments. These include loops in model structures, low-resolution experimental structures, or experimental structures of different functional forms. Accordingly, discrepancies in the accuracy of the structural environment assumed in development of the method and that in practical applications present additional challenges to modern loop modeling methods. This study demonstrates a new strategy for employing a hybrid energy function combining physics-based and knowledge-based components to help tackle this challenge. The hybrid energy function is designed to combine the strengths of each energy component, simultaneously maintaining accurate loop structure prediction in a high-resolution framework structure and tolerating minor environmental errors in low-resolution structures. A loop modeling method based on global optimization of this new energy function is tested on loop targets situated in different levels of environmental errors, ranging from experimental structures to structures perturbed in backbone as well as side chains and template-based model structures. The new method performs comparably to force field-based approaches in loop reconstruction in crystal structures and better in loop prediction in inaccurate framework structures. This result suggests that higher-accuracy predictions would be possible for a broader range of applications. The web server for this method is available at http://galaxy.seoklab.org/loop with the PS2 option for the scoring function.

  20. A function-based approach to cockpit procedure aids

    NASA Technical Reports Server (NTRS)

    Phatak, Anil V.; Jain, Parveen; Palmer, Everett

    1990-01-01

    The objective of this research is to develop and test a cockpit procedural aid that can compose and present procedures that are appropriate for the given flight situation. The procedure would indicate the status of the aircraft engineering systems, and the environmental conditions. Prescribed procedures already exist for normal as well as for a number of non-normal and emergency situations, and can be presented to the crew using an interactive cockpit display. However, no procedures are prescribed or recommended for a host of plausible flight situations involving multiple malfunctions compounded by adverse environmental conditions. Under these circumstances, the cockpit procedural aid must review the prescribed procedures for the individual malfunction (when available), evaluate the alternatives or options, and present one or more composite procedures (prioritized or unprioritized) in response to the given situation. A top-down function-based conceptual approach towards composing and presenting cockpit procedures is being investigated. This approach is based upon the thought process that an operating crew must go through while attempting to meet the flight objectives given the current flight situation. In order to accomplish the flight objectives, certain critical functions must be maintained during each phase of the flight, using the appropriate procedures or success paths. The viability of these procedures depends upon the availability of required resources. If resources available are not sufficient to meet the requirements, alternative procedures (success paths) using the available resources must be constructed to maintain the critical functions and the corresponding objectives. If no success path exists that can satisfy the critical functions/objectives, then the next level of critical functions/objectives must be selected and the process repeated. Information is given in viewgraph form.

  1. Recent developments of the quantum chemical cluster approach for modeling enzyme reactions.

    PubMed

    Siegbahn, Per E M; Himo, Fahmi

    2009-06-01

    The quantum chemical cluster approach for modeling enzyme reactions is reviewed. Recent applications have used cluster models much larger than before which have given new modeling insights. One important and rather surprising feature is the fast convergence with cluster size of the energetics of the reactions. Even for reactions with significant charge separation it has in some cases been possible to obtain full convergence in the sense that dielectric cavity effects from outside the cluster do not contribute to any significant extent. Direct comparisons between quantum mechanics (QM)-only and QM/molecular mechanics (MM) calculations for quite large clusters in a case where the results differ significantly have shown that care has to be taken when using the QM/MM approach where there is strong charge polarization. Insights from the methods used, generally hybrid density functional methods, have also led to possibilities to give reasonable error limits for the results. Examples are finally given from the most extensive study using the cluster model, the one of oxygen formation at the oxygen-evolving complex in photosystem II.

  2. Integrative and systemic approaches for evaluating PPARβ/δ (PPARD) function

    PubMed Central

    Giordano Attianese, Greta MP

    2015-01-01

    The peroxisome proliferator-activated receptors (PPARs) are a group of nuclear receptors that function as transcription factors regulating the expression of genes involved in cellular differentiation, development, metabolism and also tumorigenesis. Three PPAR isotypes (α, β/δ and γ) have been identified, among which PPARβ/δ is the most difficult to functionally examine due to its tissue-specific diversity in cell fate determination, energy metabolism and housekeeping activities. PPARβ/δ acts both in a ligand-dependent and -independent manner. The specific type of regulation, activation or repression, is determined by many factors, among which the type of ligand, the presence/absence of PPARβ/δ-interacting corepressor or coactivator complexes and PPARβ/δ protein post-translational modifications play major roles. Recently, new global approaches to the study of nuclear receptors have made it possible to evaluate their molecular activity in a more systemic fashion, rather than deeply digging into a single pathway/function. This systemic approach is ideally suited for studying PPARβ/δ, due to its ubiquitous expression in various organs and its overlapping and tissue-specific transcriptomic signatures. The aim of the present review is to present in detail the diversity of PPARβ/δ function, focusing on the different information gained at the systemic level, and describing the global and unbiased approaches that combine a systems view with molecular understanding. PMID:25945080

  3. Physical and JIT Model Based Hybrid Modeling Approach for Building Thermal Load Prediction

    NASA Astrophysics Data System (ADS)

    Iino, Yutaka; Murai, Masahiko; Murayama, Dai; Motoyama, Ichiro

    Energy conservation in building fields is one of the key issues in environmental point of view as well as that of industrial, transportation and residential fields. The half of the total energy consumption in a building is occupied by HVAC (Heating, Ventilating and Air Conditioning) systems. In order to realize energy conservation of HVAC system, a thermal load prediction model for building is required. This paper propose a hybrid modeling approach with physical and Just-in-Time (JIT) model for building thermal load prediction. The proposed method has features and benefits such as, (1) it is applicable to the case in which past operation data for load prediction model learning is poor, (2) it has a self checking function, which always supervises if the data driven load prediction and the physical based one are consistent or not, so it can find if something is wrong in load prediction procedure, (3) it has ability to adjust load prediction in real-time against sudden change of model parameters and environmental conditions. The proposed method is evaluated with real operation data of an existing building, and the improvement of load prediction performance is illustrated.

  4. Asymptotic correlation functions and FFLO signature for the one-dimensional attractive Hubbard model

    NASA Astrophysics Data System (ADS)

    Cheng, Song; Jiang, Yuzhu; Yu, Yi-Cong; Batchelor, Murray T.; Guan, Xi-Wen

    2018-04-01

    We study the long-distance asymptotic behavior of various correlation functions for the one-dimensional (1D) attractive Hubbard model in a partially polarized phase through the Bethe ansatz and conformal field theory approaches. We particularly find the oscillating behavior of these correlation functions with spatial power-law decay, of which the pair (spin) correlation function oscillates with a frequency ΔkF (2 ΔkF). Here ΔkF = π (n↑ -n↓) is the mismatch in the Fermi surfaces of spin-up and spin-down particles. Consequently, the pair correlation function in momentum space has peaks at the mismatch k = ΔkF, which has been observed in recent numerical work on this model. These singular peaks in momentum space together with the spatial oscillation suggest an analog of the Fulde-Ferrell-Larkin-Ovchinnikov (FFLO) state in the 1D Hubbard model. The parameter β representing the lattice effect becomes prominent in critical exponents which determine the power-law decay of all correlation functions. We point out that the backscattering of unpaired fermions and bound pairs within their own Fermi points gives a microscopic origin of the FFLO pairing in 1D.

  5. Trait-based approaches for understanding microbial biodiversity and ecosystem functioning

    PubMed Central

    Krause, Sascha; Le Roux, Xavier; Niklaus, Pascal A.; Van Bodegom, Peter M.; Lennon, Jay T.; Bertilsson, Stefan; Grossart, Hans-Peter; Philippot, Laurent; Bodelier, Paul L. E.

    2014-01-01

    In ecology, biodiversity-ecosystem functioning (BEF) research has seen a shift in perspective from taxonomy to function in the last two decades, with successful application of trait-based approaches. This shift offers opportunities for a deeper mechanistic understanding of the role of biodiversity in maintaining multiple ecosystem processes and services. In this paper, we highlight studies that have focused on BEF of microbial communities with an emphasis on integrating trait-based approaches to microbial ecology. In doing so, we explore some of the inherent challenges and opportunities of understanding BEF using microbial systems. For example, microbial biologists characterize communities using gene phylogenies that are often unable to resolve functional traits. Additionally, experimental designs of existing microbial BEF studies are often inadequate to unravel BEF relationships. We argue that combining eco-physiological studies with contemporary molecular tools in a trait-based framework can reinforce our ability to link microbial diversity to ecosystem processes. We conclude that such trait-based approaches are a promising framework to increase the understanding of microbial BEF relationships and thus generating systematic principles in microbial ecology and more generally ecology. PMID:24904563

  6. Building a reference functional model for EHR systems.

    PubMed

    Sumita, Yuki; Takata, Mami; Ishitsuka, Keiju; Tominaga, Yasuyuki; Ohe, Kazuhiko

    2007-09-01

    Our aim was to develop a reference functional model for electric health record systems (RFM). Such a RFM is built from functions using functional descriptive elements (FDEs) and represents the static relationships between them. This paper presents a new format for describing electric health record (EHR) system functions. Questionnaire and field interview survey was conducted in five hospitals in Japan and one in the USA, to collect data on EHR system functions. Based on survey results, a reference functional list (RFL) was created, in which each EHR system function was listed and divided into 13 FDE types. By analyzing the RFL, we built the meta-functional model and the functional model using UML class diagrams. The former defines language for expressing the functional model, while the latter represents functions, FDEs and their static relationships. A total of 385 functions were represented in the RFL. Six patterns were found for the relationships between functions. The meta-functional model was created as a new format for describing functions. Examples of the functional model, which included the six patterns in the relationships between functions and 11 verbs, were created. We present the meta-functional model, which is a new description format for the functional structure and relationships. Although a more detailed description is required to apply the RFM to the semiautomatic generation of functional specification documents, our RFM can visualize functional structures and functional relationships, classify functions using multiple axes and identify the similarities and differences between functions. The RFM will promote not only the standardization of EHR systems, but also communications between system developers and healthcare providers in the EHR system-design processes. 2006 Elsevier Ireland Ltd

  7. A Model-Based Approach for Bridging Virtual and Physical Sensor Nodes in a Hybrid Simulation Framework

    PubMed Central

    Mozumdar, Mohammad; Song, Zhen Yu; Lavagno, Luciano; Sangiovanni-Vincentelli, Alberto L.

    2014-01-01

    The Model Based Design (MBD) approach is a popular trend to speed up application development of embedded systems, which uses high-level abstractions to capture functional requirements in an executable manner, and which automates implementation code generation. Wireless Sensor Networks (WSNs) are an emerging very promising application area for embedded systems. However, there is a lack of tools in this area, which would allow an application developer to model a WSN application by using high level abstractions, simulate it mapped to a multi-node scenario for functional analysis, and finally use the refined model to automatically generate code for different WSN platforms. Motivated by this idea, in this paper we present a hybrid simulation framework that not only follows the MBD approach for WSN application development, but also interconnects a simulated sub-network with a physical sub-network and then allows one to co-simulate them, which is also known as Hardware-In-the-Loop (HIL) simulation. PMID:24960083

  8. A computational modeling approach for the characterization of mechanical properties of 3D alginate tissue scaffolds.

    PubMed

    Nair, K; Yan, K C; Sun, W

    2008-01-01

    Scaffold guided tissue engineering is an innovative approach wherein cells are seeded onto biocompatible and biodegradable materials to form 3-dimensional (3D) constructs that, when implanted in the body facilitate the regeneration of tissue. Tissue scaffolds act as artificial extracellular matrix providing the environment conducive for tissue growth. Characterization of scaffold properties is necessary to understand better the underlying processes involved in controlling cell behavior and formation of functional tissue. We report a computational modeling approach to characterize mechanical properties of 3D gellike biomaterial, specifically, 3D alginate scaffold encapsulated with cells. Alginate inherent nonlinearity and variations arising from minute changes in its concentration and viscosity make experimental evaluation of its mechanical properties a challenging and time consuming task. We developed an in silico model to determine the stress-strain relationship of alginate based scaffolds from experimental data. In particular, we compared the Ogden hyperelastic model to other hyperelastic material models and determined that this model was the most suitable to characterize the nonlinear behavior of alginate. We further propose a mathematical model that represents the alginate material constants in Ogden model as a function of concentrations and viscosity. This study demonstrates the model capability to predict mechanical properties of 3D alginate scaffolds.

  9. Defining and Applying a Functionality Approach to Intellectual Disability

    ERIC Educational Resources Information Center

    Luckasson, R.; Schalock, R. L.

    2013-01-01

    Background: The current functional models of disability do not adequately incorporate significant changes of the last three decades in our understanding of human functioning, and how the human functioning construct can be applied to clinical functions, professional practices and outcomes evaluation. Methods: The authors synthesise current…

  10. Model free approach to kinetic analysis of real-time hyperpolarized 13C magnetic resonance spectroscopy data.

    PubMed

    Hill, Deborah K; Orton, Matthew R; Mariotti, Erika; Boult, Jessica K R; Panek, Rafal; Jafar, Maysam; Parkes, Harold G; Jamin, Yann; Miniotis, Maria Falck; Al-Saffar, Nada M S; Beloueche-Babari, Mounia; Robinson, Simon P; Leach, Martin O; Chung, Yuen-Li; Eykyn, Thomas R

    2013-01-01

    Real-time detection of the rates of metabolic flux, or exchange rates of endogenous enzymatic reactions, is now feasible in biological systems using Dynamic Nuclear Polarization Magnetic Resonance. Derivation of reaction rate kinetics from this technique typically requires multi-compartmental modeling of dynamic data, and results are therefore model-dependent and prone to misinterpretation. We present a model-free formulism based on the ratio of total areas under the curve (AUC) of the injected and product metabolite, for example pyruvate and lactate. A theoretical framework to support this novel analysis approach is described, and demonstrates that the AUC ratio is proportional to the forward rate constant k. We show that the model-free approach strongly correlates with k for whole cell in vitro experiments across a range of cancer cell lines, and detects response in cells treated with the pan-class I PI3K inhibitor GDC-0941 with comparable or greater sensitivity. The same result is seen in vivo with tumor xenograft-bearing mice, in control tumors and following drug treatment with dichloroacetate. An important finding is that the area under the curve is independent of both the input function and of any other metabolic pathways arising from the injected metabolite. This model-free approach provides a robust and clinically relevant alternative to kinetic model-based rate measurements in the clinical translation of hyperpolarized (13)C metabolic imaging in humans, where measurement of the input function can be problematic.

  11. Model Free Approach to Kinetic Analysis of Real-Time Hyperpolarized 13C Magnetic Resonance Spectroscopy Data

    PubMed Central

    Mariotti, Erika; Boult, Jessica K. R.; Panek, Rafal; Jafar, Maysam; Parkes, Harold G.; Jamin, Yann; Miniotis, Maria Falck; Al-Saffar, Nada M. S.; Beloueche-Babari, Mounia; Robinson, Simon P.; Leach, Martin O.; Chung, Yuen-Li; Eykyn, Thomas R.

    2013-01-01

    Real-time detection of the rates of metabolic flux, or exchange rates of endogenous enzymatic reactions, is now feasible in biological systems using Dynamic Nuclear Polarization Magnetic Resonance. Derivation of reaction rate kinetics from this technique typically requires multi-compartmental modeling of dynamic data, and results are therefore model-dependent and prone to misinterpretation. We present a model-free formulism based on the ratio of total areas under the curve (AUC) of the injected and product metabolite, for example pyruvate and lactate. A theoretical framework to support this novel analysis approach is described, and demonstrates that the AUC ratio is proportional to the forward rate constant k. We show that the model-free approach strongly correlates with k for whole cell in vitro experiments across a range of cancer cell lines, and detects response in cells treated with the pan-class I PI3K inhibitor GDC-0941 with comparable or greater sensitivity. The same result is seen in vivo with tumor xenograft-bearing mice, in control tumors and following drug treatment with dichloroacetate. An important finding is that the area under the curve is independent of both the input function and of any other metabolic pathways arising from the injected metabolite. This model-free approach provides a robust and clinically relevant alternative to kinetic model-based rate measurements in the clinical translation of hyperpolarized 13C metabolic imaging in humans, where measurement of the input function can be problematic. PMID:24023724

  12. A hybrid agent-based approach for modeling microbiological systems.

    PubMed

    Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing

    2008-11-21

    Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.

  13. Electrification Futures Study Modeling Approach | Energy Analysis | NREL

    Science.gov Websites

    Electrification Futures Study Modeling Approach Electrification Futures Study Modeling Approach To quantitatively answer the research questions of the Electrification Futures Study, researchers will use multiple accounting for infrastructure inertia through stock turnover. Load Modeling The Electrification Futures Study

  14. Neural network approach to quantum-chemistry data: accurate prediction of density functional theory energies.

    PubMed

    Balabin, Roman M; Lomakina, Ekaterina I

    2009-08-21

    Artificial neural network (ANN) approach has been applied to estimate the density functional theory (DFT) energy with large basis set using lower-level energy values and molecular descriptors. A total of 208 different molecules were used for the ANN training, cross validation, and testing by applying BLYP, B3LYP, and BMK density functionals. Hartree-Fock results were reported for comparison. Furthermore, constitutional molecular descriptor (CD) and quantum-chemical molecular descriptor (QD) were used for building the calibration model. The neural network structure optimization, leading to four to five hidden neurons, was also carried out. The usage of several low-level energy values was found to greatly reduce the prediction error. An expected error, mean absolute deviation, for ANN approximation to DFT energies was 0.6+/-0.2 kcal mol(-1). In addition, the comparison of the different density functionals with the basis sets and the comparison of multiple linear regression results were also provided. The CDs were found to overcome limitation of the QD. Furthermore, the effective ANN model for DFT/6-311G(3df,3pd) and DFT/6-311G(2df,2pd) energy estimation was developed, and the benchmark results were provided.

  15. Ligand-guided optimization of CXCR4 homology models for virtual screening using a multiple chemotype approach

    NASA Astrophysics Data System (ADS)

    Neves, Marco A. C.; Simões, Sérgio; Sá e Melo, M. Luisa

    2010-12-01

    CXCR4 is a G-protein coupled receptor for CXCL12 that plays an important role in human immunodeficiency virus infection, cancer growth and metastasization, immune cell trafficking and WHIM syndrome. In the absence of an X-ray crystal structure, theoretical modeling of the CXCR4 receptor remains an important tool for structure-function analysis and to guide the discovery of new antagonists with potential clinical use. In this study, the combination of experimental data and molecular modeling approaches allowed the development of optimized ligand-receptor models useful for elucidation of the molecular determinants of small molecule binding and functional antagonism. The ligand-guided homology modeling approach used in this study explicitly re-shaped the CXCR4 binding pocket in order to improve discrimination between known CXCR4 antagonists and random decoys. Refinement based on multiple test-sets with small compounds from single chemotypes provided the best early enrichment performance. These results provide an important tool for structure-based drug design and virtual ligand screening of new CXCR4 antagonists.

  16. Physical models have gender-specific effects on student understanding of protein structure-function relationships.

    PubMed

    Forbes-Lorman, Robin M; Harris, Michelle A; Chang, Wesley S; Dent, Erik W; Nordheim, Erik V; Franzen, Margaret A

    2016-07-08

    Understanding how basic structural units influence function is identified as a foundational/core concept for undergraduate biological and biochemical literacy. It is essential for students to understand this concept at all size scales, but it is often more difficult for students to understand structure-function relationships at the molecular level, which they cannot as effectively visualize. Students need to develop accurate, 3-dimensional mental models of biomolecules to understand how biomolecular structure affects cellular functions at the molecular level, yet most traditional curricular tools such as textbooks include only 2-dimensional representations. We used a controlled, backward design approach to investigate how hand-held physical molecular model use affected students' ability to logically predict structure-function relationships. Brief (one class period) physical model use increased quiz score for females, whereas there was no significant increase in score for males using physical models. Females also self-reported higher learning gains in their understanding of context-specific protein function. Gender differences in spatial visualization may explain the gender-specific benefits of physical model use observed. © 2016 The Authors Biochemistry and Molecular Biology Education published by Wiley Periodicals, Inc. on behalf of International Union of Biochemistry and Molecular Biology, 44(4):326-335, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  17. Functionalized anatomical models for EM-neuron Interaction modeling

    NASA Astrophysics Data System (ADS)

    Neufeld, Esra; Cassará, Antonino Mario; Montanaro, Hazael; Kuster, Niels; Kainz, Wolfgang

    2016-06-01

    The understanding of interactions between electromagnetic (EM) fields and nerves are crucial in contexts ranging from therapeutic neurostimulation to low frequency EM exposure safety. To properly consider the impact of in vivo induced field inhomogeneity on non-linear neuronal dynamics, coupled EM-neuronal dynamics modeling is required. For that purpose, novel functionalized computable human phantoms have been developed. Their implementation and the systematic verification of the integrated anisotropic quasi-static EM solver and neuronal dynamics modeling functionality, based on the method of manufactured solutions and numerical reference data, is described. Electric and magnetic stimulation of the ulnar and sciatic nerve were modeled to help understanding a range of controversial issues related to the magnitude and optimal determination of strength-duration (SD) time constants. The results indicate the importance of considering the stimulation-specific inhomogeneous field distributions (especially at tissue interfaces), realistic models of non-linear neuronal dynamics, very short pulses, and suitable SD extrapolation models. These results and the functionalized computable phantom will influence and support the development of safe and effective neuroprosthetic devices and novel electroceuticals. Furthermore they will assist the evaluation of existing low frequency exposure standards for the entire population under all exposure conditions.

  18. Function modeling: improved raster analysis through delayed reading and function raster datasets

    Treesearch

    John S. Hogland; Nathaniel M. Anderson; J .Greg Jones

    2013-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that streamlines the...

  19. A Unified Approach to Model-Based Planning and Execution

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Dorais, Gregory A.; Fry, Chuck; Levinson, Richard; Plaunt, Christian; Norvig, Peter (Technical Monitor)

    2000-01-01

    Writing autonomous software is complex, requiring the coordination of functionally and technologically diverse software modules. System and mission engineers must rely on specialists familiar with the different software modules to translate requirements into application software. Also, each module often encodes the same requirement in different forms. The results are high costs and reduced reliability due to the difficulty of tracking discrepancies in these encodings. In this paper we describe a unified approach to planning and execution that we believe provides a unified representational and computational framework for an autonomous agent. We identify the four main components whose interplay provides the basis for the agent's autonomous behavior: the domain model, the plan database, the plan running module, and the planner modules. This representational and problem solving approach can be applied at all levels of the architecture of a complex agent, such as Remote Agent. In the rest of the paper we briefly describe the Remote Agent architecture. The new agent architecture proposed here aims at achieving the full Remote Agent functionality. We then give the fundamental ideas behind the new agent architecture and point out some implication of the structure of the architecture, mainly in the area of reactivity and interaction between reactive and deliberative decision making. We conclude with related work and current status.

  20. Trait Approach and Avoidance Motivation: Lateralized Neural Activity Associated with Executive Function

    PubMed Central

    Spielberg, Jeffrey M.; Miller, Gregory A.; Engels, Anna S.; Herrington, John D.; Sutton, Bradley P.; Banich, Marie T.; Heller, Wendy

    2010-01-01

    Motivation and executive function are both necessary for the completion of goal-directed behavior. Research investigating the manner in which these processes interact is beginning to emerge and has implicated middle frontal gyrus (MFG) as a site of interaction for relevant neural mechanisms. However, this research has focused on state motivation, and it has not examined functional lateralization. The present study examined the impact of trait levels of approach and avoidance motivation on neural processes associated with executive function. Functional magnetic resonance imaging was conducted while participants performed a color-word Stroop task. Analyses identified brain regions in which trait approach and avoidance motivation (measured by questionnaires) moderated activation associated with executive control. Approach was hypothesized to be associated with left-lateralized MFG activation, whereas avoidance was hypothesized to be associated with right-lateralized MFG activation. Results supported both hypotheses. Present findings implicate areas of middle frontal gyrus in top-down control to guide behavior in accordance with motivational goals. PMID:20728552

  1. A Goal Oriented Approach for Modeling and Analyzing Security Trade-Offs

    NASA Astrophysics Data System (ADS)

    Elahi, Golnaz; Yu, Eric

    In designing software systems, security is typically only one design objective among many. It may compete with other objectives such as functionality, usability, and performance. Too often, security mechanisms such as firewalls, access control, or encryption are adopted without explicit recognition of competing design objectives and their origins in stakeholder interests. Recently, there is increasing acknowledgement that security is ultimately about trade-offs. One can only aim for "good enough" security, given the competing demands from many parties. In this paper, we examine how conceptual modeling can provide explicit and systematic support for analyzing security trade-offs. After considering the desirable criteria for conceptual modeling methods, we examine several existing approaches for dealing with security trade-offs. From analyzing the limitations of existing methods, we propose an extension to the i* framework for security trade-off analysis, taking advantage of its multi-agent and goal orientation. The method was applied to several case studies used to exemplify existing approaches.

  2. The negotiated equilibrium model of spinal cord function.

    PubMed

    Wolpaw, Jonathan R

    2018-04-16

    The belief that the spinal cord is hardwired is no longer tenable. Like the rest of the CNS, the spinal cord changes during growth and aging, when new motor behaviours are acquired, and in response to trauma and disease. This paper describes a new model of spinal cord function that reconciles its recently appreciated plasticity with its long recognized reliability as the final common pathway for behaviour. According to this model, the substrate of each motor behaviour comprises brain and spinal plasticity: the plasticity in the brain induces and maintains the plasticity in the spinal cord. Each time a behaviour occurs, the spinal cord provides the brain with performance information that guides changes in the substrate of the behaviour. All the behaviours in the repertoire undergo this process concurrently; each repeatedly induces plasticity to preserve its key features despite the plasticity induced by other behaviours. The aggregate process is a negotiation among the behaviours: they negotiate the properties of the spinal neurons and synapses that they all use. The ongoing negotiation maintains the spinal cord in an equilibrium - a negotiated equilibrium - that serves all the behaviours. This new model of spinal cord function is supported by laboratory and clinical data, makes predictions borne out by experiment, and underlies a new approach to restoring function to people with neuromuscular disorders. Further studies are needed to test its generality, to determine whether it may apply to other CNS areas such as the cerebral cortex, and to develop its therapeutic implications. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  3. Risk prediction model: Statistical and artificial neural network approach

    NASA Astrophysics Data System (ADS)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  4. A Bio-Inspired Model-Based Approach for Context-Aware Post-WIMP Tele-Rehabilitation.

    PubMed

    López-Jaquero, Víctor; Rodríguez, Arturo C; Teruel, Miguel A; Montero, Francisco; Navarro, Elena; Gonzalez, Pascual

    2016-10-13

    Tele-rehabilitation is one of the main domains where Information and Communication Technologies (ICT) have been proven useful to move healthcare from care centers to patients' home. Moreover, patients, especially those carrying out a physical therapy, cannot use a traditional Window, Icon, Menu, Pointer (WIMP) system, but they need to interact in a natural way, that is, there is a need to move from WIMP systems to Post-WIMP ones. Moreover, tele-rehabilitation systems should be developed following the context-aware approach, so that they are able to adapt to the patients' context to provide them with usable and effective therapies. In this work a model-based approach is presented to assist stakeholders in the development of context-aware Post-WIMP tele-rehabilitation systems. It entails three different models: (i) a task model for designing the rehabilitation tasks; (ii) a context model to facilitate the adaptation of these tasks to the context; and (iii) a bio-inspired presentation model to specify thoroughly how such tasks should be performed by the patients. Our proposal overcomes one of the limitations of the model-based approach for the development of context-aware systems supporting the specification of non-functional requirements. Finally, a case study is used to illustrate how this proposal can be put into practice to design a real world rehabilitation task.

  5. Modelling the multidimensional niche by linking functional traits to competitive performance

    PubMed Central

    Maynard, Daniel S.; Leonard, Kenneth E.; Drake, John M.; Hall, David W.; Crowther, Thomas W.; Bradford, Mark A.

    2015-01-01

    Linking competitive outcomes to environmental conditions is necessary for understanding species' distributions and responses to environmental change. Despite this importance, generalizable approaches for predicting competitive outcomes across abiotic gradients are lacking, driven largely by the highly complex and context-dependent nature of biotic interactions. Here, we present and empirically test a novel niche model that uses functional traits to model the niche space of organisms and predict competitive outcomes of co-occurring populations across multiple resource gradients. The model makes no assumptions about the underlying mode of competition and instead applies to those settings where relative competitive ability across environments correlates with a quantifiable performance metric. To test the model, a series of controlled microcosm experiments were conducted using genetically related strains of a widespread microbe. The model identified trait microevolution and performance differences among strains, with the predicted competitive ability of each organism mapped across a two-dimensional carbon and nitrogen resource space. Areas of coexistence and competitive dominance between strains were identified, and the predicted competitive outcomes were validated in approximately 95% of the pairings. By linking trait variation to competitive ability, our work demonstrates a generalizable approach for predicting and modelling competitive outcomes across changing environmental contexts. PMID:26136444

  6. Pesticide fate at regional scale: Development of an integrated model approach and application

    NASA Astrophysics Data System (ADS)

    Herbst, M.; Hardelauf, H.; Harms, R.; Vanderborght, J.; Vereecken, H.

    As a result of agricultural practice many soils and aquifers are contaminated with pesticides. In order to quantify the side-effects of these anthropogenic impacts on groundwater quality at regional scale, a process-based, integrated model approach was developed. The Richards’ equation based numerical model TRACE calculates the three-dimensional saturated/unsaturated water flow. For the modeling of regional scale pesticide transport we linked TRACE with the plant module SUCROS and with 3DLEWASTE, a hybrid Lagrangian/Eulerian approach to solve the convection/dispersion equation. We used measurements, standard methods like pedotransfer-functions or parameters from literature to derive the model input for the process model. A first-step application of TRACE/3DLEWASTE to the 20 km 2 test area ‘Zwischenscholle’ for the period 1983-1993 reveals the behaviour of the pesticide isoproturon. The selected test area is characterised by an intense agricultural use and shallow groundwater, resulting in a high vulnerability of the groundwater to pesticide contamination. The model results stress the importance of the unsaturated zone for the occurrence of pesticides in groundwater. Remarkable isoproturon concentrations in groundwater are predicted for locations with thin layered and permeable soils. For four selected locations we used measured piezometric heads to validate predicted groundwater levels. In general, the model results are consistent and reasonable. Thus the developed integrated model approach is seen as a promising tool for the quantification of the agricultural practice impact on groundwater quality.

  7. Zebrafish models for the functional genomics of neurogenetic disorders.

    PubMed

    Kabashi, Edor; Brustein, Edna; Champagne, Nathalie; Drapeau, Pierre

    2011-03-01

    In this review, we consider recent work using zebrafish to validate and study the functional consequences of mutations of human genes implicated in a broad range of degenerative and developmental disorders of the brain and spinal cord. Also we present technical considerations for those wishing to study their own genes of interest by taking advantage of this easily manipulated and clinically relevant model organism. Zebrafish permit mutational analyses of genetic function (gain or loss of function) and the rapid validation of human variants as pathological mutations. In particular, neural degeneration can be characterized at genetic, cellular, functional, and behavioral levels. Zebrafish have been used to knock down or express mutations in zebrafish homologs of human genes and to directly express human genes bearing mutations related to neurodegenerative disorders such as spinal muscular atrophy, ataxia, hereditary spastic paraplegia, amyotrophic lateral sclerosis (ALS), epilepsy, Huntington's disease, Parkinson's disease, fronto-temporal dementia, and Alzheimer's disease. More recently, we have been using zebrafish to validate mutations of synaptic genes discovered by large-scale genomic approaches in developmental disorders such as autism, schizophrenia, and non-syndromic mental retardation. Advances in zebrafish genetics such as multigenic analyses and chemical genetics now offer a unique potential for disease research. Thus, zebrafish hold much promise for advancing the functional genomics of human diseases, the understanding of the genetics and cell biology of degenerative and developmental disorders, and the discovery of therapeutics. This article is part of a Special Issue entitled Zebrafish Models of Neurological Diseases. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Models of Neuronal Stimulus-Response Functions: Elaboration, Estimation, and Evaluation

    PubMed Central

    Meyer, Arne F.; Williamson, Ross S.; Linden, Jennifer F.; Sahani, Maneesh

    2017-01-01

    Rich, dynamic, and dense sensory stimuli are encoded within the nervous system by the time-varying activity of many individual neurons. A fundamental approach to understanding the nature of the encoded representation is to characterize the function that relates the moment-by-moment firing of a neuron to the recent history of a complex sensory input. This review provides a unifying and critical survey of the techniques that have been brought to bear on this effort thus far—ranging from the classical linear receptive field model to modern approaches incorporating normalization and other nonlinearities. We address separately the structure of the models; the criteria and algorithms used to identify the model parameters; and the role of regularizing terms or “priors.” In each case we consider benefits or drawbacks of various proposals, providing examples for when these methods work and when they may fail. Emphasis is placed on key concepts rather than mathematical details, so as to make the discussion accessible to readers from outside the field. Finally, we review ways in which the agreement between an assumed model and the neuron's response may be quantified. Re-implemented and unified code for many of the methods are made freely available. PMID:28127278

  9. Uncovering Local Trends in Genetic Effects of Multiple Phenotypes via Functional Linear Models.

    PubMed

    Vsevolozhskaya, Olga A; Zaykin, Dmitri V; Barondess, David A; Tong, Xiaoren; Jadhav, Sneha; Lu, Qing

    2016-04-01

    Recent technological advances equipped researchers with capabilities that go beyond traditional genotyping of loci known to be polymorphic in a general population. Genetic sequences of study participants can now be assessed directly. This capability removed technology-driven bias toward scoring predominantly common polymorphisms and let researchers reveal a wealth of rare and sample-specific variants. Although the relative contributions of rare and common polymorphisms to trait variation are being debated, researchers are faced with the need for new statistical tools for simultaneous evaluation of all variants within a region. Several research groups demonstrated flexibility and good statistical power of the functional linear model approach. In this work we extend previous developments to allow inclusion of multiple traits and adjustment for additional covariates. Our functional approach is unique in that it provides a nuanced depiction of effects and interactions for the variables in the model by representing them as curves varying over a genetic region. We demonstrate flexibility and competitive power of our approach by contrasting its performance with commonly used statistical tools and illustrate its potential for discovery and characterization of genetic architecture of complex traits using sequencing data from the Dallas Heart Study. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  10. Teaching Mathematical Functions Using Geometric Functions Approach and Its Effect on Ninth Grade Students' Motivation

    ERIC Educational Resources Information Center

    Akçakin, Veysel

    2018-01-01

    The purpose of this study is to investigate the effects of using geometric functions approach on 9th grade students' motivation levels toward mathematics in functions unit. Participants of this study were 87 students who were ongoing in the first year of high school in Turkey. In this research, pretest and posttest control group quasiexperimental…

  11. Role of theory of mind and executive function in explaining social intelligence: a structural equation modeling approach.

    PubMed

    Yeh, Zai-Ting

    2013-01-01

    Social intelligence is the ability to understand others and the social context effectively and thus to interact with people successfully. Research has suggested that the theory of mind (ToM) and executive function may play important roles in explaining social intelligence. The specific aim of the present study was to test with structural equation modeling (SEM) the hypothesis that performance on ToM tasks is more associated with social intelligence in the elderly than is performance on executive functions. One hundred and seventy-seven participants (age 56-96) completed ToM, executive function, and other basic cognition tasks, and were rated with social intelligence scales. The SEM results showed that ToM and executive function were strongly correlated (0.54); however, only the path coefficient from ToM to social intelligence, and not from executive function, was significant (0.37). ToM performance, but not executive function, was strongly correlated with social intelligence among elderly individuals. ToM and executive function might play different roles in social behavior during normal aging; however, based on the present results, it is possible that ToM might play an important role in social intelligence.

  12. Measurement of Function Post Hip Fracture: Testing a Comprehensive Measurement Model of Physical Function

    PubMed Central

    Gruber-Baldini, Ann L.; Hicks, Gregory; Ostir, Glen; Klinedinst, N. Jennifer; Orwig, Denise; Magaziner, Jay

    2015-01-01

    Background Measurement of physical function post hip fracture has been conceptualized using multiple different measures. Purpose This study tested a comprehensive measurement model of physical function. Design This was a descriptive secondary data analysis including 168 men and 171 women post hip fracture. Methods Using structural equation modeling, a measurement model of physical function which included grip strength, activities of daily living, instrumental activities of daily living and performance was tested for fit at 2 and 12 months post hip fracture and among male and female participants and validity of the measurement model of physical function was evaluated based on how well the model explained physical activity, exercise and social activities post hip fracture. Findings The measurement model of physical function fit the data. The amount of variance the model or individual factors of the model explained varied depending on the activity. Conclusion Decisions about the ideal way in which to measure physical function should be based on outcomes considered and participant Clinical Implications The measurement model of physical function is a reliable and valid method to comprehensively measure physical function across the hip fracture recovery trajectory. Practical but useful assessment of function should be considered and monitored over the recovery trajectory post hip fracture. PMID:26492866

  13. Testing the basic assumption of the hydrogeomorphic approach to assessing wetland functions.

    PubMed

    Hruby, T

    2001-05-01

    The hydrogeomorphic (HGM) approach for developing "rapid" wetland function assessment methods stipulates that the variables used are to be scaled based on data collected at sites judged to be the best at performing the wetland functions (reference standard sites). A critical step in the process is to choose the least altered wetlands in a hydrogeomorphic subclass to use as a reference standard against which other wetlands are compared. The basic assumption made in this approach is that wetlands judged to have had the least human impact have the highest level of sustainable performance for all functions. The levels at which functions are performed in these least altered wetlands are assumed to be "characteristic" for the subclass and "sustainable." Results from data collected in wetlands in the lowlands of western Washington suggest that the assumption may not be appropriate for this region. Teams developing methods for assessing wetland functions did not find that the least altered wetlands in a subclass had a range of performance levels that could be identified as "characteristic" or "sustainable." Forty-four wetlands in four hydrogeomorphic subclasses (two depressional subclasses and two riverine subclasses) were rated by teams of experts on the severity of their human alterations and on the level of performance of 15 wetland functions. An ordinal scale of 1-5 was used to quantify alterations in water regime, soils, vegetation, buffers, and contributing basin. Performance of functions was judged on an ordinal scale of 1-7. Relatively unaltered wetlands were judged to perform individual functions at levels that spanned all of the seven possible ratings in all four subclasses. The basic assumption of the HGM approach, that the least altered wetlands represent "characteristic" and "sustainable" levels of functioning that are different from those found in altered wetlands, was not confirmed. Although the intent of the HGM approach is to use level of functioning as a

  14. Studying dyadic structure-function relationships: a review of current modeling approaches and new insights into Ca2+ (mis)handling.

    PubMed

    Maleckar, Mary M; Edwards, Andrew G; Louch, William E; Lines, Glenn T

    2017-01-01

    Excitation-contraction coupling in cardiac myocytes requires calcium influx through L-type calcium channels in the sarcolemma, which gates calcium release through sarcoplasmic reticulum ryanodine receptors in a process known as calcium-induced calcium release, producing a myoplasmic calcium transient and enabling cardiomyocyte contraction. The spatio-temporal dynamics of calcium release, buffering, and reuptake into the sarcoplasmic reticulum play a central role in excitation-contraction coupling in both normal and diseased cardiac myocytes. However, further quantitative understanding of these cells' calcium machinery and the study of mechanisms that underlie both normal cardiac function and calcium-dependent etiologies in heart disease requires accurate knowledge of cardiac ultrastructure, protein distribution and subcellular function. As current imaging techniques are limited in spatial resolution, limiting insight into changes in calcium handling, computational models of excitation-contraction coupling have been increasingly employed to probe these structure-function relationships. This review will focus on the development of structural models of cardiac calcium dynamics at the subcellular level, orienting the reader broadly towards the development of models of subcellular calcium handling in cardiomyocytes. Specific focus will be given to progress in recent years in terms of multi-scale modeling employing resolved spatial models of subcellular calcium machinery. A review of the state-of-the-art will be followed by a review of emergent insights into calcium-dependent etiologies in heart disease and, finally, we will offer a perspective on future directions for related computational modeling and simulation efforts.

  15. Pharmacokinetic/Pharmacodynamic Modeling and Simulation of Cefiderocol, a Parenteral Siderophore Cephalosporin, for Dose Adjustment Based on Renal Function.

    PubMed

    Katsube, Takayuki; Wajima, Toshihiro; Ishibashi, Toru; Arjona Ferreira, Juan Camilo; Echols, Roger

    2017-01-01

    Cefiderocol, a novel parenteral siderophore cephalosporin, exhibits potent efficacy against most Gram-negative bacteria, including carbapenem-resistant strains. Since cefiderocol is excreted primarily via the kidneys, this study was conducted to develop a population pharmacokinetics (PK) model to determine dose adjustment based on renal function. Population PK models were developed based on data for cefiderocol concentrations in plasma, urine, and dialysate with a nonlinear mixed-effects model approach. Monte-Carlo simulations were conducted to calculate the probability of target attainment (PTA) of fraction of time during the dosing interval where the free drug concentration in plasma exceeds the MIC (T f >MIC ) for an MIC range of 0.25 to 16 μg/ml. For the simulations, dose regimens were selected to compare cefiderocol exposure among groups with different levels of renal function. The developed models well described the PK of cefiderocol for each renal function group. A dose of 2 g every 8 h with 3-h infusions provided >90% PTA for 75% T f >MIC for an MIC of ≤4 μg/ml for patients with normal renal function, while a more frequent dose (every 6 h) could be used for patients with augmented renal function. A reduced dose and/or extended dosing interval was selected for patients with impaired renal function. A supplemental dose immediately after intermittent hemodialysis was proposed for patients requiring intermittent hemodialysis. The PK of cefiderocol could be adequately modeled, and the modeling-and-simulation approach suggested dose regimens based on renal function, ensuring drug exposure with adequate bactericidal effect. Copyright © 2016 American Society for Microbiology.

  16. Computational models of basal-ganglia pathway functions: focus on functional neuroanatomy

    PubMed Central

    Schroll, Henning; Hamker, Fred H.

    2013-01-01

    Over the past 15 years, computational models have had a considerable impact on basal-ganglia research. Most of these models implement multiple distinct basal-ganglia pathways and assume them to fulfill different functions. As there is now a multitude of different models, it has become complex to keep track of their various, sometimes just marginally different assumptions on pathway functions. Moreover, it has become a challenge to oversee to what extent individual assumptions are corroborated or challenged by empirical data. Focusing on computational, but also considering non-computational models, we review influential concepts of pathway functions and show to what extent they are compatible with or contradict each other. Moreover, we outline how empirical evidence favors or challenges specific model assumptions and propose experiments that allow testing assumptions against each other. PMID:24416002

  17. Functional mixture regression.

    PubMed

    Yao, Fang; Fu, Yuejiao; Lee, Thomas C M

    2011-04-01

    In functional linear models (FLMs), the relationship between the scalar response and the functional predictor process is often assumed to be identical for all subjects. Motivated by both practical and methodological considerations, we relax this assumption and propose a new class of functional regression models that allow the regression structure to vary for different groups of subjects. By projecting the predictor process onto its eigenspace, the new functional regression model is simplified to a framework that is similar to classical mixture regression models. This leads to the proposed approach named as functional mixture regression (FMR). The estimation of FMR can be readily carried out using existing software implemented for functional principal component analysis and mixture regression. The practical necessity and performance of FMR are illustrated through applications to a longevity analysis of female medflies and a human growth study. Theoretical investigations concerning the consistent estimation and prediction properties of FMR along with simulation experiments illustrating its empirical properties are presented in the supplementary material available at Biostatistics online. Corresponding results demonstrate that the proposed approach could potentially achieve substantial gains over traditional FLMs.

  18. Model-based metrics of human-automation function allocation in complex work environments

    NASA Astrophysics Data System (ADS)

    Kim, So Young

    issues with function allocation. Then, based on the eight issues, eight types of metrics are established. The purpose of these metrics is to assess the extent to which each issue exists with a given function allocation. Specifically, the eight types of metrics assess workload, coherency of a function allocation, mismatches between responsibility and authority, interruptive automation, automation boundary conditions, human adaptation to context, stability of the human's work environment, and mission performance. Finally, to validate the modeling framework and the metrics, a case study was conducted modeling four different function allocations between a pilot and flight deck automation during the arrival and approach phases of flight. A range of pilot cognitive control modes and maximum human taskload limits were also included in the model. The metrics were assessed for these four function allocations and analyzed to validate capability of the metrics to identify important issues in given function allocations. In addition, the design insights provided by the metrics are highlighted. This thesis concludes with a discussion of mechanisms for further validating the modeling framework and function allocation metrics developed here, and highlights where these developments can be applied in research and in the design of function allocations in complex work environments such as aviation operations.

  19. A harmonic analysis approach to joint inversion of P-receiver functions and wave dispersion data in high dense seismic profiles

    NASA Astrophysics Data System (ADS)

    Molina-Aguilera, A.; Mancilla, F. D. L.; Julià, J.; Morales, J.

    2017-12-01

    Joint inversion techniques of P-receiver functions and wave dispersion data implicitly assume an isotropic radial stratified earth. The conventional approach invert stacked radial component receiver functions from different back-azimuths to obtain a laterally homogeneous single-velocity model. However, in the presence of strong lateral heterogeneities as anisotropic layers and/or dipping interfaces, receiver functions are considerably perturbed and both the radial and transverse components exhibit back azimuthal dependences. Harmonic analysis methods exploit these azimuthal periodicities to separate the effects due to the isotropic flat-layered structure from those effects caused by lateral heterogeneities. We implement a harmonic analysis method based on radial and transverse receiver functions components and carry out a synthetic study to illuminate the capabilities of the method in isolating the isotropic flat-layered part of receiver functions and constrain the geometry and strength of lateral heterogeneities. The independent of the baz P receiver function are jointly inverted with phase and group dispersion curves using a linearized inversion procedure. We apply this approach to high dense seismic profiles ( 2 km inter-station distance, see figure) located in the central Betics (western Mediterranean region), a region which has experienced complex geodynamic processes and exhibit strong variations in Moho topography. The technique presented here is robust and can be applied systematically to construct a 3-D model of the crust and uppermost mantle across large networks.

  20. Ecological prediction with nonlinear multivariate time-frequency functional data models

    USGS Publications Warehouse

    Yang, Wen-Hsi; Wikle, Christopher K.; Holan, Scott H.; Wildhaber, Mark L.

    2013-01-01

    Time-frequency analysis has become a fundamental component of many scientific inquiries. Due to improvements in technology, the amount of high-frequency signals that are collected for ecological and other scientific processes is increasing at a dramatic rate. In order to facilitate the use of these data in ecological prediction, we introduce a class of nonlinear multivariate time-frequency functional models that can identify important features of each signal as well as the interaction of signals corresponding to the response variable of interest. Our methodology is of independent interest and utilizes stochastic search variable selection to improve model selection and performs model averaging to enhance prediction. We illustrate the effectiveness of our approach through simulation and by application to predicting spawning success of shovelnose sturgeon in the Lower Missouri River.

  1. An approach to solving large reliability models

    NASA Technical Reports Server (NTRS)

    Boyd, Mark A.; Veeraraghavan, Malathi; Dugan, Joanne Bechta; Trivedi, Kishor S.

    1988-01-01

    This paper describes a unified approach to the problem of solving large realistic reliability models. The methodology integrates behavioral decomposition, state trunction, and efficient sparse matrix-based numerical methods. The use of fault trees, together with ancillary information regarding dependencies to automatically generate the underlying Markov model state space is proposed. The effectiveness of this approach is illustrated by modeling a state-of-the-art flight control system and a multiprocessor system. Nonexponential distributions for times to failure of components are assumed in the latter example. The modeling tool used for most of this analysis is HARP (the Hybrid Automated Reliability Predictor).

  2. [Neuropsychological models of autism spectrum disorders - behavioral evidence and functional imaging].

    PubMed

    Dziobek, Isabel; Bölte, Sven

    2011-03-01

    To review neuropsychological models of theory of mind (ToM), executive functions (EF), and central coherence (CC) as framework for cognitive abnormalities in autism spectrum disorders (ASD). Behavioral and functional imaging studies are described that assess social-cognitive, emotional, and executive functions as well as locally oriented perception in ASD. Impairments in ToM and EF as well as alterations in CC are frequently replicated phenomena in ASD. Especially problems concerning social perception and ToM have high explanatory value for clinical symptomatology. Brain activation patterns differ between individuals with and without ASD for ToM, EF, und CC functions. An approach focussing on reduced cortical connectivity seems to be increasingly favored over explanations focussing on single affected brain sites. A better understanding of the complexities of ASD in future research demands the integration of clinical, neuropsychological, functional imaging, and molecular genetics evidence. Weaknesses in ToM and EF as well as strengths in detail-focussed perception should be used for individual intervention planning.

  3. A traveling salesman approach for predicting protein functions.

    PubMed

    Johnson, Olin; Liu, Jing

    2006-10-12

    Protein-protein interaction information can be used to predict unknown protein functions and to help study biological pathways. Here we present a new approach utilizing the classic Traveling Salesman Problem to study the protein-protein interactions and to predict protein functions in budding yeast Saccharomyces cerevisiae. We apply the global optimization tool from combinatorial optimization algorithms to cluster the yeast proteins based on the global protein interaction information. We then use this clustering information to help us predict protein functions. We use our algorithm together with the direct neighbor algorithm 1 on characterized proteins and compare the prediction accuracy of the two methods. We show our algorithm can produce better predictions than the direct neighbor algorithm, which only considers the immediate neighbors of the query protein. Our method is a promising one to be used as a general tool to predict functions of uncharacterized proteins and a successful sample of using computer science knowledge and algorithms to study biological problems.

  4. A traveling salesman approach for predicting protein functions

    PubMed Central

    Johnson, Olin; Liu, Jing

    2006-01-01

    Background Protein-protein interaction information can be used to predict unknown protein functions and to help study biological pathways. Results Here we present a new approach utilizing the classic Traveling Salesman Problem to study the protein-protein interactions and to predict protein functions in budding yeast Saccharomyces cerevisiae. We apply the global optimization tool from combinatorial optimization algorithms to cluster the yeast proteins based on the global protein interaction information. We then use this clustering information to help us predict protein functions. We use our algorithm together with the direct neighbor algorithm [1] on characterized proteins and compare the prediction accuracy of the two methods. We show our algorithm can produce better predictions than the direct neighbor algorithm, which only considers the immediate neighbors of the query protein. Conclusion Our method is a promising one to be used as a general tool to predict functions of uncharacterized proteins and a successful sample of using computer science knowledge and algorithms to study biological problems. PMID:17147783

  5. Nutritional approach for designing meat-based functional food products with nuts.

    PubMed

    Olmedilla-Alonso, B; Granado-Lorencio, F; Herrero-Barbudo, C; Blanco-Navarro, I

    2006-01-01

    Meat and meat products are essential components of diets in developed countries and despite the convincing evidence that relate them to an increased risk for CVD, a growing consumption of meat products is foreseen. Epidemiological studies show that regular consumption of nuts, in general, and walnuts in particular, correlates inversely with myocardial infarction and ischaemic vascular disease. We assess the nutritional basis for and technological approach to the development of functional meat-based products potentially relevant in cardiovascular disease (CVD) risk reduction. Using the available strategies in the meat industry (reformulation processes) and a food-based approach, we address the design and development of restructured beef steak with added walnuts, potentially functional for CVD risk reduction. Its adequacy as a vehicle for active nutrients is confirmed by a pharmacokinetic pilot study in humans using gamma-tocopherol as an exposure biomarker in chylomicrons during the post-prandial state. Effect and potential "functionality" is being assessed by a dietary intervention study in subjects at risk and markers and indicators related to CVD are being evaluated. Within the conceptual framework of evidence-based medicine, development of meat-based functional products may become a useful approach for specific applications, with a potential market and health benefits of great importance at a population level.

  6. Was Hercules Happy? Some Answers from a Functional Model of Human Well-Being

    ERIC Educational Resources Information Center

    Vitterso, Joar; Soholt, Yngvil; Hetland, Audun; Thoresen, Irina Alekseeva; Roysamb, Espen

    2010-01-01

    The article proposes a functional approach as a framework for the analysis of human well-being. The model posits that the adaptive role of hedonic feelings is to regulate stability and homeostasis in human systems, and that these feelings basically are created in states of equilibrium or assimilation. To regulate change and growth, a distinct set…

  7. Benchmarking Inverse Statistical Approaches for Protein Structure and Design with Exactly Solvable Models.

    PubMed

    Jacquin, Hugo; Gilson, Amy; Shakhnovich, Eugene; Cocco, Simona; Monasson, Rémi

    2016-05-01

    Inverse statistical approaches to determine protein structure and function from Multiple Sequence Alignments (MSA) are emerging as powerful tools in computational biology. However the underlying assumptions of the relationship between the inferred effective Potts Hamiltonian and real protein structure and energetics remain untested so far. Here we use lattice protein model (LP) to benchmark those inverse statistical approaches. We build MSA of highly stable sequences in target LP structures, and infer the effective pairwise Potts Hamiltonians from those MSA. We find that inferred Potts Hamiltonians reproduce many important aspects of 'true' LP structures and energetics. Careful analysis reveals that effective pairwise couplings in inferred Potts Hamiltonians depend not only on the energetics of the native structure but also on competing folds; in particular, the coupling values reflect both positive design (stabilization of native conformation) and negative design (destabilization of competing folds). In addition to providing detailed structural information, the inferred Potts models used as protein Hamiltonian for design of new sequences are able to generate with high probability completely new sequences with the desired folds, which is not possible using independent-site models. Those are remarkable results as the effective LP Hamiltonians used to generate MSA are not simple pairwise models due to the competition between the folds. Our findings elucidate the reasons for the success of inverse approaches to the modelling of proteins from sequence data, and their limitations.

  8. Structure, function, and behaviour of computational models in systems biology

    PubMed Central

    2013-01-01

    Background Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such “bio-models” necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. Results We present a conceptual framework – the meaning facets – which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model’s components (structure), the meaning of the model’s intended use (function), and the meaning of the model’s dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. Conclusions The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research

  9. Trait approach and avoidance motivation: lateralized neural activity associated with executive function.

    PubMed

    Spielberg, Jeffrey M; Miller, Gregory A; Engels, Anna S; Herrington, John D; Sutton, Bradley P; Banich, Marie T; Heller, Wendy

    2011-01-01

    Motivation and executive function are both necessary for the completion of goal-directed behavior. Research investigating the manner in which these processes interact is beginning to emerge and has implicated middle frontal gyrus (MFG) as a site of interaction for relevant neural mechanisms. However, this research has focused on state motivation, and it has not examined functional lateralization. The present study examined the impact of trait levels of approach and avoidance motivation on neural processes associated with executive function. Functional magnetic resonance imaging was conducted while participants performed a color-word Stroop task. Analyses identified brain regions in which trait approach and avoidance motivation (measured by questionnaires) moderated activation associated with executive control. Approach was hypothesized to be associated with left-lateralized MFG activation, whereas avoidance was hypothesized to be associated with right-lateralized MFG activation. Results supported both hypotheses. Present findings implicate areas of middle frontal gyrus in top-down control to guide behavior in accordance with motivational goals. Copyright © 2010 Elsevier Inc. All rights reserved.

  10. Classical Testing in Functional Linear Models.

    PubMed

    Kong, Dehan; Staicu, Ana-Maria; Maity, Arnab

    2016-01-01

    We extend four tests common in classical regression - Wald, score, likelihood ratio and F tests - to functional linear regression, for testing the null hypothesis, that there is no association between a scalar response and a functional covariate. Using functional principal component analysis, we re-express the functional linear model as a standard linear model, where the effect of the functional covariate can be approximated by a finite linear combination of the functional principal component scores. In this setting, we consider application of the four traditional tests. The proposed testing procedures are investigated theoretically for densely observed functional covariates when the number of principal components diverges. Using the theoretical distribution of the tests under the alternative hypothesis, we develop a procedure for sample size calculation in the context of functional linear regression. The four tests are further compared numerically for both densely and sparsely observed noisy functional data in simulation experiments and using two real data applications.

  11. Classical Testing in Functional Linear Models

    PubMed Central

    Kong, Dehan; Staicu, Ana-Maria; Maity, Arnab

    2016-01-01

    We extend four tests common in classical regression - Wald, score, likelihood ratio and F tests - to functional linear regression, for testing the null hypothesis, that there is no association between a scalar response and a functional covariate. Using functional principal component analysis, we re-express the functional linear model as a standard linear model, where the effect of the functional covariate can be approximated by a finite linear combination of the functional principal component scores. In this setting, we consider application of the four traditional tests. The proposed testing procedures are investigated theoretically for densely observed functional covariates when the number of principal components diverges. Using the theoretical distribution of the tests under the alternative hypothesis, we develop a procedure for sample size calculation in the context of functional linear regression. The four tests are further compared numerically for both densely and sparsely observed noisy functional data in simulation experiments and using two real data applications. PMID:28955155

  12. A Model For Change: An Approach for Forecasting Well-Being ...

    EPA Pesticide Factsheets

    Every community decision incorporates a "forecasting" strategy (whether formal or implicit) to help visualize expected results and evaluate the potential “feelings” that people living in that community may have about those results. With more communities seeking to make decisions based on sustainable alternatives, forecasting efforts that examine potential impacts of decisions on overall community well-being may prove to be valuable for not only gaging future benefits and trade-offs, but also for recognizing a community’s affective response to the outcomes of those decisions. This paper describes a forecasting approach based on concepts introduced in the development of the U.S. Environmental Protection Agency’s (US EPA) Human Well-Being Index (HWBI) (Smith, et. al. 2014; Summers et al. 2014). The approach examines the relationships among selected economic, environmental and social services that can be directly impacted by community decisions and eight domains of human well-being. Using models developed from constructed- or fixed-effect step-wise and multiple regressions and eleven years of data (2000-2010), these relationship functions may be used to characterize likely direct impacts of decisions on future well-being as well as the possible intended and unintended secondary and tertiary effects relative to any main decision effects. This paper describes an approach to using HWBI in decision making models to characterize likely impacts of decisions on fut

  13. Determining the functional form of density dependence: deductive approaches for consumer-resource systems having a single resource.

    PubMed

    Abrams, Peter A

    2009-09-01

    Consumer-resource models are used to deduce the functional form of density dependence in the consumer population. A general approach to determining the form of consumer density dependence is proposed; this involves determining the equilibrium (or average) population size for a series of different harvest rates. The relationship between a consumer's mortality and its equilibrium population size is explored for several one-consumer/one-resource models. The shape of density dependence in the resource and the shape of the numerical and functional responses all tend to be "inherited" by the consumer's density dependence. Consumer-resource models suggest that density dependence will very often have both concave and convex segments, something that is impossible under the commonly used theta-logistic model. A range of consumer-resource models predicts that consumer population size often declines at a decelerating rate with mortality at low mortality rates, is insensitive to or increases with mortality over a wide range of intermediate mortalities, and declines at a rapidly accelerating rate with increased mortality when mortality is high. This has important implications for management and conservation of natural populations.

  14. Challenges in structural approaches to cell modeling.

    PubMed

    Im, Wonpil; Liang, Jie; Olson, Arthur; Zhou, Huan-Xiang; Vajda, Sandor; Vakser, Ilya A

    2016-07-31

    Computational modeling is essential for structural characterization of biomolecular mechanisms across the broad spectrum of scales. Adequate understanding of biomolecular mechanisms inherently involves our ability to model them. Structural modeling of individual biomolecules and their interactions has been rapidly progressing. However, in terms of the broader picture, the focus is shifting toward larger systems, up to the level of a cell. Such modeling involves a more dynamic and realistic representation of the interactomes in vivo, in a crowded cellular environment, as well as membranes and membrane proteins, and other cellular components. Structural modeling of a cell complements computational approaches to cellular mechanisms based on differential equations, graph models, and other techniques to model biological networks, imaging data, etc. Structural modeling along with other computational and experimental approaches will provide a fundamental understanding of life at the molecular level and lead to important applications to biology and medicine. A cross section of diverse approaches presented in this review illustrates the developing shift from the structural modeling of individual molecules to that of cell biology. Studies in several related areas are covered: biological networks; automated construction of three-dimensional cell models using experimental data; modeling of protein complexes; prediction of non-specific and transient protein interactions; thermodynamic and kinetic effects of crowding; cellular membrane modeling; and modeling of chromosomes. The review presents an expert opinion on the current state-of-the-art in these various aspects of structural modeling in cellular biology, and the prospects of future developments in this emerging field. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Challenges in structural approaches to cell modeling

    PubMed Central

    Im, Wonpil; Liang, Jie; Olson, Arthur; Zhou, Huan-Xiang; Vajda, Sandor; Vakser, Ilya A.

    2016-01-01

    Computational modeling is essential for structural characterization of biomolecular mechanisms across the broad spectrum of scales. Adequate understanding of biomolecular mechanisms inherently involves our ability to model them. Structural modeling of individual biomolecules and their interactions has been rapidly progressing. However, in terms of the broader picture, the focus is shifting toward larger systems, up to the level of a cell. Such modeling involves a more dynamic and realistic representation of the interactomes in vivo, in a crowded cellular environment, as well as membranes and membrane proteins, and other cellular components. Structural modeling of a cell complements computational approaches to cellular mechanisms based on differential equations, graph models, and other techniques to model biological networks, imaging data, etc. Structural modeling along with other computational and experimental approaches will provide a fundamental understanding of life at the molecular level and lead to important applications to biology and medicine. A cross section of diverse approaches presented in this review illustrates the developing shift from the structural modeling of individual molecules to that of cell biology. Studies in several related areas are covered: biological networks; automated construction of three-dimensional cell models using experimental data; modeling of protein complexes; prediction of non-specific and transient protein interactions; thermodynamic and kinetic effects of crowding; cellular membrane modeling; and modeling of chromosomes. The review presents an expert opinion on the current state-of-the-art in these various aspects of structural modeling in cellular biology, and the prospects of future developments in this emerging field. PMID:27255863

  16. Occupant behavior models: A critical review of implementation and representation approaches in building performance simulation programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Chen, Yixing; Belafi, Zsofia

    Occupant behavior (OB) in buildings is a leading factor influencing energy use in buildings. Quantifying this influence requires the integration of OB models with building performance simulation (BPS). This study reviews approaches to representing and implementing OB models in today’s popular BPS programs, and discusses weaknesses and strengths of these approaches and key issues in integrating of OB models with BPS programs. Two of the key findings are: (1) a common data model is needed to standardize the representation of OB models, enabling their flexibility and exchange among BPS programs and user applications; the data model can be implemented usingmore » a standard syntax (e.g., in the form of XML schema), and (2) a modular software implementation of OB models, such as functional mock-up units for co-simulation, adopting the common data model, has advantages in providing a robust and interoperable integration with multiple BPS programs. Such common OB model representation and implementation approaches help standardize the input structures of OB models, enable collaborative development of a shared library of OB models, and allow for rapid and widespread integration of OB models with BPS programs to improve the simulation of occupant behavior and quantification of their impact on building performance.« less

  17. Occupant behavior models: A critical review of implementation and representation approaches in building performance simulation programs

    DOE PAGES

    Hong, Tianzhen; Chen, Yixing; Belafi, Zsofia; ...

    2017-07-27

    Occupant behavior (OB) in buildings is a leading factor influencing energy use in buildings. Quantifying this influence requires the integration of OB models with building performance simulation (BPS). This study reviews approaches to representing and implementing OB models in today’s popular BPS programs, and discusses weaknesses and strengths of these approaches and key issues in integrating of OB models with BPS programs. Two of the key findings are: (1) a common data model is needed to standardize the representation of OB models, enabling their flexibility and exchange among BPS programs and user applications; the data model can be implemented usingmore » a standard syntax (e.g., in the form of XML schema), and (2) a modular software implementation of OB models, such as functional mock-up units for co-simulation, adopting the common data model, has advantages in providing a robust and interoperable integration with multiple BPS programs. Such common OB model representation and implementation approaches help standardize the input structures of OB models, enable collaborative development of a shared library of OB models, and allow for rapid and widespread integration of OB models with BPS programs to improve the simulation of occupant behavior and quantification of their impact on building performance.« less

  18. Mapping behavioral landscapes for animal movement: a finite mixture modeling approach

    USGS Publications Warehouse

    Tracey, Jeff A.; Zhu, Jun; Boydston, Erin E.; Lyren, Lisa M.; Fisher, Robert N.; Crooks, Kevin R.

    2013-01-01

    Because of its role in many ecological processes, movement of animals in response to landscape features is an important subject in ecology and conservation biology. In this paper, we develop models of animal movement in relation to objects or fields in a landscape. We take a finite mixture modeling approach in which the component densities are conceptually related to different choices for movement in response to a landscape feature, and the mixing proportions are related to the probability of selecting each response as a function of one or more covariates. We combine particle swarm optimization and an Expectation-Maximization (EM) algorithm to obtain maximum likelihood estimates of the model parameters. We use this approach to analyze data for movement of three bobcats in relation to urban areas in southern California, USA. A behavioral interpretation of the models revealed similarities and differences in bobcat movement response to urbanization. All three bobcats avoided urbanization by moving either parallel to urban boundaries or toward less urban areas as the proportion of urban land cover in the surrounding area increased. However, one bobcat, a male with a dispersal-like large-scale movement pattern, avoided urbanization at lower densities and responded strictly by moving parallel to the urban edge. The other two bobcats, which were both residents and occupied similar geographic areas, avoided urban areas using a combination of movements parallel to the urban edge and movement toward areas of less urbanization. However, the resident female appeared to exhibit greater repulsion at lower levels of urbanization than the resident male, consistent with empirical observations of bobcats in southern California. Using the parameterized finite mixture models, we mapped behavioral states to geographic space, creating a representation of a behavioral landscape. This approach can provide guidance for conservation planning based on analysis of animal movement data using

  19. Formulation of consumables management models. Development approach for the mission planning processor working model

    NASA Technical Reports Server (NTRS)

    Connelly, L. C.

    1977-01-01

    The mission planning processor is a user oriented tool for consumables management and is part of the total consumables subsystem management concept. The approach to be used in developing a working model of the mission planning processor is documented. The approach includes top-down design, structured programming techniques, and application of NASA approved software development standards. This development approach: (1) promotes cost effective software development, (2) enhances the quality and reliability of the working model, (3) encourages the sharing of the working model through a standard approach, and (4) promotes portability of the working model to other computer systems.

  20. Ethnicity and Changing Functional Health in Middle and Late Life: A Person-Centered Approach

    PubMed Central

    Xu, Xiao; Bennett, Joan M.; Ye, Wen; Quiñones, Ana R.

    2010-01-01

    Objectives. Following a person-centered approach, this research aims to depict distinct courses of disability and to ascertain how the probabilities of experiencing these trajectories vary across Black, Hispanic, and White middle-aged and older Americans. Methods. Data came from the 1995–2006 Health and Retirement Study, which involved a national sample of 18,486 Americans older than 50 years of age. Group-based semiparametric mixture models (Proc Traj) were used for data analysis. Results. Five trajectories were identified: (a) excellent functional health (61%), (b) good functional health with small increasing disability (25%), (c) accelerated increase in disability (7%), (d) high but stable disability (4%), and (e) persistent severe impairment (3%). However, when time-varying covariates (e.g., martial status and health conditions) were controlled, only 3 trajectories emerged: (a) healthy functioning (53%), moderate functional decrement (40%), and (c) large functional decrement (8%). Black and Hispanic Americans had significantly higher probabilities than White Americans in experiencing poor functional health trajectories, with Blacks at greater risks than Hispanics. Conclusions. Parallel to the concepts of successful aging, usual aging, and pathological aging, there exist distinct courses of changing functional health over time. The mechanisms underlying changes in disability may vary between Black and Hispanic Americans. PMID:20008483

  1. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    PubMed

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  2. Functional insights from proteome-wide structural modeling of Treponema pallidum subspecies pallidum, the causative agent of syphilis.

    PubMed

    Houston, Simon; Lithgow, Karen Vivien; Osbak, Kara Krista; Kenyon, Chris Richard; Cameron, Caroline E

    2018-05-16

    Syphilis continues to be a major global health threat with 11 million new infections each year, and a global burden of 36 million cases. The causative agent of syphilis, Treponema pallidum subspecies pallidum, is a highly virulent bacterium, however the molecular mechanisms underlying T. pallidum pathogenesis remain to be definitively identified. This is due to the fact that T. pallidum is currently uncultivatable, inherently fragile and thus difficult to work with, and phylogenetically distinct with no conventional virulence factor homologs found in other pathogens. In fact, approximately 30% of its predicted protein-coding genes have no known orthologs or assigned functions. Here we employed a structural bioinformatics approach using Phyre2-based tertiary structure modeling to improve our understanding of T. pallidum protein function on a proteome-wide scale. Phyre2-based tertiary structure modeling generated high-confidence predictions for 80% of the T. pallidum proteome (780/978 predicted proteins). Tertiary structure modeling also inferred the same function as primary structure-based annotations from genome sequencing pipelines for 525/605 proteins (87%), which represents 54% (525/978) of all T. pallidum proteins. Of the 175 T. pallidum proteins modeled with high confidence that were not assigned functions in the previously annotated published proteome, 167 (95%) were able to be assigned predicted functions. Twenty-one of the 175 hypothetical proteins modeled with high confidence were also predicted to exhibit significant structural similarity with proteins experimentally confirmed to be required for virulence in other pathogens. Phyre2-based structural modeling is a powerful bioinformatics tool that has provided insight into the potential structure and function of the majority of T. pallidum proteins and helped validate the primary structure-based annotation of more than 50% of all T. pallidum proteins with high confidence. This work represents the first T

  3. Models of Protocellular Structure, Function and Evolution

    NASA Technical Reports Server (NTRS)

    New, Michael H.; Pohorille, Andrew; Szostak, Jack W.; Keefe, Tony; Lanyi, Janos K.; DeVincenzi, Donald L. (Technical Monitor)

    2001-01-01

    In the absence of any record of protocells, the most direct way to test our understanding, of the origin of cellular life is to construct laboratory models that capture important features of protocellular systems. Such efforts are currently underway in a collaborative project between NASA-Ames, Harvard Medical School and University of California. They are accompanied by computational studies aimed at explaining self-organization of simple molecules into ordered structures. The centerpiece of this project is a method for the in vitro evolution of protein enzymes toward arbitrary catalytic targets. A similar approach has already been developed for nucleic acids in which a small number of functional molecules are selected from a large, random population of candidates. The selected molecules are next vastly multiplied using the polymerase chain reaction.

  4. Hypothesis testing in functional linear regression models with Neyman's truncation and wavelet thresholding for longitudinal data.

    PubMed

    Yang, Xiaowei; Nie, Kun

    2008-03-15

    Longitudinal data sets in biomedical research often consist of large numbers of repeated measures. In many cases, the trajectories do not look globally linear or polynomial, making it difficult to summarize the data or test hypotheses using standard longitudinal data analysis based on various linear models. An alternative approach is to apply the approaches of functional data analysis, which directly target the continuous nonlinear curves underlying discretely sampled repeated measures. For the purposes of data exploration, many functional data analysis strategies have been developed based on various schemes of smoothing, but fewer options are available for making causal inferences regarding predictor-outcome relationships, a common task seen in hypothesis-driven medical studies. To compare groups of curves, two testing strategies with good power have been proposed for high-dimensional analysis of variance: the Fourier-based adaptive Neyman test and the wavelet-based thresholding test. Using a smoking cessation clinical trial data set, this paper demonstrates how to extend the strategies for hypothesis testing into the framework of functional linear regression models (FLRMs) with continuous functional responses and categorical or continuous scalar predictors. The analysis procedure consists of three steps: first, apply the Fourier or wavelet transform to the original repeated measures; then fit a multivariate linear model in the transformed domain; and finally, test the regression coefficients using either adaptive Neyman or thresholding statistics. Since a FLRM can be viewed as a natural extension of the traditional multiple linear regression model, the development of this model and computational tools should enhance the capacity of medical statistics for longitudinal data.

  5. A Novel Approach to Implement Takagi-Sugeno Fuzzy Models.

    PubMed

    Chang, Chia-Wen; Tao, Chin-Wang

    2017-09-01

    This paper proposes new algorithms based on the fuzzy c-regressing model algorithm for Takagi-Sugeno (T-S) fuzzy modeling of the complex nonlinear systems. A fuzzy c-regression state model (FCRSM) algorithm is a T-S fuzzy model in which the functional antecedent and the state-space-model-type consequent are considered with the available input-output data. The antecedent and consequent forms of the proposed FCRSM consists mainly of two advantages: one is that the FCRSM has low computation load due to only one input variable is considered in the antecedent part; another is that the unknown system can be modeled to not only the polynomial form but also the state-space form. Moreover, the FCRSM can be extended to FCRSM-ND and FCRSM-Free algorithms. An algorithm FCRSM-ND is presented to find the T-S fuzzy state-space model of the nonlinear system when the input-output data cannot be precollected and an assumed effective controller is available. In the practical applications, the mathematical model of controller may be hard to be obtained. In this case, an online tuning algorithm, FCRSM-FREE, is designed such that the parameters of a T-S fuzzy controller and the T-S fuzzy state model of an unknown system can be online tuned simultaneously. Four numerical simulations are given to demonstrate the effectiveness of the proposed approach.

  6. Improvement and comparison of likelihood functions for model calibration and parameter uncertainty analysis within a Markov chain Monte Carlo scheme

    NASA Astrophysics Data System (ADS)

    Cheng, Qin-Bo; Chen, Xi; Xu, Chong-Yu; Reinhardt-Imjela, Christian; Schulte, Achim

    2014-11-01

    In this study, the likelihood functions for uncertainty analysis of hydrological models are compared and improved through the following steps: (1) the equivalent relationship between the Nash-Sutcliffe Efficiency coefficient (NSE) and the likelihood function with Gaussian independent and identically distributed residuals is proved; (2) a new estimation method of the Box-Cox transformation (BC) parameter is developed to improve the effective elimination of the heteroscedasticity of model residuals; and (3) three likelihood functions-NSE, Generalized Error Distribution with BC (BC-GED) and Skew Generalized Error Distribution with BC (BC-SGED)-are applied for SWAT-WB-VSA (Soil and Water Assessment Tool - Water Balance - Variable Source Area) model calibration in the Baocun watershed, Eastern China. Performances of calibrated models are compared using the observed river discharges and groundwater levels. The result shows that the minimum variance constraint can effectively estimate the BC parameter. The form of the likelihood function significantly impacts on the calibrated parameters and the simulated results of high and low flow components. SWAT-WB-VSA with the NSE approach simulates flood well, but baseflow badly owing to the assumption of Gaussian error distribution, where the probability of the large error is low, but the small error around zero approximates equiprobability. By contrast, SWAT-WB-VSA with the BC-GED or BC-SGED approach mimics baseflow well, which is proved in the groundwater level simulation. The assumption of skewness of the error distribution may be unnecessary, because all the results of the BC-SGED approach are nearly the same as those of the BC-GED approach.

  7. Steady state conductance in a double quantum dot array: the nonequilibrium equation-of-motion Green function approach.

    PubMed

    Levy, Tal J; Rabani, Eran

    2013-04-28

    We study steady state transport through a double quantum dot array using the equation-of-motion approach to the nonequilibrium Green functions formalism. This popular technique relies on uncontrolled approximations to obtain a closure for a hierarchy of equations; however, its accuracy is questioned. We focus on 4 different closures, 2 of which were previously proposed in the context of the single quantum dot system (Anderson impurity model) and were extended to the double quantum dot array, and develop 2 new closures. Results for the differential conductance are compared to those attained by a master equation approach known to be accurate for weak system-leads couplings and high temperatures. While all 4 closures provide an accurate description of the Coulomb blockade and other transport properties in the single quantum dot case, they differ in the case of the double quantum dot array, where only one of the developed closures provides satisfactory results. This is rationalized by comparing the poles of the Green functions to the exact many-particle energy differences for the isolate system. Our analysis provides means to extend the equation-of-motion technique to more elaborate models of large bridge systems with strong electronic interactions.

  8. Longitudinal Relationships Between Productive Activities and Functional Health in Later Years: A Multivariate Latent Growth Curve Modeling Approach.

    PubMed

    Choi, Eunhee; Tang, Fengyan; Kim, Sung-Geun; Turk, Phillip

    2016-10-01

    This study examined the longitudinal relationships between functional health in later years and three types of productive activities: volunteering, full-time, and part-time work. Using the data from five waves (2000-2008) of the Health and Retirement Study, we applied multivariate latent growth curve modeling to examine the longitudinal relationships among individuals 50 or over. Functional health was measured by limitations in activities of daily living. Individuals who volunteered, worked either full time or part time exhibited a slower decline in functional health than nonparticipants. Significant associations were also found between initial functional health and longitudinal changes in productive activity participation. This study provides additional support for the benefits of productive activities later in life; engagement in volunteering and employment are indeed associated with better functional health in middle and old age. © The Author(s) 2016.

  9. Functionalization of carbon nanotubes: Characterization, modeling and composite applications

    NASA Astrophysics Data System (ADS)

    Wang, Shiren

    Carbon nanotubes have demonstrated exceptional mechanical, thermal and electrical properties, and are regarded as one of the most promising reinforcement materials for the next generation of high performance structural and multifunctional composites. However, to date, most application attempts have been hindered by several technical roadblocks, such as poor dispersion and weak interfacial bonding. In this dissertation, several innovative functionalization methods were proposed, studied to overcome these technical issues in order to realize the full potential of nanotubes as reinforcement. These functionalization methods included precision sectioning of nanotubes using an ultra-microtome, electron-beam irradiation, amino and epoxide group grafting. The characterization results of atomic force microscope, transmission electronic microscope and Raman suggested that aligned carbon nanotubes can be precisely sectioned with controlled length and minimum sidewall damage. This study also designed and demonstrated new covalent functionalization approaches through unique epoxy-grafting and one-step amino-grafting, which have potential of scale-up for composite applications. In addition, the dissertation also successfully tailored the structure and properties of the thin nanotube film through electron beam irradiation. Significant improvement of both mechanical and electrical conducting properties of the irradiated nanotube films or buckypapers was achieved. All these methods demonstrated effectiveness in improving dispersion and interfacial bonding in the epoxy resin, resulting in considerable improvements in composite mechanical properties. Modeling of functionalization methods also provided further understanding and offered the reasonable explanations of SWNTs length distribution as well as carbon nanostructure transformation upon electron-beam irradiation. Both experimental and modeling results provide important foundations for the further comprehensively investigation of

  10. Modeling healthcare authorization and claim submissions using the openEHR dual-model approach

    PubMed Central

    2011-01-01

    Background The TISS standard is a set of mandatory forms and electronic messages for healthcare authorization and claim submissions among healthcare plans and providers in Brazil. It is not based on formal models as the new generation of health informatics standards suggests. The objective of this paper is to model the TISS in terms of the openEHR archetype-based approach and integrate it into a patient-centered EHR architecture. Methods Three approaches were adopted to model TISS. In the first approach, a set of archetypes was designed using ENTRY subclasses. In the second one, a set of archetypes was designed using exclusively ADMIN_ENTRY and CLUSTERs as their root classes. In the third approach, the openEHR ADMIN_ENTRY is extended with classes designed for authorization and claim submissions, and an ISM_TRANSITION attribute is added to the COMPOSITION class. Another set of archetypes was designed based on this model. For all three approaches, templates were designed to represent the TISS forms. Results The archetypes based on the openEHR RM (Reference Model) can represent all TISS data structures. The extended model adds subclasses and an attribute to the COMPOSITION class to represent information on authorization and claim submissions. The archetypes based on all three approaches have similar structures, although rooted in different classes. The extended openEHR RM model is more semantically aligned with the concepts involved in a claim submission, but may disrupt interoperability with other systems and the current tools must be adapted to deal with it. Conclusions Modeling the TISS standard by means of the openEHR approach makes it aligned with ISO recommendations and provides a solid foundation on which the TISS can evolve. Although there are few administrative archetypes available, the openEHR RM is expressive enough to represent the TISS standard. This paper focuses on the TISS but its results may be extended to other billing processes. A complete

  11. Functional modeling of the human auditory brainstem response to broadband stimulationa)

    PubMed Central

    Verhulst, Sarah; Bharadwaj, Hari M.; Mehraei, Golbarg; Shera, Christopher A.; Shinn-Cunningham, Barbara G.

    2015-01-01

    Population responses such as the auditory brainstem response (ABR) are commonly used for hearing screening, but the relationship between single-unit physiology and scalp-recorded population responses are not well understood. Computational models that integrate physiologically realistic models of single-unit auditory-nerve (AN), cochlear nucleus (CN) and inferior colliculus (IC) cells with models of broadband peripheral excitation can be used to simulate ABRs and thereby link detailed knowledge of animal physiology to human applications. Existing functional ABR models fail to capture the empirically observed 1.2–2 ms ABR wave-V latency-vs-intensity decrease that is thought to arise from level-dependent changes in cochlear excitation and firing synchrony across different tonotopic sections. This paper proposes an approach where level-dependent cochlear excitation patterns, which reflect human cochlear filter tuning parameters, drive AN fibers to yield realistic level-dependent properties of the ABR wave-V. The number of free model parameters is minimal, producing a model in which various sources of hearing-impairment can easily be simulated on an individualized and frequency-dependent basis. The model fits latency-vs-intensity functions observed in human ABRs and otoacoustic emissions while maintaining rate-level and threshold characteristics of single-unit AN fibers. The simulations help to reveal which tonotopic regions dominate ABR waveform peaks at different stimulus intensities. PMID:26428802

  12. Functional networks inference from rule-based machine learning models.

    PubMed

    Lazzarini, Nicola; Widera, Paweł; Williamson, Stuart; Heer, Rakesh; Krasnogor, Natalio; Bacardit, Jaume

    2016-01-01

    Functional networks play an important role in the analysis of biological processes and systems. The inference of these networks from high-throughput (-omics) data is an area of intense research. So far, the similarity-based inference paradigm (e.g. gene co-expression) has been the most popular approach. It assumes a functional relationship between genes which are expressed at similar levels across different samples. An alternative to this paradigm is the inference of relationships from the structure of machine learning models. These models are able to capture complex relationships between variables, that often are different/complementary to the similarity-based methods. We propose a protocol to infer functional networks from machine learning models, called FuNeL. It assumes, that genes used together within a rule-based machine learning model to classify the samples, might also be functionally related at a biological level. The protocol is first tested on synthetic datasets and then evaluated on a test suite of 8 real-world datasets related to human cancer. The networks inferred from the real-world data are compared against gene co-expression networks of equal size, generated with 3 different methods. The comparison is performed from two different points of view. We analyse the enriched biological terms in the set of network nodes and the relationships between known disease-associated genes in a context of the network topology. The comparison confirms both the biological relevance and the complementary character of the knowledge captured by the FuNeL networks in relation to similarity-based methods and demonstrates its potential to identify known disease associations as core elements of the network. Finally, using a prostate cancer dataset as a case study, we confirm that the biological knowledge captured by our method is relevant to the disease and consistent with the specialised literature and with an independent dataset not used in the inference process. The

  13. Reward from bugs to bipeds: a comparative approach to understanding how reward circuits function

    PubMed Central

    Scaplen, Kristin M.; Kaun, Karla R.

    2016-01-01

    Abstract In a complex environment, animals learn from their responses to stimuli and events. Appropriate response to reward and punishment can promote survival, reproduction and increase evolutionary fitness. Interestingly, the neural processes underlying these responses are remarkably similar across phyla. In all species, dopamine is central to encoding reward and directing motivated behaviors, however, a comprehensive understanding of how circuits encode reward and direct motivated behaviors is still lacking. In part, this is a result of the sheer diversity of neurons, the heterogeneity of their responses and the complexity of neural circuits within which they are found. We argue that general features of reward circuitry are common across model organisms, and thus principles learned from invertebrate model organisms can inform research across species. In particular, we discuss circuit motifs that appear to be functionally equivalent from flies to primates. We argue that a comparative approach to studying and understanding reward circuit function provides a more comprehensive understanding of reward circuitry, and informs disorders that affect the brain’s reward circuitry. PMID:27328845

  14. The treatment of climate science in Integrated Assessment Modelling: integration of climate step function response in an energy system integrated assessment model.

    NASA Astrophysics Data System (ADS)

    Dessens, Olivier

    2016-04-01

    Integrated Assessment Models (IAMs) are used as crucial inputs to policy-making on climate change. These models simulate aspect of the economy and climate system to deliver future projections and to explore the impact of mitigation and adaptation policies. The IAMs' climate representation is extremely important as it can have great influence on future political action. The step-function-response is a simple climate model recently developed by the UK Met Office and is an alternate method of estimating the climate response to an emission trajectory directly from global climate model step simulations. Good et al., (2013) have formulated a method of reconstructing general circulation models (GCMs) climate response to emission trajectories through an idealized experiment. This method is called the "step-response approach" after and is based on an idealized abrupt CO2 step experiment results. TIAM-UCL is a technology-rich model that belongs to the family of, partial-equilibrium, bottom-up models, developed at University College London to represent a wide spectrum of energy systems in 16 regions of the globe (Anandarajah et al. 2011). The model uses optimisation functions to obtain cost-efficient solutions, in meeting an exogenously defined set of energy-service demands, given certain technological and environmental constraints. Furthermore, it employs linear programming techniques making the step function representation of the climate change response adapted to the model mathematical formulation. For the first time, we have introduced the "step-response approach" method developed at the UK Met Office in an IAM, the TIAM-UCL energy system, and we investigate the main consequences of this modification on the results of the model in term of climate and energy system responses. The main advantage of this approach (apart from the low computational cost it entails) is that its results are directly traceable to the GCM involved and closely connected to well-known methods of

  15. A hybrid wavelet analysis-cloud model data-extending approach for meteorologic and hydrologic time series

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ding, Hao; Singh, Vijay P.; Shang, Xiaosan; Liu, Dengfeng; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing

    2015-05-01

    For scientific and sustainable management of water resources, hydrologic and meteorologic data series need to be often extended. This paper proposes a hybrid approach, named WA-CM (wavelet analysis-cloud model), for data series extension. Wavelet analysis has time-frequency localization features, known as "mathematics microscope," that can decompose and reconstruct hydrologic and meteorologic series by wavelet transform. The cloud model is a mathematical representation of fuzziness and randomness and has strong robustness for uncertain data. The WA-CM approach first employs the wavelet transform to decompose the measured nonstationary series and then uses the cloud model to develop an extension model for each decomposition layer series. The final extension is obtained by summing the results of extension of each layer. Two kinds of meteorologic and hydrologic data sets with different characteristics and different influence of human activity from six (three pairs) representative stations are used to illustrate the WA-CM approach. The approach is also compared with four other methods, which are conventional correlation extension method, Kendall-Theil robust line method, artificial neural network method (back propagation, multilayer perceptron, and radial basis function), and single cloud model method. To evaluate the model performance completely and thoroughly, five measures are used, which are relative error, mean relative error, standard deviation of relative error, root mean square error, and Thiel inequality coefficient. Results show that the WA-CM approach is effective, feasible, and accurate and is found to be better than other four methods compared. The theory employed and the approach developed here can be applied to extension of data in other areas as well.

  16. Models in palaeontological functional analysis

    PubMed Central

    Anderson, Philip S. L.; Bright, Jen A.; Gill, Pamela G.; Palmer, Colin; Rayfield, Emily J.

    2012-01-01

    Models are a principal tool of modern science. By definition, and in practice, models are not literal representations of reality but provide simplifications or substitutes of the events, scenarios or behaviours that are being studied or predicted. All models make assumptions, and palaeontological models in particular require additional assumptions to study unobservable events in deep time. In the case of functional analysis, the degree of missing data associated with reconstructing musculoskeletal anatomy and neuronal control in extinct organisms has, in the eyes of some scientists, rendered detailed functional analysis of fossils intractable. Such a prognosis may indeed be realized if palaeontologists attempt to recreate elaborate biomechanical models based on missing data and loosely justified assumptions. Yet multiple enabling methodologies and techniques now exist: tools for bracketing boundaries of reality; more rigorous consideration of soft tissues and missing data and methods drawing on physical principles that all organisms must adhere to. As with many aspects of science, the utility of such biomechanical models depends on the questions they seek to address, and the accuracy and validity of the models themselves. PMID:21865242

  17. EFICAz2: enzyme function inference by a combined approach enhanced by machine learning.

    PubMed

    Arakaki, Adrian K; Huang, Ying; Skolnick, Jeffrey

    2009-04-13

    We previously developed EFICAz, an enzyme function inference approach that combines predictions from non-completely overlapping component methods. Two of the four components in the original EFICAz are based on the detection of functionally discriminating residues (FDRs). FDRs distinguish between member of an enzyme family that are homofunctional (classified under the EC number of interest) or heterofunctional (annotated with another EC number or lacking enzymatic activity). Each of the two FDR-based components is associated to one of two specific kinds of enzyme families. EFICAz exhibits high precision performance, except when the maximal test to training sequence identity (MTTSI) is lower than 30%. To improve EFICAz's performance in this regime, we: i) increased the number of predictive components and ii) took advantage of consensual information from the different components to make the final EC number assignment. We have developed two new EFICAz components, analogs to the two FDR-based components, where the discrimination between homo and heterofunctional members is based on the evaluation, via Support Vector Machine models, of all the aligned positions between the query sequence and the multiple sequence alignments associated to the enzyme families. Benchmark results indicate that: i) the new SVM-based components outperform their FDR-based counterparts, and ii) both SVM-based and FDR-based components generate unique predictions. We developed classification tree models to optimally combine the results from the six EFICAz components into a final EC number prediction. The new implementation of our approach, EFICAz2, exhibits a highly improved prediction precision at MTTSI < 30% compared to the original EFICAz, with only a slight decrease in prediction recall. A comparative analysis of enzyme function annotation of the human proteome by EFICAz2 and KEGG shows that: i) when both sources make EC number assignments for the same protein sequence, the assignments tend to

  18. Combining formal and functional approaches to topic structure.

    PubMed

    Zellers, Margaret; Post, Brechtje

    2012-03-01

    Fragmentation between formal and functional approaches to prosodic variation is an ongoing problem in linguistic research. In particular, the frameworks of the Phonetics of Talk-in-Interaction (PTI) and Empirical Phonology (EP) take very different theoretical and methodological approaches to this kind of variation. We argue that it is fruitful to adopt the insights of both PTI's qualitative analysis and EP's quantitative analysis and combine them into a multiple-methods approach. One realm in which it is possible to combine these frameworks is in the analysis of discourse topic structure and the prosodic cues relevant to it. By combining a quantitative and a qualitative approach to discourse topic structure, it is possible to give a better account of the observed variation in prosody, for example in the case of fundamental frequency (F0) peak timing, which can be explained in terms of pitch accent distribution over different topic structure categories. Similarly, local and global patterns in speech rate variation can be better explained and motivated by adopting insights from both PTI and EP in the study of topic structure. Combining PTI and EP can provide better accounts of speech data as well as opening up new avenues of investigation which would not have been possible in either approach alone.

  19. Study of cumulative fatigue damage detection for used parts with nonlinear output frequency response functions based on NARMAX modelling

    NASA Astrophysics Data System (ADS)

    Huang, Honglan; Mao, Hanying; Mao, Hanling; Zheng, Weixue; Huang, Zhenfeng; Li, Xinxin; Wang, Xianghong

    2017-12-01

    Cumulative fatigue damage detection for used parts plays a key role in the process of remanufacturing engineering and is related to the service safety of the remanufactured parts. In light of the nonlinear properties of used parts caused by cumulative fatigue damage, the based nonlinear output frequency response functions detection approach offers a breakthrough to solve this key problem. First, a modified PSO-adaptive lasso algorithm is introduced to improve the accuracy of the NARMAX model under impulse hammer excitation, and then, an effective new algorithm is derived to estimate the nonlinear output frequency response functions under rectangular pulse excitation, and a based nonlinear output frequency response functions index is introduced to detect the cumulative fatigue damage in used parts. Then, a novel damage detection approach that integrates the NARMAX model and the rectangular pulse is proposed for nonlinear output frequency response functions identification and cumulative fatigue damage detection of used parts. Finally, experimental studies of fatigued plate specimens and used connecting rod parts are conducted to verify the validity of the novel approach. The obtained results reveal that the new approach can detect cumulative fatigue damages of used parts effectively and efficiently and that the various values of the based nonlinear output frequency response functions index can be used to detect the different fatigue damages or working time. Since the proposed new approach can extract nonlinear properties of systems by only a single excitation of the inspected system, it shows great promise for use in remanufacturing engineering applications.

  20. Non-parametric identification of multivariable systems: A local rational modeling approach with application to a vibration isolation benchmark

    NASA Astrophysics Data System (ADS)

    Voorhoeve, Robbert; van der Maas, Annemiek; Oomen, Tom

    2018-05-01

    Frequency response function (FRF) identification is often used as a basis for control systems design and as a starting point for subsequent parametric system identification. The aim of this paper is to develop a multiple-input multiple-output (MIMO) local parametric modeling approach for FRF identification of lightly damped mechanical systems with improved speed and accuracy. The proposed method is based on local rational models, which can efficiently handle the lightly-damped resonant dynamics. A key aspect herein is the freedom in the multivariable rational model parametrizations. Several choices for such multivariable rational model parametrizations are proposed and investigated. For systems with many inputs and outputs the required number of model parameters can rapidly increase, adversely affecting the performance of the local modeling approach. Therefore, low-order model structures are investigated. The structure of these low-order parametrizations leads to an undesired directionality in the identification problem. To address this, an iterative local rational modeling algorithm is proposed. As a special case recently developed SISO algorithms are recovered. The proposed approach is successfully demonstrated on simulations and on an active vibration isolation system benchmark, confirming good performance of the method using significantly less parameters compared with alternative approaches.

  1. A weakly-constrained data assimilation approach to address rainfall-runoff model structural inadequacy in streamflow prediction

    NASA Astrophysics Data System (ADS)

    Lee, Haksu; Seo, Dong-Jun; Noh, Seong Jin

    2016-11-01

    This paper presents a simple yet effective weakly-constrained (WC) data assimilation (DA) approach for hydrologic models which accounts for model structural inadequacies associated with rainfall-runoff transformation processes. Compared to the strongly-constrained (SC) DA, WC DA adjusts the control variables less while producing similarly or more accurate analysis. Hence the adjusted model states are dynamically more consistent with those of the base model. The inadequacy of a rainfall-runoff model was modeled as an additive error to runoff components prior to routing and penalized in the objective function. Two example modeling applications, distributed and lumped, were carried out to investigate the effects of the WC DA approach on DA results. For distributed modeling, the distributed Sacramento Soil Moisture Accounting (SAC-SMA) model was applied to the TIFM7 Basin in Missouri, USA. For lumped modeling, the lumped SAC-SMA model was applied to nineteen basins in Texas. In both cases, the variational DA (VAR) technique was used to assimilate discharge data at the basin outlet. For distributed SAC-SMA, spatially homogeneous error modeling yielded updated states that are spatially much more similar to the a priori states, as quantified by Earth Mover's Distance (EMD), than spatially heterogeneous error modeling by up to ∼10 times. DA experiments using both lumped and distributed SAC-SMA modeling indicated that assimilating outlet flow using the WC approach generally produce smaller mean absolute difference as well as higher correlation between the a priori and the updated states than the SC approach, while producing similar or smaller root mean square error of streamflow analysis and prediction. Large differences were found in both lumped and distributed modeling cases between the updated and the a priori lower zone tension and primary free water contents for both WC and SC approaches, indicating possible model structural deficiency in describing low flows or

  2. The generalized Hill model: A kinematic approach towards active muscle contraction

    NASA Astrophysics Data System (ADS)

    Göktepe, Serdar; Menzel, Andreas; Kuhl, Ellen

    2014-12-01

    Excitation-contraction coupling is the physiological process of converting an electrical stimulus into a mechanical response. In muscle, the electrical stimulus is an action potential and the mechanical response is active contraction. The classical Hill model characterizes muscle contraction though one contractile element, activated by electrical excitation, and two non-linear springs, one in series and one in parallel. This rheology translates into an additive decomposition of the total stress into a passive and an active part. Here we supplement this additive decomposition of the stress by a multiplicative decomposition of the deformation gradient into a passive and an active part. We generalize the one-dimensional Hill model to the three-dimensional setting and constitutively define the passive stress as a function of the total deformation gradient and the active stress as a function of both the total deformation gradient and its active part. We show that this novel approach combines the features of both the classical stress-based Hill model and the recent active-strain models. While the notion of active stress is rather phenomenological in nature, active strain is micro-structurally motivated, physically measurable, and straightforward to calibrate. We demonstrate that our model is capable of simulating excitation-contraction coupling in cardiac muscle with its characteristic features of wall thickening, apical lift, and ventricular torsion.

  3. Elliptic supersymmetric integrable model and multivariable elliptic functions

    NASA Astrophysics Data System (ADS)

    Motegi, Kohei

    2017-12-01

    We investigate the elliptic integrable model introduced by Deguchi and Martin [Int. J. Mod. Phys. A 7, Suppl. 1A, 165 (1992)], which is an elliptic extension of the Perk-Schultz model. We introduce and study a class of partition functions of the elliptic model by using the Izergin-Korepin analysis. We show that the partition functions are expressed as a product of elliptic factors and elliptic Schur-type symmetric functions. This result resembles recent work by number theorists in which the correspondence between the partition functions of trigonometric models and the product of the deformed Vandermonde determinant and Schur functions were established.

  4. A Functional Varying-Coefficient Single-Index Model for Functional Response Data

    PubMed Central

    Li, Jialiang; Huang, Chao; Zhu, Hongtu

    2016-01-01

    Motivated by the analysis of imaging data, we propose a novel functional varying-coefficient single index model (FVCSIM) to carry out the regression analysis of functional response data on a set of covariates of interest. FVCSIM represents a new extension of varying-coefficient single index models for scalar responses collected from cross-sectional and longitudinal studies. An efficient estimation procedure is developed to iteratively estimate varying coefficient functions, link functions, index parameter vectors, and the covariance function of individual functions. We systematically examine the asymptotic properties of all estimators including the weak convergence of the estimated varying coefficient functions, the asymptotic distribution of the estimated index parameter vectors, and the uniform convergence rate of the estimated covariance function and their spectrum. Simulation studies are carried out to assess the finite-sample performance of the proposed procedure. We apply FVCSIM to investigating the development of white matter diffusivities along the corpus callosum skeleton obtained from Alzheimer’s Disease Neuroimaging Initiative (ADNI) study. PMID:29200540

  5. A Functional Varying-Coefficient Single-Index Model for Functional Response Data.

    PubMed

    Li, Jialiang; Huang, Chao; Zhu, Hongtu

    2017-01-01

    Motivated by the analysis of imaging data, we propose a novel functional varying-coefficient single index model (FVCSIM) to carry out the regression analysis of functional response data on a set of covariates of interest. FVCSIM represents a new extension of varying-coefficient single index models for scalar responses collected from cross-sectional and longitudinal studies. An efficient estimation procedure is developed to iteratively estimate varying coefficient functions, link functions, index parameter vectors, and the covariance function of individual functions. We systematically examine the asymptotic properties of all estimators including the weak convergence of the estimated varying coefficient functions, the asymptotic distribution of the estimated index parameter vectors, and the uniform convergence rate of the estimated covariance function and their spectrum. Simulation studies are carried out to assess the finite-sample performance of the proposed procedure. We apply FVCSIM to investigating the development of white matter diffusivities along the corpus callosum skeleton obtained from Alzheimer's Disease Neuroimaging Initiative (ADNI) study.

  6. Assessment of tropospheric delay mapping function models in Egypt: Using PTD database model

    NASA Astrophysics Data System (ADS)

    Abdelfatah, M. A.; Mousa, Ashraf E.; El-Fiky, Gamal S.

    2018-06-01

    For space geodetic measurements, estimates of tropospheric delays are highly correlated with site coordinates and receiver clock biases. Thus, it is important to use the most accurate models for the tropospheric delay to reduce errors in the estimates of the other parameters. Both the zenith delay value and mapping function should be assigned correctly to reduce such errors. Several mapping function models can treat the troposphere slant delay. The recent models were not evaluated for the Egyptian local climate conditions. An assessment of these models is needed to choose the most suitable one. The goal of this paper is to test the quality of global mapping function which provides high consistency with precise troposphere delay (PTD) mapping functions. The PTD model is derived from radiosonde data using ray tracing, which consider in this paper as true value. The PTD mapping functions were compared, with three recent total mapping functions model and another three separate dry and wet mapping function model. The results of the research indicate that models are very close up to zenith angle 80°. Saastamoinen and 1/cos z model are behind accuracy. Niell model is better than VMF model. The model of Black and Eisner is a good model. The results also indicate that the geometric range error has insignificant effect on slant delay and the fluctuation of azimuth anti-symmetric is about 1%.

  7. Using A Model-Based Systems Engineering Approach For Exploration Medical System Development

    NASA Technical Reports Server (NTRS)

    Hanson, A.; Mindock, J.; McGuire, K.; Reilly, J.; Cerro, J.; Othon, W.; Rubin, D.; Urbina, M.; Canga, M.

    2017-01-01

    NASA's Human Research Program's Exploration Medical Capabilities (ExMC) element is defining the medical system needs for exploration class missions. ExMC's Systems Engineering (SE) team will play a critical role in successful design and implementation of the medical system into exploration vehicles. The team's mission is to "Define, develop, validate, and manage the technical system design needed to implement exploration medical capabilities for Mars and test the design in a progression of proving grounds." Development of the medical system is being conducted in parallel with exploration mission architecture and vehicle design development. Successful implementation of the medical system in this environment will require a robust systems engineering approach to enable technical communication across communities to create a common mental model of the emergent engineering and medical systems. Model-Based Systems Engineering (MBSE) improves shared understanding of system needs and constraints between stakeholders and offers a common language for analysis. The ExMC SE team is using MBSE techniques to define operational needs, decompose requirements and architecture, and identify medical capabilities needed to support human exploration. Systems Modeling Language (SysML) is the specific language the SE team is utilizing, within an MBSE approach, to model the medical system functional needs, requirements, and architecture. Modeling methods are being developed through the practice of MBSE within the team, and tools are being selected to support meta-data exchange as integration points to other system models are identified. Use of MBSE is supporting the development of relationships across disciplines and NASA Centers to build trust and enable teamwork, enhance visibility of team goals, foster a culture of unbiased learning and serving, and be responsive to customer needs. The MBSE approach to medical system design offers a paradigm shift toward greater integration between

  8. Controlled grafting of vinylic monomers on polyolefins: a robust mathematical modeling approach

    PubMed Central

    Saeb, Mohammad Reza; Rezaee, Babak; Shadman, Alireza; Formela, Krzysztof; Ahmadi, Zahed; Hemmati, Farkhondeh; Kermaniyan, Tayebeh Sadat; Mohammadi, Yousef

    2017-01-01

    Abstract Experimental and mathematical modeling analyses were used for controlling melt free-radical grafting of vinylic monomers on polyolefins and, thereby, reducing the disturbance of undesired cross-linking of polyolefins. Response surface, desirability function, and artificial intelligence methodologies were blended to modeling/optimization of grafting reaction in terms of vinylic monomer content, peroxide initiator concentration, and melt-processing time. An in-house code was developed based on artificial neural network that learns and mimics processing torque and grafting of glycidyl methacrylate (GMA) typical vinylic monomer on high-density polyethylene (HDPE). Application of response surface and desirability function enabled concurrent optimization of processing torque and GMA grafting on HDPE, through which we quantified for the first time competition between parallel reactions taking place during melt processing: (i) desirable grafting of GMA on HDPE; (ii) undesirable cross-linking of HDPE. The proposed robust mathematical modeling approach can precisely learn the behavior of grafting reaction of vinylic monomers on polyolefins and be placed into practice in finding exact operating condition needed for efficient grafting of reactive monomers on polyolefins. PMID:29491797

  9. Controlled grafting of vinylic monomers on polyolefins: a robust mathematical modeling approach.

    PubMed

    Saeb, Mohammad Reza; Rezaee, Babak; Shadman, Alireza; Formela, Krzysztof; Ahmadi, Zahed; Hemmati, Farkhondeh; Kermaniyan, Tayebeh Sadat; Mohammadi, Yousef

    2017-01-01

    Experimental and mathematical modeling analyses were used for controlling melt free-radical grafting of vinylic monomers on polyolefins and, thereby, reducing the disturbance of undesired cross-linking of polyolefins. Response surface, desirability function, and artificial intelligence methodologies were blended to modeling/optimization of grafting reaction in terms of vinylic monomer content, peroxide initiator concentration, and melt-processing time. An in-house code was developed based on artificial neural network that learns and mimics processing torque and grafting of glycidyl methacrylate (GMA) typical vinylic monomer on high-density polyethylene (HDPE). Application of response surface and desirability function enabled concurrent optimization of processing torque and GMA grafting on HDPE, through which we quantified for the first time competition between parallel reactions taking place during melt processing: (i) desirable grafting of GMA on HDPE; (ii) undesirable cross-linking of HDPE. The proposed robust mathematical modeling approach can precisely learn the behavior of grafting reaction of vinylic monomers on polyolefins and be placed into practice in finding exact operating condition needed for efficient grafting of reactive monomers on polyolefins.

  10. Addressing global uncertainty and sensitivity in first-principles based microkinetic models by an adaptive sparse grid approach

    NASA Astrophysics Data System (ADS)

    Döpking, Sandra; Plaisance, Craig P.; Strobusch, Daniel; Reuter, Karsten; Scheurer, Christoph; Matera, Sebastian

    2018-01-01

    In the last decade, first-principles-based microkinetic modeling has been developed into an important tool for a mechanistic understanding of heterogeneous catalysis. A commonly known, but hitherto barely analyzed issue in this kind of modeling is the presence of sizable errors from the use of approximate Density Functional Theory (DFT). We here address the propagation of these errors to the catalytic turnover frequency (TOF) by global sensitivity and uncertainty analysis. Both analyses require the numerical quadrature of high-dimensional integrals. To achieve this efficiently, we utilize and extend an adaptive sparse grid approach and exploit the confinement of the strongly non-linear behavior of the TOF to local regions of the parameter space. We demonstrate the methodology on a model of the oxygen evolution reaction at the Co3O4 (110)-A surface, using a maximum entropy error model that imposes nothing but reasonable bounds on the errors. For this setting, the DFT errors lead to an absolute uncertainty of several orders of magnitude in the TOF. We nevertheless find that it is still possible to draw conclusions from such uncertain models about the atomistic aspects controlling the reactivity. A comparison with derivative-based local sensitivity analysis instead reveals that this more established approach provides incomplete information. Since the adaptive sparse grids allow for the evaluation of the integrals with only a modest number of function evaluations, this approach opens the way for a global sensitivity analysis of more complex models, for instance, models based on kinetic Monte Carlo simulations.

  11. A flexible model for the mean and variance functions, with application to medical cost data.

    PubMed

    Chen, Jinsong; Liu, Lei; Zhang, Daowen; Shih, Ya-Chen T

    2013-10-30

    Medical cost data are often skewed to the right and heteroscedastic, having a nonlinear relation with covariates. To tackle these issues, we consider an extension to generalized linear models by assuming nonlinear associations of covariates in the mean function and allowing the variance to be an unknown but smooth function of the mean. We make no further assumption on the distributional form. The unknown functions are described by penalized splines, and the estimation is carried out using nonparametric quasi-likelihood. Simulation studies show the flexibility and advantages of our approach. We apply the model to the annual medical costs of heart failure patients in the clinical data repository at the University of Virginia Hospital System. Copyright © 2013 John Wiley & Sons, Ltd.

  12. A generalized estimating equations approach for resting-state functional MRI group analysis.

    PubMed

    D'Angelo, Gina M; Lazar, Nicole A; Eddy, William F; Morris, John C; Sheline, Yvette I

    2011-01-01

    An Alzheimer's fMRI study has motivated us to evaluate inter-regional correlations between groups. The overall objective is to assess inter-regional correlations at a resting-state with no stimulus or task. We propose using a generalized estimating equation (GEE) transition model and a GEE marginal model to model the within-subject correlation for each region. Residuals calculated from the GEE models are used to correlate brain regions and assess between group differences. The standard pooling approach of group averages of the Fisher-z transformation assuming temporal independence is a typical approach used to compare group correlations. The GEE approaches and standard Fisher-z pooling approach are demonstrated with an Alzheimer's disease (AD) connectivity study in a population of AD subjects and healthy control subjects. We also compare these methods using simulation studies and show that the transition model may have better statistical properties.

  13. Multimodal Light Microscopy Approaches to Reveal Structural and Functional Properties of Promyelocytic Leukemia Nuclear Bodies.

    PubMed

    Hoischen, Christian; Monajembashi, Shamci; Weisshart, Klaus; Hemmerich, Peter

    2018-01-01

    The promyelocytic leukemia ( pml ) gene product PML is a tumor suppressor localized mainly in the nucleus of mammalian cells. In the cell nucleus, PML seeds the formation of macromolecular multiprotein complexes, known as PML nuclear bodies (PML NBs). While PML NBs have been implicated in many cellular functions including cell cycle regulation, survival and apoptosis their role as signaling hubs along major genome maintenance pathways emerged more clearly. However, despite extensive research over the past decades, the precise biochemical function of PML in these pathways is still elusive. It remains a big challenge to unify all the different previously suggested cellular functions of PML NBs into one mechanistic model. With the advent of genetically encoded fluorescent proteins it became possible to trace protein function in living specimens. In parallel, a variety of fluorescence fluctuation microscopy (FFM) approaches have been developed which allow precise determination of the biophysical and interaction properties of cellular factors at the single molecule level in living cells. In this report, we summarize the current knowledge on PML nuclear bodies and describe several fluorescence imaging, manipulation, FFM, and super-resolution techniques suitable to analyze PML body assembly and function. These include fluorescence redistribution after photobleaching, fluorescence resonance energy transfer, fluorescence correlation spectroscopy, raster image correlation spectroscopy, ultraviolet laser microbeam-induced DNA damage, erythrocyte-mediated force application, and super-resolution microscopy approaches. Since most if not all of the microscopic equipment to perform these techniques may be available in an institutional or nearby facility, we hope to encourage more researches to exploit sophisticated imaging tools for their research in cancer biology.

  14. Fracture and fatigue analysis of functionally graded and homogeneous materials using singular integral equation approach

    NASA Astrophysics Data System (ADS)

    Zhao, Huaqing

    functionally graded materials. This work provides a solid foundation for further applications of the singular integral equation approach to fracture and fatigue problems in advanced composites. The concept of crack bridging is a unifying theory for fracture at various length scales, from atomic cleavage to rupture of concrete structures. However, most of the previous studies are limited to small scale bridging analyses although large scale bridging conditions prevail in engineering materials. In this work, a large scale bridging analysis is included within the framework of singular integral equation approach. This allows us to study fracture, fatigue and toughening mechanisms in advanced materials with crack bridging. As an example, the fatigue crack growth of grain bridging ceramics is studied. With the advent of composite materials technology, more complex material microstructures are being introduced, and more mechanics issues such as inhomogeneity and nonlinearity come into play. Improved mathematical and numerical tools need to be developed to allow theoretical modeling of these materials. This thesis work is an attempt to meet these challenges by making contributions to both micromechanics modeling and applied mathematics. It sets the stage for further investigations of a wide range of problems in the deformation and fracture of advanced engineering materials.

  15. Parent-child Communication-centered Rehabilitative Approach for Pediatric Functional Somatic Symptoms.

    PubMed

    Gerner, Maya; Barak, Sharon; Landa, Jana; Eisenstein, Etzyona

    2016-01-01

    Functional somatic symptoms (FSS) are a type of somatization phenomenon. Integrative rehabilitation approaches are the preferred treatment for pediatric FSS. Parental roles in the treatment process have not been established. to present 1) a parent-focused treatment (PFT) for pediatric FSS and 2) the approach's preliminary results. The sample included 50 children with physical disabilities due to FSS. All children received PFT including physical and psychological therapy. A detailed description of the program's course and guiding principles is provided. FSS extinction and age-appropriate functioning. Post-program, 84% of participants did not exhibit FSS and 94% returned to age-appropriate functioning. At one-year follow-up, only 5% of participants experienced symptom recurrence. No associations were found between pre-admission symptoms and intervention duration. PFT is beneficial in treating pediatric FSS. Therefore, intensive parental involvement in rehabilitation may be cardinal.

  16. Questionnaire of Executive Function for Dancers: An Ecological Approach

    ERIC Educational Resources Information Center

    Wong, Alina; Rodriguez, Mabel; Quevedo, Liliana; de Cossio, Lourdes Fernandez; Borges, Ariel; Reyes, Alicia; Corral, Roberto; Blanco, Florentino; Alvarez, Miguel

    2012-01-01

    There is a current debate about the ecological validity of executive function (EF) tests. Consistent with the verisimilitude approach, this research proposes the Ballet Executive Scale (BES), a self-rating questionnaire that assimilates idiosyncratic executive behaviors of classical dance community. The BES was administrated to 149 adolescents,…

  17. HABITAT MODELING APPROACHES FOR RESTORATION SITE SELECTION

    EPA Science Inventory

    Numerous modeling approaches have been used to develop predictive models of species-environment and species-habitat relationships. These models have been used in conservation biology and habitat or species management, but their application to restoration efforts has been minimal...

  18. Designing a model for trauma system management using public health approach: the case of Iran.

    PubMed

    Tarighi, Payam; Tabibi, Seyed Jamaledin; Motevalian, Seyed Abbas; Tofighi, Shahram; Maleki, Mohammad Reza; Delgoshaei, Bahram; Panahi, Farzad; Masoomi, Gholam Reza

    2012-01-01

    Trauma is a leading cause of death and disability around the world. Injuries are responsible for about six million deaths annually, of which ninety percent occur in developing countries. In Iran, injuries are the most common cause of death among age groups below fifty. Trauma system development is a systematic and comprehensive approach to injury prevention and treatment whose effectiveness has been proved. The present study aims at designing a trauma system management model as the first step toward trauma system establishment in Iran. In this qualitative research, a conceptual framework was developed based on the public health approach and three well-known trauma system models. We used Benchmarks, Indicators and Scoring (BIS) to analyze the current situation of Iran trauma care system. Then the trauma system management was designed using the policy development phase of public health approach The trauma system management model, validated by a panel of experts, describes lead agency, trauma system plan, policy-making councils, and data-based control according to the four main functions of management: leading, planning, organizing and controlling. This model may be implemented in two phases: the exclusive phase, focusing on resource integration and the inclusive phase, which concentrates on system development. The model could facilitate the development of trauma system in Iran through pilot studies as the assurance phase of public health approach. Furthermore, the model can provide a practical framework for trauma system management at the international level.

  19. Robust scoring functions for protein-ligand interactions with quantum chemical charge models.

    PubMed

    Wang, Jui-Chih; Lin, Jung-Hsin; Chen, Chung-Ming; Perryman, Alex L; Olson, Arthur J

    2011-10-24

    Ordinary least-squares (OLS) regression has been used widely for constructing the scoring functions for protein-ligand interactions. However, OLS is very sensitive to the existence of outliers, and models constructed using it are easily affected by the outliers or even the choice of the data set. On the other hand, determination of atomic charges is regarded as of central importance, because the electrostatic interaction is known to be a key contributing factor for biomolecular association. In the development of the AutoDock4 scoring function, only OLS was conducted, and the simple Gasteiger method was adopted. It is therefore of considerable interest to see whether more rigorous charge models could improve the statistical performance of the AutoDock4 scoring function. In this study, we have employed two well-established quantum chemical approaches, namely the restrained electrostatic potential (RESP) and the Austin-model 1-bond charge correction (AM1-BCC) methods, to obtain atomic partial charges, and we have compared how different charge models affect the performance of AutoDock4 scoring functions. In combination with robust regression analysis and outlier exclusion, our new protein-ligand free energy regression model with AM1-BCC charges for ligands and Amber99SB charges for proteins achieve lowest root-mean-squared error of 1.637 kcal/mol for the training set of 147 complexes and 2.176 kcal/mol for the external test set of 1427 complexes. The assessment for binding pose prediction with the 100 external decoy sets indicates very high success rate of 87% with the criteria of predicted root-mean-squared deviation of less than 2 Å. The success rates and statistical performance of our robust scoring functions are only weakly class-dependent (hydrophobic, hydrophilic, or mixed).

  20. Modeling for (physical) biologists: an introduction to the rule-based approach

    PubMed Central

    Chylek, Lily A; Harris, Leonard A; Faeder, James R; Hlavacek, William S

    2015-01-01

    Models that capture the chemical kinetics of cellular regulatory networks can be specified in terms of rules for biomolecular interactions. A rule defines a generalized reaction, meaning a reaction that permits multiple reactants, each capable of participating in a characteristic transformation and each possessing certain, specified properties, which may be local, such as the state of a particular site or domain of a protein. In other words, a rule defines a transformation and the properties that reactants must possess to participate in the transformation. A rule also provides a rate law. A rule-based approach to modeling enables consideration of mechanistic details at the level of functional sites of biomolecules and provides a facile and visual means for constructing computational models, which can be analyzed to study how system-level behaviors emerge from component interactions. PMID:26178138