Sample records for models functional

  1. Transfer Function Identification Using Orthogonal Fourier Transform Modeling Functions

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    2013-01-01

    A method for transfer function identification, including both model structure determination and parameter estimation, was developed and demonstrated. The approach uses orthogonal modeling functions generated from frequency domain data obtained by Fourier transformation of time series data. The method was applied to simulation data to identify continuous-time transfer function models and unsteady aerodynamic models. Model fit error, estimated model parameters, and the associated uncertainties were used to show the effectiveness of the method for identifying accurate transfer function models from noisy data.

  2. Functional CAR models for large spatially correlated functional datasets.

    PubMed

    Zhang, Lin; Baladandayuthapani, Veerabhadran; Zhu, Hongxiao; Baggerly, Keith A; Majewski, Tadeusz; Czerniak, Bogdan A; Morris, Jeffrey S

    2016-01-01

    We develop a functional conditional autoregressive (CAR) model for spatially correlated data for which functions are collected on areal units of a lattice. Our model performs functional response regression while accounting for spatial correlations with potentially nonseparable and nonstationary covariance structure, in both the space and functional domains. We show theoretically that our construction leads to a CAR model at each functional location, with spatial covariance parameters varying and borrowing strength across the functional domain. Using basis transformation strategies, the nonseparable spatial-functional model is computationally scalable to enormous functional datasets, generalizable to different basis functions, and can be used on functions defined on higher dimensional domains such as images. Through simulation studies, we demonstrate that accounting for the spatial correlation in our modeling leads to improved functional regression performance. Applied to a high-throughput spatially correlated copy number dataset, the model identifies genetic markers not identified by comparable methods that ignore spatial correlations.

  3. Functional Status Outperforms Comorbidities as a Predictor of 30-Day Acute Care Readmissions in the Inpatient Rehabilitation Population.

    PubMed

    Shih, Shirley L; Zafonte, Ross; Bates, David W; Gerrard, Paul; Goldstein, Richard; Mix, Jacqueline; Niewczyk, Paulette; Greysen, S Ryan; Kazis, Lewis; Ryan, Colleen M; Schneider, Jeffrey C

    2016-10-01

    Functional status is associated with patient outcomes, but is rarely included in hospital readmission risk models. The objective of this study was to determine whether functional status is a better predictor of 30-day acute care readmission than traditionally investigated variables including demographics and comorbidities. Retrospective database analysis between 2002 and 2011. 1158 US inpatient rehabilitation facilities. 4,199,002 inpatient rehabilitation facility admissions comprising patients from 16 impairment groups within the Uniform Data System for Medical Rehabilitation database. Logistic regression models predicting 30-day readmission were developed based on age, gender, comorbidities (Elixhauser comorbidity index, Deyo-Charlson comorbidity index, and Medicare comorbidity tier system), and functional status [Functional Independence Measure (FIM)]. We hypothesized that (1) function-based models would outperform demographic- and comorbidity-based models and (2) the addition of demographic and comorbidity data would not significantly enhance function-based models. For each impairment group, Function Only Models were compared against Demographic-Comorbidity Models and Function Plus Models (Function-Demographic-Comorbidity Models). The primary outcome was 30-day readmission, and the primary measure of model performance was the c-statistic. All-cause 30-day readmission rate from inpatient rehabilitation facilities to acute care hospitals was 9.87%. C-statistics for the Function Only Models were 0.64 to 0.70. For all 16 impairment groups, the Function Only Model demonstrated better c-statistics than the Demographic-Comorbidity Models (c-statistic difference: 0.03-0.12). The best-performing Function Plus Models exhibited negligible improvements in model performance compared to Function Only Models, with c-statistic improvements of only 0.01 to 0.05. Readmissions are currently used as a marker of hospital performance, with recent financial penalties to hospitals for excessive readmissions. Function-based readmission models outperform models based only on demographics and comorbidities. Readmission risk models would benefit from the inclusion of functional status as a primary predictor. Copyright © 2016 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  4. Functional status predicts acute care readmission in the traumatic spinal cord injury population.

    PubMed

    Huang, Donna; Slocum, Chloe; Silver, Julie K; Morgan, James W; Goldstein, Richard; Zafonte, Ross; Schneider, Jeffrey C

    2018-03-29

    Context/objective Acute care readmission has been identified as an important marker of healthcare quality. Most previous models assessing risk prediction of readmission incorporate variables for medical comorbidity. We hypothesized that functional status is a more robust predictor of readmission in the spinal cord injury population than medical comorbidities. Design Retrospective cross-sectional analysis. Setting Inpatient rehabilitation facilities, Uniform Data System for Medical Rehabilitation data from 2002 to 2012 Participants traumatic spinal cord injury patients. Outcome measures A logistic regression model for predicting acute care readmission based on demographic variables and functional status (Functional Model) was compared with models incorporating demographics, functional status, and medical comorbidities (Functional-Plus) or models including demographics and medical comorbidities (Demographic-Comorbidity). The primary outcomes were 3- and 30-day readmission, and the primary measure of model performance was the c-statistic. Results There were a total of 68,395 patients with 1,469 (2.15%) readmitted at 3 days and 7,081 (10.35%) readmitted at 30 days. The c-statistics for the Functional Model were 0.703 and 0.654 for 3 and 30 days. The Functional Model outperformed Demographic-Comorbidity models at 3 days (c-statistic difference: 0.066-0.096) and outperformed two of the three Demographic-Comorbidity models at 30 days (c-statistic difference: 0.029-0.056). The Functional-Plus models exhibited negligible improvements (0.002-0.010) in model performance compared to the Functional models. Conclusion Readmissions are used as a marker of hospital performance. Function-based readmission models in the spinal cord injury population outperform models incorporating medical comorbidities. Readmission risk models for this population would benefit from the inclusion of functional status.

  5. Building a reference functional model for EHR systems.

    PubMed

    Sumita, Yuki; Takata, Mami; Ishitsuka, Keiju; Tominaga, Yasuyuki; Ohe, Kazuhiko

    2007-09-01

    Our aim was to develop a reference functional model for electric health record systems (RFM). Such a RFM is built from functions using functional descriptive elements (FDEs) and represents the static relationships between them. This paper presents a new format for describing electric health record (EHR) system functions. Questionnaire and field interview survey was conducted in five hospitals in Japan and one in the USA, to collect data on EHR system functions. Based on survey results, a reference functional list (RFL) was created, in which each EHR system function was listed and divided into 13 FDE types. By analyzing the RFL, we built the meta-functional model and the functional model using UML class diagrams. The former defines language for expressing the functional model, while the latter represents functions, FDEs and their static relationships. A total of 385 functions were represented in the RFL. Six patterns were found for the relationships between functions. The meta-functional model was created as a new format for describing functions. Examples of the functional model, which included the six patterns in the relationships between functions and 11 verbs, were created. We present the meta-functional model, which is a new description format for the functional structure and relationships. Although a more detailed description is required to apply the RFM to the semiautomatic generation of functional specification documents, our RFM can visualize functional structures and functional relationships, classify functions using multiple axes and identify the similarities and differences between functions. The RFM will promote not only the standardization of EHR systems, but also communications between system developers and healthcare providers in the EHR system-design processes. 2006 Elsevier Ireland Ltd

  6. Measurement of Function Post Hip Fracture: Testing a Comprehensive Measurement Model of Physical Function

    PubMed Central

    Gruber-Baldini, Ann L.; Hicks, Gregory; Ostir, Glen; Klinedinst, N. Jennifer; Orwig, Denise; Magaziner, Jay

    2015-01-01

    Background Measurement of physical function post hip fracture has been conceptualized using multiple different measures. Purpose This study tested a comprehensive measurement model of physical function. Design This was a descriptive secondary data analysis including 168 men and 171 women post hip fracture. Methods Using structural equation modeling, a measurement model of physical function which included grip strength, activities of daily living, instrumental activities of daily living and performance was tested for fit at 2 and 12 months post hip fracture and among male and female participants and validity of the measurement model of physical function was evaluated based on how well the model explained physical activity, exercise and social activities post hip fracture. Findings The measurement model of physical function fit the data. The amount of variance the model or individual factors of the model explained varied depending on the activity. Conclusion Decisions about the ideal way in which to measure physical function should be based on outcomes considered and participant Clinical Implications The measurement model of physical function is a reliable and valid method to comprehensively measure physical function across the hip fracture recovery trajectory. Practical but useful assessment of function should be considered and monitored over the recovery trajectory post hip fracture. PMID:26492866

  7. Function modeling: improved raster analysis through delayed reading and function raster datasets

    Treesearch

    John S. Hogland; Nathaniel M. Anderson; J .Greg Jones

    2013-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that streamlines the...

  8. Functional linear models for zero-inflated count data with application to modeling hospitalizations in patients on dialysis.

    PubMed

    Sentürk, Damla; Dalrymple, Lorien S; Nguyen, Danh V

    2014-11-30

    We propose functional linear models for zero-inflated count data with a focus on the functional hurdle and functional zero-inflated Poisson (ZIP) models. Although the hurdle model assumes the counts come from a mixture of a degenerate distribution at zero and a zero-truncated Poisson distribution, the ZIP model considers a mixture of a degenerate distribution at zero and a standard Poisson distribution. We extend the generalized functional linear model framework with a functional predictor and multiple cross-sectional predictors to model counts generated by a mixture distribution. We propose an estimation procedure for functional hurdle and ZIP models, called penalized reconstruction, geared towards error-prone and sparsely observed longitudinal functional predictors. The approach relies on dimension reduction and pooling of information across subjects involving basis expansions and penalized maximum likelihood techniques. The developed functional hurdle model is applied to modeling hospitalizations within the first 2 years from initiation of dialysis, with a high percentage of zeros, in the Comprehensive Dialysis Study participants. Hospitalization counts are modeled as a function of sparse longitudinal measurements of serum albumin concentrations, patient demographics, and comorbidities. Simulation studies are used to study finite sample properties of the proposed method and include comparisons with an adaptation of standard principal components regression. Copyright © 2014 John Wiley & Sons, Ltd.

  9. Function Model for Community Health Service Information

    NASA Astrophysics Data System (ADS)

    Yang, Peng; Pan, Feng; Liu, Danhong; Xu, Yongyong

    In order to construct a function model of community health service (CHS) information for development of CHS information management system, Integration Definition for Function Modeling (IDEF0), an IEEE standard which is extended from Structured Analysis and Design(SADT) and now is a widely used function modeling method, was used to classifying its information from top to bottom. The contents of every level of the model were described and coded. Then function model for CHS information, which includes 4 super-classes, 15 classes and 28 sub-classed of business function, 43 business processes and 168 business activities, was established. This model can facilitate information management system development and workflow refinement.

  10. Assessment of tropospheric delay mapping function models in Egypt: Using PTD database model

    NASA Astrophysics Data System (ADS)

    Abdelfatah, M. A.; Mousa, Ashraf E.; El-Fiky, Gamal S.

    2018-06-01

    For space geodetic measurements, estimates of tropospheric delays are highly correlated with site coordinates and receiver clock biases. Thus, it is important to use the most accurate models for the tropospheric delay to reduce errors in the estimates of the other parameters. Both the zenith delay value and mapping function should be assigned correctly to reduce such errors. Several mapping function models can treat the troposphere slant delay. The recent models were not evaluated for the Egyptian local climate conditions. An assessment of these models is needed to choose the most suitable one. The goal of this paper is to test the quality of global mapping function which provides high consistency with precise troposphere delay (PTD) mapping functions. The PTD model is derived from radiosonde data using ray tracing, which consider in this paper as true value. The PTD mapping functions were compared, with three recent total mapping functions model and another three separate dry and wet mapping function model. The results of the research indicate that models are very close up to zenith angle 80°. Saastamoinen and 1/cos z model are behind accuracy. Niell model is better than VMF model. The model of Black and Eisner is a good model. The results also indicate that the geometric range error has insignificant effect on slant delay and the fluctuation of azimuth anti-symmetric is about 1%.

  11. Parametric correlation functions to model the structure of permanent environmental (co)variances in milk yield random regression models.

    PubMed

    Bignardi, A B; El Faro, L; Cardoso, V L; Machado, P F; Albuquerque, L G

    2009-09-01

    The objective of the present study was to estimate milk yield genetic parameters applying random regression models and parametric correlation functions combined with a variance function to model animal permanent environmental effects. A total of 152,145 test-day milk yields from 7,317 first lactations of Holstein cows belonging to herds located in the southeastern region of Brazil were analyzed. Test-day milk yields were divided into 44 weekly classes of days in milk. Contemporary groups were defined by herd-test-day comprising a total of 2,539 classes. The model included direct additive genetic, permanent environmental, and residual random effects. The following fixed effects were considered: contemporary group, age of cow at calving (linear and quadratic regressions), and the population average lactation curve modeled by fourth-order orthogonal Legendre polynomial. Additive genetic effects were modeled by random regression on orthogonal Legendre polynomials of days in milk, whereas permanent environmental effects were estimated using a stationary or nonstationary parametric correlation function combined with a variance function of different orders. The structure of residual variances was modeled using a step function containing 6 variance classes. The genetic parameter estimates obtained with the model using a stationary correlation function associated with a variance function to model permanent environmental effects were similar to those obtained with models employing orthogonal Legendre polynomials for the same effect. A model using a sixth-order polynomial for additive effects and a stationary parametric correlation function associated with a seventh-order variance function to model permanent environmental effects would be sufficient for data fitting.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hesheng, E-mail: hesheng@umich.edu; Feng, Mary; Jackson, Andrew

    Purpose: To develop a local and global function model in the liver based on regional and organ function measurements to support individualized adaptive radiation therapy (RT). Methods and Materials: A local and global model for liver function was developed to include both functional volume and the effect of functional variation of subunits. Adopting the assumption of parallel architecture in the liver, the global function was composed of a sum of local function probabilities of subunits, varying between 0 and 1. The model was fit to 59 datasets of liver regional and organ function measures from 23 patients obtained before, during, andmore » 1 month after RT. The local function probabilities of subunits were modeled by a sigmoid function in relating to MRI-derived portal venous perfusion values. The global function was fitted to a logarithm of an indocyanine green retention rate at 15 minutes (an overall liver function measure). Cross-validation was performed by leave-m-out tests. The model was further evaluated by fitting to the data divided according to whether the patients had hepatocellular carcinoma (HCC) or not. Results: The liver function model showed that (1) a perfusion value of 68.6 mL/(100 g · min) yielded a local function probability of 0.5; (2) the probability reached 0.9 at a perfusion value of 98 mL/(100 g · min); and (3) at a probability of 0.03 [corresponding perfusion of 38 mL/(100 g · min)] or lower, the contribution to global function was lost. Cross-validations showed that the model parameters were stable. The model fitted to the data from the patients with HCC indicated that the same amount of portal venous perfusion was translated into less local function probability than in the patients with non-HCC tumors. Conclusions: The developed liver function model could provide a means to better assess individual and regional dose-responses of hepatic functions, and provide guidance for individualized treatment planning of RT.« less

  13. Modified hyperbolic sine model for titanium dioxide-based memristive thin films

    NASA Astrophysics Data System (ADS)

    Abu Bakar, Raudah; Syahirah Kamarozaman, Nur; Fazlida Hanim Abdullah, Wan; Herman, Sukreen Hana

    2018-03-01

    Since the emergence of memristor as the newest fundamental circuit elements, studies on memristor modeling have been evolved. To date, the developed models were based on the linear model, linear ionic drift model using different window functions, tunnelling barrier model and hyperbolic-sine function based model. Although using hyperbolic-sine function model could predict the memristor electrical properties, the model was not well fitted to the experimental data. In order to improve the performance of the hyperbolic-sine function model, the state variable equation was modified. On the one hand, the addition of window function cannot provide an improved fitting. By multiplying the Yakopcic’s state variable model to Chang’s model on the other hand resulted in the closer agreement with the TiO2 thin film experimental data. The percentage error was approximately 2.15%.

  14. Elliptic supersymmetric integrable model and multivariable elliptic functions

    NASA Astrophysics Data System (ADS)

    Motegi, Kohei

    2017-12-01

    We investigate the elliptic integrable model introduced by Deguchi and Martin [Int. J. Mod. Phys. A 7, Suppl. 1A, 165 (1992)], which is an elliptic extension of the Perk-Schultz model. We introduce and study a class of partition functions of the elliptic model by using the Izergin-Korepin analysis. We show that the partition functions are expressed as a product of elliptic factors and elliptic Schur-type symmetric functions. This result resembles recent work by number theorists in which the correspondence between the partition functions of trigonometric models and the product of the deformed Vandermonde determinant and Schur functions were established.

  15. Probability Weighting Functions Derived from Hyperbolic Time Discounting: Psychophysical Models and Their Individual Level Testing.

    PubMed

    Takemura, Kazuhisa; Murakami, Hajime

    2016-01-01

    A probability weighting function (w(p)) is considered to be a nonlinear function of probability (p) in behavioral decision theory. This study proposes a psychophysical model of probability weighting functions derived from a hyperbolic time discounting model and a geometric distribution. The aim of the study is to show probability weighting functions from the point of view of waiting time for a decision maker. Since the expected value of a geometrically distributed random variable X is 1/p, we formulized the probability weighting function of the expected value model for hyperbolic time discounting as w(p) = (1 - k log p)(-1). Moreover, the probability weighting function is derived from Loewenstein and Prelec's (1992) generalized hyperbolic time discounting model. The latter model is proved to be equivalent to the hyperbolic-logarithmic weighting function considered by Prelec (1998) and Luce (2001). In this study, we derive a model from the generalized hyperbolic time discounting model assuming Fechner's (1860) psychophysical law of time and a geometric distribution of trials. In addition, we develop median models of hyperbolic time discounting and generalized hyperbolic time discounting. To illustrate the fitness of each model, a psychological experiment was conducted to assess the probability weighting and value functions at the level of the individual participant. The participants were 50 university students. The results of individual analysis indicated that the expected value model of generalized hyperbolic discounting fitted better than previous probability weighting decision-making models. The theoretical implications of this finding are discussed.

  16. The Functional Transitions Model: Maximizing Ability in the Context of Progressive Disability Associated with Alzheimer's Disease

    ERIC Educational Resources Information Center

    Slaughter, Susan; Bankes, Jane

    2007-01-01

    The Functional Transitions Model (FTM) integrates the theoretical notions of progressive functional decline associated with Alzheimer's disease (AD), excess disability, and transitions occurring intermittently along the trajectory of functional decline. Application of the Functional Transitions Model to clinical practice encompasses the paradox of…

  17. Continuous time random walk model with asymptotical probability density of waiting times via inverse Mittag-Leffler function

    NASA Astrophysics Data System (ADS)

    Liang, Yingjie; Chen, Wen

    2018-04-01

    The mean squared displacement (MSD) of the traditional ultraslow diffusion is a logarithmic function of time. Recently, the continuous time random walk model is employed to characterize this ultraslow diffusion dynamics by connecting the heavy-tailed logarithmic function and its variation as the asymptotical waiting time density. In this study we investigate the limiting waiting time density of a general ultraslow diffusion model via the inverse Mittag-Leffler function, whose special case includes the traditional logarithmic ultraslow diffusion model. The MSD of the general ultraslow diffusion model is analytically derived as an inverse Mittag-Leffler function, and is observed to increase even more slowly than that of the logarithmic function model. The occurrence of very long waiting time in the case of the inverse Mittag-Leffler function has the largest probability compared with the power law model and the logarithmic function model. The Monte Carlo simulations of one dimensional sample path of a single particle are also performed. The results show that the inverse Mittag-Leffler waiting time density is effective in depicting the general ultraslow random motion.

  18. An Arrhenius-type viscosity function to model sintering using the Skorohod Olevsky viscous sintering model within a finite element code.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ewsuk, Kevin Gregory; Arguello, Jose Guadalupe, Jr.; Reiterer, Markus W.

    2006-02-01

    The ease and ability to predict sintering shrinkage and densification with the Skorohod-Olevsky viscous sintering (SOVS) model within a finite-element (FE) code have been improved with the use of an Arrhenius-type viscosity function. The need for a better viscosity function was identified by evaluating SOVS model predictions made using a previously published polynomial viscosity function. Predictions made using the original, polynomial viscosity function do not accurately reflect experimentally observed sintering behavior. To more easily and better predict sintering behavior using FE simulations, a thermally activated viscosity function based on creep theory was used with the SOVS model. In comparison withmore » the polynomial viscosity function, SOVS model predictions made using the Arrhenius-type viscosity function are more representative of experimentally observed viscosity and sintering behavior. Additionally, the effects of changes in heating rate on densification can easily be predicted with the Arrhenius-type viscosity function. Another attribute of the Arrhenius-type viscosity function is that it provides the potential to link different sintering models. For example, the apparent activation energy, Q, for densification used in the construction of the master sintering curve for a low-temperature cofire ceramic dielectric has been used as the apparent activation energy for material flow in the Arrhenius-type viscosity function to predict heating rate-dependent sintering behavior using the SOVS model.« less

  19. Predicting cognitive function from clinical measures of physical function and health status in older adults.

    PubMed

    Bolandzadeh, Niousha; Kording, Konrad; Salowitz, Nicole; Davis, Jennifer C; Hsu, Liang; Chan, Alison; Sharma, Devika; Blohm, Gunnar; Liu-Ambrose, Teresa

    2015-01-01

    Current research suggests that the neuropathology of dementia-including brain changes leading to memory impairment and cognitive decline-is evident years before the onset of this disease. Older adults with cognitive decline have reduced functional independence and quality of life, and are at greater risk for developing dementia. Therefore, identifying biomarkers that can be easily assessed within the clinical setting and predict cognitive decline is important. Early recognition of cognitive decline could promote timely implementation of preventive strategies. We included 89 community-dwelling adults aged 70 years and older in our study, and collected 32 measures of physical function, health status and cognitive function at baseline. We utilized an L1-L2 regularized regression model (elastic net) to identify which of the 32 baseline measures were strongly predictive of cognitive function after one year. We built three linear regression models: 1) based on baseline cognitive function, 2) based on variables consistently selected in every cross-validation loop, and 3) a full model based on all the 32 variables. Each of these models was carefully tested with nested cross-validation. Our model with the six variables consistently selected in every cross-validation loop had a mean squared prediction error of 7.47. This number was smaller than that of the full model (115.33) and the model with baseline cognitive function (7.98). Our model explained 47% of the variance in cognitive function after one year. We built a parsimonious model based on a selected set of six physical function and health status measures strongly predictive of cognitive function after one year. In addition to reducing the complexity of the model without changing the model significantly, our model with the top variables improved the mean prediction error and R-squared. These six physical function and health status measures can be easily implemented in a clinical setting.

  20. Functional Enzyme-Based Approach for Linking Microbial Community Functions with Biogeochemical Process Kinetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Minjing; Qian, Wei-jun; Gao, Yuqian

    The kinetics of biogeochemical processes in natural and engineered environmental systems are typically described using Monod-type or modified Monod-type models. These models rely on biomass as surrogates for functional enzymes in microbial community that catalyze biogeochemical reactions. A major challenge to apply such models is the difficulty to quantitatively measure functional biomass for constraining and validating the models. On the other hand, omics-based approaches have been increasingly used to characterize microbial community structure, functions, and metabolites. Here we proposed an enzyme-based model that can incorporate omics-data to link microbial community functions with biogeochemical process kinetics. The model treats enzymes asmore » time-variable catalysts for biogeochemical reactions and applies biogeochemical reaction network to incorporate intermediate metabolites. The sequences of genes and proteins from metagenomes, as well as those from the UniProt database, were used for targeted enzyme quantification and to provide insights into the dynamic linkage among functional genes, enzymes, and metabolites that are necessary to be incorporated in the model. The application of the model was demonstrated using denitrification as an example by comparing model-simulated with measured functional enzymes, genes, denitrification substrates and intermediates« less

  1. Functional Risk Modeling for Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Thomson, Fraser; Mathias, Donovan; Go, Susie; Nejad, Hamed

    2010-01-01

    We introduce an approach to risk modeling that we call functional modeling , which we have developed to estimate the capabilities of a lunar base. The functional model tracks the availability of functions provided by systems, in addition to the operational state of those systems constituent strings. By tracking functions, we are able to identify cases where identical functions are provided by elements (rovers, habitats, etc.) that are connected together on the lunar surface. We credit functional diversity in those cases, and in doing so compute more realistic estimates of operational mode availabilities. The functional modeling approach yields more realistic estimates of the availability of the various operational modes provided to astronauts by the ensemble of surface elements included in a lunar base architecture. By tracking functional availability the effects of diverse backup, which often exists when two or more independent elements are connected together, is properly accounted for.

  2. From data to function: functional modeling of poultry genomics data.

    PubMed

    McCarthy, F M; Lyons, E

    2013-09-01

    One of the challenges of functional genomics is to create a better understanding of the biological system being studied so that the data produced are leveraged to provide gains for agriculture, human health, and the environment. Functional modeling enables researchers to make sense of these data as it reframes a long list of genes or gene products (mRNA, ncRNA, and proteins) by grouping based upon function, be it individual molecular functions or interactions between these molecules or broader biological processes, including metabolic and signaling pathways. However, poultry researchers have been hampered by a lack of functional annotation data, tools, and training to use these data and tools. Moreover, this lack is becoming more critical as new sequencing technologies enable us to generate data not only for an increasingly diverse range of species but also individual genomes and populations of individuals. We discuss the impact of these new sequencing technologies on poultry research, with a specific focus on what functional modeling resources are available for poultry researchers. We also describe key strategies for researchers who wish to functionally model their own data, providing background information about functional modeling approaches, the data and tools to support these approaches, and the strengths and limitations of each. Specifically, we describe methods for functional analysis using Gene Ontology (GO) functional summaries, functional enrichment analysis, and pathways and network modeling. As annotation efforts begin to provide the fundamental data that underpin poultry functional modeling (such as improved gene identification, standardized gene nomenclature, temporal and spatial expression data and gene product function), tool developers are incorporating these data into new and existing tools that are used for functional modeling, and cyberinfrastructure is being developed to provide the necessary extendibility and scalability for storing and analyzing these data. This process will support the efforts of poultry researchers to make sense of their functional genomics data sets, and we provide here a starting point for researchers who wish to take advantage of these tools.

  3. Computational Modeling of Airway and Pulmonary Vascular Structure and Function: Development of a “Lung Physiome”

    PubMed Central

    Tawhai, M. H.; Clark, A. R.; Donovan, G. M.; Burrowes, K. S.

    2011-01-01

    Computational models of lung structure and function necessarily span multiple spatial and temporal scales, i.e., dynamic molecular interactions give rise to whole organ function, and the link between these scales cannot be fully understood if only molecular or organ-level function is considered. Here, we review progress in constructing multiscale finite element models of lung structure and function that are aimed at providing a computational framework for bridging the spatial scales from molecular to whole organ. These include structural models of the intact lung, embedded models of the pulmonary airways that couple to model lung tissue, and models of the pulmonary vasculature that account for distinct structural differences at the extra- and intra-acinar levels. Biophysically based functional models for tissue deformation, pulmonary blood flow, and airway bronchoconstriction are also described. The development of these advanced multiscale models has led to a better understanding of complex physiological mechanisms that govern regional lung perfusion and emergent heterogeneity during bronchoconstriction. PMID:22011236

  4. Robust, Adaptive Functional Regression in Functional Mixed Model Framework.

    PubMed

    Zhu, Hongxiao; Brown, Philip J; Morris, Jeffrey S

    2011-09-01

    Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient enough to handle large data sets, and yields posterior samples of all model parameters that can be used to perform desired Bayesian estimation and inference. Although we present details for a specific implementation of the R-FMM using specific distributional choices in the hierarchical model, 1D functions, and wavelet transforms, the method can be applied more generally using other heavy-tailed distributions, higher dimensional functions (e.g. images), and using other invertible transformations as alternatives to wavelets.

  5. Robust, Adaptive Functional Regression in Functional Mixed Model Framework

    PubMed Central

    Zhu, Hongxiao; Brown, Philip J.; Morris, Jeffrey S.

    2012-01-01

    Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient enough to handle large data sets, and yields posterior samples of all model parameters that can be used to perform desired Bayesian estimation and inference. Although we present details for a specific implementation of the R-FMM using specific distributional choices in the hierarchical model, 1D functions, and wavelet transforms, the method can be applied more generally using other heavy-tailed distributions, higher dimensional functions (e.g. images), and using other invertible transformations as alternatives to wavelets. PMID:22308015

  6. Root structural and functional dynamics in terrestrial biosphere models--evaluation and recommendations.

    PubMed

    Warren, Jeffrey M; Hanson, Paul J; Iversen, Colleen M; Kumar, Jitendra; Walker, Anthony P; Wullschleger, Stan D

    2015-01-01

    There is wide breadth of root function within ecosystems that should be considered when modeling the terrestrial biosphere. Root structure and function are closely associated with control of plant water and nutrient uptake from the soil, plant carbon (C) assimilation, partitioning and release to the soils, and control of biogeochemical cycles through interactions within the rhizosphere. Root function is extremely dynamic and dependent on internal plant signals, root traits and morphology, and the physical, chemical and biotic soil environment. While plant roots have significant structural and functional plasticity to changing environmental conditions, their dynamics are noticeably absent from the land component of process-based Earth system models used to simulate global biogeochemical cycling. Their dynamic representation in large-scale models should improve model veracity. Here, we describe current root inclusion in models across scales, ranging from mechanistic processes of single roots to parameterized root processes operating at the landscape scale. With this foundation we discuss how existing and future root functional knowledge, new data compilation efforts, and novel modeling platforms can be leveraged to enhance root functionality in large-scale terrestrial biosphere models by improving parameterization within models, and introducing new components such as dynamic root distribution and root functional traits linked to resource extraction. No claim to original US Government works. New Phytologist © 2014 New Phytologist Trust.

  7. Calibration and prediction of removal function in magnetorheological finishing.

    PubMed

    Dai, Yifan; Song, Ci; Peng, Xiaoqiang; Shi, Feng

    2010-01-20

    A calibrated and predictive model of the removal function has been established based on the analysis of a magnetorheological finishing (MRF) process. By introducing an efficiency coefficient of the removal function, the model can be used to calibrate the removal function in a MRF figuring process and to accurately predict the removal function of a workpiece to be polished whose material is different from the spot part. Its correctness and feasibility have been validated by simulations. Furthermore, applying this model to the MRF figuring experiments, the efficiency coefficient of the removal function can be identified accurately to make the MRF figuring process deterministic and controllable. Therefore, all the results indicate that the calibrated and predictive model of the removal function can improve the finishing determinacy and increase the model applicability in a MRF process.

  8. Modelling protein functional domains in signal transduction using Maude

    NASA Technical Reports Server (NTRS)

    Sriram, M. G.

    2003-01-01

    Modelling of protein-protein interactions in signal transduction is receiving increased attention in computational biology. This paper describes recent research in the application of Maude, a symbolic language founded on rewriting logic, to the modelling of functional domains within signalling proteins. Protein functional domains (PFDs) are a critical focus of modern signal transduction research. In general, Maude models can simulate biological signalling networks and produce specific testable hypotheses at various levels of abstraction. Developing symbolic models of signalling proteins containing functional domains is important because of the potential to generate analyses of complex signalling networks based on structure-function relationships.

  9. Confirmatory factor analysis of the female sexual function index.

    PubMed

    Opperman, Emily A; Benson, Lindsay E; Milhausen, Robin R

    2013-01-01

    The Female Sexual Functioning Index (Rosen et al., 2000 ) was designed to assess the key dimensions of female sexual functioning using six domains: desire, arousal, lubrication, orgasm, satisfaction, and pain. A full-scale score was proposed to represent women's overall sexual function. The fifth revision to the Diagnostic and Statistical Manual (DSM) is currently underway and includes a proposal to combine desire and arousal problems. The objective of this article was to evaluate and compare four models of the Female Sexual Functioning Index: (a) single-factor model, (b) six-factor model, (c) second-order factor model, and (4) five-factor model combining the desire and arousal subscales. Cross-sectional and observational data from 85 women were used to conduct a confirmatory factor analysis on the Female Sexual Functioning Index. Local and global goodness-of-fit measures, the chi-square test of differences, squared multiple correlations, and regression weights were used. The single-factor model fit was not acceptable. The original six-factor model was confirmed, and good model fit was found for the second-order and five-factor models. Delta chi-square tests of differences supported best fit for the six-factor model validating usage of the six domains. However, when revisions are made to the DSM-5, the Female Sexual Functioning Index can adapt to reflect these changes and remain a valid assessment tool for women's sexual functioning, as the five-factor structure was also supported.

  10. Functional Competency Development Model for Academic Personnel Based on International Professional Qualification Standards in Computing Field

    ERIC Educational Resources Information Center

    Tumthong, Suwut; Piriyasurawong, Pullop; Jeerangsuwan, Namon

    2016-01-01

    This research proposes a functional competency development model for academic personnel based on international professional qualification standards in computing field and examines the appropriateness of the model. Specifically, the model consists of three key components which are: 1) functional competency development model, 2) blended training…

  11. Estimating FIA plot characteristics using NAIP imagery, function modeling, and the RMRS Raster Utility coding library

    Treesearch

    John S. Hogland; Nathaniel M. Anderson

    2015-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that...

  12. Estimating FIA plot characteristics using NAIP imagery, function modeling, and the RMRS raster utility coding library

    Treesearch

    John S. Hogland; Nathaniel M. Anderson

    2015-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that streamlines the...

  13. Random regression analyses using B-splines functions to model growth from birth to adult age in Canchim cattle.

    PubMed

    Baldi, F; Alencar, M M; Albuquerque, L G

    2010-12-01

    The objective of this work was to estimate covariance functions using random regression models on B-splines functions of animal age, for weights from birth to adult age in Canchim cattle. Data comprised 49,011 records on 2435 females. The model of analysis included fixed effects of contemporary groups, age of dam as quadratic covariable and the population mean trend taken into account by a cubic regression on orthogonal polynomials of animal age. Residual variances were modelled through a step function with four classes. The direct and maternal additive genetic effects, and animal and maternal permanent environmental effects were included as random effects in the model. A total of seventeen analyses, considering linear, quadratic and cubic B-splines functions and up to seven knots, were carried out. B-spline functions of the same order were considered for all random effects. Random regression models on B-splines functions were compared to a random regression model on Legendre polynomials and with a multitrait model. Results from different models of analyses were compared using the REML form of the Akaike Information criterion and Schwarz' Bayesian Information criterion. In addition, the variance components and genetic parameters estimated for each random regression model were also used as criteria to choose the most adequate model to describe the covariance structure of the data. A model fitting quadratic B-splines, with four knots or three segments for direct additive genetic effect and animal permanent environmental effect and two knots for maternal additive genetic effect and maternal permanent environmental effect, was the most adequate to describe the covariance structure of the data. Random regression models using B-spline functions as base functions fitted the data better than Legendre polynomials, especially at mature ages, but higher number of parameters need to be estimated with B-splines functions. © 2010 Blackwell Verlag GmbH.

  14. Stability and the Evolvability of Function in a Model Protein

    PubMed Central

    Bloom, Jesse D.; Wilke, Claus O.; Arnold, Frances H.; Adami, Christoph

    2004-01-01

    Functional proteins must fold with some minimal stability to a structure that can perform a biochemical task. Here we use a simple model to investigate the relationship between the stability requirement and the capacity of a protein to evolve the function of binding to a ligand. Although our model contains no built-in tradeoff between stability and function, proteins evolved function more efficiently when the stability requirement was relaxed. Proteins with both high stability and high function evolved more efficiently when the stability requirement was gradually increased than when there was constant selection for high stability. These results show that in our model, the evolution of function is enhanced by allowing proteins to explore sequences corresponding to marginally stable structures, and that it is easier to improve stability while maintaining high function than to improve function while maintaining high stability. Our model also demonstrates that even in the absence of a fundamental biophysical tradeoff between stability and function, the speed with which function can evolve is limited by the stability requirement imposed on the protein. PMID:15111394

  15. An Evidence-Based Construction of the Models of Decline of Functioning. Part 1: Two Major Models of Decline of Functioning

    ERIC Educational Resources Information Center

    Okawa, Yayoi; Nakamura, Shigemi; Kudo, Minako; Ueda, Satoshi

    2009-01-01

    The purpose of this study is to confirm the working hypothesis on two major models of functioning decline and two corresponding models of rehabilitation program in an older population through detailed interviews with the persons who have functioning declines and on-the-spot observations of key activities on home visits. A total of 542…

  16. The basis function approach for modeling autocorrelation in ecological data

    USGS Publications Warehouse

    Hefley, Trevor J.; Broms, Kristin M.; Brost, Brian M.; Buderman, Frances E.; Kay, Shannon L.; Scharf, Henry; Tipton, John; Williams, Perry J.; Hooten, Mevin B.

    2017-01-01

    Analyzing ecological data often requires modeling the autocorrelation created by spatial and temporal processes. Many seemingly disparate statistical methods used to account for autocorrelation can be expressed as regression models that include basis functions. Basis functions also enable ecologists to modify a wide range of existing ecological models in order to account for autocorrelation, which can improve inference and predictive accuracy. Furthermore, understanding the properties of basis functions is essential for evaluating the fit of spatial or time-series models, detecting a hidden form of collinearity, and analyzing large data sets. We present important concepts and properties related to basis functions and illustrate several tools and techniques ecologists can use when modeling autocorrelation in ecological data.

  17. Nonparametric Transfer Function Models

    PubMed Central

    Liu, Jun M.; Chen, Rong; Yao, Qiwei

    2009-01-01

    In this paper a class of nonparametric transfer function models is proposed to model nonlinear relationships between ‘input’ and ‘output’ time series. The transfer function is smooth with unknown functional forms, and the noise is assumed to be a stationary autoregressive-moving average (ARMA) process. The nonparametric transfer function is estimated jointly with the ARMA parameters. By modeling the correlation in the noise, the transfer function can be estimated more efficiently. The parsimonious ARMA structure improves the estimation efficiency in finite samples. The asymptotic properties of the estimators are investigated. The finite-sample properties are illustrated through simulations and one empirical example. PMID:20628584

  18. Modeling multivariate time series on manifolds with skew radial basis functions.

    PubMed

    Jamshidi, Arta A; Kirby, Michael J

    2011-01-01

    We present an approach for constructing nonlinear empirical mappings from high-dimensional domains to multivariate ranges. We employ radial basis functions and skew radial basis functions for constructing a model using data that are potentially scattered or sparse. The algorithm progresses iteratively, adding a new function at each step to refine the model. The placement of the functions is driven by a statistical hypothesis test that accounts for correlation in the multivariate range variables. The test is applied on training and validation data and reveals nonstatistical or geometric structure when it fails. At each step, the added function is fit to data contained in a spatiotemporally defined local region to determine the parameters--in particular, the scale of the local model. The scale of the function is determined by the zero crossings of the autocorrelation function of the residuals. The model parameters and the number of basis functions are determined automatically from the given data, and there is no need to initialize any ad hoc parameters save for the selection of the skew radial basis functions. Compactly supported skew radial basis functions are employed to improve model accuracy, order, and convergence properties. The extension of the algorithm to higher-dimensional ranges produces reduced-order models by exploiting the existence of correlation in the range variable data. Structure is tested not just in a single time series but between all pairs of time series. We illustrate the new methodologies using several illustrative problems, including modeling data on manifolds and the prediction of chaotic time series.

  19. Definition of Historical Models of Gene Function and Their Relation to Students' Understanding of Genetics

    ERIC Educational Resources Information Center

    Gericke, Niklas Markus; Hagberg, Mariana

    2007-01-01

    Models are often used when teaching science. In this paper historical models and students' ideas about genetics are compared. The historical development of the scientific idea of the gene and its function is described and categorized into five historical models of gene function. Differences and similarities between these historical models are made…

  20. Functional Generalized Additive Models.

    PubMed

    McLean, Mathew W; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F { X ( t ), t } where F (·,·) is an unknown regression function and X ( t ) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Müller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F (·,·) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X ( t ) is a signal from diffusion tensor imaging at position, t , along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online.

  1. Reflection and emission models for deserts derived from Nimbus-7 ERB scanner measurements

    NASA Technical Reports Server (NTRS)

    Staylor, W. F.; Suttles, J. T.

    1986-01-01

    Broadband shortwave and longwave radiance measurements obtained from the Nimbus-7 Earth Radiation Budget scanner were used to develop reflectance and emittance models for the Sahara-Arabian, Gibson, and Saudi Deserts. The models were established by fitting the satellite measurements to analytic functions. For the shortwave, the model function is based on an approximate solution to the radiative transfer equation. The bidirectional-reflectance function was obtained from a single-scattering approximation with a Rayleigh-like phase function. The directional-reflectance model followed from integration of the bidirectional model and is a function of the sum and product of cosine solar and viewing zenith angles, thus satisfying reciprocity between these angles. The emittance model was based on a simple power-law of cosine viewing zenith angle.

  2. Finite-element modeling of the human neurocranium under functional anatomical aspects.

    PubMed

    Mall, G; Hubig, M; Koebke, J; Steinbuch, R

    1997-08-01

    Due to its functional significance the human skull plays an important role in biomechanical research. The present work describes a new Finite-Element model of the human neurocranium. The dry skull of a middle-aged woman served as a pattern. The model was developed using only the preprocessor (Mentat) of a commercial FE-system (Marc). Unlike that of other FE models of the human skull mentioned in the literature, the geometry in this model was designed according to functional anatomical findings. Functionally important morphological structures representing loci minoris resistentiae, especially the foramina and fissures of the skull base, were included in the model. The results of two linear static loadcase analyses in the region of the skull base underline the importance of modeling from the functional anatomical point of view.

  3. The Esophagiome: concept, status, and future perspectives.

    PubMed

    Gregersen, Hans; Liao, Donghua; Brasseur, James G

    2016-09-01

    The term "Esophagiome" is meant to imply a holistic, multiscale treatment of esophageal function from cellular and muscle physiology to the mechanical responses that transport and mix fluid contents. The development and application of multiscale mathematical models of esophageal function are central to the Esophagiome concept. These model elements underlie the development of a "virtual esophagus" modeling framework to characterize and analyze function and disease by quantitatively contrasting normal and pathophysiological function. Functional models incorporate anatomical details with sensory-motor properties and functional responses, especially related to biomechanical functions, such as bolus transport and gastrointestinal fluid mixing. This brief review provides insight into Esophagiome research. Future advanced models can provide predictive evaluations of the therapeutic consequences of surgical and endoscopic treatments and will aim to facilitate clinical diagnostics and treatment. © 2016 New York Academy of Sciences.

  4. Assessment of parametric uncertainty for groundwater reactive transport modeling,

    USGS Publications Warehouse

    Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun

    2014-01-01

    The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.

  5. Bayesian function-on-function regression for multilevel functional data.

    PubMed

    Meyer, Mark J; Coull, Brent A; Versace, Francesco; Cinciripini, Paul; Morris, Jeffrey S

    2015-09-01

    Medical and public health research increasingly involves the collection of complex and high dimensional data. In particular, functional data-where the unit of observation is a curve or set of curves that are finely sampled over a grid-is frequently obtained. Moreover, researchers often sample multiple curves per person resulting in repeated functional measures. A common question is how to analyze the relationship between two functional variables. We propose a general function-on-function regression model for repeatedly sampled functional data on a fine grid, presenting a simple model as well as a more extensive mixed model framework, and introducing various functional Bayesian inferential procedures that account for multiple testing. We examine these models via simulation and a data analysis with data from a study that used event-related potentials to examine how the brain processes various types of images. © 2015, The International Biometric Society.

  6. Production Functions for Water Delivery Systems: Analysis and Estimation Using Dual Cost Function and Implicit Price Specifications

    NASA Astrophysics Data System (ADS)

    Teeples, Ronald; Glyer, David

    1987-05-01

    Both policy and technical analysis of water delivery systems have been based on cost functions that are inconsistent with or are incomplete representations of the neoclassical production functions of economics. We present a full-featured production function model of water delivery which can be estimated from a multiproduct, dual cost function. The model features implicit prices for own-water inputs and is implemented as a jointly estimated system of input share equations and a translog cost function. Likelihood ratio tests are performed showing that a minimally constrained, full-featured production function is a necessary specification of the water delivery operations in our sample. This, plus the model's highly efficient and economically correct parameter estimates, confirms the usefulness of a production function approach to modeling the economic activities of water delivery systems.

  7. Representing uncertainty in objective functions: extension to include the influence of serial correlation

    NASA Astrophysics Data System (ADS)

    Croke, B. F.

    2008-12-01

    The role of performance indicators is to give an accurate indication of the fit between a model and the system being modelled. As all measurements have an associated uncertainty (determining the significance that should be given to the measurement), performance indicators should take into account uncertainties in the observed quantities being modelled as well as in the model predictions (due to uncertainties in inputs, model parameters and model structure). In the presence of significant uncertainty in observed and modelled output of a system, failure to adequately account for variations in the uncertainties means that the objective function only gives a measure of how well the model fits the observations, not how well the model fits the system being modelled. Since in most cases, the interest lies in fitting the system response, it is vital that the objective function(s) be designed to account for these uncertainties. Most objective functions (e.g. those based on the sum of squared residuals) assume homoscedastic uncertainties. If model contribution to the variations in residuals can be ignored, then transformations (e.g. Box-Cox) can be used to remove (or at least significantly reduce) heteroscedasticity. An alternative which is more generally applicable is to explicitly represent the uncertainties in the observed and modelled values in the objective function. Previous work on this topic addressed the modifications to standard objective functions (Nash-Sutcliffe efficiency, RMSE, chi- squared, coefficient of determination) using the optimal weighted averaging approach. This paper extends this previous work; addressing the issue of serial correlation. A form for an objective function that includes serial correlation will be presented, and the impact on model fit discussed.

  8. Assessing the predictive value of a neuropsychological model on concurrent function in acute stroke recovery and rehabilitation.

    PubMed

    Leitner, Damian; Miller, Harry; Libben, Maya

    2018-06-25

    Few studies have examined the relationship between cognition and function for acute stroke inpatients utilizing comprehensive methods. This study aimed to assess the relationship of a neuropsychological model, above and beyond a baseline model, with concurrent functional status across multiple domains in the early weeks of stroke recovery and rehabilitation. Seventy-four acute stroke patients were administered a comprehensive neuropsychological assessment. Functional domains of ability, adjustment, and participation were assessed using the Mayo-Portland Adaptability Inventory - 4 (MPAI-4). Hierarchical linear regression was used to assess a neuropsychological model comprised of cognitive tests scores on domains of executive function, memory, and visuospatial-constructional skills (VSC), after accounting for a baseline model comprised of common demographic and stroke variants used to predict outcome. The neuropsychological model was significantly associated, above and beyond the baseline model, with MPAI-4 Ability, Participation, and Total scores (all p-values < .05). The strength of association varied across functional domains. Analyzing tests of executive function, the Color Trails Test-Part 2 predicted MPAI-4 Participation (β = -.46, p = .001), and Total score (β = -.32, p = .02). Neuropsychological assessment contributes independently to the determination of multiple domains of functional function, above and beyond common medical variants of stroke, in the early weeks of recovery and rehabilitation. Multiple tests of executive function are recommended to develop a greater appreciation of a patient's concurrent functional abilities.

  9. The basis function approach for modeling autocorrelation in ecological data.

    PubMed

    Hefley, Trevor J; Broms, Kristin M; Brost, Brian M; Buderman, Frances E; Kay, Shannon L; Scharf, Henry R; Tipton, John R; Williams, Perry J; Hooten, Mevin B

    2017-03-01

    Analyzing ecological data often requires modeling the autocorrelation created by spatial and temporal processes. Many seemingly disparate statistical methods used to account for autocorrelation can be expressed as regression models that include basis functions. Basis functions also enable ecologists to modify a wide range of existing ecological models in order to account for autocorrelation, which can improve inference and predictive accuracy. Furthermore, understanding the properties of basis functions is essential for evaluating the fit of spatial or time-series models, detecting a hidden form of collinearity, and analyzing large data sets. We present important concepts and properties related to basis functions and illustrate several tools and techniques ecologists can use when modeling autocorrelation in ecological data. © 2016 by the Ecological Society of America.

  10. Variable-Domain Functional Regression for Modeling ICU Data.

    PubMed

    Gellar, Jonathan E; Colantuoni, Elizabeth; Needham, Dale M; Crainiceanu, Ciprian M

    2014-12-01

    We introduce a class of scalar-on-function regression models with subject-specific functional predictor domains. The fundamental idea is to consider a bivariate functional parameter that depends both on the functional argument and on the width of the functional predictor domain. Both parametric and nonparametric models are introduced to fit the functional coefficient. The nonparametric model is theoretically and practically invariant to functional support transformation, or support registration. Methods were motivated by and applied to a study of association between daily measures of the Intensive Care Unit (ICU) Sequential Organ Failure Assessment (SOFA) score and two outcomes: in-hospital mortality, and physical impairment at hospital discharge among survivors. Methods are generally applicable to a large number of new studies that record a continuous variables over unequal domains.

  11. Influence Function Learning in Information Diffusion Networks.

    PubMed

    Du, Nan; Liang, Yingyu; Balcan, Maria-Florina; Song, Le

    2014-06-01

    Can we learn the influence of a set of people in a social network from cascades of information diffusion? This question is often addressed by a two-stage approach: first learn a diffusion model, and then calculate the influence based on the learned model. Thus, the success of this approach relies heavily on the correctness of the diffusion model which is hard to verify for real world data. In this paper, we exploit the insight that the influence functions in many diffusion models are coverage functions, and propose a novel parameterization of such functions using a convex combination of random basis functions. Moreover, we propose an efficient maximum likelihood based algorithm to learn such functions directly from cascade data, and hence bypass the need to specify a particular diffusion model in advance. We provide both theoretical and empirical analysis for our approach, showing that the proposed approach can provably learn the influence function with low sample complexity, be robust to the unknown diffusion models, and significantly outperform existing approaches in both synthetic and real world data.

  12. Large memory capacity in chaotic artificial neural networks: a view of the anti-integrable limit.

    PubMed

    Lin, Wei; Chen, Guanrong

    2009-08-01

    In the literature, it was reported that the chaotic artificial neural network model with sinusoidal activation functions possesses a large memory capacity as well as a remarkable ability of retrieving the stored patterns, better than the conventional chaotic model with only monotonic activation functions such as sigmoidal functions. This paper, from the viewpoint of the anti-integrable limit, elucidates the mechanism inducing the superiority of the model with periodic activation functions that includes sinusoidal functions. Particularly, by virtue of the anti-integrable limit technique, this paper shows that any finite-dimensional neural network model with periodic activation functions and properly selected parameters has much more abundant chaotic dynamics that truly determine the model's memory capacity and pattern-retrieval ability. To some extent, this paper mathematically and numerically demonstrates that an appropriate choice of the activation functions and control scheme can lead to a large memory capacity and better pattern-retrieval ability of the artificial neural network models.

  13. Enhancements to the SSME transfer function modeling code

    NASA Technical Reports Server (NTRS)

    Irwin, R. Dennis; Mitchell, Jerrel R.; Bartholomew, David L.; Glenn, Russell D.

    1995-01-01

    This report details the results of a one year effort by Ohio University to apply the transfer function modeling and analysis tools developed under NASA Grant NAG8-167 (Irwin, 1992), (Bartholomew, 1992) to attempt the generation of Space Shuttle Main Engine High Pressure Turbopump transfer functions from time domain data. In addition, new enhancements to the transfer function modeling codes which enhance the code functionality are presented, along with some ideas for improved modeling methods and future work. Section 2 contains a review of the analytical background used to generate transfer functions with the SSME transfer function modeling software. Section 2.1 presents the 'ratio method' developed for obtaining models of systems that are subject to single unmeasured excitation sources and have two or more measured output signals. Since most of the models developed during the investigation use the Eigensystem Realization Algorithm (ERA) for model generation, Section 2.2 presents an introduction of ERA, and Section 2.3 describes how it can be used to model spectral quantities. Section 2.4 details the Residue Identification Algorithm (RID) including the use of Constrained Least Squares (CLS) and Total Least Squares (TLS). Most of this information can be found in the report (and is repeated for convenience). Section 3 chronicles the effort of applying the SSME transfer function modeling codes to the a51p394.dat and a51p1294.dat time data files to generate transfer functions from the unmeasured input to the 129.4 degree sensor output. Included are transfer function modeling attempts using five methods. The first method is a direct application of the SSME codes to the data files and the second method uses the underlying trends in the spectral density estimates to form transfer function models with less clustering of poles and zeros than the models obtained by the direct method. In the third approach, the time data is low pass filtered prior to the modeling process in an effort to filter out high frequency characteristics. The fourth method removes the presumed system excitation and its harmonics in order to investigate the effects of the excitation on the modeling process. The fifth method is an attempt to apply constrained RID to obtain better transfer functions through more accurate modeling over certain frequency ranges. Section 4 presents some new C main files which were created to round out the functionality of the existing SSME transfer function modeling code. It is now possible to go from time data to transfer function models using only the C codes; it is not necessary to rely on external software. The new C main files and instructions for their use are included. Section 5 presents current and future enhancements to the XPLOT graphics program which was delivered with the initial software. Several new features which have been added to the program are detailed in the first part of this section. The remainder of Section 5 then lists some possible features which may be added in the future. Section 6 contains the conclusion section of this report. Section 6.1 is an overview of the work including a summary and observations relating to finding transfer functions with the SSME code. Section 6.2 contains information relating to future work on the project.

  14. Relief of depression and pain improves daily functioning and quality of life in patients with major depressive disorder.

    PubMed

    Lin, Ching-Hua; Yen, Yung-Chieh; Chen, Ming-Chao; Chen, Cheng-Chung

    2013-12-02

    The objective of this study was to investigate the effects of depression relief and pain relief on the improvement in daily functioning and quality of life (QOL) for depressed patients receiving a 6-week treatment of fluoxetine. A total of 131 acutely ill inpatients with major depressive disorder (MDD) were enrolled to receive 20mg of fluoxetine daily for 6 weeks. Depression severity, pain severity, daily functioning, and health-related QOL were assessed at baseline and again at week 6. Depression severity, pain severity, and daily functioning were assessed using the 17-item Hamilton Depression Rating Scale, the Short-Form 36 (SF-36) Body Pain Index, and the Work and Social Adjustment Scale. Health-related QOL was assessed by three primary domains of the SF-36, including social functioning, vitality, and general health perceptions. Pearson's correlation and structural equation modeling were used to examine relationships among the study variables. Five models were proposed. In model 1, depression relief alone improved daily functioning and QOL. In model 2, pain relief alone improved daily functioning and QOL. In model 3, depression relief, mediated by pain relief, improved daily functioning and QOL. In model 4, pain relief, mediated by depression relief, improved daily functioning and QOL. In model 5, both depression relief and pain relief improved daily functioning and QOL. One hundred and six patients completed all the measures at baseline and at week 6. Model 5 was the most fitted structural equation model (χ(2) = 8.62, df = 8, p = 0.376, GFI = 0.975, AGFI = 0.935, TLI = 0.992, CFI = 0.996, RMSEA = 0.027). Interventions which relieve depression and pain improve daily functioning and QOL among patients with MDD. The proposed model can provide quantitative estimates of improvement in treating patients with MDD. © 2013 Elsevier Inc. All rights reserved.

  15. Didactic Model--Bridging a Concept with Phenomena

    ERIC Educational Resources Information Center

    Shternberg, Beba; Yerushalmy, Michal

    2004-01-01

    The article focuses on a specific method of constructing the concept of function. The core of this method is a didactic model that plays two roles together--on the one hand a role of a model of the concept of function and on the other hand a role of a model of physical phenomena that functions can represent. This synergy of modeling situations and…

  16. Why are you telling me that? A conceptual model of the social function of autobiographical memory.

    PubMed

    Alea, Nicole; Bluck, Susan

    2003-03-01

    In an effort to stimulate and guide empirical work within a functional framework, this paper provides a conceptual model of the social functions of autobiographical memory (AM) across the lifespan. The model delineates the processes and variables involved when AMs are shared to serve social functions. Components of the model include: lifespan contextual influences, the qualitative characteristics of memory (emotionality and level of detail recalled), the speaker's characteristics (age, gender, and personality), the familiarity and similarity of the listener to the speaker, the level of responsiveness during the memory-sharing process, and the nature of the social relationship in which the memory sharing occurs (valence and length of the relationship). These components are shown to influence the type of social function served and/or, the extent to which social functions are served. Directions for future empirical work to substantiate the model and hypotheses derived from the model are provided.

  17. Delay functions in trip assignment for transport planning process

    NASA Astrophysics Data System (ADS)

    Leong, Lee Vien

    2017-10-01

    In transportation planning process, volume-delay and turn-penalty functions are the functions needed in traffic assignment to determine travel time on road network links. Volume-delay function is the delay function describing speed-flow relationship while turn-penalty function is the delay function associated to making a turn at intersection. The volume-delay function used in this study is the revised Bureau of Public Roads (BPR) function with the constant parameters, α and β values of 0.8298 and 3.361 while the turn-penalty functions for signalized intersection were developed based on uniform, random and overflow delay models. Parameters such as green time, cycle time and saturation flow were used in the development of turn-penalty functions. In order to assess the accuracy of the delay functions, road network in areas of Nibong Tebal, Penang and Parit Buntar, Perak was developed and modelled using transportation demand forecasting software. In order to calibrate the models, phase times and traffic volumes at fourteen signalised intersections within the study area were collected during morning and evening peak hours. The prediction of assigned volumes using the revised BPR function and the developed turn-penalty functions show close agreement to actual recorded traffic volume with the lowest percentage of accuracy, 80.08% and the highest, 93.04% for the morning peak model. As for the evening peak model, they were 75.59% and 95.33% respectively for lowest and highest percentage of accuracy. As for the yield left-turn lanes, the lowest percentage of accuracy obtained for the morning and evening peak models were 60.94% and 69.74% respectively while the highest percentage of accuracy obtained for both models were 100%. Therefore, can be concluded that the development and utilisation of delay functions based on local road conditions are important as localised delay functions can produce better estimate of link travel times and hence better planning for future scenarios.

  18. TH-A-9A-04: Incorporating Liver Functionality in Radiation Therapy Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, V; Epelman, M; Feng, M

    2014-06-15

    Purpose: Liver SBRT patients have both variable pretreatment liver function (e.g., due to degree of cirrhosis and/or prior treatments) and sensitivity to radiation, leading to high variability in potential liver toxicity with similar doses. This work aims to explicitly incorporate liver perfusion into treatment planning to redistribute dose to preserve well-functioning areas without compromising target coverage. Methods: Voxel-based liver perfusion, a measure of functionality, was computed from dynamic contrast-enhanced MRI. Two optimization models with different cost functions subject to the same dose constraints (e.g., minimum target EUD and maximum critical structure EUDs) were compared. The cost functions minimized were EUDmore » (standard model) and functionality-weighted EUD (functional model) to the liver. The resulting treatment plans delivering the same target EUD were compared with respect to their DVHs, their dose wash difference, the average dose delivered to voxels of a particular perfusion level, and change in number of high-/low-functioning voxels receiving a particular dose. Two-dimensional synthetic and three-dimensional clinical examples were studied. Results: The DVHs of all structures of plans from each model were comparable. In contrast, in plans obtained with the functional model, the average dose delivered to high-/low-functioning voxels was lower/higher than in plans obtained with its standard counterpart. The number of high-/low-functioning voxels receiving high/low dose was lower in the plans that considered perfusion in the cost function than in the plans that did not. Redistribution of dose can be observed in the dose wash differences. Conclusion: Liver perfusion can be used during treatment planning potentially to minimize the risk of toxicity during liver SBRT, resulting in better global liver function. The functional model redistributes dose in the standard model from higher to lower functioning voxels, while achieving the same target EUD and satisfying dose limits to critical structures. This project is funded by MCubed and grant R01-CA132834.« less

  19. Functional model of biological neural networks.

    PubMed

    Lo, James Ting-Ho

    2010-12-01

    A functional model of biological neural networks, called temporal hierarchical probabilistic associative memory (THPAM), is proposed in this paper. THPAM comprises functional models of dendritic trees for encoding inputs to neurons, a first type of neuron for generating spike trains, a second type of neuron for generating graded signals to modulate neurons of the first type, supervised and unsupervised Hebbian learning mechanisms for easy learning and retrieving, an arrangement of dendritic trees for maximizing generalization, hardwiring for rotation-translation-scaling invariance, and feedback connections with different delay durations for neurons to make full use of present and past informations generated by neurons in the same and higher layers. These functional models and their processing operations have many functions of biological neural networks that have not been achieved by other models in the open literature and provide logically coherent answers to many long-standing neuroscientific questions. However, biological justifications of these functional models and their processing operations are required for THPAM to qualify as a macroscopic model (or low-order approximate) of biological neural networks.

  20. Use of DAVID algorithms for gene functional classification in a non-model organism, rainbow trout

    USDA-ARS?s Scientific Manuscript database

    Gene functional clustering is essential in transcriptome data analysis but software programs are not always suitable for use with non-model species. The DAVID Gene Functional Classification Tool has been widely used for soft clustering in model species, but requires adaptations for use in non-model ...

  1. Optimization of canopy conductance models from concurrent measurements of sap flow and stem water potential on Drooping Sheoak in South Australia

    NASA Astrophysics Data System (ADS)

    Wang, Hailong; Guan, Huade; Deng, Zijuan; Simmons, Craig T.

    2014-07-01

    Canopy conductance (gc) is a critical component in hydrological modeling for transpiration estimate. It is often formulated as functions of environmental variables. These functions are climate and vegetation specific. Thus, it is important to determine the appropriate functions in gc models and corresponding parameter values for a specific environment. In this study, sap flow, stem water potential, and microclimatic variables were measured for three Drooping Sheoak (Allocasuarina verticillata) trees in year 2011, 2012, and 2014. Canopy conductance was calculated from the inversed Penman-Monteith (PM) equation, which was then used to examine 36 gc models that comprise different response functions. Parameters were optimized using the DiffeRential Evolution Adaptive Metropolis (DREAM) model based on a training data set in 2012. Use of proper predawn stem water potential function, vapor pressure deficit function, and temperature function improves model performance significantly, while no pronounced difference is observed between models that differ in solar radiation functions. The best model gives a correlation coefficient of 0.97, and root-mean-square error of 0.0006 m/s in comparison to the PM-calculated gc. The optimized temperature function shows different characteristics from its counterparts in other similar studies. This is likely due to strong interdependence between air temperature and vapor pressure deficit in the study area or Sheoak tree physiology. Supported by the measurements and optimization results, we suggest that the effects of air temperature and vapor pressure deficit on canopy conductance should be represented together.

  2. Muscle function may depend on model selection in forward simulation of normal walking

    PubMed Central

    Xiao, Ming; Higginson, Jill S.

    2008-01-01

    The purpose of this study was to quantify how the predicted muscle function would change in a muscle-driven forward simulation of normal walking when changing the number of degrees of freedom in the model. Muscle function was described by individual muscle contributions to the vertical acceleration of the center of mass (COM). We built a two-dimensional (2D) sagittal plane model and a three-dimensional (3D) model in OpenSim and used both models to reproduce the same normal walking data. Perturbation analysis was applied to deduce muscle function in each model. Muscle excitations and contributions to COM support were compared between the 2D and 3D models. We found that the 2D model was able to reproduce similar joint kinematics and kinetics patterns as the 3D model. Individual muscle excitations were different for most of the hip muscles but ankle and knee muscles were able to attain similar excitations. Total induced vertical COM acceleration by muscles and gravity was the same for both models. However, individual muscle contributions to COM support varied, especially for hip muscles. Although there is currently no standard way to validate muscle function predictions, a 3D model seems to be more appropriate for estimating individual hip muscle function. PMID:18804767

  3. The Functionally-Assembled Terrestrial Ecosystem Simulator Version 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Chonggang; Christoffersen, Bradley

    The Functionally-Assembled Terrestrial Ecosystem Simulator (FATES) is a vegetation model for use in Earth system models (ESMs). The model includes a size- and age-structured representation of tree dynamics, competition between functionally diverse plant functional types, and the biophysics underpinning plant growth, competition, mortality, as well as the carbon, water, and energy exchange with the atmosphere. The FATES model is designed as a modular vegetation model that can be integrated within a host land model for inclusion in ESMs. The model is designed for use in global change studies to understand and project the responses and feedbacks between terrestrial ecosystems andmore » the Earth system under changing climate and other forcings.« less

  4. Diet models with linear goal programming: impact of achievement functions.

    PubMed

    Gerdessen, J C; de Vries, J H M

    2015-11-01

    Diet models based on goal programming (GP) are valuable tools in designing diets that comply with nutritional, palatability and cost constraints. Results derived from GP models are usually very sensitive to the type of achievement function that is chosen.This paper aims to provide a methodological insight into several achievement functions. It describes the extended GP (EGP) achievement function, which enables the decision maker to use either a MinSum achievement function (which minimizes the sum of the unwanted deviations) or a MinMax achievement function (which minimizes the largest unwanted deviation), or a compromise between both. An additional advantage of EGP models is that from one set of data and weights multiple solutions can be obtained. We use small numerical examples to illustrate the 'mechanics' of achievement functions. Then, the EGP achievement function is demonstrated on a diet problem with 144 foods, 19 nutrients and several types of palatability constraints, in which the nutritional constraints are modeled with fuzzy sets. Choice of achievement function affects the results of diet models. MinSum achievement functions can give rise to solutions that are sensitive to weight changes, and that pile all unwanted deviations on a limited number of nutritional constraints. MinMax achievement functions spread the unwanted deviations as evenly as possible, but may create many (small) deviations. EGP comprises both types of achievement functions, as well as compromises between them. It can thus, from one data set, find a range of solutions with various properties.

  5. Evaluation of Empirical Tropospheric Models Using Satellite-Tracking Tropospheric Wet Delays with Water Vapor Radiometer at Tongji, China

    PubMed Central

    Wang, Miaomiao; Li, Bofeng

    2016-01-01

    An empirical tropospheric delay model, together with a mapping function, is commonly used to correct the tropospheric errors in global navigation satellite system (GNSS) processing. As is well-known, the accuracy of tropospheric delay models relies mainly on the correction efficiency for tropospheric wet delays. In this paper, we evaluate the accuracy of three tropospheric delay models, together with five mapping functions in wet delays calculation. The evaluations are conducted by comparing their slant wet delays with those measured by water vapor radiometer based on its satellite-tracking function (collected data with large liquid water path is removed). For all 15 combinations of three tropospheric models and five mapping functions, their accuracies as a function of elevation are statistically analyzed by using nine-day data in two scenarios, with and without meteorological data. The results show that (1) no matter with or without meteorological data, there is no practical difference between mapping functions, i.e., Chao, Ifadis, Vienna Mapping Function 1 (VMF1), Niell Mapping Function (NMF), and MTT Mapping Function (MTT); (2) without meteorological data, the UNB3 is much better than Saastamoinen and Hopfield models, while the Saastamoinen model performed slightly better than the Hopfield model; (3) with meteorological data, the accuracies of all three tropospheric delay models are improved to be comparable, especially for lower elevations. In addition, the kinematic precise point positioning where no parameter is set up for tropospheric delay modification is conducted to further evaluate the performance of tropospheric delay models in positioning accuracy. It is shown that the UNB3 model is best and can achieve about 10 cm accuracy for the N and E coordinate component while 20 cm accuracy for the U coordinate component no matter the meteorological data is available or not. This accuracy can be obtained by the Saastamoinen model only when meteorological data is available, and degraded to 46 cm for the U component if the meteorological data is not available. PMID:26848662

  6. An alternative perspective on assistive technology: the Person-Environment-Tool (PET) model.

    PubMed

    Jarl, Gustav; Lundqvist, Lars-Olov

    2018-04-20

    The medical and social models of disability are based on a dichotomy that categorizes people as able-bodied or disabled. In contrast, the biopsychosocial model, which forms the basis for the International Classification of Functioning, Disability and Health (ICF), suggests a universalistic perspective on human functioning, encompassing all human beings. In this article we argue that the artificial separation of function-enhancing technology into assistive technology (AT) and mainstream technology might be one of the barriers to a universalistic view of human functioning. Thus, an alternative view of AT is needed. The aim of this article was to construct a conceptual model to demonstrate how all human activities and participation depend on factors related to the person, environment, and tools, emphasizing a universalistic perspective on human functioning. In the Person-Environment-Tool (PET) model, a person's activity and participation are described as a function of factors related to the person, environment, and tool, drawing on various ICF components. Importantly, the PET model makes no distinction between people of different ability levels, between environmental modifications intended for people of different ability levels, or between different function-enhancing technologies (AT and mainstream technology). A fictive patient case is used to illustrate how the universalistic view of the PET model lead to a different approach in rehabilitation. The PET model supports a universalistic view of technology use, environmental adaptations, and variations in human functioning.

  7. Value function in economic growth model

    NASA Astrophysics Data System (ADS)

    Bagno, Alexander; Tarasyev, Alexandr A.; Tarasyev, Alexander M.

    2017-11-01

    Properties of the value function are examined in an infinite horizon optimal control problem with an unlimited integrand index appearing in the quality functional with a discount factor. Optimal control problems of such type describe solutions in models of economic growth. Necessary and sufficient conditions are derived to ensure that the value function satisfies the infinitesimal stability properties. It is proved that value function coincides with the minimax solution of the Hamilton-Jacobi equation. Description of the growth asymptotic behavior for the value function is provided for the logarithmic, power and exponential quality functionals and an example is given to illustrate construction of the value function in economic growth models.

  8. Short-term forecasting of meteorological time series using Nonparametric Functional Data Analysis (NPFDA)

    NASA Astrophysics Data System (ADS)

    Curceac, S.; Ternynck, C.; Ouarda, T.

    2015-12-01

    Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed

  9. A Novel Higher Order Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Xu, Shuxiang

    2010-05-01

    In this paper a new Higher Order Neural Network (HONN) model is introduced and applied in several data mining tasks. Data Mining extracts hidden patterns and valuable information from large databases. A hyperbolic tangent function is used as the neuron activation function for the new HONN model. Experiments are conducted to demonstrate the advantages and disadvantages of the new HONN model, when compared with several conventional Artificial Neural Network (ANN) models: Feedforward ANN with the sigmoid activation function; Feedforward ANN with the hyperbolic tangent activation function; and Radial Basis Function (RBF) ANN with the Gaussian activation function. The experimental results seem to suggest that the new HONN holds higher generalization capability as well as abilities in handling missing data.

  10. A Bayesian spatial model for neuroimaging data based on biologically informed basis functions.

    PubMed

    Huertas, Ismael; Oldehinkel, Marianne; van Oort, Erik S B; Garcia-Solis, David; Mir, Pablo; Beckmann, Christian F; Marquand, Andre F

    2017-11-01

    The dominant approach to neuroimaging data analysis employs the voxel as the unit of computation. While convenient, voxels lack biological meaning and their size is arbitrarily determined by the resolution of the image. Here, we propose a multivariate spatial model in which neuroimaging data are characterised as a linearly weighted combination of multiscale basis functions which map onto underlying brain nuclei or networks or nuclei. In this model, the elementary building blocks are derived to reflect the functional anatomy of the brain during the resting state. This model is estimated using a Bayesian framework which accurately quantifies uncertainty and automatically finds the most accurate and parsimonious combination of basis functions describing the data. We demonstrate the utility of this framework by predicting quantitative SPECT images of striatal dopamine function and we compare a variety of basis sets including generic isotropic functions, anatomical representations of the striatum derived from structural MRI, and two different soft functional parcellations of the striatum derived from resting-state fMRI (rfMRI). We found that a combination of ∼50 multiscale functional basis functions accurately represented the striatal dopamine activity, and that functional basis functions derived from an advanced parcellation technique known as Instantaneous Connectivity Parcellation (ICP) provided the most parsimonious models of dopamine function. Importantly, functional basis functions derived from resting fMRI were more accurate than both structural and generic basis sets in representing dopamine function in the striatum for a fixed model order. We demonstrate the translational validity of our framework by constructing classification models for discriminating parkinsonian disorders and their subtypes. Here, we show that ICP approach is the only basis set that performs well across all comparisons and performs better overall than the classical voxel-based approach. This spatial model constitutes an elegant alternative to voxel-based approaches in neuroimaging studies; not only are their atoms biologically informed, they are also adaptive to high resolutions, represent high dimensions efficiently, and capture long-range spatial dependencies, which are important and challenging objectives for neuroimaging data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Defining Function in the Functional Medicine Model.

    PubMed

    Bland, Jeffrey

    2017-02-01

    In the functional medicine model, the word function is aligned with the evolving understanding that disease is an endpoint and function is a process. Function can move both forward and backward. The vector of change in function through time is, in part, determined by the unique interaction of an individual's genome with their environment, diet, and lifestyle. The functional medicine model for health care is concerned less with what we call the dysfunction or disease , and more about the dynamic processes that resulted in the person's dysfunction. The previous concept of functional somatic syndromes as psychosomatic in origin has now been replaced with a new concept of function that is rooted in the emerging 21st-century understanding of systems network-enabled biology.

  12. Defining Function in the Functional Medicine Model

    PubMed Central

    Bland, Jeffrey

    2017-01-01

    In the functional medicine model, the word function is aligned with the evolving understanding that disease is an endpoint and function is a process. Function can move both forward and backward. The vector of change in function through time is, in part, determined by the unique interaction of an individual’s genome with their environment, diet, and lifestyle. The functional medicine model for health care is concerned less with what we call the dysfunction or disease, and more about the dynamic processes that resulted in the person’s dysfunction. The previous concept of functional somatic syndromes as psychosomatic in origin has now been replaced with a new concept of function that is rooted in the emerging 21st-century understanding of systems network-enabled biology. PMID:28223904

  13. Computer-based creativity enhanced conceptual design model for non-routine design of mechanical systems

    NASA Astrophysics Data System (ADS)

    Li, Yutong; Wang, Yuxin; Duffy, Alex H. B.

    2014-11-01

    Computer-based conceptual design for routine design has made great strides, yet non-routine design has not been given due attention, and it is still poorly automated. Considering that the function-behavior-structure(FBS) model is widely used for modeling the conceptual design process, a computer-based creativity enhanced conceptual design model(CECD) for non-routine design of mechanical systems is presented. In the model, the leaf functions in the FBS model are decomposed into and represented with fine-grain basic operation actions(BOA), and the corresponding BOA set in the function domain is then constructed. Choosing building blocks from the database, and expressing their multiple functions with BOAs, the BOA set in the structure domain is formed. Through rule-based dynamic partition of the BOA set in the function domain, many variants of regenerated functional schemes are generated. For enhancing the capability to introduce new design variables into the conceptual design process, and dig out more innovative physical structure schemes, the indirect function-structure matching strategy based on reconstructing the combined structure schemes is adopted. By adjusting the tightness of the partition rules and the granularity of the divided BOA subsets, and making full use of the main function and secondary functions of each basic structure in the process of reconstructing of the physical structures, new design variables and variants are introduced into the physical structure scheme reconstructing process, and a great number of simpler physical structure schemes to accomplish the overall function organically are figured out. The creativity enhanced conceptual design model presented has a dominant capability in introducing new deign variables in function domain and digging out simpler physical structures to accomplish the overall function, therefore it can be utilized to solve non-routine conceptual design problem.

  14. An adaptive radiation model for the origin of new genefunctions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Francino, M. Pilar

    2004-10-18

    The evolution of new gene functions is one of the keys to evolutionary innovation. Most novel functions result from gene duplication followed by divergence. However, the models hitherto proposed to account for this process are not fully satisfactory. The classic model of neofunctionalization holds that the two paralogous gene copies resulting from a duplication are functionally redundant, such that one of them can evolve under no functional constraints and occasionally acquire a new function. This model lacks a convincing mechanism for the new gene copies to increase in frequency in the population and survive the mutational load expected to accumulatemore » under neutrality, before the acquisition of the rare beneficial mutations that would confer new functionality. The subfunctionalization model has been proposed as an alternative way to generate genes with altered functions. This model also assumes that new paralogous gene copies are functionally redundant and therefore neutral, but it predicts that relaxed selection will affect both gene copies such that some of the capabilities of the parent gene will disappear in one of the copies and be retained in the other. Thus, the functions originally present in a single gene will be partitioned between the two descendant copies. However, although this model can explain increases in gene number, it does not really address the main evolutionary question, which is the development of new biochemical capabilities. Recently, a new concept has been introduced into the gene evolution literature which is most likely to help solve this dilemma. The key point is to allow for a period of natural selection for the duplication per se, before new function evolves, rather than considering gene duplication to be neutral as in the previous models. Here, I suggest a new model that draws on the advantage of postulating selection for gene duplication, and proposes that bursts of adaptive gene amplification in response to specific selection pressures provide the raw material for the evolution of new function.« less

  15. Generalized neurofuzzy network modeling algorithms using Bézier-Bernstein polynomial functions and additive decomposition.

    PubMed

    Hong, X; Harris, C J

    2000-01-01

    This paper introduces a new neurofuzzy model construction algorithm for nonlinear dynamic systems based upon basis functions that are Bézier-Bernstein polynomial functions. This paper is generalized in that it copes with n-dimensional inputs by utilising an additive decomposition construction to overcome the curse of dimensionality associated with high n. This new construction algorithm also introduces univariate Bézier-Bernstein polynomial functions for the completeness of the generalized procedure. Like the B-spline expansion based neurofuzzy systems, Bézier-Bernstein polynomial function based neurofuzzy networks hold desirable properties such as nonnegativity of the basis functions, unity of support, and interpretability of basis function as fuzzy membership functions, moreover with the additional advantages of structural parsimony and Delaunay input space partition, essentially overcoming the curse of dimensionality associated with conventional fuzzy and RBF networks. This new modeling network is based on additive decomposition approach together with two separate basis function formation approaches for both univariate and bivariate Bézier-Bernstein polynomial functions used in model construction. The overall network weights are then learnt using conventional least squares methods. Numerical examples are included to demonstrate the effectiveness of this new data based modeling approach.

  16. Functional Additive Mixed Models

    PubMed Central

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2014-01-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach. PMID:26347592

  17. Functional Additive Mixed Models.

    PubMed

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2015-04-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach.

  18. A long term model of circulation. [human body

    NASA Technical Reports Server (NTRS)

    White, R. J.

    1974-01-01

    A quantitative approach to modeling human physiological function, with a view toward ultimate application to long duration space flight experiments, was undertaken. Data was obtained on the effect of weightlessness on certain aspects of human physiological function during 1-3 month periods. Modifications in the Guyton model are reviewed. Design considerations for bilateral interface models are discussed. Construction of a functioning whole body model was studied, as well as the testing of the model versus available data.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ko, L.F.

    Calculations for the two-point correlation functions in the scaling limit for two statistical models are presented. In Part I, the Ising model with a linear defect is studied for T < T/sub c/ and T > T/sub c/. The transfer matrix method of Onsager and Kaufman is used. The energy-density correlation is given by functions related to the modified Bessel functions. The dispersion expansion for the spin-spin correlation functions are derived. The dominant behavior for large separations at T not equal to T/sub c/ is extracted. It is shown that these expansions lead to systems of Fredholm integral equations. Inmore » Part II, the electric correlation function of the eight-vertex model for T < T/sub c/ is studied. The eight vertex model decouples to two independent Ising models when the four spin coupling vanishes. To first order in the four-spin coupling, the electric correlation function is related to a three-point function of the Ising model. This relation is systematically investigated and the full dispersion expansion (to first order in four-spin coupling) is obtained. The results is a new kind of structure which, unlike those of many solvable models, is apparently not expressible in terms of linear integral equations.« less

  20. COPEWELL: A Conceptual Framework and System Dynamics Model for Predicting Community Functioning and Resilience After Disasters.

    PubMed

    Links, Jonathan M; Schwartz, Brian S; Lin, Sen; Kanarek, Norma; Mitrani-Reiser, Judith; Sell, Tara Kirk; Watson, Crystal R; Ward, Doug; Slemp, Cathy; Burhans, Robert; Gill, Kimberly; Igusa, Tak; Zhao, Xilei; Aguirre, Benigno; Trainor, Joseph; Nigg, Joanne; Inglesby, Thomas; Carbone, Eric; Kendra, James M

    2018-02-01

    Policy-makers and practitioners have a need to assess community resilience in disasters. Prior efforts conflated resilience with community functioning, combined resistance and recovery (the components of resilience), and relied on a static model for what is inherently a dynamic process. We sought to develop linked conceptual and computational models of community functioning and resilience after a disaster. We developed a system dynamics computational model that predicts community functioning after a disaster. The computational model outputted the time course of community functioning before, during, and after a disaster, which was used to calculate resistance, recovery, and resilience for all US counties. The conceptual model explicitly separated resilience from community functioning and identified all key components for each, which were translated into a system dynamics computational model with connections and feedbacks. The components were represented by publicly available measures at the county level. Baseline community functioning, resistance, recovery, and resilience evidenced a range of values and geographic clustering, consistent with hypotheses based on the disaster literature. The work is transparent, motivates ongoing refinements, and identifies areas for improved measurements. After validation, such a model can be used to identify effective investments to enhance community resilience. (Disaster Med Public Health Preparedness. 2018;12:127-137).

  1. Model Based Predictive Control of Multivariable Hammerstein Processes with Fuzzy Logic Hypercube Interpolated Models

    PubMed Central

    Coelho, Antonio Augusto Rodrigues

    2016-01-01

    This paper introduces the Fuzzy Logic Hypercube Interpolator (FLHI) and demonstrates applications in control of multiple-input single-output (MISO) and multiple-input multiple-output (MIMO) processes with Hammerstein nonlinearities. FLHI consists of a Takagi-Sugeno fuzzy inference system where membership functions act as kernel functions of an interpolator. Conjunction of membership functions in an unitary hypercube space enables multivariable interpolation of N-dimensions. Membership functions act as interpolation kernels, such that choice of membership functions determines interpolation characteristics, allowing FLHI to behave as a nearest-neighbor, linear, cubic, spline or Lanczos interpolator, to name a few. The proposed interpolator is presented as a solution to the modeling problem of static nonlinearities since it is capable of modeling both a function and its inverse function. Three study cases from literature are presented, a single-input single-output (SISO) system, a MISO and a MIMO system. Good results are obtained regarding performance metrics such as set-point tracking, control variation and robustness. Results demonstrate applicability of the proposed method in modeling Hammerstein nonlinearities and their inverse functions for implementation of an output compensator with Model Based Predictive Control (MBPC), in particular Dynamic Matrix Control (DMC). PMID:27657723

  2. Modeling phytoplankton community in reservoirs. A comparison between taxonomic and functional groups-based models.

    PubMed

    Di Maggio, Jimena; Fernández, Carolina; Parodi, Elisa R; Diaz, M Soledad; Estrada, Vanina

    2016-01-01

    In this paper we address the formulation of two mechanistic water quality models that differ in the way the phytoplankton community is described. We carry out parameter estimation subject to differential-algebraic constraints and validation for each model and comparison between models performance. The first approach aggregates phytoplankton species based on their phylogenetic characteristics (Taxonomic group model) and the second one, on their morpho-functional properties following Reynolds' classification (Functional group model). The latter approach takes into account tolerance and sensitivity to environmental conditions. The constrained parameter estimation problems are formulated within an equation oriented framework, with a maximum likelihood objective function. The study site is Paso de las Piedras Reservoir (Argentina), which supplies water for consumption for 450,000 population. Numerical results show that phytoplankton morpho-functional groups more closely represent each species growth requirements within the group. Each model performance is quantitatively assessed by three diagnostic measures. Parameter estimation results for seasonal dynamics of the phytoplankton community and main biogeochemical variables for a one-year time horizon are presented and compared for both models, showing the functional group model enhanced performance. Finally, we explore increasing nutrient loading scenarios and predict their effect on phytoplankton dynamics throughout a one-year time horizon. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Quantifying uncertainty in partially specified biological models: how can optimal control theory help us?

    PubMed

    Adamson, M W; Morozov, A Y; Kuzenkov, O A

    2016-09-01

    Mathematical models in biology are highly simplified representations of a complex underlying reality and there is always a high degree of uncertainty with regards to model function specification. This uncertainty becomes critical for models in which the use of different functions fitting the same dataset can yield substantially different predictions-a property known as structural sensitivity. Thus, even if the model is purely deterministic, then the uncertainty in the model functions carries through into uncertainty in model predictions, and new frameworks are required to tackle this fundamental problem. Here, we consider a framework that uses partially specified models in which some functions are not represented by a specific form. The main idea is to project infinite dimensional function space into a low-dimensional space taking into account biological constraints. The key question of how to carry out this projection has so far remained a serious mathematical challenge and hindered the use of partially specified models. Here, we propose and demonstrate a potentially powerful technique to perform such a projection by using optimal control theory to construct functions with the specified global properties. This approach opens up the prospect of a flexible and easy to use method to fulfil uncertainty analysis of biological models.

  4. Defining a Model for Mitochondrial Function in mESC Differentiation

    EPA Science Inventory

    Defining a Model for Mitochondrial Function in mESC DifferentiationDefining a Model for Mitochondrial Function in mESC Differentiation Differentiating embryonic stem cells (ESCs) undergo mitochondrial maturation leading to a switch from a system dependent upon glycolysis to a re...

  5. Flexible link functions in nonparametric binary regression with Gaussian process priors.

    PubMed

    Li, Dan; Wang, Xia; Lin, Lizhen; Dey, Dipak K

    2016-09-01

    In many scientific fields, it is a common practice to collect a sequence of 0-1 binary responses from a subject across time, space, or a collection of covariates. Researchers are interested in finding out how the expected binary outcome is related to covariates, and aim at better prediction in the future 0-1 outcomes. Gaussian processes have been widely used to model nonlinear systems; in particular to model the latent structure in a binary regression model allowing nonlinear functional relationship between covariates and the expectation of binary outcomes. A critical issue in modeling binary response data is the appropriate choice of link functions. Commonly adopted link functions such as probit or logit links have fixed skewness and lack the flexibility to allow the data to determine the degree of the skewness. To address this limitation, we propose a flexible binary regression model which combines a generalized extreme value link function with a Gaussian process prior on the latent structure. Bayesian computation is employed in model estimation. Posterior consistency of the resulting posterior distribution is demonstrated. The flexibility and gains of the proposed model are illustrated through detailed simulation studies and two real data examples. Empirical results show that the proposed model outperforms a set of alternative models, which only have either a Gaussian process prior on the latent regression function or a Dirichlet prior on the link function. © 2015, The International Biometric Society.

  6. Flexible Link Functions in Nonparametric Binary Regression with Gaussian Process Priors

    PubMed Central

    Li, Dan; Lin, Lizhen; Dey, Dipak K.

    2015-01-01

    Summary In many scientific fields, it is a common practice to collect a sequence of 0-1 binary responses from a subject across time, space, or a collection of covariates. Researchers are interested in finding out how the expected binary outcome is related to covariates, and aim at better prediction in the future 0-1 outcomes. Gaussian processes have been widely used to model nonlinear systems; in particular to model the latent structure in a binary regression model allowing nonlinear functional relationship between covariates and the expectation of binary outcomes. A critical issue in modeling binary response data is the appropriate choice of link functions. Commonly adopted link functions such as probit or logit links have fixed skewness and lack the flexibility to allow the data to determine the degree of the skewness. To address this limitation, we propose a flexible binary regression model which combines a generalized extreme value link function with a Gaussian process prior on the latent structure. Bayesian computation is employed in model estimation. Posterior consistency of the resulting posterior distribution is demonstrated. The flexibility and gains of the proposed model are illustrated through detailed simulation studies and two real data examples. Empirical results show that the proposed model outperforms a set of alternative models, which only have either a Gaussian process prior on the latent regression function or a Dirichlet prior on the link function. PMID:26686333

  7. Influence Function Learning in Information Diffusion Networks

    PubMed Central

    Du, Nan; Liang, Yingyu; Balcan, Maria-Florina; Song, Le

    2015-01-01

    Can we learn the influence of a set of people in a social network from cascades of information diffusion? This question is often addressed by a two-stage approach: first learn a diffusion model, and then calculate the influence based on the learned model. Thus, the success of this approach relies heavily on the correctness of the diffusion model which is hard to verify for real world data. In this paper, we exploit the insight that the influence functions in many diffusion models are coverage functions, and propose a novel parameterization of such functions using a convex combination of random basis functions. Moreover, we propose an efficient maximum likelihood based algorithm to learn such functions directly from cascade data, and hence bypass the need to specify a particular diffusion model in advance. We provide both theoretical and empirical analysis for our approach, showing that the proposed approach can provably learn the influence function with low sample complexity, be robust to the unknown diffusion models, and significantly outperform existing approaches in both synthetic and real world data. PMID:25973445

  8. Nonparametric Bayesian inference for mean residual life functions in survival analysis.

    PubMed

    Poynor, Valerie; Kottas, Athanasios

    2018-01-19

    Modeling and inference for survival analysis problems typically revolves around different functions related to the survival distribution. Here, we focus on the mean residual life (MRL) function, which provides the expected remaining lifetime given that a subject has survived (i.e. is event-free) up to a particular time. This function is of direct interest in reliability, medical, and actuarial fields. In addition to its practical interpretation, the MRL function characterizes the survival distribution. We develop general Bayesian nonparametric inference for MRL functions built from a Dirichlet process mixture model for the associated survival distribution. The resulting model for the MRL function admits a representation as a mixture of the kernel MRL functions with time-dependent mixture weights. This model structure allows for a wide range of shapes for the MRL function. Particular emphasis is placed on the selection of the mixture kernel, taken to be a gamma distribution, to obtain desirable properties for the MRL function arising from the mixture model. The inference method is illustrated with a data set of two experimental groups and a data set involving right censoring. The supplementary material available at Biostatistics online provides further results on empirical performance of the model, using simulated data examples. © The Author 2018. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Model-based metrics of human-automation function allocation in complex work environments

    NASA Astrophysics Data System (ADS)

    Kim, So Young

    Function allocation is the design decision which assigns work functions to all agents in a team, both human and automated. Efforts to guide function allocation systematically has been studied in many fields such as engineering, human factors, team and organization design, management science, and cognitive systems engineering. Each field focuses on certain aspects of function allocation, but not all; thus, an independent discussion of each does not address all necessary issues with function allocation. Four distinctive perspectives emerged from a review of these fields: technology-centered, human-centered, team-oriented, and work-oriented. Each perspective focuses on different aspects of function allocation: capabilities and characteristics of agents (automation or human), team structure and processes, and work structure and the work environment. Together, these perspectives identify the following eight issues with function allocation: 1) Workload, 2) Incoherency in function allocations, 3) Mismatches between responsibility and authority, 4) Interruptive automation, 5) Automation boundary conditions, 6) Function allocation preventing human adaptation to context, 7) Function allocation destabilizing the humans' work environment, and 8) Mission Performance. Addressing these issues systematically requires formal models and simulations that include all necessary aspects of human-automation function allocation: the work environment, the dynamics inherent to the work, agents, and relationships among them. Also, addressing these issues requires not only a (static) model, but also a (dynamic) simulation that captures temporal aspects of work such as the timing of actions and their impact on the agent's work. Therefore, with properly modeled work as described by the work environment, the dynamics inherent to the work, agents, and relationships among them, a modeling framework developed by this thesis, which includes static work models and dynamic simulation, can capture the issues with function allocation. Then, based on the eight issues, eight types of metrics are established. The purpose of these metrics is to assess the extent to which each issue exists with a given function allocation. Specifically, the eight types of metrics assess workload, coherency of a function allocation, mismatches between responsibility and authority, interruptive automation, automation boundary conditions, human adaptation to context, stability of the human's work environment, and mission performance. Finally, to validate the modeling framework and the metrics, a case study was conducted modeling four different function allocations between a pilot and flight deck automation during the arrival and approach phases of flight. A range of pilot cognitive control modes and maximum human taskload limits were also included in the model. The metrics were assessed for these four function allocations and analyzed to validate capability of the metrics to identify important issues in given function allocations. In addition, the design insights provided by the metrics are highlighted. This thesis concludes with a discussion of mechanisms for further validating the modeling framework and function allocation metrics developed here, and highlights where these developments can be applied in research and in the design of function allocations in complex work environments such as aviation operations.

  10. Perspective Space as a Model for Distance and Size Perception.

    PubMed

    Erkelens, Casper J

    2017-01-01

    In the literature, perspective space has been introduced as a model of visual space. Perspective space is grounded on the perspective nature of visual space during both binocular and monocular vision. A single parameter, that is, the distance of the vanishing point, transforms the geometry of physical space into that of perspective space. The perspective-space model predicts perceived angles, distances, and sizes. The model is compared with other models for distance and size perception. Perspective space predicts that perceived distance and size as a function of physical distance are described by hyperbolic functions. Alternatively, power functions have been widely used to describe perceived distance and size. Comparison of power and hyperbolic functions shows that both functions are equivalent within the range of distances that have been judged in experiments. Two models describing perceived distance on the ground plane appear to be equivalent with the perspective-space model too. The conclusion is that perspective space unifies a number of models of distance and size perception.

  11. Perspective Space as a Model for Distance and Size Perception

    PubMed Central

    2017-01-01

    In the literature, perspective space has been introduced as a model of visual space. Perspective space is grounded on the perspective nature of visual space during both binocular and monocular vision. A single parameter, that is, the distance of the vanishing point, transforms the geometry of physical space into that of perspective space. The perspective-space model predicts perceived angles, distances, and sizes. The model is compared with other models for distance and size perception. Perspective space predicts that perceived distance and size as a function of physical distance are described by hyperbolic functions. Alternatively, power functions have been widely used to describe perceived distance and size. Comparison of power and hyperbolic functions shows that both functions are equivalent within the range of distances that have been judged in experiments. Two models describing perceived distance on the ground plane appear to be equivalent with the perspective-space model too. The conclusion is that perspective space unifies a number of models of distance and size perception. PMID:29225765

  12. Time prediction of failure a type of lamps by using general composite hazard rate model

    NASA Astrophysics Data System (ADS)

    Riaman; Lesmana, E.; Subartini, B.; Supian, S.

    2018-03-01

    This paper discusses the basic survival model estimates to obtain the average predictive value of lamp failure time. This estimate is for the parametric model, General Composite Hazard Level Model. The random time variable model used is the exponential distribution model, as the basis, which has a constant hazard function. In this case, we discuss an example of survival model estimation for a composite hazard function, using an exponential model as its basis. To estimate this model is done by estimating model parameters, through the construction of survival function and empirical cumulative function. The model obtained, will then be used to predict the average failure time of the model, for the type of lamp. By grouping the data into several intervals and the average value of failure at each interval, then calculate the average failure time of a model based on each interval, the p value obtained from the tes result is 0.3296.

  13. Propulsive Reaction Control System Model

    NASA Technical Reports Server (NTRS)

    Brugarolas, Paul; Phan, Linh H.; Serricchio, Frederick; San Martin, Alejandro M.

    2011-01-01

    This software models a propulsive reaction control system (RCS) for guidance, navigation, and control simulation purposes. The model includes the drive electronics, the electromechanical valve dynamics, the combustion dynamics, and thrust. This innovation follows the Mars Science Laboratory entry reaction control system design, and has been created to meet the Mars Science Laboratory (MSL) entry, descent, and landing simulation needs. It has been built to be plug-and-play on multiple MSL testbeds [analysis, Monte Carlo, flight software development, hardware-in-the-loop, and ATLO (assembly, test and launch operations) testbeds]. This RCS model is a C language program. It contains two main functions: the RCS electronics model function that models the RCS FPGA (field-programmable-gate-array) processing and commanding of the RCS valve, and the RCS dynamic model function that models the valve and combustion dynamics. In addition, this software provides support functions to initialize the model states, set parameters, access model telemetry, and access calculated thruster forces.

  14. Computational models of basal-ganglia pathway functions: focus on functional neuroanatomy

    PubMed Central

    Schroll, Henning; Hamker, Fred H.

    2013-01-01

    Over the past 15 years, computational models have had a considerable impact on basal-ganglia research. Most of these models implement multiple distinct basal-ganglia pathways and assume them to fulfill different functions. As there is now a multitude of different models, it has become complex to keep track of their various, sometimes just marginally different assumptions on pathway functions. Moreover, it has become a challenge to oversee to what extent individual assumptions are corroborated or challenged by empirical data. Focusing on computational, but also considering non-computational models, we review influential concepts of pathway functions and show to what extent they are compatible with or contradict each other. Moreover, we outline how empirical evidence favors or challenges specific model assumptions and propose experiments that allow testing assumptions against each other. PMID:24416002

  15. Functional Behavioral Assessment: A School Based Model.

    ERIC Educational Resources Information Center

    Asmus, Jennifer M.; Vollmer, Timothy R.; Borrero, John C.

    2002-01-01

    This article begins by discussing requirements for functional behavioral assessment under the Individuals with Disabilities Education Act and then describes a comprehensive model for the application of behavior analysis in the schools. The model includes descriptive assessment, functional analysis, and intervention and involves the participation…

  16. RAPID ASSESSMENT OF URBAN WETLANDS: FUNCTIONAL ASSESSMENT MODEL DEVELOPMENT AND EVALUATION

    EPA Science Inventory

    The objective of this study was to test the ability of existing hydrogeomorphic (HGM) functional assessment models and our own proposed models to predict rates of nitrate production and removal, functions critical to water quality protection, in forested riparian wetlands in nort...

  17. Bayesian Inference for Functional Dynamics Exploring in fMRI Data.

    PubMed

    Guo, Xuan; Liu, Bing; Chen, Le; Chen, Guantao; Pan, Yi; Zhang, Jing

    2016-01-01

    This paper aims to review state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI) data. Particularly, we focus on one specific long-standing challenge in the computational modeling of fMRI datasets: how to effectively explore typical functional interactions from fMRI time series and the corresponding boundaries of temporal segments. Bayesian inference is a method of statistical inference which has been shown to be a powerful tool to encode dependence relationships among the variables with uncertainty. Here we provide an introduction to a group of Bayesian-inference-based methods for fMRI data analysis, which were designed to detect magnitude or functional connectivity change points and to infer their functional interaction patterns based on corresponding temporal boundaries. We also provide a comparison of three popular Bayesian models, that is, Bayesian Magnitude Change Point Model (BMCPM), Bayesian Connectivity Change Point Model (BCCPM), and Dynamic Bayesian Variable Partition Model (DBVPM), and give a summary of their applications. We envision that more delicate Bayesian inference models will be emerging and play increasingly important roles in modeling brain functions in the years to come.

  18. A robust and fast active contour model for image segmentation with intensity inhomogeneity

    NASA Astrophysics Data System (ADS)

    Ding, Keyan; Weng, Guirong

    2018-04-01

    In this paper, a robust and fast active contour model is proposed for image segmentation in the presence of intensity inhomogeneity. By introducing the local image intensities fitting functions before the evolution of curve, the proposed model can effectively segment images with intensity inhomogeneity. And the computation cost is low because the fitting functions do not need to be updated in each iteration. Experiments have shown that the proposed model has a higher segmentation efficiency compared to some well-known active contour models based on local region fitting energy. In addition, the proposed model is robust to initialization, which allows the initial level set function to be a small constant function.

  19. Functional enzyme-based modeling approach for dynamic simulation of denitrification process in hyporheic zone sediments: Genetically structured microbial community model

    NASA Astrophysics Data System (ADS)

    Song, H. S.; Li, M.; Qian, W.; Song, X.; Chen, X.; Scheibe, T. D.; Fredrickson, J.; Zachara, J. M.; Liu, C.

    2016-12-01

    Modeling environmental microbial communities at individual organism level is currently intractable due to overwhelming structural complexity. Functional guild-based approaches alleviate this problem by lumping microorganisms into fewer groups based on their functional similarities. This reduction may become ineffective, however, when individual species perform multiple functions as environmental conditions vary. In contrast, the functional enzyme-based modeling approach we present here describes microbial community dynamics based on identified functional enzymes (rather than individual species or their groups). Previous studies in the literature along this line used biomass or functional genes as surrogate measures of enzymes due to the lack of analytical methods for quantifying enzymes in environmental samples. Leveraging our recent development of a signature peptide-based technique enabling sensitive quantification of functional enzymes in environmental samples, we developed a genetically structured microbial community model (GSMCM) to incorporate enzyme concentrations and various other omics measurements (if available) as key modeling input. We formulated the GSMCM based on the cybernetic metabolic modeling framework to rationally account for cellular regulation without relying on empirical inhibition kinetics. In the case study of modeling denitrification process in Columbia River hyporheic zone sediments collected from the Hanford Reach, our GSMCM provided a quantitative fit to complex experimental data in denitrification, including the delayed response of enzyme activation to the change in substrate concentration. Our future goal is to extend the modeling scope to the prediction of carbon and nitrogen cycles and contaminant fate. Integration of a simpler version of the GSMCM with PFLOTRAN for multi-scale field simulations is in progress.

  20. flexsurv: A Platform for Parametric Survival Modeling in R

    PubMed Central

    Jackson, Christopher H.

    2018-01-01

    flexsurv is an R package for fully-parametric modeling of survival data. Any parametric time-to-event distribution may be fitted if the user supplies a probability density or hazard function, and ideally also their cumulative versions. Standard survival distributions are built in, including the three and four-parameter generalized gamma and F distributions. Any parameter of any distribution can be modeled as a linear or log-linear function of covariates. The package also includes the spline model of Royston and Parmar (2002), in which both baseline survival and covariate effects can be arbitrarily flexible parametric functions of time. The main model-fitting function, flexsurvreg, uses the familiar syntax of survreg from the standard survival package (Therneau 2016). Censoring or left-truncation are specified in ‘Surv’ objects. The models are fitted by maximizing the full log-likelihood, and estimates and confidence intervals for any function of the model parameters can be printed or plotted. flexsurv also provides functions for fitting and predicting from fully-parametric multi-state models, and connects with the mstate package (de Wreede, Fiocco, and Putter 2011). This article explains the methods and design principles of the package, giving several worked examples of its use. PMID:29593450

  1. Wind Tunnel Database Development using Modern Experiment Design and Multivariate Orthogonal Functions

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.; DeLoach, Richard

    2003-01-01

    A wind tunnel experiment for characterizing the aerodynamic and propulsion forces and moments acting on a research model airplane is described. The model airplane called the Free-flying Airplane for Sub-scale Experimental Research (FASER), is a modified off-the-shelf radio-controlled model airplane, with 7 ft wingspan, a tractor propeller driven by an electric motor, and aerobatic capability. FASER was tested in the NASA Langley 12-foot Low-Speed Wind Tunnel, using a combination of traditional sweeps and modern experiment design. Power level was included as an independent variable in the wind tunnel test, to allow characterization of power effects on aerodynamic forces and moments. A modeling technique that employs multivariate orthogonal functions was used to develop accurate analytic models for the aerodynamic and propulsion force and moment coefficient dependencies from the wind tunnel data. Efficient methods for generating orthogonal modeling functions, expanding the orthogonal modeling functions in terms of ordinary polynomial functions, and analytical orthogonal blocking were developed and discussed. The resulting models comprise a set of smooth, differentiable functions for the non-dimensional aerodynamic force and moment coefficients in terms of ordinary polynomials in the independent variables, suitable for nonlinear aircraft simulation.

  2. Fitness landscapes, heuristics and technological paradigms: A critique on random search models in evolutionary economics

    NASA Astrophysics Data System (ADS)

    Frenken, Koen

    2001-06-01

    The biological evolution of complex organisms, in which the functioning of genes is interdependent, has been analyzed as "hill-climbing" on NK fitness landscapes through random mutation and natural selection. In evolutionary economics, NK fitness landscapes have been used to simulate the evolution of complex technological systems containing elements that are interdependent in their functioning. In these models, economic agents randomly search for new technological design by trial-and-error and run the risk of ending up in sub-optimal solutions due to interdependencies between the elements in a complex system. These models of random search are legitimate for reasons of modeling simplicity, but remain limited as these models ignore the fact that agents can apply heuristics. A specific heuristic is one that sequentially optimises functions according to their ranking by users of the system. To model this heuristic, a generalized NK-model is developed. In this model, core elements that influence many functions can be distinguished from peripheral elements that affect few functions. The concept of paradigmatic search can then be analytically defined as search that leaves core elements in tact while concentrating on improving functions by mutation of peripheral elements.

  3. Model-based Utility Functions

    NASA Astrophysics Data System (ADS)

    Hibbard, Bill

    2012-05-01

    Orseau and Ring, as well as Dewey, have recently described problems, including self-delusion, with the behavior of agents using various definitions of utility functions. An agent's utility function is defined in terms of the agent's history of interactions with its environment. This paper argues, via two examples, that the behavior problems can be avoided by formulating the utility function in two steps: 1) inferring a model of the environment from interactions, and 2) computing utility as a function of the environment model. Basing a utility function on a model that the agent must learn implies that the utility function must initially be expressed in terms of specifications to be matched to structures in the learned model. These specifications constitute prior assumptions about the environment so this approach will not work with arbitrary environments. But the approach should work for agents designed by humans to act in the physical world. The paper also addresses the issue of self-modifying agents and shows that if provided with the possibility to modify their utility functions agents will not choose to do so, under some usual assumptions.

  4. The correlation function for density perturbations in an expanding universe. III The three-point and predictions of the four-point and higher order correlation functions

    NASA Technical Reports Server (NTRS)

    Mcclelland, J.; Silk, J.

    1978-01-01

    Higher-order correlation functions for the large-scale distribution of galaxies in space are investigated. It is demonstrated that the three-point correlation function observed by Peebles and Groth (1975) is not consistent with a distribution of perturbations that at present are randomly distributed in space. The two-point correlation function is shown to be independent of how the perturbations are distributed spatially, and a model of clustered perturbations is developed which incorporates a nonuniform perturbation distribution and which explains the three-point correlation function. A model with hierarchical perturbations incorporating the same nonuniform distribution is also constructed; it is found that this model also explains the three-point correlation function, but predicts different results for the four-point and higher-order correlation functions than does the model with clustered perturbations. It is suggested that the model of hierarchical perturbations might be explained by the single assumption of having density fluctuations or discrete objects all of the same mass randomly placed at some initial epoch.

  5. Exploiting the functional and taxonomic structure of genomic data by probabilistic topic modeling.

    PubMed

    Chen, Xin; Hu, Xiaohua; Lim, Tze Y; Shen, Xiajiong; Park, E K; Rosen, Gail L

    2012-01-01

    In this paper, we present a method that enable both homology-based approach and composition-based approach to further study the functional core (i.e., microbial core and gene core, correspondingly). In the proposed method, the identification of major functionality groups is achieved by generative topic modeling, which is able to extract useful information from unlabeled data. We first show that generative topic model can be used to model the taxon abundance information obtained by homology-based approach and study the microbial core. The model considers each sample as a “document,” which has a mixture of functional groups, while each functional group (also known as a “latent topic”) is a weight mixture of species. Therefore, estimating the generative topic model for taxon abundance data will uncover the distribution over latent functions (latent topic) in each sample. Second, we show that, generative topic model can also be used to study the genome-level composition of “N-mer” features (DNA subreads obtained by composition-based approaches). The model consider each genome as a mixture of latten genetic patterns (latent topics), while each functional pattern is a weighted mixture of the “N-mer” features, thus the existence of core genomes can be indicated by a set of common N-mer features. After studying the mutual information between latent topics and gene regions, we provide an explanation of the functional roles of uncovered latten genetic patterns. The experimental results demonstrate the effectiveness of proposed method.

  6. A suggestion for computing objective function in model calibration

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shuguang

    2014-01-01

    A parameter-optimization process (model calibration) is usually required for numerical model applications, which involves the use of an objective function to determine the model cost (model-data errors). The sum of square errors (SSR) has been widely adopted as the objective function in various optimization procedures. However, ‘square error’ calculation was found to be more sensitive to extreme or high values. Thus, we proposed that the sum of absolute errors (SAR) may be a better option than SSR for model calibration. To test this hypothesis, we used two case studies—a hydrological model calibration and a biogeochemical model calibration—to investigate the behavior of a group of potential objective functions: SSR, SAR, sum of squared relative deviation (SSRD), and sum of absolute relative deviation (SARD). Mathematical evaluation of model performance demonstrates that ‘absolute error’ (SAR and SARD) are superior to ‘square error’ (SSR and SSRD) in calculating objective function for model calibration, and SAR behaved the best (with the least error and highest efficiency). This study suggests that SSR might be overly used in real applications, and SAR may be a reasonable choice in common optimization implementations without emphasizing either high or low values (e.g., modeling for supporting resources management).

  7. Models and algorithm of optimization launch and deployment of virtual network functions in the virtual data center

    NASA Astrophysics Data System (ADS)

    Bolodurina, I. P.; Parfenov, D. I.

    2017-10-01

    The goal of our investigation is optimization of network work in virtual data center. The advantage of modern infrastructure virtualization lies in the possibility to use software-defined networks. However, the existing optimization of algorithmic solutions does not take into account specific features working with multiple classes of virtual network functions. The current paper describes models characterizing the basic structures of object of virtual data center. They including: a level distribution model of software-defined infrastructure virtual data center, a generalized model of a virtual network function, a neural network model of the identification of virtual network functions. We also developed an efficient algorithm for the optimization technology of containerization of virtual network functions in virtual data center. We propose an efficient algorithm for placing virtual network functions. In our investigation we also generalize the well renowned heuristic and deterministic algorithms of Karmakar-Karp.

  8. Chip level modeling of LSI devices

    NASA Technical Reports Server (NTRS)

    Armstrong, J. R.

    1984-01-01

    The advent of Very Large Scale Integration (VLSI) technology has rendered the gate level model impractical for many simulation activities critical to the design automation process. As an alternative, an approach to the modeling of VLSI devices at the chip level is described, including the specification of modeling language constructs important to the modeling process. A model structure is presented in which models of the LSI devices are constructed as single entities. The modeling structure is two layered. The functional layer in this structure is used to model the input/output response of the LSI chip. A second layer, the fault mapping layer, is added, if fault simulations are required, in order to map the effects of hardware faults onto the functional layer. Modeling examples for each layer are presented. Fault modeling at the chip level is described. Approaches to realistic functional fault selection and defining fault coverage for functional faults are given. Application of the modeling techniques to single chip and bit slice microprocessors is discussed.

  9. Enabling complex queries to drug information sources through functional composition.

    PubMed

    Peters, Lee; Mortensen, Jonathan; Nguyen, Thang; Bodenreider, Olivier

    2013-01-01

    Our objective was to enable an end-user to create complex queries to drug information sources through functional composition, by creating sequences of functions from application program interfaces (API) to drug terminologies. The development of a functional composition model seeks to link functions from two distinct APIs. An ontology was developed using Protégé to model the functions of the RxNorm and NDF-RT APIs by describing the semantics of their input and output. A set of rules were developed to define the interoperable conditions for functional composition. The operational definition of interoperability between function pairs is established by executing the rules on the ontology. We illustrate that the functional composition model supports common use cases, including checking interactions for RxNorm drugs and deploying allergy lists defined in reference to drug properties in NDF-RT. This model supports the RxMix application (http://mor.nlm.nih.gov/RxMix/), an application we developed for enabling complex queries to the RxNorm and NDF-RT APIs.

  10. Methodology to develop crash modification functions for road safety treatments with fully specified and hierarchical models.

    PubMed

    Chen, Yongsheng; Persaud, Bhagwant

    2014-09-01

    Crash modification factors (CMFs) for road safety treatments are developed as multiplicative factors that are used to reflect the expected changes in safety performance associated with changes in highway design and/or the traffic control features. However, current CMFs have methodological drawbacks. For example, variability with application circumstance is not well understood, and, as important, correlation is not addressed when several CMFs are applied multiplicatively. These issues can be addressed by developing safety performance functions (SPFs) with components of crash modification functions (CM-Functions), an approach that includes all CMF related variables, along with others, while capturing quantitative and other effects of factors and accounting for cross-factor correlations. CM-Functions can capture the safety impact of factors through a continuous and quantitative approach, avoiding the problematic categorical analysis that is often used to capture CMF variability. There are two formulations to develop such SPFs with CM-Function components - fully specified models and hierarchical models. Based on sample datasets from two Canadian cities, both approaches are investigated in this paper. While both model formulations yielded promising results and reasonable CM-Functions, the hierarchical model was found to be more suitable in retaining homogeneity of first-level SPFs, while addressing CM-Functions in sub-level modeling. In addition, hierarchical models better capture the correlations between different impact factors. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Understanding the contribution of phytoplankton phase functions to uncertainties in the water colour signal.

    PubMed

    Lain, Lisl Robertson; Bernard, Stewart; Matthews, Mark W

    2017-02-20

    The accurate description of a water body's volume scattering function (VSF), and hence its phase functions, is critical to the determination of the constituent inherent optical properties (IOPs), the associated spectral water-leaving reflectance, and consequently the retrieval of phytoplankton functional type (PFT) information. The equivalent algal populations (EAP) model has previously been evaluated for phytoplankton-dominated waters, and offers the ability to provide phytoplankton population-specific phase functions, unveiling a new opportunity to further understanding of the causality of the PFT signal. This study presents and evaluates the wavelength dependent, spectrally variable EAP particle phase functions and the subsequent effects on water-leaving reflectance. Comparisons are made with frequently used phase function approximations e.g. the Fournier Forand formulation, as well as with phase functions inferred from measured VSFs in coastal waters. Relative differences in shape and magnitude are quantified. Reflectance modelled with the EAP phase functions is then compared against measured reflectance data from phytoplankton-dominated waters. Further examples of modelled phytoplankton-dominated waters are discussed with reference to choice of phase function for two PFTs (eukaryote and prokaryote) across a range of biomass. Finally a demonstration of the sensitivity of reflectance due to the choice of phase function is presented. The EAP model phase functions account for both spectral and angular variability in phytoplankton backscattering i.e. they display variability which is both spectral and shape-related. It is concluded that phase functions modelled in this way are necessary for investigating the effects of assemblage variability on the ocean colour signal, and should be considered for model closure even in relatively low scattering conditions where phytoplankton dominate the IOPs.

  12. Early post-stroke cognition in stroke rehabilitation patients predicts functional outcome at 13 months.

    PubMed

    Wagle, Jørgen; Farner, Lasse; Flekkøy, Kjell; Bruun Wyller, Torgeir; Sandvik, Leiv; Fure, Brynjar; Stensrød, Brynhild; Engedal, Knut

    2011-01-01

    To identify prognostic factors associated with functional outcome at 13 months in a sample of stroke rehabilitation patients. Specifically, we hypothesized that cognitive functioning early after stroke would predict long-term functional outcome independently of other factors. 163 stroke rehabilitation patients underwent a structured neuropsychological examination 2-3 weeks after hospital admittance, and their functional status was subsequently evaluated 13 months later with the modified Rankin Scale (mRS) as outcome measure. Three predictive models were built using linear regression analyses: a biological model (sociodemographics, apolipoprotein E genotype, prestroke vascular factors, lesion characteristics and neurological stroke-related impairment); a functional model (pre- and early post-stroke cognitive functioning, personal and instrumental activities of daily living, ADL, and depressive symptoms), and a combined model (including significant variables, with p value <0.05, from the biological and functional models). A combined model of 4 variables best predicted long-term functional outcome with explained variance of 49%: neurological impairment (National Institute of Health Stroke Scale; β = 0.402, p < 0.001), age (β = 0.233, p = 0.001), post-stroke cognitive functioning (Repeatable Battery of Neuropsychological Status, RBANS; β = -0.248, p = 0.001) and prestroke personal ADL (Barthel Index; β = -0.217, p = 0.002). Further linear regression analyses of which RBANS indexes and subtests best predicted long-term functional outcome showed that Coding (β = -0.484, p < 0.001) and Figure Copy (β = -0.233, p = 0.002) raw scores at baseline explained 42% of the variance in mRS scores at follow-up. Early post-stroke cognitive functioning as measured by the RBANS is a significant and independent predictor of long-term functional post-stroke outcome. Copyright © 2011 S. Karger AG, Basel.

  13. A function space approach to smoothing with applications to model error estimation for flexible spacecraft control

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.

    1981-01-01

    A function space approach to smoothing is used to obtain a set of model error estimates inherent in a reduced-order model. By establishing knowledge of inevitable deficiencies in the truncated model, the error estimates provide a foundation for updating the model and thereby improving system performance. The function space smoothing solution leads to a specification of a method for computation of the model error estimates and development of model error analysis techniques for comparison between actual and estimated errors. The paper summarizes the model error estimation approach as well as an application arising in the area of modeling for spacecraft attitude control.

  14. High-throughput screening of chemicals as functional ...

    EPA Pesticide Factsheets

    Identifying chemicals that provide a specific function within a product, yet have minimal impact on the human body or environment, is the goal of most formulation chemists and engineers practicing green chemistry. We present a methodology to identify potential chemical functional substitutes from large libraries of chemicals using machine learning based models. We collect and analyze publicly available information on the function of chemicals in consumer products or industrial processes to identify a suite of harmonized function categories suitable for modeling. We use structural and physicochemical descriptors for these chemicals to build 41 quantitative structure–use relationship (QSUR) models for harmonized function categories using random forest classification. We apply these models to screen a library of nearly 6400 chemicals with available structure information for potential functional substitutes. Using our Functional Use database (FUse), we could identify uses for 3121 chemicals; 4412 predicted functional uses had a probability of 80% or greater. We demonstrate the potential application of the models to high-throughput (HT) screening for “candidate alternatives” by merging the valid functional substitute classifications with hazard metrics developed from HT screening assays for bioactivity. A descriptor set could be obtained for 6356 Tox21 chemicals that have undergone a battery of HT in vitro bioactivity screening assays. By applying QSURs, we wer

  15. Implementation of the zooplankton functional response in plankton models: State of the art, recent challenges and future directions

    NASA Astrophysics Data System (ADS)

    Morozov, Andrew; Poggiale, Jean-Christophe; Cordoleani, Flora

    2012-09-01

    The conventional way of describing grazing in plankton models is based on a zooplankton functional response framework, according to which the consumption rate is computed as the product of a certain function of food (the functional response) and the density/biomass of herbivorous zooplankton. A large amount of literature on experimental feeding reports the existence of a zooplankton functional response in microcosms and small mesocosms, which goes a long way towards explaining the popularity of this framework both in mean-field (e.g. NPZD models) and spatially resolved models. On the other hand, the complex foraging behaviour of zooplankton (feeding cycles) as well as spatial heterogeneity of food and grazer distributions (plankton patchiness) across time and space scales raise questions as to the existence of a functional response of herbivores in vivo. In the current review, we discuss limitations of the ‘classical’ zooplankton functional response and consider possible ways to amend this framework to cope with the complexity of real planktonic ecosystems. Our general conclusion is that although the functional response of herbivores often does not exist in real ecosystems (especially in the form observed in the laboratory), this framework can be rather useful in modelling - but it does need some amendment which can be made based on various techniques of model reduction. We also show that the shape of the functional response depends on the spatial resolution (‘frame’) of the model. We argue that incorporating foraging behaviour and spatial heterogeneity in plankton models would not necessarily require the use of individual based modelling - an approach which is now becoming dominant in the literature. Finally, we list concrete future directions and challenges and emphasize the importance of a closer collaboration between plankton biologists and modellers in order to make further progress towards better descriptions of zooplankton grazing.

  16. ERS-1 and Seasat scatterometer measurements of ocean winds: Model functions and the directional distribution of short waves

    NASA Technical Reports Server (NTRS)

    Freilich, Michael H.; Dunbar, R. Scott

    1993-01-01

    Calculation of accurate vector winds from scatterometers requires knowledge of the relationship between backscatter cross-section and the geophysical variable of interest. As the detailed dynamics of wind generation of centimetric waves and radar-sea surface scattering at moderate incidence angles are not well known, empirical scatterometer model functions relating backscatter to winds must be developed. Less well appreciated is the fact that, given an accurate model function and some knowledge of the dominant scattering mechanisms, significant information on the amplitudes and directional distributions of centimetric roughness elements on the sea surface can be inferred. accurate scatterometer model functions can thus be used to investigate wind generation of short waves under realistic conditions. The present investigation involves developing an empirical model function for the C-band (5.3 GHz) ERS-1 scatterometer and comparing Ku-band model functions with the C-band model to infer information on the two-dimensional spectrum of centimetric roughness elements in the ocean. The C-band model function development is based on collocations of global backscatter measurements with operational surface analyses produced by meteorological agencies. Strengths and limitations of the method are discussed, and the resulting model function is validated in part through comparison with the actual distributions of backscatter cross-section triplets. Details of the directional modulation as well as the wind speed sensitivity at C-band are investigated. Analysis of persistent outliers in the data is used to infer the magnitudes of non-wind effects (such as atmospheric stratification, swell, etc.). The ERS-1 C-band instrument and the Seasat Ku-band (14.6 GHz) scatterometer both imaged waves of approximately 3.4 cm wavelength assuming that Bragg scattering is the dominant mechanism. Comparisons of the C-band and Ku-band model functions are used both to test the validity of the postulated Bragg mechanism and to investigate the directional distribution of the imaged waves under a variety of conditions where Bragg scatter is dominant.

  17. Fractional calculus in biomechanics: a 3D viscoelastic model using regularized fractional derivative kernels with application to the human calcaneal fat pad.

    PubMed

    Freed, A D; Diethelm, K

    2006-11-01

    A viscoelastic model of the K-BKZ (Kaye, Technical Report 134, College of Aeronautics, Cranfield 1962; Bernstein et al., Trans Soc Rheol 7: 391-410, 1963) type is developed for isotropic biological tissues and applied to the fat pad of the human heel. To facilitate this pursuit, a class of elastic solids is introduced through a novel strain-energy function whose elements possess strong ellipticity, and therefore lead to stable material models. This elastic potential - via the K-BKZ hypothesis - also produces the tensorial structure of the viscoelastic model. Candidate sets of functions are proposed for the elastic and viscoelastic material functions present in the model, including two functions whose origins lie in the fractional calculus. The Akaike information criterion is used to perform multi-model inference, enabling an objective selection to be made as to the best material function from within a candidate set.

  18. A quark model analysis of the transversity distribution

    NASA Astrophysics Data System (ADS)

    Scopetta, Sergio; Vento, Vicente

    1998-04-01

    The feasibility of measuring chiral-odd parton distribution functions in polarized Drell-Yan and semi-inclusive experiments has renewed theoretical interest in their study. Models of hadron structure have proven successful in describing the gross features of the chiral-even structure functions. Similar expectations motivated our study of the transversity parton distributions in the Isgur-Karl and MIT bag models. We confirm, by performing a NLO calculation, the diverse low x behaviors of the transversity and spin structure functions at the experimental scale and show that it is fundamentally a consequence of the different behaviors under evolution of these functions. The inequalities of Soffer establish constraints between data and model calculations of the chiral-odd transversity function. The approximate compatibility of our model calculations with these constraints confers credibility to our estimates.

  19. Validation of a Node-Centered Wall Function Model for the Unstructured Flow Code FUN3D

    NASA Technical Reports Server (NTRS)

    Carlson, Jan-Renee; Vasta, Veer N.; White, Jeffery

    2015-01-01

    In this paper, the implementation of two wall function models in the Reynolds averaged Navier-Stokes (RANS) computational uid dynamics (CFD) code FUN3D is described. FUN3D is a node centered method for solving the three-dimensional Navier-Stokes equations on unstructured computational grids. The first wall function model, based on the work of Knopp et al., is used in conjunction with the one-equation turbulence model of Spalart-Allmaras. The second wall function model, also based on the work of Knopp, is used in conjunction with the two-equation k-! turbulence model of Menter. The wall function models compute the wall momentum and energy flux, which are used to weakly enforce the wall velocity and pressure flux boundary conditions in the mean flow momentum and energy equations. These wall conditions are implemented in an implicit form where the contribution of the wall function model to the Jacobian are also included. The boundary conditions of the turbulence transport equations are enforced explicitly (strongly) on all solid boundaries. The use of the wall function models is demonstrated on four test cases: a at plate boundary layer, a subsonic di user, a 2D airfoil, and a 3D semi-span wing. Where possible, different near-wall viscous spacing tactics are examined. Iterative residual convergence was obtained in most cases. Solution results are compared with theoretical and experimental data for several variations of grid spacing. In general, very good comparisons with data were achieved.

  20. An integrative model of evolutionary covariance: a symposium on body shape in fishes.

    PubMed

    Walker, Jeffrey A

    2010-12-01

    A major direction of current and future biological research is to understand how multiple, interacting functional systems coordinate in producing a body that works. This understanding is complicated by the fact that organisms need to work well in multiple environments, with both predictable and unpredictable environmental perturbations. Furthermore, organismal design reflects a history of past environments and not a plan for future environments. How complex, interacting functional systems evolve, then, is a truly grand challenge. In accepting the challenge, an integrative model of evolutionary covariance is developed. The model combines quantitative genetics, functional morphology/physiology, and functional ecology. The model is used to convene scientists ranging from geneticists, to physiologists, to ecologists, to engineers to facilitate the emergence of body shape in fishes as a model system for understanding how complex, interacting functional systems develop and evolve. Body shape of fish is a complex morphology that (1) results from many developmental paths and (2) functions in many different behaviors. Understanding the coordination and evolution of the many paths from genes to body shape, body shape to function, and function to a working fish body in a dynamic environment is now possible given new technologies from genetics to engineering and new theoretical models that integrate the different levels of biological organization (from genes to ecology).

  1. Observing and modeling dynamics in terrestrial gross primary productivity and phenology from remote sensing: An assessment using in-situ measurements

    NASA Astrophysics Data System (ADS)

    Verma, Manish K.

    Terrestrial gross primary productivity (GPP) is the largest and most variable component of the carbon cycle and is strongly influenced by phenology. Realistic characterization of spatio-temporal variation in GPP and phenology is therefore crucial for understanding dynamics in the global carbon cycle. In the last two decades, remote sensing has become a widely-used tool for this purpose. However, no study has comprehensively examined how well remote sensing models capture spatiotemporal patterns in GPP, and validation of remote sensing-based phenology models is limited. Using in-situ data from 144 eddy covariance towers located in all major biomes, I assessed the ability of 10 remote sensing-based methods to capture spatio-temporal variation in GPP at annual and seasonal scales. The models are based on different hypotheses regarding ecophysiological controls on GPP and span a range of structural and computational complexity. The results lead to four main conclusions: (i) at annual time scale, models were more successful capturing spatial variability than temporal variability; (ii) at seasonal scale, models were more successful in capturing average seasonal variability than interannual variability; (iii) simpler models performed as well or better than complex models; and (iv) models that were best at explaining seasonal variability in GPP were different from those that were best able to explain variability in annual scale GPP. Seasonal phenology of vegetation follows bounded growth and decay, and is widely modeled using growth functions. However, the specific form of the growth function affects how phenological dynamics are represented in ecosystem and remote sensing-base models. To examine this, four different growth functions (the logistic, Gompertz, Mirror-Gompertz and Richards function) were assessed using remotely sensed and in-situ data collected at several deciduous forest sites. All of the growth functions provided good statistical representation of in-situ and remote sensing time series. However, the Richards function captured observed asymmetric dynamics that were not captured by the other functions. The timing of key phenophase transitions derived using the Richards function therefore agreed best with observations. This suggests that ecosystem models and remote-sensing algorithms would benefit from using the Richards function to represent phenological dynamics.

  2. A bayesian hierarchical model for classification with selection of functional predictors.

    PubMed

    Zhu, Hongxiao; Vannucci, Marina; Cox, Dennis D

    2010-06-01

    In functional data classification, functional observations are often contaminated by various systematic effects, such as random batch effects caused by device artifacts, or fixed effects caused by sample-related factors. These effects may lead to classification bias and thus should not be neglected. Another issue of concern is the selection of functions when predictors consist of multiple functions, some of which may be redundant. The above issues arise in a real data application where we use fluorescence spectroscopy to detect cervical precancer. In this article, we propose a Bayesian hierarchical model that takes into account random batch effects and selects effective functions among multiple functional predictors. Fixed effects or predictors in nonfunctional form are also included in the model. The dimension of the functional data is reduced through orthonormal basis expansion or functional principal components. For posterior sampling, we use a hybrid Metropolis-Hastings/Gibbs sampler, which suffers slow mixing. An evolutionary Monte Carlo algorithm is applied to improve the mixing. Simulation and real data application show that the proposed model provides accurate selection of functional predictors as well as good classification.

  3. Accurate lithography simulation model based on convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Watanabe, Yuki; Kimura, Taiki; Matsunawa, Tetsuaki; Nojima, Shigeki

    2017-07-01

    Lithography simulation is an essential technique for today's semiconductor manufacturing process. In order to calculate an entire chip in realistic time, compact resist model is commonly used. The model is established for faster calculation. To have accurate compact resist model, it is necessary to fix a complicated non-linear model function. However, it is difficult to decide an appropriate function manually because there are many options. This paper proposes a new compact resist model using CNN (Convolutional Neural Networks) which is one of deep learning techniques. CNN model makes it possible to determine an appropriate model function and achieve accurate simulation. Experimental results show CNN model can reduce CD prediction errors by 70% compared with the conventional model.

  4. Improving plant functional groups for dynamic models of biodiversity: at the crossroads between functional and community ecology

    PubMed Central

    Isabelle, Boulangeat; Pauline, Philippe; Sylvain, Abdulhak; Roland, Douzet; Luc, Garraud; Sébastien, Lavergne; Sandra, Lavorel; Jérémie, Van Es; Pascal, Vittoz; Wilfried, Thuiller

    2013-01-01

    The pace of on-going climate change calls for reliable plant biodiversity scenarios. Traditional dynamic vegetation models use plant functional types that are summarized to such an extent that they become meaningless for biodiversity scenarios. Hybrid dynamic vegetation models of intermediate complexity (hybrid-DVMs) have recently been developed to address this issue. These models, at the crossroads between phenomenological and process-based models, are able to involve an intermediate number of well-chosen plant functional groups (PFGs). The challenge is to build meaningful PFGs that are representative of plant biodiversity, and consistent with the parameters and processes of hybrid-DVMs. Here, we propose and test a framework based on few selected traits to define a limited number of PFGs, which are both representative of the diversity (functional and taxonomic) of the flora in the Ecrins National Park, and adapted to hybrid-DVMs. This new classification scheme, together with recent advances in vegetation modeling, constitutes a step forward for mechanistic biodiversity modeling. PMID:24403847

  5. Comparison of Regression Analysis and Transfer Function in Estimating the Parameters of Central Pulse Waves from Brachial Pulse Wave.

    PubMed

    Chai, Rui; Xu, Li-Sheng; Yao, Yang; Hao, Li-Ling; Qi, Lin

    2017-01-01

    This study analyzed ascending branch slope (A_slope), dicrotic notch height (Hn), diastolic area (Ad) and systolic area (As) diastolic blood pressure (DBP), systolic blood pressure (SBP), pulse pressure (PP), subendocardial viability ratio (SEVR), waveform parameter (k), stroke volume (SV), cardiac output (CO), and peripheral resistance (RS) of central pulse wave invasively and non-invasively measured. Invasively measured parameters were compared with parameters measured from brachial pulse waves by regression model and transfer function model. Accuracy of parameters estimated by regression and transfer function model, was compared too. Findings showed that k value, central pulse wave and brachial pulse wave parameters invasively measured, correlated positively. Regression model parameters including A_slope, DBP, SEVR, and transfer function model parameters had good consistency with parameters invasively measured. They had same effect of consistency. SBP, PP, SV, and CO could be calculated through the regression model, but their accuracies were worse than that of transfer function model.

  6. A discriminant function model as an alternative method to spirometry for COPD screening in primary care settings in China.

    PubMed

    Cui, Jiangyu; Zhou, Yumin; Tian, Jia; Wang, Xinwang; Zheng, Jingping; Zhong, Nanshan; Ran, Pixin

    2012-12-01

    COPD is often underdiagnosed in a primary care setting where the spirometry is unavailable. This study was aimed to develop a simple, economical and applicable model for COPD screening in those settings. First we established a discriminant function model based on Bayes' Rule by stepwise discriminant analysis, using the data from 243 COPD patients and 112 non-COPD subjects from our COPD survey in urban and rural communities and local primary care settings in Guangdong Province, China. We then used this model to discriminate COPD in additional 150 subjects (50 non-COPD and 100 COPD ones) who had been recruited by the same methods as used to have established the model. All participants completed pre- and post-bronchodilator spirometry and questionnaires. COPD was diagnosed according to the Global Initiative for Chronic Obstructive Lung Disease criteria. The sensitivity and specificity of the discriminant function model was assessed. THE ESTABLISHED DISCRIMINANT FUNCTION MODEL INCLUDED NINE VARIABLES: age, gender, smoking index, body mass index, occupational exposure, living environment, wheezing, cough and dyspnoea. The sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, accuracy and error rate of the function model to discriminate COPD were 89.00%, 82.00%, 4.94, 0.13, 86.66% and 13.34%, respectively. The accuracy and Kappa value of the function model to predict COPD stages were 70% and 0.61 (95% CI, 0.50 to 0.71). This discriminant function model may be used for COPD screening in primary care settings in China as an alternative option instead of spirometry.

  7. Effect of Microgravity on Several Visual Functions During STS Shuttle Missions: Visual Function Tester-model 1 (VFT-1)

    NASA Technical Reports Server (NTRS)

    Oneal, Melvin R.; Task, H. Lee; Genco, Louis V.

    1992-01-01

    Viewgraphs on the effect of microgravity on several visual functions during STS shuttle missions are presented. The purpose, methods, results, and discussion are discussed. The visual function tester model 1 is used.

  8. The extended Lennard-Jones potential energy function: A simpler model for direct-potential-fit analysis

    NASA Astrophysics Data System (ADS)

    Hajigeorgiou, Photos G.

    2016-12-01

    An analytical model for the diatomic potential energy function that was recently tested as a universal function (Hajigeorgiou, 2010) has been further modified and tested as a suitable model for direct-potential-fit analysis. Applications are presented for the ground electronic states of three diatomic molecules: oxygen, carbon monoxide, and hydrogen fluoride. The adjustable parameters of the extended Lennard-Jones potential model are determined through nonlinear regression by fits to calculated rovibrational energy term values or experimental spectroscopic line positions. The model is shown to lead to reliable, compact and simple representations for the potential energy functions of these systems and could therefore be classified as a suitable and attractive model for direct-potential-fit analysis.

  9. Structural equation modeling of motor impairment, gross motor function, and the functional outcome in children with cerebral palsy.

    PubMed

    Park, Eun-Young; Kim, Won-Ho

    2013-05-01

    Physical therapy intervention for children with cerebral palsy (CP) is focused on reducing neurological impairments, improving strength, and preventing the development of secondary impairments in order to improve functional outcomes. However, relationship between motor impairments and functional outcome has not been proved definitely. This study confirmed the construct of motor impairment and performed structural equation modeling (SEM) between motor impairment, gross motor function, and functional outcomes of regarding activities of daily living in children with CP. 98 children (59 boys, 39 girls) with CP participated in this cross-sectional study. Mean age was 11 y 5 mo (SD 1 y 9 mo). The Manual Muscle Test (MMT), the Modified Ashworth Scale (MAS), range of motion (ROM) measurement, and the selective motor control (SMC) scale were used to assess motor impairments. Gross motor function and functional outcomes were measured using the Gross Motor Function Measure (GMFM) and the Functional Skills domain of the Pediatric Evaluation of Disability Inventory (PEDI) respectively. Measurement of motor impairment was consisted of strength, spasticity, ROM, and SMC. The construct of motor impairment was confirmed though an examination of a measurement model. The proposed SEM model showed good fit indices. Motor impairment effected gross motor function (β=-.0869). Gross motor function and motor impairment affected functional outcomes directly (β=0.890) and indirectly (β=-0.773) respectively. We confirmed that the construct of motor impairment consist of strength, spasticity, ROM, and SMC and it was identified through measurement model analysis. Functional outcomes are best predicted by gross motor function and motor impairments have indirect effects on functional outcomes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Aircraft/Air Traffic Management Functional Analysis Model. Version 2.0; User's Guide

    NASA Technical Reports Server (NTRS)

    Etheridge, Melvin; Plugge, Joana; Retina, Nusrat

    1998-01-01

    The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) a National Aeronautics and Space Administration (NASA) contract. This document provides a guide for using the model in analysis. Those interested in making enhancements or modification to the model should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Technical Description.

  11. A probabilistic framework to infer brain functional connectivity from anatomical connections.

    PubMed

    Deligianni, Fani; Varoquaux, Gael; Thirion, Bertrand; Robinson, Emma; Sharp, David J; Edwards, A David; Rueckert, Daniel

    2011-01-01

    We present a novel probabilistic framework to learn across several subjects a mapping from brain anatomical connectivity to functional connectivity, i.e. the covariance structure of brain activity. This prediction problem must be formulated as a structured-output learning task, as the predicted parameters are strongly correlated. We introduce a model selection framework based on cross-validation with a parametrization-independent loss function suitable to the manifold of covariance matrices. Our model is based on constraining the conditional independence structure of functional activity by the anatomical connectivity. Subsequently, we learn a linear predictor of a stationary multivariate autoregressive model. This natural parameterization of functional connectivity also enforces the positive-definiteness of the predicted covariance and thus matches the structure of the output space. Our results show that functional connectivity can be explained by anatomical connectivity on a rigorous statistical basis, and that a proper model of functional connectivity is essential to assess this link.

  12. Developing models that analyze the economic/environmental trade-offs implicit in water resource management

    NASA Astrophysics Data System (ADS)

    Howitt, R. E.

    2016-12-01

    Hydro-economic models have been used to analyze optimal supply management and groundwater use for the past 25 years. They are characterized by an objective function that usually maximizes economic measures such as consumer and producer surplus subject to hydrologic equations of motion or water distribution systems. The hydrologic and economic components are sometimes fully integrated. Alternatively they may use an iterative interactive process. Environmental considerations have been included in hydro-economic models as inequality constraints. Representing environmental requirements as constraints is a rigid approximation of the range of management alternatives that could be used to implement environmental objectives. The next generation of hydro-economic models, currently being developed, require that the environmental alternatives be represented by continuous or semi-continuous functions which relate water resource use allocated to the environment with the probabilities of achieving environmental objectives. These functions will be generated by process models of environmental and biological systems which are now advanced to the state that they can realistically represent environmental systems and flexibility to interact with economic models. Examples are crop growth models, climate modeling, and biological models of forest, fish, and fauna systems. These process models can represent environmental outcomes in a form that is similar to economic production functions. When combined with economic models the interacting process models can reproduce a range of trade-offs between economic and environmental objectives, and thus optimize social value of many water and environmental resources. Some examples of this next-generation of hydro-enviro- economic models are reviewed. In these models implicit production functions for environmental goods are combined with hydrologic equations of motion and economic response functions. We discuss models that show interaction between environmental goods and agricultural production, and others that address alternative climate change policies, or habitat provision.

  13. Modeling and Circumventing the Effect of Sediments and Water Column on Receiver Functions

    NASA Astrophysics Data System (ADS)

    Audet, P.

    2017-12-01

    Teleseismic P-wave receiver functions are routinely used to resolve crust and mantle structure in various geologic settings. Receiver functions are approximations to the Earth's Green's functions and are composed of various scattered phase arrivals, depending on the complexity of the underlying Earth structure. For simple structure, the dominant arrivals (converted and back-scattered P-to-S phases) are well separated in time and can be reliably used in estimating crustal velocity structure. In the presence of sedimentary layers, strong reverberations typically produce high-amplitude oscillations that contaminate the early part of the wave train and receiver functions can be difficult to interpret in terms of underlying structure. The effect of a water column also limits the interpretability of under-water receiver functions due to the additional acoustic wave propagating within the water column that can contaminate structural arrivals. We perform numerical modeling of teleseismic Green's functions and receiver functions using a reflectivity technique for a range of Earth models that include thin sedimentary layers and overlying water column. These modeling results indicate that, as expected, receiver functions are difficult to interpret in the presence of sediments, but the contaminating effect of the water column is dependent on the thickness of the water layer. To circumvent these effects and recover source-side structure, we propose using an approach based on transfer function modeling that bypasses receiver functions altogether and estimates crustal properties directly from the waveforms (Frederiksen and Delayney, 2015). Using this approach, reasonable assumptions about the properties of the sedimentary layer can be included in forward calculations of the Green's functions that are convolved with radial waveforms to predict vertical waveforms. Exploration of model space using Monte Carlo-style search and least-square waveform misfits can be performed to estimate any model parameter of interest, including those of the sedimentary or water layer. We show how this method can be applied to OBS data using broadband stations from the Cascadia Initiative to recover oceanic plate structure.

  14. Developing the snow component of a distributed hydrological model: a step-wise approach based on multi-objective analysis

    NASA Astrophysics Data System (ADS)

    Dunn, S. M.; Colohan, R. J. E.

    1999-09-01

    A snow component has been developed for the distributed hydrological model, DIY, using an approach that sequentially evaluates the behaviour of different functions as they are implemented in the model. The evaluation is performed using multi-objective functions to ensure that the internal structure of the model is correct. The development of the model, using a sub-catchment in the Cairngorm Mountains in Scotland, demonstrated that the degree-day model can be enhanced for hydroclimatic conditions typical of those found in Scotland, without increasing meteorological data requirements. An important element of the snow model is a function to account for wind re-distribution. This causes large accumulations of snow in small pockets, which are shown to be important in sustaining baseflows in the rivers during the late spring and early summer, long after the snowpack has melted from the bulk of the catchment. The importance of the wind function would not have been identified using a single objective function of total streamflow to evaluate the model behaviour.

  15. Supervised nonlinear spectral unmixing using a postnonlinear mixing model for hyperspectral imagery.

    PubMed

    Altmann, Yoann; Halimi, Abderrahim; Dobigeon, Nicolas; Tourneret, Jean-Yves

    2012-06-01

    This paper presents a nonlinear mixing model for hyperspectral image unmixing. The proposed model assumes that the pixel reflectances are nonlinear functions of pure spectral components contaminated by an additive white Gaussian noise. These nonlinear functions are approximated using polynomial functions leading to a polynomial postnonlinear mixing model. A Bayesian algorithm and optimization methods are proposed to estimate the parameters involved in the model. The performance of the unmixing strategies is evaluated by simulations conducted on synthetic and real data.

  16. Effective Biot theory and its generalization to poroviscoelastic models

    NASA Astrophysics Data System (ADS)

    Liu, Xu; Greenhalgh, Stewart; Zhou, Bing; Greenhalgh, Mark

    2018-02-01

    A method is suggested to express the effective bulk modulus of the solid frame of a poroelastic material as a function of the saturated bulk modulus. This method enables effective Biot theory to be described through the use of seismic dispersion measurements or other models developed for the effective saturated bulk modulus. The effective Biot theory is generalized to a poroviscoelastic model of which the moduli are represented by the relaxation functions of the generalized fractional Zener model. The latter covers the general Zener and the Cole-Cole models as special cases. A global search method is described to determine the parameters of the relaxation functions, and a simple deterministic method is also developed to find the defining parameters of the single Cole-Cole model. These methods enable poroviscoelastic models to be constructed, which are based on measured seismic attenuation functions, and ensure that the model dispersion characteristics match the observations.

  17. A Hermite-based lattice Boltzmann model with artificial viscosity for compressible viscous flows

    NASA Astrophysics Data System (ADS)

    Qiu, Ruofan; Chen, Rongqian; Zhu, Chenxiang; You, Yancheng

    2018-05-01

    A lattice Boltzmann model on Hermite basis for compressible viscous flows is presented in this paper. The model is developed in the framework of double-distribution-function approach, which has adjustable specific-heat ratio and Prandtl number. It contains a density distribution function for the flow field and a total energy distribution function for the temperature field. The equilibrium distribution function is determined by Hermite expansion, and the D3Q27 and D3Q39 three-dimensional (3D) discrete velocity models are used, in which the discrete velocity model can be replaced easily. Moreover, an artificial viscosity is introduced to enhance the model for capturing shock waves. The model is tested through several cases of compressible flows, including 3D supersonic viscous flows with boundary layer. The effect of artificial viscosity is estimated. Besides, D3Q27 and D3Q39 models are further compared in the present platform.

  18. Bayesian model calibration of computational models in velocimetry diagnosed dynamic compression experiments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Justin; Hund, Lauren

    2017-02-01

    Dynamic compression experiments are being performed on complicated materials using increasingly complex drivers. The data produced in these experiments are beginning to reach a regime where traditional analysis techniques break down; requiring the solution of an inverse problem. A common measurement in dynamic experiments is an interface velocity as a function of time, and often this functional output can be simulated using a hydrodynamics code. Bayesian model calibration is a statistical framework to estimate inputs into a computational model in the presence of multiple uncertainties, making it well suited to measurements of this type. In this article, we apply Bayesianmore » model calibration to high pressure (250 GPa) ramp compression measurements in tantalum. We address several issues speci c to this calibration including the functional nature of the output as well as parameter and model discrepancy identi ability. Speci cally, we propose scaling the likelihood function by an e ective sample size rather than modeling the autocorrelation function to accommodate the functional output and propose sensitivity analyses using the notion of `modularization' to assess the impact of experiment-speci c nuisance input parameters on estimates of material properties. We conclude that the proposed Bayesian model calibration procedure results in simple, fast, and valid inferences on the equation of state parameters for tantalum.« less

  19. Long-term effects of psychosocial work stress in midlife on health functioning after labor market exit--results from the GAZEL study.

    PubMed

    Wahrendorf, Morten; Sembajwe, Grace; Zins, Marie; Berkman, Lisa; Goldberg, Marcel; Siegrist, Johannes

    2012-07-01

    To study long-term effects of psychosocial work stress in mid-life on health functioning after labor market exit using two established work stress models. In the frame of the prospective French Gazel cohort study, data on psychosocial work stress were assessed using the full questionnaires measuring the demand-control-support model (in 1997 and 1999) and the effort-reward imbalance model (in 1998). In 2007, health functioning was assessed, using the Short Form 36 mental and physical component scores. Multivariate regressions were calculated to predict health functioning in 2007, controlling for age, gender, social position, and baseline self-perceived health. Consistent effects of both work stress models and their single components on mental and physical health functioning during retirement were observed. Effects remained significant after adjustment including baseline self-perceived health. Whereas the predictive power of both work stress models was similar in the case of the physical composite score, in the case of the mental health score, values of model fit were slightly higher for the effort-reward imbalance model (R(2): 0.13) compared with the demand-control model (R²: 0.11). Findings underline the importance of working conditions in midlife not only for health in midlife but also for health functioning after labor market exit.

  20. Psychometric Properties on Lecturers' Beliefs on Teaching Function: Rasch Model Analysis

    ERIC Educational Resources Information Center

    Mofreh, Samah Ali Mohsen; Ghafar, Mohammed Najib Abdul; Omar, Abdul Hafiz Hj; Mosaku, Monsurat; Ma'ruf, Amar

    2014-01-01

    This paper focuses on the psychometric analysis of lecturers' beliefs on teaching function (LBTF) survey using Rasch Model analysis. The sample comprised 34 Community Colleges' lecturers. The Rasch Model is applied to produce specific measurements on the lecturers' beliefs on teaching function in order to generalize results and inferential…

  1. Matrix models and stochastic growth in Donaldson-Thomas theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szabo, Richard J.; Tierz, Miguel; Departamento de Analisis Matematico, Facultad de Ciencias Matematicas, Universidad Complutense de Madrid, Plaza de Ciencias 3, 28040 Madrid

    We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kaehler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used tomore » show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.« less

  2. Modeling personnel turnover in the parametric organization

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1991-01-01

    A model is developed for simulating the dynamics of a newly formed organization, credible during all phases of organizational development. The model development process is broken down into the activities of determining the tasks required for parametric cost analysis (PCA), determining the skills required for each PCA task, determining the skills available in the applicant marketplace, determining the structure of the model, implementing the model, and testing it. The model, parameterized by the likelihood of job function transition, has demonstrated by the capability to represent the transition of personnel across functional boundaries within a parametric organization using a linear dynamical system, and the ability to predict required staffing profiles to meet functional needs at the desired time. The model can be extended by revisions of the state and transition structure to provide refinements in functional definition for the parametric and extended organization.

  3. Modelling soil water retention using support vector machines with genetic algorithm optimisation.

    PubMed

    Lamorski, Krzysztof; Sławiński, Cezary; Moreno, Felix; Barna, Gyöngyi; Skierucha, Wojciech; Arrue, José L

    2014-01-01

    This work presents point pedotransfer function (PTF) models of the soil water retention curve. The developed models allowed for estimation of the soil water content for the specified soil water potentials: -0.98, -3.10, -9.81, -31.02, -491.66, and -1554.78 kPa, based on the following soil characteristics: soil granulometric composition, total porosity, and bulk density. Support Vector Machines (SVM) methodology was used for model development. A new methodology for elaboration of retention function models is proposed. Alternative to previous attempts known from literature, the ν-SVM method was used for model development and the results were compared with the formerly used the C-SVM method. For the purpose of models' parameters search, genetic algorithms were used as an optimisation framework. A new form of the aim function used for models parameters search is proposed which allowed for development of models with better prediction capabilities. This new aim function avoids overestimation of models which is typically encountered when root mean squared error is used as an aim function. Elaborated models showed good agreement with measured soil water retention data. Achieved coefficients of determination values were in the range 0.67-0.92. Studies demonstrated usability of ν-SVM methodology together with genetic algorithm optimisation for retention modelling which gave better performing models than other tested approaches.

  4. Graphic comparison of reserve-growth models for conventional oil and accumulation

    USGS Publications Warehouse

    Klett, T.R.

    2003-01-01

    The U.S. Geological Survey (USGS) periodically assesses crude oil, natural gas, and natural gas liquids resources of the world. The assessment procedure requires estimated recover-able oil and natural gas volumes (field size, cumulative production plus remaining reserves) in discovered fields. Because initial reserves are typically conservative, subsequent estimates increase through time as these fields are developed and produced. The USGS assessment of petroleum resources makes estimates, or forecasts, of the potential additions to reserves in discovered oil and gas fields resulting from field development, and it also estimates the potential fully developed sizes of undiscovered fields. The term ?reserve growth? refers to the commonly observed upward adjustment of reserve estimates. Because such additions are related to increases in the total size of a field, the USGS uses field sizes to model reserve growth. Future reserve growth in existing fields is a major component of remaining U.S. oil and natural gas resources and has therefore become a necessary element of U.S. petroleum resource assessments. Past and currently proposed reserve-growth models compared herein aid in the selection of a suitable set of forecast functions to provide an estimate of potential additions to reserves from reserve growth in the ongoing National Oil and Gas Assessment Project (NOGA). Reserve growth is modeled by construction of a curve that represents annual fractional changes of recoverable oil and natural gas volumes (for fields and reservoirs), which provides growth factors. Growth factors are used to calculate forecast functions, which are sets of field- or reservoir-size multipliers. Comparisons of forecast functions were made based on datasets used to construct the models, field type, modeling method, and length of forecast span. Comparisons were also made between forecast functions based on field-level and reservoir- level growth, and between forecast functions based on older and newer data. The reserve-growth model used in the 1995 USGS National Assessment and the model currently used in the NOGA project provide forecast functions that yield similar estimates of potential additions to reserves. Both models are based on the Oil and Gas Integrated Field File from the Energy Information Administration (EIA), but different vintages of data (from 1977 through 1991 and 1977 through 1996, respectively). The model based on newer data can be used in place of the previous model, providing similar estimates of potential additions to reserves. Fore-cast functions for oil fields vary little from those for gas fields in these models; therefore, a single function may be used for both oil and gas fields, like that used in the USGS World Petroleum Assessment 2000. Forecast functions based on the field-level reserve growth model derived from the NRG Associates databases (from 1982 through 1998) differ from those derived from EIA databases (from 1977 through 1996). However, the difference may not be enough to preclude the use of the forecast functions derived from NRG data in place of the forecast functions derived from EIA data. Should the model derived from NRG data be used, separate forecast functions for oil fields and gas fields must be employed. The forecast function for oil fields from the model derived from NRG data varies significantly from that for gas fields, and a single function for both oil and gas fields may not be appropriate.

  5. Subsystem functional and the missing ingredient of confinement physics in density functionals.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armiento, Rickard Roberto; Mattsson, Ann Elisabet; Hao, Feng

    2010-08-01

    The subsystem functional scheme is a promising approach recently proposed for constructing exchange-correlation density functionals. In this scheme, the physics in each part of real materials is described by mapping to a characteristic model system. The 'confinement physics,' an essential physical ingredient that has been left out in present functionals, is studied by employing the harmonic-oscillator (HO) gas model. By performing the potential {yields} density and the density {yields} exchange energy per particle mappings based on two model systems characterizing the physics in the interior (uniform electron-gas model) and surface regions (Airy gas model) of materials for the HO gases,more » we show that the confinement physics emerges when only the lowest subband of the HO gas is occupied by electrons. We examine the approximations of the exchange energy by several state-of-the-art functionals for the HO gas, and none of them produces adequate accuracy in the confinement dominated cases. A generic functional that incorporates the description of the confinement physics is needed.« less

  6. Modeling of edge effect in subaperture tool influence functions of computer controlled optical surfacing.

    PubMed

    Wan, Songlin; Zhang, Xiangchao; He, Xiaoying; Xu, Min

    2016-12-20

    Computer controlled optical surfacing requires an accurate tool influence function (TIF) for reliable path planning and deterministic fabrication. Near the edge of the workpieces, the TIF has a nonlinear removal behavior, which will cause a severe edge-roll phenomenon. In the present paper, a new edge pressure model is developed based on the finite element analysis results. The model is represented as the product of a basic pressure function and a correcting function. The basic pressure distribution is calculated according to the surface shape of the polishing pad, and the correcting function is used to compensate the errors caused by the edge effect. Practical experimental results demonstrate that the new model can accurately predict the edge TIFs with different overhang ratios. The relative error of the new edge model can be reduced to 15%.

  7. GeoChip-based analysis of the microbial community functional structures in simultaneous desulfurization and denitrification process.

    PubMed

    Yu, Hao; Chen, Chuan; Ma, Jincai; Liu, Wenzong; Zhou, Jizhong; Lee, Duu-Jong; Ren, Nanqi; Wang, Aijie

    2014-07-01

    The elemental sulfur (S°) recovery was evaluated in the presence of nitrate in two development models of simultaneous desulfurization and denitrification (SDD) process. At the loading rates of 0.9 kg S/(m³·day) for sulfide and 0.4 kg N/(m³·day) for nitrate, S° conversion rate was 91.1% in denitrifying sulfide removal (DSR) model which was higher than in integrated simultaneous desulfurization and denitrification (ISDD) model (25.6%). A comprehensive analysis of functional diversity, structure and metabolic potential of microbial communities was examined in two models by using functional gene array (GeoChip 2.0). GeoChip data indicated that diversity indices, community structure, and abundance of functional genes were distinct between two models. Diversity indices (Simpson's diversity index (1/D) and Shannon-Weaver index (H')) of all detected genes showed that with elevated influent loading rate, the functional diversity decreased in ISDD model but increased in DSR model. In contrast to ISDD model, the overall abundance of dsr genes was lower in DSR model, while some functional genes targeting from nitrate-reducing sulfide-oxidizing bacteria (NR-SOB), such as Thiobacillus denitrificans, Sulfurimonas denitrificans, and Paracoccus pantotrophus were more abundant in DSR model which were highly associated with the change of S(0) conversion rate obtained in two models. The results obtained in this study provide additional insights into the microbial metabolic mechanisms involved in ISDD and DSR models, which in turn will improve the overall performance of SDD process. Copyright © 2014. Published by Elsevier B.V.

  8. Dynamic prediction in functional concurrent regression with an application to child growth.

    PubMed

    Leroux, Andrew; Xiao, Luo; Crainiceanu, Ciprian; Checkley, William

    2018-04-15

    In many studies, it is of interest to predict the future trajectory of subjects based on their historical data, referred to as dynamic prediction. Mixed effects models have traditionally been used for dynamic prediction. However, the commonly used random intercept and slope model is often not sufficiently flexible for modeling subject-specific trajectories. In addition, there may be useful exposures/predictors of interest that are measured concurrently with the outcome, complicating dynamic prediction. To address these problems, we propose a dynamic functional concurrent regression model to handle the case where both the functional response and the functional predictors are irregularly measured. Currently, such a model cannot be fit by existing software. We apply the model to dynamically predict children's length conditional on prior length, weight, and baseline covariates. Inference on model parameters and subject-specific trajectories is conducted using the mixed effects representation of the proposed model. An extensive simulation study shows that the dynamic functional regression model provides more accurate estimation and inference than existing methods. Methods are supported by fast, flexible, open source software that uses heavily tested smoothing techniques. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  9. Function-based payment model for inpatient medical rehabilitation: an evaluation.

    PubMed

    Sutton, J P; DeJong, G; Wilkerson, D

    1996-07-01

    To describe the components of a function-based prospective payment model for inpatient medical rehabilitation that parallels diagnosis-related groups (DRGs), to evaluate this model in relation to stakeholder objectives, and to detail the components of a quality of care incentive program that, when combined with this payment model, creates an incentive for provides to maximize functional outcomes. This article describes a conceptual model, involving no data collection or data synthesis. The basic payment model described parallels DRGs. Information on the potential impact of this model on medical rehabilitation is gleaned from the literature evaluating the impact of DRGs. The conceptual model described is evaluated against the results of a Delphi Survey of rehabilitation providers, consumers, policymakers, and researchers previously conducted by members of the research team. The major shortcoming of a function-based prospective payment model for inpatient medical rehabilitation is that it contains no inherent incentive to maximize functional outcomes. Linkage of reimbursement to outcomes, however, by withholding a fixed proportion of the standard FRG payment amount, placing that amount in a "quality of care" pool, and distributing that pool annually among providers whose predesignated, facility-level, case-mix-adjusted outcomes are attained, may be one strategy for maximizing outcome goals.

  10. Functional linear models for association analysis of quantitative traits.

    PubMed

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY PERIODICALS, INC.

  11. Classical Testing in Functional Linear Models.

    PubMed

    Kong, Dehan; Staicu, Ana-Maria; Maity, Arnab

    2016-01-01

    We extend four tests common in classical regression - Wald, score, likelihood ratio and F tests - to functional linear regression, for testing the null hypothesis, that there is no association between a scalar response and a functional covariate. Using functional principal component analysis, we re-express the functional linear model as a standard linear model, where the effect of the functional covariate can be approximated by a finite linear combination of the functional principal component scores. In this setting, we consider application of the four traditional tests. The proposed testing procedures are investigated theoretically for densely observed functional covariates when the number of principal components diverges. Using the theoretical distribution of the tests under the alternative hypothesis, we develop a procedure for sample size calculation in the context of functional linear regression. The four tests are further compared numerically for both densely and sparsely observed noisy functional data in simulation experiments and using two real data applications.

  12. Classical Testing in Functional Linear Models

    PubMed Central

    Kong, Dehan; Staicu, Ana-Maria; Maity, Arnab

    2016-01-01

    We extend four tests common in classical regression - Wald, score, likelihood ratio and F tests - to functional linear regression, for testing the null hypothesis, that there is no association between a scalar response and a functional covariate. Using functional principal component analysis, we re-express the functional linear model as a standard linear model, where the effect of the functional covariate can be approximated by a finite linear combination of the functional principal component scores. In this setting, we consider application of the four traditional tests. The proposed testing procedures are investigated theoretically for densely observed functional covariates when the number of principal components diverges. Using the theoretical distribution of the tests under the alternative hypothesis, we develop a procedure for sample size calculation in the context of functional linear regression. The four tests are further compared numerically for both densely and sparsely observed noisy functional data in simulation experiments and using two real data applications. PMID:28955155

  13. Inter-species activity correlations reveal functional correspondences between monkey and human brain areas

    PubMed Central

    Mantini, Dante; Hasson, Uri; Betti, Viviana; Perrucci, Mauro G.; Romani, Gian Luca; Corbetta, Maurizio; Orban, Guy A.; Vanduffel, Wim

    2012-01-01

    Evolution-driven functional changes in the primate brain are typically assessed by aligning monkey and human activation maps using cortical surface expansion models. These models use putative homologous areas as registration landmarks, assuming they are functionally correspondent. In cases where functional changes have occurred in an area, this assumption prohibits to reveal whether other areas may have assumed lost functions. Here we describe a method to examine functional correspondences across species. Without making spatial assumptions, we assess similarities in sensory-driven functional magnetic resonance imaging responses between monkey (Macaca mulatta) and human brain areas by means of temporal correlation. Using natural vision data, we reveal regions for which functional processing has shifted to topologically divergent locations during evolution. We conclude that substantial evolution-driven functional reorganizations have occurred, not always consistent with cortical expansion processes. This novel framework for evaluating changes in functional architecture is crucial to building more accurate evolutionary models. PMID:22306809

  14. Space station functional relationships analysis

    NASA Technical Reports Server (NTRS)

    Tullis, Thomas S.; Bied, Barbra R.

    1988-01-01

    A systems engineering process is developed to assist Space Station designers to understand the underlying operational system of the facility so that it can be physically arranged and configured to support crew productivity. The study analyzes the operational system proposed for the Space Station in terms of mission functions, crew activities, and functional relationships in order to develop a quantitative model for evaluation of interior layouts, configuration, and traffic analysis for any Station configuration. Development of the model involved identification of crew functions, required support equipment, criteria of assessing functional relationships, and tools for analyzing functional relationship matrices, as well as analyses of crew transition frequency, sequential dependencies, support equipment requirements, potential for noise interference, need for privacy, and overall compatability of functions. The model can be used for analyzing crew functions for the Initial Operating Capability of the Station and for detecting relationships among these functions. Note: This process (FRA) was used during Phase B design studies to test optional layouts of the Space Station habitat module. The process is now being automated as a computer model for use in layout testing of the Space Station laboratory modules during Phase C.

  15. Clinical Supervision and Psychological Functions: A New Direction for Theory and Practice.

    ERIC Educational Resources Information Center

    Pajak, Edward

    2002-01-01

    Relates Carl Jung's concept of psychological functions to four families of clinical supervision: the original clinical models, the humanistic/artistic models, the technical/didactic models, and the developmental/reflective models. Differences among clinical supervision models within these families are clarified as representing "communication…

  16. Modeling functional neuroanatomy for an anatomy information system.

    PubMed

    Niggemann, Jörg M; Gebert, Andreas; Schulz, Stefan

    2008-01-01

    Existing neuroanatomical ontologies, databases and information systems, such as the Foundational Model of Anatomy (FMA), represent outgoing connections from brain structures, but cannot represent the "internal wiring" of structures and as such, cannot distinguish between different independent connections from the same structure. Thus, a fundamental aspect of Neuroanatomy, the functional pathways and functional systems of the brain such as the pupillary light reflex system, is not adequately represented. This article identifies underlying anatomical objects which are the source of independent connections (collections of neurons) and uses these as basic building blocks to construct a model of functional neuroanatomy and its functional pathways. The basic representational elements of the model are unnamed groups of neurons or groups of neuron segments. These groups, their relations to each other, and the relations to the objects of macroscopic anatomy are defined. The resulting model can be incorporated into the FMA. The capabilities of the presented model are compared to the FMA and the Brain Architecture Management System (BAMS). Internal wiring as well as functional pathways can correctly be represented and tracked. This model bridges the gap between representations of single neurons and their parts on the one hand and representations of spatial brain structures and areas on the other hand. It is capable of drawing correct inferences on pathways in a nervous system. The object and relation definitions are related to the Open Biomedical Ontology effort and its relation ontology, so that this model can be further developed into an ontology of neuronal functional systems.

  17. Thermal form-factor approach to dynamical correlation functions of integrable lattice models

    NASA Astrophysics Data System (ADS)

    Göhmann, Frank; Karbach, Michael; Klümper, Andreas; Kozlowski, Karol K.; Suzuki, Junji

    2017-11-01

    We propose a method for calculating dynamical correlation functions at finite temperature in integrable lattice models of Yang-Baxter type. The method is based on an expansion of the correlation functions as a series over matrix elements of a time-dependent quantum transfer matrix rather than the Hamiltonian. In the infinite Trotter-number limit the matrix elements become time independent and turn into the thermal form factors studied previously in the context of static correlation functions. We make this explicit with the example of the XXZ model. We show how the form factors can be summed utilizing certain auxiliary functions solving finite sets of nonlinear integral equations. The case of the XX model is worked out in more detail leading to a novel form-factor series representation of the dynamical transverse two-point function.

  18. A hypothetical universal model of cerebellar function: reconsideration of the current dogma.

    PubMed

    Magal, Ari

    2013-10-01

    The cerebellum is commonly studied in the context of the classical eyeblink conditioning model, which attributes an adaptive motor function to cerebellar learning processes. This model of cerebellar function has quite a few shortcomings and may in fact be somewhat deficient in explaining the myriad functions attributed to the cerebellum, functions ranging from motor sequencing to emotion and cognition. The involvement of the cerebellum in these motor and non-motor functions has been demonstrated in both animals and humans in electrophysiological, behavioral, tracing, functional neuroimaging, and PET studies, as well as in clinical human case studies. A closer look at the cerebellum's evolutionary origin provides a clue to its underlying purpose as a tool which evolved to aid predation rather than as a tool for protection. Based upon this evidence, an alternative model of cerebellar function is proposed, one which might more comprehensively account both for the cerebellum's involvement in a myriad of motor, affective, and cognitive functions and for the relative simplicity and ubiquitous repetitiveness of its circuitry. This alternative model suggests that the cerebellum has the ability to detect coincidences of events, be they sensory, motor, affective, or cognitive in nature, and, after having learned to associate these, it can then trigger (or "mirror") these events after having temporally adjusted their onset based on positive/negative reinforcement. The model also provides for the cerebellum's direction of the proper and uninterrupted sequence of events resulting from this learning through the inhibition of efferent structures (as demonstrated in our lab).

  19. Analysis and control of the METC fluid-bed gasifier. Quarterly report, October 1994--January 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farell, A.E.; Reddy, S.

    1995-03-01

    This document summarizes work performed for the period 10/1/94 to 2/1/95. The initial phase of the work focuses on developing a simple transfer function model of the Fluidized Bed Gasifier (FBG). This transfer function model will be developed based purely on the gasifier responses to step changes in gasifier inputs (including reactor air, convey air, cone nitrogen, FBG pressure, and coal feedrate). This transfer function model will represent a linear, dynamic model that is valid near the operating point at which the data was taken. In addition, a similar transfer function model will be developed using MGAS in order tomore » assess MGAS for use as a model of the FBG for control systems analysis.« less

  20. Modified polarized geometrical attenuation model for bidirectional reflection distribution function based on random surface microfacet theory.

    PubMed

    Liu, Hong; Zhu, Jingping; Wang, Kai

    2015-08-24

    The geometrical attenuation model given by Blinn was widely used in the geometrical optics bidirectional reflectance distribution function (BRDF) models. Blinn's geometrical attenuation model based on symmetrical V-groove assumption and ray scalar theory causes obvious inaccuracies in BRDF curves and negatives the effects of polarization. Aiming at these questions, a modified polarized geometrical attenuation model based on random surface microfacet theory is presented by combining of masking and shadowing effects and polarized effect. The p-polarized, s-polarized and unpolarized geometrical attenuation functions are given in their separate expressions and are validated with experimental data of two samples. It shows that the modified polarized geometrical attenuation function reaches better physical rationality, improves the precision of BRDF model, and widens the applications for different polarization.

  1. Interaction Models for Functional Regression.

    PubMed

    Usset, Joseph; Staicu, Ana-Maria; Maity, Arnab

    2016-02-01

    A functional regression model with a scalar response and multiple functional predictors is proposed that accommodates two-way interactions in addition to their main effects. The proposed estimation procedure models the main effects using penalized regression splines, and the interaction effect by a tensor product basis. Extensions to generalized linear models and data observed on sparse grids or with measurement error are presented. A hypothesis testing procedure for the functional interaction effect is described. The proposed method can be easily implemented through existing software. Numerical studies show that fitting an additive model in the presence of interaction leads to both poor estimation performance and lost prediction power, while fitting an interaction model where there is in fact no interaction leads to negligible losses. The methodology is illustrated on the AneuRisk65 study data.

  2. On the Validity of the Streaming Model for the Redshift-Space Correlation Function in the Linear Regime

    NASA Astrophysics Data System (ADS)

    Fisher, Karl B.

    1995-08-01

    The relation between the galaxy correlation functions in real-space and redshift-space is derived in the linear regime by an appropriate averaging of the joint probability distribution of density and velocity. The derivation recovers the familiar linear theory result on large scales but has the advantage of clearly revealing the dependence of the redshift distortions on the underlying peculiar velocity field; streaming motions give rise to distortions of θ(Ω0.6/b) while variations in the anisotropic velocity dispersion yield terms of order θ(Ω1.2/b2). This probabilistic derivation of the redshift-space correlation function is similar in spirit to the derivation of the commonly used "streaming" model, in which the distortions are given by a convolution of the real-space correlation function with a velocity distribution function. The streaming model is often used to model the redshift-space correlation function on small, highly nonlinear, scales. There have been claims in the literature, however, that the streaming model is not valid in the linear regime. Our analysis confirms this claim, but we show that the streaming model can be made consistent with linear theory provided that the model for the streaming has the functional form predicted by linear theory and that the velocity distribution is chosen to be a Gaussian with the correct linear theory dispersion.

  3. Insights into DNA-mediated interparticle interactions from a coarse-grained model

    NASA Astrophysics Data System (ADS)

    Ding, Yajun; Mittal, Jeetain

    2014-11-01

    DNA-functionalized particles have great potential for the design of complex self-assembled materials. The major hurdle in realizing crystal structures from DNA-functionalized particles is expected to be kinetic barriers that trap the system in metastable amorphous states. Therefore, it is vital to explore the molecular details of particle assembly processes in order to understand the underlying mechanisms. Molecular simulations based on coarse-grained models can provide a convenient route to explore these details. Most of the currently available coarse-grained models of DNA-functionalized particles ignore key chemical and structural details of DNA behavior. These models therefore are limited in scope for studying experimental phenomena. In this paper, we present a new coarse-grained model of DNA-functionalized particles which incorporates some of the desired features of DNA behavior. The coarse-grained DNA model used here provides explicit DNA representation (at the nucleotide level) and complementary interactions between Watson-Crick base pairs, which lead to the formation of single-stranded hairpin and double-stranded DNA. Aggregation between multiple complementary strands is also prevented in our model. We study interactions between two DNA-functionalized particles as a function of DNA grafting density, lengths of the hybridizing and non-hybridizing parts of DNA, and temperature. The calculated free energies as a function of pair distance between particles qualitatively resemble experimental measurements of DNA-mediated pair interactions.

  4. Overgeneral autobiographical memory in healthy young and older adults: Differential age effects on components of the capture and rumination, functional avoidance, and impaired executive control (CaRFAX) model.

    PubMed

    Ros, Laura; Latorre, Jose M; Serrano, Juan P; Ricarte, Jorge J

    2017-08-01

    The CaRFAX model (Williams et al., 2007) has been used to explain the causes of overgeneral autobiographical memory (OGM; the difficulty to retrieve specific autobiographical memories), a cognitive phenomenon generally related with different psychopathologies. This model proposes 3 different mechanisms to explain OGM: capture and rumination (CaR), functional avoidance (FA) and impaired executive functions (X). However, the complete CaRFAX model has not been tested in nonclinical populations. This study aims to assess the usefulness of the CaRFAX model to explain OGM in 2 healthy samples: a young sample and an older sample, to test for possible age-related differences in the underlying causes of OGM. A total of 175 young (age range: 19-36 years) and 175 older (age range: 53-88 years) participants completed measures of brooding rumination (CaR), functional avoidance (FA), and executive tasks (X). Using structural equation modeling, we found that memory specificity is mainly associated with lower functional avoidance and higher executive functions in the older group, but only with executive functions in young participants. We discuss the different roles of emotional regulation strategies used by young and older people and their relation to the CaRFAX model to explain OGM in healthy people. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Computing rates of Markov models of voltage-gated ion channels by inverting partial differential equations governing the probability density functions of the conducting and non-conducting states.

    PubMed

    Tveito, Aslak; Lines, Glenn T; Edwards, Andrew G; McCulloch, Andrew

    2016-07-01

    Markov models are ubiquitously used to represent the function of single ion channels. However, solving the inverse problem to construct a Markov model of single channel dynamics from bilayer or patch-clamp recordings remains challenging, particularly for channels involving complex gating processes. Methods for solving the inverse problem are generally based on data from voltage clamp measurements. Here, we describe an alternative approach to this problem based on measurements of voltage traces. The voltage traces define probability density functions of the functional states of an ion channel. These probability density functions can also be computed by solving a deterministic system of partial differential equations. The inversion is based on tuning the rates of the Markov models used in the deterministic system of partial differential equations such that the solution mimics the properties of the probability density function gathered from (pseudo) experimental data as well as possible. The optimization is done by defining a cost function to measure the difference between the deterministic solution and the solution based on experimental data. By evoking the properties of this function, it is possible to infer whether the rates of the Markov model are identifiable by our method. We present applications to Markov model well-known from the literature. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Statistical Time Series Models of Pilot Control with Applications to Instrument Discrimination

    NASA Technical Reports Server (NTRS)

    Altschul, R. E.; Nagel, P. M.; Oliver, F.

    1984-01-01

    A general description of the methodology used in obtaining the transfer function models and verification of model fidelity, frequency domain plots of the modeled transfer functions, numerical results obtained from an analysis of poles and zeroes obtained from z plane to s-plane conversions of the transfer functions, and the results of a study on the sequential introduction of other variables, both exogenous and endogenous into the loop are contained.

  7. Directional output distance functions: endogenous directions based on exogenous normalization constraints

    USDA-ARS?s Scientific Manuscript database

    In this paper we develop a model for computing directional output distance functions with endogenously determined direction vectors. We show how this model is related to the slacks-based directional distance function introduced by Fare and Grosskopf and show how to use the slacks-based function to e...

  8. Application of a water quality model in the White Cart water catchment, Glasgow, UK.

    PubMed

    Liu, S; Tucker, P; Mansell, M; Hursthouse, A

    2003-03-01

    Water quality models of urban systems have previously focused on point source (sewerage system) inputs. Little attention has been given to diffuse inputs and research into diffuse pollution has been largely confined to agriculture sources. This paper reports on new research that is aimed at integrating diffuse inputs into an urban water quality model. An integrated model is introduced that is made up of four modules: hydrology, contaminant point sources, nutrient cycling and leaching. The hydrology module, T&T consists of a TOPMODEL (a TOPography-based hydrological MODEL), which simulates runoff from pervious areas and a two-tank model, which simulates runoff from impervious urban areas. Linked into the two-tank model, the contaminant point source module simulates the overflow from the sewerage system in heavy rain. The widely known SOILN (SOIL Nitrate model) is the basis of nitrogen cycle module. Finally, the leaching module consists of two functions: the production function and the transfer function. The production function is based on SLIM (Solute Leaching Intermediate Model) while the transfer function is based on the 'flushing hypothesis' which postulates a relationship between contaminant concentrations in the receiving water course and the extent to which the catchment is saturated. This paper outlines the modelling methodology and the model structures that have been developed. An application of this model in the White Cart catchment (Glasgow) is also included.

  9. A Note on the Equivalence between Observed and Expected Information Functions with Polytomous IRT Models

    ERIC Educational Resources Information Center

    Magis, David

    2015-01-01

    The purpose of this note is to study the equivalence of observed and expected (Fisher) information functions with polytomous item response theory (IRT) models. It is established that observed and expected information functions are equivalent for the class of divide-by-total models (including partial credit, generalized partial credit, rating…

  10. Collaborative Modelling of the Vascular System--Designing and Evaluating a New Learning Method for Secondary Students

    ERIC Educational Resources Information Center

    Haugwitz, Marion; Sandmann, Angela

    2010-01-01

    Understanding biological structures and functions is often difficult because of their complexity and micro-structure. For example, the vascular system is a complex and only partly visible system. Constructing models to better understand biological functions is seen as a suitable learning method. Models function as simplified versions of real…

  11. A Classroom Note on: Modeling Functions with the TI-83/84 Calculator

    ERIC Educational Resources Information Center

    Lubowsky, Jack

    2011-01-01

    In Pre-Calculus courses, students are taught the composition and combination of functions to model physical applications. However, when combining two or more functions into a single more complicated one, students may lose sight of the physical picture which they are attempting to model. A block diagram, or flow chart, in which each block…

  12. Dealing with dissatisfaction in mathematical modelling to integrate QFD and Kano’s model

    NASA Astrophysics Data System (ADS)

    Retno Sari Dewi, Dian; Debora, Joana; Edy Sianto, Martinus

    2017-12-01

    The purpose of the study is to implement the integration of Quality Function Deployment (QFD) and Kano’s Model into mathematical model. Voice of customer data in QFD was collected using questionnaire and the questionnaire was developed based on Kano’s model. Then the operational research methodology was applied to build the objective function and constraints in the mathematical model. The relationship between voice of customer and engineering characteristics was modelled using linier regression model. Output of the mathematical model would be detail of engineering characteristics. The objective function of this model is to maximize satisfaction and minimize dissatisfaction as well. Result of this model is 62% .The major contribution of this research is to implement the existing mathematical model to integrate QFD and Kano’s Model in the case study of shoe cabinet.

  13. Nonlinear model for an optical read-only-memory disk readout channel based on an edge-spread function.

    PubMed

    Kobayashi, Seiji

    2002-05-10

    A point-spread function (PSF) is commonly used as a model of an optical disk readout channel. However, the model given by the PSF does not contain the quadratic distortion generated by the photo-detection process. We introduce a model for calculating an approximation of the quadratic component of a signal. We show that this model can be further simplified when a read-only-memory (ROM) disk is assumed. We introduce an edge-spread function by which a simple nonlinear model of an optical ROM disk readout channel is created.

  14. Comparing methods to combine functional loss and mortality in clinical trials for amyotrophic lateral sclerosis

    PubMed Central

    van Eijk, Ruben PA; Eijkemans, Marinus JC; Rizopoulos, Dimitris

    2018-01-01

    Objective Amyotrophic lateral sclerosis (ALS) clinical trials based on single end points only partially capture the full treatment effect when both function and mortality are affected, and may falsely dismiss efficacious drugs as futile. We aimed to investigate the statistical properties of several strategies for the simultaneous analysis of function and mortality in ALS clinical trials. Methods Based on the Pooled Resource Open-Access ALS Clinical Trials (PRO-ACT) database, we simulated longitudinal patterns of functional decline, defined by the revised amyotrophic lateral sclerosis functional rating scale (ALSFRS-R) and conditional survival time. Different treatment scenarios with varying effect sizes were simulated with follow-up ranging from 12 to 18 months. We considered the following analytical strategies: 1) Cox model; 2) linear mixed effects (LME) model; 3) omnibus test based on Cox and LME models; 4) composite time-to-6-point decrease or death; 5) combined assessment of function and survival (CAFS); and 6) test based on joint modeling framework. For each analytical strategy, we calculated the empirical power and sample size. Results Both Cox and LME models have increased false-negative rates when treatment exclusively affects either function or survival. The joint model has superior power compared to other strategies. The composite end point increases false-negative rates among all treatment scenarios. To detect a 15% reduction in ALSFRS-R decline and 34% decline in hazard with 80% power after 18 months, the Cox model requires 524 patients, the LME model 794 patients, the omnibus test 526 patients, the composite end point 1,274 patients, the CAFS 576 patients and the joint model 464 patients. Conclusion Joint models have superior statistical power to analyze simultaneous effects on survival and function and may circumvent pitfalls encountered by other end points. Optimizing trial end points is essential, as selecting suboptimal outcomes may disguise important treatment clues. PMID:29593436

  15. On the development of a semi-nonparametric generalized multinomial logit model for travel-related choices

    PubMed Central

    Ye, Xin; Pendyala, Ram M.; Zou, Yajie

    2017-01-01

    A semi-nonparametric generalized multinomial logit model, formulated using orthonormal Legendre polynomials to extend the standard Gumbel distribution, is presented in this paper. The resulting semi-nonparametric function can represent a probability density function for a large family of multimodal distributions. The model has a closed-form log-likelihood function that facilitates model estimation. The proposed method is applied to model commute mode choice among four alternatives (auto, transit, bicycle and walk) using travel behavior data from Argau, Switzerland. Comparisons between the multinomial logit model and the proposed semi-nonparametric model show that violations of the standard Gumbel distribution assumption lead to considerable inconsistency in parameter estimates and model inferences. PMID:29073152

  16. On the development of a semi-nonparametric generalized multinomial logit model for travel-related choices.

    PubMed

    Wang, Ke; Ye, Xin; Pendyala, Ram M; Zou, Yajie

    2017-01-01

    A semi-nonparametric generalized multinomial logit model, formulated using orthonormal Legendre polynomials to extend the standard Gumbel distribution, is presented in this paper. The resulting semi-nonparametric function can represent a probability density function for a large family of multimodal distributions. The model has a closed-form log-likelihood function that facilitates model estimation. The proposed method is applied to model commute mode choice among four alternatives (auto, transit, bicycle and walk) using travel behavior data from Argau, Switzerland. Comparisons between the multinomial logit model and the proposed semi-nonparametric model show that violations of the standard Gumbel distribution assumption lead to considerable inconsistency in parameter estimates and model inferences.

  17. Functional Results-Oriented Healthcare Leadership: A Novel Leadership Model

    PubMed Central

    Al-Touby, Salem Said

    2012-01-01

    This article modifies the traditional functional leadership model to accommodate contemporary needs in healthcare leadership based on two findings. First, the article argues that it is important that the ideal healthcare leadership emphasizes the outcomes of the patient care more than processes and structures used to deliver such care; and secondly, that the leadership must strive to attain effectiveness of their care provision and not merely targeting the attractive option of efficient operations. Based on these premises, the paper reviews the traditional Functional Leadership Model and the three elements that define the type of leadership an organization has namely, the tasks, the individuals, and the team. The article argues that concentrating on any one of these elements is not ideal and proposes adding a new element to the model to construct a novel Functional Result-Oriented healthcare leadership model. The recommended Functional-Results Oriented leadership model embosses the results element on top of the other three elements so that every effort on healthcare leadership is directed towards attaining excellent patient outcomes. PMID:22496933

  18. Functional results-oriented healthcare leadership: a novel leadership model.

    PubMed

    Al-Touby, Salem Said

    2012-03-01

    This article modifies the traditional functional leadership model to accommodate contemporary needs in healthcare leadership based on two findings. First, the article argues that it is important that the ideal healthcare leadership emphasizes the outcomes of the patient care more than processes and structures used to deliver such care; and secondly, that the leadership must strive to attain effectiveness of their care provision and not merely targeting the attractive option of efficient operations. Based on these premises, the paper reviews the traditional Functional Leadership Model and the three elements that define the type of leadership an organization has namely, the tasks, the individuals, and the team. The article argues that concentrating on any one of these elements is not ideal and proposes adding a new element to the model to construct a novel Functional Result-Oriented healthcare leadership model. The recommended Functional-Results Oriented leadership model embosses the results element on top of the other three elements so that every effort on healthcare leadership is directed towards attaining excellent patient outcomes.

  19. Influence of the volume and density functions within geometric models for estimating trunk inertial parameters.

    PubMed

    Wicke, Jason; Dumas, Genevieve A

    2010-02-01

    The geometric method combines a volume and a density function to estimate body segment parameters and has the best opportunity for developing the most accurate models. In the trunk, there are many different tissues that greatly differ in density (e.g., bone versus lung). Thus, the density function for the trunk must be particularly sensitive to capture this diversity, such that accurate inertial estimates are possible. Three different models were used to test this hypothesis by estimating trunk inertial parameters of 25 female and 24 male college-aged participants. The outcome of this study indicates that the inertial estimates for the upper and lower trunk are most sensitive to the volume function and not very sensitive to the density function. Although it appears that the uniform density function has a greater influence on inertial estimates in the lower trunk region than in the upper trunk region, this is likely due to the (overestimated) density value used. When geometric models are used to estimate body segment parameters, care must be taken in choosing a model that can accurately estimate segment volumes. Researchers wanting to develop accurate geometric models should focus on the volume function, especially in unique populations (e.g., pregnant or obese individuals).

  20. Determining A Purely Symbolic Transfer Function from Symbol Streams: Theory and Algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griffin, Christopher H

    Transfer function modeling is a \\emph{standard technique} in classical Linear Time Invariant and Statistical Process Control. The work of Box and Jenkins was seminal in developing methods for identifying parameters associated with classicalmore » $(r,s,k)$$ transfer functions. Discrete event systems are often \\emph{used} for modeling hybrid control structures and high-level decision problems. \\emph{Examples include} discrete time, discrete strategy repeated games. For these games, a \\emph{discrete transfer function in the form of} an accurate hidden Markov model of input-output relations \\emph{could be used to derive optimal response strategies.} In this paper, we develop an algorithm \\emph{for} creating probabilistic \\textit{Mealy machines} that act as transfer function models for discrete event dynamic systems (DEDS). Our models are defined by three parameters, $$(l_1, l_2, k)$ just as the Box-Jenkins transfer function models. Here $$l_1$$ is the maximal input history lengths to consider, $$l_2$$ is the maximal output history lengths to consider and $k$ is the response lag. Using related results, We show that our Mealy machine transfer functions are optimal in the sense that they maximize the mutual information between the current known state of the DEDS and the next observed input/output pair.« less

  1. Evaluating simulated functional trait patterns and quantifying modelled trait diversity effects on simulated ecosystem fluxes

    NASA Astrophysics Data System (ADS)

    Pavlick, R.; Schimel, D.

    2014-12-01

    Dynamic Global Vegetation Models (DGVMs) typically employ only a small set of Plant Functional Types (PFTs) to represent the vast diversity of observed vegetation forms and functioning. There is growing evidence, however, that this abstraction may not adequately represent the observed variation in plant functional traits, which is thought to play an important role for many ecosystem functions and for ecosystem resilience to environmental change. The geographic distribution of PFTs in these models is also often based on empirical relationships between present-day climate and vegetation patterns. Projections of future climate change, however, point toward the possibility of novel regional climates, which could lead to no-analog vegetation compositions incompatible with the PFT paradigm. Here, we present results from the Jena Diversity-DGVM (JeDi-DGVM), a novel traits-based vegetation model, which simulates a large number of hypothetical plant growth strategies constrained by functional tradeoffs, thereby allowing for a more flexible temporal and spatial representation of the terrestrial biosphere. First, we compare simulated present-day geographical patterns of functional traits with empirical trait observations (in-situ and from airborne imaging spectroscopy). The observed trait patterns are then used to improve the tradeoff parameterizations of JeDi-DGVM. Finally, focusing primarily on the simulated leaf traits, we run the model with various amounts of trait diversity. We quantify the effects of these modeled biodiversity manipulations on simulated ecosystem fluxes and stocks for both present-day conditions and transient climate change scenarios. The simulation results reveal that the coarse treatment of plant functional traits by current PFT-based vegetation models may contribute substantial uncertainty regarding carbon-climate feedbacks. Further development of trait-based models and further investment in global in-situ and spectroscopic plant trait observations are needed.

  2. Model-free estimation of the psychometric function

    PubMed Central

    Żychaluk, Kamila; Foster, David H.

    2009-01-01

    A subject's response to the strength of a stimulus is described by the psychometric function, from which summary measures, such as a threshold or slope, may be derived. Traditionally, this function is estimated by fitting a parametric model to the experimental data, usually the proportion of successful trials at each stimulus level. Common models include the Gaussian and Weibull cumulative distribution functions. This approach works well if the model is correct, but it can mislead if not. In practice, the correct model is rarely known. Here, a nonparametric approach based on local linear fitting is advocated. No assumption is made about the true model underlying the data, except that the function is smooth. The critical role of the bandwidth is identified, and its optimum value estimated by a cross-validation procedure. As a demonstration, seven vision and hearing data sets were fitted by the local linear method and by several parametric models. The local linear method frequently performed better and never worse than the parametric ones. Supplemental materials for this article can be downloaded from app.psychonomic-journals.org/content/supplemental. PMID:19633355

  3. Functional diversity supports the physiological tolerance hypothesis for plant species richness along climatic gradients

    USGS Publications Warehouse

    Spasojevic, Marko J.; Grace, James B.; Harrison, Susan; Damschen, Ellen Ingman

    2013-01-01

    1. The physiological tolerance hypothesis proposes that plant species richness is highest in warm and/or wet climates because a wider range of functional strategies can persist under such conditions. Functional diversity metrics, combined with statistical modeling, offer new ways to test whether diversity-environment relationships are consistent with this hypothesis. 2. In a classic study by R. H. Whittaker (1960), herb species richness declined from mesic (cool, moist, northerly) slopes to xeric (hot, dry, southerly) slopes. Building on this dataset, we measured four plant functional traits (plant height, specific leaf area, leaf water content and foliar C:N) and used them to calculate three functional diversity metrics (functional richness, evenness, and dispersion). We then used a structural equation model to ask if ‘functional diversity’ (modeled as the joint responses of richness, evenness, and dispersion) could explain the observed relationship of topographic climate gradients to species richness. We then repeated our model examining the functional diversity of each of the four traits individually. 3. Consistent with the physiological tolerance hypothesis, we found that functional diversity was higher in more favorable climatic conditions (mesic slopes), and that multivariate functional diversity mediated the relationship of the topographic climate gradient to plant species richness. We found similar patterns for models focusing on individual trait functional diversity of leaf water content and foliar C:N. 4. Synthesis. Our results provide trait-based support for the physiological tolerance hypothesis, suggesting that benign climates support more species because they allow for a wider range of functional strategies.

  4. Extending existing structural identifiability analysis methods to mixed-effects models.

    PubMed

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2018-01-01

    The concept of structural identifiability for state-space models is expanded to cover mixed-effects state-space models. Two methods applicable for the analytical study of the structural identifiability of mixed-effects models are presented. The two methods are based on previously established techniques for non-mixed-effects models; namely the Taylor series expansion and the input-output form approach. By generating an exhaustive summary, and by assuming an infinite number of subjects, functions of random variables can be derived which in turn determine the distribution of the system's observation function(s). By considering the uniqueness of the analytical statistical moments of the derived functions of the random variables, the structural identifiability of the corresponding mixed-effects model can be determined. The two methods are applied to a set of examples of mixed-effects models to illustrate how they work in practice. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Improving reliability of aggregation, numerical simulation and analysis of complex systems by empirical data

    NASA Astrophysics Data System (ADS)

    Dobronets, Boris S.; Popova, Olga A.

    2018-05-01

    The paper considers a new approach of regression modeling that uses aggregated data presented in the form of density functions. Approaches to Improving the reliability of aggregation of empirical data are considered: improving accuracy and estimating errors. We discuss the procedures of data aggregation as a preprocessing stage for subsequent to regression modeling. An important feature of study is demonstration of the way how represent the aggregated data. It is proposed to use piecewise polynomial models, including spline aggregate functions. We show that the proposed approach to data aggregation can be interpreted as the frequency distribution. To study its properties density function concept is used. Various types of mathematical models of data aggregation are discussed. For the construction of regression models, it is proposed to use data representation procedures based on piecewise polynomial models. New approaches to modeling functional dependencies based on spline aggregations are proposed.

  6. Resolving Microzooplankton Functional Groups In A Size-Structured Planktonic Model

    NASA Astrophysics Data System (ADS)

    Taniguchi, D.; Dutkiewicz, S.; Follows, M. J.; Jahn, O.; Menden-Deuer, S.

    2016-02-01

    Microzooplankton are important marine grazers, often consuming a large fraction of primary productivity. They consist of a great diversity of organisms with different behaviors, characteristics, and rates. This functional diversity, and its consequences, are not currently reflected in large-scale ocean ecological simulations. How should these organisms be represented, and what are the implications for their biogeography? We develop a size-structured, trait-based model to characterize a diversity of microzooplankton functional groups. We compile and examine size-based laboratory data on the traits, revealing some patterns with size and functional group that we interpret with mechanistic theory. Fitting the model to the data provides parameterizations of key rates and properties, which we employ in a numerical ocean model. The diversity of grazing preference, rates, and trophic strategies enables the coexistence of different functional groups of micro-grazers under various environmental conditions, and the model produces testable predictions of the biogeography.

  7. Bayesian switching factor analysis for estimating time-varying functional connectivity in fMRI.

    PubMed

    Taghia, Jalil; Ryali, Srikanth; Chen, Tianwen; Supekar, Kaustubh; Cai, Weidong; Menon, Vinod

    2017-07-15

    There is growing interest in understanding the dynamical properties of functional interactions between distributed brain regions. However, robust estimation of temporal dynamics from functional magnetic resonance imaging (fMRI) data remains challenging due to limitations in extant multivariate methods for modeling time-varying functional interactions between multiple brain areas. Here, we develop a Bayesian generative model for fMRI time-series within the framework of hidden Markov models (HMMs). The model is a dynamic variant of the static factor analysis model (Ghahramani and Beal, 2000). We refer to this model as Bayesian switching factor analysis (BSFA) as it integrates factor analysis into a generative HMM in a unified Bayesian framework. In BSFA, brain dynamic functional networks are represented by latent states which are learnt from the data. Crucially, BSFA is a generative model which estimates the temporal evolution of brain states and transition probabilities between states as a function of time. An attractive feature of BSFA is the automatic determination of the number of latent states via Bayesian model selection arising from penalization of excessively complex models. Key features of BSFA are validated using extensive simulations on carefully designed synthetic data. We further validate BSFA using fingerprint analysis of multisession resting-state fMRI data from the Human Connectome Project (HCP). Our results show that modeling temporal dependencies in the generative model of BSFA results in improved fingerprinting of individual participants. Finally, we apply BSFA to elucidate the dynamic functional organization of the salience, central-executive, and default mode networks-three core neurocognitive systems with central role in cognitive and affective information processing (Menon, 2011). Across two HCP sessions, we demonstrate a high level of dynamic interactions between these networks and determine that the salience network has the highest temporal flexibility among the three networks. Our proposed methods provide a novel and powerful generative model for investigating dynamic brain connectivity. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Multiple Stressors and the Functioning of Coral Reefs.

    PubMed

    Harborne, Alastair R; Rogers, Alice; Bozec, Yves-Marie; Mumby, Peter J

    2017-01-03

    Coral reefs provide critical services to coastal communities, and these services rely on ecosystem functions threatened by stressors. By summarizing the threats to the functioning of reefs from fishing, climate change, and decreasing water quality, we highlight that these stressors have multiple, conflicting effects on functionally similar groups of species and their interactions, and that the overall effects are often uncertain because of a lack of data or variability among taxa. The direct effects of stressors on links among functional groups, such as predator-prey interactions, are particularly uncertain. Using qualitative modeling, we demonstrate that this uncertainty of stressor impacts on functional groups (whether they are positive, negative, or neutral) can have significant effects on models of ecosystem stability, and reducing uncertainty is vital for understanding changes to reef functioning. This review also provides guidance for future models of reef functioning, which should include interactions among functional groups and the cumulative effect of stressors.

  9. Multiple Stressors and the Functioning of Coral Reefs

    NASA Astrophysics Data System (ADS)

    Harborne, Alastair R.; Rogers, Alice; Bozec, Yves-Marie; Mumby, Peter J.

    2017-01-01

    Coral reefs provide critical services to coastal communities, and these services rely on ecosystem functions threatened by stressors. By summarizing the threats to the functioning of reefs from fishing, climate change, and decreasing water quality, we highlight that these stressors have multiple, conflicting effects on functionally similar groups of species and their interactions, and that the overall effects are often uncertain because of a lack of data or variability among taxa. The direct effects of stressors on links among functional groups, such as predator-prey interactions, are particularly uncertain. Using qualitative modeling, we demonstrate that this uncertainty of stressor impacts on functional groups (whether they are positive, negative, or neutral) can have significant effects on models of ecosystem stability, and reducing uncertainty is vital for understanding changes to reef functioning. This review also provides guidance for future models of reef functioning, which should include interactions among functional groups and the cumulative effect of stressors.

  10. Calibrating the ECCO ocean general circulation model using Green's functions

    NASA Technical Reports Server (NTRS)

    Menemenlis, D.; Fu, L. L.; Lee, T.; Fukumori, I.

    2002-01-01

    Green's functions provide a simple, yet effective, method to test and calibrate General-Circulation-Model(GCM) parameterizations, to study and quantify model and data errors, to correct model biases and trends, and to blend estimates from different solutions and data products.

  11. The universal function in color dipole model

    NASA Astrophysics Data System (ADS)

    Jalilian, Z.; Boroun, G. R.

    2017-10-01

    In this work we review color dipole model and recall properties of the saturation and geometrical scaling in this model. Our primary aim is determining the exact universal function in terms of the introduced scaling variable in different distance than the saturation radius. With inserting the mass in calculation we compute numerically the contribution of heavy productions in small x from the total structure function by the fraction of universal functions and show the geometrical scaling is established due to our scaling variable in this study.

  12. A functional-dynamic reflection on participatory processes in modeling projects.

    PubMed

    Seidl, Roman

    2015-12-01

    The participation of nonscientists in modeling projects/studies is increasingly employed to fulfill different functions. However, it is not well investigated if and how explicitly these functions and the dynamics of a participatory process are reflected by modeling projects in particular. In this review study, I explore participatory modeling projects from a functional-dynamic process perspective. The main differences among projects relate to the functions of participation-most often, more than one per project can be identified, along with the degree of explicit reflection (i.e., awareness and anticipation) on the dynamic process perspective. Moreover, two main approaches are revealed: participatory modeling covering diverse approaches and companion modeling. It becomes apparent that the degree of reflection on the participatory process itself is not always explicit and perfectly visible in the descriptions of the modeling projects. Thus, the use of common protocols or templates is discussed to facilitate project planning, as well as the publication of project results. A generic template may help, not in providing details of a project or model development, but in explicitly reflecting on the participatory process. It can serve to systematize the particular project's approach to stakeholder collaboration, and thus quality management.

  13. FGWAS: Functional genome wide association analysis.

    PubMed

    Huang, Chao; Thompson, Paul; Wang, Yalin; Yu, Yang; Zhang, Jingwen; Kong, Dehan; Colen, Rivka R; Knickmeyer, Rebecca C; Zhu, Hongtu

    2017-10-01

    Functional phenotypes (e.g., subcortical surface representation), which commonly arise in imaging genetic studies, have been used to detect putative genes for complexly inherited neuropsychiatric and neurodegenerative disorders. However, existing statistical methods largely ignore the functional features (e.g., functional smoothness and correlation). The aim of this paper is to develop a functional genome-wide association analysis (FGWAS) framework to efficiently carry out whole-genome analyses of functional phenotypes. FGWAS consists of three components: a multivariate varying coefficient model, a global sure independence screening procedure, and a test procedure. Compared with the standard multivariate regression model, the multivariate varying coefficient model explicitly models the functional features of functional phenotypes through the integration of smooth coefficient functions and functional principal component analysis. Statistically, compared with existing methods for genome-wide association studies (GWAS), FGWAS can substantially boost the detection power for discovering important genetic variants influencing brain structure and function. Simulation studies show that FGWAS outperforms existing GWAS methods for searching sparse signals in an extremely large search space, while controlling for the family-wise error rate. We have successfully applied FGWAS to large-scale analysis of data from the Alzheimer's Disease Neuroimaging Initiative for 708 subjects, 30,000 vertices on the left and right hippocampal surfaces, and 501,584 SNPs. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Recurrence relations in one-dimensional Ising models.

    PubMed

    da Conceição, C M Silva; Maia, R N P

    2017-09-01

    The exact finite-size partition function for the nonhomogeneous one-dimensional (1D) Ising model is found through an approach using algebra operators. Specifically, in this paper we show that the partition function can be computed through a trace from a linear second-order recurrence relation with nonconstant coefficients in matrix form. A relation between the finite-size partition function and the generalized Lucas polynomials is found for the simple homogeneous model, thus establishing a recursive formula for the partition function. This is an important property and it might indicate the possible existence of recurrence relations in higher-dimensional Ising models. Moreover, assuming quenched disorder for the interactions within the model, the quenched averaged magnetic susceptibility displays a nontrivial behavior due to changes in the ferromagnetic concentration probability.

  15. Comparisons of regional Hydrological Angular Momentum (HAM) of the different models

    NASA Astrophysics Data System (ADS)

    Nastula, J.; Kolaczek, B.; Popinski, W.

    2006-10-01

    In the paper hydrological excitations of the polar motion (HAM) were computed from various hydrological data series (NCEP, ECMWF, CPC water storage and LaD World Simulations of global continental water). HAM series obtained from these four models and the geodetic excitation function GEOD computed from the polar motion COMB03 data were compared in the seasonal spectral band. The results show big differences of these hydrological excitation functions as well as of their spectra in the seasonal spectra band. Seasonal oscillations of the global geophysical excitation functions (AAM + OAM + HAM) in all cases besides the NCEP/NCAR model are smaller than the geodetic excitation function. It means that these models need further improvement and perhaps not only hydrological models need improvements.

  16. Effects of soil water and heat relationship under various snow cover during freezing-thawing periods in Songnen Plain, China.

    PubMed

    Fu, Qiang; Hou, Renjie; Li, Tianxiao; Jiang, Ruiqi; Yan, Peiru; Ma, Ziao; Zhou, Zhaoqiang

    2018-01-22

    In this study, the spatial variations of soil water and heat under bare land (BL), natural snow (NS), compacted snow (CS) and thick snow (TS) treatments were analyzed. The relationship curve between soil temperature and water content conforms to the exponential filtering model, by means of the functional form of the model, it was defined as soil water and heat relation function model. On this basis, soil water and heat function models of 10, 20, 40, 60, 100, and 140 cm were established. Finally, a spatial variation law of the relationship effect was described based on analysising of the differences between the predicted and measured results. During freezing period, the effects of external factors on soil were hindered by snow cover. As the snow increased, the accuracy of the function model gradually improved. During melting period, infiltration by snowmelt affected the relationship between the soil temperature and moisture. With the increasing of snow, the accuracy of the function models gradually decreased. The relationship effects of soil water and heat increased with increasing depth within the frozen zone. In contrast, below the frozen layer, the relationship of soil water and heat was weaker, and the function models were less accurate.

  17. Integrating occupational therapy in treating combat stress reaction within a military unit: An intervention model.

    PubMed

    Gindi, Shahar; Galili, Giora; Volovic-Shushan, Shani; Adir-Pavis, Shirly

    2016-01-01

    Combat stress reaction (CR) is a syndrome with a wide range of symptoms including changes in soldiers' behaviors, emotional and physiological responses, avoidance and a decrease in both personal and military functioning. The short-term goal in treating CR is a speedy return to healthy functioning, whereas the long-term goal is to prevent the development of PTSD. Previous research has indicated that the achievement of this short-term goal affects the achievement of the long-term goal and vice versa. Effective treatment requires intervention by trained professionals proficient in reinforcing personal and functional identity without psychiatric labelling. The present paper presents a therapeutic model integrating OT in treating CR within a military setting. The model emphasizes the importance of preventing fixation to the role of 'patient' and a rapid return to maximal functioning. Based on Kielhofner's Model of Human Occupation, which aims to promote adaptive and efficient functioning by engaging soldiers in tasks supporting their military identity, empowering functionality, and increasing their perceived competency. The model emphasizes the therapeutic milieu within a military environment. Practical application of this model focuses on interdisciplinary aspects and client-focused application. The paper describes an assessment process for each soldier entering the CR unit and a treatment model integrating OT.

  18. Representing functions/procedures and processes/structures for analysis of effects of failures on functions and operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Leifker, Daniel B.

    1991-01-01

    Current qualitative device and process models represent only the structure and behavior of physical systems. However, systems in the real world include goal-oriented activities that generally cannot be easily represented using current modeling techniques. An extension of a qualitative modeling system, known as functional modeling, which captures goal-oriented activities explicitly is proposed and how they may be used to support intelligent automation and fault management is shown.

  19. Long-term Effects of Psychosocial Work Stress in Midlife on Health Functioning After Labor Market Exit—Results From the GAZEL Study

    PubMed Central

    Sembajwe, Grace; Zins, Marie; Berkman, Lisa; Goldberg, Marcel; Siegrist, Johannes

    2012-01-01

    Objectives. To study long-term effects of psychosocial work stress in mid-life on health functioning after labor market exit using two established work stress models. Methods. In the frame of the prospective French Gazel cohort study, data on psychosocial work stress were assessed using the full questionnaires measuring the demand-control-support model (in 1997 and 1999) and the effort–reward imbalance model (in 1998). In 2007, health functioning was assessed, using the Short Form 36 mental and physical component scores. Multivariate regressions were calculated to predict health functioning in 2007, controlling for age, gender, social position, and baseline self-perceived health. Results. Consistent effects of both work stress models and their single components on mental and physical health functioning during retirement were observed. Effects remained significant after adjustment including baseline self-perceived health. Whereas the predictive power of both work stress models was similar in the case of the physical composite score, in the case of the mental health score, values of model fit were slightly higher for the effort–reward imbalance model (R²: 0.13) compared with the demand-control model (R²: 0.11). Conclusions. Findings underline the importance of working conditions in midlife not only for health in midlife but also for health functioning after labor market exit. PMID:22546992

  20. Predicting nucleic acid binding interfaces from structural models of proteins

    PubMed Central

    Dror, Iris; Shazman, Shula; Mukherjee, Srayanta; Zhang, Yang; Glaser, Fabian; Mandel-Gutfreund, Yael

    2011-01-01

    The function of DNA- and RNA-binding proteins can be inferred from the characterization and accurate prediction of their binding interfaces. However the main pitfall of various structure-based methods for predicting nucleic acid binding function is that they are all limited to a relatively small number of proteins for which high-resolution three dimensional structures are available. In this study, we developed a pipeline for extracting functional electrostatic patches from surfaces of protein structural models, obtained using the I-TASSER protein structure predictor. The largest positive patches are extracted from the protein surface using the patchfinder algorithm. We show that functional electrostatic patches extracted from an ensemble of structural models highly overlap the patches extracted from high-resolution structures. Furthermore, by testing our pipeline on a set of 55 known nucleic acid binding proteins for which I-TASSER produces high-quality models, we show that the method accurately identifies the nucleic acids binding interface on structural models of proteins. Employing a combined patch approach we show that patches extracted from an ensemble of models better predicts the real nucleic acid binding interfaces compared to patches extracted from independent models. Overall, these results suggest that combining information from a collection of low-resolution structural models could be a valuable approach for functional annotation. We suggest that our method will be further applicable for predicting other functional surfaces of proteins with unknown structure. PMID:22086767

  1. mfpa: Extension of mfp using the ACD covariate transformation for enhanced parametric multivariable modeling.

    PubMed

    Royston, Patrick; Sauerbrei, Willi

    2016-01-01

    In a recent article, Royston (2015, Stata Journal 15: 275-291) introduced the approximate cumulative distribution (acd) transformation of a continuous covariate x as a route toward modeling a sigmoid relationship between x and an outcome variable. In this article, we extend the approach to multivariable modeling by modifying the standard Stata program mfp. The result is a new program, mfpa, that has all the features of mfp plus the ability to fit a new model for user-selected covariates that we call fp1( p 1 , p 2 ). The fp1( p 1 , p 2 ) model comprises the best-fitting combination of a dimension-one fractional polynomial (fp1) function of x and an fp1 function of acd ( x ). We describe a new model-selection algorithm called function-selection procedure with acd transformation, which uses significance testing to attempt to simplify an fp1( p 1 , p 2 ) model to a submodel, an fp1 or linear model in x or in acd ( x ). The function-selection procedure with acd transformation is related in concept to the fsp (fp function-selection procedure), which is an integral part of mfp and which is used to simplify a dimension-two (fp2) function. We describe the mfpa command and give univariable and multivariable examples with real data to demonstrate its use.

  2. Virtual Plants Need Water Too: Functional-Structural Root System Models in the Context of Drought Tolerance Breeding

    PubMed Central

    Ndour, Adama; Vadez, Vincent; Pradal, Christophe; Lucas, Mikaël

    2017-01-01

    Developing a sustainable agricultural model is one of the great challenges of the coming years. The agricultural practices inherited from the Green Revolution of the 1960s show their limits today, and new paradigms need to be explored to counter rising issues such as the multiplication of climate-change related drought episodes. Two such new paradigms are the use of functional-structural plant models to complement and rationalize breeding approaches and a renewed focus on root systems as untapped sources of plant amelioration. Since the late 1980s, numerous functional and structural models of root systems were developed and used to investigate the properties of root systems in soil or lab-conditions. In this review, we focus on the conception and use of such root models in the broader context of research on root-driven drought tolerance, on the basis of root system architecture (RSA) phenotyping. Such models result from the integration of architectural, physiological and environmental data. Here, we consider the different phenotyping techniques allowing for root architectural and physiological study and their limits. We discuss how QTL and breeding studies support the manipulation of RSA as a way to improve drought resistance. We then go over the integration of the generated data within architectural models, how those architectural models can be coupled with functional hydraulic models, and how functional parameters can be measured to feed those models. We then consider the assessment and validation of those hydraulic models through confrontation of simulations to experimentations. Finally, we discuss the up and coming challenges facing root systems functional-structural modeling approaches in the context of breeding. PMID:29018456

  3. Virtual Plants Need Water Too: Functional-Structural Root System Models in the Context of Drought Tolerance Breeding.

    PubMed

    Ndour, Adama; Vadez, Vincent; Pradal, Christophe; Lucas, Mikaël

    2017-01-01

    Developing a sustainable agricultural model is one of the great challenges of the coming years. The agricultural practices inherited from the Green Revolution of the 1960s show their limits today, and new paradigms need to be explored to counter rising issues such as the multiplication of climate-change related drought episodes. Two such new paradigms are the use of functional-structural plant models to complement and rationalize breeding approaches and a renewed focus on root systems as untapped sources of plant amelioration. Since the late 1980s, numerous functional and structural models of root systems were developed and used to investigate the properties of root systems in soil or lab-conditions. In this review, we focus on the conception and use of such root models in the broader context of research on root-driven drought tolerance, on the basis of root system architecture (RSA) phenotyping. Such models result from the integration of architectural, physiological and environmental data. Here, we consider the different phenotyping techniques allowing for root architectural and physiological study and their limits. We discuss how QTL and breeding studies support the manipulation of RSA as a way to improve drought resistance. We then go over the integration of the generated data within architectural models, how those architectural models can be coupled with functional hydraulic models, and how functional parameters can be measured to feed those models. We then consider the assessment and validation of those hydraulic models through confrontation of simulations to experimentations. Finally, we discuss the up and coming challenges facing root systems functional-structural modeling approaches in the context of breeding.

  4. Improved Ionospheric Electrodynamic Models and Application to Calculating Joule Heating Rates

    NASA Technical Reports Server (NTRS)

    Weimer, D. R.

    2004-01-01

    Improved techniques have been developed for empirical modeling of the high-latitude electric potentials and magnetic field aligned currents (FAC) as a function of the solar wind parameters. The FAC model is constructed using scalar magnetic Euler potentials, and functions as a twin to the electric potential model. The improved models have more accurate field values as well as more accurate boundary locations. Non-linear saturation effects in the solar wind-magnetosphere coupling are also better reproduced. The models are constructed using a hybrid technique, which has spherical harmonic functions only within a small area at the pole. At lower latitudes the potentials are constructed from multiple Fourier series functions of longitude, at discrete latitudinal steps. It is shown that the two models can be used together in order to calculate the total Poynting flux and Joule heating in the ionosphere. An additional model of the ionospheric conductivity is not required in order to obtain the ionospheric currents and Joule heating, as the conductivity variations as a function of the solar inclination are implicitly contained within the FAC model's data. The models outputs are shown for various input conditions, as well as compared with satellite measurements. The calculations of the total Joule heating are compared with results obtained by the inversion of ground-based magnetometer measurements. Like their predecessors, these empirical models should continue to be a useful research and forecast tools.

  5. An alternative approach for modeling strength differential effect in sheet metals with symmetric yield functions

    NASA Astrophysics Data System (ADS)

    Kurukuri, Srihari; Worswick, Michael J.

    2013-12-01

    An alternative approach is proposed to utilize symmetric yield functions for modeling the tension-compression asymmetry commonly observed in hcp materials. In this work, the strength differential (SD) effect is modeled by choosing separate symmetric plane stress yield functions (for example, Barlat Yld 2000-2d) for the tension i.e., in the first quadrant of principal stress space, and compression i.e., third quadrant of principal stress space. In the second and fourth quadrants, the yield locus is constructed by adopting interpolating functions between uniaxial tensile and compressive stress states. In this work, different interpolating functions are chosen and the predictive capability of each approach is discussed. The main advantage of this proposed approach is that the yield locus parameters are deterministic and relatively easy to identify when compared to the Cazacu family of yield functions commonly used for modeling SD effect observed in hcp materials.

  6. Method of and apparatus for determining the similarity of a biological analyte from a model constructed from known biological fluids

    DOEpatents

    Robinson, Mark R.; Ward, Kenneth J.; Eaton, Robert P.; Haaland, David M.

    1990-01-01

    The characteristics of a biological fluid sample having an analyte are determined from a model constructed from plural known biological fluid samples. The model is a function of the concentration of materials in the known fluid samples as a function of absorption of wideband infrared energy. The wideband infrared energy is coupled to the analyte containing sample so there is differential absorption of the infrared energy as a function of the wavelength of the wideband infrared energy incident on the analyte containing sample. The differential absorption causes intensity variations of the infrared energy incident on the analyte containing sample as a function of sample wavelength of the energy, and concentration of the unknown analyte is determined from the thus-derived intensity variations of the infrared energy as a function of wavelength from the model absorption versus wavelength function.

  7. Functional Freedom: A Psychological Model of Freedom in Decision-Making.

    PubMed

    Lau, Stephan; Hiemisch, Anette

    2017-07-05

    The freedom of a decision is not yet sufficiently described as a psychological variable. We present a model of functional decision freedom that aims to fill that role. The model conceptualizes functional freedom as a capacity of people that varies depending on certain conditions of a decision episode. It denotes an inner capability to consciously shape complex decisions according to one's own values and needs. Functional freedom depends on three compensatory dimensions: it is greatest when the decision-maker is highly rational, when the structure of the decision is highly underdetermined, and when the decision process is strongly based on conscious thought and reflection. We outline possible research questions, argue for psychological benefits of functional decision freedom, and explicate the model's implications on current knowledge and research. In conclusion, we show that functional freedom is a scientific variable, permitting an additional psychological foothold in research on freedom, and that is compatible with a deterministic worldview.

  8. Developing Students' Reflections on the Function and Status of Mathematical Modeling in Different Scientific Practices: History as a Provider of Cases

    ERIC Educational Resources Information Center

    Kjeldsen, Tinne Hoff; Blomhøj, Morten

    2013-01-01

    Mathematical models and mathematical modeling play different roles in the different areas and problems in which they are used. The function and status of mathematical modeling and models in the different areas depend on the scientific practice as well as the underlying philosophical and theoretical position held by the modeler(s) and the…

  9. A Review of Modeling Pedagogies: Pedagogical Functions, Discursive Acts, and Technology in Modeling Instruction

    ERIC Educational Resources Information Center

    Campbell, Todd; Oh, Phil Seok; Maughn, Milo; Kiriazis, Nick; Zuwallack, Rebecca

    2015-01-01

    The current review examined modeling literature in top science education journals to better understand the pedagogical functions of modeling instruction reported over the last decade. Additionally, the review sought to understand the extent to which different modeling pedagogies were employed, the discursive acts that were identified as important,…

  10. Function modeling improves the efficiency of spatial modeling using big data from remote sensing

    Treesearch

    John Hogland; Nathaniel Anderson

    2017-01-01

    Spatial modeling is an integral component of most geographic information systems (GISs). However, conventional GIS modeling techniques can require substantial processing time and storage space and have limited statistical and machine learning functionality. To address these limitations, many have parallelized spatial models using multiple coding libraries and have...

  11. Symbolic Regression for the Estimation of Transfer Functions of Hydrological Models

    NASA Astrophysics Data System (ADS)

    Klotz, D.; Herrnegger, M.; Schulz, K.

    2017-11-01

    Current concepts for parameter regionalization of spatially distributed rainfall-runoff models rely on the a priori definition of transfer functions that globally map land surface characteristics (such as soil texture, land use, and digital elevation) into the model parameter space. However, these transfer functions are often chosen ad hoc or derived from small-scale experiments. This study proposes and tests an approach for inferring the structure and parametrization of possible transfer functions from runoff data to potentially circumvent these difficulties. The concept uses context-free grammars to generate possible proposition for transfer functions. The resulting structure can then be parametrized with classical optimization techniques. Several virtual experiments are performed to examine the potential for an appropriate estimation of transfer function, all of them using a very simple conceptual rainfall-runoff model with data from the Austrian Mur catchment. The results suggest that a priori defined transfer functions are in general well identifiable by the method. However, the deduction process might be inhibited, e.g., by noise in the runoff observation data, often leading to transfer function estimates of lower structural complexity.

  12. Thinking, Feeling, Intuiting and Sensing: Using the Four Psychological Functions as a Model to Empower Student Writers.

    ERIC Educational Resources Information Center

    Miller, Lori Ann

    Writing is an act of self construction. Considering how students process information can improve the quality of instruction in composing courses, but only if quantifiable, verified models of cognitive functions are taken to heart and applied to teaching methods in the classroom. C. G. Jung's model of the four functions (thinking, sensation,…

  13. Prediction of Chemical Function: Model Development and ...

    EPA Pesticide Factsheets

    The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (HT) screening-level exposures developed under ExpoCast can be combined with HT screening (HTS) bioactivity data for the risk-based prioritization of chemicals for further evaluation. The functional role (e.g. solvent, plasticizer, fragrance) that a chemical performs can drive both the types of products in which it is found and the concentration in which it is present and therefore impacting exposure potential. However, critical chemical use information (including functional role) is lacking for the majority of commercial chemicals for which exposure estimates are needed. A suite of machine-learning based models for classifying chemicals in terms of their likely functional roles in products based on structure were developed. This effort required collection, curation, and harmonization of publically-available data sources of chemical functional use information from government and industry bodies. Physicochemical and structure descriptor data were generated for chemicals with function data. Machine-learning classifier models for function were then built in a cross-validated manner from the descriptor/function data using the method of random forests. The models were applied to: 1) predict chemi

  14. Independent component model for cognitive functions of multiple subjects using [15O]H2O PET images.

    PubMed

    Park, Hae-Jeong; Kim, Jae-Jin; Youn, Tak; Lee, Dong Soo; Lee, Myung Chul; Kwon, Jun Soo

    2003-04-01

    An independent component model of multiple subjects' positron emission tomography (PET) images is proposed to explore the overall functional components involved in a task and to explain subject specific variations of metabolic activities under altered experimental conditions utilizing the Independent component analysis (ICA) concept. As PET images represent time-compressed activities of several cognitive components, we derived a mathematical model to decompose functional components from cross-sectional images based on two fundamental hypotheses: (1) all subjects share basic functional components that are common to subjects and spatially independent of each other in relation to the given experimental task, and (2) all subjects share common functional components throughout tasks which are also spatially independent. The variations of hemodynamic activities according to subjects or tasks can be explained by the variations in the usage weight of the functional components. We investigated the plausibility of the model using serial cognitive experiments of simple object perception, object recognition, two-back working memory, and divided attention of a syntactic process. We found that the independent component model satisfactorily explained the functional components involved in the task and discuss here the application of ICA in multiple subjects' PET images to explore the functional association of brain activations. Copyright 2003 Wiley-Liss, Inc.

  15. Spectrum analysis of radar life signal in the three kinds of theoretical models

    NASA Astrophysics Data System (ADS)

    Yang, X. F.; Ma, J. F.; Wang, D.

    2017-02-01

    In the single frequency continuous wave radar life detection system, based on the Doppler effect, the theory model of radar life signal is expressed by the real function, and there is a phenomenon that can't be confirmed by the experiment. When the phase generated by the distance between the measured object and the radar measuring head is л of integer times, the main frequency spectrum of life signal (respiration and heartbeat) is not existed in radar life signal. If this phase is л/2 of odd times, the main frequency spectrum of breath and heartbeat frequency is the strongest. In this paper, we use the Doppler effect as the basic theory, using three different mathematical expressions——real function, complex exponential function and Bessel's function expansion form. They are used to establish the theoretical model of radar life signal. Simulation analysis revealed that the Bessel expansion form theoretical model solve the problem of real function form. Compared with the theoretical model of the complex exponential function, the derived spectral line is greatly reduced in the theoretical model of Bessel expansion form, which is more consistent with the actual situation.

  16. Estimation of parameters of constant elasticity of substitution production functional model

    NASA Astrophysics Data System (ADS)

    Mahaboob, B.; Venkateswarlu, B.; Sankar, J. Ravi

    2017-11-01

    Nonlinear model building has become an increasing important powerful tool in mathematical economics. In recent years the popularity of applications of nonlinear models has dramatically been rising up. Several researchers in econometrics are very often interested in the inferential aspects of nonlinear regression models [6]. The present research study gives a distinct method of estimation of more complicated and highly nonlinear model viz Constant Elasticity of Substitution (CES) production functional model. Henningen et.al [5] proposed three solutions to avoid serious problems when estimating CES functions in 2012 and they are i) removing discontinuities by using the limits of the CES function and its derivative. ii) Circumventing large rounding errors by local linear approximations iii) Handling ill-behaved objective functions by a multi-dimensional grid search. Joel Chongeh et.al [7] discussed the estimation of the impact of capital and labour inputs to the gris output agri-food products using constant elasticity of substitution production function in Tanzanian context. Pol Antras [8] presented new estimates of the elasticity of substitution between capital and labour using data from the private sector of the U.S. economy for the period 1948-1998.

  17. Two-Dimensional Magnetotelluric Modelling of Ore Deposits: Improvements in Model Constraints by Inclusion of Borehole Measurements

    NASA Astrophysics Data System (ADS)

    Kalscheuer, Thomas; Juhojuntti, Niklas; Vaittinen, Katri

    2017-12-01

    A combination of magnetotelluric (MT) measurements on the surface and in boreholes (without metal casing) can be expected to enhance resolution and reduce the ambiguity in models of electrical resistivity derived from MT surface measurements alone. In order to quantify potential improvement in inversion models and to aid design of electromagnetic (EM) borehole sensors, we considered two synthetic 2D models containing ore bodies down to 3000 m depth (the first with two dipping conductors in resistive crystalline host rock and the second with three mineralisation zones in a sedimentary succession exhibiting only moderate resistivity contrasts). We computed 2D inversion models from the forward responses based on combinations of surface impedance measurements and borehole measurements such as (1) skin-effect transfer functions relating horizontal magnetic fields at depth to those on the surface, (2) vertical magnetic transfer functions relating vertical magnetic fields at depth to horizontal magnetic fields on the surface and (3) vertical electric transfer functions relating vertical electric fields at depth to horizontal magnetic fields on the surface. Whereas skin-effect transfer functions are sensitive to the resistivity of the background medium and 2D anomalies, the vertical magnetic and electric field transfer functions have the disadvantage that they are comparatively insensitive to the resistivity of the layered background medium. This insensitivity introduces convergence problems in the inversion of data from structures with strong 2D resistivity contrasts. Hence, we adjusted the inversion approach to a three-step procedure, where (1) an initial inversion model is computed from surface impedance measurements, (2) this inversion model from surface impedances is used as the initial model for a joint inversion of surface impedances and skin-effect transfer functions and (3) the joint inversion model derived from the surface impedances and skin-effect transfer functions is used as the initial model for the inversion of the surface impedances, skin-effect transfer functions and vertical magnetic and electric transfer functions. For both synthetic examples, the inversion models resulting from surface and borehole measurements have higher similarity to the true models than models computed exclusively from surface measurements. However, the most prominent improvements were obtained for the first example, in which a deep small-sized ore body is more easily distinguished from a shallow main ore body penetrated by a borehole and the extent of the shadow zone (a conductive artefact) underneath the main conductor is strongly reduced. Formal model error and resolution analysis demonstrated that predominantly the skin-effect transfer functions improve model resolution at depth below the sensors and at distance of ˜ 300-1000 m laterally off a borehole, whereas the vertical electric and magnetic transfer functions improve resolution along the borehole and in its immediate vicinity. Furthermore, we studied the signal levels at depth and provided specifications of borehole magnetic and electric field sensors to be developed in a future project. Our results suggest that three-component SQUID and fluxgate magnetometers should be developed to facilitate borehole MT measurements at signal frequencies above and below 1 Hz, respectively.

  18. Rethinking plant functional types in Earth System Models: pan-tropical analysis of tree survival across environmental gradients

    NASA Astrophysics Data System (ADS)

    Johnson, D. J.; Needham, J.; Xu, C.; Davies, S. J.; Bunyavejchewin, S.; Giardina, C. P.; Condit, R.; Cordell, S.; Litton, C. M.; Hubbell, S.; Kassim, A. R. B.; Shawn, L. K. Y.; Nasardin, M. B.; Ong, P.; Ostertag, R.; Sack, L.; Tan, S. K. S.; Yap, S.; McDowell, N. G.; McMahon, S.

    2016-12-01

    Terrestrial carbon cycling is a function of the growth and survival of trees. Current model representations of tree growth and survival at a global scale rely on coarse plant functional traits that are parameterized very generally. In view of the large biodiversity in the tropical forests, it is important that we account for the functional diversity in order to better predict tropical forest responses to future climate changes. Several next generation Earth System Models are moving towards a size-structured, trait-based approach to modelling vegetation globally, but the challenge of which and how many traits are necessary to capture forest complexity remains. Additionally, the challenge of collecting sufficient trait data to describe the vast species richness of tropical forests is enormous. We propose a more fundamental approach to these problems by characterizing forests by their patterns of survival. We expect our approach to distill real-world tree survival into a reasonable number of functional types. Using 10 large-area tropical forest plots that span geographic, edaphic and climatic gradients, we model tree survival as a function of tree size for hundreds of species. We found surprisingly few categories of size-survival functions emerge. This indicates some fundamental strategies at play across diverse forests to constrain the range of possible size-survival functions. Initial cluster analysis indicates that four to eight functional forms are necessary to describe variation in size-survival relations. Temporal variation in size-survival functions can be related to local environmental variation, allowing us to parameterize how demographically similar groups of species respond to perturbations in the ecosystem. We believe this methodology will yield a synthetic approach to classifying forest systems that will greatly reduce uncertainty and complexity in global vegetation models.

  19. Mirror neurons and imitation: a computationally guided review.

    PubMed

    Oztop, Erhan; Kawato, Mitsuo; Arbib, Michael

    2006-04-01

    Neurophysiology reveals the properties of individual mirror neurons in the macaque while brain imaging reveals the presence of 'mirror systems' (not individual neurons) in the human. Current conceptual models attribute high level functions such as action understanding, imitation, and language to mirror neurons. However, only the first of these three functions is well-developed in monkeys. We thus distinguish current opinions (conceptual models) on mirror neuron function from more detailed computational models. We assess the strengths and weaknesses of current computational models in addressing the data and speculations on mirror neurons (macaque) and mirror systems (human). In particular, our mirror neuron system (MNS), mental state inference (MSI) and modular selection and identification for control (MOSAIC) models are analyzed in more detail. Conceptual models often overlook the computational requirements for posited functions, while too many computational models adopt the erroneous hypothesis that mirror neurons are interchangeable with imitation ability. Our meta-analysis underlines the gap between conceptual and computational models and points out the research effort required from both sides to reduce this gap.

  20. Modeling Functional Neuroanatomy for an Anatomy Information System

    PubMed Central

    Niggemann, Jörg M.; Gebert, Andreas; Schulz, Stefan

    2008-01-01

    Objective Existing neuroanatomical ontologies, databases and information systems, such as the Foundational Model of Anatomy (FMA), represent outgoing connections from brain structures, but cannot represent the “internal wiring” of structures and as such, cannot distinguish between different independent connections from the same structure. Thus, a fundamental aspect of Neuroanatomy, the functional pathways and functional systems of the brain such as the pupillary light reflex system, is not adequately represented. This article identifies underlying anatomical objects which are the source of independent connections (collections of neurons) and uses these as basic building blocks to construct a model of functional neuroanatomy and its functional pathways. Design The basic representational elements of the model are unnamed groups of neurons or groups of neuron segments. These groups, their relations to each other, and the relations to the objects of macroscopic anatomy are defined. The resulting model can be incorporated into the FMA. Measurements The capabilities of the presented model are compared to the FMA and the Brain Architecture Management System (BAMS). Results Internal wiring as well as functional pathways can correctly be represented and tracked. Conclusion This model bridges the gap between representations of single neurons and their parts on the one hand and representations of spatial brain structures and areas on the other hand. It is capable of drawing correct inferences on pathways in a nervous system. The object and relation definitions are related to the Open Biomedical Ontology effort and its relation ontology, so that this model can be further developed into an ontology of neuronal functional systems. PMID:18579841

  1. How to precisely measure the volume velocity transfer function of physical vocal tract models by external excitation

    PubMed Central

    Mainka, Alexander; Kürbis, Steffen; Birkholz, Peter

    2018-01-01

    Recently, 3D printing has been increasingly used to create physical models of the vocal tract with geometries obtained from magnetic resonance imaging. These printed models allow measuring the vocal tract transfer function, which is not reliably possible in vivo for the vocal tract of living humans. The transfer functions enable the detailed examination of the acoustic effects of specific articulatory strategies in speaking and singing, and the validation of acoustic plane-wave models for realistic vocal tract geometries in articulatory speech synthesis. To measure the acoustic transfer function of 3D-printed models, two techniques have been described: (1) excitation of the models with a broadband sound source at the glottis and measurement of the sound pressure radiated from the lips, and (2) excitation of the models with an external source in front of the lips and measurement of the sound pressure inside the models at the glottal end. The former method is more frequently used and more intuitive due to its similarity to speech production. However, the latter method avoids the intricate problem of constructing a suitable broadband glottal source and is therefore more effective. It has been shown to yield a transfer function similar, but not exactly equal to the volume velocity transfer function between the glottis and the lips, which is usually used to characterize vocal tract acoustics. Here, we revisit this method and show both, theoretically and experimentally, how it can be extended to yield the precise volume velocity transfer function of the vocal tract. PMID:29543829

  2. Robust estimation for ordinary differential equation models.

    PubMed

    Cao, J; Wang, L; Xu, J

    2011-12-01

    Applied scientists often like to use ordinary differential equations (ODEs) to model complex dynamic processes that arise in biology, engineering, medicine, and many other areas. It is interesting but challenging to estimate ODE parameters from noisy data, especially when the data have some outliers. We propose a robust method to address this problem. The dynamic process is represented with a nonparametric function, which is a linear combination of basis functions. The nonparametric function is estimated by a robust penalized smoothing method. The penalty term is defined with the parametric ODE model, which controls the roughness of the nonparametric function and maintains the fidelity of the nonparametric function to the ODE model. The basis coefficients and ODE parameters are estimated in two nested levels of optimization. The coefficient estimates are treated as an implicit function of ODE parameters, which enables one to derive the analytic gradients for optimization using the implicit function theorem. Simulation studies show that the robust method gives satisfactory estimates for the ODE parameters from noisy data with outliers. The robust method is demonstrated by estimating a predator-prey ODE model from real ecological data. © 2011, The International Biometric Society.

  3. Examining the influence of link function misspecification in conventional regression models for developing crash modification factors.

    PubMed

    Wu, Lingtao; Lord, Dominique

    2017-05-01

    This study further examined the use of regression models for developing crash modification factors (CMFs), specifically focusing on the misspecification in the link function. The primary objectives were to validate the accuracy of CMFs derived from the commonly used regression models (i.e., generalized linear models or GLMs with additive linear link functions) when some of the variables have nonlinear relationships and quantify the amount of bias as a function of the nonlinearity. Using the concept of artificial realistic data, various linear and nonlinear crash modification functions (CM-Functions) were assumed for three variables. Crash counts were randomly generated based on these CM-Functions. CMFs were then derived from regression models for three different scenarios. The results were compared with the assumed true values. The main findings are summarized as follows: (1) when some variables have nonlinear relationships with crash risk, the CMFs for these variables derived from the commonly used GLMs are all biased, especially around areas away from the baseline conditions (e.g., boundary areas); (2) with the increase in nonlinearity (i.e., nonlinear relationship becomes stronger), the bias becomes more significant; (3) the quality of CMFs for other variables having linear relationships can be influenced when mixed with those having nonlinear relationships, but the accuracy may still be acceptable; and (4) the misuse of the link function for one or more variables can also lead to biased estimates for other parameters. This study raised the importance of the link function when using regression models for developing CMFs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Evaluating the Functionality of Conceptual Models

    NASA Astrophysics Data System (ADS)

    Mehmood, Kashif; Cherfi, Samira Si-Said

    Conceptual models serve as the blueprints of information systems and their quality plays decisive role in the success of the end system. It has been witnessed that majority of the IS change-requests results due to deficient functionalities in the information systems. Therefore, a good analysis and design method should ensure that conceptual models are functionally correct and complete, as they are the communicating mediator between the users and the development team. Conceptual model is said to be functionally complete if it represents all the relevant features of the application domain and covers all the specified requirements. Our approach evaluates the functional aspects on multiple levels of granularity in addition to providing the corrective actions or transformation for improvement. This approach has been empirically validated by practitioners through a survey.

  5. Simulation-based Bayesian inference for latent traits of item response models: Introduction to the ltbayes package for R.

    PubMed

    Johnson, Timothy R; Kuhn, Kristine M

    2015-12-01

    This paper introduces the ltbayes package for R. This package includes a suite of functions for investigating the posterior distribution of latent traits of item response models. These include functions for simulating realizations from the posterior distribution, profiling the posterior density or likelihood function, calculation of posterior modes or means, Fisher information functions and observed information, and profile likelihood confidence intervals. Inferences can be based on individual response patterns or sets of response patterns such as sum scores. Functions are included for several common binary and polytomous item response models, but the package can also be used with user-specified models. This paper introduces some background and motivation for the package, and includes several detailed examples of its use.

  6. KECSA-Movable Type Implicit Solvation Model (KMTISM)

    PubMed Central

    2015-01-01

    Computation of the solvation free energy for chemical and biological processes has long been of significant interest. The key challenges to effective solvation modeling center on the choice of potential function and configurational sampling. Herein, an energy sampling approach termed the “Movable Type” (MT) method, and a statistical energy function for solvation modeling, “Knowledge-based and Empirical Combined Scoring Algorithm” (KECSA) are developed and utilized to create an implicit solvation model: KECSA-Movable Type Implicit Solvation Model (KMTISM) suitable for the study of chemical and biological systems. KMTISM is an implicit solvation model, but the MT method performs energy sampling at the atom pairwise level. For a specific molecular system, the MT method collects energies from prebuilt databases for the requisite atom pairs at all relevant distance ranges, which by its very construction encodes all possible molecular configurations simultaneously. Unlike traditional statistical energy functions, KECSA converts structural statistical information into categorized atom pairwise interaction energies as a function of the radial distance instead of a mean force energy function. Within the implicit solvent model approximation, aqueous solvation free energies are then obtained from the NVT ensemble partition function generated by the MT method. Validation is performed against several subsets selected from the Minnesota Solvation Database v2012. Results are compared with several solvation free energy calculation methods, including a one-to-one comparison against two commonly used classical implicit solvation models: MM-GBSA and MM-PBSA. Comparison against a quantum mechanics based polarizable continuum model is also discussed (Cramer and Truhlar’s Solvation Model 12). PMID:25691832

  7. Aircraft/Air Traffic Management Functional Analysis Model: Technical Description. 2.0

    NASA Technical Reports Server (NTRS)

    Etheridge, Melvin; Plugge, Joana; Retina, Nusrat

    1998-01-01

    The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) under a National Aeronautics and Space Administration (NASA) contract. This document provides a technical description of FAM 2.0 and its computer files to enable the modeler and programmer to make enhancements or modifications to the model. Those interested in a guide for using the model in analysis should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Users Manual.

  8. Modeling Rabbit Responses to Single and Multiple Aerosol ...

    EPA Pesticide Factsheets

    Journal Article Survival models are developed here to predict response and time-to-response for mortality in rabbits following exposures to single or multiple aerosol doses of Bacillus anthracis spores. Hazard function models were developed for a multiple dose dataset to predict the probability of death through specifying dose-response functions and the time between exposure and the time-to-death (TTD). Among the models developed, the best-fitting survival model (baseline model) has an exponential dose-response model with a Weibull TTD distribution. Alternative models assessed employ different underlying dose-response functions and use the assumption that, in a multiple dose scenario, earlier doses affect the hazard functions of each subsequent dose. In addition, published mechanistic models are analyzed and compared with models developed in this paper. None of the alternative models that were assessed provided a statistically significant improvement in fit over the baseline model. The general approach utilizes simple empirical data analysis to develop parsimonious models with limited reliance on mechanistic assumptions. The baseline model predicts TTDs consistent with reported results from three independent high-dose rabbit datasets. More accurate survival models depend upon future development of dose-response datasets specifically designed to assess potential multiple dose effects on response and time-to-response. The process used in this paper to dev

  9. Convex reformulation of biologically-based multi-criteria intensity-modulated radiation therapy optimization including fractionation effects

    NASA Astrophysics Data System (ADS)

    Hoffmann, Aswin L.; den Hertog, Dick; Siem, Alex Y. D.; Kaanders, Johannes H. A. M.; Huizenga, Henk

    2008-11-01

    Finding fluence maps for intensity-modulated radiation therapy (IMRT) can be formulated as a multi-criteria optimization problem for which Pareto optimal treatment plans exist. To account for the dose-per-fraction effect of fractionated IMRT, it is desirable to exploit radiobiological treatment plan evaluation criteria based on the linear-quadratic (LQ) cell survival model as a means to balance the radiation benefits and risks in terms of biologic response. Unfortunately, the LQ-model-based radiobiological criteria are nonconvex functions, which make the optimization problem hard to solve. We apply the framework proposed by Romeijn et al (2004 Phys. Med. Biol. 49 1991-2013) to find transformations of LQ-model-based radiobiological functions and establish conditions under which transformed functions result in equivalent convex criteria that do not change the set of Pareto optimal treatment plans. The functions analysed are: the LQ-Poisson-based model for tumour control probability (TCP) with and without inter-patient heterogeneity in radiation sensitivity, the LQ-Poisson-based relative seriality s-model for normal tissue complication probability (NTCP), the equivalent uniform dose (EUD) under the LQ-Poisson model and the fractionation-corrected Probit-based model for NTCP according to Lyman, Kutcher and Burman. These functions differ from those analysed before in that they cannot be decomposed into elementary EUD or generalized-EUD functions. In addition, we show that applying increasing and concave transformations to the convexified functions is beneficial for the piecewise approximation of the Pareto efficient frontier.

  10. Age at exposure and attained age variations of cancer risk in the Japanese A-bomb and radiotherapy cohorts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schneider, Uwe, E-mail: uwe.schneider@uzh.ch; Walsh, Linda

    Purpose: Phenomenological risk models for radiation-induced cancer are frequently applied to estimate the risk of radiation-induced cancers at radiotherapy doses. Such models often include the effect modification, of the main risk to radiation dose response, by age at exposure and attained age. The aim of this paper is to compare the patterns in risk effect modification by age, between models obtained from the Japanese atomic-bomb (A-bomb) survivor data and models for cancer risks previously reported for radiotherapy patients. Patterns in risk effect modification by age from the epidemiological studies of radiotherapy patients were also used to refine and extend themore » risk effect modification by age obtained from the A-bomb survivor data, so that more universal models can be presented here. Methods: Simple log-linear and power functions of age for the risk effect modification applied in models of the A-bomb survivor data are compared to risks from epidemiological studies of second cancers after radiotherapy. These functions of age were also refined and fitted to radiotherapy risks. The resulting age models provide a refined and extended functional dependence of risk with age at exposure and attained age especially beyond 40 and 65 yr, respectively, and provide a better representation than the currently available simple age functions. Results: It was found that the A-bomb models predict risk similarly to the outcomes of testicular cancer survivors. The survivors of Hodgkin’s disease show steeper variations of risk with both age at exposure and attained age. The extended models predict solid cancer risk increase as a function of age at exposure beyond 40 yr and the risk decrease as a function of attained age beyond 65 yr better than the simple models. Conclusions: The standard functions for risk effect modification by age, based on the A-bomb survivor data, predict second cancer risk in radiotherapy patients for ages at exposure prior to 40 yr and attained ages before 55 yr reasonably well. However, for larger ages, the refined and extended models can be applied to predict the risk as a function of age.« less

  11. Age at exposure and attained age variations of cancer risk in the Japanese A-bomb and radiotherapy cohorts.

    PubMed

    Schneider, Uwe; Walsh, Linda

    2015-08-01

    Phenomenological risk models for radiation-induced cancer are frequently applied to estimate the risk of radiation-induced cancers at radiotherapy doses. Such models often include the effect modification, of the main risk to radiation dose response, by age at exposure and attained age. The aim of this paper is to compare the patterns in risk effect modification by age, between models obtained from the Japanese atomic-bomb (A-bomb) survivor data and models for cancer risks previously reported for radiotherapy patients. Patterns in risk effect modification by age from the epidemiological studies of radiotherapy patients were also used to refine and extend the risk effect modification by age obtained from the A-bomb survivor data, so that more universal models can be presented here. Simple log-linear and power functions of age for the risk effect modification applied in models of the A-bomb survivor data are compared to risks from epidemiological studies of second cancers after radiotherapy. These functions of age were also refined and fitted to radiotherapy risks. The resulting age models provide a refined and extended functional dependence of risk with age at exposure and attained age especially beyond 40 and 65 yr, respectively, and provide a better representation than the currently available simple age functions. It was found that the A-bomb models predict risk similarly to the outcomes of testicular cancer survivors. The survivors of Hodgkin's disease show steeper variations of risk with both age at exposure and attained age. The extended models predict solid cancer risk increase as a function of age at exposure beyond 40 yr and the risk decrease as a function of attained age beyond 65 yr better than the simple models. The standard functions for risk effect modification by age, based on the A-bomb survivor data, predict second cancer risk in radiotherapy patients for ages at exposure prior to 40 yr and attained ages before 55 yr reasonably well. However, for larger ages, the refined and extended models can be applied to predict the risk as a function of age.

  12. Neural modeling and functional neuroimaging.

    PubMed

    Horwitz, B; Sporns, O

    1994-01-01

    Two research areas that so far have had little interaction with one another are functional neuroimaging and computational neuroscience. The application of computational models and techniques to the inherently rich data sets generated by "standard" neurophysiological methods has proven useful for interpreting these data sets and for providing predictions and hypotheses for further experiments. We suggest that both theory- and data-driven computational modeling of neuronal systems can help to interpret data generated by functional neuroimaging methods, especially those used with human subjects. In this article, we point out four sets of questions, addressable by computational neuroscientists whose answere would be of value and interest to those who perform functional neuroimaging. The first set consist of determining the neurobiological substrate of the signals measured by functional neuroimaging. The second set concerns developing systems-level models of functional neuroimaging data. The third set of questions involves integrating functional neuroimaging data across modalities, with a particular emphasis on relating electromagnetic with hemodynamic data. The last set asks how one can relate systems-level models to those at the neuronal and neural ensemble levels. We feel that there are ample reasons to link functional neuroimaging and neural modeling, and that combining the results from the two disciplines will result in furthering our understanding of the central nervous system. © 1994 Wiley-Liss, Inc. This Article is a US Goverment work and, as such, is in the public domain in the United State of America. Copyright © 1994 Wiley-Liss, Inc.

  13. Percutaneous Transcatheter One-Step Mechanical Aortic Disc Valve Prosthesis Implantation: A Preliminary Feasibility Study in Swine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sochman, Jan; Peregrin, Jan H.; Rocek, Miloslav

    Purpose. To evaluate the feasibility of one-step implantation of a new type of stent-based mechanical aortic disc valve prosthesis (MADVP) above and across the native aortic valve and its short-term function in swine with both functional and dysfunctional native valves. Methods. The MADVP consisted of a folding disc valve made of silicone elastomer attached to either a nitinol Z-stent (Z model) or a nitinol cross-braided stent (SX model). Implantation of 10 MADVPs (6 Z and 4 SX models) was attempted in 10 swine: 4 (2 Z and 2 SX models) with a functional native valve and 6 (4 Z andmore » 2 SX models) with aortic regurgitation induced either by intentional valve injury or by MADVP placement across the native valve. MADVP function was observed for up to 3 hr after implantation. Results. MADVP implantation was successful in 9 swine. One animal died of induced massive regurgitation prior to implantation. Four MADVPs implanted above functioning native valves exhibited good function. In 5 swine with regurgitation, MADVP implantation corrected the induced native valve dysfunction and the device's continuous good function was observed in 4 animals. One MADVP (SX model) placed across native valve gradually migrated into the left ventricle. Conclusion. The tested MADVP can be implanted above and across the native valve in a one-step procedure and can replace the function of the regurgitating native valve. Further technical development and testing are warranted, preferably with a manufactured MADVP.« less

  14. Integrated Workforce Modeling System

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.

    2000-01-01

    There are several computer-based systems, currently in various phases of development at KSC, which encompass some component, aspect, or function of workforce modeling. These systems may offer redundant capabilities and/or incompatible interfaces. A systems approach to workforce modeling is necessary in order to identify and better address user requirements. This research has consisted of two primary tasks. Task 1 provided an assessment of existing and proposed KSC workforce modeling systems for their functionality and applicability to the workforce planning function. Task 2 resulted in the development of a proof-of-concept design for a systems approach to workforce modeling. The model incorporates critical aspects of workforce planning, including hires, attrition, and employee development.

  15. A Functional Varying-Coefficient Single-Index Model for Functional Response Data

    PubMed Central

    Li, Jialiang; Huang, Chao; Zhu, Hongtu

    2016-01-01

    Motivated by the analysis of imaging data, we propose a novel functional varying-coefficient single index model (FVCSIM) to carry out the regression analysis of functional response data on a set of covariates of interest. FVCSIM represents a new extension of varying-coefficient single index models for scalar responses collected from cross-sectional and longitudinal studies. An efficient estimation procedure is developed to iteratively estimate varying coefficient functions, link functions, index parameter vectors, and the covariance function of individual functions. We systematically examine the asymptotic properties of all estimators including the weak convergence of the estimated varying coefficient functions, the asymptotic distribution of the estimated index parameter vectors, and the uniform convergence rate of the estimated covariance function and their spectrum. Simulation studies are carried out to assess the finite-sample performance of the proposed procedure. We apply FVCSIM to investigating the development of white matter diffusivities along the corpus callosum skeleton obtained from Alzheimer’s Disease Neuroimaging Initiative (ADNI) study. PMID:29200540

  16. A Functional Varying-Coefficient Single-Index Model for Functional Response Data.

    PubMed

    Li, Jialiang; Huang, Chao; Zhu, Hongtu

    2017-01-01

    Motivated by the analysis of imaging data, we propose a novel functional varying-coefficient single index model (FVCSIM) to carry out the regression analysis of functional response data on a set of covariates of interest. FVCSIM represents a new extension of varying-coefficient single index models for scalar responses collected from cross-sectional and longitudinal studies. An efficient estimation procedure is developed to iteratively estimate varying coefficient functions, link functions, index parameter vectors, and the covariance function of individual functions. We systematically examine the asymptotic properties of all estimators including the weak convergence of the estimated varying coefficient functions, the asymptotic distribution of the estimated index parameter vectors, and the uniform convergence rate of the estimated covariance function and their spectrum. Simulation studies are carried out to assess the finite-sample performance of the proposed procedure. We apply FVCSIM to investigating the development of white matter diffusivities along the corpus callosum skeleton obtained from Alzheimer's Disease Neuroimaging Initiative (ADNI) study.

  17. QMEANclust: estimation of protein model quality by combining a composite scoring function with structural density information.

    PubMed

    Benkert, Pascal; Schwede, Torsten; Tosatto, Silvio Ce

    2009-05-20

    The selection of the most accurate protein model from a set of alternatives is a crucial step in protein structure prediction both in template-based and ab initio approaches. Scoring functions have been developed which can either return a quality estimate for a single model or derive a score from the information contained in the ensemble of models for a given sequence. Local structural features occurring more frequently in the ensemble have a greater probability of being correct. Within the context of the CASP experiment, these so called consensus methods have been shown to perform considerably better in selecting good candidate models, but tend to fail if the best models are far from the dominant structural cluster. In this paper we show that model selection can be improved if both approaches are combined by pre-filtering the models used during the calculation of the structural consensus. Our recently published QMEAN composite scoring function has been improved by including an all-atom interaction potential term. The preliminary model ranking based on the new QMEAN score is used to select a subset of reliable models against which the structural consensus score is calculated. This scoring function called QMEANclust achieves a correlation coefficient of predicted quality score and GDT_TS of 0.9 averaged over the 98 CASP7 targets and perform significantly better in selecting good models from the ensemble of server models than any other groups participating in the quality estimation category of CASP7. Both scoring functions are also benchmarked on the MOULDER test set consisting of 20 target proteins each with 300 alternatives models generated by MODELLER. QMEAN outperforms all other tested scoring functions operating on individual models, while the consensus method QMEANclust only works properly on decoy sets containing a certain fraction of near-native conformations. We also present a local version of QMEAN for the per-residue estimation of model quality (QMEANlocal) and compare it to a new local consensus-based approach. Improved model selection is obtained by using a composite scoring function operating on single models in order to enrich higher quality models which are subsequently used to calculate the structural consensus. The performance of consensus-based methods such as QMEANclust highly depends on the composition and quality of the model ensemble to be analysed. Therefore, performance estimates for consensus methods based on large meta-datasets (e.g. CASP) might overrate their applicability in more realistic modelling situations with smaller sets of models based on individual methods.

  18. Using a Functional Simulation of Crisis Management to Test the C2 Agility Model Parameters on Key Performance Variables

    DTIC Science & Technology

    2013-06-01

    1 18th ICCRTS Using a Functional Simulation of Crisis Management to Test the C2 Agility Model Parameters on Key Performance Variables...AND SUBTITLE Using a Functional Simulation of Crisis Management to Test the C2 Agility Model Parameters on Key Performance Variables 5a. CONTRACT...command in crisis management. C2 Agility Model Agility can be conceptualized at a number of different levels; for instance at the team

  19. Radial basis function and its application in tourism management

    NASA Astrophysics Data System (ADS)

    Hu, Shan-Feng; Zhu, Hong-Bin; Zhao, Lei

    2018-05-01

    In this work, several applications and the performances of the radial basis function (RBF) are briefly reviewed at first. After that, the binomial function combined with three different RBFs including the multiquadric (MQ), inverse quadric (IQ) and inverse multiquadric (IMQ) distributions are adopted to model the tourism data of Huangshan in China. Simulation results showed that all the models match very well with the sample data. It is found that among the three models, the IMQ-RBF model is more suitable for forecasting the tourist flow.

  20. Tree-Based Global Model Tests for Polytomous Rasch Models

    ERIC Educational Resources Information Center

    Komboz, Basil; Strobl, Carolin; Zeileis, Achim

    2018-01-01

    Psychometric measurement models are only valid if measurement invariance holds between test takers of different groups. Global model tests, such as the well-established likelihood ratio (LR) test, are sensitive to violations of measurement invariance, such as differential item functioning and differential step functioning. However, these…

  1. A universal Model-R Coupler to facilitate the use of R functions for model calibration and analysis

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shuguang; Yan, Wende

    2014-01-01

    Mathematical models are useful in various fields of science and engineering. However, it is a challenge to make a model utilize the open and growing functions (e.g., model inversion) on the R platform due to the requirement of accessing and revising the model's source code. To overcome this barrier, we developed a universal tool that aims to convert a model developed in any computer language to an R function using the template and instruction concept of the Parameter ESTimation program (PEST) and the operational structure of the R-Soil and Water Assessment Tool (R-SWAT). The developed tool (Model-R Coupler) is promising because users of any model can connect an external algorithm (written in R) with their model to implement various model behavior analyses (e.g., parameter optimization, sensitivity and uncertainty analysis, performance evaluation, and visualization) without accessing or modifying the model's source code.

  2. State-space model with deep learning for functional dynamics estimation in resting-state fMRI.

    PubMed

    Suk, Heung-Il; Wee, Chong-Yaw; Lee, Seong-Whan; Shen, Dinggang

    2016-04-01

    Studies on resting-state functional Magnetic Resonance Imaging (rs-fMRI) have shown that different brain regions still actively interact with each other while a subject is at rest, and such functional interaction is not stationary but changes over time. In terms of a large-scale brain network, in this paper, we focus on time-varying patterns of functional networks, i.e., functional dynamics, inherent in rs-fMRI, which is one of the emerging issues along with the network modelling. Specifically, we propose a novel methodological architecture that combines deep learning and state-space modelling, and apply it to rs-fMRI based Mild Cognitive Impairment (MCI) diagnosis. We first devise a Deep Auto-Encoder (DAE) to discover hierarchical non-linear functional relations among regions, by which we transform the regional features into an embedding space, whose bases are complex functional networks. Given the embedded functional features, we then use a Hidden Markov Model (HMM) to estimate dynamic characteristics of functional networks inherent in rs-fMRI via internal states, which are unobservable but can be inferred from observations statistically. By building a generative model with an HMM, we estimate the likelihood of the input features of rs-fMRI as belonging to the corresponding status, i.e., MCI or normal healthy control, based on which we identify the clinical label of a testing subject. In order to validate the effectiveness of the proposed method, we performed experiments on two different datasets and compared with state-of-the-art methods in the literature. We also analyzed the functional networks learned by DAE, estimated the functional connectivities by decoding hidden states in HMM, and investigated the estimated functional connectivities by means of a graph-theoretic approach. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. State-space model with deep learning for functional dynamics estimation in resting-state fMRI

    PubMed Central

    Suk, Heung-Il; Wee, Chong-Yaw; Lee, Seong-Whan; Shen, Dinggang

    2017-01-01

    Studies on resting-state functional Magnetic Resonance Imaging (rs-fMRI) have shown that different brain regions still actively interact with each other while a subject is at rest, and such functional interaction is not stationary but changes over time. In terms of a large-scale brain network, in this paper, we focus on time-varying patterns of functional networks, i.e., functional dynamics, inherent in rs-fMRI, which is one of the emerging issues along with the network modelling. Specifically, we propose a novel methodological architecture that combines deep learning and state-space modelling, and apply it to rs-fMRI based Mild Cognitive Impairment (MCI) diagnosis. We first devise a Deep Auto-Encoder (DAE) to discover hierarchical non-linear functional relations among regions, by which we transform the regional features into an embedding space, whose bases are complex functional networks. Given the embedded functional features, we then use a Hidden Markov Model (HMM) to estimate dynamic characteristics of functional networks inherent in rs-fMRI via internal states, which are unobservable but can be inferred from observations statistically. By building a generative model with an HMM, we estimate the likelihood of the input features of rs-fMRI as belonging to the corresponding status, i.e., MCI or normal healthy control, based on which we identify the clinical label of a testing subject. In order to validate the effectiveness of the proposed method, we performed experiments on two different datasets and compared with state-of-the-art methods in the literature. We also analyzed the functional networks learned by DAE, estimated the functional connectivities by decoding hidden states in HMM, and investigated the estimated functional connectivities by means of a graph-theoretic approach. PMID:26774612

  4. Methodology to Support Dynamic Function Allocation Policies Between Humans and Flight Deck Automation

    NASA Technical Reports Server (NTRS)

    Johnson, Eric N.

    2012-01-01

    Function allocation assigns work functions to all agents in a team, both human and automation. Efforts to guide function allocation systematically have been studied in many fields such as engineering, human factors, team and organization design, management science, cognitive systems engineering. Each field focuses on certain aspects of function allocation, but not all; thus, an independent discussion of each does not address all necessary aspects of function allocation. Four distinctive perspectives have emerged from this comprehensive review of literature on those fields: the technology-centered, human-centered, team-oriented, and work-oriented perspectives. Each perspective focuses on different aspects of function allocation: capabilities and characteristics of agents (automation or human), structure and strategy of a team, and work structure and environment. This report offers eight issues with function allocation that can be used to assess the extent to which each of issues exist on a given function allocation. A modeling framework using formal models and simulation was developed to model work as described by the environment, agents, their inherent dynamics, and relationships among them. Finally, to validate the framework and metrics, a case study modeled four different function allocations between a pilot and flight deck automation during the arrival and approach phases of flight.

  5. Further support for the role of dysfunctional attitudes in models of real-world functioning in schizophrenia.

    PubMed

    Horan, William P; Rassovsky, Yuri; Kern, Robert S; Lee, Junghee; Wynn, Jonathan K; Green, Michael F

    2010-06-01

    According to A.T. Beck and colleagues' cognitive formulation of poor functioning in schizophrenia, maladaptive cognitive appraisals play a key role in the expression and persistence of negative symptoms and associated real-world functioning deficits. They provided initial support for this model by showing that dysfunctional attitudes are elevated in schizophrenia and account for significant variance in negative symptoms and subjective quality of life. The current study used structural equation modeling to further evaluate the contribution of dysfunctional attitudes to outcome in schizophrenia. One hundred eleven outpatients and 67 healthy controls completed a Dysfunctional Attitudes Scale, and patients completed a competence measure of functional capacity, clinical ratings of negative symptoms, and interview-based ratings of real-world functioning. Patients reported higher defeatist performance beliefs than controls and these were significantly related to lower functional capacity, higher negative symptoms, and worse community functioning. Consistent with Beck and colleagues' formulation, modeling analyses indicated a significant indirect pathway from functional capacity-->dysfunctional attitudes-->negative symptoms-->real-world functioning. These findings support the value of dysfunctional attitudes for understanding the determinants of outcome in schizophrenia and suggest that therapeutic interventions targeting these attitudes may facilitate functional recovery. (c) 2009 Elsevier Ltd. All rights reserved.

  6. Renal Function Descriptors in Neonates: Which Creatinine-Based Formula Best Describes Vancomycin Clearance?

    PubMed

    Bhongsatiern, Jiraganya; Stockmann, Chris; Yu, Tian; Constance, Jonathan E; Moorthy, Ganesh; Spigarelli, Michael G; Desai, Pankaj B; Sherwin, Catherine M T

    2016-05-01

    Growth and maturational changes have been identified as significant covariates in describing variability in clearance of renally excreted drugs such as vancomycin. Because of immaturity of clearance mechanisms, quantification of renal function in neonates is of importance. Several serum creatinine (SCr)-based renal function descriptors have been developed in adults and children, but none are selectively derived for neonates. This review summarizes development of the neonatal kidney and discusses assessment of the renal function regarding estimation of glomerular filtration rate using renal function descriptors. Furthermore, identification of the renal function descriptors that best describe the variability of vancomycin clearance was performed in a sample study of a septic neonatal cohort. Population pharmacokinetic models were developed applying a combination of age-weight, renal function descriptors, or SCr alone. In addition to age and weight, SCr or renal function descriptors significantly reduced variability of vancomycin clearance. The population pharmacokinetic models with Léger and modified Schwartz formulas were selected as the optimal final models, although the other renal function descriptors and SCr provided reasonably good fit to the data, suggesting further evaluation of the final models using external data sets and cross validation. The present study supports incorporation of renal function descriptors in the estimation of vancomycin clearance in neonates. © 2015, The American College of Clinical Pharmacology.

  7. Employee subjective well-being and physiological functioning: An integrative model.

    PubMed

    Kuykendall, Lauren; Tay, Louis

    2015-01-01

    Research shows that worker subjective well-being influences physiological functioning-an early signal of poor health outcomes. While several theoretical perspectives provide insights on this relationship, the literature lacks an integrative framework explaining the relationship. We develop a conceptual model explaining the link between subjective well-being and physiological functioning in the context of work. Integrating positive psychology and occupational stress perspectives, our model explains the relationship between subjective well-being and physiological functioning as a result of the direct influence of subjective well-being on physiological functioning and of their common relationships with work stress and personal resources, both of which are influenced by job conditions.

  8. Indicators of ecosystem function identify alternate states in the sagebrush steppe.

    PubMed

    Kachergis, Emily; Rocca, Monique E; Fernandez-Gimenez, Maria E

    2011-10-01

    Models of ecosystem change that incorporate nonlinear dynamics and thresholds, such as state-and-transition models (STMs), are increasingly popular tools for land management decision-making. However, few models are based on systematic collection and documentation of ecological data, and of these, most rely solely on structural indicators (species composition) to identify states and transitions. As STMs are adopted as an assessment framework throughout the United States, finding effective and efficient ways to create data-driven models that integrate ecosystem function and structure is vital. This study aims to (1) evaluate the utility of functional indicators (indicators of rangeland health, IRH) as proxies for more difficult ecosystem function measurements and (2) create a data-driven STM for the sagebrush steppe of Colorado, USA, that incorporates both ecosystem structure and function. We sampled soils, plant communities, and IRH at 41 plots with similar clayey soils but different site histories to identify potential states and infer the effects of management practices and disturbances on transitions. We found that many IRH were correlated with quantitative measures of functional indicators, suggesting that the IRH can be used to approximate ecosystem function. In addition to a reference state that functions as expected for this soil type, we identified four biotically and functionally distinct potential states, consistent with the theoretical concept of alternate states. Three potential states were related to management practices (chemical and mechanical shrub treatments and seeding history) while one was related only to ecosystem processes (erosion). IRH and potential states were also related to environmental variation (slope, soil texture), suggesting that there are environmental factors within areas with similar soils that affect ecosystem dynamics and should be noted within STMs. Our approach generated an objective, data-driven model of ecosystem dynamics for rangeland management. Our findings suggest that the IRH approximate ecosystem processes and can distinguish between alternate states and communities and identify transitions when building data-driven STMs. Functional indicators are a simple, efficient way to create data-driven models that are consistent with alternate state theory. Managers can use them to improve current model-building methods and thus apply state-and-transition models more broadly for land management decision-making.

  9. Enabling Cross-Discipline Collaboration Via a Functional Data Model

    NASA Astrophysics Data System (ADS)

    Lindholm, D. M.; Wilson, A.; Baltzer, T.

    2016-12-01

    Many research disciplines have very specialized data models that are used to express the detailed semantics that are meaningful to that community and easily utilized by their data analysis tools. While invaluable to members of that community, such expressive data structures and metadata are of little value to potential collaborators from other scientific disciplines. Many data interoperability efforts focus on the difficult task of computationally mapping concepts from one domain to another to facilitate discovery and use of data. Although these efforts are important and promising, we have found that a great deal of discovery and dataset understanding still happens at the level of less formal, personal communication. However, a significant barrier to inter-disciplinary data sharing that remains is one of data access.Scientists and data analysts continue to spend inordinate amounts of time simply trying to get data into their analysis tools. Providing data in a standard file format is often not sufficient since data can be structured in many ways. Adhering to more explicit community standards for data structure and metadata does little to help those in other communities.The Functional Data Model specializes the Relational Data Model (used by many database systems)by defining relations as functions between independent (domain) and dependent (codomain) variables. Given that arrays of data in many scientific data formats generally represent functionally related parameters (e.g. temperature as a function of space and time), the Functional Data Model is quite relevant for these datasets as well. The LaTiS software framework implements the Functional Data Model and provides a mechanism to expose an existing data source as a LaTiS dataset. LaTiS datasets can be manipulated using a Functional Algebra and output in any number of formats.LASP has successfully used the Functional Data Model and its implementation in the LaTiS software framework to bridge the gap between disparate data sources and communities. This presentation will demonstrate the utility of the Functional Data Model and how it can be used to facilitate cross-discipline collaboration.

  10. Functional Data Analysis in NTCP Modeling: A New Method to Explore the Radiation Dose-Volume Effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benadjaoud, Mohamed Amine, E-mail: mohamedamine.benadjaoud@gustaveroussy.fr; Université Paris sud, Le Kremlin-Bicêtre; Institut Gustave Roussy, Villejuif

    2014-11-01

    Purpose/Objective(s): To describe a novel method to explore radiation dose-volume effects. Functional data analysis is used to investigate the information contained in differential dose-volume histograms. The method is applied to the normal tissue complication probability modeling of rectal bleeding (RB) for patients irradiated in the prostatic bed by 3-dimensional conformal radiation therapy. Methods and Materials: Kernel density estimation was used to estimate the individual probability density functions from each of the 141 rectum differential dose-volume histograms. Functional principal component analysis was performed on the estimated probability density functions to explore the variation modes in the dose distribution. The functional principalmore » components were then tested for association with RB using logistic regression adapted to functional covariates (FLR). For comparison, 3 other normal tissue complication probability models were considered: the Lyman-Kutcher-Burman model, logistic model based on standard dosimetric parameters (LM), and logistic model based on multivariate principal component analysis (PCA). Results: The incidence rate of grade ≥2 RB was 14%. V{sub 65Gy} was the most predictive factor for the LM (P=.058). The best fit for the Lyman-Kutcher-Burman model was obtained with n=0.12, m = 0.17, and TD50 = 72.6 Gy. In PCA and FLR, the components that describe the interdependence between the relative volumes exposed at intermediate and high doses were the most correlated to the complication. The FLR parameter function leads to a better understanding of the volume effect by including the treatment specificity in the delivered mechanistic information. For RB grade ≥2, patients with advanced age are significantly at risk (odds ratio, 1.123; 95% confidence interval, 1.03-1.22), and the fits of the LM, PCA, and functional principal component analysis models are significantly improved by including this clinical factor. Conclusion: Functional data analysis provides an attractive method for flexibly estimating the dose-volume effect for normal tissues in external radiation therapy.« less

  11. Secure and Resilient Functional Modeling for Navy Cyber-Physical Systems

    DTIC Science & Technology

    2017-05-24

    Functional Modeling Compiler (SCCT) FM Compiler and Key Performance Indicators (KPI) May 2018 Pending. Model Management Backbone (SCCT) MMB Demonstration...implement the agent- based distributed runtime. - KPIs for single/multicore controllers and temporal/spatial domains. - Integration of the model management ...Distributed Runtime (UCI) Not started. Model Management Backbone (SCCT) Not started. Siemens Corporation Corporate Technology Unrestricted

  12. A Bifactor Multidimensional Item Response Theory Model for Differential Item Functioning Analysis on Testlet-Based Items

    ERIC Educational Resources Information Center

    Fukuhara, Hirotaka; Kamata, Akihito

    2011-01-01

    A differential item functioning (DIF) detection method for testlet-based data was proposed and evaluated in this study. The proposed DIF model is an extension of a bifactor multidimensional item response theory (MIRT) model for testlets. Unlike traditional item response theory (IRT) DIF models, the proposed model takes testlet effects into…

  13. Using a Functional Model to Develop a Mathematical Formula

    ERIC Educational Resources Information Center

    Otto, Charlotte A.; Everett, Susan A.; Luera, Gail R.

    2008-01-01

    The unifying theme of models was incorporated into a required Science Capstone course for pre-service elementary teachers based on national standards in science and mathematics. A model of a teeter-totter was selected for use as an example of a functional model for gathering data as well as a visual model of a mathematical equation for developing…

  14. Using Stocking or Harvesting to Reverse Period-Doubling Bifurcations in Discrete Population Models

    Treesearch

    James F. Selgrade

    1998-01-01

    This study considers a general class of 2-dimensional, discrete population models where each per capita transition function (fitness) depends on a linear combination of the densities of the interacting populations. The fitness functions are either monotone decreasing functions (pioneer fitnesses) or one-humped functions (climax fitnesses). Four sets of necessary...

  15. Reversing Period-Doubling Bifurcations in Models of Population Interactions Using Constant Stocking or Harvesting

    Treesearch

    James F. Selgrade; James H. Roberds

    1998-01-01

    This study considers a general class of two-dimensional, discrete population models where each per capita transition function (fitness) depends on a linear combination of the densities of the interacting populations. The fitness functions are either monotone decreasing functions (pioneer fitnesses) or one-humped functions (climax fitnesses). Conditions are derived...

  16. PROGRAPH Diagrams--A New Old System for Teaching Functional Modelling

    ERIC Educational Resources Information Center

    Siller, Hans-Stefan

    2009-01-01

    This paper shows the basic concept of Functional Modelling in mathematics education which has become more and more important in recent years. Hence it is necessary to think about suitable graphical methods to explain the fundamental idea of a function and its influence on values and other functions. PROGRAPH diagrams are a potentially good way to…

  17. Crustal Structure Beneath Taiwan Using Frequency-band Inversion of Receiver Function Waveforms

    NASA Astrophysics Data System (ADS)

    Tomfohrde, D. A.; Nowack, R. L.

    Receiver function analysis is used to determine local crustal structure beneath Taiwan. We have performed preliminary data processing and polarization analysis for the selection of stations and events and to increase overall data quality. Receiver function analysis is then applied to data from the Taiwan Seismic Network to obtain radial and transverse receiver functions. Due to the limited azimuthal coverage, only the radial receiver functions are analyzed in terms of horizontally layered crustal structure for each station. In order to improve convergence of the receiver function inversion, frequency-band inversion (FBI) is implemented, in which an iterative inversion procedure with sequentially higher low-pass corner frequencies is used to stabilize the waveform inversion. Frequency-band inversion is applied to receiver functions at six stations of the Taiwan Seismic Network. Initial 20-layer crustal models are inverted for using prior tomographic results for the initial models. The resulting 20-1ayer models are then simplified to 4 to 5 layer models and input into an alternating depth and velocity frequency-band inversion. For the six stations investigated, the resulting simplified models provide an average estimate of 38 km for the Moho thickness surrounding the Central Range of Taiwan. Also, the individual station estimates compare well with the recent tomographic model of and the refraction results of Rau and Wu (1995) and the refraction results of Ma and Song (1997).

  18. Fitted Hanbury-Brown Twiss radii versus space-time variances in flow-dominated models

    NASA Astrophysics Data System (ADS)

    Frodermann, Evan; Heinz, Ulrich; Lisa, Michael Annan

    2006-04-01

    The inability of otherwise successful dynamical models to reproduce the Hanbury-Brown Twiss (HBT) radii extracted from two-particle correlations measured at the Relativistic Heavy Ion Collider (RHIC) is known as the RHIC HBT Puzzle. Most comparisons between models and experiment exploit the fact that for Gaussian sources the HBT radii agree with certain combinations of the space-time widths of the source that can be directly computed from the emission function without having to evaluate, at significant expense, the two-particle correlation function. We here study the validity of this approach for realistic emission function models, some of which exhibit significant deviations from simple Gaussian behavior. By Fourier transforming the emission function, we compute the two-particle correlation function, and fit it with a Gaussian to partially mimic the procedure used for measured correlation functions. We describe a novel algorithm to perform this Gaussian fit analytically. We find that for realistic hydrodynamic models the HBT radii extracted from this procedure agree better with the data than the values previously extracted from the space-time widths of the emission function. Although serious discrepancies between the calculated and the measured HBT radii remain, we show that a more apples-to-apples comparison of models with data can play an important role in any eventually successful theoretical description of RHIC HBT data.

  19. Fitted Hanbury-Brown-Twiss radii versus space-time variances in flow-dominated models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frodermann, Evan; Heinz, Ulrich; Lisa, Michael Annan

    2006-04-15

    The inability of otherwise successful dynamical models to reproduce the Hanbury-Brown-Twiss (HBT) radii extracted from two-particle correlations measured at the Relativistic Heavy Ion Collider (RHIC) is known as the RHIC HBT Puzzle. Most comparisons between models and experiment exploit the fact that for Gaussian sources the HBT radii agree with certain combinations of the space-time widths of the source that can be directly computed from the emission function without having to evaluate, at significant expense, the two-particle correlation function. We here study the validity of this approach for realistic emission function models, some of which exhibit significant deviations from simplemore » Gaussian behavior. By Fourier transforming the emission function, we compute the two-particle correlation function, and fit it with a Gaussian to partially mimic the procedure used for measured correlation functions. We describe a novel algorithm to perform this Gaussian fit analytically. We find that for realistic hydrodynamic models the HBT radii extracted from this procedure agree better with the data than the values previously extracted from the space-time widths of the emission function. Although serious discrepancies between the calculated and the measured HBT radii remain, we show that a more apples-to-apples comparison of models with data can play an important role in any eventually successful theoretical description of RHIC HBT data.« less

  20. Factor structure of overall autobiographical memory usage: the directive, self and social functions revisited.

    PubMed

    Rasmussen, Anne S; Habermas, Tilmann

    2011-08-01

    According to theory, autobiographical memory serves three broad functions of overall usage: directive, self, and social. However, there is evidence to suggest that the tripartite model may be better conceptualised in terms of a four-factor model with two social functions. In the present study we examined the two models in Danish and German samples, using the Thinking About Life Experiences Questionnaire (TALE; Bluck, Alea, Habermas, & Rubin, 2005), which measures the overall usage of the three functions generalised across concrete memories. Confirmatory factor analysis supported the four-factor model and rejected the theoretical three-factor model in both samples. The results are discussed in relation to cultural differences in overall autobiographical memory usage as well as sharing versus non-sharing aspects of social remembering.

  1. Bifurcations in a discrete time model composed of Beverton-Holt function and Ricker function.

    PubMed

    Shang, Jin; Li, Bingtuan; Barnard, Michael R

    2015-05-01

    We provide rigorous analysis for a discrete-time model composed of the Ricker function and Beverton-Holt function. This model was proposed by Lewis and Li [Bull. Math. Biol. 74 (2012) 2383-2402] in the study of a population in which reproduction occurs at a discrete instant of time whereas death and competition take place continuously during the season. We show analytically that there exists a period-doubling bifurcation curve in the model. The bifurcation curve divides the parameter space into the region of stability and the region of instability. We demonstrate through numerical bifurcation diagrams that the regions of periodic cycles are intermixed with the regions of chaos. We also study the global stability of the model. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. "Shape function + memory mechanism"-based hysteresis modeling of magnetorheological fluid actuators

    NASA Astrophysics Data System (ADS)

    Qian, Li-Jun; Chen, Peng; Cai, Fei-Long; Bai, Xian-Xu

    2018-03-01

    A hysteresis model based on "shape function + memory mechanism" is presented and its feasibility is verified through modeling the hysteresis behavior of a magnetorheological (MR) damper. A hysteresis phenomenon in resistor-capacitor (RC) circuit is first presented and analyzed. In the hysteresis model, the "memory mechanism" originating from the charging and discharging processes of the RC circuit is constructed by adopting a virtual displacement variable and updating laws for the reference points. The "shape function" is achieved and generalized from analytical solutions of the simple semi-linear Duhem model. Using the approach, the memory mechanism reveals the essence of specific Duhem model and the general shape function provides a direct and clear means to fit the hysteresis loop. In the frame of the structure of a "Restructured phenomenological model", the original hysteresis operator, i.e., the Bouc-Wen operator, is replaced with the new hysteresis operator. The comparative work with the Bouc-Wen operator based model demonstrates superior performances of high computational efficiency and comparable accuracy of the new hysteresis operator-based model.

  3. A progress report on seismic model studies

    USGS Publications Warehouse

    Healy, J.H.; Mangan, G.B.

    1963-01-01

    The value of seismic-model studies as an aid to understanding wave propagation in the Earth's crust was recognized by early investigators (Tatel and Tuve, 1955). Preliminary model results were very promising, but progress in model seismology has been restricted by two problems: (1) difficulties in the development of models with continuously variable velocity-depth functions, and (2) difficulties in the construction of models of adequate size to provide a meaningful wave-length to layer-thickness ratio. The problem of a continuously variable velocity-depth function has been partly solved by a technique using two-dimensional plate models constructed by laminating plastic to aluminum, so that the ratio of plastic to aluminum controls the velocity-depth function (Healy and Press, 1960). These techniques provide a continuously variable velocity-depth function, but it is not possible to construct such models large enough to study short-period wave propagation in the crust. This report describes improvements in our ability to machine large models. Two types of models are being used: one is a cylindrical aluminum tube machined on a lathe, and the other is a large plate machined on a precision planer. Both of these modeling techniques give promising results and are a significant improvement over earlier efforts.

  4. Improving predicted protein loop structure ranking using a Pareto-optimality consensus method.

    PubMed

    Li, Yaohang; Rata, Ionel; Chiu, See-wing; Jakobsson, Eric

    2010-07-20

    Accurate protein loop structure models are important to understand functions of many proteins. Identifying the native or near-native models by distinguishing them from the misfolded ones is a critical step in protein loop structure prediction. We have developed a Pareto Optimal Consensus (POC) method, which is a consensus model ranking approach to integrate multiple knowledge- or physics-based scoring functions. The procedure of identifying the models of best quality in a model set includes: 1) identifying the models at the Pareto optimal front with respect to a set of scoring functions, and 2) ranking them based on the fuzzy dominance relationship to the rest of the models. We apply the POC method to a large number of decoy sets for loops of 4- to 12-residue in length using a functional space composed of several carefully-selected scoring functions: Rosetta, DOPE, DDFIRE, OPLS-AA, and a triplet backbone dihedral potential developed in our lab. Our computational results show that the sets of Pareto-optimal decoys, which are typically composed of approximately 20% or less of the overall decoys in a set, have a good coverage of the best or near-best decoys in more than 99% of the loop targets. Compared to the individual scoring function yielding best selection accuracy in the decoy sets, the POC method yields 23%, 37%, and 64% less false positives in distinguishing the native conformation, indentifying a near-native model (RMSD < 0.5A from the native) as top-ranked, and selecting at least one near-native model in the top-5-ranked models, respectively. Similar effectiveness of the POC method is also found in the decoy sets from membrane protein loops. Furthermore, the POC method outperforms the other popularly-used consensus strategies in model ranking, such as rank-by-number, rank-by-rank, rank-by-vote, and regression-based methods. By integrating multiple knowledge- and physics-based scoring functions based on Pareto optimality and fuzzy dominance, the POC method is effective in distinguishing the best loop models from the other ones within a loop model set.

  5. Improving predicted protein loop structure ranking using a Pareto-optimality consensus method

    PubMed Central

    2010-01-01

    Background Accurate protein loop structure models are important to understand functions of many proteins. Identifying the native or near-native models by distinguishing them from the misfolded ones is a critical step in protein loop structure prediction. Results We have developed a Pareto Optimal Consensus (POC) method, which is a consensus model ranking approach to integrate multiple knowledge- or physics-based scoring functions. The procedure of identifying the models of best quality in a model set includes: 1) identifying the models at the Pareto optimal front with respect to a set of scoring functions, and 2) ranking them based on the fuzzy dominance relationship to the rest of the models. We apply the POC method to a large number of decoy sets for loops of 4- to 12-residue in length using a functional space composed of several carefully-selected scoring functions: Rosetta, DOPE, DDFIRE, OPLS-AA, and a triplet backbone dihedral potential developed in our lab. Our computational results show that the sets of Pareto-optimal decoys, which are typically composed of ~20% or less of the overall decoys in a set, have a good coverage of the best or near-best decoys in more than 99% of the loop targets. Compared to the individual scoring function yielding best selection accuracy in the decoy sets, the POC method yields 23%, 37%, and 64% less false positives in distinguishing the native conformation, indentifying a near-native model (RMSD < 0.5A from the native) as top-ranked, and selecting at least one near-native model in the top-5-ranked models, respectively. Similar effectiveness of the POC method is also found in the decoy sets from membrane protein loops. Furthermore, the POC method outperforms the other popularly-used consensus strategies in model ranking, such as rank-by-number, rank-by-rank, rank-by-vote, and regression-based methods. Conclusions By integrating multiple knowledge- and physics-based scoring functions based on Pareto optimality and fuzzy dominance, the POC method is effective in distinguishing the best loop models from the other ones within a loop model set. PMID:20642859

  6. Models of violently relaxed galaxies

    NASA Astrophysics Data System (ADS)

    Merritt, David; Tremaine, Scott; Johnstone, Doug

    1989-02-01

    The properties of spherical self-gravitating models derived from two distribution functions that incorporate, in a crude way, the physics of violent relaxation are investigated. The first distribution function is identical to the one discussed by Stiavelli and Bertin (1985) except for a change in the sign of the 'temperature', i.e., e exp(-aE) to e exp(+aE). It is shown that these 'negative temperature' models provide a much better description of the end-state of violent relaxation than 'positive temperature' models. The second distribution function is similar to the first except for a different dependence on angular momentum. Both distribution functions yield single-parameter families of models with surface density profiles very similar to the R exp 1/4 law. Furthermore, the central concentration of models in both families increases monotonically with the velocity anisotropy, as expected in systems that formed through cold collapse.

  7. A K-BKZ Formulation for Soft-Tissue Viscoelasticity

    NASA Technical Reports Server (NTRS)

    Freed, Alan D.; Diethelm, Kai

    2005-01-01

    A viscoelastic model of the K-BKZ (Kaye 1962; Bernstein et al. 1963) type is developed for isotropic biological tissues, and applied to the fat pad of the human heel. To facilitate this pursuit, a class of elastic solids is introduced through a novel strain-energy function whose elements possess strong ellipticity, and therefore lead to stable material models. The standard fractional-order viscoelastic (FOV) solid is used to arrive at the overall elastic/viscoelastic structure of the model, while the elastic potential via the K-BKZ hypothesis is used to arrive at the tensorial structure of the model. Candidate sets of functions are proposed for the elastic and viscoelastic material functions present in the model, including a regularized fractional derivative that was determined to be the best. The Akaike information criterion (AIC) is advocated for performing multi-model inference, enabling an objective selection of the best material function from within a candidate set.

  8. A study of material damping in large space structures

    NASA Technical Reports Server (NTRS)

    Highsmith, A. L.; Allen, D. H.

    1989-01-01

    A constitutive model was developed for predicting damping as a function of damage in continuous fiber reinforced laminated composites. The damage model is a continuum formulation, and uses internal state variables to quantify damage and its subsequent effect on material response. The model is sensitive to the stacking sequence of the laminate. Given appropriate baseline data from unidirectional material, and damping as a function of damage in one crossply laminate, damage can be predicted as a function of damage in other crossply laminates. Agreement between theory and experiment was quite good. A micromechanics model was also developed for examining the influence of damage on damping. This model explicitly includes crack surfaces. The model provides reasonable predictions of bending stiffness as a function of damage. Damping predictions are not in agreement with the experiment. This is thought to be a result of dissipation mechanisms such as friction, which are not presently included in the analysis.

  9. Discriminating Among Probability Weighting Functions Using Adaptive Design Optimization

    PubMed Central

    Cavagnaro, Daniel R.; Pitt, Mark A.; Gonzalez, Richard; Myung, Jay I.

    2014-01-01

    Probability weighting functions relate objective probabilities and their subjective weights, and play a central role in modeling choices under risk within cumulative prospect theory. While several different parametric forms have been proposed, their qualitative similarities make it challenging to discriminate among them empirically. In this paper, we use both simulation and choice experiments to investigate the extent to which different parametric forms of the probability weighting function can be discriminated using adaptive design optimization, a computer-based methodology that identifies and exploits model differences for the purpose of model discrimination. The simulation experiments show that the correct (data-generating) form can be conclusively discriminated from its competitors. The results of an empirical experiment reveal heterogeneity between participants in terms of the functional form, with two models (Prelec-2, Linear in Log Odds) emerging as the most common best-fitting models. The findings shed light on assumptions underlying these models. PMID:24453406

  10. MODELING THE DYNAMICS OF THREE FUNCTIONAL GROUPS OF MACROALGAE IN TROPICAL SEAGRASS HABITATS. (R828677C004)

    EPA Science Inventory

    A model of three functional groups of macroalgae, drift algae, rhizophytic calcareous algae, and seagrass epiphytes, was developed to complement an existing seagrass production model for tropical habitats dominated by Thalassia testudinum (Turtle-grass). The current modeling e...

  11. Structural habitat predicts functional dispersal habitat of a large carnivore: how leopards change spots.

    PubMed

    Fattebert, Julien; Robinson, Hugh S; Balme, Guy; Slotow, Rob; Hunter, Luke

    2015-10-01

    Natal dispersal promotes inter-population linkage, and is key to spatial distribution of populations. Degradation of suitable landscape structures beyond the specific threshold of an individual's ability to disperse can therefore lead to disruption of functional landscape connectivity and impact metapopulation function. Because it ignores behavioral responses of individuals, structural connectivity is easier to assess than functional connectivity and is often used as a surrogate for landscape connectivity modeling. However using structural resource selection models as surrogate for modeling functional connectivity through dispersal could be erroneous. We tested how well a second-order resource selection function (RSF) models (structural connectivity), based on GPS telemetry data from resident adult leopard (Panthera pardus L.), could predict subadult habitat use during dispersal (functional connectivity). We created eight non-exclusive subsets of the subadult data based on differing definitions of dispersal to assess the predictive ability of our adult-based RSF model extrapolated over a broader landscape. Dispersing leopards used habitats in accordance with adult selection patterns, regardless of the definition of dispersal considered. We demonstrate that, for a wide-ranging apex carnivore, functional connectivity through natal dispersal corresponds to structural connectivity as modeled by a second-order RSF. Mapping of the adult-based habitat classes provides direct visualization of the potential linkages between populations, without the need to model paths between a priori starting and destination points. The use of such landscape scale RSFs may provide insight into predicting suitable dispersal habitat peninsulas in human-dominated landscapes where mitigation of human-wildlife conflict should be focused. We recommend the use of second-order RSFs for landscape conservation planning and propose a similar approach to the conservation of other wide-ranging large carnivore species where landscape-scale resource selection data already exist.

  12. Pharmacokinetic/Pharmacodynamic Modeling and Simulation of Cefiderocol, a Parenteral Siderophore Cephalosporin, for Dose Adjustment Based on Renal Function.

    PubMed

    Katsube, Takayuki; Wajima, Toshihiro; Ishibashi, Toru; Arjona Ferreira, Juan Camilo; Echols, Roger

    2017-01-01

    Cefiderocol, a novel parenteral siderophore cephalosporin, exhibits potent efficacy against most Gram-negative bacteria, including carbapenem-resistant strains. Since cefiderocol is excreted primarily via the kidneys, this study was conducted to develop a population pharmacokinetics (PK) model to determine dose adjustment based on renal function. Population PK models were developed based on data for cefiderocol concentrations in plasma, urine, and dialysate with a nonlinear mixed-effects model approach. Monte-Carlo simulations were conducted to calculate the probability of target attainment (PTA) of fraction of time during the dosing interval where the free drug concentration in plasma exceeds the MIC (T f >MIC ) for an MIC range of 0.25 to 16 μg/ml. For the simulations, dose regimens were selected to compare cefiderocol exposure among groups with different levels of renal function. The developed models well described the PK of cefiderocol for each renal function group. A dose of 2 g every 8 h with 3-h infusions provided >90% PTA for 75% T f >MIC for an MIC of ≤4 μg/ml for patients with normal renal function, while a more frequent dose (every 6 h) could be used for patients with augmented renal function. A reduced dose and/or extended dosing interval was selected for patients with impaired renal function. A supplemental dose immediately after intermittent hemodialysis was proposed for patients requiring intermittent hemodialysis. The PK of cefiderocol could be adequately modeled, and the modeling-and-simulation approach suggested dose regimens based on renal function, ensuring drug exposure with adequate bactericidal effect. Copyright © 2016 American Society for Microbiology.

  13. Froissart bound and self-similarity based models of proton structure functions

    NASA Astrophysics Data System (ADS)

    Choudhury, D. K.; Saikia, Baishali

    2018-03-01

    Froissart bound implies that the total proton-proton cross-section (or equivalently proton structure function) cannot rise faster than log2s ˜log2 1 x. Compatibility of such behavior with the notion of self-similarity in proton structure function was suggested by us sometime back. In the present work, we generalize and improve it further by considering more recent self-similarity based models of proton structure functions and compare with recent data as well as with the model of Block, Durand, Ha and McKay.

  14. Using Cultural Modeling to Inform a NEDSS-Compatible System Functionality Evaluation

    PubMed Central

    Anderson, Olympia; Torres-Urquidy, Miguel

    2013-01-01

    Objective The culture by which public health professionals work defines their organizational objectives, expectations, policies, and values. These aspects of culture are often intangible and difficult to qualify. The introduction of an information system could further complicate the culture of a jurisdiction if the intangibles of a culture are not clearly understood. This report describes how cultural modeling can be used to capture intangible elements or factors that may affect NEDSS-compatible (NC) system functionalities within the culture of public health jurisdictions. Introduction The National Notifiable Disease Surveillance System (NNDSS) comprises many activities including collaborations, processes, standards, and systems which support gathering data from US states and territories. As part of NNDSS, the National Electronic Disease Surveillance System (NEDSS) provides the standards, tools, and resources to support reporting public health jurisdictions (jurisdictions). The NEDSS Base System (NBS) is a CDC-developed, software application available to jurisdictions to collect, manage, analyze and report national notifiable disease (NND) data. An evaluation of NEDSS with the objective of identifying the functionalities of NC systems and the impact of these features on the user’s culture is underway. Methods We used cultural models to capture additional NC system functionality gaps within the culture of the user. Cultural modeling is a process of graphically depicting people and organizations referred to as influencers and the intangible factors that affect the user’s operations or work as influences. Influencers are denoted as bubbles while influences are depicted as arrows penetrating the bubbles. In the cultural model, influence can be seen by the size and proximity (or lack of) in the model. We restricted the models to secondary data sources and interviews of CDC programs (data users) and public health jurisdictions (data reporters). Results Three cultural models were developed from the secondary information sources; these models include the NBS vendor, public health jurisdiction (jurisdiction) activities, and NEDSS technical consultants. The vendor cultural model identified channels of communication about functionalities flowing from the vendor and the NBS users with CDC as the approval mechanism. The jurisdiction activities model highlighted perceived issues external to the organization that had some impact in their organization. Key disconnecting issues in the jurisdiction model included situational awareness, data competency, and bureaucracy. This model also identified poor coordination as a major influencer of the jurisdiction’s activities. The NEDSS technical model identified major issues and disconnects among data access, capture and reporting, processing, and ELR functionalities (Figure 1). The data processing functionality resulted in the largest negative influencer with issues that included: loss of data specificity, lengthy submission strategies, and risk of data use. Collectively, the models depict issues with the system functionality but mostly identify other factors that may influence how jurisdictions use the system, moreover determining the functionalities to be included. Conclusions By using the cultural model as a guide, we are able to clarify complex relationships using multiple data sources and improve our understanding of the impacts of the NC system functionalities on user’s operations. Modeling the recipients of the data (e.g. CDC programs) will provide insight on additional factors that may inform the NEDSS evaluation.

  15. Multi-subject hierarchical inverse covariance modelling improves estimation of functional brain networks.

    PubMed

    Colclough, Giles L; Woolrich, Mark W; Harrison, Samuel J; Rojas López, Pedro A; Valdes-Sosa, Pedro A; Smith, Stephen M

    2018-05-07

    A Bayesian model for sparse, hierarchical, inver-covariance estimation is presented, and applied to multi-subject functional connectivity estimation in the human brain. It enables simultaneous inference of the strength of connectivity between brain regions at both subject and population level, and is applicable to fMRI, MEG and EEG data. Two versions of the model can encourage sparse connectivity, either using continuous priors to suppress irrelevant connections, or using an explicit description of the network structure to estimate the connection probability between each pair of regions. A large evaluation of this model, and thirteen methods that represent the state of the art of inverse covariance modelling, is conducted using both simulated and resting-state functional imaging datasets. Our novel Bayesian approach has similar performance to the best extant alternative, Ng et al.'s Sparse Group Gaussian Graphical Model algorithm, which also is based on a hierarchical structure. Using data from the Human Connectome Project, we show that these hierarchical models are able to reduce the measurement error in MEG beta-band functional networks by 10%, producing concomitant increases in estimates of the genetic influence on functional connectivity. Copyright © 2018. Published by Elsevier Inc.

  16. Concerted and mosaic evolution of functional modules in songbird brains

    PubMed Central

    DeVoogd, Timothy J.

    2017-01-01

    Vertebrate brains differ in overall size, composition and functional capacities, but the evolutionary processes linking these traits are unclear. Two leading models offer opposing views: the concerted model ascribes major dimensions of covariation in brain structures to developmental events, whereas the mosaic model relates divergent structures to functional capabilities. The models are often cast as incompatible, but they must be unified to explain how adaptive changes in brain structure arise from pre-existing architectures and developmental mechanisms. Here we show that variation in the sizes of discrete neural systems in songbirds, a species-rich group exhibiting diverse behavioural and ecological specializations, supports major elements of both models. In accordance with the concerted model, most variation in nucleus volumes is shared across functional domains and allometry is related to developmental sequence. Per the mosaic model, residual variation in nucleus volumes is correlated within functional systems and predicts specific behavioural capabilities. These comparisons indicate that oscine brains evolved primarily as a coordinated whole but also experienced significant, independent modifications to dedicated systems from specific selection pressures. Finally, patterns of covariation between species and brain areas hint at underlying developmental mechanisms. PMID:28490627

  17. Development of a Conceptual Model for Smoking Cessation: Physical Activity, Neurocognition, and Executive Functioning.

    PubMed

    Loprinzi, Paul D; Herod, Skyla M; Walker, Jerome F; Cardinal, Bradley J; Mahoney, Sara E; Kane, Christy

    2015-01-01

    Considerable research has shown adverse neurobiological effects of chronic alcohol use, including long-term and potentially permanent changes in the structure and function of the brain; however, much less is known about the neurobiological consequences of chronic smoking, as it has largely been ignored until recently. In this article, we present a conceptual model proposing the effects of smoking on neurocognition and the role that physical activity may play in this relationship as well as its role in smoking cessation. Pertinent published peer-reviewed articles deposited in PubMed delineating the pathways in the proposed model were reviewed. The proposed model, which is supported by emerging research, demonstrates a bidirectional relationship between smoking and executive functioning. In support of our conceptual model, physical activity may moderate this relationship and indirectly influence smoking behavior through physical activity-induced changes in executive functioning. Our model may have implications for aiding smoking cessation efforts through the promotion of physical activity as a mechanism for preventing smoking-induced deficits in neurocognition and executive function.

  18. Intent inferencing with a model-based operator's associate

    NASA Technical Reports Server (NTRS)

    Jones, Patricia M.; Mitchell, Christine M.; Rubin, Kenneth S.

    1989-01-01

    A portion of the Operator Function Model Expert System (OFMspert) research project is described. OFMspert is an architecture for an intelligent operator's associate or assistant that can aid the human operator of a complex, dynamic system. Intelligent aiding requires both understanding and control. The understanding (i.e., intent inferencing) ability of the operator's associate is discussed. Understanding or intent inferencing requires a model of the human operator; the usefulness of an intelligent aid depends directly on the fidelity and completeness of its underlying model. The model chosen for this research is the operator function model (OFM). The OFM represents operator functions, subfunctions, tasks, and actions as a heterarchic-hierarchic network of finite state automata, where the arcs in the network are system triggering events. The OFM provides the structure for intent inferencing in that operator functions and subfunctions correspond to likely operator goals and plans. A blackboard system similar to that of Human Associative Processor (HASP) is proposed as the implementation of intent inferencing function. This system postulates operator intentions based on current system state and attempts to interpret observed operator actions in light of these hypothesized intentions.

  19. Directivity models produced for the Next Generation Attenuation West 2 (NGA-West 2) project

    USGS Publications Warehouse

    Spudich, Paul A.; Watson-Lamprey, Jennie; Somerville, Paul G.; Bayless, Jeff; Shahi, Shrey; Baker, Jack W.; Rowshandel, Badie; Chiou, Brian

    2012-01-01

    Five new directivity models are being developed for the NGA-West 2 project. All are based on the NGA-West 2 data base, which is considerably expanded from the original NGA-West data base, containing about 3,000 more records from earthquakes having finite-fault rupture models. All of the new directivity models have parameters based on fault dimension in km, not normalized fault dimension. This feature removes a peculiarity of previous models which made them inappropriate for modeling large magnitude events on long strike-slip faults. Two models are explicitly, and one is implicitly, 'narrowband' models, in which the effect of directivity does not monotonically increase with spectral period but instead peaks at a specific period that is a function of earthquake magnitude. These narrowband models' functional forms are capable of simulating directivity over a wider range of earthquake magnitude than previous models. The functional forms of the five models are presented.

  20. Predicting nucleic acid binding interfaces from structural models of proteins.

    PubMed

    Dror, Iris; Shazman, Shula; Mukherjee, Srayanta; Zhang, Yang; Glaser, Fabian; Mandel-Gutfreund, Yael

    2012-02-01

    The function of DNA- and RNA-binding proteins can be inferred from the characterization and accurate prediction of their binding interfaces. However, the main pitfall of various structure-based methods for predicting nucleic acid binding function is that they are all limited to a relatively small number of proteins for which high-resolution three-dimensional structures are available. In this study, we developed a pipeline for extracting functional electrostatic patches from surfaces of protein structural models, obtained using the I-TASSER protein structure predictor. The largest positive patches are extracted from the protein surface using the patchfinder algorithm. We show that functional electrostatic patches extracted from an ensemble of structural models highly overlap the patches extracted from high-resolution structures. Furthermore, by testing our pipeline on a set of 55 known nucleic acid binding proteins for which I-TASSER produces high-quality models, we show that the method accurately identifies the nucleic acids binding interface on structural models of proteins. Employing a combined patch approach we show that patches extracted from an ensemble of models better predicts the real nucleic acid binding interfaces compared with patches extracted from independent models. Overall, these results suggest that combining information from a collection of low-resolution structural models could be a valuable approach for functional annotation. We suggest that our method will be further applicable for predicting other functional surfaces of proteins with unknown structure. Copyright © 2011 Wiley Periodicals, Inc.

  1. Investigating the effects of the fixed and varying dispersion parameters of Poisson-gamma models on empirical Bayes estimates.

    PubMed

    Lord, Dominique; Park, Peter Young-Jin

    2008-07-01

    Traditionally, transportation safety analysts have used the empirical Bayes (EB) method to improve the estimate of the long-term mean of individual sites; to correct for the regression-to-the-mean (RTM) bias in before-after studies; and to identify hotspots or high risk locations. The EB method combines two different sources of information: (1) the expected number of crashes estimated via crash prediction models, and (2) the observed number of crashes at individual sites. Crash prediction models have traditionally been estimated using a negative binomial (NB) (or Poisson-gamma) modeling framework due to the over-dispersion commonly found in crash data. A weight factor is used to assign the relative influence of each source of information on the EB estimate. This factor is estimated using the mean and variance functions of the NB model. With recent trends that illustrated the dispersion parameter to be dependent upon the covariates of NB models, especially for traffic flow-only models, as well as varying as a function of different time-periods, there is a need to determine how these models may affect EB estimates. The objectives of this study are to examine how commonly used functional forms as well as fixed and time-varying dispersion parameters affect the EB estimates. To accomplish the study objectives, several traffic flow-only crash prediction models were estimated using a sample of rural three-legged intersections located in California. Two types of aggregated and time-specific models were produced: (1) the traditional NB model with a fixed dispersion parameter and (2) the generalized NB model (GNB) with a time-varying dispersion parameter, which is also dependent upon the covariates of the model. Several statistical methods were used to compare the fitting performance of the various functional forms. The results of the study show that the selection of the functional form of NB models has an important effect on EB estimates both in terms of estimated values, weight factors, and dispersion parameters. Time-specific models with a varying dispersion parameter provide better statistical performance in terms of goodness-of-fit (GOF) than aggregated multi-year models. Furthermore, the identification of hazardous sites, using the EB method, can be significantly affected when a GNB model with a time-varying dispersion parameter is used. Thus, erroneously selecting a functional form may lead to select the wrong sites for treatment. The study concludes that transportation safety analysts should not automatically use an existing functional form for modeling motor vehicle crashes without conducting rigorous analyses to estimate the most appropriate functional form linking crashes with traffic flow.

  2. Variation in habitat suitability does not always relate to variation in species' plant functional traits

    PubMed Central

    Thuiller, Wilfried; Albert, Cécile H.; Dubuis, Anne; Randin, Christophe; Guisan, Antoine

    2010-01-01

    Habitat suitability models, which relate species occurrences to environmental variables, are assumed to predict suitable conditions for a given species. If these models are reliable, they should relate to change in plant growth and function. In this paper, we ask the question whether habitat suitability models are able to predict variation in plant functional traits, often assumed to be a good surrogate for a species' overall health and vigour. Using a thorough sampling design, we show a tight link between variation in plant functional traits and habitat suitability for some species, but not for others. Our contrasting results pave the way towards a better understanding of how species cope with varying habitat conditions and demonstrate that habitat suitability models can provide meaningful descriptions of the functional niche in some cases, but not in others. PMID:19793738

  3. A Posteriori Comparison of Natural and Surgical Destabilization Models of Canine Osteoarthritis

    PubMed Central

    Pelletier, Jean-Pierre; d'Anjou, Marc-André; Blond, Laurent; Pelletier, Johanne-Martel; del Castillo, Jérôme R. E.

    2013-01-01

    For many years Canis familiaris, the domestic dog, has drawn particular interest as a model of osteoarthritis (OA). Here, we optimized the dog model of experimental OA induced by cranial cruciate ligament sectioning. The usefulness of noninvasive complementary outcome measures, such as gait analysis for the limb function and magnetic resonance imaging for structural changes, was demonstrated in this model. Relationships were established between the functional impairment and the severity of structural changes including the measurement of cartilage thinning. In the dog model of naturally occurring OA, excellent test-retest reliability was denoted for the measurement of the limb function. A criterion to identify clinically meaningful responders to therapy was determined for privately owned dogs undergoing clinical trials. In addition, the recording of accelerometer-based duration of locomotor activity showed strong and complementary agreement with the biomechanical limb function. The translation potential of these models to the human OA condition is underlined. A preclinical testing protocol which combines the dog model of experimental OA induced by cranial cruciate ligament transection and the Dog model of naturally occurring OA offers the opportunity to further investigate the structural and functional benefits of disease-modifying strategies. Ultimately, a better prediction of outcomes for human clinical trials would be brought. PMID:24288664

  4. Prospects of second generation artificial intelligence tools in calibration of chemical sensors.

    PubMed

    Braibanti, Antonio; Rao, Rupenaguntla Sambasiva; Ramam, Veluri Anantha; Rao, Gollapalli Nageswara; Rao, Vaddadi Venkata Panakala

    2005-05-01

    Multivariate data driven calibration models with neural networks (NNs) are developed for binary (Cu++ and Ca++) and quaternary (K+, Ca++, NO3- and Cl-) ion-selective electrode (ISE) data. The response profiles of ISEs with concentrations are non-linear and sub-Nernstian. This task represents function approximation of multi-variate, multi-response, correlated, non-linear data with unknown noise structure i.e. multi-component calibration/prediction in chemometric parlance. Radial distribution function (RBF) and Fuzzy-ARTMAP-NN models implemented in the software packages, TRAJAN and Professional II, are employed for the calibration. The optimum NN models reported are based on residuals in concentration space. Being a data driven information technology, NN does not require a model, prior- or posterior- distribution of data or noise structure. Missing information, spikes or newer trends in different concentration ranges can be modeled through novelty detection. Two simulated data sets generated from mathematical functions are modeled as a function of number of data points and network parameters like number of neurons and nearest neighbors. The success of RBF and Fuzzy-ARTMAP-NNs to develop adequate calibration models for experimental data and function approximation models for more complex simulated data sets ensures AI2 (artificial intelligence, 2nd generation) as a promising technology in quantitation.

  5. Regression analysis and transfer function in estimating the parameters of central pulse waves from brachial pulse wave.

    PubMed

    Chai Rui; Li Si-Man; Xu Li-Sheng; Yao Yang; Hao Li-Ling

    2017-07-01

    This study mainly analyzed the parameters such as ascending branch slope (A_slope), dicrotic notch height (Hn), diastolic area (Ad) and systolic area (As) diastolic blood pressure (DBP), systolic blood pressure (SBP), pulse pressure (PP), subendocardial viability ratio (SEVR), waveform parameter (k), stroke volume (SV), cardiac output (CO) and peripheral resistance (RS) of central pulse wave invasively and non-invasively measured. These parameters extracted from the central pulse wave invasively measured were compared with the parameters measured from the brachial pulse waves by a regression model and a transfer function model. The accuracy of the parameters which were estimated by the regression model and the transfer function model was compared too. Our findings showed that in addition to the k value, the above parameters of the central pulse wave and the brachial pulse wave invasively measured had positive correlation. Both the regression model parameters including A_slope, DBP, SEVR and the transfer function model parameters had good consistency with the parameters invasively measured, and they had the same effect of consistency. The regression equations of the three parameters were expressed by Y'=a+bx. The SBP, PP, SV, CO of central pulse wave could be calculated through the regression model, but their accuracies were worse than that of transfer function model.

  6. Objectively-Measured Physical Activity and Cognitive Functioning in Breast Cancer Survivors

    PubMed Central

    Marinac, Catherine R.; Godbole, Suneeta; Kerr, Jacqueline; Natarajan, Loki; Patterson, Ruth E.; Hartman, Sheri J.

    2015-01-01

    Purpose To explore the relationship between objectively measured physical activity and cognitive functioning in breast cancer survivors. Methods Participants were 136 postmenopausal breast cancer survivors. Cognitive functioning was assessed using a comprehensive computerized neuropsychological test. 7-day physical activity was assessed using hip-worn accelerometers. Linear regression models examined associations of minutes per day of physical activity at various intensities on individual cognitive functioning domains. The partially adjusted model controlled for primary confounders (model 1), and subsequent adjustments were made for chemotherapy history (model 2), and BMI (model 3). Interaction and stratified models examined BMI as an effect modifier. Results Moderate-to-vigorous physical activity (MVPA) was associated with Information Processing Speed. Specifically, ten minutes of MVPA was associated with a 1.35-point higher score (out of 100) on the Information Processing Speed domain in the partially adjusted model, and a 1.29-point higher score when chemotherapy was added to the model (both p<.05). There was a significant BMI x MVPA interaction (p=.051). In models stratified by BMI (<25 vs. ≥25 kg/m2), the favorable association between MVPA and Information Processing Speed was stronger in the subsample of overweight and obese women (p<.05), but not statistically significant in the leaner subsample. Light-intensity physical activity was not significantly associated with any of the measured domains of cognitive function. Conclusions MVPA may have favorable effects on Information Processing Speed in breast cancer survivors, particularly among overweight or obese women. Implications for Cancer Survivors Interventions targeting increased physical activity may enhance aspects of cognitive function among breast cancer survivors. PMID:25304986

  7. Dynamic physiological modeling for functional diffuse optical tomography

    PubMed Central

    Diamond, Solomon Gilbert; Huppert, Theodore J.; Kolehmainen, Ville; Franceschini, Maria Angela; Kaipio, Jari P.; Arridge, Simon R.; Boas, David A.

    2009-01-01

    Diffuse optical tomography (DOT) is a noninvasive imaging technology that is sensitive to local concentration changes in oxy- and deoxyhemoglobin. When applied to functional neuroimaging, DOT measures hemodynamics in the scalp and brain that reflect competing metabolic demands and cardiovascular dynamics. The diffuse nature of near-infrared photon migration in tissue and the multitude of physiological systems that affect hemodynamics motivate the use of anatomical and physiological models to improve estimates of the functional hemodynamic response. In this paper, we present a linear state-space model for DOT analysis that models the physiological fluctuations present in the data with either static or dynamic estimation. We demonstrate the approach by using auxiliary measurements of blood pressure variability and heart rate variability as inputs to model the background physiology in DOT data. We evaluate the improvements accorded by modeling this physiology on ten human subjects with simulated functional hemodynamic responses added to the baseline physiology. Adding physiological modeling with a static estimator significantly improved estimates of the simulated functional response, and further significant improvements were achieved with a dynamic Kalman filter estimator (paired t tests, n = 10, P < 0.05). These results suggest that physiological modeling can improve DOT analysis. The further improvement with the Kalman filter encourages continued research into dynamic linear modeling of the physiology present in DOT. Cardiovascular dynamics also affect the blood-oxygen-dependent (BOLD) signal in functional magnetic resonance imaging (fMRI). This state-space approach to DOT analysis could be extended to BOLD fMRI analysis, multimodal studies and real-time analysis. PMID:16242967

  8. On testing an unspecified function through a linear mixed effects model with multiple variance components

    PubMed Central

    Wang, Yuanjia; Chen, Huaihou

    2012-01-01

    Summary We examine a generalized F-test of a nonparametric function through penalized splines and a linear mixed effects model representation. With a mixed effects model representation of penalized splines, we imbed the test of an unspecified function into a test of some fixed effects and a variance component in a linear mixed effects model with nuisance variance components under the null. The procedure can be used to test a nonparametric function or varying-coefficient with clustered data, compare two spline functions, test the significance of an unspecified function in an additive model with multiple components, and test a row or a column effect in a two-way analysis of variance model. Through a spectral decomposition of the residual sum of squares, we provide a fast algorithm for computing the null distribution of the test, which significantly improves the computational efficiency over bootstrap. The spectral representation reveals a connection between the likelihood ratio test (LRT) in a multiple variance components model and a single component model. We examine our methods through simulations, where we show that the power of the generalized F-test may be higher than the LRT, depending on the hypothesis of interest and the true model under the alternative. We apply these methods to compute the genome-wide critical value and p-value of a genetic association test in a genome-wide association study (GWAS), where the usual bootstrap is computationally intensive (up to 108 simulations) and asymptotic approximation may be unreliable and conservative. PMID:23020801

  9. On testing an unspecified function through a linear mixed effects model with multiple variance components.

    PubMed

    Wang, Yuanjia; Chen, Huaihou

    2012-12-01

    We examine a generalized F-test of a nonparametric function through penalized splines and a linear mixed effects model representation. With a mixed effects model representation of penalized splines, we imbed the test of an unspecified function into a test of some fixed effects and a variance component in a linear mixed effects model with nuisance variance components under the null. The procedure can be used to test a nonparametric function or varying-coefficient with clustered data, compare two spline functions, test the significance of an unspecified function in an additive model with multiple components, and test a row or a column effect in a two-way analysis of variance model. Through a spectral decomposition of the residual sum of squares, we provide a fast algorithm for computing the null distribution of the test, which significantly improves the computational efficiency over bootstrap. The spectral representation reveals a connection between the likelihood ratio test (LRT) in a multiple variance components model and a single component model. We examine our methods through simulations, where we show that the power of the generalized F-test may be higher than the LRT, depending on the hypothesis of interest and the true model under the alternative. We apply these methods to compute the genome-wide critical value and p-value of a genetic association test in a genome-wide association study (GWAS), where the usual bootstrap is computationally intensive (up to 10(8) simulations) and asymptotic approximation may be unreliable and conservative. © 2012, The International Biometric Society.

  10. Modelling population distribution using remote sensing imagery and location-based data

    NASA Astrophysics Data System (ADS)

    Song, J.; Prishchepov, A. V.

    2017-12-01

    Detailed spatial distribution of population density is essential for city studies such as urban planning, environmental pollution and city emergency, even estimate pressure on the environment and human exposure and risks to health. However, most of the researches used census data as the detailed dynamic population distribution are difficult to acquire, especially in microscale research. This research describes a method using remote sensing imagery and location-based data to model population distribution at the function zone level. Firstly, urban functional zones within a city were mapped by high-resolution remote sensing images and POIs. The workflow of functional zones extraction includes five parts: (1) Urban land use classification. (2) Segmenting images in built-up area. (3) Identification of functional segments by POIs. (4) Identification of functional blocks by functional segmentation and weight coefficients. (5) Assessing accuracy by validation points. The result showed as Fig.1. Secondly, we applied ordinary least square and geographically weighted regression to assess spatial nonstationary relationship between light digital number (DN) and population density of sampling points. The two methods were employed to predict the population distribution over the research area. The R²of GWR model were in the order of 0.7 and typically showed significant variations over the region than traditional OLS model. The result showed as Fig.2.Validation with sampling points of population density demonstrated that the result predicted by the GWR model correlated well with light value. The result showed as Fig.3. Results showed: (1) Population density is not linear correlated with light brightness using global model. (2) VIIRS night-time light data could estimate population density integrating functional zones at city level. (3) GWR is a robust model to map population distribution, the adjusted R2 of corresponding GWR models were higher than the optimal OLS models, confirming that GWR models demonstrate better prediction accuracy. So this method provide detailed population density information for microscale citizen studies.

  11. Representing Operational Modes for Situation Awareness

    NASA Astrophysics Data System (ADS)

    Kirchhübel, Denis; Lind, Morten; Ravn, Ole

    2017-01-01

    Operating complex plants is an increasingly demanding task for human operators. Diagnosis of and reaction to on-line events requires the interpretation of real time data. Vast amounts of sensor data as well as operational knowledge about the state and design of the plant are necessary to deduct reasonable reactions to abnormal situations. Intelligent computational support tools can make the operator’s task easier, but they require knowledge about the overall system in form of some model. While tools used for fault-tolerant control design based on physical principles and relations are valuable tools for designing robust systems, the models become too complex when considering the interactions on a plant-wide level. The alarm systems meant to support human operators in the diagnosis of the plant-wide situation on the other hand fail regularly in situations where these interactions of systems lead to many related alarms overloading the operator with alarm floods. Functional modelling can provide a middle way to reduce the complexity of plant-wide models by abstracting from physical details to more general functions and behaviours. Based on functional models the propagation of failures through the interconnected systems can be inferred and alarm floods can potentially be reduced to their root-cause. However, the desired behaviour of a complex system changes due to operating procedures that require more than one physical and functional configuration. In this paper a consistent representation of possible configurations is deduced from the analysis of an exemplary start-up procedure by functional models. The proposed interpretation of the modelling concepts simplifies the functional modelling of distinct modes. The analysis further reveals relevant links between the quantitative sensor data and the qualitative perspective of the diagnostics tool based on functional models. This will form the basis for the ongoing development of a novel real-time diagnostics system based on the on-line adaptation of the underlying MFM model.

  12. Functional classification of skeletal muscle networks. I. Normal physiology

    PubMed Central

    Wang, Yu; Winters, Jack

    2012-01-01

    Extensive measurements of the parts list of human skeletal muscle through transcriptomics and other phenotypic assays offer the opportunity to reconstruct detailed functional models. Through integration of vast amounts of data present in databases and extant knowledge of muscle function combined with robust analyses that include a clustering approach, we present both a protein parts list and network models for skeletal muscle function. The model comprises the four key functional family networks that coexist within a functional space; namely, excitation-activation family (forward pathways that transmit a motoneuronal command signal into the spatial volume of the cell and then use Ca2+ fluxes to bind Ca2+ to troponin C sites on F-actin filaments, plus transmembrane pumps that maintain transmission capacity); mechanical transmission family (a sophisticated three-dimensional mechanical apparatus that bidirectionally couples the millions of actin-myosin nanomotors with external axial tensile forces at insertion sites); metabolic and bioenergetics family (pathways that supply energy for the skeletal muscle function under widely varying demands and provide for other cellular processes); and signaling-production family (which represents various sensing, signal transduction, and nuclear infrastructure that controls the turn over and structural integrity and regulates the maintenance, regeneration, and remodeling of the muscle). Within each family, we identify subfamilies that function as a unit through analysis of large-scale transcription profiles of muscle and other tissues. This comprehensive network model provides a framework for exploring functional mechanisms of the skeletal muscle in normal and pathophysiology, as well as for quantitative modeling. PMID:23085959

  13. The Spinal Cord Injury- Functional Index: Item Banks to Measure Physical Functioning of Individuals with Spinal Cord Injury

    PubMed Central

    Tulsky, David S.; Jette, Alan; Kisala, Pamela A.; Kalpakjian, Claire; Dijkers, Marcel P.; Whiteneck, Gale; Ni, Pengsheng; Kirshblum, Steven; Charlifue, Susan; Heinemann, Allen W.; Forchheimer, Martin; Slavin, Mary; Houlihan, Bethlyn; Tate, Denise; Dyson-Hudson, Trevor; Fyffe, Denise; Williams, Steve; Zanca, Jeanne

    2012-01-01

    Objective To develop a comprehensive set of patient reported items to assess multiple aspects of physical functioning relevant to the lives of people with spinal cord injury (SCI) and to evaluate the underlying structure of physical functioning. Design Cross-sectional Setting Inpatient and community Participants Item pools of physical functioning were developed, refined and field tested in a large sample of 855 individuals with traumatic spinal cord injury stratified by diagnosis, severity, and time since injury Interventions None Main Outcome Measure SCI-FI measurement system Results Confirmatory factor analysis (CFA) indicated that a 5-factor model, including basic mobility, ambulation, wheelchair mobility, self care, and fine motor, had the best model fit and was most closely aligned conceptually with feedback received from individuals with SCI and SCI clinicians. When just the items making up basic mobility were tested in CFA, the fit statistics indicate strong support for a unidimensional model. Similar results were demonstrated for each of the other four factors indicating unidimensional models. Conclusions Though unidimensional or 2-factor (mobility and upper extremity) models of physical functioning make up outcomes measures in the general population, the underlying structure of physical function in SCI is more complex. A 5-factor solution allows for comprehensive assessment of key domain areas of physical functioning. These results informed the structure and development of the SCI-FI measurement system of physical functioning. PMID:22609299

  14. The Bilinear Product Model of Hysteresis Phenomena

    NASA Astrophysics Data System (ADS)

    Kádár, György

    1989-01-01

    In ferromagnetic materials non-reversible magnetization processes are represented by rather complex hysteresis curves. The phenomenological description of such curves needs the use of multi-valued, yet unambiguous, deterministic functions. The history dependent calculation of consecutive Everett-integrals of the two-variable Preisach-function can account for the main features of hysteresis curves in uniaxial magnetic materials. The traditional Preisach model has recently been modified on the basis of population dynamics considerations, removing the non-real congruency property of the model. The Preisach-function was proposed to be a product of two factors of distinct physical significance: a magnetization dependent function taking into account the overall magnetization state of the body and a bilinear form of a single variable, magnetic field dependent, switching probability function. The most important statement of the bilinear product model is, that the switching process of individual particles is to be separated from the book-keeping procedure of their states. This empirical model of hysteresis can easily be extended to other irreversible physical processes, such as first order phase transitions.

  15. Nonlocal kinetic energy functional from the jellium-with-gap model: Applications to orbital-free density functional theory

    NASA Astrophysics Data System (ADS)

    Constantin, Lucian A.; Fabiano, Eduardo; Della Sala, Fabio

    2018-05-01

    Orbital-free density functional theory (OF-DFT) promises to describe the electronic structure of very large quantum systems, being its computational cost linear with the system size. However, the OF-DFT accuracy strongly depends on the approximation made for the kinetic energy (KE) functional. To date, the most accurate KE functionals are nonlocal functionals based on the linear-response kernel of the homogeneous electron gas, i.e., the jellium model. Here, we use the linear-response kernel of the jellium-with-gap model to construct a simple nonlocal KE functional (named KGAP) which depends on the band-gap energy. In the limit of vanishing energy gap (i.e., in the case of metals), the KGAP is equivalent to the Smargiassi-Madden (SM) functional, which is accurate for metals. For a series of semiconductors (with different energy gaps), the KGAP performs much better than SM, and results are close to the state-of-the-art functionals with sophisticated density-dependent kernels.

  16. Effects of functional constraints and opportunism on the functional structure of a vertebrate predator assemblage.

    PubMed

    Farias, Ariel A; Jaksic, Fabian M

    2007-03-01

    1. Within mainstream ecological literature, functional structure has been viewed as resulting from the interplay of species interactions, resource levels and environmental variability. Classical models state that interspecific competition generates species segregation and guild formation in stable saturated environments, whereas opportunism causes species aggregation on abundant resources in variable unsaturated situations. 2. Nevertheless, intrinsic functional constraints may result in species-specific differences in resource-use capabilities. This could force some degree of functional structure without assuming other putative causes. However, the influence of such constraints has rarely been tested, and their relative contribution to observed patterns has not been quantified. 3. We used a multiple null-model approach to quantify the magnitude and direction (non-random aggregation or divergence) of the functional structure of a vertebrate predator assemblage exposed to variable prey abundance over an 18-year period. Observed trends were contrasted with predictions from null-models designed in an orthogonal fashion to account independently for the effects of functional constraints and opportunism. Subsequently, the unexplained variation was regressed against environmental variables to search for evidence of interspecific competition. 4. Overall, null-models accounting for functional constraints showed the best fit to the observed data, and suggested an effect of this factor in modulating predator opportunistic responses. However, regression models on residual variation indicated that such an effect was dependent on both total and relative abundance of principal (small mammals) and alternative (arthropods, birds, reptiles) prey categories. 5. In addition, no clear evidence for interspecific competition was found, but differential delays in predator functional responses could explain some of the unaccounted variation. Thus, we call for caution when interpreting empirical data in the context of classical models assuming synchronous responses of consumers to resource levels.

  17. Multiple metrics of diversity have different effects on temperate forest functioning over succession.

    PubMed

    Yuan, Zuoqiang; Wang, Shaopeng; Gazol, Antonio; Mellard, Jarad; Lin, Fei; Ye, Ji; Hao, Zhanqing; Wang, Xugao; Loreau, Michel

    2016-12-01

    Biodiversity can be measured by taxonomic, phylogenetic, and functional diversity. How ecosystem functioning depends on these measures of diversity can vary from site to site and depends on successional stage. Here, we measured taxonomic, phylogenetic, and functional diversity, and examined their relationship with biomass in two successional stages of the broad-leaved Korean pine forest in northeastern China. Functional diversity was calculated from six plant traits, and aboveground biomass (AGB) and coarse woody productivity (CWP) were estimated using data from three forest censuses (10 years) in two large fully mapped forest plots (25 and 5 ha). 11 of the 12 regressions between biomass variables (AGB and CWP) and indices of diversity showed significant positive relationships, especially those with phylogenetic diversity. The mean tree diversity-biomass regressions increased from 0.11 in secondary forest to 0.31 in old-growth forest, implying a stronger biodiversity effect in more mature forest. Multi-model selection results showed that models including species richness, phylogenetic diversity, and single functional traits explained more variation in forest biomass than other candidate models. The models with a single functional trait, i.e., leaf area in secondary forest and wood density in mature forest, provided better explanations for forest biomass than models that combined all six functional traits. This finding may reflect different strategies in growth and resource acquisition in secondary and old-growth forests.

  18. Development of numerical model for predicting heat generation and temperatures in MSW landfills.

    PubMed

    Hanson, James L; Yeşiller, Nazli; Onnen, Michael T; Liu, Wei-Lien; Oettle, Nicolas K; Marinos, Janelle A

    2013-10-01

    A numerical modeling approach has been developed for predicting temperatures in municipal solid waste landfills. Model formulation and details of boundary conditions are described. Model performance was evaluated using field data from a landfill in Michigan, USA. The numerical approach was based on finite element analysis incorporating transient conductive heat transfer. Heat generation functions representing decomposition of wastes were empirically developed and incorporated to the formulation. Thermal properties of materials were determined using experimental testing, field observations, and data reported in literature. The boundary conditions consisted of seasonal temperature cycles at the ground surface and constant temperatures at the far-field boundary. Heat generation functions were developed sequentially using varying degrees of conceptual complexity in modeling. First a step-function was developed to represent initial (aerobic) and residual (anaerobic) conditions. Second, an exponential growth-decay function was established. Third, the function was scaled for temperature dependency. Finally, an energy-expended function was developed to simulate heat generation with waste age as a function of temperature. Results are presented and compared to field data for the temperature-dependent growth-decay functions. The formulations developed can be used for prediction of temperatures within various components of landfill systems (liner, waste mass, cover, and surrounding subgrade), determination of frost depths, and determination of heat gain due to decomposition of wastes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. The Two-Dimensional Gabor Function Adapted to Natural Image Statistics: A Model of Simple-Cell Receptive Fields and Sparse Structure in Images.

    PubMed

    Loxley, P N

    2017-10-01

    The two-dimensional Gabor function is adapted to natural image statistics, leading to a tractable probabilistic generative model that can be used to model simple cell receptive field profiles, or generate basis functions for sparse coding applications. Learning is found to be most pronounced in three Gabor function parameters representing the size and spatial frequency of the two-dimensional Gabor function and characterized by a nonuniform probability distribution with heavy tails. All three parameters are found to be strongly correlated, resulting in a basis of multiscale Gabor functions with similar aspect ratios and size-dependent spatial frequencies. A key finding is that the distribution of receptive-field sizes is scale invariant over a wide range of values, so there is no characteristic receptive field size selected by natural image statistics. The Gabor function aspect ratio is found to be approximately conserved by the learning rules and is therefore not well determined by natural image statistics. This allows for three distinct solutions: a basis of Gabor functions with sharp orientation resolution at the expense of spatial-frequency resolution, a basis of Gabor functions with sharp spatial-frequency resolution at the expense of orientation resolution, or a basis with unit aspect ratio. Arbitrary mixtures of all three cases are also possible. Two parameters controlling the shape of the marginal distributions in a probabilistic generative model fully account for all three solutions. The best-performing probabilistic generative model for sparse coding applications is found to be a gaussian copula with Pareto marginal probability density functions.

  20. Activity Diagrams for DEVS Models: A Case Study Modeling Health Care Behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozmen, Ozgur; Nutaro, James J

    Discrete Event Systems Specification (DEVS) is a widely used formalism for modeling and simulation of discrete and continuous systems. While DEVS provides a sound mathematical representation of discrete systems, its practical use can suffer when models become complex. Five main functions, which construct the core of atomic modules in DEVS, can realize the behaviors that modelers want to represent. The integration of these functions is handled by the simulation routine, however modelers can implement each function in various ways. Therefore, there is a need for graphical representations of complex models to simplify their implementation and facilitate their reproduction. In thismore » work, we illustrate the use of activity diagrams for this purpose in the context of a health care behavior model, which is developed with an agent-based modeling paradigm.« less

  1. Development of a Dynamic Visco-elastic Vehicle-Soil Interaction Model for Rut Depth, and Power Determinations

    DTIC Science & Technology

    2011-09-06

    Presentation Outline A) Review of Soil Model governing equations B) Development of pedo -transfer functions (terrain database to engineering properties) C...lateral earth pressure) UNCLASSIFIED B) Development of pedo -transfer functions Engineering parameters needed by soil model - compression index - rebound...inches, RCI for fine- grained soils, CI for coarse-grained soils. UNCLASSIFIED Pedo -transfer function • Need to transfer existing terrain database

  2. Mathematical Models to Determine Stable Behavior of Complex Systems

    NASA Astrophysics Data System (ADS)

    Sumin, V. I.; Dushkin, A. V.; Smolentseva, T. E.

    2018-05-01

    The paper analyzes a possibility to predict functioning of a complex dynamic system with a significant amount of circulating information and a large number of random factors impacting its functioning. Functioning of the complex dynamic system is described as a chaotic state, self-organized criticality and bifurcation. This problem may be resolved by modeling such systems as dynamic ones, without applying stochastic models and taking into account strange attractors.

  3. Integral formulae of the canonical correlation functions for the one dimensional transverse Ising model

    NASA Astrophysics Data System (ADS)

    Inoue, Makoto

    2017-12-01

    Some new formulae of the canonical correlation functions for the one dimensional quantum transverse Ising model are found by the ST-transformation method using a Morita's sum rule and its extensions for the two dimensional classical Ising model. As a consequence we obtain a time-independent term of the dynamical correlation functions. Differences of quantum version and classical version of these formulae are also discussed.

  4. Executive functions, impulsivity, and inhibitory control in adolescents: A structural equation model

    PubMed Central

    Fino, Emanuele; Melogno, Sergio; Iliceto, Paolo; D’Aliesio, Sara; Pinto, Maria Antonietta; Candilera, Gabriella; Sabatello, Ugo

    2014-01-01

    Background. Adolescence represents a critical period for brain development, addressed by neurodevelopmental models to frontal, subcortical-limbic, and striatal activation, a pattern associated with rise of impulsivity and deficits in inhibitory control. The present study aimed at studying the association between self-report measures of impulsivity and inhibitory control with executive function in adolescents, employing structural equation modeling. Method. Tests were administered to 434 high school students. Acting without thinking was measured through the Barratt Impulsiveness Scale and the Dickman Impulsivity Inventory, reward sensitivity through the Behavioral Activation System, and sensation seeking through the Zuckerman–Kuhlman–Aluja Personali- ty Questionnaire. Inhibitory control was assessed through the Behavioral Inhibition System. The performance at the Wisconsin Card Sorting Task indicated executive function. Three models were specified using Sample Covariance Matrix, and the estimated parameters using Maximum Likelihood. Results. In the final model, impulsivity and inhibitory control predicted executive function, but sensation seeking did not. The fit of the model to data was excellent. Conclusions. The hypothesis that inhibitory control and impulsivity are predictors of executive function was supported. Our results appear informative of the validity of self-report measures to examine the relation between impulsivity traits rather than others to regulatory function of cognition and behavior. PMID:25157298

  5. Physical models have gender-specific effects on student understanding of protein structure-function relationships.

    PubMed

    Forbes-Lorman, Robin M; Harris, Michelle A; Chang, Wesley S; Dent, Erik W; Nordheim, Erik V; Franzen, Margaret A

    2016-07-08

    Understanding how basic structural units influence function is identified as a foundational/core concept for undergraduate biological and biochemical literacy. It is essential for students to understand this concept at all size scales, but it is often more difficult for students to understand structure-function relationships at the molecular level, which they cannot as effectively visualize. Students need to develop accurate, 3-dimensional mental models of biomolecules to understand how biomolecular structure affects cellular functions at the molecular level, yet most traditional curricular tools such as textbooks include only 2-dimensional representations. We used a controlled, backward design approach to investigate how hand-held physical molecular model use affected students' ability to logically predict structure-function relationships. Brief (one class period) physical model use increased quiz score for females, whereas there was no significant increase in score for males using physical models. Females also self-reported higher learning gains in their understanding of context-specific protein function. Gender differences in spatial visualization may explain the gender-specific benefits of physical model use observed. © 2016 The Authors Biochemistry and Molecular Biology Education published by Wiley Periodicals, Inc. on behalf of International Union of Biochemistry and Molecular Biology, 44(4):326-335, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  6. Functional mixture regression.

    PubMed

    Yao, Fang; Fu, Yuejiao; Lee, Thomas C M

    2011-04-01

    In functional linear models (FLMs), the relationship between the scalar response and the functional predictor process is often assumed to be identical for all subjects. Motivated by both practical and methodological considerations, we relax this assumption and propose a new class of functional regression models that allow the regression structure to vary for different groups of subjects. By projecting the predictor process onto its eigenspace, the new functional regression model is simplified to a framework that is similar to classical mixture regression models. This leads to the proposed approach named as functional mixture regression (FMR). The estimation of FMR can be readily carried out using existing software implemented for functional principal component analysis and mixture regression. The practical necessity and performance of FMR are illustrated through applications to a longevity analysis of female medflies and a human growth study. Theoretical investigations concerning the consistent estimation and prediction properties of FMR along with simulation experiments illustrating its empirical properties are presented in the supplementary material available at Biostatistics online. Corresponding results demonstrate that the proposed approach could potentially achieve substantial gains over traditional FLMs.

  7. Experimental testing of Mackay's model for functional antagonism in the isolated costo-uterus of the rat.

    PubMed Central

    Henry, P. J.; Lulich, K. M.; Paterson, J. W.

    1985-01-01

    Several key predictions of a recently developed model for functional antagonism (Mackay, 1981) were experimentally tested using the rat isolated costo-uterine preparation. In the presence of the functional antagonist fenoterol (Fen), the functional constants (KAF) for carbachol and oxotremorine (Oxo) were respectively 9.9 and 3.4 fold greater than their corresponding affinity constants (KA). According to Mackay's model for functional antagonism, the higher KAF/KA ratio for carbachol indicates that this cholinoceptor agonist has a greater efficacy than Oxo. This was confirmed by using conventional pharmacological methods. As predicted from the model of functional antagonism, the plot of KAF/KA-1 against the fraction of cholinoceptors not irreversibly blocked by phenoxybenzamine (Pbz) was linear for both carbachol and Oxo and the lines of best fit crossed the axes at a point not significantly different from the origin. The value of 4.6 for the relative efficacy of carbachol to Oxo estimated from functional antagonism studies was comparable to the value of 5.6 calculated using the method of irreversible antagonism proposed by Furchgott (1966). PMID:3840396

  8. Modeling corneal surfaces with rational functions for high-speed videokeratoscopy data compression.

    PubMed

    Schneider, Martin; Iskander, D Robert; Collins, Michael J

    2009-02-01

    High-speed videokeratoscopy is an emerging technique that enables study of the corneal surface and tear-film dynamics. Unlike its static predecessor, this new technique results in a very large amount of digital data for which storage needs become significant. We aimed to design a compression technique that would use mathematical functions to parsimoniously fit corneal surface data with a minimum number of coefficients. Since the Zernike polynomial functions that have been traditionally used for modeling corneal surfaces may not necessarily correctly represent given corneal surface data in terms of its optical performance, we introduced the concept of Zernike polynomial-based rational functions. Modeling optimality criteria were employed in terms of both the rms surface error as well as the point spread function cross-correlation. The parameters of approximations were estimated using a nonlinear least-squares procedure based on the Levenberg-Marquardt algorithm. A large number of retrospective videokeratoscopic measurements were used to evaluate the performance of the proposed rational-function-based modeling approach. The results indicate that the rational functions almost always outperform the traditional Zernike polynomial approximations with the same number of coefficients.

  9. Diffusion in different models of active Brownian motion

    NASA Astrophysics Data System (ADS)

    Lindner, B.; Nicola, E. M.

    2008-04-01

    Active Brownian particles (ABP) have served as phenomenological models of self-propelled motion in biology. We study the effective diffusion coefficient of two one-dimensional ABP models (simplified depot model and Rayleigh-Helmholtz model) differing in their nonlinear friction functions. Depending on the choice of the friction function the diffusion coefficient does or does not attain a minimum as a function of noise intensity. We furthermore discuss the case of an additional bias breaking the left-right symmetry of the system. We show that this bias induces a drift and that it generally reduces the diffusion coefficient. For a finite range of values of the bias, both models can exhibit a maximum in the diffusion coefficient vs. noise intensity.

  10. On the numbers of images of two stochastic gravitational lensing models

    NASA Astrophysics Data System (ADS)

    Wei, Ang

    2017-02-01

    We study two gravitational lensing models with Gaussian randomness: the continuous mass fluctuation model and the floating black hole model. The lens equations of these models are related to certain random harmonic functions. Using Rice's formula and Gaussian techniques, we obtain the expected numbers of zeros of these functions, which indicate the amounts of images in the corresponding lens systems.

  11. Reorganization of the Connectivity between Elementary Functions – A Model Relating Conscious States to Neural Connections

    PubMed Central

    Mogensen, Jesper; Overgaard, Morten

    2017-01-01

    In the present paper it is argued that the “neural correlate of consciousness” (NCC) does not appear to be a separate “module” – but an aspect of information processing within the neural substrate of various cognitive processes. Consequently, NCC can only be addressed adequately within frameworks that model the general relationship between neural processes and mental states – and take into account the dynamic connectivity of the brain. We presently offer the REFGEN (general reorganization of elementary functions) model as such a framework. This model builds upon and expands the REF (reorganization of elementary functions) and REFCON (of elementary functions and consciousness) models. All three models integrate the relationship between the neural and mental layers of description via the construction of an intermediate level dealing with computational states. The importance of experience based organization of neural and cognitive processes is stressed. The models assume that the mechanisms of consciousness are in principle the same as the basic mechanisms of all aspects of cognition – when information is processed to a sufficiently “high level” it becomes available to conscious experience. The NCC is within the REFGEN model seen as aspects of the dynamic and experience driven reorganizations of the synaptic connectivity between the neurocognitive “building blocks” of the model – the elementary functions. PMID:28473797

  12. A generalization of the Becker model in linear viscoelasticity: creep, relaxation and internal friction

    NASA Astrophysics Data System (ADS)

    Mainardi, Francesco; Masina, Enrico; Spada, Giorgio

    2018-02-01

    We present a new rheological model depending on a real parameter ν \\in [0,1], which reduces to the Maxwell body for ν =0 and to the Becker body for ν =1. The corresponding creep law is expressed in an integral form in which the exponential function of the Becker model is replaced and generalized by a Mittag-Leffler function of order ν . Then the corresponding non-dimensional creep function and its rate are studied as functions of time for different values of ν in order to visualize the transition from the classical Maxwell body to the Becker body. Based on the hereditary theory of linear viscoelasticity, we also approximate the relaxation function by solving numerically a Volterra integral equation of the second kind. In turn, the relaxation function is shown versus time for different values of ν to visualize again the transition from the classical Maxwell body to the Becker body. Furthermore, we provide a full characterization of the new model by computing, in addition to the creep and relaxation functions, the so-called specific dissipation Q^{-1} as a function of frequency, which is of particular relevance for geophysical applications.

  13. Modeling Limited Foresight in Water Management Systems

    NASA Astrophysics Data System (ADS)

    Howitt, R.

    2005-12-01

    The inability to forecast future water supplies means that their management inevitably occurs under situations of limited foresight. Three modeling problems arise, first what type of objective function is a manager with limited foresight optimizing? Second how can we measure these objectives? Third can objective functions that incorporate uncertainty be integrated within the structure of optimizing water management models? The paper reviews the concepts of relative risk aversion and intertemporal substitution that underlie stochastic dynamic preference functions. Some initial results from the estimation of such functions for four different dam operations in northern California are presented and discussed. It appears that the path of previous water decisions and states influences the decision-makers willingness to trade off water supplies between periods. A compromise modeling approach that incorporates carry-over value functions under limited foresight within a broader net work optimal water management model is developed. The approach uses annual carry-over value functions derived from small dimension stochastic dynamic programs embedded within a larger dimension water allocation network. The disaggregation of the carry-over value functions to the broader network is extended using the space rule concept. Initial results suggest that the solution of such annual nonlinear network optimizations is comparable to, or faster than, the solution of linear network problems over long time series.

  14. Hemorrhage and Hemorrhagic Shock in Swine: A Review

    DTIC Science & Technology

    1989-11-01

    17 Temperature Regulation ....................... 18 Blood Gas and Acid- Base Status ....................... 18 Electrolyte...22 Renal Function .................................. 23 Hepatic Function ................................ 24 Central Nervous System Function...MODELS Most porcine hemorrhage models are based on concepts and procedures previously developed in other species, especially the dog. As a consequence

  15. Assessment of triglyceride and cholesterol in overweight people based on multiple linear regression and artificial intelligence model.

    PubMed

    Ma, Jing; Yu, Jiong; Hao, Guangshu; Wang, Dan; Sun, Yanni; Lu, Jianxin; Cao, Hongcui; Lin, Feiyan

    2017-02-20

    The prevalence of high hyperlipemia is increasing around the world. Our aims are to analyze the relationship of triglyceride (TG) and cholesterol (TC) with indexes of liver function and kidney function, and to develop a prediction model of TG, TC in overweight people. A total of 302 adult healthy subjects and 273 overweight subjects were enrolled in this study. The levels of fasting indexes of TG (fs-TG), TC (fs-TC), blood glucose, liver function, and kidney function were measured and analyzed by correlation analysis and multiple linear regression (MRL). The back propagation artificial neural network (BP-ANN) was applied to develop prediction models of fs-TG and fs-TC. The results showed there was significant difference in biochemical indexes between healthy people and overweight people. The correlation analysis showed fs-TG was related to weight, height, blood glucose, and indexes of liver and kidney function; while fs-TC was correlated with age, indexes of liver function (P < 0.01). The MRL analysis indicated regression equations of fs-TG and fs-TC both had statistic significant (P < 0.01) when included independent indexes. The BP-ANN model of fs-TG reached training goal at 59 epoch, while fs-TC model achieved high prediction accuracy after training 1000 epoch. In conclusions, there was high relationship of fs-TG and fs-TC with weight, height, age, blood glucose, indexes of liver function and kidney function. Based on related variables, the indexes of fs-TG and fs-TC can be predicted by BP-ANN models in overweight people.

  16. Simultaneous optimization of biomolecular energy function on features from small molecules and macromolecules

    PubMed Central

    Park, Hahnbeom; Bradley, Philip; Greisen, Per; Liu, Yuan; Mulligan, Vikram Khipple; Kim, David E.; Baker, David; DiMaio, Frank

    2017-01-01

    Most biomolecular modeling energy functions for structure prediction, sequence design, and molecular docking, have been parameterized using existing macromolecular structural data; this contrasts molecular mechanics force fields which are largely optimized using small-molecule data. In this study, we describe an integrated method that enables optimization of a biomolecular modeling energy function simultaneously against small-molecule thermodynamic data and high-resolution macromolecular structural data. We use this approach to develop a next-generation Rosetta energy function that utilizes a new anisotropic implicit solvation model, and an improved electrostatics and Lennard-Jones model, illustrating how energy functions can be considerably improved in their ability to describe large-scale energy landscapes by incorporating both small-molecule and macromolecule data. The energy function improves performance in a wide range of protein structure prediction challenges, including monomeric structure prediction, protein-protein and protein-ligand docking, protein sequence design, and prediction of the free energy changes by mutation, while reasonably recapitulating small-molecule thermodynamic properties. PMID:27766851

  17. Features of spatial and functional segregation and integration of the primate connectome revealed by trade-off between wiring cost and efficiency

    PubMed Central

    Chen, Yuhan; Wang, Shengjun

    2017-01-01

    The primate connectome, possessing a characteristic global topology and specific regional connectivity profiles, is well organized to support both segregated and integrated brain function. However, the organization mechanisms shaping the characteristic connectivity and its relationship to functional requirements remain unclear. The primate brain connectome is shaped by metabolic economy as well as functional values. Here, we explored the influence of two competing factors and additional advanced functional requirements on the primate connectome employing an optimal trade-off model between neural wiring cost and the representative functional requirement of processing efficiency. Moreover, we compared this model with a generative model combining spatial distance and topological similarity, with the objective of statistically reproducing multiple topological features of the network. The primate connectome indeed displays a cost-efficiency trade-off and that up to 67% of the connections were recovered by optimal combination of the two basic factors of wiring economy and processing efficiency, clearly higher than the proportion of connections (56%) explained by the generative model. While not explicitly aimed for, the trade-off model captured several key topological features of the real connectome as the generative model, yet better explained the connectivity of most regions. The majority of the remaining 33% of connections unexplained by the best trade-off model were long-distance links, which are concentrated on few cortical areas, termed long-distance connectors (LDCs). The LDCs are mainly non-hubs, but form a densely connected group overlapping on spatially segregated functional modalities. LDCs are crucial for both functional segregation and integration across different scales. These organization features revealed by the optimization analysis provide evidence that the demands of advanced functional segregation and integration among spatially distributed regions may play a significant role in shaping the cortical connectome, in addition to the basic cost-efficiency trade-off. These findings also shed light on inherent vulnerabilities of brain networks in diseases. PMID:28961235

  18. Features of spatial and functional segregation and integration of the primate connectome revealed by trade-off between wiring cost and efficiency.

    PubMed

    Chen, Yuhan; Wang, Shengjun; Hilgetag, Claus C; Zhou, Changsong

    2017-09-01

    The primate connectome, possessing a characteristic global topology and specific regional connectivity profiles, is well organized to support both segregated and integrated brain function. However, the organization mechanisms shaping the characteristic connectivity and its relationship to functional requirements remain unclear. The primate brain connectome is shaped by metabolic economy as well as functional values. Here, we explored the influence of two competing factors and additional advanced functional requirements on the primate connectome employing an optimal trade-off model between neural wiring cost and the representative functional requirement of processing efficiency. Moreover, we compared this model with a generative model combining spatial distance and topological similarity, with the objective of statistically reproducing multiple topological features of the network. The primate connectome indeed displays a cost-efficiency trade-off and that up to 67% of the connections were recovered by optimal combination of the two basic factors of wiring economy and processing efficiency, clearly higher than the proportion of connections (56%) explained by the generative model. While not explicitly aimed for, the trade-off model captured several key topological features of the real connectome as the generative model, yet better explained the connectivity of most regions. The majority of the remaining 33% of connections unexplained by the best trade-off model were long-distance links, which are concentrated on few cortical areas, termed long-distance connectors (LDCs). The LDCs are mainly non-hubs, but form a densely connected group overlapping on spatially segregated functional modalities. LDCs are crucial for both functional segregation and integration across different scales. These organization features revealed by the optimization analysis provide evidence that the demands of advanced functional segregation and integration among spatially distributed regions may play a significant role in shaping the cortical connectome, in addition to the basic cost-efficiency trade-off. These findings also shed light on inherent vulnerabilities of brain networks in diseases.

  19. Specifications of insilicoML 1.0: a multilevel biophysical model description language.

    PubMed

    Asai, Yoshiyuki; Suzuki, Yasuyuki; Kido, Yoshiyuki; Oka, Hideki; Heien, Eric; Nakanishi, Masao; Urai, Takahito; Hagihara, Kenichi; Kurachi, Yoshihisa; Nomura, Taishin

    2008-12-01

    An extensible markup language format, insilicoML (ISML), version 0.1, describing multi-level biophysical models has been developed and available in the public domain. ISML is fully compatible with CellML 1.0, a model description standard developed by the IUPS Physiome Project, for enhancing knowledge integration and model sharing. This article illustrates the new specifications of ISML 1.0 that largely extend the capability of ISML 0.1. ISML 1.0 can describe various types of mathematical models, including ordinary/partial differential/difference equations representing the dynamics of physiological functions and the geometry of living organisms underlying the functions. ISML 1.0 describes a model using a set of functional elements (modules) each of which can specify mathematical expressions of the functions. Structural and logical relationships between any two modules are specified by edges, which allow modular, hierarchical, and/or network representations of the model. The role of edge-relationships is enriched by key words in order for use in constructing a physiological ontology. The ontology is further improved by the traceability of history of the model's development and by linking between different ISML models stored in the model's database using meta-information. ISML 1.0 is designed to operate with a model database and integrated environments for model development and simulations for knowledge integration and discovery.

  20. Functional insights from proteome-wide structural modeling of Treponema pallidum subspecies pallidum, the causative agent of syphilis.

    PubMed

    Houston, Simon; Lithgow, Karen Vivien; Osbak, Kara Krista; Kenyon, Chris Richard; Cameron, Caroline E

    2018-05-16

    Syphilis continues to be a major global health threat with 11 million new infections each year, and a global burden of 36 million cases. The causative agent of syphilis, Treponema pallidum subspecies pallidum, is a highly virulent bacterium, however the molecular mechanisms underlying T. pallidum pathogenesis remain to be definitively identified. This is due to the fact that T. pallidum is currently uncultivatable, inherently fragile and thus difficult to work with, and phylogenetically distinct with no conventional virulence factor homologs found in other pathogens. In fact, approximately 30% of its predicted protein-coding genes have no known orthologs or assigned functions. Here we employed a structural bioinformatics approach using Phyre2-based tertiary structure modeling to improve our understanding of T. pallidum protein function on a proteome-wide scale. Phyre2-based tertiary structure modeling generated high-confidence predictions for 80% of the T. pallidum proteome (780/978 predicted proteins). Tertiary structure modeling also inferred the same function as primary structure-based annotations from genome sequencing pipelines for 525/605 proteins (87%), which represents 54% (525/978) of all T. pallidum proteins. Of the 175 T. pallidum proteins modeled with high confidence that were not assigned functions in the previously annotated published proteome, 167 (95%) were able to be assigned predicted functions. Twenty-one of the 175 hypothetical proteins modeled with high confidence were also predicted to exhibit significant structural similarity with proteins experimentally confirmed to be required for virulence in other pathogens. Phyre2-based structural modeling is a powerful bioinformatics tool that has provided insight into the potential structure and function of the majority of T. pallidum proteins and helped validate the primary structure-based annotation of more than 50% of all T. pallidum proteins with high confidence. This work represents the first T. pallidum proteome-wide structural modeling study and is one of few studies to apply this approach for the functional annotation of a whole proteome.

  1. Including Effects of Water Stress on Dead Organic Matter Decay to a Forest Carbon Model

    NASA Astrophysics Data System (ADS)

    Kim, H.; Lee, J.; Han, S. H.; Kim, S.; Son, Y.

    2017-12-01

    Decay of dead organic matter is a key process of carbon (C) cycling in forest ecosystems. The change in decay rate depends on temperature sensitivity and moisture conditions. The Forest Biomass and Dead organic matter Carbon (FBDC) model includes a decay sub-model considering temperature sensitivity, yet does not consider moisture conditions as drivers of the decay rate change. This study aimed to improve the FBDC model by including a water stress function to the decay sub-model. Also, soil C sequestration under climate change with the FBDC model including the water stress function was simulated. The water stress functions were determined with data from decomposition study on Quercus variabilis forests and Pinus densiflora forests of Korea, and adjustment parameters of the functions were determined for both species. The water stress functions were based on the ratio of precipitation to potential evapotranspiration. Including the water stress function increased the explained variances of the decay rate by 19% for the Q. variabilis forests and 7% for the P. densiflora forests, respectively. The increase of the explained variances resulted from large difference in temperature range and precipitation range across the decomposition study plots. During the period of experiment, the mean annual temperature range was less than 3°C, while the annual precipitation ranged from 720mm to 1466mm. Application of the water stress functions to the FBDC model constrained increasing trend of temperature sensitivity under climate change, and thus increased the model-estimated soil C sequestration (Mg C ha-1) by 6.6 for the Q. variabilis forests and by 3.1 for the P. densiflora forests, respectively. The addition of water stress functions increased reliability of the decay rate estimation and could contribute to reducing the bias in estimating soil C sequestration under varying moisture condition. Acknowledgement: This study was supported by Korea Forest Service (2017044B10-1719-BB01)

  2. Elucidation of spin echo small angle neutron scattering correlation functions through model studies.

    PubMed

    Shew, Chwen-Yang; Chen, Wei-Ren

    2012-02-14

    Several single-modal Debye correlation functions to approximate part of the overall Debey correlation function of liquids are closely examined for elucidating their behavior in the corresponding spin echo small angle neutron scattering (SESANS) correlation functions. We find that the maximum length scale of a Debye correlation function is identical to that of its SESANS correlation function. For discrete Debye correlation functions, the peak of SESANS correlation function emerges at their first discrete point, whereas for continuous Debye correlation functions with greater width, the peak position shifts to a greater value. In both cases, the intensity and shape of the peak of the SESANS correlation function are determined by the width of the Debye correlation functions. Furthermore, we mimic the intramolecular and intermolecular Debye correlation functions of liquids composed of interacting particles based on a simple model to elucidate their competition in the SESANS correlation function. Our calculations show that the first local minimum of a SESANS correlation function can be negative and positive. By adjusting the spatial distribution of the intermolecular Debye function in the model, the calculated SESANS spectra exhibit the profile consistent with that of hard-sphere and sticky-hard-sphere liquids predicted by more sophisticated liquid state theory and computer simulation. © 2012 American Institute of Physics

  3. Improved analyses using function datasets and statistical modeling

    Treesearch

    John S. Hogland; Nathaniel M. Anderson

    2014-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space and have limited statistical functionality and machine learning algorithms. To address this issue, we developed a new modeling framework using C# and ArcObjects and integrated that framework...

  4. Experience with k-epsilon turbulence models for heat transfer computations in rotating

    NASA Technical Reports Server (NTRS)

    Tekriwal, Prabbat

    1995-01-01

    This viewgraph presentation discusses geometry and flow configuration, effect of y+ on heat transfer computations, standard and extended k-epsilon turbulence model results with wall function, low-Re model results (the Lam-Bremhorst model without wall function), a criterion for flow reversal in a radially rotating square duct, and a summary.

  5. Z/sub n/ Baxter model: symmetries and the Belavin parametrization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richey, M.P.; Tracy, C.A.

    1986-02-01

    The Z/sub n/ Baxter model is an exactly solvable lattice model in the special case of the Belavin parametrization. For this parametrization the authors calculate the partition function in an antiferromagnetic region and the order parameter in a ferromagnetic region. They find that the order parameter is expressible in terms of a modular function of level n which for n=2 is the Onsager-Yang-Baxter result. In addition they determine the symmetry group of the finite lattice partition function for the general Z/sub n/ Baxter model.

  6. Formulation of consumables management models: Mission planning processor payload interface definition

    NASA Technical Reports Server (NTRS)

    Torian, J. G.

    1977-01-01

    Consumables models required for the mission planning and scheduling function are formulated. The relation of the models to prelaunch, onboard, ground support, and postmission functions for the space transportation systems is established. Analytical models consisting of an orbiter planning processor with consumables data base is developed. A method of recognizing potential constraint violations in both the planning and flight operations functions, and a flight data file storage/retrieval of information over an extended period which interfaces with a flight operations processor for monitoring of the actual flights is presented.

  7. A Cross-Validation Approach to Approximate Basis Function Selection of the Stall Flutter Response of a Rectangular Wing in a Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.; Vio, Gareth A.; Andrianne, Thomas; azak, Norizham Abudl; Dimitriadis, Grigorios

    2012-01-01

    The stall flutter response of a rectangular wing in a low speed wind tunnel is modelled using a nonlinear difference equation description. Static and dynamic tests are used to select a suitable model structure and basis function. Bifurcation criteria such as the Hopf condition and vibration amplitude variation with airspeed were used to ensure the model was representative of experimentally measured stall flutter phenomena. Dynamic test data were used to estimate model parameters and estimate an approximate basis function.

  8. Multipoint Green's functions in 1 + 1 dimensional integrable quantum field theories

    DOE PAGES

    Babujian, H. M.; Karowski, M.; Tsvelik, A. M.

    2017-02-14

    We calculate the multipoint Green functions in 1+1 dimensional integrable quantum field theories. We use the crossing formula for general models and calculate the 3 and 4 point functions taking in to account only the lower nontrivial intermediate states contributions. Then we apply the general results to the examples of the scaling Z 2 Ising model, sinh-Gordon model and Z 3 scaling Potts model. We demonstrate this calculations explicitly. The results can be applied to physical phenomena as for example to the Raman scattering.

  9. Incorporating an enzymatic model of effects of temperature, moisture, and substrate supply on soil respiration into an ecosystem model for two forests of northeastern USA

    NASA Astrophysics Data System (ADS)

    Sihi, Debjani; Davidson, Eric; Chen, Min; Savage, Kathleen; Richardson, Andrew; Keenan, Trevor; Hollinger, David

    2017-04-01

    Soils represent the largest terrestrial carbon (C) pool, and microbial decomposition of soil organic matter (SOM) to carbon dioxide, also called heterotrophic respiration (Rh), is an important component of the global C cycle. Temperature sensitivity of Rh is often represented with a simple Q10 function in ecosystem models and earth system models (ESMs), sometimes accompanied by an empirical soil moisture modifier. More explicit representation of the effects of soil moisture, substrate supply, and their interactions with temperature has been proposed to disentangle the confounding factors of apparent temperature sensitivity of SOM decomposition and improve performance of ecosystem models and ESMs. The objective of this work was to incorporate into an ecosystem model a more mechanistic, but still parsimonious, model of environmental factors controlling Rh. The Dual Arrhenius and Michaelis-Menten (DAMM) model simulates Rh using Michaelis-Menten, Arrhenius, and diffusion functions. Soil moisture affects Rh and its apparent temperature sensitivity in DAMM by regulating the diffusion of oxygen and soluble carbon substrates to the enzymatic reaction site. However, in its current configuration, DAMM depends on assumptions or inputs from other models regarding soil C inputs. Here we merged the DAMM soil flux model with a parsimonious ecosystem flux model, FöBAAR (Forest Biomass, Assimilation, Allocation and Respiration) by replacing FöBAAR's algorithms for Rh with those of DAMM. Classical root trenching experiments provided data to partition soil CO2 efflux into Rh (trenched plot) and root respiration (untrenched minus trenched plots). We used three years of high-frequency soil flux data from automated soil chambers (trenched and untrenched plots) and landscape-scale ecosystem fluxes from eddy covariance towers from two mid-latitude forests (Harvard Forest, MA and Howland Forest, ME) of northeastern USA to develop and validate the merged model and to quantify the uncertainties in a multiple constraints approach. The optimized DAMM-FöBAAR model better captured the seasonal dynamics of Rh compared to the FöBAAR-only model for the Harvard Forest, as indicated by lower cost functions (model-data mismatch). However, DAMM-FöBAAR showed less improvement over FöBAAR-only for the boreal transition forest at Howland. The frequency of droughts is lower at Howland, due to a shallow water table, resulting in only brief water limitation affecting Rh in some years. At both sites, the declining trend of soil respiration during drought episodes was captured by the DAMM-FöBAAR model, but not the FöBAAR-only model, which simulates Rh using only a Q10 type function. Greater confidence in model prediction resulting from the inclusion of mechanistic simulation of moisture limitation on substrate availability, an emergent property of DAMM, depends on site conditions, climate, and the temporal scale of interest. While the DAMM functions require a few more parameters than a simple Q10 function, we have demonstrated that they can be included in an ecosystem model and reduce the cost function. Moreover, the mechanistic structure of the soil moisture effects using DAMM functions should be more generalizable than other commonly used empirical functions.

  10. Estimation of the characteristic parameters of the multilayered film model using the patterson differential function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Astaf'ev, S. B., E-mail: webmaster@ns.crys.ras.ru; Shchedrin, B. M.; Yanusova, L. G.

    The possibility of estimating the layered film structural parameters by constructing the autocorrelation function P{sub F}(z) (referred to as the Patterson differential function) for the derivative d{rho}/dz of electron density along the normal to the sample surface has been considered. An analytical expression P{sub F}(z) is presented for a multilayered film within the box model of the electron density profile. The possibilities of selecting structural information about layered films by analyzing the features of this function are demonstrated by model and real examples, in particular, by applying the method of shifted systems of peaks for the function P{sub F}(z).

  11. Dynamics of functional failures and recovery in complex road networks

    NASA Astrophysics Data System (ADS)

    Zhan, Xianyuan; Ukkusuri, Satish V.; Rao, P. Suresh C.

    2017-11-01

    We propose a new framework for modeling the evolution of functional failures and recoveries in complex networks, with traffic congestion on road networks as the case study. Differently from conventional approaches, we transform the evolution of functional states into an equivalent dynamic structural process: dual-vertex splitting and coalescing embedded within the original network structure. The proposed model successfully explains traffic congestion and recovery patterns at the city scale based on high-resolution data from two megacities. Numerical analysis shows that certain network structural attributes can amplify or suppress cascading functional failures. Our approach represents a new general framework to model functional failures and recoveries in flow-based networks and allows understanding of the interplay between structure and function for flow-induced failure propagation and recovery.

  12. Generalized Constitutive-Based Theoretical and Empirical Models for Hot Working Behavior of Functionally Graded Steels

    NASA Astrophysics Data System (ADS)

    Vanini, Seyed Ali Sadough; Abolghasemzadeh, Mohammad; Assadi, Abbas

    2013-07-01

    Functionally graded steels with graded ferritic and austenitic regions including bainite and martensite intermediate layers produced by electroslag remelting have attracted much attention in recent years. In this article, an empirical model based on the Zener-Hollomon (Z-H) constitutive equation with generalized material constants is presented to investigate the effects of temperature and strain rate on the hot working behavior of functionally graded steels. Next, a theoretical model, generalized by strain compensation, is developed for the flow stress estimation of functionally graded steels under hot compression based on the phase mixture rule and boundary layer characteristics. The model is used for different strains and grading configurations. Specifically, the results for αβγMγ steels from empirical and theoretical models showed excellent agreement with those of experiments of other references within acceptable error.

  13. Model driver screening and evaluation program. Volume 1, Project summary and model program recommendations

    DOT National Transportation Integrated Search

    2003-05-01

    This research project studied the feasibility as well as the scientific validity and utility of performing functional capacity screening with older drivers. A Model Program was described encompassing procedures to detect functionally impaired drivers...

  14. PROSTATE REGULATION: MODELING ENDOGENOUS ...

    EPA Pesticide Factsheets

    Prostate function is an important indicator of androgen status in toxicological studies making predictive modeling of the relevant pharmacokinetics and pharmacodynamics desirable. Prostate function is an important indicator of androgen status in toxicological studies making predictive modeling of the relevant pharmacokinetics and pharmacodynamics desirable.

  15. Consistent Parameter and Transfer Function Estimation using Context Free Grammars

    NASA Astrophysics Data System (ADS)

    Klotz, Daniel; Herrnegger, Mathew; Schulz, Karsten

    2017-04-01

    This contribution presents a method for the inference of transfer functions for rainfall-runoff models. Here, transfer functions are defined as parametrized (functional) relationships between a set of spatial predictors (e.g. elevation, slope or soil texture) and model parameters. They are ultimately used for estimation of consistent, spatially distributed model parameters from a limited amount of lumped global parameters. Additionally, they provide a straightforward method for parameter extrapolation from one set of basins to another and can even be used to derive parameterizations for multi-scale models [see: Samaniego et al., 2010]. Yet, currently an actual knowledge of the transfer functions is often implicitly assumed. As a matter of fact, for most cases these hypothesized transfer functions can rarely be measured and often remain unknown. Therefore, this contribution presents a general method for the concurrent estimation of the structure of transfer functions and their respective (global) parameters. Note, that by consequence an estimation of the distributed parameters of the rainfall-runoff model is also undertaken. The method combines two steps to achieve this. The first generates different possible transfer functions. The second then estimates the respective global transfer function parameters. The structural estimation of the transfer functions is based on the context free grammar concept. Chomsky first introduced context free grammars in linguistics [Chomsky, 1956]. Since then, they have been widely applied in computer science. But, to the knowledge of the authors, they have so far not been used in hydrology. Therefore, the contribution gives an introduction to context free grammars and shows how they can be constructed and used for the structural inference of transfer functions. This is enabled by new methods from evolutionary computation, such as grammatical evolution [O'Neill, 2001], which make it possible to exploit the constructed grammar as a search space for equations. The parametrization of the transfer functions is then achieved through a second optimization routine. The contribution explores different aspects of the described procedure through a set of experiments. These experiments can be divided into three categories: (1) The inference of transfer functions from directly measurable parameters; (2) The estimation of global parameters for given transfer functions from runoff data; and (3) The estimation of sets of completely unknown transfer functions from runoff data. The conducted tests reveal different potentials and limits of the procedure. In concrete it is shown that example (1) and (2) work remarkably well. Example (3) is much more dependent on the setup. In general, it can be said that in that case much more data is needed to derive transfer function estimations, even for simple models and setups. References: - Chomsky, N. (1956): Three Models for the Description of Language. IT IRETr. 2(3), p 113-124 - O'Neil, M. (2001): Grammatical Evolution. IEEE ToEC, Vol.5, No. 4 - Samaniego, L.; Kumar, R.; Attinger, S. (2010): Multiscale parameter regionalization of a grid-based hydrologic model at the mesoscale. WWR, Vol. 46, W05523, doi:10.1029/2008WR007327

  16. A strategy for improved computational efficiency of the method of anchored distributions

    NASA Astrophysics Data System (ADS)

    Over, Matthew William; Yang, Yarong; Chen, Xingyuan; Rubin, Yoram

    2013-06-01

    This paper proposes a strategy for improving the computational efficiency of model inversion using the method of anchored distributions (MAD) by "bundling" similar model parametrizations in the likelihood function. Inferring the likelihood function typically requires a large number of forward model (FM) simulations for each possible model parametrization; as a result, the process is quite expensive. To ease this prohibitive cost, we present an approximation for the likelihood function called bundling that relaxes the requirement for high quantities of FM simulations. This approximation redefines the conditional statement of the likelihood function as the probability of a set of similar model parametrizations "bundle" replicating field measurements, which we show is neither a model reduction nor a sampling approach to improving the computational efficiency of model inversion. To evaluate the effectiveness of these modifications, we compare the quality of predictions and computational cost of bundling relative to a baseline MAD inversion of 3-D flow and transport model parameters. Additionally, to aid understanding of the implementation we provide a tutorial for bundling in the form of a sample data set and script for the R statistical computing language. For our synthetic experiment, bundling achieved a 35% reduction in overall computational cost and had a limited negative impact on predicted probability distributions of the model parameters. Strategies for minimizing error in the bundling approximation, for enforcing similarity among the sets of model parametrizations, and for identifying convergence of the likelihood function are also presented.

  17. Computational Models for Calcium-Mediated Astrocyte Functions.

    PubMed

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro , but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus, we would like to emphasize that only via reproducible research are we able to build better computational models for astrocytes, which truly advance science. Our study is the first to characterize in detail the biophysical and biochemical mechanisms that have been modeled for astrocytes.

  18. Computational Models for Calcium-Mediated Astrocyte Functions

    PubMed Central

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro, but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus, we would like to emphasize that only via reproducible research are we able to build better computational models for astrocytes, which truly advance science. Our study is the first to characterize in detail the biophysical and biochemical mechanisms that have been modeled for astrocytes. PMID:29670517

  19. Multitrophic functional diversity predicts ecosystem functioning in experimental assemblages of estuarine consumers.

    PubMed

    Lefcheck, Jonathan S; Duffy, J Emmett

    2015-11-01

    The use of functional traits to explain how biodiversity affects ecosystem functioning has attracted intense interest, yet few studies have a priori altered functional diversity, especially in multitrophic communities. Here, we manipulated multivariate functional diversity of estuarine grazers and predators within multiple levels of species richness to test how species richness and functional diversity predicted ecosystem functioning in a multitrophic food web. Community functional diversity was a better predictor than species richness for the majority of ecosystem properties, based on generalized linear mixed-effects models. Combining inferences from eight traits into a single multivariate index increased prediction accuracy of these models relative to any individual trait. Structural equation modeling revealed that functional diversity of both grazers and predators was important in driving final biomass within trophic levels, with stronger effects observed for predators. We also show that different species drove different ecosystem responses, with evidence for both sampling effects and complementarity. Our study extends experimental investigations of functional trait diversity to a multilevel food web, and demonstrates that functional diversity can be more accurate and effective than species richness in predicting community biomass in a food web context.

  20. Kullback-Leibler information function and the sequential selection of experiments to discriminate among several linear models. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Sidik, S. M.

    1972-01-01

    A sequential adaptive experimental design procedure for a related problem is studied. It is assumed that a finite set of potential linear models relating certain controlled variables to an observed variable is postulated, and that exactly one of these models is correct. The problem is to sequentially design most informative experiments so that the correct model equation can be determined with as little experimentation as possible. Discussion includes: structure of the linear models; prerequisite distribution theory; entropy functions and the Kullback-Leibler information function; the sequential decision procedure; and computer simulation results. An example of application is given.

  1. Blurred image restoration using knife-edge function and optimal window Wiener filtering.

    PubMed

    Wang, Min; Zhou, Shudao; Yan, Wei

    2018-01-01

    Motion blur in images is usually modeled as the convolution of a point spread function (PSF) and the original image represented as pixel intensities. The knife-edge function can be used to model various types of motion-blurs, and hence it allows for the construction of a PSF and accurate estimation of the degradation function without knowledge of the specific degradation model. This paper addresses the problem of image restoration using a knife-edge function and optimal window Wiener filtering. In the proposed method, we first calculate the motion-blur parameters and construct the optimal window. Then, we use the detected knife-edge function to obtain the system degradation function. Finally, we perform Wiener filtering to obtain the restored image. Experiments show that the restored image has improved resolution and contrast parameters with clear details and no discernible ringing effects.

  2. Blurred image restoration using knife-edge function and optimal window Wiener filtering

    PubMed Central

    Zhou, Shudao; Yan, Wei

    2018-01-01

    Motion blur in images is usually modeled as the convolution of a point spread function (PSF) and the original image represented as pixel intensities. The knife-edge function can be used to model various types of motion-blurs, and hence it allows for the construction of a PSF and accurate estimation of the degradation function without knowledge of the specific degradation model. This paper addresses the problem of image restoration using a knife-edge function and optimal window Wiener filtering. In the proposed method, we first calculate the motion-blur parameters and construct the optimal window. Then, we use the detected knife-edge function to obtain the system degradation function. Finally, we perform Wiener filtering to obtain the restored image. Experiments show that the restored image has improved resolution and contrast parameters with clear details and no discernible ringing effects. PMID:29377950

  3. Understanding Complex Natural Systems by Articulating Structure-Behavior-Function Models

    ERIC Educational Resources Information Center

    Vattam, Swaroop S.; Goel, Ashok K.; Rugaber, Spencer; Hmelo-Silver, Cindy E.; Jordan, Rebecca; Gray, Steven; Sinha, Suparna

    2011-01-01

    Artificial intelligence research on creative design has led to Structure-Behavior-Function (SBF) models that emphasize functions as abstractions for organizing understanding of physical systems. Empirical studies on understanding complex systems suggest that novice understanding is shallow, typically focusing on their visible structures and…

  4. Linking biodiversity to ecosystem function: Implications for conservation ecology

    USGS Publications Warehouse

    Schwartz, M.W.; Brigham, C.A.; Hoeksema, J.D.; Lyons, K.G.; Mills, M.H.; van Mantgem, P.

    2000-01-01

    We evaluate the empirical and theoretical support for the hypothesis that a large proportion of native species richness is required to maximize ecosystem stability and sustain function. This assessment is important for conservation strategies because sustenance of ecosystem functions has been used as an argument for the conservation of species. If ecosystem functions are sustained at relatively low species richness, then arguing for the conservation of ecosystem function, no matter how important in its own right, does not strongly argue for the conservation of species. Additionally, for this to be a strong conservation argument the link between species diversity and ecosystem functions of value to the human community must be clear. We review the empirical literature to quantify the support for two hypotheses: (1) species richness is positively correlated with ecosystem function, and (2) ecosystem functions do not saturate at low species richness relative to the observed or experimental diversity. Few empirical studies demonstrate improved function at high levels of species richness. Second, we analyze recent theoretical models in order to estimate the level of species richness required to maintain ecosystem function. Again we find that, within a single trophic level, most mathematical models predict saturation of ecosystem function at a low proportion of local species richness. We also analyze a theoretical model linking species number to ecosystem stability. This model predicts that species richness beyond the first few species does not typically increase ecosystem stability. One reason that high species richness may not contribute significantly to function or stability is that most communities are characterized by strong dominance such that a few species provide the vast majority of the community biomass. Rapid turnover of species may rescue the concept that diversity leads to maximum function and stability. The role of turnover in ecosystem function and stability has not been investigated. Despite the recent rush to embrace the linkage between biodiversity and ecosystem function, we find little support for the hypothesis that there is a strong dependence of ecosystem function on the full complement of diversity within sites. Given this observation, the conservation community should take a cautious view of endorsing this linkage as a model to promote conservation goals.

  5. Incorporation of an evolutionary algorithm to estimate transfer-functions for a parameter regionalization scheme of a rainfall-runoff model

    NASA Astrophysics Data System (ADS)

    Klotz, Daniel; Herrnegger, Mathew; Schulz, Karsten

    2016-04-01

    This contribution presents a framework, which enables the use of an Evolutionary Algorithm (EA) for the calibration and regionalization of the hydrological model COSEROreg. COSEROreg uses an updated version of the HBV-type model COSERO (Kling et al. 2014) for the modelling of hydrological processes and is embedded in a parameter regionalization scheme based on Samaniego et al. (2010). The latter uses subscale-information to estimate model via a-priori chosen transfer functions (often derived from pedotransfer functions). However, the transferability of the regionalization scheme to different model-concepts and the integration of new forms of subscale information is not straightforward. (i) The usefulness of (new) single sub-scale information layers is unknown beforehand. (ii) Additionally, the establishment of functional relationships between these (possibly meaningless) sub-scale information layers and the distributed model parameters remain a central challenge in the implementation of a regionalization procedure. The proposed method theoretically provides a framework to overcome this challenge. The implementation of the EA encompasses the following procedure: First, a formal grammar is specified (Ryan et al., 1998). The construction of the grammar thereby defines the set of possible transfer functions and also allows to incorporate hydrological domain knowledge into the search itself. The EA iterates over the given space by combining parameterized basic functions (e.g. linear- or exponential functions) and sub-scale information layers into transfer functions, which are then used in COSEROreg. However, a pre-selection model is applied beforehand to sort out unfeasible proposals by the EA and to reduce the necessary model runs. A second optimization routine is used to optimize the parameters of the transfer functions proposed by the EA. This concept, namely using two nested optimization loops, is inspired by the idea of Lamarckian Evolution and Baldwin Effect (Whitley et al., 1994), which might be understood as the idea that acquired characteristics during the lifetime of an individual can be transferred between generations. A hierarchical objective function is used for the model evaluation. This enables model preemption (Tolson et al., 2010) and reduces the amount of model evaluations in the early stages of optimization. References: • Samaniego, L., Kumar, R., Attinger, S. (2010): Multiscale parameter regionalization of a grid-based hydrologic model at the mesoscale, Water Resour. Res., doi: 10.1029/2008WR007327 • Kling, H., Stanzel, P., Fuchs, M., and Nachtnebel, H. P. (2014): Performance of the COSERO precipitation-runoff model under non-stationary conditions in basins with different climates, Hydrolog. Sci. J., doi:10.1080/02626667.2014.959956. • C. Ryan, J.J. Collins, Ji, Collins, M. O'Neil (1998): Evolving Programs for an Arbitrary Language, Lecture Notes in Computer Science 1391, Proceedings of the First European Workshop on Genetic Programming. • B.A. Tolson, S. Razavi, L.S. Matott, N.R. Thomson, A. MacLean, F.R. Seglenieks (2010): Reducing the computational cost of automatic calibration through model preemption, Water Resour. Res., 46, W11523, doi:10.1029/2009WR008957. • D. Whitley, S. Gordon, K. Mathias (1994): Lamarckian evolution, the Baldwin effect, and function optimization, in Parallel Problem Solving from Nature (PPSN) III, Y. Davidor, H.-P. Schwefel, and R. Manner, Eds. Berlin: Springer-Verlag, pp. 6-15.

  6. Successful Aging and Subjective Well-Being Among Oldest-Old Adults

    PubMed Central

    Cho, Jinmyoung; Martin, Peter; Poon, Leonard W.

    2015-01-01

    Purpose of the Study: This research integrates successful aging and developmental adaptation models to empirically define the direct and indirect effects of 2 distal (i.e., education and past life experiences) and 5 proximal influences (i.e., physical functioning, cognitive functioning, physical health impairment, social resources, and perceived economic status) on subjective well-being. The proximal influences involved predictors outlined in most extant models of successful aging (e.g., Rowe & Kahn, 1998 [Rowe, J. W., & Kahn, R. L. (1998). Successful aging. New York: Pantheon Books.]). Our model extends such models by including distal impact as well as interactions between distal and proximal impacts. Design and Methods: Data were obtained from 234 centenarians and 72 octogenarians in the Georgia Centenarian Study. Structural equation modeling was conducted with Mplus 6.1. Results: Results showed significant direct effects of physical health impairment and social resources on positive aspects of subjective well-being among oldest-old adults. We also found significant indirect effects of cognitive functioning and education on positive affect among oldest-old adults. Social resources mediated the relationship between cognitive functioning and positive affect; and cognitive functioning and social resources mediated the relationship between education and positive affect. In addition, physical health impairment mediated the relationship between cognitive functioning and positive affect; and cognitive functioning and physical health impairment mediated the relationship between education and positive affect. Implications: Integrating 2 different models (i.e., successful aging and developmental adaptation) provided a comprehensive view of adaptation from a developmental perspective. PMID:25112594

  7. Coupling Spatiotemporal Community Assembly Processes to Changes in Microbial Metabolism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graham, Emily B.; Crump, Alex R.; Resch, Charles T.

    Community assembly processes govern shifts in species abundances in response to environmental change, yet our understanding of assembly remains largely decoupled from ecosystem function. Here, we test hypotheses regarding assembly and function across space and time using hyporheic microbial communities as a model system. We pair sampling of two habitat types through hydrologic fluctuation with null modeling and multivariate statistics. We demonstrate that dual selective pressures assimilate to generate compositional changes at distinct timescales among habitat types, resulting in contrasting associations of Betaproteobacteria and Thaumarchaeota with selection and with seasonal changes in aerobic metabolism. Our results culminate in a conceptualmore » model in which selection from contrasting environments regulates taxon abundance and ecosystem function through time, with increases in function when oscillating selection opposes stable selective pressures. Our model is applicable within both macrobial and microbial ecology and presents an avenue for assimilating community assembly processes into predictions of ecosystem function.« less

  8. Aerodynamic parameter estimation via Fourier modulating function techniques

    NASA Technical Reports Server (NTRS)

    Pearson, A. E.

    1995-01-01

    Parameter estimation algorithms are developed in the frequency domain for systems modeled by input/output ordinary differential equations. The approach is based on Shinbrot's method of moment functionals utilizing Fourier based modulating functions. Assuming white measurement noises for linear multivariable system models, an adaptive weighted least squares algorithm is developed which approximates a maximum likelihood estimate and cannot be biased by unknown initial or boundary conditions in the data owing to a special property attending Shinbrot-type modulating functions. Application is made to perturbation equation modeling of the longitudinal and lateral dynamics of a high performance aircraft using flight-test data. Comparative studies are included which demonstrate potential advantages of the algorithm relative to some well established techniques for parameter identification. Deterministic least squares extensions of the approach are made to the frequency transfer function identification problem for linear systems and to the parameter identification problem for a class of nonlinear-time-varying differential system models.

  9. Stochastic derivative-free optimization using a trust region framework

    DOE PAGES

    Larson, Jeffrey; Billups, Stephen C.

    2016-02-17

    This study presents a trust region algorithm to minimize a function f when one has access only to noise-corrupted function values f¯. The model-based algorithm dynamically adjusts its step length, taking larger steps when the model and function agree and smaller steps when the model is less accurate. The method does not require the user to specify a fixed pattern of points used to build local models and does not repeatedly sample points. If f is sufficiently smooth and the noise is independent and identically distributed with mean zero and finite variance, we prove that our algorithm produces iterates suchmore » that the corresponding function gradients converge in probability to zero. As a result, we present a prototype of our algorithm that, while simplistic in its management of previously evaluated points, solves benchmark problems in fewer function evaluations than do existing stochastic approximation methods.« less

  10. Derivation of Hunt equation for suspension distribution using Shannon entropy theory

    NASA Astrophysics Data System (ADS)

    Kundu, Snehasis

    2017-12-01

    In this study, the Hunt equation for computing suspension concentration in sediment-laden flows is derived using Shannon entropy theory. Considering the inverse of the void ratio as a random variable and using principle of maximum entropy, probability density function and cumulative distribution function of suspension concentration is derived. A new and more general cumulative distribution function for the flow domain is proposed which includes several specific other models of CDF reported in literature. This general form of cumulative distribution function also helps to derive the Rouse equation. The entropy based approach helps to estimate model parameters using suspension data of sediment concentration which shows the advantage of using entropy theory. Finally model parameters in the entropy based model are also expressed as functions of the Rouse number to establish a link between the parameters of the deterministic and probabilistic approaches.

  11. Evaluation of computing systems using functionals of a Stochastic process

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.; Wu, L. T.

    1980-01-01

    An intermediate model was used to represent the probabilistic nature of a total system at a level which is higher than the base model and thus closer to the performance variable. A class of intermediate models, which are generally referred to as functionals of a Markov process, were considered. A closed form solution of performability for the case where performance is identified with the minimum value of a functional was developed.

  12. From inflammation to wound healing: using a simple model to understand the functional versatility of murine macrophages.

    PubMed

    Childs, Lauren M; Paskow, Michael; Morris, Sidney M; Hesse, Matthias; Strogatz, Steven

    2011-11-01

    Macrophages are fundamental cells of the innate immune system. Their activation is essential for such distinct immune functions as inflammation (pathogen-killing) and tissue repair (wound healing). An open question has been the functional stability of an individual macrophage cell: whether it can change its functional profile between different immune responses such as between the repair pathway and the inflammatory pathway. We studied this question theoretically by constructing a rate equation model for the key substrate, enzymes and products of the pathways; we then tested the model experimentally. Both our model and experiments show that individual macrophages can switch from the repair pathway to the inflammation pathway but that the reverse switch does not occur.

  13. From Inflammation to Wound Healing: Using a Simple Model to Understand the Functional Versatility of Murine Macrophages

    PubMed Central

    Paskow, Michael; Morris, Sidney M.; Hesse, Matthias; Strogatz, Steven

    2011-01-01

    Macrophages are fundamental cells of the innate immune system. Their activation is essential for such distinct immune functions as inflammation (pathogen-killing) and tissue repair (wound healing). An open question has been the functional stability of an individual macrophage cell: whether it can change its functional profile between different immune responses such as between the repair pathway and the inflammatory pathway. We studied this question theoretically by constructing a rate equation model for the key substrate, enzymes and products of the pathways; we then tested the model experimentally. Both our model and experiments show that individual macrophages can switch from the repair pathway to the inflammation pathway but that the reverse switch does not occur. PMID:21347813

  14. Research and exploration of product innovative design for function

    NASA Astrophysics Data System (ADS)

    Wang, Donglin; Wei, Zihui; Wang, Youjiang; Tan, Runhua

    2009-07-01

    Products innovation is under the prerequisite of realizing the new function, the realization of the new function must solve the contradiction. A new process model of new product innovative design was proposed based on Axiomatic Design (AD) Theory and Functional Structure Analysis (FSA), imbedded Principle of Solving Contradiction. In this model, employ AD Theory to guide FSA, determine the contradiction for the realization of the principle solution. To provide powerful support for innovative design tools in principle solution, Principle of Solving Contradiction in the model were imbedded, so as to boost up the innovation of principle solution. As a case study, an innovative design of button battery separator paper punching machine has been achieved with application of the proposed model.

  15. Functional Data Analysis Applied to Modeling of Severe Acute Mucositis and Dysphagia Resulting From Head and Neck Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dean, Jamie A., E-mail: jamie.dean@icr.ac.uk; Wong, Kee H.; Gay, Hiram

    Purpose: Current normal tissue complication probability modeling using logistic regression suffers from bias and high uncertainty in the presence of highly correlated radiation therapy (RT) dose data. This hinders robust estimates of dose-response associations and, hence, optimal normal tissue–sparing strategies from being elucidated. Using functional data analysis (FDA) to reduce the dimensionality of the dose data could overcome this limitation. Methods and Materials: FDA was applied to modeling of severe acute mucositis and dysphagia resulting from head and neck RT. Functional partial least squares regression (FPLS) and functional principal component analysis were used for dimensionality reduction of the dose-volume histogrammore » data. The reduced dose data were input into functional logistic regression models (functional partial least squares–logistic regression [FPLS-LR] and functional principal component–logistic regression [FPC-LR]) along with clinical data. This approach was compared with penalized logistic regression (PLR) in terms of predictive performance and the significance of treatment covariate–response associations, assessed using bootstrapping. Results: The area under the receiver operating characteristic curve for the PLR, FPC-LR, and FPLS-LR models was 0.65, 0.69, and 0.67, respectively, for mucositis (internal validation) and 0.81, 0.83, and 0.83, respectively, for dysphagia (external validation). The calibration slopes/intercepts for the PLR, FPC-LR, and FPLS-LR models were 1.6/−0.67, 0.45/0.47, and 0.40/0.49, respectively, for mucositis (internal validation) and 2.5/−0.96, 0.79/−0.04, and 0.79/0.00, respectively, for dysphagia (external validation). The bootstrapped odds ratios indicated significant associations between RT dose and severe toxicity in the mucositis and dysphagia FDA models. Cisplatin was significantly associated with severe dysphagia in the FDA models. None of the covariates was significantly associated with severe toxicity in the PLR models. Dose levels greater than approximately 1.0 Gy/fraction were most strongly associated with severe acute mucositis and dysphagia in the FDA models. Conclusions: FPLS and functional principal component analysis marginally improved predictive performance compared with PLR and provided robust dose-response associations. FDA is recommended for use in normal tissue complication probability modeling.« less

  16. Functional Data Analysis Applied to Modeling of Severe Acute Mucositis and Dysphagia Resulting From Head and Neck Radiation Therapy.

    PubMed

    Dean, Jamie A; Wong, Kee H; Gay, Hiram; Welsh, Liam C; Jones, Ann-Britt; Schick, Ulrike; Oh, Jung Hun; Apte, Aditya; Newbold, Kate L; Bhide, Shreerang A; Harrington, Kevin J; Deasy, Joseph O; Nutting, Christopher M; Gulliford, Sarah L

    2016-11-15

    Current normal tissue complication probability modeling using logistic regression suffers from bias and high uncertainty in the presence of highly correlated radiation therapy (RT) dose data. This hinders robust estimates of dose-response associations and, hence, optimal normal tissue-sparing strategies from being elucidated. Using functional data analysis (FDA) to reduce the dimensionality of the dose data could overcome this limitation. FDA was applied to modeling of severe acute mucositis and dysphagia resulting from head and neck RT. Functional partial least squares regression (FPLS) and functional principal component analysis were used for dimensionality reduction of the dose-volume histogram data. The reduced dose data were input into functional logistic regression models (functional partial least squares-logistic regression [FPLS-LR] and functional principal component-logistic regression [FPC-LR]) along with clinical data. This approach was compared with penalized logistic regression (PLR) in terms of predictive performance and the significance of treatment covariate-response associations, assessed using bootstrapping. The area under the receiver operating characteristic curve for the PLR, FPC-LR, and FPLS-LR models was 0.65, 0.69, and 0.67, respectively, for mucositis (internal validation) and 0.81, 0.83, and 0.83, respectively, for dysphagia (external validation). The calibration slopes/intercepts for the PLR, FPC-LR, and FPLS-LR models were 1.6/-0.67, 0.45/0.47, and 0.40/0.49, respectively, for mucositis (internal validation) and 2.5/-0.96, 0.79/-0.04, and 0.79/0.00, respectively, for dysphagia (external validation). The bootstrapped odds ratios indicated significant associations between RT dose and severe toxicity in the mucositis and dysphagia FDA models. Cisplatin was significantly associated with severe dysphagia in the FDA models. None of the covariates was significantly associated with severe toxicity in the PLR models. Dose levels greater than approximately 1.0 Gy/fraction were most strongly associated with severe acute mucositis and dysphagia in the FDA models. FPLS and functional principal component analysis marginally improved predictive performance compared with PLR and provided robust dose-response associations. FDA is recommended for use in normal tissue complication probability modeling. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  17. Comparison and Contrast of Two General Functional Regression Modeling Frameworks

    PubMed Central

    Morris, Jeffrey S.

    2017-01-01

    In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable. PMID:28736502

  18. Comparison and Contrast of Two General Functional Regression Modeling Frameworks.

    PubMed

    Morris, Jeffrey S

    2017-02-01

    In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable.

  19. Individual-tree diameter growth model for managed, even-aged, upland oak stands

    Treesearch

    Donald E. Hilt

    1983-01-01

    A distance-independent, individual-tree diameter growth model was developed for managed, even-aged, upland oak stands. The 5-year basal-area growth of individual trees is first modeled as a function of dbh squared for given stands. Parameters from these models are then modeled as a function of mean stand diameter, percent stocking of the stand, and site index. A...

  20. Systems Operation Studies for Automated Guideway Transit Systems : Availability Model Functional Specification

    DOT National Transportation Integrated Search

    1981-01-01

    The System Availability Model (SAM) is a system-level model which provides measures of vehicle and passenger availability. The SAM will be used to evaluate the system-level influence of availability concepts employed in AGT systems. This functional s...

  1. Intranet Model and Metrics

    DTIC Science & Technology

    2007-02-01

    organization: to add suffi- cient value for its customers to create a sustainable business model . It takes its features and functionality from the mandate...customers to create a sustainable business model . It takes its features and functionality from the mandate to operate at world-class efficiency and

  2. An individual-tree basal area growth model for loblolly pine stands

    Treesearch

    Paul A. Murphy; Michael G. Shelton

    1996-01-01

    Tree basal area growth has been modeled as a combination of a potential growth function and a modifier function, in which the potential function is fitted separately from open-grown tree data or a subset of the data and the modifier function includes stand and site variables. We propose a modification of this by simultaneously fitting both a growth component and a...

  3. Data Reduction Functions for the Langley 14- by 22-Foot Subsonic Tunnel

    NASA Technical Reports Server (NTRS)

    Boney, Andy D.

    2014-01-01

    The Langley 14- by 22-Foot Subsonic Tunnel's data reduction software utilizes six major functions to compute the acquired data. These functions calculate engineering units, tunnel parameters, flowmeters, jet exhaust measurements, balance loads/model attitudes, and model /wall pressures. The input (required) variables, the output (computed) variables, and the equations and/or subfunction(s) associated with each major function are discussed.

  4. Optimizing global liver function in radiation therapy treatment planning

    NASA Astrophysics Data System (ADS)

    Wu, Victor W.; Epelman, Marina A.; Wang, Hesheng; Romeijn, H. Edwin; Feng, Mary; Cao, Yue; Ten Haken, Randall K.; Matuszak, Martha M.

    2016-09-01

    Liver stereotactic body radiation therapy (SBRT) patients differ in both pre-treatment liver function (e.g. due to degree of cirrhosis and/or prior treatment) and radiosensitivity, leading to high variability in potential liver toxicity with similar doses. This work investigates three treatment planning optimization models that minimize risk of toxicity: two consider both voxel-based pre-treatment liver function and local-function-based radiosensitivity with dose; one considers only dose. Each model optimizes different objective functions (varying in complexity of capturing the influence of dose on liver function) subject to the same dose constraints and are tested on 2D synthesized and 3D clinical cases. The normal-liver-based objective functions are the linearized equivalent uniform dose (\\ell \\text{EUD} ) (conventional ‘\\ell \\text{EUD} model’), the so-called perfusion-weighted \\ell \\text{EUD} (\\text{fEUD} ) (proposed ‘fEUD model’), and post-treatment global liver function (GLF) (proposed ‘GLF model’), predicted by a new liver-perfusion-based dose-response model. The resulting \\ell \\text{EUD} , fEUD, and GLF plans delivering the same target \\ell \\text{EUD} are compared with respect to their post-treatment function and various dose-based metrics. Voxel-based portal venous liver perfusion, used as a measure of local function, is computed using DCE-MRI. In cases used in our experiments, the GLF plan preserves up to 4.6 % ≤ft(7.5 % \\right) more liver function than the fEUD (\\ell \\text{EUD} ) plan does in 2D cases, and up to 4.5 % ≤ft(5.6 % \\right) in 3D cases. The GLF and fEUD plans worsen in \\ell \\text{EUD} of functional liver on average by 1.0 Gy and 0.5 Gy in 2D and 3D cases, respectively. Liver perfusion information can be used during treatment planning to minimize the risk of toxicity by improving expected GLF; the degree of benefit varies with perfusion pattern. Although fEUD model optimization is computationally inexpensive and often achieves better GLF than \\ell \\text{EUD} model optimization does, the GLF model directly optimizes a more clinically relevant metric and can further improve fEUD plan quality.

  5. An optimal generic model for multi-parameters and big data optimizing: a laboratory experimental study

    NASA Astrophysics Data System (ADS)

    Utama, D. N.; Ani, N.; Iqbal, M. M.

    2018-03-01

    Optimization is a process for finding parameter (parameters) that is (are) able to deliver an optimal value for an objective function. Seeking an optimal generic model for optimizing is a computer science study that has been being practically conducted by numerous researchers. Generic model is a model that can be technically operated to solve any varieties of optimization problem. By using an object-oriented method, the generic model for optimizing was constructed. Moreover, two types of optimization method, simulated-annealing and hill-climbing, were functioned in constructing the model and compared to find the most optimal one then. The result said that both methods gave the same result for a value of objective function and the hill-climbing based model consumed the shortest running time.

  6. Ocean biogeochemistry modeled with emergent trait-based genomics

    NASA Astrophysics Data System (ADS)

    Coles, V. J.; Stukel, M. R.; Brooks, M. T.; Burd, A.; Crump, B. C.; Moran, M. A.; Paul, J. H.; Satinsky, B. M.; Yager, P. L.; Zielinski, B. L.; Hood, R. R.

    2017-12-01

    Marine ecosystem models have advanced to incorporate metabolic pathways discovered with genomic sequencing, but direct comparisons between models and “omics” data are lacking. We developed a model that directly simulates metagenomes and metatranscriptomes for comparison with observations. Model microbes were randomly assigned genes for specialized functions, and communities of 68 species were simulated in the Atlantic Ocean. Unfit organisms were replaced, and the model self-organized to develop community genomes and transcriptomes. Emergent communities from simulations that were initialized with different cohorts of randomly generated microbes all produced realistic vertical and horizontal ocean nutrient, genome, and transcriptome gradients. Thus, the library of gene functions available to the community, rather than the distribution of functions among specific organisms, drove community assembly and biogeochemical gradients in the model ocean.

  7. Theory of reflectivity blurring in seismic depth imaging

    NASA Astrophysics Data System (ADS)

    Thomson, C. J.; Kitchenside, P. W.; Fletcher, R. P.

    2016-05-01

    A subsurface extended image gather obtained during controlled-source depth imaging yields a blurred kernel of an interface reflection operator. This reflectivity kernel or reflection function is comprised of the interface plane-wave reflection coefficients and so, in principle, the gather contains amplitude versus offset or angle information. We present a modelling theory for extended image gathers that accounts for variable illumination and blurring, under the assumption of a good migration-velocity model. The method involves forward modelling as well as migration or back propagation so as to define a receiver-side blurring function, which contains the effects of the detector array for a given shot. Composition with the modelled incident wave and summation over shots then yields an overall blurring function that relates the reflectivity to the extended image gather obtained from field data. The spatial evolution or instability of blurring functions is a key concept and there is generally not just spatial blurring in the apparent reflectivity, but also slowness or angle blurring. Gridded blurring functions can be estimated with, for example, a reverse-time migration modelling engine. A calibration step is required to account for ad hoc band limitedness in the modelling and the method also exploits blurring-function reciprocity. To demonstrate the concepts, we show numerical examples of various quantities using the well-known SIGSBEE test model and a simple salt-body overburden model, both for 2-D. The moderately strong slowness/angle blurring in the latter model suggests that the effect on amplitude versus offset or angle analysis should be considered in more realistic structures. Although the description and examples are for 2-D, the extension to 3-D is conceptually straightforward. The computational cost of overall blurring functions implies their targeted use for the foreseeable future, for example, in reservoir characterization. The description is for scalar waves, but the extension to elasticity is foreseeable and we emphasize the separation of the overburden and survey-geometry blurring effects from the nature of the target scatterer.

  8. Animal Models for the Study of Female Sexual Dysfunction

    PubMed Central

    Marson, Lesley; Giamberardino, Maria Adele; Costantini, Raffaele; Czakanski, Peter; Wesselmann, Ursula

    2017-01-01

    Introduction Significant progress has been made in elucidating the physiological and pharmacological mechanisms of female sexual function through preclinical animal research. The continued development of animal models is vital for the understanding and treatment of the many diverse disorders that occur in women. Aim To provide an updated review of the experimental models evaluating female sexual function that may be useful for clinical translation. Methods Review of English written, peer-reviewed literature, primarily from 2000 to 2012, that described studies on female sexual behavior related to motivation, arousal, physiological monitoring of genital function and urogenital pain. Main Outcomes Measures Analysis of supporting evidence for the suitability of the animal model to provide measurable indices related to desire, arousal, reward, orgasm, and pelvic pain. Results The development of female animal models has provided important insights in the peripheral and central processes regulating sexual function. Behavioral models of sexual desire, motivation, and reward are well developed. Central arousal and orgasmic responses are less well understood, compared with the physiological changes associated with genital arousal. Models of nociception are useful for replicating symptoms and identifying the neurobiological pathways involved. While in some cases translation to women correlates with the findings in animals, the requirement of circulating hormones for sexual receptivity in rodents and the multifactorial nature of women’s sexual function requires better designed studies and careful analysis. The current models have studied sexual dysfunction or pelvic pain in isolation; combining these aspects would help to elucidate interactions of the pathophysiology of pain and sexual dysfunction. Conclusions Basic research in animals has been vital for understanding the anatomy, neurobiology, and physiological mechanisms underlying sexual function and urogenital pain. These models are important for understanding the etiology of female sexual function and for future development of pharmacological treatments for sexual dysfunctions with or without pain. PMID:27784584

  9. Modelling the Impact of Soil Management on Soil Functions

    NASA Astrophysics Data System (ADS)

    Vogel, H. J.; Weller, U.; Rabot, E.; Stößel, B.; Lang, B.; Wiesmeier, M.; Urbanski, L.; Wollschläger, U.

    2017-12-01

    Due to an increasing soil loss and an increasing demand for food and energy there is an enormous pressure on soils as the central resource for agricultural production. Besides the importance of soils for biomass production there are other essential soil functions, i.e. filter and buffer for water, carbon sequestration, provision and recycling of nutrients, and habitat for biological activity. All these functions have a direct feed back to biogeochemical cycles and climate. To render agricultural production efficient and sustainable we need to develop model tools that are capable to predict quantitatively the impact of a multitude of management measures on these soil functions. These functions are considered as emergent properties produced by soils as complex systems. The major challenge is to handle the multitude of physical, chemical and biological processes interacting in a non-linear manner. A large number of validated models for specific soil processes are available. However, it is not possible to simulate soil functions by coupling all the relevant processes at the detailed (i.e. molecular) level where they are well understood. A new systems perspective is required to evaluate the ensemble of soil functions and their sensitivity to external forcing. Another challenge is that soils are spatially heterogeneous systems by nature. Soil processes are highly dependent on the local soil properties and, hence, any model to predict soil functions needs to account for the site-specific conditions. For upscaling towards regional scales the spatial distribution of functional soil types need to be taken into account. We propose a new systemic model approach based on a thorough analysis of the interactions between physical, chemical and biological processes considering their site-specific characteristics. It is demonstrated for the example of soil compaction and the recovery of soil structure, water capacity and carbon stocks as a result of plant growth and biological activity. Coupling of the observed nonlinear interactions allows for modeling the stability and resilience of soil systems in terms of their essential functions.

  10. Losing function through wetland mitigation in central Pennsylvania, USA.

    PubMed

    Hoeltje, S M; Cole, C A

    2007-03-01

    In the United States, the Clean Water Act requires mitigation for wetlands that are negatively impacted by dredging and filling activities. During the mitigation process, there generally is little effort to assess function for mitigation sites and function is usually inferred based on vegetative cover and acreage. In our study, hydrogeomorphic (HGM) functional assessment models were used to compare predicted and potential levels of functional capacity in created and natural reference wetlands. HGM models assess potential function by measurement of a suite of structural variables and these modeled functions can then be compared to those in natural, reference wetlands. The created wetlands were built in a floodplain setting of a valley in central Pennsylvania to replace natural ridge-side slope wetlands. Functional assessment models indicated that the created sites differed significantly from natural wetlands that represented the impacted sites for seven of the ten functions assessed. This was expected because the created wetlands were located in a different geomorphic setting than the impacted sites, which would affect the type and degree of functions that occur. However, functional differences were still observed when the created sites were compared with a second set of reference wetlands that were located in a similar geomorphic setting (floodplain). Most of the differences observed in both comparisons were related to unnatural hydrologic regimes and to the characteristics of the surrounding landscape. As a result, the created wetlands are not fulfilling the criteria for successful wetland mitigation.

  11. Improving the diagnosis related grouping model's ability to explain length of stay of elderly medical inpatients by incorporating function-linked variables.

    PubMed

    Sahadevan, S; Earnest, A; Koh, Y L; Lee, K M; Soh, C H; Ding, Y Y

    2004-09-01

    This study first aimed to determine the adequacy of the Diagnosis Related Grouping (DRG) model's ability to explain (1) the variance in the actual length of stay (LOS) of elderly medical inpatients and (2) the LOS difference in the same cohort between the departments of Geriatric Medicine (GRM) and General Medicine (GM). We then looked at how these explanatory abilities of the DRG changed when patients' function-linked variables (ignored by DRG) were incorporated into the model. Basic demographic data of a consecutively hospitalised cohort of elderly medical inpatients from GRM and GM, as well as their actual LOS, discharge DRG codes [with their corresponding trimmed average length of stay (ALOS)] and selected function-linked variables (including premorbid functional status, change in functional profile during hospitalisation and number of therapists seen) were recorded. Beginning with ALOS, function-linked variables that were significantly associated with LOS were then added into two multiple liner regression models so as to quantify how the functional dimension improved the DRGs' abilities to explain LOS variances and interdepartmental LOS differences. Forward selection procedure was employed to determine the final models. For the interdepartmental analysis, the study sample was restricted to patients who shared common DRG codes. 114 GRM and 118 GM patients were studied. Trimmed ALOS alone explained 8% of the actual LOS variance. With the addition of function-linked variables, the adjusted R2 of the final model increased to 28%. Due to common code restrictions, the data of 79 GRM and 78 GM patients were available for the analysis of interdepartmental LOS differences. At the unadjusted stage, the median stay of GRM patients was 4.3 days longer than GM's and with adjustments made for the DRGs, this difference was reduced to 3.9 days. Additionally adjusting for the patients' functional features diminished the interdepartmental LOS discrepancy even further, to 2.1 days. This study demonstrates that for elderly medical inpatients, the incorporation of patients' functional status significantly improves the DRG model's ability to predict the patients' actual LOS as well as to explain interdepartmental LOS differences between GRM and GM.

  12. Weighted functional linear regression models for gene-based association analysis.

    PubMed

    Belonogova, Nadezhda M; Svishcheva, Gulnara R; Wilson, James F; Campbell, Harry; Axenovich, Tatiana I

    2018-01-01

    Functional linear regression models are effectively used in gene-based association analysis of complex traits. These models combine information about individual genetic variants, taking into account their positions and reducing the influence of noise and/or observation errors. To increase the power of methods, where several differently informative components are combined, weights are introduced to give the advantage to more informative components. Allele-specific weights have been introduced to collapsing and kernel-based approaches to gene-based association analysis. Here we have for the first time introduced weights to functional linear regression models adapted for both independent and family samples. Using data simulated on the basis of GAW17 genotypes and weights defined by allele frequencies via the beta distribution, we demonstrated that type I errors correspond to declared values and that increasing the weights of causal variants allows the power of functional linear models to be increased. We applied the new method to real data on blood pressure from the ORCADES sample. Five of the six known genes with P < 0.1 in at least one analysis had lower P values with weighted models. Moreover, we found an association between diastolic blood pressure and the VMP1 gene (P = 8.18×10-6), when we used a weighted functional model. For this gene, the unweighted functional and weighted kernel-based models had P = 0.004 and 0.006, respectively. The new method has been implemented in the program package FREGAT, which is freely available at https://cran.r-project.org/web/packages/FREGAT/index.html.

  13. Protein loop modeling using a new hybrid energy function and its application to modeling in inaccurate structural environments.

    PubMed

    Park, Hahnbeom; Lee, Gyu Rie; Heo, Lim; Seok, Chaok

    2014-01-01

    Protein loop modeling is a tool for predicting protein local structures of particular interest, providing opportunities for applications involving protein structure prediction and de novo protein design. Until recently, the majority of loop modeling methods have been developed and tested by reconstructing loops in frameworks of experimentally resolved structures. In many practical applications, however, the protein loops to be modeled are located in inaccurate structural environments. These include loops in model structures, low-resolution experimental structures, or experimental structures of different functional forms. Accordingly, discrepancies in the accuracy of the structural environment assumed in development of the method and that in practical applications present additional challenges to modern loop modeling methods. This study demonstrates a new strategy for employing a hybrid energy function combining physics-based and knowledge-based components to help tackle this challenge. The hybrid energy function is designed to combine the strengths of each energy component, simultaneously maintaining accurate loop structure prediction in a high-resolution framework structure and tolerating minor environmental errors in low-resolution structures. A loop modeling method based on global optimization of this new energy function is tested on loop targets situated in different levels of environmental errors, ranging from experimental structures to structures perturbed in backbone as well as side chains and template-based model structures. The new method performs comparably to force field-based approaches in loop reconstruction in crystal structures and better in loop prediction in inaccurate framework structures. This result suggests that higher-accuracy predictions would be possible for a broader range of applications. The web server for this method is available at http://galaxy.seoklab.org/loop with the PS2 option for the scoring function.

  14. Breeding value accuracy estimates for growth traits using random regression and multi-trait models in Nelore cattle.

    PubMed

    Boligon, A A; Baldi, F; Mercadante, M E Z; Lobo, R B; Pereira, R J; Albuquerque, L G

    2011-06-28

    We quantified the potential increase in accuracy of expected breeding value for weights of Nelore cattle, from birth to mature age, using multi-trait and random regression models on Legendre polynomials and B-spline functions. A total of 87,712 weight records from 8144 females were used, recorded every three months from birth to mature age from the Nelore Brazil Program. For random regression analyses, all female weight records from birth to eight years of age (data set I) were considered. From this general data set, a subset was created (data set II), which included only nine weight records: at birth, weaning, 365 and 550 days of age, and 2, 3, 4, 5, and 6 years of age. Data set II was analyzed using random regression and multi-trait models. The model of analysis included the contemporary group as fixed effects and age of dam as a linear and quadratic covariable. In the random regression analyses, average growth trends were modeled using a cubic regression on orthogonal polynomials of age. Residual variances were modeled by a step function with five classes. Legendre polynomials of fourth and sixth order were utilized to model the direct genetic and animal permanent environmental effects, respectively, while third-order Legendre polynomials were considered for maternal genetic and maternal permanent environmental effects. Quadratic polynomials were applied to model all random effects in random regression models on B-spline functions. Direct genetic and animal permanent environmental effects were modeled using three segments or five coefficients, and genetic maternal and maternal permanent environmental effects were modeled with one segment or three coefficients in the random regression models on B-spline functions. For both data sets (I and II), animals ranked differently according to expected breeding value obtained by random regression or multi-trait models. With random regression models, the highest gains in accuracy were obtained at ages with a low number of weight records. The results indicate that random regression models provide more accurate expected breeding values than the traditionally finite multi-trait models. Thus, higher genetic responses are expected for beef cattle growth traits by replacing a multi-trait model with random regression models for genetic evaluation. B-spline functions could be applied as an alternative to Legendre polynomials to model covariance functions for weights from birth to mature age.

  15. An Alternative Derivation of the Energy Levels of the "Particle on a Ring" System

    NASA Astrophysics Data System (ADS)

    Vincent, Alan

    1996-10-01

    All acceptable wave functions must be continuous mathematical functions. This criterion limits the acceptable functions for a particle in a linear 1-dimensional box to sine functions. If, however, the linear box is bent round into a ring, acceptable wave functions are those which are continuous at the 'join'. On this model some acceptable linear functions become unacceptable for the ring and some unacceptable cosine functions become acceptable. This approach can be used to produce a straightforward derivation of the energy levels and wave functions of the particle on a ring. These simple wave mechanical systems can be used as models of linear and cyclic delocalised systems such as conjugated hydrocarbons or the benzene ring. The promotion energy of an electron can then be used to calculate the wavelength of absorption of uv light. The simple model gives results of the correct order of magnitude and shows that, as the chain length increases, the uv maximum moves to longer wavelengths, as found experimentally.

  16. FOAM (Functional Ontology Assignments for Metagenomes): A Hidden Markov Model (HMM) database with environmental focus

    DOE PAGES

    Prestat, Emmanuel; David, Maude M.; Hultman, Jenni; ...

    2014-09-26

    A new functional gene database, FOAM (Functional Ontology Assignments for Metagenomes), was developed to screen environmental metagenomic sequence datasets. FOAM provides a new functional ontology dedicated to classify gene functions relevant to environmental microorganisms based on Hidden Markov Models (HMMs). Sets of aligned protein sequences (i.e. ‘profiles’) were tailored to a large group of target KEGG Orthologs (KOs) from which HMMs were trained. The alignments were checked and curated to make them specific to the targeted KO. Within this process, sequence profiles were enriched with the most abundant sequences available to maximize the yield of accurate classifier models. An associatedmore » functional ontology was built to describe the functional groups and hierarchy. FOAM allows the user to select the target search space before HMM-based comparison steps and to easily organize the results into different functional categories and subcategories. FOAM is publicly available at http://portal.nersc.gov/project/m1317/FOAM/.« less

  17. Work Functions for Models of Scandate Surfaces

    NASA Technical Reports Server (NTRS)

    Mueller, Wolfgang

    1997-01-01

    The electronic structure, surface dipole properties, and work functions of scandate surfaces have been investigated using the fully relativistic scattered-wave cluster approach. Three different types of model surfaces are considered: (1) a monolayer of Ba-Sc-O on W(100), (2) Ba or BaO adsorbed on Sc2O3 + W, and (3) BaO on SC2O3 + WO3. Changes in the work function due to Ba or BaO adsorption on the different surfaces are calculated by employing the depolarization model of interacting surface dipoles. The largest work function change and the lowest work function of 1.54 eV are obtained for Ba adsorbed on the Sc-O monolayer on W(100). The adsorption of Ba on Sc2O3 + W does not lead to a low work function, but the adsorption of BaO results in a work function of about 1.6-1.9 eV. BaO adsorbed on Sc2O3 + WO3, or scandium tungstates, may also lead to low work functions.

  18. Use of selection indices to model the functional response of predators

    USGS Publications Warehouse

    Joly, D.O.; Patterson, B.R.

    2003-01-01

    The functional response of a predator to changing prey density is an important determinant of stability of predatora??prey systems. We show how Manly's selection indices can be used to distinguish between hyperbolic and sigmoidal models of a predator functional response to primary prey density in the presence of alternative prey. Specifically, an inverse relationship between prey density and preference for that prey results in a hyperbolic functional response while a positive relationship can yield either a hyperbolic or sigmoidal functional response, depending on the form and relative magnitudes of the density-dependent preference model, attack rate, and handling time. As an example, we examine wolf (Canis lupus) functional response to moose (Alces alces) density in the presence of caribou (Rangifer tarandus). The use of selection indices to evaluate the form of the functional response has significant advantages over previous attempts to fit Holling's functional response curves to killing-rate data directly, including increased sensitivity, use of relatively easily collected data, and consideration of other explanatory factors (e.g., weather, seasons, productivity).

  19. Does Functional Neuroimaging Solve the Questions of Neurolinguistics?

    ERIC Educational Resources Information Center

    Sidtis, Diana Van Lancker

    2006-01-01

    Neurolinguistic research has been engaged in evaluating models of language using measures from brain structure and function, and/or in investigating brain structure and function with respect to language representation using proposed models of language. While the aphasiological strategy, which classifies aphasias based on performance modality and a…

  20. A perturbative approach to the redshift space correlation function: beyond the Standard Model

    NASA Astrophysics Data System (ADS)

    Bose, Benjamin; Koyama, Kazuya

    2017-08-01

    We extend our previous redshift space power spectrum code to the redshift space correlation function. Here we focus on the Gaussian Streaming Model (GSM). Again, the code accommodates a wide range of modified gravity and dark energy models. For the non-linear real space correlation function used in the GSM we use the Fourier transform of the RegPT 1-loop matter power spectrum. We compare predictions of the GSM for a Vainshtein screened and Chameleon screened model as well as GR. These predictions are compared to the Fourier transform of the Taruya, Nishimichi and Saito (TNS) redshift space power spectrum model which is fit to N-body data. We find very good agreement between the Fourier transform of the TNS model and the GSM predictions, with <= 6% deviations in the first two correlation function multipoles for all models for redshift space separations in 50Mpch <= s <= 180Mpc/h. Excellent agreement is found in the differences between the modified gravity and GR multipole predictions for both approaches to the redshift space correlation function, highlighting their matched ability in picking up deviations from GR. We elucidate the timeliness of such non-standard templates at the dawn of stage-IV surveys and discuss necessary preparations and extensions needed for upcoming high quality data.

  1. Extra permeability is required to model dynamic oxygen measurements: evidence for functional recruitment?

    PubMed Central

    Barrett, Matthew JP; Suresh, Vinod

    2013-01-01

    Neural activation triggers a rapid, focal increase in blood flow and thus oxygen delivery. Local oxygen consumption also increases, although not to the same extent as oxygen delivery. This ‘uncoupling' enables a number of widely-used functional neuroimaging techniques; however, the physiologic mechanisms that govern oxygen transport under these conditions remain unclear. Here, we explore this dynamic process using a new mathematical model. Motivated by experimental observations and previous modeling, we hypothesized that functional recruitment of capillaries has an important role during neural activation. Using conventional mechanisms alone, the model predictions were inconsistent with in vivo measurements of oxygen partial pressure. However, dynamically increasing net capillary permeability, a simple description of functional recruitment, led to predictions consistent with the data. Increasing permeability in all vessel types had the same effect, but two alternative mechanisms were unable to produce predictions consistent with the data. These results are further evidence that conventional models of oxygen transport are not sufficient to predict dynamic experimental data. The data and modeling suggest that it is necessary to include a mechanism that dynamically increases net vascular permeability. While the model cannot distinguish between the different possibilities, we speculate that functional recruitment could have this effect in vivo. PMID:23673433

  2. Stability and Optimal Harvesting of Modified Leslie-Gower Predator-Prey Model

    NASA Astrophysics Data System (ADS)

    Toaha, S.; Azis, M. I.

    2018-03-01

    This paper studies a modified of dynamics of Leslie-Gower predator-prey population model. The model is stated as a system of first order differential equations. The model consists of one predator and one prey. The Holling type II as a predation function is considered in this model. The predator and prey populations are assumed to be beneficial and then the two populations are harvested with constant efforts. Existence and stability of the interior equilibrium point are analysed. Linearization method is used to get the linearized model and the eigenvalue is used to justify the stability of the interior equilibrium point. From the analyses, we show that under a certain condition the interior equilibrium point exists and is locally asymptotically stable. For the model with constant efforts of harvesting, cost function, revenue function, and profit function are considered. The stable interior equilibrium point is then related to the maximum profit problem as well as net present value of revenues problem. We show that there exists a certain value of the efforts that maximizes the profit function and net present value of revenues while the interior equilibrium point remains stable. This means that the populations can live in coexistence for a long time and also maximize the benefit even though the populations are harvested with constant efforts.

  3. Spatial and functional modeling of carnivore and insectivore molariform teeth.

    PubMed

    Evans, Alistair R; Sanson, Gordon D

    2006-06-01

    The interaction between the two main competing geometric determinants of teeth (the geometry of function and the geometry of occlusion) were investigated through the construction of three-dimensional spatial models of several mammalian tooth forms (carnassial, insectivore premolar, zalambdodont, dilambdodont, and tribosphenic). These models aim to emulate the shape and function of mammalian teeth. The geometric principles of occlusion relating to single- and double-crested teeth are reviewed. Function was considered using engineering principles that relate tooth shape to function. Substantial similarity between the models and mammalian teeth were achieved. Differences between the two indicate the influence of tooth strength, geometric relations between upper and lower teeth (including the presence of the protocone), and wear on tooth morphology. The concept of "autocclusion" is expanded to include any morphological features that ensure proper alignment of cusps on the same tooth and other teeth in the tooth row. It is concluded that the tooth forms examined are auto-aligning, and do not require additional morphological guides for correct alignment. The model of therian molars constructed by Crompton and Sita-Lumsden ([1970] Nature 227:197-199) is reconstructed in 3D space to show that their hypothesis of crest geometry is erroneous, and that their model is a special case of a more general class of models. (c) 2004 Wiley-Liss, Inc.

  4. A Prototype Symbolic Model of Canonical Functional Neuroanatomy of the Motor System

    PubMed Central

    Rubin, Daniel L.; Halle, Michael; Musen, Mark; Kikinis, Ron

    2008-01-01

    Recent advances in bioinformatics have opened entire new avenues for organizing, integrating and retrieving neuroscientific data, in a digital, machine-processable format, which can be at the same time understood by humans, using ontological, symbolic data representations. Declarative information stored in ontological format can be perused and maintained by domain experts, interpreted by machines, and serve as basis for a multitude of decision-support, computerized simulation, data mining, and teaching applications. We have developed a prototype symbolic model of canonical neuroanatomy of the motor system. Our symbolic model is intended to support symbolic lookup, logical inference and mathematical modeling by integrating descriptive, qualitative and quantitative functional neuroanatomical knowledge. Furthermore, we show how our approach can be extended to modeling impaired brain connectivity in disease states, such as common movement disorders. In developing our ontology, we adopted a disciplined modeling approach, relying on a set of declared principles, a high-level schema, Aristotelian definitions, and a frame-based authoring system. These features, along with the use of the Unified Medical Language System (UMLS) vocabulary, enable the alignment of our functional ontology with an existing comprehensive ontology of human anatomy, and thus allow for combining the structural and functional views of neuroanatomy for clinical decision support and neuroanatomy teaching applications. Although the scope of our current prototype ontology is limited to a particular functional system in the brain, it may be possible to adapt this approach for modeling other brain functional systems as well. PMID:18164666

  5. Evaluation of atmospheric density models and preliminary functional specifications for the Langley Atmospheric Information Retrieval System (LAIRS)

    NASA Technical Reports Server (NTRS)

    Lee, T.; Boland, D. F., Jr.

    1980-01-01

    This document presents the results of an extensive survey and comparative evaluation of current atmosphere and wind models for inclusion in the Langley Atmospheric Information Retrieval System (LAIRS). It includes recommended models for use in LAIRS, estimated accuracies for the recommended models, and functional specifications for the development of LAIRS.

  6. Tactile Teaching: Exploring Protein Structure/Function Using Physical Models

    ERIC Educational Resources Information Center

    Herman, Tim; Morris, Jennifer; Colton, Shannon; Batiza, Ann; Patrick, Michael; Franzen, Margaret; Goodsell, David S.

    2006-01-01

    The technology now exists to construct physical models of proteins based on atomic coordinates of solved structures. We review here our recent experiences in using physical models to teach concepts of protein structure and function at both the high school and the undergraduate levels. At the high school level, physical models are used in a…

  7. From quantum affine groups to the exact dynamical correlation function of the Heisenberg model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bougourzi, A.H.; Couture, M.; Kacir, M.

    1997-01-20

    The exact form factors of the Heisenberg models XXX and XXZ have been recently computed through the quantum affine symmetry of XXZ model in the thermodynamic limit. The authors use them to derive an exact formula for the contribution of two spinons to the dynamical correlation function of XXX model at zero temperature.

  8. Asymptotic behaviour of two-point functions in multi-species models

    NASA Astrophysics Data System (ADS)

    Kozlowski, Karol K.; Ragoucy, Eric

    2016-05-01

    We extract the long-distance asymptotic behaviour of two-point correlation functions in massless quantum integrable models containing multi-species excitations. For such a purpose, we extend to these models the method of a large-distance regime re-summation of the form factor expansion of correlation functions. The key feature of our analysis is a technical hypothesis on the large-volume behaviour of the form factors of local operators in such models. We check the validity of this hypothesis on the example of the SU (3)-invariant XXX magnet by means of the determinant representations for the form factors of local operators in this model. Our approach confirms the structure of the critical exponents obtained previously for numerous models solvable by the nested Bethe Ansatz.

  9. Team functioning as a predictor of patient outcomes in early medical home implementation.

    PubMed

    Wu, Frances M; Rubenstein, Lisa V; Yoon, Jean

    New models of patient-centered primary care such as the patient-centered medical home (PCMH) depend on high levels of interdisciplinary primary care team functioning to achieve improved outcomes. A few studies have qualitatively assessed barriers and facilitators to optimal team functioning; however, we know of no prior study that assesses PCMH team functioning in relationship to patient health outcomes. The aim of the study was to assess the relationships between primary care team functioning, patients' use of acute care, and mortality. Retrospective longitudinal cohort analysis of patient outcomes measured at two time points (2012 and 2013) after PCMH implementation began in Veterans Health Administration practices. Multilevel models examined practice-level measures of team functioning in relationship to patient outcomes (all-cause and ambulatory care-sensitive condition-related hospitalizations, emergency department visits, and mortality). We controlled for practice-level factors likely to affect team functioning, including leadership support, provider and staff burnout, and staffing sufficiency, as well as for individual patient characteristics. We also tested the model among a subgroup of vulnerable patients (homeless, mentally ill, or with dementia). In adjusted analyses, higher team functioning was associated with lower mortality (OR = 0.92, p = .04) among all patients and with fewer all-cause admissions (incidence rate ratio [IRR] = 0.90, p < 0.01), ambulatory care-sensitive condition-related admissions (IRR = 0.91, p = .04), and emergency department visits (IRR = 0.91, p = .03) in the vulnerable patient subgroup. These early findings give support for the importance of team functioning within PCMH models for achieving improved patient outcomes. A focus on team functioning is important especially in the early implementation of team-based primary care models.

  10. Relationship of amotivation to neurocognition, self-efficacy and functioning in first-episode psychosis: a structural equation modeling approach.

    PubMed

    Chang, W C; Kwong, V W Y; Hui, C L M; Chan, S K W; Lee, E H M; Chen, E Y H

    2017-03-01

    Better understanding of the complex interplay among key determinants of functional outcome is crucial to promoting recovery in psychotic disorders. However, this is understudied in the early course of illness. We aimed to examine the relationships among negative symptoms, neurocognition, general self-efficacy and global functioning in first-episode psychosis (FEP) patients using structural equation modeling (SEM). Three hundred and twenty-one Chinese patients aged 26-55 years presenting with FEP to an early intervention program in Hong Kong were recruited. Assessments encompassing symptom profiles, functioning, perceived general self-efficacy and a battery of neurocognitive tests were conducted. Negative symptom measurement was subdivided into amotivation and diminished expression (DE) domain scores based on the ratings in the Scale for the Assessment of Negative Symptoms. An initial SEM model showed no significant association between functioning and DE which was removed from further analysis. A final trimmed model yielded very good model fit (χ2 = 15.48, p = 0.63; comparative fit index = 1.00; root mean square error of approximation <0.001) and demonstrated that amotivation, neurocognition and general self-efficacy had a direct effect on global functioning. Amotivation was also found to mediate a significant indirect effect of neurocognition and general self-efficacy on functioning. Neurocognition was not significantly related to general self-efficacy. Our results indicate a critical intermediary role of amotivation in linking neurocognitive impairment to functioning in FEP. General self-efficacy may represent a promising treatment target for improvement of motivational deficits and functional outcome in the early illness stage.

  11. Integrating research, clinical care, and education in academic health science centers.

    PubMed

    King, Gillian; Thomson, Nicole; Rothstein, Mitchell; Kingsnorth, Shauna; Parker, Kathryn

    2016-10-10

    Purpose One of the major issues faced by academic health science centers (AHSCs) is the need for mechanisms to foster the integration of research, clinical, and educational activities to achieve the vision of evidence-informed decision making (EIDM) and optimal client care. The paper aims to discuss this issue. Design/methodology/approach This paper synthesizes literature on organizational learning and collaboration, evidence-informed organizational decision making, and learning-based organizations to derive insights concerning the nature of effective workplace learning in AHSCs. Findings An evidence-informed model of collaborative workplace learning is proposed to aid the alignment of research, clinical, and educational functions in AHSCs. The model articulates relationships among AHSC academic functions and sub-functions, cross-functional activities, and collaborative learning processes, emphasizing the importance of cross-functional activities in enhancing collaborative learning processes and optimizing EIDM and client care. Cross-functional activities involving clinicians, researchers, and educators are hypothesized to be a primary vehicle for integration, supported by a learning-oriented workplace culture. These activities are distinct from interprofessional teams, which are clinical in nature. Four collaborative learning processes are specified that are enhanced in cross-functional activities or teamwork: co-constructing meaning, co-learning, co-producing knowledge, and co-using knowledge. Practical implications The model provides an aspirational vision and insight into the importance of cross-functional activities in enhancing workplace learning. The paper discusses the conceptual and empirical basis to the model, its contributions and limitations, and implications for AHSCs. Originality/value The model's potential utility for health care is discussed, with implications for organizational culture and the promotion of cross-functional activities.

  12. Chaotic simulated annealing by a neural network with a variable delay: design and application.

    PubMed

    Chen, Shyan-Shiou

    2011-10-01

    In this paper, we have three goals: the first is to delineate the advantages of a variably delayed system, the second is to find a more intuitive Lyapunov function for a delayed neural network, and the third is to design a delayed neural network for a quadratic cost function. For delayed neural networks, most researchers construct a Lyapunov function based on the linear matrix inequality (LMI) approach. However, that approach is not intuitive. We provide a alternative candidate Lyapunov function for a delayed neural network. On the other hand, if we are first given a quadratic cost function, we can construct a delayed neural network by suitably dividing the second-order term into two parts: a self-feedback connection weight and a delayed connection weight. To demonstrate the advantage of a variably delayed neural network, we propose a transiently chaotic neural network with variable delay and show numerically that the model should possess a better searching ability than Chen-Aihara's model, Wang's model, and Zhao's model. We discuss both the chaotic and the convergent phases. During the chaotic phase, we simply present bifurcation diagrams for a single neuron with a constant delay and with a variable delay. We show that the variably delayed model possesses the stochastic property and chaotic wandering. During the convergent phase, we not only provide a novel Lyapunov function for neural networks with a delay (the Lyapunov function is independent of the LMI approach) but also establish a correlation between the Lyapunov function for a delayed neural network and an objective function for the traveling salesman problem. © 2011 IEEE

  13. Connectotyping: Model Based Fingerprinting of the Functional Connectome

    PubMed Central

    Miranda-Dominguez, Oscar; Mills, Brian D.; Carpenter, Samuel D.; Grant, Kathleen A.; Kroenke, Christopher D.; Nigg, Joel T.; Fair, Damien A.

    2014-01-01

    A better characterization of how an individual’s brain is functionally organized will likely bring dramatic advances to many fields of study. Here we show a model-based approach toward characterizing resting state functional connectivity MRI (rs-fcMRI) that is capable of identifying a so-called “connectotype”, or functional fingerprint in individual participants. The approach rests on a simple linear model that proposes the activity of a given brain region can be described by the weighted sum of its functional neighboring regions. The resulting coefficients correspond to a personalized model-based connectivity matrix that is capable of predicting the timeseries of each subject. Importantly, the model itself is subject specific and has the ability to predict an individual at a later date using a limited number of non-sequential frames. While we show that there is a significant amount of shared variance between models across subjects, the model’s ability to discriminate an individual is driven by unique connections in higher order control regions in frontal and parietal cortices. Furthermore, we show that the connectotype is present in non-human primates as well, highlighting the translational potential of the approach. PMID:25386919

  14. A class of stochastic optimization problems with one quadratic & several linear objective functions and extended portfolio selection model

    NASA Astrophysics Data System (ADS)

    Xu, Jiuping; Li, Jun

    2002-09-01

    In this paper a class of stochastic multiple-objective programming problems with one quadratic, several linear objective functions and linear constraints has been introduced. The former model is transformed into a deterministic multiple-objective nonlinear programming model by means of the introduction of random variables' expectation. The reference direction approach is used to deal with linear objectives and results in a linear parametric optimization formula with a single linear objective function. This objective function is combined with the quadratic function using the weighted sums. The quadratic problem is transformed into a linear (parametric) complementary problem, the basic formula for the proposed approach. The sufficient and necessary conditions for (properly, weakly) efficient solutions and some construction characteristics of (weakly) efficient solution sets are obtained. An interactive algorithm is proposed based on reference direction and weighted sums. Varying the parameter vector on the right-hand side of the model, the DM can freely search the efficient frontier with the model. An extended portfolio selection model is formed when liquidity is considered as another objective to be optimized besides expectation and risk. The interactive approach is illustrated with a practical example.

  15. Characterization and dynamic charge dependent modeling of conducting polymer trilayer bending

    NASA Astrophysics Data System (ADS)

    Farajollahi, Meisam; Sassani, Farrokh; Naserifar, Naser; Fannir, Adelyne; Plesse, Cédric; Nguyen, Giao T. M.; Vidal, Frédéric; Madden, John D. W.

    2016-11-01

    Trilayer bending actuators are charge driven devices that have the ability to function in air and provide large mechanical amplification. The electronic and mechanical properties of these actuators are known to be functions of their charge state making prediction of their responses more difficult when they operate over their full range of deformation. In this work, a combination of state space representation and a two-dimensional RC transmission line model are used to implement a nonlinear time variant model for conducting polymer-based trilayer actuators. Electrical conductivity and Young’s modulus of electromechanically active PEDOT conducting polymer containing films as a function of applied voltage were measured and incorporated into the model. A 16% drop in Young’s modulus and 24 times increase in conductivity are observed by oxidizing the PEDOT. A closed form formulation for radius of curvature of trilayer actuators considering asymmetric and location dependent Young’s modulus and conductivity in the conducting polymer layers is derived and implemented in the model. The nonlinear model shows the capability to predict the radius of curvature as a function of time and position with reasonable consistency (within 4%). The formulation is useful for general trilayer configurations to calculate the radius of curvature as a function of time. The proposed electrochemical modeling approach may also be useful for modeling energy storage devices.

  16. Evolution and development of model membranes for physicochemical and functional studies of the membrane lateral heterogeneity.

    PubMed

    Morigaki, Kenichi; Tanimoto, Yasushi

    2018-03-14

    One of the main questions in the membrane biology is the functional roles of membrane heterogeneity and molecular localization. Although segregation and local enrichment of protein/lipid components (rafts) have been extensively studied, the presence and functions of such membrane domains still remain elusive. Along with biochemical, cell observation, and simulation studies, model membranes are emerging as an important tool for understanding the biological membrane, providing quantitative information on the physicochemical properties of membrane proteins and lipids. Segregation of fluid lipid bilayer into liquid-ordered (Lo) and liquid-disordered (Ld) phases has been studied as a simplified model of raft in model membranes, including giant unilamellar vesicles (GUVs), giant plasma membrane vesicles (GPMVs), and supported lipid bilayers (SLB). Partition coefficients of membrane proteins between Lo and Ld phases were measured to gauze their affinities to lipid rafts (raftophilicity). One important development in model membrane is patterned SLB based on the microfabrication technology. Patterned Lo/Ld phases have been applied to study the partition and function of membrane-bound molecules. Quantitative information of individual molecular species attained by model membranes is critical for elucidating the molecular functions in the complex web of molecular interactions. The present review gives a short account of the model membranes developed for studying the lateral heterogeneity, especially focusing on patterned model membranes on solid substrates. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Structure-based Markov random field model for representing evolutionary constraints on functional sites.

    PubMed

    Jeong, Chan-Seok; Kim, Dongsup

    2016-02-24

    Elucidating the cooperative mechanism of interconnected residues is an important component toward understanding the biological function of a protein. Coevolution analysis has been developed to model the coevolutionary information reflecting structural and functional constraints. Recently, several methods have been developed based on a probabilistic graphical model called the Markov random field (MRF), which have led to significant improvements for coevolution analysis; however, thus far, the performance of these models has mainly been assessed by focusing on the aspect of protein structure. In this study, we built an MRF model whose graphical topology is determined by the residue proximity in the protein structure, and derived a novel positional coevolution estimate utilizing the node weight of the MRF model. This structure-based MRF method was evaluated for three data sets, each of which annotates catalytic site, allosteric site, and comprehensively determined functional site information. We demonstrate that the structure-based MRF architecture can encode the evolutionary information associated with biological function. Furthermore, we show that the node weight can more accurately represent positional coevolution information compared to the edge weight. Lastly, we demonstrate that the structure-based MRF model can be reliably built with only a few aligned sequences in linear time. The results show that adoption of a structure-based architecture could be an acceptable approximation for coevolution modeling with efficient computation complexity.

  18. Estimating soil hydraulic properties from soil moisture time series by inversion of a dual-permeability model

    NASA Astrophysics Data System (ADS)

    Dalla Valle, Nicolas; Wutzler, Thomas; Meyer, Stefanie; Potthast, Karin; Michalzik, Beate

    2017-04-01

    Dual-permeability type models are widely used to simulate water fluxes and solute transport in structured soils. These models contain two spatially overlapping flow domains with different parameterizations or even entirely different conceptual descriptions of flow processes. They are usually able to capture preferential flow phenomena, but a large set of parameters is needed, which are very laborious to obtain or cannot be measured at all. Therefore, model inversions are often used to derive the necessary parameters. Although these require sufficient input data themselves, they can use measurements of state variables instead, which are often easier to obtain and can be monitored by automated measurement systems. In this work we show a method to estimate soil hydraulic parameters from high frequency soil moisture time series data gathered at two different measurement depths by inversion of a simple one dimensional dual-permeability model. The model uses an advection equation based on the kinematic wave theory to describe the flow in the fracture domain and a Richards equation for the flow in the matrix domain. The soil moisture time series data were measured in mesocosms during sprinkling experiments. The inversion consists of three consecutive steps: First, the parameters of the water retention function were assessed using vertical soil moisture profiles in hydraulic equilibrium. This was done using two different exponential retention functions and the Campbell function. Second, the soil sorptivity and diffusivity functions were estimated from Boltzmann-transformed soil moisture data, which allowed the calculation of the hydraulic conductivity function. Third, the parameters governing flow in the fracture domain were determined using the whole soil moisture time series. The resulting retention functions were within the range of values predicted by pedotransfer functions apart from very dry conditions, where all retention functions predicted lower matrix potentials. The diffusivity function predicted values of a similar range as shown in other studies. Overall, the model was able to emulate soil moisture time series for low measurement depths, but deviated increasingly at larger depths. This indicates that some of the model parameters are not constant throughout the profile. However, overall seepage fluxes were still predicted correctly. In the near future we will apply the inversion method to lower frequency soil moisture data from different sites to evaluate the model's ability to predict preferential flow seepage fluxes at the field scale.

  19. Optimal Linking Design for Response Model Parameters

    ERIC Educational Resources Information Center

    Barrett, Michelle D.; van der Linden, Wim J.

    2017-01-01

    Linking functions adjust for differences between identifiability restrictions used in different instances of the estimation of item response model parameters. These adjustments are necessary when results from those instances are to be compared. As linking functions are derived from estimated item response model parameters, parameter estimation…

  20. Evolution of optimal Hill coefficients in nonlinear public goods games.

    PubMed

    Archetti, Marco; Scheuring, István

    2016-10-07

    In evolutionary game theory, the effect of public goods like diffusible molecules has been modelled using linear, concave, sigmoid and step functions. The observation that biological systems are often sigmoid input-output functions, as described by the Hill equation, suggests that a sigmoid function is more realistic. The Michaelis-Menten model of enzyme kinetics, however, predicts a concave function, and while mechanistic explanations of sigmoid kinetics exist, we lack an adaptive explanation: what is the evolutionary advantage of a sigmoid benefit function? We analyse public goods games in which the shape of the benefit function can evolve, in order to determine the optimal and evolutionarily stable Hill coefficients. We find that, while the dynamics depends on whether output is controlled at the level of the individual or the population, intermediate or high Hill coefficients often evolve, leading to sigmoid input-output functions that for some parameters are so steep to resemble a step function (an on-off switch). Our results suggest that, even when the shape of the benefit function is unknown, biological public goods should be modelled using a sigmoid or step function rather than a linear or concave function. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Symmetric functions and wavefunctions of XXZ-type six-vertex models and elliptic Felderhof models by Izergin-Korepin analysis

    NASA Astrophysics Data System (ADS)

    Motegi, Kohei

    2018-05-01

    We present a method to analyze the wavefunctions of six-vertex models by extending the Izergin-Korepin analysis originally developed for domain wall boundary partition functions. First, we apply the method to the case of the basic wavefunctions of the XXZ-type six-vertex model. By giving the Izergin-Korepin characterization of the wavefunctions, we show that these wavefunctions can be expressed as multiparameter deformations of the quantum group deformed Grothendieck polynomials. As a second example, we show that the Izergin-Korepin analysis is effective for analysis of the wavefunctions for a triangular boundary and present the explicit forms of the symmetric functions representing these wavefunctions. As a third example, we apply the method to the elliptic Felderhof model which is a face-type version and an elliptic extension of the trigonometric Felderhof model. We show that the wavefunctions can be expressed as one-parameter deformations of an elliptic analog of the Vandermonde determinant and elliptic symmetric functions.

  2. Semi-parametric regression model for survival data: graphical visualization with R

    PubMed Central

    2016-01-01

    Cox proportional hazards model is a semi-parametric model that leaves its baseline hazard function unspecified. The rationale to use Cox proportional hazards model is that (I) the underlying form of hazard function is stringent and unrealistic, and (II) researchers are only interested in estimation of how the hazard changes with covariate (relative hazard). Cox regression model can be easily fit with coxph() function in survival package. Stratified Cox model may be used for covariate that violates the proportional hazards assumption. The relative importance of covariates in population can be examined with the rankhazard package in R. Hazard ratio curves for continuous covariates can be visualized using smoothHR package. This curve helps to better understand the effects that each continuous covariate has on the outcome. Population attributable fraction is a classic quantity in epidemiology to evaluate the impact of risk factor on the occurrence of event in the population. In survival analysis, the adjusted/unadjusted attributable fraction can be plotted against survival time to obtain attributable fraction function. PMID:28090517

  3. Analytical Model for Thermal Elastoplastic Stresses of Functionally Graded Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhai, P. C.; Chen, G.; Liu, L. S.

    2008-02-15

    A modification analytical model is presented for the thermal elastoplastic stresses of functionally graded materials subjected to thermal loading. The presented model follows the analytical scheme presented by Y. L. Shen and S. Suresh [6]. In the present model, the functionally graded materials are considered as multilayered materials. Each layer consists of metal and ceramic with different volume fraction. The ceramic layer and the FGM interlayers are considered as elastic brittle materials. The metal layer is considered as elastic-perfectly plastic ductile materials. Closed-form solutions for different characteristic temperature for thermal loading are presented as a function of the structure geometriesmore » and the thermomechanical properties of the materials. A main advance of the present model is that the possibility of the initial and spread of plasticity from the two sides of the ductile layers taken into account. Comparing the analytical results with the results from the finite element analysis, the thermal stresses and deformation from the present model are in good agreement with the numerical ones.« less

  4. Valid approximation of spatially distributed grain size distributions - A priori information encoded to a feedforward network

    NASA Astrophysics Data System (ADS)

    Berthold, T.; Milbradt, P.; Berkhahn, V.

    2018-04-01

    This paper presents a model for the approximation of multiple, spatially distributed grain size distributions based on a feedforward neural network. Since a classical feedforward network does not guarantee to produce valid cumulative distribution functions, a priori information is incor porated into the model by applying weight and architecture constraints. The model is derived in two steps. First, a model is presented that is able to produce a valid distribution function for a single sediment sample. Although initially developed for sediment samples, the model is not limited in its application; it can also be used to approximate any other multimodal continuous distribution function. In the second part, the network is extended in order to capture the spatial variation of the sediment samples that have been obtained from 48 locations in the investigation area. Results show that the model provides an adequate approximation of grain size distributions, satisfying the requirements of a cumulative distribution function.

  5. Piezoelectric Actuator Modeling Using MSC/NASTRAN and MATLAB

    NASA Technical Reports Server (NTRS)

    Reaves, Mercedes C.; Horta, Lucas G.

    2003-01-01

    This paper presents a procedure for modeling structures containing piezoelectric actuators using MSCMASTRAN and MATLAB. The paper describes the utility and functionality of one set of validated modeling tools. The tools described herein use MSCMASTRAN to model the structure with piezoelectric actuators and a thermally induced strain to model straining of the actuators due to an applied voltage field. MATLAB scripts are used to assemble the dynamic equations and to generate frequency response functions. The application of these tools is discussed using a cantilever aluminum beam with a surface mounted piezoelectric actuator as a sample problem. Software in the form of MSCINASTRAN DMAP input commands, MATLAB scripts, and a step-by-step procedure to solve the example problem are provided. Analysis results are generated in terms of frequency response functions from deflection and strain data as a function of input voltage to the actuator.

  6. Quantum random oracle model for quantum digital signature

    NASA Astrophysics Data System (ADS)

    Shang, Tao; Lei, Qi; Liu, Jianwei

    2016-10-01

    The goal of this work is to provide a general security analysis tool, namely, the quantum random oracle (QRO), for facilitating the security analysis of quantum cryptographic protocols, especially protocols based on quantum one-way function. QRO is used to model quantum one-way function and different queries to QRO are used to model quantum attacks. A typical application of quantum one-way function is the quantum digital signature, whose progress has been hampered by the slow pace of the experimental realization. Alternatively, we use the QRO model to analyze the provable security of a quantum digital signature scheme and elaborate the analysis procedure. The QRO model differs from the prior quantum-accessible random oracle in that it can output quantum states as public keys and give responses to different queries. This tool can be a test bed for the cryptanalysis of more quantum cryptographic protocols based on the quantum one-way function.

  7. Random medium model for cusping of plane waves.

    PubMed

    Li, Jia; Korotkova, Olga

    2017-09-01

    We introduce a model for a three-dimensional (3D) Schell-type stationary medium whose degree of potential's correlation satisfies the Fractional Multi-Gaussian (FMG) function. Compared with the scattered profile produced by the Gaussian Schell-model (GSM) medium, the Fractional Multi-Gaussian Schell-model (FMGSM) medium gives rise to a sharp concave intensity apex in the scattered field. This implies that the FMGSM medium also accounts for a larger than Gaussian's power in the bucket (PIB) in the forward scattering direction, hence being a better candidate than the GSM medium for generating highly-focused (cusp-like) scattered profiles in the far zone. Compared to other mathematical models for the medium's correlation function which can produce similar cusped scattered profiles the FMG function offers unprecedented tractability being the weighted superposition of Gaussian functions. Our results provide useful applications to energy counter problems and particle manipulation by weakly scattered fields.

  8. [Mathematic concept model of accumulation of functional disorders associated with environmental factors].

    PubMed

    Zaĭtseva, N V; Trusov, P V; Kir'ianov, D A

    2012-01-01

    The mathematic concept model presented describes accumulation of functional disorders associated with environmental factors, plays predictive role and is designed for assessments of possible effects caused by heterogenous factors with variable exposures. Considering exposure changes with self-restoration process opens prospects of using the model to evaluate, analyse and manage occupational risks. To develop current theoretic approaches, the authors suggested a model considering age-related body peculiarities, systemic interactions of organs, including neuro-humoral regulation, accumulation of functional disorders due to external factors, rehabilitation of functions during treatment. General objective setting covers defining over a hundred unknow coefficients that characterize speed of various processes within the body. To solve this problem, the authors used iteration approach, successive identification, that starts from the certain primary approximation of the model parameters and processes subsequent updating on the basis of new theoretic and empirical knowledge.

  9. Task Analytic Models to Guide Analysis and Design: Use of the Operator Function Model to Represent Pilot-Autoflight System Mode Problems

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Mitchell, Christine M.; Chappell, Alan R.; Shafto, Mike (Technical Monitor)

    1995-01-01

    Task-analytic models structure essential information about operator interaction with complex systems, in this case pilot interaction with the autoflight system. Such models serve two purposes: (1) they allow researchers and practitioners to understand pilots' actions; and (2) they provide a compact, computational representation needed to design 'intelligent' aids, e.g., displays, assistants, and training systems. This paper demonstrates the use of the operator function model to trace the process of mode engagements while a pilot is controlling an aircraft via the, autoflight system. The operator function model is a normative and nondeterministic model of how a well-trained, well-motivated operator manages multiple concurrent activities for effective real-time control. For each function, the model links the pilot's actions with the required information. Using the operator function model, this paper describes several mode engagement scenarios. These scenarios were observed and documented during a field study that focused on mode engagements and mode transitions during normal line operations. Data including time, ATC clearances, altitude, system states, and active modes and sub-modes, engagement of modes, were recorded during sixty-six flights. Using these data, seven prototypical mode engagement scenarios were extracted. One scenario details the decision of the crew to disengage a fully automatic mode in favor of a semi-automatic mode, and the consequences of this action. Another describes a mode error involving updating aircraft speed following the engagement of a speed submode. Other scenarios detail mode confusion at various phases of the flight. This analysis uses the operator function model to identify three aspects of mode engagement: (1) the progress of pilot-aircraft-autoflight system interaction; (2) control/display information required to perform mode management activities; and (3) the potential cause(s) of mode confusion. The goal of this paper is twofold: (1) to demonstrate the use of the operator functio model methodology to describe pilot-system interaction while engaging modes And monitoring the system, and (2) to initiate a discussion of how task-analytic models might inform design processes. While the operator function model is only one type of task-analytic representation, the hypothesis of this paper is that some type of task analytic structure is a prerequisite for the design of effective human-automation interaction.

  10. Variability in Dopamine Genes Dissociates Model-Based and Model-Free Reinforcement Learning

    PubMed Central

    Bath, Kevin G.; Daw, Nathaniel D.; Frank, Michael J.

    2016-01-01

    Considerable evidence suggests that multiple learning systems can drive behavior. Choice can proceed reflexively from previous actions and their associated outcomes, as captured by “model-free” learning algorithms, or flexibly from prospective consideration of outcomes that might occur, as captured by “model-based” learning algorithms. However, differential contributions of dopamine to these systems are poorly understood. Dopamine is widely thought to support model-free learning by modulating plasticity in striatum. Model-based learning may also be affected by these striatal effects, or by other dopaminergic effects elsewhere, notably on prefrontal working memory function. Indeed, prominent demonstrations linking striatal dopamine to putatively model-free learning did not rule out model-based effects, whereas other studies have reported dopaminergic modulation of verifiably model-based learning, but without distinguishing a prefrontal versus striatal locus. To clarify the relationships between dopamine, neural systems, and learning strategies, we combine a genetic association approach in humans with two well-studied reinforcement learning tasks: one isolating model-based from model-free behavior and the other sensitive to key aspects of striatal plasticity. Prefrontal function was indexed by a polymorphism in the COMT gene, differences of which reflect dopamine levels in the prefrontal cortex. This polymorphism has been associated with differences in prefrontal activity and working memory. Striatal function was indexed by a gene coding for DARPP-32, which is densely expressed in the striatum where it is necessary for synaptic plasticity. We found evidence for our hypothesis that variations in prefrontal dopamine relate to model-based learning, whereas variations in striatal dopamine function relate to model-free learning. SIGNIFICANCE STATEMENT Decisions can stem reflexively from their previously associated outcomes or flexibly from deliberative consideration of potential choice outcomes. Research implicates a dopamine-dependent striatal learning mechanism in the former type of choice. Although recent work has indicated that dopamine is also involved in flexible, goal-directed decision-making, it remains unclear whether it also contributes via striatum or via the dopamine-dependent working memory function of prefrontal cortex. We examined genetic indices of dopamine function in these regions and their relation to the two choice strategies. We found that striatal dopamine function related most clearly to the reflexive strategy, as previously shown, and that prefrontal dopamine related most clearly to the flexible strategy. These findings suggest that dissociable brain regions support dissociable choice strategies. PMID:26818509

  11. Penalized nonparametric scalar-on-function regression via principal coordinates

    PubMed Central

    Reiss, Philip T.; Miller, David L.; Wu, Pei-Shien; Hua, Wen-Yu

    2016-01-01

    A number of classical approaches to nonparametric regression have recently been extended to the case of functional predictors. This paper introduces a new method of this type, which extends intermediate-rank penalized smoothing to scalar-on-function regression. In the proposed method, which we call principal coordinate ridge regression, one regresses the response on leading principal coordinates defined by a relevant distance among the functional predictors, while applying a ridge penalty. Our publicly available implementation, based on generalized additive modeling software, allows for fast optimal tuning parameter selection and for extensions to multiple functional predictors, exponential family-valued responses, and mixed-effects models. In an application to signature verification data, principal coordinate ridge regression, with dynamic time warping distance used to define the principal coordinates, is shown to outperform a functional generalized linear model. PMID:29217963

  12. Analysis of the Yule-Nielsen effect with the multiple-path point spread function in a frequency-modulated halftone.

    PubMed

    Rogers, Geoffrey

    2018-06-01

    The Yule-Nielsen effect is an influence on halftone color caused by the diffusion of light within the paper upon which the halftone ink is printed. The diffusion can be characterized by a point spread function. In this paper, a point spread function for paper is derived using the multiple-path model of reflection. This model treats the interaction of light with turbid media as a random walk. Using the multiple-path point spread function, a general expression is derived for the average reflectance of light from a frequency-modulated halftone, in which dot size is constant and the number of dots is varied, with the arrangement of dots random. It is also shown that the line spread function derived from the multiple-path model has the form of a Lorentzian function.

  13. Functional interaction-based nonlinear models with application to multiplatform genomics data.

    PubMed

    Davenport, Clemontina A; Maity, Arnab; Baladandayuthapani, Veerabhadran

    2018-05-07

    Functional regression allows for a scalar response to be dependent on a functional predictor; however, not much work has been done when a scalar exposure that interacts with the functional covariate is introduced. In this paper, we present 2 functional regression models that account for this interaction and propose 2 novel estimation procedures for the parameters in these models. These estimation methods allow for a noisy and/or sparsely observed functional covariate and are easily extended to generalized exponential family responses. We compute standard errors of our estimators, which allows for further statistical inference and hypothesis testing. We compare the performance of the proposed estimators to each other and to one found in the literature via simulation and demonstrate our methods using a real data example. Copyright © 2018 John Wiley & Sons, Ltd.

  14. A numerical study of the string function using a primitive equation ocean model

    NASA Astrophysics Data System (ADS)

    Tyler, R. H.; Käse, R.

    We use results from a primitive-equation ocean numerical model (SCRUM) to test a theoretical 'string function' formulation put forward by Tyler and Käse in another article in this issue. The string function acts as a stream function for the large-scale potential energy flow under the combined beta and topographic effects. The model results verify that large-scale anomalies propagate along the string function contours with a speed correctly given by the cross-string gradient. For anomalies having a scale similar to the Rossby radius, material rates of change in the layer mass following the string velocity are balanced by material rates of change in relative vorticity following the flow velocity. It is shown that large-amplitude anomalies can be generated when wind stress is resonant with the string function configuration.

  15. Improvements to Fidelity, Generation and Implementation of Physics-Based Lithium-Ion Reduced-Order Models

    NASA Astrophysics Data System (ADS)

    Rodriguez Marco, Albert

    Battery management systems (BMS) require computationally simple but highly accurate models of the battery cells they are monitoring and controlling. Historically, empirical equivalent-circuit models have been used, but increasingly researchers are focusing their attention on physics-based models due to their greater predictive capabilities. These models are of high intrinsic computational complexity and so must undergo some kind of order-reduction process to make their use by a BMS feasible: we favor methods based on a transfer-function approach of battery cell dynamics. In prior works, transfer functions have been found from full-order PDE models via two simplifying assumptions: (1) a linearization assumption--which is a fundamental necessity in order to make transfer functions--and (2) an assumption made out of expedience that decouples the electrolyte-potential and electrolyte-concentration PDEs in order to render an approach to solve for the transfer functions from the PDEs. This dissertation improves the fidelity of physics-based models by eliminating the need for the second assumption and, by linearizing nonlinear dynamics around different constant currents. Electrochemical transfer functions are infinite-order and cannot be expressed as a ratio of polynomials in the Laplace variable s. Thus, for practical use, these systems need to be approximated using reduced-order models that capture the most significant dynamics. This dissertation improves the generation of physics-based reduced-order models by introducing different realization algorithms, which produce a low-order model from the infinite-order electrochemical transfer functions. Physics-based reduced-order models are linear and describe cell dynamics if operated near the setpoint at which they have been generated. Hence, multiple physics-based reduced-order models need to be generated at different setpoints (i.e., state-of-charge, temperature and C-rate) in order to extend the cell operating range. This dissertation improves the implementation of physics-based reduced-order models by introducing different blending approaches that combine the pre-computed models generated (offline) at different setpoints in order to produce good electrochemical estimates (online) along the cell state-of-charge, temperature and C-rate range.

  16. AptRank: an adaptive PageRank model for protein function prediction on   bi-relational graphs.

    PubMed

    Jiang, Biaobin; Kloster, Kyle; Gleich, David F; Gribskov, Michael

    2017-06-15

    Diffusion-based network models are widely used for protein function prediction using protein network data and have been shown to outperform neighborhood-based and module-based methods. Recent studies have shown that integrating the hierarchical structure of the Gene Ontology (GO) data dramatically improves prediction accuracy. However, previous methods usually either used the GO hierarchy to refine the prediction results of multiple classifiers, or flattened the hierarchy into a function-function similarity kernel. No study has taken the GO hierarchy into account together with the protein network as a two-layer network model. We first construct a Bi-relational graph (Birg) model comprised of both protein-protein association and function-function hierarchical networks. We then propose two diffusion-based methods, BirgRank and AptRank, both of which use PageRank to diffuse information on this two-layer graph model. BirgRank is a direct application of traditional PageRank with fixed decay parameters. In contrast, AptRank utilizes an adaptive diffusion mechanism to improve the performance of BirgRank. We evaluate the ability of both methods to predict protein function on yeast, fly and human protein datasets, and compare with four previous methods: GeneMANIA, TMC, ProteinRank and clusDCA. We design four different validation strategies: missing function prediction, de novo function prediction, guided function prediction and newly discovered function prediction to comprehensively evaluate predictability of all six methods. We find that both BirgRank and AptRank outperform the previous methods, especially in missing function prediction when using only 10% of the data for training. The MATLAB code is available at https://github.rcac.purdue.edu/mgribsko/aptrank . gribskov@purdue.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  17. Bayesian inference of galaxy formation from the K-band luminosity function of galaxies: tensions between theory and observation

    NASA Astrophysics Data System (ADS)

    Lu, Yu; Mo, H. J.; Katz, Neal; Weinberg, Martin D.

    2012-04-01

    We conduct Bayesian model inferences from the observed K-band luminosity function of galaxies in the local Universe, using the semi-analytic model (SAM) of galaxy formation introduced in Lu et al. The prior distributions for the 14 free parameters include a large range of possible models. We find that some of the free parameters, e.g. the characteristic scales for quenching star formation in both high-mass and low-mass haloes, are already tightly constrained by the single data set. The posterior distribution includes the model parameters adopted in other SAMs. By marginalizing over the posterior distribution, we make predictions that include the full inferential uncertainties for the colour-magnitude relation, the Tully-Fisher relation, the conditional stellar mass function of galaxies in haloes of different masses, the H I mass function, the redshift evolution of the stellar mass function of galaxies and the global star formation history. Using posterior predictive checking with the available observational results, we find that the model family (i) predicts a Tully-Fisher relation that is curved; (ii) significantly overpredicts the satellite fraction; (iii) vastly overpredicts the H I mass function; (iv) predicts high-z stellar mass functions that have too many low-mass galaxies and too few high-mass ones and (v) predicts a redshift evolution of the stellar mass density and the star formation history that are in moderate disagreement. These results suggest that some important processes are still missing in the current model family, and we discuss a number of possible solutions to solve the discrepancies, such as interactions between galaxies and dark matter haloes, tidal stripping, the bimodal accretion of gas, preheating and a redshift-dependent initial mass function.

  18. Wavelet-based functional linear mixed models: an application to measurement error-corrected distributed lag models.

    PubMed

    Malloy, Elizabeth J; Morris, Jeffrey S; Adar, Sara D; Suh, Helen; Gold, Diane R; Coull, Brent A

    2010-07-01

    Frequently, exposure data are measured over time on a grid of discrete values that collectively define a functional observation. In many applications, researchers are interested in using these measurements as covariates to predict a scalar response in a regression setting, with interest focusing on the most biologically relevant time window of exposure. One example is in panel studies of the health effects of particulate matter (PM), where particle levels are measured over time. In such studies, there are many more values of the functional data than observations in the data set so that regularization of the corresponding functional regression coefficient is necessary for estimation. Additional issues in this setting are the possibility of exposure measurement error and the need to incorporate additional potential confounders, such as meteorological or co-pollutant measures, that themselves may have effects that vary over time. To accommodate all these features, we develop wavelet-based linear mixed distributed lag models that incorporate repeated measures of functional data as covariates into a linear mixed model. A Bayesian approach to model fitting uses wavelet shrinkage to regularize functional coefficients. We show that, as long as the exposure error induces fine-scale variability in the functional exposure profile and the distributed lag function representing the exposure effect varies smoothly in time, the model corrects for the exposure measurement error without further adjustment. Both these conditions are likely to hold in the environmental applications we consider. We examine properties of the method using simulations and apply the method to data from a study examining the association between PM, measured as hourly averages for 1-7 days, and markers of acute systemic inflammation. We use the method to fully control for the effects of confounding by other time-varying predictors, such as temperature and co-pollutants.

  19. 3D finite element model of the chinchilla ear for characterizing middle ear functions

    PubMed Central

    Wang, Xuelin; Gan, Rong Z.

    2016-01-01

    Chinchilla is a commonly used animal model for research of sound transmission through the ear. Experimental measurements of the middle ear transfer function in chinchillas have shown that the middle ear cavity greatly affects the tympanic membrane (TM) and stapes footplate (FP) displacements. However, there is no finite element (FE) model of the chinchilla ear available in the literature to characterize the middle ear functions with the anatomical features of the chinchilla ear. This paper reports a recently completed 3D FE model of the chinchilla ear based on X-ray micro-computed tomography images of a chinchilla bulla. The model consisted of the ear canal, TM, middle ear ossicles and suspensory ligaments, and the middle ear cavity. Two boundary conditions of the middle ear cavity wall were simulated in the model as the rigid structure and the partially flexible surface, and the acoustic-mechanical coupled analysis was conducted with these two conditions to characterize the middle ear function. The model results were compared with experimental measurements reported in the literature including the TM and FP displacements and the middle ear input admittance in chinchilla ear. An application of this model was presented to identify the acoustic role of the middle ear septa - a unique feature of chinchilla middle ear cavity. This study provides the first 3D FE model of the chinchilla ear for characterizing the middle ear functions through the acoustic-mechanical coupled FE analysis. PMID:26785845

  20. Effective model hierarchies for dynamic and static classical density functional theories

    NASA Astrophysics Data System (ADS)

    Majaniemi, S.; Provatas, N.; Nonomura, M.

    2010-09-01

    The origin and methodology of deriving effective model hierarchies are presented with applications to solidification of crystalline solids. In particular, it is discussed how the form of the equations of motion and the effective parameters on larger scales can be obtained from the more microscopic models. It will be shown that tying together the dynamic structure of the projection operator formalism with static classical density functional theories can lead to incomplete (mass) transport properties even though the linearized hydrodynamics on large scales is correctly reproduced. To facilitate a more natural way of binding together the dynamics of the macrovariables and classical density functional theory, a dynamic generalization of density functional theory based on the nonequilibrium generating functional is suggested.

  1. Advancing Collaboration through Hydrologic Data and Model Sharing

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Castronova, A. M.; Miles, B.; Li, Z.; Morsy, M. M.

    2015-12-01

    HydroShare is an online, collaborative system for open sharing of hydrologic data, analytical tools, and models. It supports the sharing of and collaboration around "resources" which are defined primarily by standardized metadata, content data models for each resource type, and an overarching resource data model based on the Open Archives Initiative's Object Reuse and Exchange (OAI-ORE) standard and a hierarchical file packaging system called "BagIt". HydroShare expands the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated to include geospatial and multidimensional space-time datasets commonly used in hydrology. HydroShare also includes new capability for sharing models, model components, and analytical tools and will take advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. It also supports web services and server/cloud based computation operating on resources for the execution of hydrologic models and analysis and visualization of hydrologic data. HydroShare uses iRODS as a network file system for underlying storage of datasets and models. Collaboration is enabled by casting datasets and models as "social objects". Social functions include both private and public sharing, formation of collaborative groups of users, and value-added annotation of shared datasets and models. The HydroShare web interface and social media functions were developed using the Django web application framework coupled to iRODS. Data visualization and analysis is supported through the Tethys Platform web GIS software stack. Links to external systems are supported by RESTful web service interfaces to HydroShare's content. This presentation will introduce the HydroShare functionality developed to date and describe ongoing development of functionality to support collaboration and integration of data and models.

  2. Free-energy-based lattice Boltzmann model for the simulation of multiphase flows with density contrast.

    PubMed

    Shao, J Y; Shu, C; Huang, H B; Chew, Y T

    2014-03-01

    A free-energy-based phase-field lattice Boltzmann method is proposed in this work to simulate multiphase flows with density contrast. The present method is to improve the Zheng-Shu-Chew (ZSC) model [Zheng, Shu, and Chew, J. Comput. Phys. 218, 353 (2006)] for correct consideration of density contrast in the momentum equation. The original ZSC model uses the particle distribution function in the lattice Boltzmann equation (LBE) for the mean density and momentum, which cannot properly consider the effect of local density variation in the momentum equation. To correctly consider it, the particle distribution function in the LBE must be for the local density and momentum. However, when the LBE of such distribution function is solved, it will encounter a severe numerical instability. To overcome this difficulty, a transformation, which is similar to the one used in the Lee-Lin (LL) model [Lee and Lin, J. Comput. Phys. 206, 16 (2005)] is introduced in this work to change the particle distribution function for the local density and momentum into that for the mean density and momentum. As a result, the present model still uses the particle distribution function for the mean density and momentum, and in the meantime, considers the effect of local density variation in the LBE as a forcing term. Numerical examples demonstrate that both the present model and the LL model can correctly simulate multiphase flows with density contrast, and the present model has an obvious improvement over the ZSC model in terms of solution accuracy. In terms of computational time, the present model is less efficient than the ZSC model, but is much more efficient than the LL model.

  3. A Scalable Heuristic for Viral Marketing Under the Tipping Model

    DTIC Science & Technology

    2013-09-01

    removal of high-degree nodes. The rest of the paper is organized as follows. In Section 2, we provide formal definitions of the tipping model. This is...that must be activated for it to become activate as well. A Scalable Heuristic for Viral Marketing Under the Tipping Model 3 Definition 1 (Threshold...returns a set of active nodes after one time step. Definition 2 (Activation Function) Given a threshold function, θ, an ac- tivation function Aθ maps

  4. An object model and database for functional genomics.

    PubMed

    Jones, Andrew; Hunt, Ela; Wastling, Jonathan M; Pizarro, Angel; Stoeckert, Christian J

    2004-07-10

    Large-scale functional genomics analysis is now feasible and presents significant challenges in data analysis, storage and querying. Data standards are required to enable the development of public data repositories and to improve data sharing. There is an established data format for microarrays (microarray gene expression markup language, MAGE-ML) and a draft standard for proteomics (PEDRo). We believe that all types of functional genomics experiments should be annotated in a consistent manner, and we hope to open up new ways of comparing multiple datasets used in functional genomics. We have created a functional genomics experiment object model (FGE-OM), developed from the microarray model, MAGE-OM and two models for proteomics, PEDRo and our own model (Gla-PSI-Glasgow Proposal for the Proteomics Standards Initiative). FGE-OM comprises three namespaces representing (i) the parts of the model common to all functional genomics experiments; (ii) microarray-specific components; and (iii) proteomics-specific components. We believe that FGE-OM should initiate discussion about the contents and structure of the next version of MAGE and the future of proteomics standards. A prototype database called RNA And Protein Abundance Database (RAPAD), based on FGE-OM, has been implemented and populated with data from microbial pathogenesis. FGE-OM and the RAPAD schema are available from http://www.gusdb.org/fge.html, along with a set of more detailed diagrams. RAPAD can be accessed by registration at the site.

  5. Balance Confidence: A Predictor of Perceived Physical Function, Perceived Mobility, and Perceived Recovery 1 Year After Inpatient Stroke Rehabilitation.

    PubMed

    Torkia, Caryne; Best, Krista L; Miller, William C; Eng, Janice J

    2016-07-01

    To estimate the effect of balance confidence measured at 1 month poststroke rehabilitation on perceived physical function, mobility, and stroke recovery 12 months later. Longitudinal study (secondary analysis). Multisite, community-based. Community-dwelling individuals (N=69) with stroke living in a home setting. Not applicable. Activities-specific Balance Confidence scale; physical function and mobility subscales of the Stroke Impact Scale 3.0; and a single item from the Stroke Impact Scale for perceived recovery. Balance confidence at 1 month postdischarge from inpatient rehabilitation predicts perceived physical function (model 1), mobility (model 2), and recovery (model 3) 12 months later after adjusting for important covariates. The covariates included in model 1 were age, sex, basic mobility, and depression. The covariates selected for model 2 were age, sex, balance capacity, and anxiety, and the covariates in model 3 were age, sex, walking capacity, and social support. The amount of variance in perceived physical function, perceived mobility, and perceived recovery that balance confidence accounted for was 12%, 9%, and 10%, respectively. After discharge from inpatient rehabilitation poststroke, balance confidence predicts individuals' perceived physical function, mobility, and recovery 12 months later. There is a need to address balance confidence at discharge from inpatient stroke rehabilitation. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  6. N-3 fatty acids and membrane microdomains: from model membranes to lymphocyte function.

    PubMed

    Shaikh, Saame Raza; Teague, Heather

    2012-12-01

    This article summarizes the author's research on fish oil derived n-3 fatty acids, plasma membrane organization and B cell function. We first cover basic model membrane studies that investigated how docosahexaenoic acid (DHA) targeted the organization of sphingolipid-cholesterol enriched lipid microdomains. A key finding here was that DHA had a relatively poor affinity for cholesterol. This work led to a model that predicted DHA acyl chains in cells would manipulate lipid-protein microdomain organization and thereby function. We then review how the predictions of the model were tested with B cells in vitro followed by experiments using mice fed fish oil. These studies reveal a highly complex picture on how n-3 fatty acids target lipid-protein organization and B cell function. Key findings are as follows: (1) n-3 fatty acids target not just the plasma membrane but also endomembrane organization; (2) DHA, but not eicosapentaenoic acid (EPA), disrupts microdomain spatial distribution (i.e. clustering), (3) DHA alters protein lateral organization and (4) changes in membrane organization are accompanied by functional effects on both innate and adaptive B cell function. Altogether, the research over the past 10 years has led to an evolution of the original model on how DHA reorganizes membrane microdomains. The work raises the intriguing possibility of testing the model at the human level to target health and disease. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. A quantitative systems physiology model of renal function and blood pressure regulation: Model description.

    PubMed

    Hallow, K M; Gebremichael, Y

    2017-06-01

    Renal function plays a central role in cardiovascular, kidney, and multiple other diseases, and many existing and novel therapies act through renal mechanisms. Even with decades of accumulated knowledge of renal physiology, pathophysiology, and pharmacology, the dynamics of renal function remain difficult to understand and predict, often resulting in unexpected or counterintuitive therapy responses. Quantitative systems pharmacology modeling of renal function integrates this accumulated knowledge into a quantitative framework, allowing evaluation of competing hypotheses, identification of knowledge gaps, and generation of new experimentally testable hypotheses. Here we present a model of renal physiology and control mechanisms involved in maintaining sodium and water homeostasis. This model represents the core renal physiological processes involved in many research questions in drug development. The model runs in R and the code is made available. In a companion article, we present a case study using the model to explore mechanisms and pharmacology of salt-sensitive hypertension. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  8. Volterra-series-based nonlinear system modeling and its engineering applications: A state-of-the-art review

    NASA Astrophysics Data System (ADS)

    Cheng, C. M.; Peng, Z. K.; Zhang, W. M.; Meng, G.

    2017-03-01

    Nonlinear problems have drawn great interest and extensive attention from engineers, physicists and mathematicians and many other scientists because most real systems are inherently nonlinear in nature. To model and analyze nonlinear systems, many mathematical theories and methods have been developed, including Volterra series. In this paper, the basic definition of the Volterra series is recapitulated, together with some frequency domain concepts which are derived from the Volterra series, including the general frequency response function (GFRF), the nonlinear output frequency response function (NOFRF), output frequency response function (OFRF) and associated frequency response function (AFRF). The relationship between the Volterra series and other nonlinear system models and nonlinear problem solving methods are discussed, including the Taylor series, Wiener series, NARMAX model, Hammerstein model, Wiener model, Wiener-Hammerstein model, harmonic balance method, perturbation method and Adomian decomposition. The challenging problems and their state of arts in the series convergence study and the kernel identification study are comprehensively introduced. In addition, a detailed review is then given on the applications of Volterra series in mechanical engineering, aeroelasticity problem, control engineering, electronic and electrical engineering.

  9. On the Effects of Artificial Feeding on Bee Colony Dynamics: A Mathematical Model

    PubMed Central

    Paiva, Juliana Pereira Lisboa Mohallem; Paiva, Henrique Mohallem; Esposito, Elisa; Morais, Michelle Manfrini

    2016-01-01

    This paper proposes a new mathematical model to evaluate the effects of artificial feeding on bee colony population dynamics. The proposed model is based on a classical framework and contains differential equations that describe the changes in the number of hive bees, forager bees, and brood cells, as a function of amounts of natural and artificial food. The model includes the following elements to characterize the artificial feeding scenario: a function to model the preference of the bees for natural food over artificial food; parameters to quantify the quality and palatability of artificial diets; a function to account for the efficiency of the foragers in gathering food under different environmental conditions; and a function to represent different approaches used by the beekeeper to feed the hive with artificial food. Simulated results are presented to illustrate the main characteristics of the model and its behavior under different scenarios. The model results are validated with experimental data from the literature involving four different artificial diets. A good match between simulated and experimental results was achieved. PMID:27875589

  10. Empirical models for fitting of oral concentration time curves with and without an intravenous reference.

    PubMed

    Weiss, Michael

    2017-06-01

    Appropriate model selection is important in fitting oral concentration-time data due to the complex character of the absorption process. When IV reference data are available, the problem is the selection of an empirical input function (absorption model). In the present examples a weighted sum of inverse Gaussian density functions (IG) was found most useful. It is shown that alternative models (gamma and Weibull density) are only valid if the input function is log-concave. Furthermore, it is demonstrated for the first time that the sum of IGs model can be also applied to fit oral data directly (without IV data). In the present examples, a weighted sum of two or three IGs was sufficient. From the parameters of this function, the model-independent measures AUC and mean residence time can be calculated. It turned out that a good fit of the data in the terminal phase is essential to avoid parameter biased estimates. The time course of fractional elimination rate and the concept of log-concavity have proved as useful tools in model selection.

  11. Improved Monitoring of Vegetation Productivity using Continuous Assimilation of Radiometric Data

    NASA Astrophysics Data System (ADS)

    Baret, F.; Lauvernet, C.; Weiss, M.; Prevot, L.; Rochdi, N.

    Canopy functioning models describe crop production from meteorological and soil inputs. However, because of the large number of variables and parameters used, and the poor knowledge of the actual values of some of them, the time course of the canopy and thus final production simulated by these models is often not very accurate. Satellite observations sensors allow controlling the simulations through assimilation of the radiometric data within radiative transfer models coupled to canopy functioning models. An assimilation scheme is presented with application to wheat crops. The coupling between radiative transfer models and canopy functioning models is described. The assimilation scheme is then applied to an experiment achieved within the ReSeDA project. Several issues relative to the assimilation process are discussed. They concern the type of canopy functioning model used, the possibility to assimilate biophysical products rather than radiances, and the use of ancillary information. Further, considerations associated to the problems linked to high spatial and temporal resolution data are listed and illustrated by preliminary results acquired within the ADAM project. Further discussion is made on the required temporal sampling for space observations.

  12. Three friendly walkers

    NASA Astrophysics Data System (ADS)

    Jensen, Iwan

    2017-01-01

    More than 15 years ago Guttmann and Vöge (2002 J. Stat. Plan. Inference 101 107), introduced a model of friendly walkers. Since then it has remained unsolved. In this paper we provide the exact solution to a closely allied model which essentially only differs in the boundary conditions. The exact solution is expressed in terms of the reciprocal of the generating function for vicious walkers which is a D-finite function. However, ratios of D-finite functions are inherently not D-finite and in this case we prove that the friendly walkers generating function is the solution to a non-linear differential equation with polynomial coefficients, it is in other words D-algebraic. We find using numerically exact calculations a conjectured expression for the generating function of the original model as a ratio of a D-finite function and the generating function for vicious walkers. We obtain an expression for this D-finite function in terms of a {{}2}{{F}1} hypergeometric function with a rational pullback and its first and second derivatives. Dedicated to Tony Guttmann on the occasion of his 70th birthday.

  13. Quark fragmentation functions in NJL-jet model

    NASA Astrophysics Data System (ADS)

    Bentz, Wolfgang; Matevosyan, Hrayr; Thomas, Anthony

    2014-09-01

    We report on our studies of quark fragmentation functions in the Nambu-Jona-Lasinio (NJL) - jet model. The results of Monte-Carlo simulations for the fragmentation functions to mesons and nucleons, as well as to pion and kaon pairs (dihadron fragmentation functions) are presented. The important role of intermediate vector meson resonances for those semi-inclusive deep inelastic production processes is emphasized. Our studies are very relevant for the extraction of transverse momentum dependent quark distribution functions from measured scattering cross sections. We report on our studies of quark fragmentation functions in the Nambu-Jona-Lasinio (NJL) - jet model. The results of Monte-Carlo simulations for the fragmentation functions to mesons and nucleons, as well as to pion and kaon pairs (dihadron fragmentation functions) are presented. The important role of intermediate vector meson resonances for those semi-inclusive deep inelastic production processes is emphasized. Our studies are very relevant for the extraction of transverse momentum dependent quark distribution functions from measured scattering cross sections. Supported by Grant in Aid for Scientific Research, Japanese Ministry of Education, Culture, Sports, Science and Technology, Project No. 20168769.

  14. Error Propagation in a System Model

    NASA Technical Reports Server (NTRS)

    Schloegel, Kirk (Inventor); Bhatt, Devesh (Inventor); Oglesby, David V. (Inventor); Madl, Gabor (Inventor)

    2015-01-01

    Embodiments of the present subject matter can enable the analysis of signal value errors for system models. In an example, signal value errors can be propagated through the functional blocks of a system model to analyze possible effects as the signal value errors impact incident functional blocks. This propagation of the errors can be applicable to many models of computation including avionics models, synchronous data flow, and Kahn process networks.

  15. Using transfer functions to quantify El Niño Southern Oscillation dynamics in data and models.

    PubMed

    MacMartin, Douglas G; Tziperman, Eli

    2014-09-08

    Transfer function tools commonly used in engineering control analysis can be used to better understand the dynamics of El Niño Southern Oscillation (ENSO), compare data with models and identify systematic model errors. The transfer function describes the frequency-dependent input-output relationship between any pair of causally related variables, and can be estimated from time series. This can be used first to assess whether the underlying relationship is or is not frequency dependent, and if so, to diagnose the underlying differential equations that relate the variables, and hence describe the dynamics of individual subsystem processes relevant to ENSO. Estimating process parameters allows the identification of compensating model errors that may lead to a seemingly realistic simulation in spite of incorrect model physics. This tool is applied here to the TAO array ocean data, the GFDL-CM2.1 and CCSM4 general circulation models, and to the Cane-Zebiak ENSO model. The delayed oscillator description is used to motivate a few relevant processes involved in the dynamics, although any other ENSO mechanism could be used instead. We identify several differences in the processes between the models and data that may be useful for model improvement. The transfer function methodology is also useful in understanding the dynamics and evaluating models of other climate processes.

  16. Functional response and capture timing in an individual-based model: predation by northern squawfish (Ptychocheilus oregonensis) on juvenile salmonids in the Columbia River

    USGS Publications Warehouse

    Petersen, James H.; DeAngelis, Donald L.

    1992-01-01

    The behavior of individual northern squawfish (Ptychocheilus oregonensis) preying on juvenile salmonids was modeled to address questions about capture rate and the timing of prey captures (random versus contagious). Prey density, predator weight, prey weight, temperature, and diel feeding pattern were first incorporated into predation equations analogous to Holling Type 2 and Type 3 functional response models. Type 2 and Type 3 equations fit field data from the Columbia River equally well, and both models predicted predation rates on five of seven independent dates. Selecting a functional response type may be complicated by variable predation rates, analytical methods, and assumptions of the model equations. Using the Type 2 functional response, random versus contagious timing of prey capture was tested using two related models. ln the simpler model, salmon captures were assumed to be controlled by a Poisson renewal process; in the second model, several salmon captures were assumed to occur during brief "feeding bouts", modeled with a compound Poisson process. Salmon captures by individual northern squawfish were clustered through time, rather than random, based on comparison of model simulations and field data. The contagious-feeding result suggests that salmonids may be encountered as patches or schools in the river.

  17. Faces of matrix models

    NASA Astrophysics Data System (ADS)

    Morozov, A.

    2012-08-01

    Partition functions of eigenvalue matrix models possess a number of very different descriptions: as matrix integrals, as solutions to linear and nonlinear equations, as τ-functions of integrable hierarchies and as special-geometry prepotentials, as result of the action of W-operators and of various recursions on elementary input data, as gluing of certain elementary building blocks. All this explains the central role of such matrix models in modern mathematical physics: they provide the basic "special functions" to express the answers and relations between them, and they serve as a dream model of what one should try to achieve in any other field.

  18. Beta functions in Chirally deformed supersymmetric sigma models in two dimensions

    NASA Astrophysics Data System (ADS)

    Vainshtein, Arkady

    2016-10-01

    We study two-dimensional sigma models where the chiral deformation diminished the original 𝒩 = (2, 2) supersymmetry to the chiral one, 𝒩 = (0, 2). Such heterotic models were discovered previously on the world sheet of non-Abelian stringy solitons supported by certain four-dimensional 𝒩 = 1 theories. We study geometric aspects and holomorphic properties of these models, and derive a number of exact expressions for the β functions in terms of the anomalous dimensions analogous to the NSVZ β function in four-dimensional Yang-Mills. Instanton calculus provides a straightforward method for the derivation.

  19. Beta Functions in Chirally Deformed Supersymmetric Sigma Models in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Vainshtein, Arkady

    We study two-dimensional sigma models where the chiral deformation diminished the original 𝒩 =(2, 2) supersymmetry to the chiral one, 𝒩 =(0, 2). Such heterotic models were discovered previously on the world sheet of non-Abelian stringy solitons supported by certain four-dimensional 𝒩 = 1 theories. We study geometric aspects and holomorphic properties of these models, and derive a number of exact expressions for the β functions in terms of the anomalous dimensions analogous to the NSVZ β function in four-dimensional Yang-Mills. Instanton calculus provides a straightforward method for the derivation.

  20. Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform

    PubMed Central

    Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong

    2016-01-01

    We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features. PMID:27304979

  1. Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform.

    PubMed

    Wu, Hau-Tieng; Wu, Han-Kuei; Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong

    2016-01-01

    We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features.

  2. Regularity Results for a Class of Functionals with Non-Standard Growth

    NASA Astrophysics Data System (ADS)

    Acerbi, Emilio; Mingione, Giuseppe

    We consider the integral functional under non-standard growth assumptions that we call p(x) type: namely, we assume that a relevant model case being the functional Under sharp assumptions on the continuous function p(x)>1 we prove regularity of minimizers. Energies exhibiting this growth appear in several models from mathematical physics.

  3. Research on an augmented Lagrangian penalty function algorithm for nonlinear programming

    NASA Technical Reports Server (NTRS)

    Frair, L.

    1978-01-01

    The augmented Lagrangian (ALAG) Penalty Function Algorithm for optimizing nonlinear mathematical models is discussed. The mathematical models of interest are deterministic in nature and finite dimensional optimization is assumed. A detailed review of penalty function techniques in general and the ALAG technique in particular is presented. Numerical experiments are conducted utilizing a number of nonlinear optimization problems to identify an efficient ALAG Penalty Function Technique for computer implementation.

  4. The Jena Diversity Model: Towards a Richer Representation of the Terrestrial Biosphere for Earth System Modelling

    NASA Astrophysics Data System (ADS)

    Pavlick, R.; Reu, B.; Bohn, K.; Dyke, J.; Kleidon, A.

    2010-12-01

    The terrestrial biosphere is a complex, self-organizing system which is continually both adapting to and altering its global environment. It also exhibits a vast diversity of vegetation forms and functioning. However, the terrestrial biosphere components within current state-of-the-art Earth System Models abstract this diversity in to a handful of relatively static plant functional types. These coarse and static representations of functional diversity might contribute to overly pessimistic projections regarding terrestrial ecosystem responses to scenarios of global change (e.g. Amazonian and boreal forest diebacks). In the Jena Diversity (JeDi) model, we introduce a new approach to vegetation modelling with a richer representation of functional diversity, based not on plant functional types, but on unavoidable plant ecophysiological trade-offs, which we hypothesize should be more stable in time. The JeDi model tests a large number of plant growth strategies. Each growth strategy is simulated using a set of randomly generated parameter values, which characterize its functioning in terms of carbon allocation, ecophysiology, and phenology, which are then linked to the growing conditions at the land surface. The model is constructed in such a way that these parameters inherently lead to ecophysiological trade-offs, which determine whether a growth strategy is able to survive and reproduce under the prevalent climatic conditions. Kleidon and Mooney (2000) demonstrated that this approach is capable of reproducing the geographic distribution of species richness. More recently, we have shown the JeDi model can explain other biogeographical phenomena including the present-day global pattern of biomes (Reu et al., accepted), ecosystem evenness (Kleidon et al. 2009), and possible mechanisms for biome shifts and biodiversity changes under scenarios of global warming (Reu et al., submitted). We have also evaluated the simulated biogeochemical fluxes from JeDi against a variety of site, field, and satellite observations (Pavlick et al., submitted) following a protocol established by the Carbon-Land Model Intercomparison Project (Randerson et al. 2009). We found that the global patterns of biogeochemical fluxes and land surface properties are reasonably well simulated using this bottom-up trade-off approach and compare favorably with other state of the art terrestrial biosphere models. Here, we present some results from JeDi simulations, wherein we varied the modelled functional diversity to quantify its impact on terrestrial biogeochemical fluxes under both present-day conditions and projected scenarios of global change. We also present results from a set of simulations wherein we vary the ability of the modelled ecosystems to adapt through changes in functional composition, leading to different projection responses of the carbon cycle to global warming. This plant functional tradeoff approach sets the foundation for many applications, including exploring the emergence and climatic impacts of major vegetation transitions throughout the last 400 million years as well as quantifying the significance of preserving functional diversity to hedge against uncertain climates in the future.

  5. Modeling procedures for handling qualities evaluation of flexible aircraft

    NASA Technical Reports Server (NTRS)

    Govindaraj, K. S.; Eulrich, B. J.; Chalk, C. R.

    1981-01-01

    This paper presents simplified modeling procedures to evaluate the impact of flexible modes and the unsteady aerodynamic effects on the handling qualities of Supersonic Cruise Aircraft (SCR). The modeling procedures involve obtaining reduced order transfer function models of SCR vehicles, including the important flexible mode responses and unsteady aerodynamic effects, and conversion of the transfer function models to time domain equations for use in simulations. The use of the modeling procedures is illustrated by a simple example.

  6. Hirabayashi, Satoshi; Kroll, Charles N.; Nowak, David J. 2011. Component-based development and sensitivity analyses of an air pollutant dry deposition model. Environmental Modelling & Software. 26(6): 804-816.

    Treesearch

    Satoshi Hirabayashi; Chuck Kroll; David Nowak

    2011-01-01

    The Urban Forest Effects-Deposition model (UFORE-D) was developed with a component-based modeling approach. Functions of the model were separated into components that are responsible for user interface, data input/output, and core model functions. Taking advantage of the component-based approach, three UFORE-D applications were developed: a base application to estimate...

  7. Conceptualization of the Sexual Response Models in Men: Are there Differences Between Sexually Functional and Dysfunctional Men?

    PubMed

    Connaughton, Catherine; McCabe, Marita; Karantzas, Gery

    2016-03-01

    Research to validate models of sexual response empirically in men with and without sexual dysfunction (MSD), as currently defined, is limited. To explore the extent to which the traditional linear or the Basson circular model best represents male sexual response for men with MSD and sexually functional men. In total, 573 men completed an online questionnaire to assess sexual function and aspects of the models of sexual response. In total, 42.2% of men (242) were sexually functional, and 57.8% (331) had at least one MSD. Models were built and tested using bootstrapping and structural equation modeling. Fit of models for men with and without MSD. The linear model and the initial circular model were a poor fit for men with and without MSD. A modified version of the circular model demonstrated adequate fit for the two groups and showed important interactions between psychological factors and sexual response for men with and without MSD. Male sexual response was not represented by the linear model for men with or without MSD, excluding possible healthy responsive desire. The circular model provided a better fit for the two groups of men but demonstrated that the relations between psychological factors and phases of sexual response were different for men with and without MSD as currently defined. Copyright © 2016 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  8. Uncertainty importance analysis using parametric moment ratio functions.

    PubMed

    Wei, Pengfei; Lu, Zhenzhou; Song, Jingwen

    2014-02-01

    This article presents a new importance analysis framework, called parametric moment ratio function, for measuring the reduction of model output uncertainty when the distribution parameters of inputs are changed, and the emphasis is put on the mean and variance ratio functions with respect to the variances of model inputs. The proposed concepts efficiently guide the analyst to achieve a targeted reduction on the model output mean and variance by operating on the variances of model inputs. The unbiased and progressive unbiased Monte Carlo estimators are also derived for the parametric mean and variance ratio functions, respectively. Only a set of samples is needed for implementing the proposed importance analysis by the proposed estimators, thus the computational cost is free of input dimensionality. An analytical test example with highly nonlinear behavior is introduced for illustrating the engineering significance of the proposed importance analysis technique and verifying the efficiency and convergence of the derived Monte Carlo estimators. Finally, the moment ratio function is applied to a planar 10-bar structure for achieving a targeted 50% reduction of the model output variance. © 2013 Society for Risk Analysis.

  9. Subgrid spatial variability of soil hydraulic functions for hydrological modelling

    NASA Astrophysics Data System (ADS)

    Kreye, Phillip; Meon, Günter

    2016-07-01

    State-of-the-art hydrological applications require a process-based, spatially distributed hydrological model. Runoff characteristics are demanded to be well reproduced by the model. Despite that, the model should be able to describe the processes at a subcatchment scale in a physically credible way. The objective of this study is to present a robust procedure to generate various sets of parameterisations of soil hydraulic functions for the description of soil heterogeneity on a subgrid scale. Relations between Rosetta-generated values of saturated hydraulic conductivity (Ks) and van Genuchten's parameters of soil hydraulic functions were statistically analysed. An universal function that is valid for the complete bandwidth of Ks values could not be found. After concentrating on natural texture classes, strong correlations were identified for all parameters. The obtained regression results were used to parameterise sets of hydraulic functions for each soil class. The methodology presented in this study is applicable on a wide range of spatial scales and does not need input data from field studies. The developments were implemented into a hydrological modelling system.

  10. Accelerated Testing and Modeling of Potential-Induced Degradation as a Function of Temperature and Relative Humidity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hacke, Peter; Spataru, Sergiu; Terwilliger, Kent

    2015-06-14

    An acceleration model based on the Peck equation was applied to power performance of crystalline silicon cell modules as a function of time and of temperature and humidity, the two main environmental stress factors that promote potential-induced degradation. This model was derived from module power degradation data obtained semi-continuously and statistically by in-situ dark current-voltage measurements in an environmental chamber. The modeling enables prediction of degradation rates and times as functions of temperature and humidity. Power degradation could be modeled linearly as a function of time to the second power; additionally, we found that coulombs transferred from the active cellmore » circuit to ground during the stress test is approximately linear with time. Therefore, the power loss could be linearized as a function of coulombs squared. With this result, we observed that when the module face was completely grounded with a condensed phase conductor, leakage current exceeded the anticipated corresponding degradation rate relative to the other tests performed in damp heat.« less

  11. Geographically weighted regression model on poverty indicator

    NASA Astrophysics Data System (ADS)

    Slamet, I.; Nugroho, N. F. T. A.; Muslich

    2017-12-01

    In this research, we applied geographically weighted regression (GWR) for analyzing the poverty in Central Java. We consider Gaussian Kernel as weighted function. The GWR uses the diagonal matrix resulted from calculating kernel Gaussian function as a weighted function in the regression model. The kernel weights is used to handle spatial effects on the data so that a model can be obtained for each location. The purpose of this paper is to model of poverty percentage data in Central Java province using GWR with Gaussian kernel weighted function and to determine the influencing factors in each regency/city in Central Java province. Based on the research, we obtained geographically weighted regression model with Gaussian kernel weighted function on poverty percentage data in Central Java province. We found that percentage of population working as farmers, population growth rate, percentage of households with regular sanitation, and BPJS beneficiaries are the variables that affect the percentage of poverty in Central Java province. In this research, we found the determination coefficient R2 are 68.64%. There are two categories of district which are influenced by different of significance factors.

  12. Optimization of finite difference forward modeling for elastic waves based on optimum combined window functions

    NASA Astrophysics Data System (ADS)

    Jian, Wang; Xiaohong, Meng; Hong, Liu; Wanqiu, Zheng; Yaning, Liu; Sheng, Gui; Zhiyang, Wang

    2017-03-01

    Full waveform inversion and reverse time migration are active research areas for seismic exploration. Forward modeling in the time domain determines the precision of the results, and numerical solutions of finite difference have been widely adopted as an important mathematical tool for forward modeling. In this article, the optimum combined of window functions was designed based on the finite difference operator using a truncated approximation of the spatial convolution series in pseudo-spectrum space, to normalize the outcomes of existing window functions for different orders. The proposed combined window functions not only inherit the characteristics of the various window functions, to provide better truncation results, but also control the truncation error of the finite difference operator manually and visually by adjusting the combinations and analyzing the characteristics of the main and side lobes of the amplitude response. Error level and elastic forward modeling under the proposed combined system were compared with outcomes from conventional window functions and modified binomial windows. Numerical dispersion is significantly suppressed, which is compared with modified binomial window function finite-difference and conventional finite-difference. Numerical simulation verifies the reliability of the proposed method.

  13. Fast computation of the electrolyte-concentration transfer function of a lithium-ion cell model

    NASA Astrophysics Data System (ADS)

    Rodríguez, Albert; Plett, Gregory L.; Trimboli, M. Scott

    2017-08-01

    One approach to creating physics-based reduced-order models (ROMs) of battery-cell dynamics requires first generating linearized Laplace-domain transfer functions of all cell internal electrochemical variables of interest. Then, the resulting infinite-dimensional transfer functions can be reduced by various means in order to find an approximate low-dimensional model. These methods include Padé approximation or the Discrete-Time Realization algorithm. In a previous article, Lee and colleagues developed a transfer function of the electrolyte concentration for a porous-electrode pseudo-two-dimensional lithium-ion cell model. Their approach used separation of variables and Sturm-Liouville theory to compute an infinite-series solution to the transfer function, which they then truncated to a finite number of terms for reasons of practicality. Here, we instead use a variation-of-parameters approach to arrive at a different representation of the identical solution that does not require a series expansion. The primary benefits of the new approach are speed of computation of the transfer function and the removal of the requirement to approximate the transfer function by truncating the number of terms evaluated. Results show that the speedup of the new method can be more than 3800.

  14. Adaptive non-linear control for cancer therapy through a Fokker-Planck observer.

    PubMed

    Shakeri, Ehsan; Latif-Shabgahi, Gholamreza; Esmaeili Abharian, Amir

    2018-04-01

    In recent years, many efforts have been made to present optimal strategies for cancer therapy through the mathematical modelling of tumour-cell population dynamics and optimal control theory. In many cases, therapy effect is included in the drift term of the stochastic Gompertz model. By fitting the model with empirical data, the parameters of therapy function are estimated. The reported research works have not presented any algorithm to determine the optimal parameters of therapy function. In this study, a logarithmic therapy function is entered in the drift term of the Gompertz model. Using the proposed control algorithm, the therapy function parameters are predicted and adaptively adjusted. To control the growth of tumour-cell population, its moments must be manipulated. This study employs the probability density function (PDF) control approach because of its ability to control all the process moments. A Fokker-Planck-based non-linear stochastic observer will be used to determine the PDF of the process. A cost function based on the difference between a predefined desired PDF and PDF of tumour-cell population is defined. Using the proposed algorithm, the therapy function parameters are adjusted in such a manner that the cost function is minimised. The existence of an optimal therapy function is also proved. The numerical results are finally given to demonstrate the effectiveness of the proposed method.

  15. Accurate mass and velocity functions of dark matter haloes

    NASA Astrophysics Data System (ADS)

    Comparat, Johan; Prada, Francisco; Yepes, Gustavo; Klypin, Anatoly

    2017-08-01

    N-body cosmological simulations are an essential tool to understand the observed distribution of galaxies. We use the MultiDark simulation suite, run with the Planck cosmological parameters, to revisit the mass and velocity functions. At redshift z = 0, the simulations cover four orders of magnitude in halo mass from ˜1011M⊙ with 8783 874 distinct haloes and 532 533 subhaloes. The total volume used is ˜515 Gpc3, more than eight times larger than in previous studies. We measure and model the halo mass function, its covariance matrix w.r.t halo mass and the large-scale halo bias. With the formalism of the excursion-set mass function, we explicit the tight interconnection between the covariance matrix, bias and halo mass function. We obtain a very accurate (<2 per cent level) model of the distinct halo mass function. We also model the subhalo mass function and its relation to the distinct halo mass function. The set of models obtained provides a complete and precise framework for the description of haloes in the concordance Planck cosmology. Finally, we provide precise analytical fits of the Vmax maximum velocity function up to redshift z < 2.3 to push for the development of halo occupation distribution using Vmax. The data and the analysis code are made publicly available in the Skies and Universes data base.

  16. Equal Area Logistic Estimation for Item Response Theory

    NASA Astrophysics Data System (ADS)

    Lo, Shih-Ching; Wang, Kuo-Chang; Chang, Hsin-Li

    2009-08-01

    Item response theory (IRT) models use logistic functions exclusively as item response functions (IRFs). Applications of IRT models require obtaining the set of values for logistic function parameters that best fit an empirical data set. However, success in obtaining such set of values does not guarantee that the constructs they represent actually exist, for the adequacy of a model is not sustained by the possibility of estimating parameters. In this study, an equal area based two-parameter logistic model estimation algorithm is proposed. Two theorems are given to prove that the results of the algorithm are equivalent to the results of fitting data by logistic model. Numerical results are presented to show the stability and accuracy of the algorithm.

  17. Models of subjective response to in-flight motion data

    NASA Technical Reports Server (NTRS)

    Rudrapatna, A. N.; Jacobson, I. D.

    1973-01-01

    Mathematical relationships between subjective comfort and environmental variables in an air transportation system are investigated. As a first step in model building, only the motion variables are incorporated and sensitivities are obtained using stepwise multiple regression analysis. The data for these models have been collected from commercial passenger flights. Two models are considered. In the first, subjective comfort is assumed to depend on rms values of the six-degrees-of-freedom accelerations. The second assumes a Rustenburg type human response function in obtaining frequency weighted rms accelerations, which are used in a linear model. The form of the human response function is examined and the results yield a human response weighting function for different degrees of freedom.

  18. Modeling and control of flexible space structures

    NASA Technical Reports Server (NTRS)

    Wie, B.; Bryson, A. E., Jr.

    1981-01-01

    The effects of actuator and sensor locations on transfer function zeros are investigated, using uniform bars and beams as generic models of flexible space structures. It is shown how finite element codes may be used directly to calculate transfer function zeros. The impulse response predicted by finite-dimensional models is compared with the exact impulse response predicted by the infinite dimensional models. It is shown that some flexible structures behave as if there were a direct transmission between actuator and sensor (equal numbers of zeros and poles in the transfer function). Finally, natural damping models for a vibrating beam are investigated since natural damping has a strong influence on the appropriate active control logic for a flexible structure.

  19. Ensemble modeling with pedotransfer functions in the hydropedological context

    USDA-ARS?s Scientific Manuscript database

    Uncertainty of soil water content and/or soil water flux estimates with soil water models has recently become of a particular interest in various applications. This work provides examples of using pedotransfer functions (PTFs) to build ensembles of models to characterize the uncertainty of simulatio...

  20. Systems Operation Studies for Automated Guideway Transit Systems: Feeder Systems Model Functional Specification

    DOT National Transportation Integrated Search

    1981-01-01

    This document specifies the functional requirements for the AGT-SOS Feeder Systems Model (FSM), the type of hardware required, and the modeling techniques employed by the FSM. The objective of the FSM is to map the zone-to-zone transit patronage dema...

  1. Simulating bimodal tall fescue growth with a degree-day-based process-oriented plant model

    USDA-ARS?s Scientific Manuscript database

    Plant growth simulation models have a temperature response function driving development, with a base temperature and an optimum temperature defined. Such growth simulation models often function well when plant development rate shows a continuous change throughout the growing season. This approach ...

  2. Optimal observation network design for conceptual model discrimination and uncertainty reduction

    NASA Astrophysics Data System (ADS)

    Pham, Hai V.; Tsai, Frank T.-C.

    2016-02-01

    This study expands the Box-Hill discrimination function to design an optimal observation network to discriminate conceptual models and, in turn, identify a most favored model. The Box-Hill discrimination function measures the expected decrease in Shannon entropy (for model identification) before and after the optimal design for one additional observation. This study modifies the discrimination function to account for multiple future observations that are assumed spatiotemporally independent and Gaussian-distributed. Bayesian model averaging (BMA) is used to incorporate existing observation data and quantify future observation uncertainty arising from conceptual and parametric uncertainties in the discrimination function. In addition, the BMA method is adopted to predict future observation data in a statistical sense. The design goal is to find optimal locations and least data via maximizing the Box-Hill discrimination function value subject to a posterior model probability threshold. The optimal observation network design is illustrated using a groundwater study in Baton Rouge, Louisiana, to collect additional groundwater heads from USGS wells. The sources of uncertainty creating multiple groundwater models are geological architecture, boundary condition, and fault permeability architecture. Impacts of considering homoscedastic and heteroscedastic future observation data and the sources of uncertainties on potential observation areas are analyzed. Results show that heteroscedasticity should be considered in the design procedure to account for various sources of future observation uncertainty. After the optimal design is obtained and the corresponding data are collected for model updating, total variances of head predictions can be significantly reduced by identifying a model with a superior posterior model probability.

  3. Optically inspired biomechanical model of the human eyeball.

    PubMed

    Sródka, Wieslaw; Iskander, D Robert

    2008-01-01

    Currently available biomechanical models of the human eyeball focus mainly on the geometries and material properties of its components while little attention has been given to its optics--the eye's primary function. We postulate that in the evolution process, the mechanical structure of the eyeball has been influenced by its optical functions. We develop a numerical finite element analysis-based model in which the eyeball geometry and its material properties are linked to the optical functions of the eye. This is achieved by controlling in the model all essential optical functions while still choosing material properties from a range of clinically available data. In particular, it is assumed that in a certain range of intraocular pressures, the eye is able to maintain focus. This so-called property of optical self-adjustments provides a more constrained set of numerical solutions in which the number of free model parameters significantly decreases, leading to models that are more robust. Further, we investigate two specific cases of a model that satisfies optical self-adjustment: (1) a full model in which the cornea is flexibly attached to sclera at the limbus, and (2) a fixed cornea model in which the cornea is not allowed to move at the limbus. We conclude that for a biomechanical model of the eyeball to mimic the optical function of a real eye, it is crucial that the cornea is allowed to move at the limbal junction, that the materials used for the cornea and sclera are strongly nonlinear, and that their moduli of elasticity remain in a very close relationship.

  4. SU-E-T-39: A Logistic Function-Based Model to Predict Organ-At-Risk (OAR) DVH in IMRT Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, S; Zhang, H; Zhang, B

    2015-06-15

    Purpose: To investigate the feasibility of a logistic function-based model to predict organ-at-risk (OAR) DVH for IMRT planning. The predicted DVHs are compared to achieved DVHs by expert treatment planners. Methods: A logistic function is used to model the OAR dose-gradient function. This function describes the percentage of the prescription dose as a function of the normal distance to PTV surface. The slope of dose-gradient function is function of relative spatial orientation of the PTV and OARs. The OAR DVH is calculated using the OAR dose-gradient function assuming that the dose is same for voxels with same normal distance tomore » PTV. Ten previously planned prostate IMRT plans were selected to build the model, and the following plan parameters were calculated as possible features to the model: the PTV maximum/minimum dose, PTV volume, bladder/rectum volume in the radiation field, percentage of bladder/rectum overlapping with PTV, and the distance between the bladder/rectum centroid and PTV. The bladder/rectum dose-gradient function was modeled and applied on 10 additional test cases, and the predicted and achieved clinical bladder/rectum DVHs were compared: V70 (percentage of volume receiving 70Gy and above), V65, V60, V55, V50, V45, V40. Results: The following parameters were selected as model features: PTV volume, and distance of centroid of rectum/bladder to PTV. The model was tested with 10 additional patients. For bladder, the absolute difference (mean±standard deviation) between predicted and clinical DVHs is V70=−0.3±3.2, V65=−0.8±3.9, V60=1.5±4.3, V55=1.7±5.3, V50=−0.6±6.4, V45=0.6±6.5, and V40=0.9±5.7, the correlation coefficient is 0.96; for rectum, the difference is V70=1.5±3.8, V65=1.2±4.2, V60=−0.1±5.3, V55=1.0±6.6, V50=1.6±8.7, V45=1.9±9.8, and V40=1.5±10.1, and the correlation coefficient is 0.87. Conclusion: The OAR DVH can be accurately predicted using the OAR dose-gradient function in IMRT plans. This approach may be used as a quality control tool and aid less experienced planners determine benchmarks for plan quality.« less

  5. PDF modeling of near-wall turbulent flows

    NASA Astrophysics Data System (ADS)

    Dreeben, Thomas David

    1997-06-01

    Pdf methods are extended to include modeling of wall- bounded turbulent flows. For flows in which resolution of the viscous sublayer is desired, a Pdf near-wall model is developed in which the Generalized Langevin model is combined with an exact model for viscous transport. Durbin's method of elliptic relaxation is used to incorporate the wall effects into the governing equations without the use of wall functions or damping functions. Close to the wall, the Generalized Langevin model provides an analogy to the effect of the fluctuating continuity equation. This enables accurate modeling of the near-wall turbulent statistics. Demonstrated accuracy for fully-developed channel flow is achieved with a Pdf/Monte Carlo simulation, and with its related Reynolds-stress closure. For flows in which the details of the viscous sublayer are not important, a Pdf wall- function method is developed with the Simplified Langevin model.

  6. Nonlinear and Digital Man-machine Control Systems Modeling

    NASA Technical Reports Server (NTRS)

    Mekel, R.

    1972-01-01

    An adaptive modeling technique is examined by which controllers can be synthesized to provide corrective dynamics to a human operator's mathematical model in closed loop control systems. The technique utilizes a class of Liapunov functions formulated for this purpose, Liapunov's stability criterion and a model-reference system configuration. The Liapunov function is formulated to posses variable characteristics to take into consideration the identification dynamics. The time derivative of the Liapunov function generate the identification and control laws for the mathematical model system. These laws permit the realization of a controller which updates the human operator's mathematical model parameters so that model and human operator produce the same response when subjected to the same stimulus. A very useful feature is the development of a digital computer program which is easily implemented and modified concurrent with experimentation. The program permits the modeling process to interact with the experimentation process in a mutually beneficial way.

  7. The Effects of the Green House Nursing Home Model on ADL Function Trajectory: A Retrospective Longitudinal Study

    PubMed Central

    YOON, Ju Young; BROWN, Roger L.; BOWERS, Barbara J.; SHARKEY, Siobhan S.; HORN, Susan D.

    2015-01-01

    Background Growing attention in the past few decades has focused on improving care quality and quality of life for nursing home residents. Many traditional nursing homes have attempted to transform themselves to become more homelike emphasizing individualized care. This trend is referred to as nursing home culture change in the U.S. A promising culture change nursing home model, the Green House (GH) nursing home model, has shown positive psychological outcomes. However, little is known about whether the GH nursing home model has positive effects on physical function compared to traditional nursing homes. Objectives To examine the longitudinal effects of the GH nursing home model by comparing change patterns of ADL function over time between GH home residents and traditional nursing home residents. Design A retrospective longitudinal study. Settings Four GH organizations (nine GH units and four traditional units). Participants A total of 242 residents (93 GH residents and 149 traditional home residents) who had stayed in the nursing home at least six months from admission. Methods The outcome was ADL function, and the main independent variable was the facility type in which the resident stayed: a GH or traditional unit. Age, gender, comorbidity score, cognitive function, and depressive symptoms at baseline were controlled. All of these measures were from a minimum dataset. Growth curve modeling and growth mixture modeling were employed in this study for longitudinal analyses. Results The mean ADL function showed deterioration over time, and the rates of deterioration between GH and traditional home residents were not different over time. Four different ADL function trajectories were identified for 18 months, but there was no statistical difference in the likelihood of being in one of the four trajectory classes between the two groups. Conclusions Although GH nursing homes are considered to represent an innovative model changing the nursing home environment into more person-centered, this study did not demonstrate significant differences in ADL function changes for residents in the GH nursing homes compared to traditional nursing homes. Given that the GH model continues to evolve as it is being implemented and variations within and across GH homes are identified, large-scale longitudinal studies are needed to provide further relevant information on the effects of the GH model. PMID:26260709

  8. A representation of an NTCP function for local complication mechanisms

    NASA Astrophysics Data System (ADS)

    Alber, M.; Nüsslin, F.

    2001-02-01

    A mathematical formalism was tailored for the description of mechanisms complicating radiation therapy with a predominantly local component. The functional representation of an NTCP function was developed based on the notion that it has to be robust against population averages in order to be applicable to experimental data. The model was required to be invariant under scaling operations of the dose and the irradiated volume. The NTCP function was derived from the model assumptions that the complication is a consequence of local tissue damage and that the probability of local damage in a small reference volume is independent of the neighbouring volumes. The performance of the model was demonstrated with an animal model which has been published previously (Powers et al 1998 Radiother. Oncol. 46 297-306).

  9. Bread dough rheology: Computing with a damage function model

    NASA Astrophysics Data System (ADS)

    Tanner, Roger I.; Qi, Fuzhong; Dai, Shaocong

    2015-01-01

    We describe an improved damage function model for bread dough rheology. The model has relatively few parameters, all of which can easily be found from simple experiments. Small deformations in the linear region are described by a gel-like power-law memory function. A set of large non-reversing deformations - stress relaxation after a step of shear, steady shearing and elongation beginning from rest, and biaxial stretching, is used to test the model. With the introduction of a revised strain measure which includes a Mooney-Rivlin term, all of these motions can be well described by the damage function described in previous papers. For reversing step strains, larger amplitude oscillatory shearing and recoil reasonable predictions have been found. The numerical methods used are discussed and we give some examples.

  10. Effect of formal and informal likelihood functions on uncertainty assessment in a single event rainfall-runoff model

    NASA Astrophysics Data System (ADS)

    Nourali, Mahrouz; Ghahraman, Bijan; Pourreza-Bilondi, Mohsen; Davary, Kamran

    2016-09-01

    In the present study, DREAM(ZS), Differential Evolution Adaptive Metropolis combined with both formal and informal likelihood functions, is used to investigate uncertainty of parameters of the HEC-HMS model in Tamar watershed, Golestan province, Iran. In order to assess the uncertainty of 24 parameters used in HMS, three flood events were used to calibrate and one flood event was used to validate the posterior distributions. Moreover, performance of seven different likelihood functions (L1-L7) was assessed by means of DREAM(ZS)approach. Four likelihood functions, L1-L4, Nash-Sutcliffe (NS) efficiency, Normalized absolute error (NAE), Index of agreement (IOA), and Chiew-McMahon efficiency (CM), is considered as informal, whereas remaining (L5-L7) is represented in formal category. L5 focuses on the relationship between the traditional least squares fitting and the Bayesian inference, and L6, is a hetereoscedastic maximum likelihood error (HMLE) estimator. Finally, in likelihood function L7, serial dependence of residual errors is accounted using a first-order autoregressive (AR) model of the residuals. According to the results, sensitivities of the parameters strongly depend on the likelihood function, and vary for different likelihood functions. Most of the parameters were better defined by formal likelihood functions L5 and L7 and showed a high sensitivity to model performance. Posterior cumulative distributions corresponding to the informal likelihood functions L1, L2, L3, L4 and the formal likelihood function L6 are approximately the same for most of the sub-basins, and these likelihood functions depict almost a similar effect on sensitivity of parameters. 95% total prediction uncertainty bounds bracketed most of the observed data. Considering all the statistical indicators and criteria of uncertainty assessment, including RMSE, KGE, NS, P-factor and R-factor, results showed that DREAM(ZS) algorithm performed better under formal likelihood functions L5 and L7, but likelihood function L5 may result in biased and unreliable estimation of parameters due to violation of the residualerror assumptions. Thus, likelihood function L7 provides posterior distribution of model parameters credibly and therefore can be employed for further applications.

  11. Exact diagonalization library for quantum electron models

    NASA Astrophysics Data System (ADS)

    Iskakov, Sergei; Danilov, Michael

    2018-04-01

    We present an exact diagonalization C++ template library (EDLib) for solving quantum electron models, including the single-band finite Hubbard cluster and the multi-orbital impurity Anderson model. The observables that can be computed using EDLib are single particle Green's functions and spin-spin correlation functions. This code provides three different types of Hamiltonian matrix storage that can be chosen based on the model.

  12. Modeling Sodium Iodide Detector Response Using Parametric Equations

    DTIC Science & Technology

    2013-03-22

    MCNP particle current and pulse height tally functions, backscattering photons are quantified as a function of material thickness and energy...source – detector – scattering medium arrangements were modeled in MCNP using the pulse height tally functions, integrated over a 70 keV – 360 keV energy...15  4.1  MCNP

  13. Applications of the Functional Writing Model in Technical and Professional Writing.

    ERIC Educational Resources Information Center

    Brostoff, Anita

    The functional writing model is a method by which students learn to devise and organize a written argument. Salient features of functional writing include the organizing idea (a component that logically unifies a paragraph or sequence of paragraphs), the reader's frame of reference, forecasting (prediction of the sequence by which the organizing…

  14. A New MRI-Based Model of Heart Function with Coupled Hemodynamics and Application to Normal and Diseased Canine Left Ventricles

    PubMed Central

    Choi, Young Joon; Constantino, Jason; Vedula, Vijay; Trayanova, Natalia; Mittal, Rajat

    2015-01-01

    A methodology for the simulation of heart function that combines an MRI-based model of cardiac electromechanics (CE) with a Navier–Stokes-based hemodynamics model is presented. The CE model consists of two coupled components that simulate the electrical and the mechanical functions of the heart. Accurate representations of ventricular geometry and fiber orientations are constructed from the structural magnetic resonance and the diffusion tensor MR images, respectively. The deformation of the ventricle obtained from the electromechanical model serves as input to the hemodynamics model in this one-way coupled approach via imposed kinematic wall velocity boundary conditions and at the same time, governs the blood flow into and out of the ventricular volume. The time-dependent endocardial surfaces are registered using a diffeomorphic mapping algorithm, while the intraventricular blood flow patterns are simulated using a sharp-interface immersed boundary method-based flow solver. The utility of the combined heart-function model is demonstrated by comparing the hemodynamic characteristics of a normal canine heart beating in sinus rhythm against that of the dyssynchronously beating failing heart. We also discuss the potential of coupled CE and hemodynamics models for various clinical applications. PMID:26442254

  15. An exponential decay model for mediation.

    PubMed

    Fritz, Matthew S

    2014-10-01

    Mediation analysis is often used to investigate mechanisms of change in prevention research. Results finding mediation are strengthened when longitudinal data are used because of the need for temporal precedence. Current longitudinal mediation models have focused mainly on linear change, but many variables in prevention change nonlinearly across time. The most common solution to nonlinearity is to add a quadratic term to the linear model, but this can lead to the use of the quadratic function to explain all nonlinearity, regardless of theory and the characteristics of the variables in the model. The current study describes the problems that arise when quadratic functions are used to describe all nonlinearity and how the use of nonlinear functions, such as exponential decay, address many of these problems. In addition, nonlinear models provide several advantages over polynomial models including usefulness of parameters, parsimony, and generalizability. The effects of using nonlinear functions for mediation analysis are then discussed and a nonlinear growth curve model for mediation is presented. An empirical example using data from a randomized intervention study is then provided to illustrate the estimation and interpretation of the model. Implications, limitations, and future directions are also discussed.

  16. An Exponential Decay Model for Mediation

    PubMed Central

    Fritz, Matthew S.

    2013-01-01

    Mediation analysis is often used to investigate mechanisms of change in prevention research. Results finding mediation are strengthened when longitudinal data are used because of the need for temporal precedence. Current longitudinal mediation models have focused mainly on linear change, but many variables in prevention change nonlinearly across time. The most common solution to nonlinearity is to add a quadratic term to the linear model, but this can lead to the use of the quadratic function to explain all nonlinearity, regardless of theory and the characteristics of the variables in the model. The current study describes the problems that arise when quadratic functions are used to describe all nonlinearity and how the use of nonlinear functions, such as exponential decay, addresses many of these problems. In addition, nonlinear models provide several advantages over polynomial models including usefulness of parameters, parsimony, and generalizability. The effects of using nonlinear functions for mediation analysis are then discussed and a nonlinear growth curve model for mediation is presented. An empirical example using data from a randomized intervention study is then provided to illustrate the estimation and interpretation of the model. Implications, limitations, and future directions are also discussed. PMID:23625557

  17. Site-occupation embedding theory using Bethe ansatz local density approximations

    NASA Astrophysics Data System (ADS)

    Senjean, Bruno; Nakatani, Naoki; Tsuchiizu, Masahisa; Fromager, Emmanuel

    2018-06-01

    Site-occupation embedding theory (SOET) is an alternative formulation of density functional theory (DFT) for model Hamiltonians where the fully interacting Hubbard problem is mapped, in principle exactly, onto an impurity-interacting (rather than a noninteracting) one. It provides a rigorous framework for combining wave-function (or Green function)-based methods with DFT. In this work, exact expressions for the per-site energy and double occupation of the uniform Hubbard model are derived in the context of SOET. As readily seen from these derivations, the so-called bath contribution to the per-site correlation energy is, in addition to the latter, the key density functional quantity to model in SOET. Various approximations based on Bethe ansatz and perturbative solutions to the Hubbard and single-impurity Anderson models are constructed and tested on a one-dimensional ring. The self-consistent calculation of the embedded impurity wave function has been performed with the density-matrix renormalization group method. It has been shown that promising results are obtained in specific regimes of correlation and density. Possible further developments have been proposed in order to provide reliable embedding functionals and potentials.

  18. On the contribution of height to predict lung volumes, capacities and diffusion in healthy school children of 10-17 years.

    PubMed

    Gupta, C K; Mishra, G; Mehta, S C; Prasad, J

    1993-01-01

    Lung volumes, capacities, diffusion and alveolar volumes with physical characteristics (age, height and weight) were recorded for 186 healthy school children (96 boys and 90 girls) of 10-17 years age group. The objective was to study the relative importance of physical characteristics as regressor variables in regression models to estimate lung functions. We observed that height is best correlated with all the lung functions. Inclusion of all physical characteristics in the models have little gain compared to the ones having just height as regressor variable. We also find that exponential models were not only statistically valid but fared better compared to the linear ones. We conclude that lung functions covary with height and other physical characteristics but do not depend upon them. The rate of increase in the functions depend upon initial lung functions. Further, we propose models and provide ready reckoners to give estimates of lung functions with 95 per cent confidence limits based on heights from 125 to 170 cm for the age group of 10 to 17 years.

  19. Lumped parametric model of the human ear for sound transmission.

    PubMed

    Feng, Bin; Gan, Rong Z

    2004-09-01

    A lumped parametric model of the human auditoria peripherals consisting of six masses suspended with six springs and ten dashpots was proposed. This model will provide the quantitative basis for the construction of a physical model of the human middle ear. The lumped model parameters were first identified using published anatomical data, and then determined through a parameter optimization process. The transfer function of the middle ear obtained from human temporal bone experiments with laser Doppler interferometers was used for creating the target function during the optimization process. It was found that, among 14 spring and dashpot parameters, there were five parameters which had pronounced effects on the dynamic behaviors of the model. The detailed discussion on the sensitivity of those parameters was provided with appropriate applications for sound transmission in the ear. We expect that the methods for characterizing the lumped model of the human ear and the model parameters will be useful for theoretical modeling of the ear function and construction of the ear physical model.

  20. Hydrological-niche models predict water plant functional group distributions in diverse wetland types.

    PubMed

    Deane, David C; Nicol, Jason M; Gehrig, Susan L; Harding, Claire; Aldridge, Kane T; Goodman, Abigail M; Brookes, Justin D

    2017-06-01

    Human use of water resources threatens environmental water supplies. If resource managers are to develop policies that avoid unacceptable ecological impacts, some means to predict ecosystem response to changes in water availability is necessary. This is difficult to achieve at spatial scales relevant for water resource management because of the high natural variability in ecosystem hydrology and ecology. Water plant functional groups classify species with similar hydrological niche preferences together, allowing a qualitative means to generalize community responses to changes in hydrology. We tested the potential for functional groups in making quantitative prediction of water plant functional group distributions across diverse wetland types over a large geographical extent. We sampled wetlands covering a broad range of hydrogeomorphic and salinity conditions in South Australia, collecting both hydrological and floristic data from 687 quadrats across 28 wetland hydrological gradients. We built hydrological-niche models for eight water plant functional groups using a range of candidate models combining different surface inundation metrics. We then tested the predictive performance of top-ranked individual and averaged models for each functional group. Cross validation showed that models achieved acceptable predictive performance, with correct classification rates in the range 0.68-0.95. Model predictions can be made at any spatial scale that hydrological data are available and could be implemented in a geographical information system. We show the response of water plant functional groups to inundation is consistent enough across diverse wetland types to quantify the probability of hydrological impacts over regional spatial scales. © 2017 by the Ecological Society of America.

  1. Modelling the maximum voluntary joint torque/angular velocity relationship in human movement.

    PubMed

    Yeadon, Maurice R; King, Mark A; Wilson, Cassie

    2006-01-01

    The force exerted by a muscle is a function of the activation level and the maximum (tetanic) muscle force. In "maximum" voluntary knee extensions muscle activation is lower for eccentric muscle velocities than for concentric velocities. The aim of this study was to model this "differential activation" in order to calculate the maximum voluntary knee extensor torque as a function of knee angular velocity. Torque data were collected on two subjects during maximal eccentric-concentric knee extensions using an isovelocity dynamometer with crank angular velocities ranging from 50 to 450 degrees s(-1). The theoretical tetanic torque/angular velocity relationship was modelled using a four parameter function comprising two rectangular hyperbolas while the activation/angular velocity relationship was modelled using a three parameter function that rose from submaximal activation for eccentric velocities to full activation for high concentric velocities. The product of these two functions gave a seven parameter function which was fitted to the joint torque/angular velocity data, giving unbiased root mean square differences of 1.9% and 3.3% of the maximum torques achieved. Differential activation accounts for the non-hyperbolic behaviour of the torque/angular velocity data for low concentric velocities. The maximum voluntary knee extensor torque that can be exerted may be modelled accurately as the product of functions defining the maximum torque and the maximum voluntary activation level. Failure to include differential activation considerations when modelling maximal movements will lead to errors in the estimation of joint torque in the eccentric phase and low velocity concentric phase.

  2. Developing EHR-driven heart failure risk prediction models using CPXR(Log) with the probabilistic loss function.

    PubMed

    Taslimitehrani, Vahid; Dong, Guozhu; Pereira, Naveen L; Panahiazar, Maryam; Pathak, Jyotishman

    2016-04-01

    Computerized survival prediction in healthcare identifying the risk of disease mortality, helps healthcare providers to effectively manage their patients by providing appropriate treatment options. In this study, we propose to apply a classification algorithm, Contrast Pattern Aided Logistic Regression (CPXR(Log)) with the probabilistic loss function, to develop and validate prognostic risk models to predict 1, 2, and 5year survival in heart failure (HF) using data from electronic health records (EHRs) at Mayo Clinic. The CPXR(Log) constructs a pattern aided logistic regression model defined by several patterns and corresponding local logistic regression models. One of the models generated by CPXR(Log) achieved an AUC and accuracy of 0.94 and 0.91, respectively, and significantly outperformed prognostic models reported in prior studies. Data extracted from EHRs allowed incorporation of patient co-morbidities into our models which helped improve the performance of the CPXR(Log) models (15.9% AUC improvement), although did not improve the accuracy of the models built by other classifiers. We also propose a probabilistic loss function to determine the large error and small error instances. The new loss function used in the algorithm outperforms other functions used in the previous studies by 1% improvement in the AUC. This study revealed that using EHR data to build prediction models can be very challenging using existing classification methods due to the high dimensionality and complexity of EHR data. The risk models developed by CPXR(Log) also reveal that HF is a highly heterogeneous disease, i.e., different subgroups of HF patients require different types of considerations with their diagnosis and treatment. Our risk models provided two valuable insights for application of predictive modeling techniques in biomedicine: Logistic risk models often make systematic prediction errors, and it is prudent to use subgroup based prediction models such as those given by CPXR(Log) when investigating heterogeneous diseases. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Reconciling mass functions with the star-forming main sequence via mergers

    NASA Astrophysics Data System (ADS)

    Steinhardt, Charles L.; Yurk, Dominic; Capak, Peter

    2017-06-01

    We combine star formation along the 'main sequence', quiescence and clustering and merging to produce an empirical model for the evolution of individual galaxies. Main-sequence star formation alone would significantly steepen the stellar mass function towards low redshift, in sharp conflict with observation. However, a combination of star formation and merging produces a consistent result for correct choice of the merger rate function. As a result, we are motivated to propose a model in which hierarchical merging is disconnected from environmentally independent star formation. This model can be tested via correlation functions and would produce new constraints on clustering and merging.

  4. Employee subjective well-being and physiological functioning: An integrative model

    PubMed Central

    Tay, Louis

    2015-01-01

    Research shows that worker subjective well-being influences physiological functioning—an early signal of poor health outcomes. While several theoretical perspectives provide insights on this relationship, the literature lacks an integrative framework explaining the relationship. We develop a conceptual model explaining the link between subjective well-being and physiological functioning in the context of work. Integrating positive psychology and occupational stress perspectives, our model explains the relationship between subjective well-being and physiological functioning as a result of the direct influence of subjective well-being on physiological functioning and of their common relationships with work stress and personal resources, both of which are influenced by job conditions. PMID:28070359

  5. Parametric regression model for survival data: Weibull regression model as an example

    PubMed Central

    2016-01-01

    Weibull regression model is one of the most popular forms of parametric regression model that it provides estimate of baseline hazard function, as well as coefficients for covariates. Because of technical difficulties, Weibull regression model is seldom used in medical literature as compared to the semi-parametric proportional hazard model. To make clinical investigators familiar with Weibull regression model, this article introduces some basic knowledge on Weibull regression model and then illustrates how to fit the model with R software. The SurvRegCensCov package is useful in converting estimated coefficients to clinical relevant statistics such as hazard ratio (HR) and event time ratio (ETR). Model adequacy can be assessed by inspecting Kaplan-Meier curves stratified by categorical variable. The eha package provides an alternative method to model Weibull regression model. The check.dist() function helps to assess goodness-of-fit of the model. Variable selection is based on the importance of a covariate, which can be tested using anova() function. Alternatively, backward elimination starting from a full model is an efficient way for model development. Visualization of Weibull regression model after model development is interesting that it provides another way to report your findings. PMID:28149846

  6. Predicting cognitive function of the Malaysian elderly: a structural equation modelling approach.

    PubMed

    Foong, Hui Foh; Hamid, Tengku Aizan; Ibrahim, Rahimah; Haron, Sharifah Azizah; Shahar, Suzana

    2018-01-01

    The aim of this study was to identify the predictors of elderly's cognitive function based on biopsychosocial and cognitive reserve perspectives. The study included 2322 community-dwelling elderly in Malaysia, randomly selected through a multi-stage proportional cluster random sampling from Peninsular Malaysia. The elderly were surveyed on socio-demographic information, biomarkers, psychosocial status, disability, and cognitive function. A biopsychosocial model of cognitive function was developed to test variables' predictive power on cognitive function. Statistical analyses were performed using SPSS (version 15.0) in conjunction with Analysis of Moment Structures Graphics (AMOS 7.0). The estimated theoretical model fitted the data well. Psychosocial stress and metabolic syndrome (MetS) negatively predicted cognitive function and psychosocial stress appeared as a main predictor. Socio-demographic characteristics, except gender, also had significant effects on cognitive function. However, disability failed to predict cognitive function. Several factors together may predict cognitive function in the Malaysian elderly population, and the variance accounted for it is large enough to be considered substantial. Key factor associated with the elderly's cognitive function seems to be psychosocial well-being. Thus, psychosocial well-being should be included in the elderly assessment, apart from medical conditions, both in clinical and community setting.

  7. Item Purification in Differential Item Functioning Using Generalized Linear Mixed Models

    ERIC Educational Resources Information Center

    Liu, Qian

    2011-01-01

    For this dissertation, four item purification procedures were implemented onto the generalized linear mixed model for differential item functioning (DIF) analysis, and the performance of these item purification procedures was investigated through a series of simulations. Among the four procedures, forward and generalized linear mixed model (GLMM)…

  8. A Functional Model of Quality Assurance for Psychiatric Hospitals and Corresponding Staffing Requirements.

    ERIC Educational Resources Information Center

    Kamis-Gould, Edna; And Others

    1991-01-01

    A model for quality assurance (QA) in psychiatric hospitals is described. Its functions (general QA, utilization review, clinical records, evaluation, management information systems, risk management, and infection control), subfunctions, and corresponding staffing requirements are reviewed. This model was designed to foster standardization in QA…

  9. Latent Partially Ordered Classification Models and Normal Mixtures

    ERIC Educational Resources Information Center

    Tatsuoka, Curtis; Varadi, Ferenc; Jaeger, Judith

    2013-01-01

    Latent partially ordered sets (posets) can be employed in modeling cognitive functioning, such as in the analysis of neuropsychological (NP) and educational test data. Posets are cognitively diagnostic in the sense that classification states in these models are associated with detailed profiles of cognitive functioning. These profiles allow for…

  10. Curriculum Development: A Philosophical Model.

    ERIC Educational Resources Information Center

    Bruening, William H.

    Presenting models based on the philosophies of Carl Rogers, John Dewey, Erich Fromm, and Jean-Paul Sartre, this paper proposes a philosophical approach to education and concludes with pragmatic suggestions concerning teaching based on a fully-functioning-person model. The fully-functioning person is characterized as being open to experience,…

  11. Model for nucleon valence structure functions at all x, all p ⊥ and all Q 2 from the correspondence between QCD and DTU

    NASA Astrophysics Data System (ADS)

    Cohen-Tannoudji, G.; El Hassouni, A.; Mantrach, A.; Oudrhiri-Safiani, E. G.

    1982-09-01

    We propose a simple parametrization of the nucleon valence structure functions at all x, all p ⊥ and all Q 2. We use the DTU parton model to fix the parametrization at a reference point ( Q {0/2}=3 GeV2) and we mimic the QCD evolution by replacing the dimensioned parameters of the DTU parton model by functions depending on Q 2. Excellent agreement is obtained with existing data.

  12. Binder model system to be used for determination of prepolymer functionality

    NASA Technical Reports Server (NTRS)

    Martinelli, F. J.; Hodgkin, J. H.

    1971-01-01

    Development of a method for determining the functionality distribution of prepolymers used for rocket binders is discussed. Research has been concerned with accurately determining the gel point of a model polyester system containing a single trifunctional crosslinker, and the application of these methods to more complicated model systems containing a second trifunctional crosslinker, monofunctional ingredients, or a higher functionality crosslinker. Correlations of observed with theoretical gel points for these systems would allow the methods to be applied directly to prepolymers.

  13. Glomerular structural-functional relationship models of diabetic nephropathy are robust in type 1 diabetic patients.

    PubMed

    Mauer, Michael; Caramori, Maria Luiza; Fioretto, Paola; Najafian, Behzad

    2015-06-01

    Studies of structural-functional relationships have improved understanding of the natural history of diabetic nephropathy (DN). However, in order to consider structural end points for clinical trials, the robustness of the resultant models needs to be verified. This study examined whether structural-functional relationship models derived from a large cohort of type 1 diabetic (T1D) patients with a wide range of renal function are robust. The predictability of models derived from multiple regression analysis and piecewise linear regression analysis was also compared. T1D patients (n = 161) with research renal biopsies were divided into two equal groups matched for albumin excretion rate (AER). Models to explain AER and glomerular filtration rate (GFR) by classical DN lesions in one group (T1D-model, or T1D-M) were applied to the other group (T1D-test, or T1D-T) and regression analyses were performed. T1D-M-derived models explained 70 and 63% of AER variance and 32 and 21% of GFR variance in T1D-M and T1D-T, respectively, supporting the substantial robustness of the models. Piecewise linear regression analyses substantially improved predictability of the models with 83% of AER variance and 66% of GFR variance explained by classical DN glomerular lesions alone. These studies demonstrate that DN structural-functional relationship models are robust, and if appropriate models are used, glomerular lesions alone explain a major proportion of AER and GFR variance in T1D patients. © The Author 2014. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  14. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE PAGES

    Ye, Xin; Garikapati, Venu M.; You, Daehyun; ...

    2017-11-08

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  15. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Xin; Garikapati, Venu M.; You, Daehyun

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  16. Identification of functional differences in metabolic networks using comparative genomics and constraint-based models.

    PubMed

    Hamilton, Joshua J; Reed, Jennifer L

    2012-01-01

    Genome-scale network reconstructions are useful tools for understanding cellular metabolism, and comparisons of such reconstructions can provide insight into metabolic differences between organisms. Recent efforts toward comparing genome-scale models have focused primarily on aligning metabolic networks at the reaction level and then looking at differences and similarities in reaction and gene content. However, these reaction comparison approaches are time-consuming and do not identify the effect network differences have on the functional states of the network. We have developed a bilevel mixed-integer programming approach, CONGA, to identify functional differences between metabolic networks by comparing network reconstructions aligned at the gene level. We first identify orthologous genes across two reconstructions and then use CONGA to identify conditions under which differences in gene content give rise to differences in metabolic capabilities. By seeking genes whose deletion in one or both models disproportionately changes flux through a selected reaction (e.g., growth or by-product secretion) in one model over another, we are able to identify structural metabolic network differences enabling unique metabolic capabilities. Using CONGA, we explore functional differences between two metabolic reconstructions of Escherichia coli and identify a set of reactions responsible for chemical production differences between the two models. We also use this approach to aid in the development of a genome-scale model of Synechococcus sp. PCC 7002. Finally, we propose potential antimicrobial targets in Mycobacterium tuberculosis and Staphylococcus aureus based on differences in their metabolic capabilities. Through these examples, we demonstrate that a gene-centric approach to comparing metabolic networks allows for a rapid comparison of metabolic models at a functional level. Using CONGA, we can identify differences in reaction and gene content which give rise to different functional predictions. Because CONGA provides a general framework, it can be applied to find functional differences across models and biological systems beyond those presented here.

  17. Identification of Functional Differences in Metabolic Networks Using Comparative Genomics and Constraint-Based Models

    PubMed Central

    Hamilton, Joshua J.; Reed, Jennifer L.

    2012-01-01

    Genome-scale network reconstructions are useful tools for understanding cellular metabolism, and comparisons of such reconstructions can provide insight into metabolic differences between organisms. Recent efforts toward comparing genome-scale models have focused primarily on aligning metabolic networks at the reaction level and then looking at differences and similarities in reaction and gene content. However, these reaction comparison approaches are time-consuming and do not identify the effect network differences have on the functional states of the network. We have developed a bilevel mixed-integer programming approach, CONGA, to identify functional differences between metabolic networks by comparing network reconstructions aligned at the gene level. We first identify orthologous genes across two reconstructions and then use CONGA to identify conditions under which differences in gene content give rise to differences in metabolic capabilities. By seeking genes whose deletion in one or both models disproportionately changes flux through a selected reaction (e.g., growth or by-product secretion) in one model over another, we are able to identify structural metabolic network differences enabling unique metabolic capabilities. Using CONGA, we explore functional differences between two metabolic reconstructions of Escherichia coli and identify a set of reactions responsible for chemical production differences between the two models. We also use this approach to aid in the development of a genome-scale model of Synechococcus sp. PCC 7002. Finally, we propose potential antimicrobial targets in Mycobacterium tuberculosis and Staphylococcus aureus based on differences in their metabolic capabilities. Through these examples, we demonstrate that a gene-centric approach to comparing metabolic networks allows for a rapid comparison of metabolic models at a functional level. Using CONGA, we can identify differences in reaction and gene content which give rise to different functional predictions. Because CONGA provides a general framework, it can be applied to find functional differences across models and biological systems beyond those presented here. PMID:22666308

  18. PV_LIB Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-11

    While an organized source of reference information on PV performance modeling is certainly valuable, there is nothing to match the availability of actual examples of modeling algorithms being used in practice. To meet this need, Sandia has developed a PV performance modeling toolbox (PV_LIB) for Matlab. It contains a set of well-documented, open source functions and example scripts showing the functions being used in practical examples. This toolbox is meant to help make the multi-step process of modeling a PV system more transparent and provide the means for model users to validate and understand the models they use and ormore » develop. It is fully integrated into Matlab's help and documentation utilities. The PV_LIB Toolbox provides more than 30 functions that are sorted into four categories« less

  19. A study of structural properties of gene network graphs for mathematical modeling of integrated mosaic gene networks.

    PubMed

    Petrovskaya, Olga V; Petrovskiy, Evgeny D; Lavrik, Inna N; Ivanisenko, Vladimir A

    2017-04-01

    Gene network modeling is one of the widely used approaches in systems biology. It allows for the study of complex genetic systems function, including so-called mosaic gene networks, which consist of functionally interacting subnetworks. We conducted a study of a mosaic gene networks modeling method based on integration of models of gene subnetworks by linear control functionals. An automatic modeling of 10,000 synthetic mosaic gene regulatory networks was carried out using computer experiments on gene knockdowns/knockouts. Structural analysis of graphs of generated mosaic gene regulatory networks has revealed that the most important factor for building accurate integrated mathematical models, among those analyzed in the study, is data on expression of genes corresponding to the vertices with high properties of centrality.

  20. Ab Initio Structural Modeling of and Experimental Validation for Chlamydia trachomatis Protein CT296 Reveal Structural Similarity to Fe(II) 2-Oxoglutarate-Dependent Enzymes▿

    PubMed Central

    Kemege, Kyle E.; Hickey, John M.; Lovell, Scott; Battaile, Kevin P.; Zhang, Yang; Hefty, P. Scott

    2011-01-01

    Chlamydia trachomatis is a medically important pathogen that encodes a relatively high percentage of proteins with unknown function. The three-dimensional structure of a protein can be very informative regarding the protein's functional characteristics; however, determining protein structures experimentally can be very challenging. Computational methods that model protein structures with sufficient accuracy to facilitate functional studies have had notable successes. To evaluate the accuracy and potential impact of computational protein structure modeling of hypothetical proteins encoded by Chlamydia, a successful computational method termed I-TASSER was utilized to model the three-dimensional structure of a hypothetical protein encoded by open reading frame (ORF) CT296. CT296 has been reported to exhibit functional properties of a divalent cation transcription repressor (DcrA), with similarity to the Escherichia coli iron-responsive transcriptional repressor, Fur. Unexpectedly, the I-TASSER model of CT296 exhibited no structural similarity to any DNA-interacting proteins or motifs. To validate the I-TASSER-generated model, the structure of CT296 was solved experimentally using X-ray crystallography. Impressively, the ab initio I-TASSER-generated model closely matched (2.72-Å Cα root mean square deviation [RMSD]) the high-resolution (1.8-Å) crystal structure of CT296. Modeled and experimentally determined structures of CT296 share structural characteristics of non-heme Fe(II) 2-oxoglutarate-dependent enzymes, although key enzymatic residues are not conserved, suggesting a unique biochemical process is likely associated with CT296 function. Additionally, functional analyses did not support prior reports that CT296 has properties shared with divalent cation repressors such as Fur. PMID:21965559

Top