Sample records for model analysis demonstrated

  1. Nonlinear structural analysis of a turbine airfoil using the Walker viscoplastic material model for B1900 + Hf

    NASA Technical Reports Server (NTRS)

    Meyer, T. G.; Hill, J. T.; Weber, R. M.

    1988-01-01

    A viscoplastic material model for the high temperature turbine airfoil material B1900 + Hf was developed and was demonstrated in a three dimensional finite element analysis of a typical turbine airfoil. The demonstration problem is a simulated flight cycle and includes the appropriate transient thermal and mechanical loads typically experienced by these components. The Walker viscoplastic material model was shown to be efficient, stable and easily used. The demonstration is summarized and the performance of the material model is evaluated.

  2. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  3. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was built using the OpenMDAO framework. Pycycle provides analytic derivatives allowing for an efficient use of gradient-based optimization methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  4. Application of structured analysis to a telerobotic system

    NASA Technical Reports Server (NTRS)

    Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven

    1990-01-01

    The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.

  5. Characterizing observed circulation patterns within a bay using HF radar and numerical model simulations

    NASA Astrophysics Data System (ADS)

    O'Donncha, Fearghal; Hartnett, Michael; Nash, Stephen; Ren, Lei; Ragnoli, Emanuele

    2015-02-01

    In this study, High Frequency Radar (HFR), observations in conjunction with numerical model simulations investigate surface flow dynamics in a tidally-active, wind-driven bay; Galway Bay situated on the West coast of Ireland. Comparisons against ADCP sensor data permit an independent assessment of HFR and model performance, respectively. Results show root-mean-square (rms) differences in the range 10 - 12cm/s while model rms equalled 12 - 14cm/s. Subsequent analysis focus on a detailed comparison of HFR and model output. Harmonic analysis decompose both sets of surface currents based on distinct flow process, enabling a correlation analysis between the resultant output and dominant forcing parameters. Comparisons of barotropic model simulations and HFR tidal signal demonstrate consistently high agreement, particularly of the dominant M2 tidal signal. Analysis of residual flows demonstrate considerably poorer agreement, with the model failing to replicate complex flows. A number of hypotheses explaining this discrepancy are discussed, namely: discrepancies between regional-scale, coastal-ocean models and globally-influenced bay-scale dynamics; model uncertainties arising from highly-variable wind-driven flows across alarge body of water forced by point measurements of wind vectors; and the high dependence of model simulations on empirical wind-stress coefficients. The research demonstrates that an advanced, widely-used hydro-environmental model does not accurately reproduce aspects of surface flow processes, particularly with regards wind forcing. Considering the significance of surface boundary conditions in both coastal and open ocean dynamics, the viability of using a systematic analysis of results to improve model predictions is discussed.

  6. 77 FR 41132 - Air Quality Implementation Plans; Alabama; Attainment Plan for the Alabama Portion of the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-12

    ... modeling demonstration should include supporting technical analyses and descriptions of all relevant....5 and NO X . The attainment demonstration includes: Technical analyses that locate, identify, and... modeling analysis is a complex technical evaluation that began with selection of the modeling system. The...

  7. Modeling abundance using multinomial N-mixture models

    USGS Publications Warehouse

    Royle, Andy

    2016-01-01

    Multinomial N-mixture models are a generalization of the binomial N-mixture models described in Chapter 6 to allow for more complex and informative sampling protocols beyond simple counts. Many commonly used protocols such as multiple observer sampling, removal sampling, and capture-recapture produce a multivariate count frequency that has a multinomial distribution and for which multinomial N-mixture models can be developed. Such protocols typically result in more precise estimates than binomial mixture models because they provide direct information about parameters of the observation process. We demonstrate the analysis of these models in BUGS using several distinct formulations that afford great flexibility in the types of models that can be developed, and we demonstrate likelihood analysis using the unmarked package. Spatially stratified capture-recapture models are one class of models that fall into the multinomial N-mixture framework, and we discuss analysis of stratified versions of classical models such as model Mb, Mh and other classes of models that are only possible to describe within the multinomial N-mixture framework.

  8. A LISREL Model for the Analysis of Repeated Measures with a Patterned Covariance Matrix.

    ERIC Educational Resources Information Center

    Rovine, Michael J.; Molenaar, Peter C. M.

    1998-01-01

    Presents a LISREL model for the estimation of the repeated measures analysis of variance (ANOVA) with a patterned covariance matrix. The model is demonstrated for a 5 x 2 (Time x Group) ANOVA in which the data are assumed to be serially correlated. Similarities with the Statistical Analysis System PROC MIXED model are discussed. (SLD)

  9. Contrastive Analysis of English and Japanese Demonstratives from the Perspective of L1 and L2 Acquisition.

    ERIC Educational Resources Information Center

    Niimura, Tomomi; Hayashi, Brenda

    1996-01-01

    Presents a contrastive analysis of English and Japanese demonstratives based on the first- (L1) and second-language (L2) data of an earlier study. First, the traditional explanations and their alternative models for English and Japanese are presented, then, all models are tested with the L1 and L2 data, which leads to a discussion of the different…

  10. Demonstrating Rapid Qualitative Elemental Analyses of Participant-Supplied Objects at a Public Outreach Event

    ERIC Educational Resources Information Center

    Schwarz, Gunnar; Burger, Marcel; Guex, Kevin; Gundlach-Graham, Alexander; Ka¨ser, Debora; Koch, Joachim; Velicsanyi, Peter; Wu, Chung-Che; Gu¨nther, Detlef; Hattendorf, Bodo

    2016-01-01

    A public demonstration of laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS) for fast and sensitive qualitative elemental analysis of solid everyday objects is described. This demonstration served as a showcase model for modern instrumentation (and for elemental analysis, in particular) to the public. Several steps were made to…

  11. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    PubMed

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.

  12. Analysis of SMA Hybrid Composite Structures in MSC.Nastran and ABAQUS

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.; Patel, Hemant D.

    2005-01-01

    A thermoelastic constitutive model for shape memory alloy (SMA) actuators and SMA hybrid composite (SMAHC) structures was recently implemented in the commercial finite element codes MSC.Nastran and ABAQUS. The model may be easily implemented in any code that has the capability for analysis of laminated composite structures with temperature dependent material properties. The model is also relatively easy to use and requires input of only fundamental engineering properties. A brief description of the model is presented, followed by discussion of implementation and usage in the commercial codes. Results are presented from static and dynamic analysis of SMAHC beams of two types; a beam clamped at each end and a cantilever beam. Nonlinear static (post-buckling) and random response analyses are demonstrated for the first specimen. Static deflection (shape) control is demonstrated for the cantilever beam. Approaches for modeling SMAHC material systems with embedded SMA in ribbon and small round wire product forms are demonstrated and compared. The results from the commercial codes are compared to those from a research code as validation of the commercial implementations; excellent correlation is achieved in all cases.

  13. Analysis of SMA Hybrid Composite Structures using Commercial Codes

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.; Patel, Hemant D.

    2004-01-01

    A thermomechanical model for shape memory alloy (SMA) actuators and SMA hybrid composite (SMAHC) structures has been recently implemented in the commercial finite element codes MSC.Nastran and ABAQUS. The model may be easily implemented in any code that has the capability for analysis of laminated composite structures with temperature dependent material properties. The model is also relatively easy to use and requires input of only fundamental engineering properties. A brief description of the model is presented, followed by discussion of implementation and usage in the commercial codes. Results are presented from static and dynamic analysis of SMAHC beams of two types; a beam clamped at each end and a cantilevered beam. Nonlinear static (post-buckling) and random response analyses are demonstrated for the first specimen. Static deflection (shape) control is demonstrated for the cantilevered beam. Approaches for modeling SMAHC material systems with embedded SMA in ribbon and small round wire product forms are demonstrated and compared. The results from the commercial codes are compared to those from a research code as validation of the commercial implementations; excellent correlation is achieved in all cases.

  14. Validation of the Integrated Medical Model Using Historical Space Flight Data

    NASA Technical Reports Server (NTRS)

    Kerstman, Eric L.; Minard, Charles G.; FreiredeCarvalho, Mary H.; Walton, Marlei E.; Myers, Jerry G., Jr.; Saile, Lynn G.; Lopez, Vilma; Butler, Douglas J.; Johnson-Throop, Kathy A.

    2010-01-01

    The Integrated Medical Model (IMM) utilizes Monte Carlo methodologies to predict the occurrence of medical events, utilization of resources, and clinical outcomes during space flight. Real-world data may be used to demonstrate the accuracy of the model. For this analysis, IMM predictions were compared to data from historical shuttle missions, not yet included as model source input. Initial goodness of fit test-ing on International Space Station data suggests that the IMM may overestimate the number of occurrences for three of the 83 medical conditions in the model. The IMM did not underestimate the occurrence of any medical condition. Initial comparisons with shuttle data demonstrate the importance of understanding crew preference (i.e., preferred analgesic) for accurately predicting the utilization of re-sources. The initial analysis demonstrates the validity of the IMM for its intended use and highlights areas for improvement.

  15. A Generalized Partial Credit Model: Application of an EM Algorithm.

    ERIC Educational Resources Information Center

    Muraki, Eiji

    1992-01-01

    The partial credit model with a varying slope parameter is developed and called the generalized partial credit model (GPCM). Analysis results for simulated data by this and other polytomous item-response models demonstrate that the rating formulation of the GPCM is adaptable to the analysis of polytomous item responses. (SLD)

  16. Global/local stress analysis of composite panels

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Knight, Norman F., Jr.

    1989-01-01

    A method for performing a global/local stress analysis is described, and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.

  17. Global/local stress analysis of composite structures. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.

    1989-01-01

    A method for performing a global/local stress analysis is described and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.

  18. Rapid Energy Modeling Workflow Demonstration Project

    DTIC Science & Technology

    2014-01-01

    Conditioning Engineers BIM Building Information Model BLCC building life cycle costs BPA Building Performance Analysis CAD computer assisted...invited to enroll in the Autodesk Building Performance Analysis ( BPA ) Certificate Program under a group 30 specifically for DoD installation

  19. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies.

    PubMed

    Barnes, Samuel R; Ng, Thomas S C; Santa-Maria, Naomi; Montagne, Axel; Zlokovic, Berislav V; Jacobs, Russell E

    2015-06-16

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at https://github.com/petmri/ROCKETSHIP .

  20. A Latent Transition Analysis Model for Assessing Change in Cognitive Skills

    ERIC Educational Resources Information Center

    Li, Feiming; Cohen, Allan; Bottge, Brian; Templin, Jonathan

    2016-01-01

    Latent transition analysis (LTA) was initially developed to provide a means of measuring change in dynamic latent variables. In this article, we illustrate the use of a cognitive diagnostic model, the DINA model, as the measurement model in a LTA, thereby demonstrating a means of analyzing change in cognitive skills over time. An example is…

  1. Discriminative Nonlinear Analysis Operator Learning: When Cosparse Model Meets Image Classification.

    PubMed

    Wen, Zaidao; Hou, Biao; Jiao, Licheng

    2017-05-03

    Linear synthesis model based dictionary learning framework has achieved remarkable performances in image classification in the last decade. Behaved as a generative feature model, it however suffers from some intrinsic deficiencies. In this paper, we propose a novel parametric nonlinear analysis cosparse model (NACM) with which a unique feature vector will be much more efficiently extracted. Additionally, we derive a deep insight to demonstrate that NACM is capable of simultaneously learning the task adapted feature transformation and regularization to encode our preferences, domain prior knowledge and task oriented supervised information into the features. The proposed NACM is devoted to the classification task as a discriminative feature model and yield a novel discriminative nonlinear analysis operator learning framework (DNAOL). The theoretical analysis and experimental performances clearly demonstrate that DNAOL will not only achieve the better or at least competitive classification accuracies than the state-of-the-art algorithms but it can also dramatically reduce the time complexities in both training and testing phases.

  2. Meta-Analysis in Higher Education: An Illustrative Example Using Hierarchical Linear Modeling

    ERIC Educational Resources Information Center

    Denson, Nida; Seltzer, Michael H.

    2011-01-01

    The purpose of this article is to provide higher education researchers with an illustrative example of meta-analysis utilizing hierarchical linear modeling (HLM). This article demonstrates the step-by-step process of meta-analysis using a recently-published study examining the effects of curricular and co-curricular diversity activities on racial…

  3. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis: A Cross-National Investigation of Schwartz Values

    ERIC Educational Resources Information Center

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the model parameters and demonstrates the consequences…

  4. Optimizing Biorefinery Design and Operations via Linear Programming Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talmadge, Michael; Batan, Liaw; Lamers, Patrick

    The ability to assess and optimize economics of biomass resource utilization for the production of fuels, chemicals and power is essential for the ultimate success of a bioenergy industry. The team of authors, consisting of members from the National Renewable Energy Laboratory (NREL) and the Idaho National Laboratory (INL), has developed simple biorefinery linear programming (LP) models to enable the optimization of theoretical or existing biorefineries. The goal of this analysis is to demonstrate how such models can benefit the developing biorefining industry. It focuses on a theoretical multi-pathway, thermochemical biorefinery configuration and demonstrates how the biorefinery can use LPmore » models for operations planning and optimization in comparable ways to the petroleum refining industry. Using LP modeling tools developed under U.S. Department of Energy's Bioenergy Technologies Office (DOE-BETO) funded efforts, the authors investigate optimization challenges for the theoretical biorefineries such as (1) optimal feedstock slate based on available biomass and prices, (2) breakeven price analysis for available feedstocks, (3) impact analysis for changes in feedstock costs and product prices, (4) optimal biorefinery operations during unit shutdowns / turnarounds, and (5) incentives for increased processing capacity. These biorefinery examples are comparable to crude oil purchasing and operational optimization studies that petroleum refiners perform routinely using LPs and other optimization models. It is important to note that the analyses presented in this article are strictly theoretical and they are not based on current energy market prices. The pricing structure assigned for this demonstrative analysis is consistent with $4 per gallon gasoline, which clearly assumes an economic environment that would favor the construction and operation of biorefineries. The analysis approach and examples provide valuable insights into the usefulness of analysis tools for maximizing the potential benefits of biomass utilization for production of fuels, chemicals and power.« less

  5. A Watershed-based spatially-explicit demonstration of an Integrated Environmental Modeling Framework for Ecosystem Services in the Coal River Basin (WV, USA)

    EPA Science Inventory

    We demonstrate a spatially-explicit regional assessment of current condition of aquatic ecoservices in the Coal River Basin (CRB), with limited sensitivity analysis for the atmospheric contaminant mercury. The integrated modeling framework (IMF) forecasts water quality and quant...

  6. A Markov Chain Monte Carlo Approach to Confirmatory Item Factor Analysis

    ERIC Educational Resources Information Center

    Edwards, Michael C.

    2010-01-01

    Item factor analysis has a rich tradition in both the structural equation modeling and item response theory frameworks. The goal of this paper is to demonstrate a novel combination of various Markov chain Monte Carlo (MCMC) estimation routines to estimate parameters of a wide variety of confirmatory item factor analysis models. Further, I show…

  7. TRAC-PD2 posttest analysis of the CCTF Evaluation-Model Test C1-19 (Run 38). [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Motley, F.

    The results of a Transient Reactor Analysis Code posttest analysis of the Cylindral Core Test Facility Evaluation-Model Test agree very well with the results of the experiment. The good agreement obtained verifies the multidimensional analysis capability of the TRAC code. Because of the steep radial power profile, the importance of using fine noding in the core region was demonstrated (as compared with poorer results obtained from an earlier pretest prediction that used a coarsely noded model).

  8. The Sixth Annual Thermal and Fluids Analysis Workshop

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Sixth Annual Thermal and Fluids Analysis Workshop consisted of classes, vendor demonstrations, and paper sessions. The classes and vendor demonstrations provided participants with the information on widely used tools for thermal and fluids analysis. The paper sessions provided a forum for the exchange of information and ideas among thermal and fluids analysis. Paper topics included advances an uses of established thermal and fluids computer codes (such as SINDA and TRASYS) as well as unique modeling techniques and applications.

  9. Using Runtime Analysis to Guide Model Checking of Java Programs

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Norvig, Peter (Technical Monitor)

    2001-01-01

    This paper describes how two runtime analysis algorithms, an existing data race detection algorithm and a new deadlock detection algorithm, have been implemented to analyze Java programs. Runtime analysis is based on the idea of executing the program once. and observing the generated run to extract various kinds of information. This information can then be used to predict whether other different runs may violate some properties of interest, in addition of course to demonstrate whether the generated run itself violates such properties. These runtime analyses can be performed stand-alone to generate a set of warnings. It is furthermore demonstrated how these warnings can be used to guide a model checker, thereby reducing the search space. The described techniques have been implemented in the b e grown Java model checker called PathFinder.

  10. Low-Dimensional Statistics of Anatomical Variability via Compact Representation of Image Deformations.

    PubMed

    Zhang, Miaomiao; Wells, William M; Golland, Polina

    2016-10-01

    Using image-based descriptors to investigate clinical hypotheses and therapeutic implications is challenging due to the notorious "curse of dimensionality" coupled with a small sample size. In this paper, we present a low-dimensional analysis of anatomical shape variability in the space of diffeomorphisms and demonstrate its benefits for clinical studies. To combat the high dimensionality of the deformation descriptors, we develop a probabilistic model of principal geodesic analysis in a bandlimited low-dimensional space that still captures the underlying variability of image data. We demonstrate the performance of our model on a set of 3D brain MRI scans from the Alzheimer's Disease Neuroimaging Initiative (ADNI) database. Our model yields a more compact representation of group variation at substantially lower computational cost than models based on the high-dimensional state-of-the-art approaches such as tangent space PCA (TPCA) and probabilistic principal geodesic analysis (PPGA).

  11. Thermal analysis of combinatorial solid geometry models using SINDA

    NASA Technical Reports Server (NTRS)

    Gerencser, Diane; Radke, George; Introne, Rob; Klosterman, John; Miklosovic, Dave

    1993-01-01

    Algorithms have been developed using Monte Carlo techniques to determine the thermal network parameters necessary to perform a finite difference analysis on Combinatorial Solid Geometry (CSG) models. Orbital and laser fluxes as well as internal heat generation are modeled to facilitate satellite modeling. The results of the thermal calculations are used to model the infrared (IR) images of targets and assess target vulnerability. Sample analyses and validation are presented which demonstrate code products.

  12. Project Organization and Management; Analysis of a Model. Satellite Technology Demonstration, Technical Report No. 0127.

    ERIC Educational Resources Information Center

    Lokey, Kenneth R.

    The Satellite Technology Demonstration (STD), a project of the Federation of Rocky Mountain States, Inc. (FRMS), employed a project management model for its organizational structure. The organization and management system utilized by the STD was designed to accomplish a predetermined set of objectives with the highest quality possible within a…

  13. An Example for Integrated Gas Turbine Engine Testing and Analysis Using Modeling and Simulation

    DTIC Science & Technology

    2006-12-01

    USAF Academy in a joint test and analysis effort of the F109 turbofan engine. This process uses a swirl investigation as a vehicle to exercise and...test and analysis effort of the F109 turbofan engine. This process uses a swirl investigation as a vehicle to exercise and demonstrate the approach...test and analysis effort of the F109 turbofan engine, an effort which uses a swirl investigation as a vehicle to exercise and demonstrate the

  14. Causal modeling in international migration research: a methodological prolegomenon.

    PubMed

    Papademetriou, D G; Hopple, G W

    1982-10-01

    The authors examine the value of using models to study the migration process. In particular, they demonstrate the potential utility of a partial least squares modeling approach to the causal analysis of international migration.

  15. Design and analysis of forward and reverse models for predicting defect accumulation, defect energetics, and irradiation conditions

    DOE PAGES

    Stewart, James A.; Kohnert, Aaron A.; Capolungo, Laurent; ...

    2018-03-06

    The complexity of radiation effects in a material’s microstructure makes developing predictive models a difficult task. In principle, a complete list of all possible reactions between defect species being considered can be used to elucidate damage evolution mechanisms and its associated impact on microstructure evolution. However, a central limitation is that many models use a limited and incomplete catalog of defect energetics and associated reactions. Even for a given model, estimating its input parameters remains a challenge, especially for complex material systems. Here, we present a computational analysis to identify the extent to which defect accumulation, energetics, and irradiation conditionsmore » can be determined via forward and reverse regression models constructed and trained from large data sets produced by cluster dynamics simulations. A global sensitivity analysis, via Sobol’ indices, concisely characterizes parameter sensitivity and demonstrates how this can be connected to variability in defect evolution. Based on this analysis and depending on the definition of what constitutes the input and output spaces, forward and reverse regression models are constructed and allow for the direct calculation of defect accumulation, defect energetics, and irradiation conditions. Here, this computational analysis, exercised on a simplified cluster dynamics model, demonstrates the ability to design predictive surrogate and reduced-order models, and provides guidelines for improving model predictions within the context of forward and reverse engineering of mathematical models for radiation effects in a materials’ microstructure.« less

  16. Design and analysis of forward and reverse models for predicting defect accumulation, defect energetics, and irradiation conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, James A.; Kohnert, Aaron A.; Capolungo, Laurent

    The complexity of radiation effects in a material’s microstructure makes developing predictive models a difficult task. In principle, a complete list of all possible reactions between defect species being considered can be used to elucidate damage evolution mechanisms and its associated impact on microstructure evolution. However, a central limitation is that many models use a limited and incomplete catalog of defect energetics and associated reactions. Even for a given model, estimating its input parameters remains a challenge, especially for complex material systems. Here, we present a computational analysis to identify the extent to which defect accumulation, energetics, and irradiation conditionsmore » can be determined via forward and reverse regression models constructed and trained from large data sets produced by cluster dynamics simulations. A global sensitivity analysis, via Sobol’ indices, concisely characterizes parameter sensitivity and demonstrates how this can be connected to variability in defect evolution. Based on this analysis and depending on the definition of what constitutes the input and output spaces, forward and reverse regression models are constructed and allow for the direct calculation of defect accumulation, defect energetics, and irradiation conditions. Here, this computational analysis, exercised on a simplified cluster dynamics model, demonstrates the ability to design predictive surrogate and reduced-order models, and provides guidelines for improving model predictions within the context of forward and reverse engineering of mathematical models for radiation effects in a materials’ microstructure.« less

  17. Factorial Based Response Surface Modeling with Confidence Intervals for Optimizing Thermal Optical Transmission Analysis of Atmospheric Black Carbon

    EPA Science Inventory

    We demonstrate how thermal-optical transmission analysis (TOT) for refractory light-absorbing carbon in atmospheric particulate matter was optimized with empirical response surface modeling. TOT employs pyrolysis to distinguish the mass of black carbon (BC) from organic carbon (...

  18. A General Approach to Causal Mediation Analysis

    ERIC Educational Resources Information Center

    Imai, Kosuke; Keele, Luke; Tingley, Dustin

    2010-01-01

    Traditionally in the social sciences, causal mediation analysis has been formulated, understood, and implemented within the framework of linear structural equation models. We argue and demonstrate that this is problematic for 3 reasons: the lack of a general definition of causal mediation effects independent of a particular statistical model, the…

  19. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  20. Business Modeling to Implement an eHealth Portal for Infection Control: A Reflection on Co-Creation With Stakeholders.

    PubMed

    van Limburg, Maarten; Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette

    2015-08-13

    It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed "basic stakeholder analysis," stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology.

  1. WASP TRANSPORT MODELING AND WASP ECOLOGICAL MODELING

    EPA Science Inventory

    A combination of lectures, demonstrations, and hands-on excercises will be used to introduce pollutant transport modeling with the U.S. EPA's general water quality model, WASP (Water Quality Analysis Simulation Program). WASP features include a user-friendly Windows-based interfa...

  2. Linked Sensitivity Analysis, Calibration, and Uncertainty Analysis Using a System Dynamics Model for Stroke Comparative Effectiveness Research.

    PubMed

    Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B

    2016-11-01

    As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.

  3. Geomagnetic field models for satellite angular motion studies

    NASA Astrophysics Data System (ADS)

    Ovchinnikov, M. Yu.; Penkov, V. I.; Roldugin, D. S.; Pichuzhkina, A. V.

    2018-03-01

    Four geomagnetic field models are discussed: IGRF, inclined, direct and simplified dipoles. Geomagnetic induction vector expressions are provided in different reference frames. Induction vector behavior is compared for different models. Models applicability for the analysis of satellite motion is studied from theoretical and engineering perspectives. Relevant satellite dynamics analysis cases using analytical and numerical techniques are provided. These cases demonstrate the benefit of a certain model for a specific dynamics study. Recommendations for models usage are summarized in the end.

  4. Dynamic Modeling, Model-Based Control, and Optimization of Solid Oxide Fuel Cells

    NASA Astrophysics Data System (ADS)

    Spivey, Benjamin James

    2011-07-01

    Solid oxide fuel cells are a promising option for distributed stationary power generation that offers efficiencies ranging from 50% in stand-alone applications to greater than 80% in cogeneration. To advance SOFC technology for widespread market penetration, the SOFC should demonstrate improved cell lifetime and load-following capability. This work seeks to improve lifetime through dynamic analysis of critical lifetime variables and advanced control algorithms that permit load-following while remaining in a safe operating zone based on stress analysis. Control algorithms typically have addressed SOFC lifetime operability objectives using unconstrained, single-input-single-output control algorithms that minimize thermal transients. Existing SOFC controls research has not considered maximum radial thermal gradients or limits on absolute temperatures in the SOFC. In particular, as stress analysis demonstrates, the minimum cell temperature is the primary thermal stress driver in tubular SOFCs. This dissertation presents a dynamic, quasi-two-dimensional model for a high-temperature tubular SOFC combined with ejector and prereformer models. The model captures dynamics of critical thermal stress drivers and is used as the physical plant for closed-loop control simulations. A constrained, MIMO model predictive control algorithm is developed and applied to control the SOFC. Closed-loop control simulation results demonstrate effective load-following, constraint satisfaction for critical lifetime variables, and disturbance rejection. Nonlinear programming is applied to find the optimal SOFC size and steady-state operating conditions to minimize total system costs.

  5. Applications of Response Surface-Based Methods to Noise Analysis in the Conceptual Design of Revolutionary Aircraft

    NASA Technical Reports Server (NTRS)

    Hill, Geoffrey A.; Olson, Erik D.

    2004-01-01

    Due to the growing problem of noise in today's air transportation system, there have arisen needs to incorporate noise considerations in the conceptual design of revolutionary aircraft. Through the use of response surfaces, complex noise models may be converted into polynomial equations for rapid and simplified evaluation. This conversion allows many of the commonly used response surface-based trade space exploration methods to be applied to noise analysis. This methodology is demonstrated using a noise model of a notional 300 passenger Blended-Wing-Body (BWB) transport. Response surfaces are created relating source noise levels of the BWB vehicle to its corresponding FAR-36 certification noise levels and the resulting trade space is explored. Methods demonstrated include: single point analysis, parametric study, an optimization technique for inverse analysis, sensitivity studies, and probabilistic analysis. Extended applications of response surface-based methods in noise analysis are also discussed.

  6. Mixture Rasch Models with Joint Maximum Likelihood Estimation

    ERIC Educational Resources Information Center

    Willse, John T.

    2011-01-01

    This research provides a demonstration of the utility of mixture Rasch models. Specifically, a model capable of estimating a mixture partial credit model using joint maximum likelihood is presented. Like the partial credit model, the mixture partial credit model has the beneficial feature of being appropriate for analysis of assessment data…

  7. Creating opportunities to influence self-efficacy through modeling instruction

    NASA Astrophysics Data System (ADS)

    Sawtelle, Vashti; Brewe, Eric; Goertzen, Renee Michelle; Kramer, Laird H.

    2012-02-01

    In this paper we present an initial analysis connecting key elements of Modeling Instruction (MI) to self-efficacy experience opportunities. Previously, we demonstrated that MI has positive effects on self-efficacy when compared with traditional Lecture instruction [1]. We also found a particularly strong positive effect on the social persuasion source of self-efficacy for women in the MI class. Our current study seeks to understand through what mechanisms MI influences self-efficacy. We demonstrate this connection through an in-depth analysis of video chosen to exemplify Modeling techniques used in a problem-solving episode by three female participants enrolled in a MI introductory physics class. We provide a rich and descriptive analysis of the self-efficacy experiences opportunities within this context and discuss how these opportunities provide a potential explanation of how MI influences self-efficacy.

  8. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  9. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  10. Fuzzy logic application for modeling man-in-the-loop space shuttle proximity operations. M.S. Thesis - MIT

    NASA Technical Reports Server (NTRS)

    Brown, Robert B.

    1994-01-01

    A software pilot model for Space Shuttle proximity operations is developed, utilizing fuzzy logic. The model is designed to emulate a human pilot during the terminal phase of a Space Shuttle approach to the Space Station. The model uses the same sensory information available to a human pilot and is based upon existing piloting rules and techniques determined from analysis of human pilot performance. Such a model is needed to generate numerous rendezvous simulations to various Space Station assembly stages for analysis of current NASA procedures and plume impingement loads on the Space Station. The advantages of a fuzzy logic pilot model are demonstrated by comparing its performance with NASA's man-in-the-loop simulations and with a similar model based upon traditional Boolean logic. The fuzzy model is shown to respond well from a number of initial conditions, with results typical of an average human. In addition, the ability to model different individual piloting techniques and new piloting rules is demonstrated.

  11. Meeting in Turkey: WASP Transport Modeling and WASP Ecological Modeling

    EPA Science Inventory

    A combination of lectures, demonstrations, and hands-on excercises will be used to introduce pollutant transport modeling with the U.S. EPA's general water quality model, WASP (Water Quality Analysis Simulation Program). WASP features include a user-friendly Windows-based interfa...

  12. Meeting in Korea: WASP Transport Modeling and WASP Ecological Modeling

    EPA Science Inventory

    A combination of lectures, demonstrations, and hands-on excercises will be used to introduce pollutant transport modeling with the U.S. EPA's general water quality model, WASP (Water Quality Analysis Simulation Program). WASP features include a user-friendly Windows-based interfa...

  13. Analytic methods for questions pertaining to a randomized pretest, posttest, follow-up design.

    PubMed

    Rausch, Joseph R; Maxwell, Scott E; Kelley, Ken

    2003-09-01

    Delineates 5 questions regarding group differences that are likely to be of interest to researchers within the framework of a randomized pretest, posttest, follow-up (PPF) design. These 5 questions are examined from a methodological perspective by comparing and discussing analysis of variance (ANOVA) and analysis of covariance (ANCOVA) methods and briefly discussing hierarchical linear modeling (HLM) for these questions. This article demonstrates that the pretest should be utilized as a covariate in the model rather than as a level of the time factor or as part of the dependent variable within the analysis of group differences. It is also demonstrated that how the posttest and the follow-up are utilized in the analysis of group differences is determined by the specific question asked by the researcher.

  14. An Analysis of Cross Racial Identity Scale Scores Using Classical Test Theory and Rasch Item Response Models

    ERIC Educational Resources Information Center

    Sussman, Joshua; Beaujean, A. Alexander; Worrell, Frank C.; Watson, Stevie

    2013-01-01

    Item response models (IRMs) were used to analyze Cross Racial Identity Scale (CRIS) scores. Rasch analysis scores were compared with classical test theory (CTT) scores. The partial credit model demonstrated a high goodness of fit and correlations between Rasch and CTT scores ranged from 0.91 to 0.99. CRIS scores are supported by both methods.…

  15. Quantitative assessment of myocardial blood flow in coronary artery disease by cardiovascular magnetic resonance: comparison of Fermi and distributed parameter modeling against invasive methods.

    PubMed

    Papanastasiou, Giorgos; Williams, Michelle C; Dweck, Marc R; Alam, Shirjel; Cooper, Annette; Mirsadraee, Saeed; Newby, David E; Semple, Scott I

    2016-09-13

    Mathematical modeling of perfusion cardiovascular magnetic resonance (CMR) data allows absolute quantification of myocardial blood flow and can potentially improve the diagnosis and prognostication of obstructive coronary artery disease (CAD), against the current clinical standard of visual assessments. This study compares the diagnostic performance of distributed parameter modeling (DP) against the standard Fermi model, for the detection of obstructive CAD, in per vessel against per patient analysis. A pilot cohort of 28 subjects (24 included in the final analysis) with known or suspected CAD underwent adenosine stress-rest perfusion CMR at 3T. Data were analysed using Fermi and DP modeling against invasive coronary angiography and fractional flow reserve, acquired in all subjects. Obstructive CAD was defined as luminal stenosis of ≥70 % alone, or luminal stenosis ≥50 % and fractional flow reserve ≤0.80. On ROC analysis, DP modeling outperformed the standard Fermi model, in per vessel and per patient analysis. In per patient analysis, DP modeling-derived myocardial blood flow at stress demonstrated the highest sensitivity and specificity (0.96, 0.92) in detecting obstructive CAD, against Fermi modeling (0.78, 0.88) and visual assessments (0.79, 0.88), respectively. DP modeling demonstrated consistently increased diagnostic performance against Fermi modeling and showed that it may have merit for stratifying patients with at least one vessel with obstructive CAD. Clinicaltrials.gov NCT01368237 Registered 6 of June 2011. URL: https://clinicaltrials.gov/ct2/show/NCT01368237.

  16. Development of an Uncertainty Quantification Predictive Chemical Reaction Model for Syngas Combustion

    DOE PAGES

    Slavinskaya, N. A.; Abbasi, M.; Starcke, J. H.; ...

    2017-01-24

    An automated data-centric infrastructure, Process Informatics Model (PrIMe), was applied to validation and optimization of a syngas combustion model. The Bound-to-Bound Data Collaboration (B2BDC) module of PrIMe was employed to discover the limits of parameter modifications based on uncertainty quantification (UQ) and consistency analysis of the model–data system and experimental data, including shock-tube ignition delay times and laminar flame speeds. Existing syngas reaction models are reviewed, and the selected kinetic data are described in detail. Empirical rules were developed and applied to evaluate the uncertainty bounds of the literature experimental data. Here, the initial H 2/CO reaction model, assembled frommore » 73 reactions and 17 species, was subjected to a B2BDC analysis. For this purpose, a dataset was constructed that included a total of 167 experimental targets and 55 active model parameters. Consistency analysis of the composed dataset revealed disagreement between models and data. Further analysis suggested that removing 45 experimental targets, 8 of which were self-inconsistent, would lead to a consistent dataset. This dataset was subjected to a correlation analysis, which highlights possible directions for parameter modification and model improvement. Additionally, several methods of parameter optimization were applied, some of them unique to the B2BDC framework. The optimized models demonstrated improved agreement with experiments compared to the initially assembled model, and their predictions for experiments not included in the initial dataset (i.e., a blind prediction) were investigated. The results demonstrate benefits of applying the B2BDC methodology for developing predictive kinetic models.« less

  17. Development of an Uncertainty Quantification Predictive Chemical Reaction Model for Syngas Combustion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slavinskaya, N. A.; Abbasi, M.; Starcke, J. H.

    An automated data-centric infrastructure, Process Informatics Model (PrIMe), was applied to validation and optimization of a syngas combustion model. The Bound-to-Bound Data Collaboration (B2BDC) module of PrIMe was employed to discover the limits of parameter modifications based on uncertainty quantification (UQ) and consistency analysis of the model–data system and experimental data, including shock-tube ignition delay times and laminar flame speeds. Existing syngas reaction models are reviewed, and the selected kinetic data are described in detail. Empirical rules were developed and applied to evaluate the uncertainty bounds of the literature experimental data. Here, the initial H 2/CO reaction model, assembled frommore » 73 reactions and 17 species, was subjected to a B2BDC analysis. For this purpose, a dataset was constructed that included a total of 167 experimental targets and 55 active model parameters. Consistency analysis of the composed dataset revealed disagreement between models and data. Further analysis suggested that removing 45 experimental targets, 8 of which were self-inconsistent, would lead to a consistent dataset. This dataset was subjected to a correlation analysis, which highlights possible directions for parameter modification and model improvement. Additionally, several methods of parameter optimization were applied, some of them unique to the B2BDC framework. The optimized models demonstrated improved agreement with experiments compared to the initially assembled model, and their predictions for experiments not included in the initial dataset (i.e., a blind prediction) were investigated. The results demonstrate benefits of applying the B2BDC methodology for developing predictive kinetic models.« less

  18. Effect of rhBMP-2 on tibial plateau fractures in a canine model.

    PubMed

    Schaefer, Susan L; Lu, Yan; Seeherman, Howard; Li, X Jian; Lopez, Mandi J; Markel, Mark D

    2009-04-01

    This study was to determine the efficacy of recombinant human bone morphogenetic protien-2 (rhBMP-2)/calcium phosphate matrix (CPX) paste to accelerate healing in a canine articular fracture model with associated subchondral defect. rhBMP-2/CPX (BMP), CPX alone (CPX) or autogenous bone graft (ABG) was administered to a canine articular tibial plateau osteotomy with a subchondral defect in each of 21 female dogs. The unoperated contralateral limbs served as controls. Ground reaction forces, synovial fluid, radiographic changes, mechanical testing, bone density, and histology of bone and synovium were analyzed at 6 weeks after surgery. Radiographic analysis demonstrated that the BMP and CPX groups showed improved bony healing compared to the ABG group at week 6. Histomorphometric analysis demonstrated that the BMP group had significantly increased trabecular bone volume compared to the CPX and ABG groups. Mechanical testing revealed that the BMP group had significantly greater maximum failure loads than the ABG group. Histological analysis demonstrated that the BMP group had significantly less sub-synovial inflammation than CPX group. This study demonstrated that rhBMP-2/CPX accelerated healing of articular fractures with subchondral defect compared to ABG in most of the parameters evaluated, and had less subsynovial inflammation than the CPX alone in a canine model.

  19. Cammp Team

    NASA Technical Reports Server (NTRS)

    Evertt, Shonn F.; Collins, Michael; Hahn, William

    2008-01-01

    The International Space Station (ISS) Configuration Analysis Modeling and Mass Properties (CAMMP) Team is presenting a demo of certain CAMMP capabilities at a Booz Allen Hamilton conference in San Antonio. The team will be showing pictures of low fidelity, simplified ISS models, but no dimensions or technical data. The presentation will include a brief description of the contract and task, description and picture of the Topology, description of Generic Ground Rules and Constraints (GGR&C), description of Stage Analysis with constraints applied, and wrap up with description of other tasks such as Special Studies, Cable Routing, etc. The models include conceptual Crew Exploration Vehicle (CEV) and Lunar Lander images and animations created for promotional purposes, which are based entirely on public domain conceptual images from public NASA web sites and publicly available magazine articles and are not based on any actual designs, measurements, or 3D models. Conceptual Mars rover and lander are completely conceptual and are not based on any NASA designs or data. The demonstration includes High Fidelity Computer Aided Design (CAD) models of ISS provided by the ISS 3D CAD Team which will be used in a visual display to demonstrate the capabilities of the Teamcenter Visualization software. The demonstration will include 3D views of the CAD models including random measurements that will be taken to demonstrate the measurement tool. A 3D PDF file will be demonstrated of the Blue Book fidelity assembly complete model with no vehicles attached. The 3D zoom and rotation will be displayed as well as random measurements from the measurement tool. The External Configuration Analysis and Tracking Tool (ExCATT) Microsoft Access Database will be demonstrated to show its capabilities to organize and track hardware on ISS. The data included will be part numbers, serial numbers, historical, current, and future locations, of external hardware components on station. It includes dates of all external ISS events and flights and the associated hardware changes for each event. The hardware location information does not always reveal the exact location of the hardware, only the general location. In some cases the location is a module or carrier, in other cases it is a WIF socket, handrail, or attach point. Only small portions of the data will be displayed for demonstration purposes.

  20. The Importance of Statistical Modeling in Data Analysis and Inference

    ERIC Educational Resources Information Center

    Rollins, Derrick, Sr.

    2017-01-01

    Statistical inference simply means to draw a conclusion based on information that comes from data. Error bars are the most commonly used tool for data analysis and inference in chemical engineering data studies. This work demonstrates, using common types of data collection studies, the importance of specifying the statistical model for sound…

  1. Error quantification of a high-resolution coupled hydrodynamic-ecosystem coastal-ocean model: Part 2. Chlorophyll-a, nutrients and SPM

    NASA Astrophysics Data System (ADS)

    Allen, J. Icarus; Holt, Jason T.; Blackford, Jerry; Proctor, Roger

    2007-12-01

    Marine systems models are becoming increasingly complex and sophisticated, but far too little attention has been paid to model errors and the extent to which model outputs actually relate to ecosystem processes. Here we describe the application of summary error statistics to a complex 3D model (POLCOMS-ERSEM) run for the period 1988-1989 in the southern North Sea utilising information from the North Sea Project, which collected a wealth of observational data. We demonstrate that to understand model data misfit and the mechanisms creating errors, we need to use a hierarchy of techniques, including simple correlations, model bias, model efficiency, binary discriminator analysis and the distribution of model errors to assess model errors spatially and temporally. We also demonstrate that a linear cost function is an inappropriate measure of misfit. This analysis indicates that the model has some skill for all variables analysed. A summary plot of model performance indicates that model performance deteriorates as we move through the ecosystem from the physics, to the nutrients and plankton.

  2. A Trustworthy Internet Auction Model with Verifiable Fairness.

    ERIC Educational Resources Information Center

    Liao, Gen-Yih; Hwang, Jing-Jang

    2001-01-01

    Describes an Internet auction model achieving verifiable fairness, a requirement aimed at enhancing the trust of bidders in auctioneers. Analysis results demonstrate that the proposed model satisfies various requirements regarding fairness and privacy. Moreover, in the proposed model, the losing bids remain sealed. (Author/AEF)

  3. ADM Analysis of gravity models within the framework of bimetric variational formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golovnev, Alexey; Karčiauskas, Mindaugas; Nyrhinen, Hannu J., E-mail: agolovnev@yandex.ru, E-mail: mindaugas.karciauskas@helsinki.fi, E-mail: hannu.nyrhinen@helsinki.fi

    2015-05-01

    Bimetric variational formalism was recently employed to construct novel bimetric gravity models. In these models an affine connection is generated by an additional tensor field which is independent of the physical metric. In this work we demonstrate how the ADM decomposition can be applied to study such models and provide some technical intermediate details. Using ADM decomposition we are able to prove that a linear model is unstable as has previously been indicated by perturbative analysis. Moreover, we show that it is also very difficult if not impossible to construct a non-linear model which is ghost-free within the framework ofmore » bimetric variational formalism. However, we demonstrate that viable models are possible along similar lines of thought. To this end, we consider a set up in which the affine connection is a variation of the Levi-Civita one. As a proof of principle we construct a gravity model with a massless scalar field obtained this way.« less

  4. Utility of texture analysis for quantifying hepatic fibrosis on proton density MRI.

    PubMed

    Yu, HeiShun; Buch, Karen; Li, Baojun; O'Brien, Michael; Soto, Jorge; Jara, Hernan; Anderson, Stephan W

    2015-11-01

    To evaluate the potential utility of texture analysis of proton density maps for quantifying hepatic fibrosis in a murine model of hepatic fibrosis. Following Institutional Animal Care and Use Committee (IACUC) approval, a dietary model of hepatic fibrosis was used and 15 ex vivo murine liver tissues were examined. All images were acquired using a 30 mm bore 11.7T magnetic resonance imaging (MRI) scanner with a multiecho spin-echo sequence. A texture analysis was employed extracting multiple texture features including histogram-based, gray-level co-occurrence matrix-based (GLCM), gray-level run-length-based features (GLRL), gray level gradient matrix (GLGM), and Laws' features. Texture features were correlated with histopathologic and digital image analysis of hepatic fibrosis. Histogram features demonstrated very weak to moderate correlations (r = -0.29 to 0.51) with hepatic fibrosis. GLCM features correlation and contrast demonstrated moderate-to-strong correlations (r = -0.71 and 0.59, respectively) with hepatic fibrosis. Moderate correlations were seen between hepatic fibrosis and the GLRL feature short run low gray-level emphasis (SRLGE) (r = -0. 51). GLGM features demonstrate very weak to weak correlations with hepatic fibrosis (r = -0.27 to 0.09). Moderate correlations were seen between hepatic fibrosis and Laws' features L6 and L7 (r = 0.58). This study demonstrates the utility of texture analysis applied to proton density MRI in a murine liver fibrosis model and validates the potential utility of texture-based features for the noninvasive, quantitative assessment of hepatic fibrosis. © 2015 Wiley Periodicals, Inc.

  5. Evidence-based economic analysis demonstrates that ecosystem service benefits of water hyacinth management greatly exceed research and control costs

    PubMed Central

    Harms, Nathan E.; Magen, Cedric; Liang, Dong; Nesslage, Genevieve M.; McMurray, Anna M.; Cofrancesco, Al F.

    2018-01-01

    Invasive species management can be a victim of its own success when decades of effective control cause memories of past harm to fade and raise questions of whether programs should continue. Economic analysis can be used to assess the efficiency of investing in invasive species control by comparing ecosystem service benefits to program costs, but only if appropriate data exist. We used a case study of water hyacinth (Eichhornia crassipes (Mart.) Solms), a nuisance floating aquatic plant, in Louisiana to demonstrate how comprehensive record-keeping supports economic analysis. Using long-term data sets, we developed empirical and spatio-temporal simulation models of intermediate complexity to project invasive species growth for control and no-control scenarios. For Louisiana, we estimated that peak plant cover would be 76% higher without the substantial growth rate suppression (84% reduction) that appeared due primarily to biological control agents. Our economic analysis revealed that combined biological and herbicide control programs, monitored over an unusually long time period (1975–2013), generated a benefit-cost ratio of about 34:1 derived from the relatively modest costs of $124 million ($2013) compared to the $4.2 billion ($2013) in benefits to anglers, waterfowl hunters, boating-dependent businesses, and water treatment facilities over the 38-year analysis period. This work adds to the literature by: (1) providing evidence of the effectiveness of water hyacinth biological control; (2) demonstrating use of parsimonious spatio-temporal models to estimate benefits of invasive species control; and (3) incorporating activity substitution into economic benefit transfer to avoid overstating benefits. Our study suggests that robust and cost-effective economic analysis is enabled by good record keeping and generalizable models that can demonstrate management effectiveness and promote social efficiency of invasive species control. PMID:29844976

  6. Evidence-based economic analysis demonstrates that ecosystem service benefits of water hyacinth management greatly exceed research and control costs.

    PubMed

    Wainger, Lisa A; Harms, Nathan E; Magen, Cedric; Liang, Dong; Nesslage, Genevieve M; McMurray, Anna M; Cofrancesco, Al F

    2018-01-01

    Invasive species management can be a victim of its own success when decades of effective control cause memories of past harm to fade and raise questions of whether programs should continue. Economic analysis can be used to assess the efficiency of investing in invasive species control by comparing ecosystem service benefits to program costs, but only if appropriate data exist. We used a case study of water hyacinth ( Eichhornia crassipes (Mart.) Solms), a nuisance floating aquatic plant, in Louisiana to demonstrate how comprehensive record-keeping supports economic analysis. Using long-term data sets, we developed empirical and spatio-temporal simulation models of intermediate complexity to project invasive species growth for control and no-control scenarios. For Louisiana, we estimated that peak plant cover would be 76% higher without the substantial growth rate suppression (84% reduction) that appeared due primarily to biological control agents. Our economic analysis revealed that combined biological and herbicide control programs, monitored over an unusually long time period (1975-2013), generated a benefit-cost ratio of about 34:1 derived from the relatively modest costs of $124 million ($2013) compared to the $4.2 billion ($2013) in benefits to anglers, waterfowl hunters, boating-dependent businesses, and water treatment facilities over the 38-year analysis period. This work adds to the literature by: (1) providing evidence of the effectiveness of water hyacinth biological control; (2) demonstrating use of parsimonious spatio-temporal models to estimate benefits of invasive species control; and (3) incorporating activity substitution into economic benefit transfer to avoid overstating benefits. Our study suggests that robust and cost-effective economic analysis is enabled by good record keeping and generalizable models that can demonstrate management effectiveness and promote social efficiency of invasive species control.

  7. Equivalent plate modeling for conceptual design of aircraft wing structures

    NASA Technical Reports Server (NTRS)

    Giles, Gary L.

    1995-01-01

    This paper describes an analysis method that generates conceptual-level design data for aircraft wing structures. A key requirement is that this data must be produced in a timely manner so that is can be used effectively by multidisciplinary synthesis codes for performing systems studies. Such a capability is being developed by enhancing an equivalent plate structural analysis computer code to provide a more comprehensive, robust and user-friendly analysis tool. The paper focuses on recent enhancements to the Equivalent Laminated Plate Solution (ELAPS) analysis code that significantly expands the modeling capability and improves the accuracy of results. Modeling additions include use of out-of-plane plate segments for representing winglets and advanced wing concepts such as C-wings along with a new capability for modeling the internal rib and spar structure. The accuracy of calculated results is improved by including transverse shear effects in the formulation and by using multiple sets of assumed displacement functions in the analysis. Typical results are presented to demonstrate these new features. Example configurations include a C-wing transport aircraft, a representative fighter wing and a blended-wing-body transport. These applications are intended to demonstrate and quantify the benefits of using equivalent plate modeling of wing structures during conceptual design.

  8. Industry Application ECCS / LOCA Integrated Cladding/Emergency Core Cooling System Performance: Demonstration of LOTUS-Baseline Coupled Analysis of the South Texas Plant Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Hongbin; Szilard, Ronaldo; Epiney, Aaron

    Under the auspices of the DOE LWRS Program RISMC Industry Application ECCS/LOCA, INL has engaged staff from both South Texas Project (STP) and the Texas A&M University (TAMU) to produce a generic pressurized water reactor (PWR) model including reactor core, clad/fuel design and systems thermal hydraulics based on the South Texas Project (STP) nuclear power plant, a 4-Loop Westinghouse PWR. A RISMC toolkit, named LOCA Toolkit for the U.S. (LOTUS), has been developed for use in this generic PWR plant model to assess safety margins for the proposed NRC 10 CFR 50.46c rule, Emergency Core Cooling System (ECCS) performance duringmore » LOCA. This demonstration includes coupled analysis of core design, fuel design, thermalhydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results. Within this context, a multi-physics best estimate plus uncertainty (MPBEPU) methodology framework is proposed.« less

  9. Probabilistic structural analysis by extremum methods

    NASA Technical Reports Server (NTRS)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  10. Tutorial: Advanced fault tree applications using HARP

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta; Bavuso, Salvatore J.; Boyd, Mark A.

    1993-01-01

    Reliability analysis of fault tolerant computer systems for critical applications is complicated by several factors. These modeling difficulties are discussed and dynamic fault tree modeling techniques for handling them are described and demonstrated. Several advanced fault tolerant computer systems are described, and fault tree models for their analysis are presented. HARP (Hybrid Automated Reliability Predictor) is a software package developed at Duke University and NASA Langley Research Center that is capable of solving the fault tree models presented.

  11. Demonstration of a modelling-based multi-criteria decision analysis procedure for prioritisation of occupational risks from manufactured nanomaterials.

    PubMed

    Hristozov, Danail; Zabeo, Alex; Alstrup Jensen, Keld; Gottardo, Stefania; Isigonis, Panagiotis; Maccalman, Laura; Critto, Andrea; Marcomini, Antonio

    2016-11-01

    Several tools to facilitate the risk assessment and management of manufactured nanomaterials (MN) have been developed. Most of them require input data on physicochemical properties, toxicity and scenario-specific exposure information. However, such data are yet not readily available, and tools that can handle data gaps in a structured way to ensure transparent risk analysis for industrial and regulatory decision making are needed. This paper proposes such a quantitative risk prioritisation tool, based on a multi-criteria decision analysis algorithm, which combines advanced exposure and dose-response modelling to calculate margins of exposure (MoE) for a number of MN in order to rank their occupational risks. We demonstrated the tool in a number of workplace exposure scenarios (ES) involving the production and handling of nanoscale titanium dioxide, zinc oxide (ZnO), silver and multi-walled carbon nanotubes. The results of this application demonstrated that bag/bin filling, manual un/loading and dumping of large amounts of dry powders led to high emissions, which resulted in high risk associated with these ES. The ZnO MN revealed considerable hazard potential in vivo, which significantly influenced the risk prioritisation results. In order to study how variations in the input data affect our results, we performed probabilistic Monte Carlo sensitivity/uncertainty analysis, which demonstrated that the performance of the proposed model is stable against changes in the exposure and hazard input variables.

  12. OpenMDAO: Framework for Flexible Multidisciplinary Design, Analysis and Optimization Methods

    NASA Technical Reports Server (NTRS)

    Heath, Christopher M.; Gray, Justin S.

    2012-01-01

    The OpenMDAO project is underway at NASA to develop a framework which simplifies the implementation of state-of-the-art tools and methods for multidisciplinary design, analysis and optimization. Foremost, OpenMDAO has been designed to handle variable problem formulations, encourage reconfigurability, and promote model reuse. This work demonstrates the concept of iteration hierarchies in OpenMDAO to achieve a flexible environment for supporting advanced optimization methods which include adaptive sampling and surrogate modeling techniques. In this effort, two efficient global optimization methods were applied to solve a constrained, single-objective and constrained, multiobjective version of a joint aircraft/engine sizing problem. The aircraft model, NASA's nextgeneration advanced single-aisle civil transport, is being studied as part of the Subsonic Fixed Wing project to help meet simultaneous program goals for reduced fuel burn, emissions, and noise. This analysis serves as a realistic test problem to demonstrate the flexibility and reconfigurability offered by OpenMDAO.

  13. Tolerance analysis through computational imaging simulations

    NASA Astrophysics Data System (ADS)

    Birch, Gabriel C.; LaCasse, Charles F.; Stubbs, Jaclynn J.; Dagel, Amber L.; Bradley, Jon

    2017-11-01

    The modeling and simulation of non-traditional imaging systems require holistic consideration of the end-to-end system. We demonstrate this approach through a tolerance analysis of a random scattering lensless imaging system.

  14. Human factors phase IV : risk analysis tool for new train control technology.

    DOT National Transportation Integrated Search

    2005-01-31

    This report covers the theoretical development of the safety state model for railroad operations. Using data from a train control technology experiment, experimental application of the model is demonstrated. A stochastic model of system behavior is d...

  15. Human factors phase IV : risk analysis tool for new train control technology

    DOT National Transportation Integrated Search

    2005-01-01

    This report covers the theoretical development of the safety state model for railroad operations. Using data from a train control technology experiment, experimental application of the model is demonstrated. A stochastic model of system behavior is d...

  16. Application of the actor model to large scale NDE data analysis

    NASA Astrophysics Data System (ADS)

    Coughlin, Chris

    2018-03-01

    The Actor model of concurrent computation discretizes a problem into a series of independent units or actors that interact only through the exchange of messages. Without direct coupling between individual components, an Actor-based system is inherently concurrent and fault-tolerant. These traits lend themselves to so-called "Big Data" applications in which the volume of data to analyze requires a distributed multi-system design. For a practical demonstration of the Actor computational model, a system was developed to assist with the automated analysis of Nondestructive Evaluation (NDE) datasets using the open source Myriad Data Reduction Framework. A machine learning model trained to detect damage in two-dimensional slices of C-Scan data was deployed in a streaming data processing pipeline. To demonstrate the flexibility of the Actor model, the pipeline was deployed on a local system and re-deployed as a distributed system without recompiling, reconfiguring, or restarting the running application.

  17. Stability analysis and application of a mathematical cholera model.

    PubMed

    Liao, Shu; Wang, Jin

    2011-07-01

    In this paper, we conduct a dynamical analysis of the deterministic cholera model proposed in [9]. We study the stability of both the disease-free and endemic equilibria so as to explore the complex epidemic and endemic dynamics of the disease. We demonstrate a real-world application of this model by investigating the recent cholera outbreak in Zimbabwe. Meanwhile, we present numerical simulation results to verify the analytical predictions.

  18. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  19. Spike-train spectra and network response functions for non-linear integrate-and-fire neurons.

    PubMed

    Richardson, Magnus J E

    2008-11-01

    Reduced models have long been used as a tool for the analysis of the complex activity taking place in neurons and their coupled networks. Recent advances in experimental and theoretical techniques have further demonstrated the usefulness of this approach. Despite the often gross simplification of the underlying biophysical properties, reduced models can still present significant difficulties in their analysis, with the majority of exact and perturbative results available only for the leaky integrate-and-fire model. Here an elementary numerical scheme is demonstrated which can be used to calculate a number of biologically important properties of the general class of non-linear integrate-and-fire models. Exact results for the first-passage-time density and spike-train spectrum are derived, as well as the linear response properties and emergent states of recurrent networks. Given that the exponential integrate-fire model has recently been shown to agree closely with the experimentally measured response of pyramidal cells, the methodology presented here promises to provide a convenient tool to facilitate the analysis of cortical-network dynamics.

  20. The Recoverability of P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  1. The Computation of Orthogonal Independent Cluster Solutions and Their Oblique Analogs in Factor Analysis.

    ERIC Educational Resources Information Center

    Hofmann, Richard J.

    A very general model for the computation of independent cluster solutions in factor analysis is presented. The model is discussed as being either orthogonal or oblique. Furthermore, it is demonstrated that for every orthogonal independent cluster solution there is an oblique analog. Using three illustrative examples, certain generalities are made…

  2. Quantitative model of diffuse speckle contrast analysis for flow measurement.

    PubMed

    Liu, Jialin; Zhang, Hongchao; Lu, Jian; Ni, Xiaowu; Shen, Zhonghua

    2017-07-01

    Diffuse speckle contrast analysis (DSCA) is a noninvasive optical technique capable of monitoring deep tissue blood flow. However, a detailed study of the speckle contrast model for DSCA has yet to be presented. We deduced the theoretical relationship between speckle contrast and exposure time and further simplified it to a linear approximation model. The feasibility of this linear model was validated by the liquid phantoms which demonstrated that the slope of this linear approximation was able to rapidly determine the Brownian diffusion coefficient of the turbid media at multiple distances using multiexposure speckle imaging. Furthermore, we have theoretically quantified the influence of optical property on the measurements of the Brownian diffusion coefficient which was a consequence of the fact that the slope of this linear approximation was demonstrated to be equal to the inverse of correlation time of the speckle.

  3. What do we mean by sensitivity analysis? The need for comprehensive characterization of "global" sensitivity in Earth and Environmental systems models

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin V.

    2015-05-01

    Sensitivity analysis is an essential paradigm in Earth and Environmental Systems modeling. However, the term "sensitivity" has a clear definition, based in partial derivatives, only when specified locally around a particular point (e.g., optimal solution) in the problem space. Accordingly, no unique definition exists for "global sensitivity" across the problem space, when considering one or more model responses to different factors such as model parameters or forcings. A variety of approaches have been proposed for global sensitivity analysis, based on different philosophies and theories, and each of these formally characterizes a different "intuitive" understanding of sensitivity. These approaches focus on different properties of the model response at a fundamental level and may therefore lead to different (even conflicting) conclusions about the underlying sensitivities. Here we revisit the theoretical basis for sensitivity analysis, summarize and critically evaluate existing approaches in the literature, and demonstrate their flaws and shortcomings through conceptual examples. We also demonstrate the difficulty involved in interpreting "global" interaction effects, which may undermine the value of existing interpretive approaches. With this background, we identify several important properties of response surfaces that are associated with the understanding and interpretation of sensitivities in the context of Earth and Environmental System models. Finally, we highlight the need for a new, comprehensive framework for sensitivity analysis that effectively characterizes all of the important sensitivity-related properties of model response surfaces.

  4. Probabilistic image modeling with an extended chain graph for human activity recognition and image segmentation.

    PubMed

    Zhang, Lei; Zeng, Zhi; Ji, Qiang

    2011-09-01

    Chain graph (CG) is a hybrid probabilistic graphical model (PGM) capable of modeling heterogeneous relationships among random variables. So far, however, its application in image and video analysis is very limited due to lack of principled learning and inference methods for a CG of general topology. To overcome this limitation, we introduce methods to extend the conventional chain-like CG model to CG model with more general topology and the associated methods for learning and inference in such a general CG model. Specifically, we propose techniques to systematically construct a generally structured CG, to parameterize this model, to derive its joint probability distribution, to perform joint parameter learning, and to perform probabilistic inference in this model. To demonstrate the utility of such an extended CG, we apply it to two challenging image and video analysis problems: human activity recognition and image segmentation. The experimental results show improved performance of the extended CG model over the conventional directed or undirected PGMs. This study demonstrates the promise of the extended CG for effective modeling and inference of complex real-world problems.

  5. A Simulation Model Articulation of the REA Ontology

    NASA Astrophysics Data System (ADS)

    Laurier, Wim; Poels, Geert

    This paper demonstrates how the REA enterprise ontology can be used to construct simulation models for business processes, value chains and collaboration spaces in supply chains. These models support various high-level and operational management simulation applications, e.g. the analysis of enterprise sustainability and day-to-day planning. First, the basic constructs of the REA ontology and the ExSpect modelling language for simulation are introduced. Second, collaboration space, value chain and business process models and their conceptual dependencies are shown, using the ExSpect language. Third, an exhibit demonstrates the use of value chain models in predicting the financial performance of an enterprise.

  6. A Model for Analyzing Disability Policy

    ERIC Educational Resources Information Center

    Turnbull, Rud; Stowe, Matthew J.

    2017-01-01

    This article describes a 12-step model that can be used for policy analysis. The model encompasses policy development, implementation, and evaluation; takes into account structural foundations of policy; addresses both legal formalism and legal realism; demonstrates contextual sensitivity; and addresses application issues and different…

  7. Geometric Analyses of Rotational Faults.

    ERIC Educational Resources Information Center

    Schwert, Donald Peters; Peck, Wesley David

    1986-01-01

    Describes the use of analysis of rotational faults in undergraduate structural geology laboratories to provide students with applications of both orthographic and stereographic techniques. A demonstration problem is described, and an orthographic/stereographic solution and a reproducible black model demonstration pattern are provided. (TW)

  8. Rotor design optimization using a free wake analysis

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Boschitsch, Alexander H.; Wachspress, Daniel A.; Chua, Kiat

    1993-01-01

    The aim of this effort was to develop a comprehensive performance optimization capability for tiltrotor and helicopter blades. The analysis incorporates the validated EHPIC (Evaluation of Hover Performance using Influence Coefficients) model of helicopter rotor aerodynamics within a general linear/quadratic programming algorithm that allows optimization using a variety of objective functions involving the performance. The resulting computer code, EHPIC/HERO (HElicopter Rotor Optimization), improves upon several features of the previous EHPIC performance model and allows optimization utilizing a wide spectrum of design variables, including twist, chord, anhedral, and sweep. The new analysis supports optimization of a variety of objective functions, including weighted measures of rotor thrust, power, and propulsive efficiency. The fundamental strength of the approach is that an efficient search for improved versions of the baseline design can be carried out while retaining the demonstrated accuracy inherent in the EHPIC free wake/vortex lattice performance analysis. Sample problems are described that demonstrate the success of this approach for several representative rotor configurations in hover and axial flight. Features that were introduced to convert earlier demonstration versions of this analysis into a generally applicable tool for researchers and designers is also discussed.

  9. Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model: A Web-based program designed to evaluate the cost-effectiveness of disease management programs in heart failure.

    PubMed

    Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C

    2015-11-01

    Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. PyFolding: Open-Source Graphing, Simulation, and Analysis of the Biophysical Properties of Proteins.

    PubMed

    Lowe, Alan R; Perez-Riba, Albert; Itzhaki, Laura S; Main, Ewan R G

    2018-02-06

    For many years, curve-fitting software has been heavily utilized to fit simple models to various types of biophysical data. Although such software packages are easy to use for simple functions, they are often expensive and present substantial impediments to applying more complex models or for the analysis of large data sets. One field that is reliant on such data analysis is the thermodynamics and kinetics of protein folding. Over the past decade, increasingly sophisticated analytical models have been generated, but without simple tools to enable routine analysis. Consequently, users have needed to generate their own tools or otherwise find willing collaborators. Here we present PyFolding, a free, open-source, and extensible Python framework for graphing, analysis, and simulation of the biophysical properties of proteins. To demonstrate the utility of PyFolding, we have used it to analyze and model experimental protein folding and thermodynamic data. Examples include: 1) multiphase kinetic folding fitted to linked equations, 2) global fitting of multiple data sets, and 3) analysis of repeat protein thermodynamics with Ising model variants. Moreover, we demonstrate how PyFolding is easily extensible to novel functionality beyond applications in protein folding via the addition of new models. Example scripts to perform these and other operations are supplied with the software, and we encourage users to contribute notebooks and models to create a community resource. Finally, we show that PyFolding can be used in conjunction with Jupyter notebooks as an easy way to share methods and analysis for publication and among research teams. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  11. Application of Interface Technology in Progressive Failure Analysis of Composite Panels

    NASA Technical Reports Server (NTRS)

    Sleight, D. W.; Lotts, C. G.

    2002-01-01

    A progressive failure analysis capability using interface technology is presented. The capability has been implemented in the COMET-AR finite element analysis code developed at the NASA Langley Research Center and is demonstrated on composite panels. The composite panels are analyzed for damage initiation and propagation from initial loading to final failure using a progressive failure analysis capability that includes both geometric and material nonlinearities. Progressive failure analyses are performed on conventional models and interface technology models of the composite panels. Analytical results and the computational effort of the analyses are compared for the conventional models and interface technology models. The analytical results predicted with the interface technology models are in good correlation with the analytical results using the conventional models, while significantly reducing the computational effort.

  12. Static analysis of a sonar dome rubber window

    NASA Technical Reports Server (NTRS)

    Lai, J. L.

    1978-01-01

    The application of NASTRAN (level 16.0.1) to the static analysis of a sonar dome rubber window (SDRW) was demonstrated. The assessment of the conventional model (neglecting the enclosed fluid) for the stress analysis of the SDRW was made by comparing its results to those based on a sophisticated model (including the enclosed fluid). The fluid was modeled with isoparametric linear hexahedron elements with approximate material properties whose shear modulus was much smaller than its bulk modulus. The effect of the chosen material property for the fluid is discussed.

  13. FRAP Analysis: Accounting for Bleaching during Image Capture

    PubMed Central

    Wu, Jun; Shekhar, Nandini; Lele, Pushkar P.; Lele, Tanmay P.

    2012-01-01

    The analysis of Fluorescence Recovery After Photobleaching (FRAP) experiments involves mathematical modeling of the fluorescence recovery process. An important feature of FRAP experiments that tends to be ignored in the modeling is that there can be a significant loss of fluorescence due to bleaching during image capture. In this paper, we explicitly include the effects of bleaching during image capture in the model for the recovery process, instead of correcting for the effects of bleaching using reference measurements. Using experimental examples, we demonstrate the usefulness of such an approach in FRAP analysis. PMID:22912750

  14. DeepInfer: open-source deep learning deployment toolkit for image-guided therapy

    NASA Astrophysics Data System (ADS)

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A.; Kapur, Tina; Wells, William M.; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-03-01

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research work ows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose "DeepInfer" - an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections.

  15. DeepInfer: Open-Source Deep Learning Deployment Toolkit for Image-Guided Therapy.

    PubMed

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A; Kapur, Tina; Wells, William M; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-02-11

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research workflows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose "DeepInfer" - an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections.

  16. DeepInfer: Open-Source Deep Learning Deployment Toolkit for Image-Guided Therapy

    PubMed Central

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A.; Kapur, Tina; Wells, William M.; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-01-01

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research workflows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose “DeepInfer” – an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections. PMID:28615794

  17. Business Modeling to Implement an eHealth Portal for Infection Control: A Reflection on Co-Creation With Stakeholders

    PubMed Central

    Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette

    2015-01-01

    Background It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. Objective This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. Methods We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed “basic stakeholder analysis,” stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Results Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. Conclusions The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology. PMID:26272510

  18. A watershed-based spatially-explicit demonstration of an integrated environmental modeling framework for ecosystem services in the Coal River Basin (WV, USA)

    Treesearch

    John M. Johnston; Mahion C. Barber; Kurt Wolfe; Mike Galvin; Mike Cyterski; Rajbir Parmar; Luis Suarez

    2016-01-01

    We demonstrate a spatially-explicit regional assessment of current condition of aquatic ecoservices in the Coal River Basin (CRB), with limited sensitivity analysis for the atmospheric contaminant mercury. The integrated modeling framework (IMF) forecasts water quality and quantity, habitat suitability for aquatic biota, fish biomasses, population densities, ...

  19. Practical Application of Model-based Programming and State-based Architecture to Space Missions

    NASA Technical Reports Server (NTRS)

    Horvath, Gregory; Ingham, Michel; Chung, Seung; Martin, Oliver; Williams, Brian

    2006-01-01

    A viewgraph presentation to develop models from systems engineers that accomplish mission objectives and manage the health of the system is shown. The topics include: 1) Overview; 2) Motivation; 3) Objective/Vision; 4) Approach; 5) Background: The Mission Data System; 6) Background: State-based Control Architecture System; 7) Background: State Analysis; 8) Overview of State Analysis; 9) Background: MDS Software Frameworks; 10) Background: Model-based Programming; 10) Background: Titan Model-based Executive; 11) Model-based Execution Architecture; 12) Compatibility Analysis of MDS and Titan Architectures; 13) Integrating Model-based Programming and Execution into the Architecture; 14) State Analysis and Modeling; 15) IMU Subsystem State Effects Diagram; 16) Titan Subsystem Model: IMU Health; 17) Integrating Model-based Programming and Execution into the Software IMU; 18) Testing Program; 19) Computationally Tractable State Estimation & Fault Diagnosis; 20) Diagnostic Algorithm Performance; 21) Integration and Test Issues; 22) Demonstrated Benefits; and 23) Next Steps

  20. Practical Use of Computationally Frugal Model Analysis Methods

    DOE PAGES

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; ...

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  1. The effects of videotape modeling on staff acquisition of functional analysis methodology.

    PubMed

    Moore, James W; Fisher, Wayne W

    2007-01-01

    Lectures and two types of video modeling were compared to determine their relative effectiveness in training 3 staff members to conduct functional analysis sessions. Video modeling that contained a larger number of therapist exemplars resulted in mastery-level performance eight of the nine times it was introduced, whereas neither lectures nor partial video modeling produced significant improvements in performance. Results demonstrated that video modeling provided an effective training strategy but only when a wide range of exemplars of potential therapist behaviors were depicted in the videotape.

  2. The Effects of Videotape Modeling on Staff Acquisition of Functional Analysis Methodology

    PubMed Central

    Moore, James W; Fisher, Wayne W

    2007-01-01

    Lectures and two types of video modeling were compared to determine their relative effectiveness in training 3 staff members to conduct functional analysis sessions. Video modeling that contained a larger number of therapist exemplars resulted in mastery-level performance eight of the nine times it was introduced, whereas neither lectures nor partial video modeling produced significant improvements in performance. Results demonstrated that video modeling provided an effective training strategy but only when a wide range of exemplars of potential therapist behaviors were depicted in the videotape. PMID:17471805

  3. Automated Student Model Improvement

    ERIC Educational Resources Information Center

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  4. Use of direct gradient analysis to uncover biological hypotheses in 16s survey data and beyond.

    PubMed

    Erb-Downward, John R; Sadighi Akha, Amir A; Wang, Juan; Shen, Ning; He, Bei; Martinez, Fernando J; Gyetko, Margaret R; Curtis, Jeffrey L; Huffnagle, Gary B

    2012-01-01

    This study investigated the use of direct gradient analysis of bacterial 16S pyrosequencing surveys to identify relevant bacterial community signals in the midst of a "noisy" background, and to facilitate hypothesis-testing both within and beyond the realm of ecological surveys. The results, utilizing 3 different real world data sets, demonstrate the utility of adding direct gradient analysis to any analysis that draws conclusions from indirect methods such as Principal Component Analysis (PCA) and Principal Coordinates Analysis (PCoA). Direct gradient analysis produces testable models, and can identify significant patterns in the midst of noisy data. Additionally, we demonstrate that direct gradient analysis can be used with other kinds of multivariate data sets, such as flow cytometric data, to identify differentially expressed populations. The results of this study demonstrate the utility of direct gradient analysis in microbial ecology and in other areas of research where large multivariate data sets are involved.

  5. Electric train energy consumption modeling

    DOE PAGES

    Wang, Jinghui; Rakha, Hesham A.

    2017-05-01

    For this paper we develop an electric train energy consumption modeling framework considering instantaneous regenerative braking efficiency in support of a rail simulation system. The model is calibrated with data from Portland, Oregon using an unconstrained non-linear optimization procedure, and validated using data from Chicago, Illinois by comparing model predictions against the National Transit Database (NTD) estimates. The results demonstrate that regenerative braking efficiency varies as an exponential function of the deceleration level, rather than an average constant as assumed in previous studies. The model predictions are demonstrated to be consistent with the NTD estimates, producing a predicted error ofmore » 1.87% and -2.31%. The paper demonstrates that energy recovery reduces the overall power consumption by 20% for the tested Chicago route. Furthermore, the paper demonstrates that the proposed modeling approach is able to capture energy consumption differences associated with train, route and operational parameters, and thus is applicable for project-level analysis. The model can be easily implemented in traffic simulation software, used in smartphone applications and eco-transit programs given its fast execution time and easy integration in complex frameworks.« less

  6. Electric train energy consumption modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jinghui; Rakha, Hesham A.

    For this paper we develop an electric train energy consumption modeling framework considering instantaneous regenerative braking efficiency in support of a rail simulation system. The model is calibrated with data from Portland, Oregon using an unconstrained non-linear optimization procedure, and validated using data from Chicago, Illinois by comparing model predictions against the National Transit Database (NTD) estimates. The results demonstrate that regenerative braking efficiency varies as an exponential function of the deceleration level, rather than an average constant as assumed in previous studies. The model predictions are demonstrated to be consistent with the NTD estimates, producing a predicted error ofmore » 1.87% and -2.31%. The paper demonstrates that energy recovery reduces the overall power consumption by 20% for the tested Chicago route. Furthermore, the paper demonstrates that the proposed modeling approach is able to capture energy consumption differences associated with train, route and operational parameters, and thus is applicable for project-level analysis. The model can be easily implemented in traffic simulation software, used in smartphone applications and eco-transit programs given its fast execution time and easy integration in complex frameworks.« less

  7. Significance testing of clinical data using virus dynamics models with a Markov chain Monte Carlo method: application to emergence of lamivudine-resistant hepatitis B virus.

    PubMed Central

    Burroughs, N J; Pillay, D; Mutimer, D

    1999-01-01

    Bayesian analysis using a virus dynamics model is demonstrated to facilitate hypothesis testing of patterns in clinical time-series. Our Markov chain Monte Carlo implementation demonstrates that the viraemia time-series observed in two sets of hepatitis B patients on antiviral (lamivudine) therapy, chronic carriers and liver transplant patients, are significantly different, overcoming clinical trial design differences that question the validity of non-parametric tests. We show that lamivudine-resistant mutants grow faster in transplant patients than in chronic carriers, which probably explains the differences in emergence times and failure rates between these two sets of patients. Incorporation of dynamic models into Bayesian parameter analysis is of general applicability in medical statistics. PMID:10643081

  8. Factors Influencing Progressive Failure Analysis Predictions for Laminated Composite Structure

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    2008-01-01

    Progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model for use with a nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details are described in the present paper. Parametric studies for laminated composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented and to demonstrate their influence on progressive failure analysis predictions.

  9. A physiological model for interpretation of arterial spin labeling reactive hyperemia of calf muscles.

    PubMed

    Chen, Hou-Jen; Wright, Graham A

    2017-01-01

    To characterize and interpret arterial spin labeling (ASL) reactive hyperemia of calf muscles for a better understanding of the microcirculation in peripheral arterial disease (PAD), we present a physiological model incorporating oxygen transport, tissue metabolism, and vascular regulation mechanisms. The model demonstrated distinct effects between arterial stenoses and microvascular dysfunction on reactive hyperemia, and indicated a higher sensitivity of 2-minute thigh cuffing to microvascular dysfunction than 5-minute cuffing. The recorded perfusion responses in PAD patients (n = 9) were better differentiated from the normal subjects (n = 7) using the model-based analysis rather than characterization using the apparent peak and time-to-peak of the responses. The analysis results suggested different amounts of microvascular disease within the patient group. Overall, this work demonstrates a novel analysis method and facilitates understanding of the physiology involved in ASL reactive hyperemia. ASL reactive hyperemia with model-based analysis may be used as a noninvasive microvascular assessment in the presence of arterial stenoses, allowing us to look beyond the macrovascular disease in PAD. A subgroup who will have a poor prognosis after revascularization in the patients with critical limb ischemia may be associated with more severe microvascular diseases, which may potentially be identified using ASL reactive hyperemia.

  10. Linking population viability, habitat suitability, and landscape simulation models for conservation planning

    Treesearch

    Michael A. Larson; Frank R., III Thompson; Joshua J. Millspaugh; William D. Dijak; Stephen R. Shifley

    2004-01-01

    Methods for habitat modeling based on landscape simulations and population viability modeling based on habitat quality are well developed, but no published study of which we are aware has effectively joined them in a single, comprehensive analysis. We demonstrate the application of a population viability model for ovenbirds (Seiurus aurocapillus)...

  11. Identifying influences on model uncertainty: an application using a forest carbon budget model

    Treesearch

    James E. Smith; Linda S. Heath

    2001-01-01

    Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...

  12. Using integrated models to minimize environmentally induced wavefront error in optomechanical design and analysis

    NASA Astrophysics Data System (ADS)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate design goal of an optical system subjected to dynamic loads is to minimize system level wavefront error (WFE). In random response analysis, system WFE is difficult to predict from finite element results due to the loss of phase information. In the past, the use of ystem WFE was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for determining system level WFE using a linear optics model is presented. An error estimate is included in the analysis output based on fitting errors of mode shapes. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  13. Efficiency evaluation with feedback for regional water use and wastewater treatment

    NASA Astrophysics Data System (ADS)

    Hu, Zhineng; Yan, Shiyu; Yao, Liming; Moudi, Mahdi

    2018-07-01

    Clean water is crucial for sustainable economic and social development; however, around the world low water use efficiency and increasing water pollution have become serious problems. To comprehensively evaluate water use and wastewater treatment, this paper integrated bi-level programming (BLP) and Data Envelopment Analysis (DEA) with a feedback variable to deal with poor output to rank DMUs using a super efficiency DEA. The proposed model was applied to a case study of 10 cities in the Minjiang River Basin to demonstrate the applicability and effectiveness, from which it was found that a water system can only be cost-efficient when both the water use and wastewater treatment subsystems are both cost-efficient. The comparison analysis demonstrated that the proposed model was more discriminating, and stable than traditional DEA models and was able to better improve total water system cost efficiencies than a BLP-DEA model.

  14. Tutorial on Biostatistics: Linear Regression Analysis of Continuous Correlated Eye Data.

    PubMed

    Ying, Gui-Shuang; Maguire, Maureen G; Glynn, Robert; Rosner, Bernard

    2017-04-01

    To describe and demonstrate appropriate linear regression methods for analyzing correlated continuous eye data. We describe several approaches to regression analysis involving both eyes, including mixed effects and marginal models under various covariance structures to account for inter-eye correlation. We demonstrate, with SAS statistical software, applications in a study comparing baseline refractive error between one eye with choroidal neovascularization (CNV) and the unaffected fellow eye, and in a study determining factors associated with visual field in the elderly. When refractive error from both eyes were analyzed with standard linear regression without accounting for inter-eye correlation (adjusting for demographic and ocular covariates), the difference between eyes with CNV and fellow eyes was 0.15 diopters (D; 95% confidence interval, CI -0.03 to 0.32D, p = 0.10). Using a mixed effects model or a marginal model, the estimated difference was the same but with narrower 95% CI (0.01 to 0.28D, p = 0.03). Standard regression for visual field data from both eyes provided biased estimates of standard error (generally underestimated) and smaller p-values, while analysis of the worse eye provided larger p-values than mixed effects models and marginal models. In research involving both eyes, ignoring inter-eye correlation can lead to invalid inferences. Analysis using only right or left eyes is valid, but decreases power. Worse-eye analysis can provide less power and biased estimates of effect. Mixed effects or marginal models using the eye as the unit of analysis should be used to appropriately account for inter-eye correlation and maximize power and precision.

  15. Turbulence model development and application at Lockheed Fort Worth Company

    NASA Technical Reports Server (NTRS)

    Smith, Brian R.

    1995-01-01

    This viewgraph presentation demonstrates that computationally efficient k-l and k-kl turbulence models have been developed and implemented at Lockheed Fort Worth Company. Many years of experience have been gained applying two equation turbulence models to complex three-dimensional flows for design and analysis.

  16. Ambiguities in model-independent partial-wave analysis

    NASA Astrophysics Data System (ADS)

    Krinner, F.; Greenwald, D.; Ryabchikov, D.; Grube, B.; Paul, S.

    2018-06-01

    Partial-wave analysis is an important tool for analyzing large data sets in hadronic decays of light and heavy mesons. It commonly relies on the isobar model, which assumes multihadron final states originate from successive two-body decays of well-known undisturbed intermediate states. Recently, analyses of heavy-meson decays and diffractively produced states have attempted to overcome the strong model dependences of the isobar model. These analyses have overlooked that model-independent, or freed-isobar, partial-wave analysis can introduce mathematical ambiguities in results. We show how these ambiguities arise and present general techniques for identifying their presence and for correcting for them. We demonstrate these techniques with specific examples in both heavy-meson decay and pion-proton scattering.

  17. X-ray and neutron total scattering analysis of Hy·(Bi0.2Ca0.55Sr0.25)(Ag0.25Na0.75)Nb3O10·xH2O perovskite nanosheet booklets with stacking disorder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Metz, Peter; Koch, Robert; Cladek, Bernadette

    Ion-exchanged Aurivillius materials form perovskite nanosheet booklets wherein well-defined bi-periodic sheets, with ~11.5 Å thickness, exhibit extensive stacking disorder. The perovskite layer contents were defined initially using combined synchrotron X-ray and neutron Rietveld refinement of the parent Aurivillius structure. The structure of the subsequently ion-exchanged material, which is disordered in its stacking sequence, is analyzed using both pair distribution function (PDF) analysis and recursive method simulations of the scattered intensity. Combined X-ray and neutron PDF refinement of supercell stacking models demonstrates sensitivity of the PDF to both perpendicular and transverse stacking vector components. Further, hierarchical ensembles of stacking models weightedmore » by a standard normal distribution are demonstrated to improve PDF fit over 1–25 Å. Recursive method simulations of the X-ray scattering profile demonstrate agreement between the real space stacking analysis and more conventional reciprocal space methods. The local structure of the perovskite sheet is demonstrated to relax only slightly from the Aurivillius structure after ion exchange.« less

  18. 77 FR 47077 - Statement of Organization, Functions, and Delegations of Authority; Office of Planning, Research...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-07

    ...; surveys, research and evaluation methodologies; demonstration testing and model development; synthesis and..., policy and program analysis; surveys, research and evaluation methodologies; demonstration testing and... Organization, Functions, and Delegations of Authority; Office of Planning, Research and Evaluation AGENCY...

  19. Structural reliability analysis under evidence theory using the active learning kriging model

    NASA Astrophysics Data System (ADS)

    Yang, Xufeng; Liu, Yongshou; Ma, Panke

    2017-11-01

    Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.

  20. The Fourth Annual Thermal and Fluids Analysis Workshop

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Fourth Annual Thermal and Fluids Analysis Workshop was held from August 17-21, 1992, at NASA Lewis Research Center. The workshop consisted of classes, vendor demonstrations, and paper sessions. The classes and vendor demonstrations provided participants with the information on widely used tools for thermal and fluids analysis. The paper sessions provided a forum for the exchange of information and ideas among thermal and fluids analysts. Paper topics included advances and uses of established thermal and fluids computer codes (such as SINDA and TRASYS) as well as unique modeling techniques and applications.

  1. Estimating animal resource selection from telemetry data using point process models

    USGS Publications Warehouse

    Johnson, Devin S.; Hooten, Mevin B.; Kuhn, Carey E.

    2013-01-01

    To demonstrate the analysis of telemetry data with the point process approach, we analysed a data set of telemetry locations from northern fur seals (Callorhinus ursinus) in the Pribilof Islands, Alaska. Both a space–time and an aggregated space-only model were fitted. At the individual level, the space–time analysis showed little selection relative to the habitat covariates. However, at the study area level, the space-only model showed strong selection relative to the covariates.

  2. The Use of Object-Oriented Analysis Methods in Surety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less

  3. NDARC - NASA Design and Analysis of Rotorcraft Validation and Demonstration

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2010-01-01

    Validation and demonstration results from the development of the conceptual design tool NDARC (NASA Design and Analysis of Rotorcraft) are presented. The principal tasks of NDARC are to design a rotorcraft to satisfy specified design conditions and missions, and then analyze the performance of the aircraft for a set of off-design missions and point operating conditions. The aircraft chosen as NDARC development test cases are the UH-60A single main-rotor and tail-rotor helicopter, the CH-47D tandem helicopter, the XH-59A coaxial lift-offset helicopter, and the XV-15 tiltrotor. These aircraft were selected because flight performance data, a weight statement, detailed geometry information, and a correlated comprehensive analysis model are available for each. Validation consists of developing the NDARC models for these aircraft by using geometry and weight information, airframe wind tunnel test data, engine decks, rotor performance tests, and comprehensive analysis results; and then comparing the NDARC results for aircraft and component performance with flight test data. Based on the calibrated models, the capability of the code to size rotorcraft is explored.

  4. A Complex Network Approach to Distributional Semantic Models

    PubMed Central

    Utsumi, Akira

    2015-01-01

    A number of studies on network analysis have focused on language networks based on free word association, which reflects human lexical knowledge, and have demonstrated the small-world and scale-free properties in the word association network. Nevertheless, there have been very few attempts at applying network analysis to distributional semantic models, despite the fact that these models have been studied extensively as computational or cognitive models of human lexical knowledge. In this paper, we analyze three network properties, namely, small-world, scale-free, and hierarchical properties, of semantic networks created by distributional semantic models. We demonstrate that the created networks generally exhibit the same properties as word association networks. In particular, we show that the distribution of the number of connections in these networks follows the truncated power law, which is also observed in an association network. This indicates that distributional semantic models can provide a plausible model of lexical knowledge. Additionally, the observed differences in the network properties of various implementations of distributional semantic models are consistently explained or predicted by considering the intrinsic semantic features of a word-context matrix and the functions of matrix weighting and smoothing. Furthermore, to simulate a semantic network with the observed network properties, we propose a new growing network model based on the model of Steyvers and Tenenbaum. The idea underlying the proposed model is that both preferential and random attachments are required to reflect different types of semantic relations in network growth process. We demonstrate that this model provides a better explanation of network behaviors generated by distributional semantic models. PMID:26295940

  5. Identification of visual evoked response parameters sensitive to pilot mental state

    NASA Technical Reports Server (NTRS)

    Zacharias, G. L.

    1988-01-01

    Systems analysis techniques were developed and demonstrated for modeling the electroencephalographic (EEG) steady state visual evoked response (ssVER), for use in EEG data compression and as an indicator of mental workload. The study focused on steady state frequency domain stimulation and response analysis, implemented with a sum-of-sines (SOS) stimulus generator and an off-line describing function response analyzer. Three major tasks were conducted: (1) VER related systems identification material was reviewed; (2) Software for experiment control and data analysis was developed and implemented; and (3) ssVER identification and modeling was demonstrated, via a mental loading experiment. It was found that a systems approach to ssVER functional modeling can serve as the basis for eventual development of a mental workload indicator. The review showed how transient visual evoked response (tVER) and ssVER research are related at the functional level, the software development showed how systems techniques can be used for ssVER characterization, and the pilot experiment showed how a simple model can be used to capture the basic dynamic response of the ssVER, under varying loads.

  6. The Use of Mouse Models of Breast Cancer and Quantitative Image Analysis to Evaluate Hormone Receptor Antigenicity after Microwave-assisted Formalin Fixation

    PubMed Central

    Engelberg, Jesse A.; Giberson, Richard T.; Young, Lawrence J.T.; Hubbard, Neil E.

    2014-01-01

    Microwave methods of fixation can dramatically shorten fixation times while preserving tissue structure; however, it remains unclear if adequate tissue antigenicity is preserved. To assess and validate antigenicity, robust quantitative methods and animal disease models are needed. We used two mouse mammary models of human breast cancer to evaluate microwave-assisted and standard 24-hr formalin fixation. The mouse models expressed four antigens prognostic for breast cancer outcome: estrogen receptor, progesterone receptor, Ki67, and human epidermal growth factor receptor 2. Using pathologist evaluation and novel methods of quantitative image analysis, we measured and compared the quality of antigen preservation, percentage of positive cells, and line plots of cell intensity. Visual evaluations by pathologists established that the amounts and patterns of staining were similar in tissues fixed by the different methods. The results of the quantitative image analysis provided a fine-grained evaluation, demonstrating that tissue antigenicity is preserved in tissues fixed using microwave methods. Evaluation of the results demonstrated that a 1-hr, 150-W fixation is better than a 45-min, 150-W fixation followed by a 15-min, 650-W fixation. The results demonstrated that microwave-assisted formalin fixation can standardize fixation times to 1 hr and produce immunohistochemistry that is in every way commensurate with longer conventional fixation methods. PMID:24682322

  7. An Active Learning Exercise for Introducing Agent-Based Modeling

    ERIC Educational Resources Information Center

    Pinder, Jonathan P.

    2013-01-01

    Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…

  8. Modelling and analysis of the sugar cataract development process using stochastic hybrid systems.

    PubMed

    Riley, D; Koutsoukos, X; Riley, K

    2009-05-01

    Modelling and analysis of biochemical systems such as sugar cataract development (SCD) are critical because they can provide new insights into systems, which cannot be easily tested with experiments; however, they are challenging problems due to the highly coupled chemical reactions that are involved. The authors present a stochastic hybrid system (SHS) framework for modelling biochemical systems and demonstrate the approach for the SCD process. A novel feature of the framework is that it allows modelling the effect of drug treatment on the system dynamics. The authors validate the three sugar cataract models by comparing trajectories computed by two simulation algorithms. Further, the authors present a probabilistic verification method for computing the probability of sugar cataract formation for different chemical concentrations using safety and reachability analysis methods for SHSs. The verification method employs dynamic programming based on a discretisation of the state space and therefore suffers from the curse of dimensionality. To analyse the SCD process, a parallel dynamic programming implementation that can handle large, realistic systems was developed. Although scalability is a limiting factor, this work demonstrates that the proposed method is feasible for realistic biochemical systems.

  9. Finite element modeling of truss structures with frequency-dependent material damping

    NASA Technical Reports Server (NTRS)

    Lesieutre, George A.

    1991-01-01

    A physically motivated modelling technique for structural dynamic analysis that accommodates frequency dependent material damping was developed. Key features of the technique are the introduction of augmenting thermodynamic fields (AFT) to interact with the usual mechanical displacement field, and the treatment of the resulting coupled governing equations using finite element analysis methods. The AFT method is fully compatible with current structural finite element analysis techniques. The method is demonstrated in the dynamic analysis of a 10-bay planar truss structure, a structure representative of those contemplated for use in future space systems.

  10. Functional recognition imaging using artificial neural networks: applications to rapid cellular identification via broadband electromechanical response

    NASA Astrophysics Data System (ADS)

    Nikiforov, M. P.; Reukov, V. V.; Thompson, G. L.; Vertegel, A. A.; Guo, S.; Kalinin, S. V.; Jesse, S.

    2009-10-01

    Functional recognition imaging in scanning probe microscopy (SPM) using artificial neural network identification is demonstrated. This approach utilizes statistical analysis of complex SPM responses at a single spatial location to identify the target behavior, which is reminiscent of associative thinking in the human brain, obviating the need for analytical models. We demonstrate, as an example of recognition imaging, rapid identification of cellular organisms using the difference in electromechanical activity over a broad frequency range. Single-pixel identification of model Micrococcus lysodeikticus and Pseudomonas fluorescens bacteria is achieved, demonstrating the viability of the method.

  11. Brain MRI analysis for Alzheimer's disease diagnosis using an ensemble system of deep convolutional neural networks.

    PubMed

    Islam, Jyoti; Zhang, Yanqing

    2018-05-31

    Alzheimer's disease is an incurable, progressive neurological brain disorder. Earlier detection of Alzheimer's disease can help with proper treatment and prevent brain tissue damage. Several statistical and machine learning models have been exploited by researchers for Alzheimer's disease diagnosis. Analyzing magnetic resonance imaging (MRI) is a common practice for Alzheimer's disease diagnosis in clinical research. Detection of Alzheimer's disease is exacting due to the similarity in Alzheimer's disease MRI data and standard healthy MRI data of older people. Recently, advanced deep learning techniques have successfully demonstrated human-level performance in numerous fields including medical image analysis. We propose a deep convolutional neural network for Alzheimer's disease diagnosis using brain MRI data analysis. While most of the existing approaches perform binary classification, our model can identify different stages of Alzheimer's disease and obtains superior performance for early-stage diagnosis. We conducted ample experiments to demonstrate that our proposed model outperformed comparative baselines on the Open Access Series of Imaging Studies dataset.

  12. Characterization of bone microstructure using photoacoustic spectrum analysis

    NASA Astrophysics Data System (ADS)

    Feng, Ting; Kozloff, Kenneth M.; Xu, Guan; Du, Sidan; Yuan, Jie; Deng, Cheri X.; Wang, Xueding

    2015-03-01

    Osteoporosis is a progressive bone disease that is characterized by a decrease in bone mass and deterioration in microarchitecture. This study investigates the feasibility of characterizing bone microstructure by analyzing the frequency spectrum of the photoacoustic signals from the bone. Modeling and numerical simulation of photoacoustic signals and their frequency-domain analysis were performed on trabecular bones with different mineral densities. The resulting quasilinear photoacoustic spectra were fit by linear regression, from which spectral parameter slope can be quantified. The modeling demonstrates that, at an optical wavelength of 685 nm, bone specimens with lower mineral densities have higher slope. Preliminary experiment on osteoporosis rat tibia bones with different mineral contents has also been conducted. The finding from the experiment has a good agreement with the modeling, both demonstrating that the frequency-domain analysis of photoacoustic signals can provide objective assessment of bone microstructure and deterioration. Considering that photoacoustic measurement is non-ionizing, non-invasive, and has sufficient penetration in both calcified and noncalcified tissues, this new technology holds unique potential for clinical translation.

  13. Multi-Scale Computational Modeling of Two-Phased Metal Using GMC Method

    NASA Technical Reports Server (NTRS)

    Moghaddam, Masoud Ghorbani; Achuthan, A.; Bednacyk, B. A.; Arnold, S. M.; Pineda, E. J.

    2014-01-01

    A multi-scale computational model for determining plastic behavior in two-phased CMSX-4 Ni-based superalloys is developed on a finite element analysis (FEA) framework employing crystal plasticity constitutive model that can capture the microstructural scale stress field. The generalized method of cells (GMC) micromechanics model is used for homogenizing the local field quantities. At first, GMC as stand-alone is validated by analyzing a repeating unit cell (RUC) as a two-phased sample with 72.9% volume fraction of gamma'-precipitate in the gamma-matrix phase and comparing the results with those predicted by finite element analysis (FEA) models incorporating the same crystal plasticity constitutive model. The global stress-strain behavior and the local field quantity distributions predicted by GMC demonstrated good agreement with FEA. High computational saving, at the expense of some accuracy in the components of local tensor field quantities, was obtained with GMC. Finally, the capability of the developed multi-scale model linking FEA and GMC to solve real life sized structures is demonstrated by analyzing an engine disc component and determining the microstructural scale details of the field quantities.

  14. A multi-level simulation platform of natural gas internal reforming solid oxide fuel cell-gas turbine hybrid generation system - Part II. Balancing units model library and system simulation

    NASA Astrophysics Data System (ADS)

    Bao, Cheng; Cai, Ningsheng; Croiset, Eric

    2011-10-01

    Following our integrated hierarchical modeling framework of natural gas internal reforming solid oxide fuel cell (IRSOFC), this paper firstly introduces the model libraries of main balancing units, including some state-of-the-art achievements and our specific work. Based on gPROMS programming code, flexible configuration and modular design are fully realized by specifying graphically all unit models in each level. Via comparison with the steady-state experimental data of Siemens-Westinghouse demonstration system, the in-house multi-level SOFC-gas turbine (GT) simulation platform is validated to be more accurate than the advanced power system analysis tool (APSAT). Moreover, some units of the demonstration system are designed reversely for analysis of a typically part-load transient process. The framework of distributed and dynamic modeling in most of units is significant for the development of control strategies in the future.

  15. Mathematical Analysis for Non-reciprocal-interaction-based Model of Collective Behavior

    NASA Astrophysics Data System (ADS)

    Kano, Takeshi; Osuka, Koichi; Kawakatsu, Toshihiro; Ishiguro, Akio

    2017-12-01

    In many natural and social systems, collective behaviors emerge as a consequence of non-reciprocal interaction between their constituents. As a first step towards understanding the core principle that underlies these phenomena, we previously proposed a minimal model of collective behavior based on non-reciprocal interactions by drawing inspiration from friendship formation in human society, and demonstrated via simulations that various non-trivial patterns emerge by changing parameters. In this study, a mathematical analysis of the proposed model wherein the system size is small is performed. Through the analysis, the mechanism of the transition between several patterns is elucidated.

  16. Real-Time Onboard Global Nonlinear Aerodynamic Modeling from Flight Data

    NASA Technical Reports Server (NTRS)

    Brandon, Jay M.; Morelli, Eugene A.

    2014-01-01

    Flight test and modeling techniques were developed to accurately identify global nonlinear aerodynamic models onboard an aircraft. The techniques were developed and demonstrated during piloted flight testing of an Aermacchi MB-326M Impala jet aircraft. Advanced piloting techniques and nonlinear modeling techniques based on fuzzy logic and multivariate orthogonal function methods were implemented with efficient onboard calculations and flight operations to achieve real-time maneuver monitoring and analysis, and near-real-time global nonlinear aerodynamic modeling and prediction validation testing in flight. Results demonstrated that global nonlinear aerodynamic models for a large portion of the flight envelope were identified rapidly and accurately using piloted flight test maneuvers during a single flight, with the final identified and validated models available before the aircraft landed.

  17. Sobol‧ sensitivity analysis of NAPL-contaminated aquifer remediation process based on multiple surrogates

    NASA Astrophysics Data System (ADS)

    Luo, Jiannan; Lu, Wenxi

    2014-06-01

    Sobol‧ sensitivity analyses based on different surrogates were performed on a trichloroethylene (TCE)-contaminated aquifer to assess the sensitivity of the design variables of remediation duration, surfactant concentration and injection rates at four wells to remediation efficiency First, the surrogate models of a multi-phase flow simulation model were constructed by applying radial basis function artificial neural network (RBFANN) and Kriging methods, and the two models were then compared. Based on the developed surrogate models, the Sobol‧ method was used to calculate the sensitivity indices of the design variables which affect the remediation efficiency. The coefficient of determination (R2) and the mean square error (MSE) of these two surrogate models demonstrated that both models had acceptable approximation accuracy, furthermore, the approximation accuracy of the Kriging model was slightly better than that of the RBFANN model. Sobol‧ sensitivity analysis results demonstrated that the remediation duration was the most important variable influencing remediation efficiency, followed by rates of injection at wells 1 and 3, while rates of injection at wells 2 and 4 and the surfactant concentration had negligible influence on remediation efficiency. In addition, high-order sensitivity indices were all smaller than 0.01, which indicates that interaction effects of these six factors were practically insignificant. The proposed Sobol‧ sensitivity analysis based on surrogate is an effective tool for calculating sensitivity indices, because it shows the relative contribution of the design variables (individuals and interactions) to the output performance variability with a limited number of runs of a computationally expensive simulation model. The sensitivity analysis results lay a foundation for the optimal groundwater remediation process optimization.

  18. Analysis of Composite Skin-Stiffener Debond Specimens Using Volume Elements and a Shell/3D Modeling Technique

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald; Minguet, Pierre J.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The debonding of a skin/stringer specimen subjected to tension was studied using three-dimensional volume element modeling and computational fracture mechanics. Mixed mode strain energy release rates were calculated from finite element results using the virtual crack closure technique. The simulations revealed an increase in total energy release rate in the immediate vicinity of the free edges of the specimen. Correlation of the computed mixed-mode strain energy release rates along the delamination front contour with a two-dimensional mixed-mode interlaminar fracture criterion suggested that in spite of peak total energy release rates at the free edge the delamination would not advance at the edges first. The qualitative prediction of the shape of the delamination front was confirmed by X-ray photographs of a specimen taken during testing. The good correlation between prediction based on analysis and experiment demonstrated the efficiency of a mixed-mode failure analysis for the investigation of skin/stiffener separation due to delamination in the adherents. The application of a shell/3D modeling technique for the simulation of skin/stringer debond in a specimen subjected to three-point bending is also demonstrated. The global structure was modeled with shell elements. A local three-dimensional model, extending to about three specimen thicknesses on either side of the delamination front was used to capture the details of the damaged section. Computed total strain energy release rates and mixed-mode ratios obtained from shell/3D simulations were in good agreement with results obtained from full solid models. The good correlations of the results demonstrated the effectiveness of the shell/3D modeling technique for the investigation of skin/stiffener separation due to delamination in the adherents.

  19. Remotely piloted vehicle: Application of the GRASP analysis method

    NASA Technical Reports Server (NTRS)

    Andre, W. L.; Morris, J. B.

    1981-01-01

    The application of General Reliability Analysis Simulation Program (GRASP) to the remotely piloted vehicle (RPV) system is discussed. The model simulates the field operation of the RPV system. By using individual component reliabilities, the overall reliability of the RPV system is determined. The results of the simulations are given in operational days. The model represented is only a basis from which more detailed work could progress. The RPV system in this model is based on preliminary specifications and estimated values. The use of GRASP from basic system definition, to model input, and to model verification is demonstrated.

  20. DigR: a generic model and its open source simulation software to mimic three-dimensional root-system architecture diversity.

    PubMed

    Barczi, Jean-François; Rey, Hervé; Griffon, Sébastien; Jourdan, Christophe

    2018-04-18

    Many studies exist in the literature dealing with mathematical representations of root systems, categorized, for example, as pure structure description, partial derivative equations or functional-structural plant models. However, in these studies, root architecture modelling has seldom been carried out at the organ level with the inclusion of environmental influences that can be integrated into a whole plant characterization. We have conducted a multidisciplinary study on root systems including field observations, architectural analysis, and formal and mathematical modelling. This integrative and coherent approach leads to a generic model (DigR) and its software simulator. Architecture analysis applied to root systems helps at root type classification and architectural unit design for each species. Roots belonging to a particular type share dynamic and morphological characteristics which consist of topological and geometric features. The DigR simulator is integrated into the Xplo environment, with a user interface to input parameter values and make output ready for dynamic 3-D visualization, statistical analysis and saving to standard formats. DigR is simulated in a quasi-parallel computing algorithm and may be used either as a standalone tool or integrated into other simulation platforms. The software is open-source and free to download at http://amapstudio.cirad.fr/soft/xplo/download. DigR is based on three key points: (1) a root-system architectural analysis, (2) root type classification and modelling and (3) a restricted set of 23 root type parameters with flexible values indexed in terms of root position. Genericity and botanical accuracy of the model is demonstrated for growth, branching, mortality and reiteration processes, and for different root architectures. Plugin examples demonstrate the model's versatility at simulating plastic responses to environmental constraints. Outputs of the model include diverse root system structures such as tap-root, fasciculate, tuberous, nodulated and clustered root systems. DigR is based on plant architecture analysis which leads to specific root type classification and organization that are directly linked to field measurements. The open source simulator of the model has been included within a friendly user environment. DigR accuracy and versatility are demonstrated for growth simulations of complex root systems for both annual and perennial plants.

  1. Analysis of the stochastic excitability in the flow chemical reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bashkirtseva, Irina

    2015-11-30

    A dynamic model of the thermochemical process in the flow reactor is considered. We study an influence of the random disturbances on the stationary regime of this model. A phenomenon of noise-induced excitability is demonstrated. For the analysis of this phenomenon, a constructive technique based on the stochastic sensitivity functions and confidence domains is applied. It is shown how elaborated technique can be used for the probabilistic analysis of the generation of mixed-mode stochastic oscillations in the flow chemical reactor.

  2. UV spectroscopy including ISM line absorption: of the exciting star of Abell 35

    NASA Astrophysics Data System (ADS)

    Ziegler, M.; Rauch, T.; Werner, K.; Kruk, J. W.

    Reliable spectral analysis that is based on high-resolution UV observations requires an adequate, simultaneous modeling of the interstellar line absorption and reddening. In the case of the central star of the planetary nebula Abell 35, BD-22 3467, we demonstrate our current standard spectral-analysis method that is based on the Tübingen NLTE Model-Atmosphere Package (TMAP). We present an on- going spectral analysis of FUSE and HST/STIS observations of BD-22 3467.

  3. Analysis of the stochastic excitability in the flow chemical reactor

    NASA Astrophysics Data System (ADS)

    Bashkirtseva, Irina

    2015-11-01

    A dynamic model of the thermochemical process in the flow reactor is considered. We study an influence of the random disturbances on the stationary regime of this model. A phenomenon of noise-induced excitability is demonstrated. For the analysis of this phenomenon, a constructive technique based on the stochastic sensitivity functions and confidence domains is applied. It is shown how elaborated technique can be used for the probabilistic analysis of the generation of mixed-mode stochastic oscillations in the flow chemical reactor.

  4. The business value and cost-effectiveness of genomic medicine.

    PubMed

    Crawford, James M; Aspinall, Mara G

    2012-05-01

    Genomic medicine offers the promise of more effective diagnosis and treatment of human diseases. Genome sequencing early in the course of disease may enable more timely and informed intervention, with reduced healthcare costs and improved long-term outcomes. However, genomic medicine strains current models for demonstrating value, challenging efforts to achieve fair payment for services delivered, both for laboratory diagnostics and for use of molecular information in clinical management. Current models of healthcare reform stipulate that care must be delivered at equal or lower cost, with better patient and population outcomes. To achieve demonstrated value, genomic medicine must overcome many uncertainties: the clinical relevance of genomic variation; potential variation in technical performance and/or computational analysis; management of massive information sets; and must have available clinical interventions that can be informed by genomic analysis, so as to attain more favorable cost management of healthcare delivery and demonstrate improvements in cost-effectiveness.

  5. Selecting the "Best" Factor Structure and Moving Measurement Validation Forward: An Illustration.

    PubMed

    Schmitt, Thomas A; Sass, Daniel A; Chappelle, Wayne; Thompson, William

    2018-04-09

    Despite the broad literature base on factor analysis best practices, research seeking to evaluate a measure's psychometric properties frequently fails to consider or follow these recommendations. This leads to incorrect factor structures, numerous and often overly complex competing factor models and, perhaps most harmful, biased model results. Our goal is to demonstrate a practical and actionable process for factor analysis through (a) an overview of six statistical and psychometric issues and approaches to be aware of, investigate, and report when engaging in factor structure validation, along with a flowchart for recommended procedures to understand latent factor structures; (b) demonstrating these issues to provide a summary of the updated Posttraumatic Stress Disorder Checklist (PCL-5) factor models and a rationale for validation; and (c) conducting a comprehensive statistical and psychometric validation of the PCL-5 factor structure to demonstrate all the issues we described earlier. Considering previous research, the PCL-5 was evaluated using a sample of 1,403 U.S. Air Force remotely piloted aircraft operators with high levels of battlefield exposure. Previously proposed PCL-5 factor structures were not supported by the data, but instead a bifactor model is arguably more statistically appropriate.

  6. Pelagic Habitat Analysis Module (PHAM) for GIS Based Fisheries Decision Support

    NASA Technical Reports Server (NTRS)

    Kiefer, D. A.; Armstrong, Edward M.; Harrison, D. P.; Hinton, M. G.; Kohin, S.; Snyder, S.; O'Brien, F. J.

    2011-01-01

    We have assembled a system that integrates satellite and model output with fisheries data We have developed tools that allow analysis of the interaction between species and key environmental variables Demonstrated the capacity to accurately map habitat of Thresher Sharks Alopias vulpinus & pelagicus. Their seasonal migration along the California Current is at least partly driven by the seasonal migration of sardine, key prey of the sharks.We have assembled a system that integrates satellite and model output with fisheries data We have developed tools that allow analysis of the interaction between species and key environmental variables Demonstrated the capacity to accurately map habitat of Thresher Sharks Alopias vulpinus nd pelagicus. Their seasonal migration along the California Current is at least partly driven by the seasonal migration of sardine, key prey of the sharks.

  7. Incorporating principal component analysis into air quality model evaluation

    EPA Science Inventory

    The efficacy of standard air quality model evaluation techniques is becoming compromised as the simulation periods continue to lengthen in response to ever increasing computing capacity. Accordingly, the purpose of this paper is to demonstrate a statistical approach called Princi...

  8. Comparison of composite rotor blade models: A coupled-beam analysis and an MSC/NASTRAN finite-element model

    NASA Technical Reports Server (NTRS)

    Hodges, Robert V.; Nixon, Mark W.; Rehfield, Lawrence W.

    1987-01-01

    A methodology was developed for the structural analysis of composite rotor blades. This coupled-beam analysis is relatively simple to use compared with alternative analysis techniques. The beam analysis was developed for thin-wall single-cell rotor structures and includes the effects of elastic coupling. This paper demonstrates the effectiveness of the new composite-beam analysis method through comparison of its results with those of an established baseline analysis technique. The baseline analysis is an MSC/NASTRAN finite-element model built up from anisotropic shell elements. Deformations are compared for three linear static load cases of centrifugal force at design rotor speed, applied torque, and lift for an ideal rotor in hover. A D-spar designed to twist under axial loading is the subject of the analysis. Results indicate the coupled-beam analysis is well within engineering accuracy.

  9. A Quantitative Approach to Scar Analysis

    PubMed Central

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-01-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. PMID:21281794

  10. Psychometric properties of the college survey for students with brain injury: individuals with and without traumatic brain injury.

    PubMed

    Kennedy, Mary R T; Krause, Miriam O; O'Brien, Katy H

    2014-01-01

    The psychometric properties of the college challenges sub-set from The College Survey for Students with Brain Injury (CSS-BI) were investigated with adults with and without traumatic brain injury (TBI). Adults with and without TBI completed the CSS-BI. A sub-set of participants with TBI were interviewed, intentional and convergent validity were investigated, and the internal structure of the college challenges was analysed with exploratory factor analysis/principle component analysis. Respondents with TBI understood the items describing college challenges with evidence of intentional validity. More individuals with TBI than controls endorsed eight of the 13 college challenges. Those who reported more health issues endorsed more college challenges, demonstrating preliminary convergent validity. Cronbach's alphas of >0.85 demonstrated acceptable internal reliability. Factor analysis revealed a four-factor model for those with TBI: studying and learning (Factor 1), time management and organization (Factor 2), social (Factor 3) and nervousness/anxiety (Factor 4). This model explained 72% and 69% of the variance for those with and without TBI, respectively. The college challenges sub-set from the CSS-BI identifies challenges that individuals with TBI face when going to college. Some challenges were related to two factors in the model, demonstrating the inter-connections of these experiences.

  11. Dynamic Blowout Risk Analysis Using Loss Functions.

    PubMed

    Abimbola, Majeed; Khan, Faisal

    2018-02-01

    Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real-time risk analysis. The real-time evolving situation is considered dependent on the changing bottom-hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout. © 2017 Society for Risk Analysis.

  12. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-09-01

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.

  13. Clonal analysis of synovial fluid stem cells to characterize and identify stable mesenchymal stromal cell/mesenchymal progenitor cell phenotypes in a porcine model: a cell source with enhanced commitment to the chondrogenic lineage.

    PubMed

    Ando, Wataru; Kutcher, Josh J; Krawetz, Roman; Sen, Arindom; Nakamura, Norimasa; Frank, Cyril B; Hart, David A

    2014-06-01

    Previous studies have demonstrated that porcine synovial membrane stem cells can adhere to a cartilage defect in vivo through the use of a tissue-engineered construct approach. To optimize this model, we wanted to compare effectiveness of tissue sources to determine whether porcine synovial fluid, synovial membrane, bone marrow and skin sources replicate our understanding of synovial fluid mesenchymal stromal cells or mesenchymal progenitor cells from humans both at the population level and the single-cell level. Synovial fluid clones were subsequently isolated and characterized to identify cells with a highly characterized optimal phenotype. The chondrogenic, osteogenic and adipogenic potentials were assessed in vitro for skin, bone marrow, adipose, synovial fluid and synovial membrane-derived stem cells. Synovial fluid cells then underwent limiting dilution analysis to isolate single clonal populations. These clonal populations were assessed for proliferative and differentiation potential by use of standardized protocols. Porcine-derived cells demonstrated the same relationship between cell sources as that demonstrated previously for humans, suggesting that the pig may be an ideal preclinical animal model. Synovial fluid cells demonstrated the highest chondrogenic potential that was further characterized, demonstrating the existence of a unique clonal phenotype with enhanced chondrogenic potential. Porcine stem cells demonstrate characteristics similar to those in human-derived mesenchymal stromal cells from the same sources. Synovial fluid-derived stem cells contain an inherent phenotype that may be optimal for cartilage repair. This must be more fully investigated for future use in the in vivo tissue-engineered construct approach in this physiologically relevant preclinical porcine model. Copyright © 2014 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  14. Nonlinear dynamic mechanism of vocal tremor from voice analysis and model simulations

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Jiang, Jack J.

    2008-09-01

    Nonlinear dynamic analysis and model simulations are used to study the nonlinear dynamic characteristics of vocal folds with vocal tremor, which can typically be characterized by low-frequency modulation and aperiodicity. Tremor voices from patients with disorders such as paresis, Parkinson's disease, hyperfunction, and adductor spasmodic dysphonia show low-dimensional characteristics, differing from random noise. Correlation dimension analysis statistically distinguishes tremor voices from normal voices. Furthermore, a nonlinear tremor model is proposed to study the vibrations of the vocal folds with vocal tremor. Fractal dimensions and positive Lyapunov exponents demonstrate the evidence of chaos in the tremor model, where amplitude and frequency play important roles in governing vocal fold dynamics. Nonlinear dynamic voice analysis and vocal fold modeling may provide a useful set of tools for understanding the dynamic mechanism of vocal tremor in patients with laryngeal diseases.

  15. Sensitivity analysis of automatic flight control systems using singular value concepts

    NASA Technical Reports Server (NTRS)

    Herrera-Vaillard, A.; Paduano, J.; Downing, D.

    1985-01-01

    A sensitivity analysis is presented that can be used to judge the impact of vehicle dynamic model variations on the relative stability of multivariable continuous closed-loop control systems. The sensitivity analysis uses and extends the singular-value concept by developing expressions for the gradients of the singular value with respect to variations in the vehicle dynamic model and the controller design. Combined with a priori estimates of the accuracy of the model, the gradients are used to identify the elements in the vehicle dynamic model and controller that could severely impact the system's relative stability. The technique is demonstrated for a yaw/roll damper stability augmentation designed for a business jet.

  16. Infinite von Mises-Fisher Mixture Modeling of Whole Brain fMRI Data.

    PubMed

    Røge, Rasmus E; Madsen, Kristoffer H; Schmidt, Mikkel N; Mørup, Morten

    2017-10-01

    Cluster analysis of functional magnetic resonance imaging (fMRI) data is often performed using gaussian mixture models, but when the time series are standardized such that the data reside on a hypersphere, this modeling assumption is questionable. The consequences of ignoring the underlying spherical manifold are rarely analyzed, in part due to the computational challenges imposed by directional statistics. In this letter, we discuss a Bayesian von Mises-Fisher (vMF) mixture model for data on the unit hypersphere and present an efficient inference procedure based on collapsed Markov chain Monte Carlo sampling. Comparing the vMF and gaussian mixture models on synthetic data, we demonstrate that the vMF model has a slight advantage inferring the true underlying clustering when compared to gaussian-based models on data generated from both a mixture of vMFs and a mixture of gaussians subsequently normalized. Thus, when performing model selection, the two models are not in agreement. Analyzing multisubject whole brain resting-state fMRI data from healthy adult subjects, we find that the vMF mixture model is considerably more reliable than the gaussian mixture model when comparing solutions across models trained on different groups of subjects, and again we find that the two models disagree on the optimal number of components. The analysis indicates that the fMRI data support more than a thousand clusters, and we confirm this is not a result of overfitting by demonstrating better prediction on data from held-out subjects. Our results highlight the utility of using directional statistics to model standardized fMRI data and demonstrate that whole brain segmentation of fMRI data requires a very large number of functional units in order to adequately account for the discernible statistical patterns in the data.

  17. Model based inversion of ultrasound data in composites

    NASA Astrophysics Data System (ADS)

    Roberts, R. A.

    2018-04-01

    Work is reported on model-based defect characterization in CFRP composites. The work utilizes computational models of ultrasound interaction with defects in composites, to determine 1) the measured signal dependence on material and defect properties (forward problem), and 2) an assessment of defect properties from analysis of measured ultrasound signals (inverse problem). Work is reported on model implementation for inspection of CFRP laminates containing multi-ply impact-induced delamination, in laminates displaying irregular surface geometry (roughness), as well as internal elastic heterogeneity (varying fiber density, porosity). Inversion of ultrasound data is demonstrated showing the quantitative extraction of delamination geometry and surface transmissivity. Additionally, data inversion is demonstrated for determination of surface roughness and internal heterogeneity, and the influence of these features on delamination characterization is examined. Estimation of porosity volume fraction is demonstrated when internal heterogeneity is attributed to porosity.

  18. Tutorial on Biostatistics: Linear Regression Analysis of Continuous Correlated Eye Data

    PubMed Central

    Ying, Gui-shuang; Maguire, Maureen G; Glynn, Robert; Rosner, Bernard

    2017-01-01

    Purpose To describe and demonstrate appropriate linear regression methods for analyzing correlated continuous eye data. Methods We describe several approaches to regression analysis involving both eyes, including mixed effects and marginal models under various covariance structures to account for inter-eye correlation. We demonstrate, with SAS statistical software, applications in a study comparing baseline refractive error between one eye with choroidal neovascularization (CNV) and the unaffected fellow eye, and in a study determining factors associated with visual field data in the elderly. Results When refractive error from both eyes were analyzed with standard linear regression without accounting for inter-eye correlation (adjusting for demographic and ocular covariates), the difference between eyes with CNV and fellow eyes was 0.15 diopters (D; 95% confidence interval, CI −0.03 to 0.32D, P=0.10). Using a mixed effects model or a marginal model, the estimated difference was the same but with narrower 95% CI (0.01 to 0.28D, P=0.03). Standard regression for visual field data from both eyes provided biased estimates of standard error (generally underestimated) and smaller P-values, while analysis of the worse eye provided larger P-values than mixed effects models and marginal models. Conclusion In research involving both eyes, ignoring inter-eye correlation can lead to invalid inferences. Analysis using only right or left eyes is valid, but decreases power. Worse-eye analysis can provide less power and biased estimates of effect. Mixed effects or marginal models using the eye as the unit of analysis should be used to appropriately account for inter-eye correlation and maximize power and precision. PMID:28102741

  19. Simulation technique for modeling flow on floodplains and in coastal wetlands

    USGS Publications Warehouse

    Schaffranek, Raymond W.; Baltzer, Robert A.

    1988-01-01

    The system design is premised on a proven, areal two-dimensional, finite-difference flow/transport model which is supported by an operational set of computer programs for input data management and model output interpretation. The purposes of the project are (1) to demonstrate the utility of the model for providing useful highway design information, (2) to develop guidelines and procedures for using the simulation system for evaluation, analysis, and optimal design of highway crossings of floodplain and coastal wetland areas, and (3) to identify improvements which can be effected in the simulation system to better serve the needs of highway design engineers. Two case study model implementations, being conducted to demonstrate the simulation system and modeling procedure, are presented and discussed briefly.

  20. Application of the Shell/3D Modeling Technique for the Analysis of Skin-Stiffener Debond Specimens

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald; O'Brien, T. Kevin; Minguet, Pierre J.

    2002-01-01

    The application of a shell/3D modeling technique for the simulation of skin/stringer debond in a specimen subjected to three-point bending is demonstrated. The global structure was modeled with shell elements. A local three-dimensional model, extending to about three specimen thicknesses on either side of the delamination front was used to capture the details of the damaged section. Computed total strain energy release rates and mixed-mode ratios obtained from shell/13D simulations were in good agreement with results obtained from full solid models. The good correlations of the results demonstrated the effectiveness of the shell/3D modeling technique for the investigation of skin/stiffener separation due to delamination in the adherents.

  1. Bayesian Finite Mixtures for Nonlinear Modeling of Educational Data.

    ERIC Educational Resources Information Center

    Tirri, Henry; And Others

    A Bayesian approach for finding latent classes in data is discussed. The approach uses finite mixture models to describe the underlying structure in the data and demonstrate that the possibility of using full joint probability models raises interesting new prospects for exploratory data analysis. The concepts and methods discussed are illustrated…

  2. Analysis of Some Properties of the Nonlinear Schrödinger Equation Used for Filamentation Modeling

    NASA Astrophysics Data System (ADS)

    Zemlyanov, A. A.; Bulygin, A. D.

    2018-06-01

    Properties of the integral of motion and evolution of the effective light beam radius are analyzed for the stationary model of the nonlinear Schrödinger equation describing the filamentation. It is demonstrated that within the limits of such model, filamentation is limited only by the dissipation mechanisms.

  3. A comprehensive pharmacokinetic/pharmacodynamics analysis of the novel IGF1R/INSR inhibitor BI 893923 applying in vitro, in vivo and in silico modeling techniques.

    PubMed

    Titze, Melanie I; Schaaf, Otmar; Hofmann, Marco H; Sanderson, Michael P; Zahn, Stephan K; Quant, Jens; Lehr, Thorsten

    2016-06-01

    BI 893923 is a novel IGF1R/INSR tyrosine kinase inhibitor demonstrating anti-tumor efficacy and good tolerability. We aimed to characterize the relationship between BI 893923 plasma concentration, tumor biomarker modulation, tumor growth and hyperglycemia in mice using in silico modeling analyses. In vitro molecular and cellular assays were used to demonstrate the potency and selectivity of BI 893923. Diverse in vitro DMPK assays were used to characterize the compound's drug-like properties. Mice xenografted with human GEO tumors were treated with different doses of BI 893923 to demonstrate the compound's efficacy, biomarker modulation and tolerability. PK/PD analyses were performed using nonlinear mixed-effects modeling. BI 893923 demonstrated potent and selective molecular inhibition of the IGF1R and INSR and demonstrated attractive drug-like properties (permeability, bioavailability). BI 893923 dose-dependently reduced GEO tumor growth and demonstrated good tolerability, characterized by transient hyperglycemia and normal body weight gain. A population PK/PD model was developed, which established relationships between BI 893923 pharmacokinetics, hyperglycemia, pIGF1R reduction and tumor growth. BI 893923 demonstrates molecular properties consistent with a highly attractive inhibitor of the IGF1R/INSR. A generic PK/PD model was developed to support preclinical drug development and dose finding in mice.

  4. Pyrotechnic Shock Analysis Using Statistical Energy Analysis

    DTIC Science & Technology

    2015-10-23

    SEA subsystems. A couple of validation examples are provided to demonstrate the new approach. KEY WORDS : Peak Ratio, phase perturbation...Ballistic Shock Prediction Models and Techniques for Use in the Crusader Combat Vehicle Program,” 11th Annual US Army Ground Vehicle Survivability

  5. A framework for longitudinal data analysis via shape regression

    NASA Astrophysics Data System (ADS)

    Fishbaugh, James; Durrleman, Stanley; Piven, Joseph; Gerig, Guido

    2012-02-01

    Traditional longitudinal analysis begins by extracting desired clinical measurements, such as volume or head circumference, from discrete imaging data. Typically, the continuous evolution of a scalar measurement is estimated by choosing a 1D regression model, such as kernel regression or fitting a polynomial of fixed degree. This type of analysis not only leads to separate models for each measurement, but there is no clear anatomical or biological interpretation to aid in the selection of the appropriate paradigm. In this paper, we propose a consistent framework for the analysis of longitudinal data by estimating the continuous evolution of shape over time as twice differentiable flows of deformations. In contrast to 1D regression models, one model is chosen to realistically capture the growth of anatomical structures. From the continuous evolution of shape, we can simply extract any clinical measurements of interest. We demonstrate on real anatomical surfaces that volume extracted from a continuous shape evolution is consistent with a 1D regression performed on the discrete measurements. We further show how the visualization of shape progression can aid in the search for significant measurements. Finally, we present an example on a shape complex of the brain (left hemisphere, right hemisphere, cerebellum) that demonstrates a potential clinical application for our framework.

  6. Methods utilized in evaluating the profitability of commercial space processing

    NASA Technical Reports Server (NTRS)

    Bloom, H. L.; Schmitt, P. T.

    1976-01-01

    Profitability analysis is applied to commercial space processing on the basis of business concept definition and assessment and the relationship between ground and space functions. Throughput analysis is demonstrated by analysis of the space manufacturing of surface acoustic wave devices. The paper describes a financial analysis model for space processing and provides key profitability measures for space processed isoenzymes.

  7. A generalized spatiotemporal covariance model for stationary background in analysis of MEG data.

    PubMed

    Plis, S M; Schmidt, D M; Jun, S C; Ranken, D M

    2006-01-01

    Using a noise covariance model based on a single Kronecker product of spatial and temporal covariance in the spatiotemporal analysis of MEG data was demonstrated to provide improvement in the results over that of the commonly used diagonal noise covariance model. In this paper we present a model that is a generalization of all of the above models. It describes models based on a single Kronecker product of spatial and temporal covariance as well as more complicated multi-pair models together with any intermediate form expressed as a sum of Kronecker products of spatial component matrices of reduced rank and their corresponding temporal covariance matrices. The model provides a framework for controlling the tradeoff between the described complexity of the background and computational demand for the analysis using this model. Ways to estimate the value of the parameter controlling this tradeoff are also discussed.

  8. Rasch analysis on OSCE data : An illustrative example.

    PubMed

    Tor, E; Steketee, C

    2011-01-01

    The Objective Structured Clinical Examination (OSCE) is a widely used tool for the assessment of clinical competence in health professional education. The goal of the OSCE is to make reproducible decisions on pass/fail status as well as students' levels of clinical competence according to their demonstrated abilities based on the scores. This paper explores the use of the polytomous Rasch model in evaluating the psychometric properties of OSCE scores through a case study. The authors analysed an OSCE data set (comprised of 11 stations) for 80 fourth year medical students based on the polytomous Rasch model in an effort to answer two research questions: (1) Do the clinical tasks assessed in the 11 OSCE stations map on to a common underlying construct, namely clinical competence? (2) What other insights can Rasch analysis offer in terms of scaling, item analysis and instrument validation over and above the conventional analysis based on classical test theory? The OSCE data set has demonstrated a sufficient degree of fit to the Rasch model (Χ(2) = 17.060, DF=22, p=0.76) indicating that the 11 OSCE station scores have sufficient psychometric properties to form a measure for a common underlying construct, i.e. clinical competence. Individual OSCE station scores with good fit to the Rasch model (p > 0.1 for all Χ(2) statistics) further corroborated the characteristic of unidimensionality of the OSCE scale for clinical competence. A Person Separation Index (PSI) of 0.704 indicates sufficient level of reliability for the OSCE scores. Other useful findings from the Rasch analysis that provide insights, over and above the analysis based on classical test theory, are also exemplified using the data set. The polytomous Rasch model provides a useful and supplementary approach to the calibration and analysis of OSCE examination data.

  9. A superconducting nanowire can be modeled by using SPICE

    NASA Astrophysics Data System (ADS)

    Berggren, Karl K.; Zhao, Qing-Yuan; Abebe, Nathnael; Chen, Minjie; Ravindran, Prasana; McCaughan, Adam; Bardin, Joseph C.

    2018-05-01

    Modeling of superconducting nanowire single-photon detectors typically requires custom simulations or finite-element analysis in one or two dimensions. Here, we demonstrate two simplified one-dimensional SPICE models of a superconducting nanowire that can quickly and efficiently describe the electrical characteristics of a superconducting nanowire. These models may be of particular use in understanding alternative architectures for nanowire detectors and readouts.

  10. Predictive and mechanistic multivariate linear regression models for reaction development

    PubMed Central

    Santiago, Celine B.; Guo, Jing-Yao

    2018-01-01

    Multivariate Linear Regression (MLR) models utilizing computationally-derived and empirically-derived physical organic molecular descriptors are described in this review. Several reports demonstrating the effectiveness of this methodological approach towards reaction optimization and mechanistic interrogation are discussed. A detailed protocol to access quantitative and predictive MLR models is provided as a guide for model development and parameter analysis. PMID:29719711

  11. A Modular Simulation Framework for Assessing Swarm Search Models

    DTIC Science & Technology

    2014-09-01

    SUBTITLE A MODULAR SIMULATION FRAMEWORK FOR ASSESSING SWARM SEARCH MODELS 5. FUNDING NUMBERS 6. AUTHOR(S) Blake M. Wanier 7. PERFORMING ORGANIZATION...Numerical studies demonstrate the ability to leverage the developed simulation and analysis framework to investigate three canonical swarm search models ...as benchmarks for future exploration of more sophisticated swarm search scenarios. 14. SUBJECT TERMS Swarm Search, Search Theory, Modeling Framework

  12. A comprehensive equivalent circuit model of all-vanadium redox flow battery for power system analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Zhao, Jiyun; Wang, Peng; Skyllas-Kazacos, Maria; Xiong, Binyu; Badrinarayanan, Rajagopalan

    2015-09-01

    Electrical equivalent circuit models demonstrate excellent adaptability and simplicity in predicting the electrical dynamic response of the all-vanadium redox flow battery (VRB) system. However, only a few publications that focus on this topic are available. The paper presents a comprehensive equivalent circuit model of VRB for system level analysis. The least square method is used to identify both steady-state and dynamic characteristics of VRB. The inherent features of the flow battery such as shunt current, ion diffusion and pumping energy consumption are also considered. The proposed model consists of an open-circuit voltage source, two parasitic shunt bypass circuits, a 1st order resistor-capacitor network and a hydraulic circuit model. Validated with experimental data, the proposed model demonstrates excellent accuracy. The mean-error of terminal voltage and pump consumption are 0.09 V and 0.49 W respectively. Based on the proposed model, self-discharge and system efficiency are studied. An optimal flow rate which maximizes the system efficiency is identified. Finally, the dynamic responses of the proposed VRB model under step current profiles are presented. Variables such as SOC and stack terminal voltage can be provided.

  13. Competitive assessment of aerospace systems using system dynamics

    NASA Astrophysics Data System (ADS)

    Pfaender, Jens Holger

    Aircraft design has recently experienced a trend away from performance centric design towards a more balanced approach with increased emphasis on engineering an economically successful system. This approach focuses on bringing forward a comprehensive economic and life-cycle cost analysis. Since the success of any system also depends on many external factors outside of the control of the designer, this traditionally has been modeled as noise affecting the uncertainty of the design. However, this approach is currently lacking a strategic treatment of necessary early decisions affecting the probability of success of a given concept in a dynamic environment. This suggests that the introduction of a dynamic method into a life-cycle cost analysis should allow the analysis of the future attractiveness of such a concept in the presence of uncertainty. One way of addressing this is through the use of a competitive market model. However, existing market models do not focus on the dynamics of the market. Instead, they focus on modeling and predicting market share through logit regression models. The resulting models exhibit relatively poor predictive capabilities. The method proposed here focuses on a top-down approach that integrates a competitive model based on work in the field of system dynamics into the aircraft design process. Demonstrating such integration is one of the primary contributions of this work, which previously has not been demonstrated. This integration is achieved through the use of surrogate models, in this case neural networks. This enabled not only the practical integration of analysis techniques, but also reduced the computational requirements so that interactive exploration as envisioned was actually possible. The example demonstration of this integration is built on the competition in the 250 seat large commercial aircraft market exemplified by the Boeing 767-400ER and the Airbus A330-200. Both aircraft models were calibrated to existing performance and certification data and then integrated into the system dynamics market model. The market model was then calibrated with historical market data. This calibration showed a much improved predictive capability as compared to the conventional logit regression models. An additional advantage of this dynamic model is that to realize this improved capability, no additional explanatory variables were required. Furthermore, the resulting market model was then integrated into a prediction profiler environment with a time variant Monte-Carlo analysis resulting in a unique trade-off environment. This environment was shown to allow interactive trade-off between aircraft design decisions and economic considerations while allowing the exploration potential market success in the light of varying external market conditions and scenarios. The resulting method is capable of reduced decision support uncertainty and identification of robust design decisions in future scenarios with a high likelihood of occurrence with special focus on the path dependent nature of future implications of decisions. Furthermore, it was possible to demonstrate the increased importance of design and technology choices on the competitiveness in scenarios with drastic increases in commodity prices during the time period modeled. Another use of the existing outputs of the Monte-Carlo analysis was then realized by showing them on a multivariate scatter plot. This plot was then shown to enable by appropriate grouping of variables to enable the top down definition of an aircraft design, also known as inverse design. In other words this enables the designer to define strategic market and return on investment goals for a number of scenarios, for example the development of fuel prices, and then directly see which specific aircraft designs meet these goals.

  14. Time series models on analysing mortality rates and acute childhood lymphoid leukaemia.

    PubMed

    Kis, Maria

    2005-01-01

    In this paper we demonstrate applying time series models on medical research. The Hungarian mortality rates were analysed by autoregressive integrated moving average models and seasonal time series models examined the data of acute childhood lymphoid leukaemia.The mortality data may be analysed by time series methods such as autoregressive integrated moving average (ARIMA) modelling. This method is demonstrated by two examples: analysis of the mortality rates of ischemic heart diseases and analysis of the mortality rates of cancer of digestive system. Mathematical expressions are given for the results of analysis. The relationships between time series of mortality rates were studied with ARIMA models. Calculations of confidence intervals for autoregressive parameters by tree methods: standard normal distribution as estimation and estimation of the White's theory and the continuous time case estimation. Analysing the confidence intervals of the first order autoregressive parameters we may conclude that the confidence intervals were much smaller than other estimations by applying the continuous time estimation model.We present a new approach to analysing the occurrence of acute childhood lymphoid leukaemia. We decompose time series into components. The periodicity of acute childhood lymphoid leukaemia in Hungary was examined using seasonal decomposition time series method. The cyclic trend of the dates of diagnosis revealed that a higher percent of the peaks fell within the winter months than in the other seasons. This proves the seasonal occurrence of the childhood leukaemia in Hungary.

  15. Quantitative petri net model of gene regulated metabolic networks in the cell.

    PubMed

    Chen, Ming; Hofestädt, Ralf

    2011-01-01

    A method to exploit hybrid Petri nets (HPN) for quantitatively modeling and simulating gene regulated metabolic networks is demonstrated. A global kinetic modeling strategy and Petri net modeling algorithm are applied to perform the bioprocess functioning and model analysis. With the model, the interrelations between pathway analysis and metabolic control mechanism are outlined. Diagrammatical results of the dynamics of metabolites are simulated and observed by implementing a HPN tool, Visual Object Net ++. An explanation of the observed behavior of the urea cycle is proposed to indicate possibilities for metabolic engineering and medical care. Finally, the perspective of Petri nets on modeling and simulation of metabolic networks is discussed.

  16. Three-parameter error analysis method based on rotating coordinates in rotating birefringent polarizer system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Junjie; Jia, Hongzhi, E-mail: hzjia@usst.edu.cn

    2015-11-15

    We propose error analysis using a rotating coordinate system with three parameters of linearly polarized light—incidence angle, azimuth angle on the front surface, and angle between the incidence and vibration planes—and demonstrate the method on a rotating birefringent prism system. The transmittance and angles are calculated plane-by-plane using a birefringence ellipsoid model and the final transmitted intensity equation is deduced. The effects of oblique incidence, light interference, beam convergence, and misalignment of the rotation and prism axes are discussed. We simulate the entire error model using MATLAB and conduct experiments based on a built polarimeter. The simulation and experimental resultsmore » are consistent and demonstrate the rationality and validity of this method.« less

  17. Item-Level Psychometrics of the Glasgow Outcome Scale: Extended Structured Interviews.

    PubMed

    Hong, Ickpyo; Li, Chih-Ying; Velozo, Craig A

    2016-04-01

    The Glasgow Outcome Scale-Extended (GOSE) structured interview captures critical components of activities and participation, including home, shopping, work, leisure, and family/friend relationships. Eighty-nine community dwelling adults with mild-moderate traumatic brain injury (TBI) were recruited (average = 2.7 year post injury). Nine items of the 19 items were used for the psychometrics analysis purpose. Factor analysis and item-level psychometrics were investigated using the Rasch partial-credit model. Although the principal components analysis of residuals suggests that a single measurement factor dominates the measure, the instrument did not meet the factor analysis criteria. Five items met the rating scale criteria. Eight items fit the Rasch model. The instrument demonstrated low person reliability (0.63), low person strata (2.07), and a slight ceiling effect. The GOSE demonstrated limitations in precisely measuring activities/participation for individuals after TBI. Future studies should examine the impact of the low precision of the GOSE on effect size. © The Author(s) 2016.

  18. Bifurcation analysis of dengue transmission model in Baguio City, Philippines

    NASA Astrophysics Data System (ADS)

    Libatique, Criselda P.; Pajimola, Aprimelle Kris J.; Addawe, Joel M.

    2017-11-01

    In this study, we formulate a deterministic model for the transmission dynamics of dengue fever in Baguio City, Philippines. We analyzed the existence of the equilibria of the dengue model. We computed and obtained conditions for the existence of the equilibrium states. Stability analysis for the system is carried out for disease free equilibrium. We showed that the system becomes stable under certain conditions of the parameters. A particular parameter is taken and with the use of the Theory of Centre Manifold, the proposed model demonstrates a bifurcation phenomenon. We performed numerical simulation to verify the analytical results.

  19. Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform

    PubMed Central

    Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong

    2016-01-01

    We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features. PMID:27304979

  20. Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform.

    PubMed

    Wu, Hau-Tieng; Wu, Han-Kuei; Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong

    2016-01-01

    We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features.

  1. Qualitative analysis of pure and adulterated canola oil via SIMCA

    NASA Astrophysics Data System (ADS)

    Basri, Katrul Nadia; Khir, Mohd Fared Abdul; Rani, Rozina Abdul; Sharif, Zaiton; Rusop, M.; Zoolfakar, Ahmad Sabirin

    2018-05-01

    This paper demonstrates the utilization of near infrared (NIR) spectroscopy to classify pure and adulterated sample of canola oil. Soft Independent Modeling Class Analogies (SIMCA) algorithm was implemented to discriminate the samples to its classes. Spectral data obtained was divided using Kennard Stone algorithm into training and validation dataset by a fixed ratio of 7:3. The model accuracy obtained based on the model built is 0.99 whereas the sensitivity and precision are 0.92 and 1.00. The result showed the classification model is robust to perform qualitative analysis of canola oil for future application.

  2. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  3. A Measure for Evaluating the Effectiveness of Teen Pregnancy Prevention Programs.

    ERIC Educational Resources Information Center

    Somers, Cheryl L.; Johnson, Stephanie A.; Sawilowksy, Shlomo S.

    2002-01-01

    The Teen Attitude Pregnancy Scale (TAPS) was developed to measure teen attitudes and intentions regarding teenage pregnancy. The model demonstrated good internal consistency and concurrent validity for the samples in this study. Analysis revealed evidence of validity for this model. (JDM)

  4. Model Of Neural Network With Creative Dynamics

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Barhen, Jacob

    1993-01-01

    Paper presents analysis of mathematical model of one-neuron/one-synapse neural network featuring coupled activation and learning dynamics and parametrical periodic excitation. Demonstrates self-programming, partly random behavior of suitable designed neural network; believed to be related to spontaneity and creativity of biological neural networks.

  5. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  6. Versatile Micromechanics Model for Multiscale Analysis of Composite Structures

    NASA Astrophysics Data System (ADS)

    Kwon, Y. W.; Park, M. S.

    2013-08-01

    A general-purpose micromechanics model was developed so that the model could be applied to various composite materials such as reinforced by particles, long fibers and short fibers as well as those containing micro voids. Additionally, the model can be used with hierarchical composite materials. The micromechanics model can be used to compute effective material properties like elastic moduli, shear moduli, Poisson's ratios, and coefficients of thermal expansion for the various composite materials. The model can also calculate the strains and stresses at the constituent material level such as fibers, particles, and whiskers from the composite level stresses and strains. The model was implemented into ABAQUS using the UMAT option for multiscale analysis. An extensive set of examples are presented to demonstrate the reliability and accuracy of the developed micromechanics model for different kinds of composite materials. Another set of examples is provided to study the multiscale analysis of composite structures.

  7. A metabolite-centric view on flux distributions in genome-scale metabolic models

    PubMed Central

    2013-01-01

    Background Genome-scale metabolic models are important tools in systems biology. They permit the in-silico prediction of cellular phenotypes via mathematical optimisation procedures, most importantly flux balance analysis. Current studies on metabolic models mostly consider reaction fluxes in isolation. Based on a recently proposed metabolite-centric approach, we here describe a set of methods that enable the analysis and interpretation of flux distributions in an integrated metabolite-centric view. We demonstrate how this framework can be used for the refinement of genome-scale metabolic models. Results We applied the metabolite-centric view developed here to the most recent metabolic reconstruction of Escherichia coli. By compiling the balance sheets of a small number of currency metabolites, we were able to fully characterise the energy metabolism as predicted by the model and to identify a possibility for model refinement in NADPH metabolism. Selected branch points were examined in detail in order to demonstrate how a metabolite-centric view allows identifying functional roles of metabolites. Fructose 6-phosphate aldolase and the sedoheptulose bisphosphate bypass were identified as enzymatic reactions that can carry high fluxes in the model but are unlikely to exhibit significant activity in vivo. Performing a metabolite essentiality analysis, unconstrained import and export of iron ions could be identified as potentially problematic for the quality of model predictions. Conclusions The system-wide analysis of split ratios and branch points allows a much deeper insight into the metabolic network than reaction-centric analyses. Extending an earlier metabolite-centric approach, the methods introduced here establish an integrated metabolite-centric framework for the interpretation of flux distributions in genome-scale metabolic networks that can complement the classical reaction-centric framework. Analysing fluxes and their metabolic context simultaneously opens the door to systems biological interpretations that are not apparent from isolated reaction fluxes. Particularly powerful demonstrations of this are the analyses of the complete metabolic contexts of energy metabolism and the folate-dependent one-carbon pool presented in this work. Finally, a metabolite-centric view on flux distributions can guide the refinement of metabolic reconstructions for specific growth scenarios. PMID:23587327

  8. Demonstration of the Dynamic Flowgraph Methodology using the Titan 2 Space Launch Vehicle Digital Flight Control System

    NASA Technical Reports Server (NTRS)

    Yau, M.; Guarro, S.; Apostolakis, G.

    1993-01-01

    Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps.

  9. Interpreting cost of ownership for mix-and-match lithography

    NASA Astrophysics Data System (ADS)

    Levine, Alan L.; Bergendahl, Albert S.

    1994-05-01

    Cost of ownership modeling is a critical and emerging tool that provides significant insight into the ways to optimize device manufacturing costs. The development of a model to deal with a particular application, mix-and-match lithography, was performed in order to determine the level of cost savings and the optimum ways to create these savings. The use of sensitivity analysis with cost of ownership allows the user to make accurate trade-offs between technology and cost. The use and interpretation of the model results are described in this paper. Parameters analyzed include several manufacturing considerations -- depreciation, maintenance, engineering and operator labor, floorspace, resist, consumables and reticles. Inherent in this study is the ability to customize this analysis for a particular operating environment. Results demonstrate the clear advantages of a mix-and-match approach for three different operating environments. These case studies also demonstrate various methods to efficiently optimize cost savings strategies.

  10. Automated Bayesian model development for frequency detection in biological time series.

    PubMed

    Granqvist, Emma; Oldroyd, Giles E D; Morris, Richard J

    2011-06-24

    A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure.

  11. Automated Bayesian model development for frequency detection in biological time series

    PubMed Central

    2011-01-01

    Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure. PMID:21702910

  12. Systematic parameter estimation and sensitivity analysis using a multidimensional PEMFC model coupled with DAKOTA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Chao Yang; Luo, Gang; Jiang, Fangming

    2010-05-01

    Current computational models for proton exchange membrane fuel cells (PEMFCs) include a large number of parameters such as boundary conditions, material properties, and numerous parameters used in sub-models for membrane transport, two-phase flow and electrochemistry. In order to successfully use a computational PEMFC model in design and optimization, it is important to identify critical parameters under a wide variety of operating conditions, such as relative humidity, current load, temperature, etc. Moreover, when experimental data is available in the form of polarization curves or local distribution of current and reactant/product species (e.g., O2, H2O concentrations), critical parameters can be estimated inmore » order to enable the model to better fit the data. Sensitivity analysis and parameter estimation are typically performed using manual adjustment of parameters, which is also common in parameter studies. We present work to demonstrate a systematic approach based on using a widely available toolkit developed at Sandia called DAKOTA that supports many kinds of design studies, such as sensitivity analysis as well as optimization and uncertainty quantification. In the present work, we couple a multidimensional PEMFC model (which is being developed, tested and later validated in a joint effort by a team from Penn State Univ. and Sandia National Laboratories) with DAKOTA through the mapping of model parameters to system responses. Using this interface, we demonstrate the efficiency of performing simple parameter studies as well as identifying critical parameters using sensitivity analysis. Finally, we show examples of optimization and parameter estimation using the automated capability in DAKOTA.« less

  13. Music and Cultural Analysis in the Classroom: Introducing Sociology through Heavy Metal.

    ERIC Educational Resources Information Center

    Ahlkvist, Jarl A.

    1999-01-01

    Demonstrates that popular music's potential as a tool for teaching interactive introductory sociology courses is enhanced when a cultural analysis of a specific music genre is incorporated into the classroom. Presents a two-part model for integrating a cultural analysis of heavy metal music and its subculture into the introductory course. Includes…

  14. Reduction method with system analysis for multiobjective optimization-based design

    NASA Technical Reports Server (NTRS)

    Azarm, S.; Sobieszczanski-Sobieski, J.

    1993-01-01

    An approach for reducing the number of variables and constraints, which is combined with System Analysis Equations (SAE), for multiobjective optimization-based design is presented. In order to develop a simplified analysis model, the SAE is computed outside an optimization loop and then approximated for use by an operator. Two examples are presented to demonstrate the approach.

  15. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PM2.5 violations”) must be based on quantitative analysis using the applicable air quality models... either: (i) Quantitative methods that represent reasonable and common professional practice; or (ii) A...) The hot-spot demonstration required by § 93.116 must be based on quantitative analysis methods for the...

  16. Invariant density analysis: modeling and analysis of the postural control system using Markov chains.

    PubMed

    Hur, Pilwon; Shorter, K Alex; Mehta, Prashant G; Hsiao-Wecksler, Elizabeth T

    2012-04-01

    In this paper, a novel analysis technique, invariant density analysis (IDA), is introduced. IDA quantifies steady-state behavior of the postural control system using center of pressure (COP) data collected during quiet standing. IDA relies on the analysis of a reduced-order finite Markov model to characterize stochastic behavior observed during postural sway. Five IDA parameters characterize the model and offer physiological insight into the long-term dynamical behavior of the postural control system. Two studies were performed to demonstrate the efficacy of IDA. Study 1 showed that multiple short trials can be concatenated to create a dataset suitable for IDA. Study 2 demonstrated that IDA was effective at distinguishing age-related differences in postural control behavior between young, middle-aged, and older adults. These results suggest that the postural control system of young adults converges more quickly to their steady-state behavior while maintaining COP nearer an overall centroid than either the middle-aged or older adults. Additionally, larger entropy values for older adults indicate that their COP follows a more stochastic path, while smaller entropy values for young adults indicate a more deterministic path. These results illustrate the potential of IDA as a quantitative tool for the assessment of the quiet-standing postural control system.

  17. Development of Test-Analysis Models (TAM) for correlation of dynamic test and analysis results

    NASA Technical Reports Server (NTRS)

    Angelucci, Filippo; Javeed, Mehzad; Mcgowan, Paul

    1992-01-01

    The primary objective of structural analysis of aerospace applications is to obtain a verified finite element model (FEM). The verified FEM can be used for loads analysis, evaluate structural modifications, or design control systems. Verification of the FEM is generally obtained as the result of correlating test and FEM models. A test analysis model (TAM) is very useful in the correlation process. A TAM is essentially a FEM reduced to the size of the test model, which attempts to preserve the dynamic characteristics of the original FEM in the analysis range of interest. Numerous methods for generating TAMs have been developed in the literature. The major emphasis of this paper is a description of the procedures necessary for creation of the TAM and the correlation of the reduced models with the FEM or the test results. Herein, three methods are discussed, namely Guyan, Improved Reduced System (IRS), and Hybrid. Also included are the procedures for performing these analyses using MSC/NASTRAN. Finally, application of the TAM process is demonstrated with an experimental test configuration of a ten bay cantilevered truss structure.

  18. Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods

    NASA Technical Reports Server (NTRS)

    Berry, J. K.; Tomlin, C. D.

    1982-01-01

    Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.

  19. Aeroelastic stability analyses of two counter rotating propfan designs for a cruise missile model

    NASA Technical Reports Server (NTRS)

    Mahajan, Aparajit J.; Lucero, John M.; Mehmed, Oral; Stefko, George L.

    1992-01-01

    A modal aeroelastic analysis combining structural and aerodynamic models is applied to counterrotating propfans to evaluate their structural integrity for wind-tunnel testing. The aeroelastic analysis code is an extension of the 2D analysis code called the Aeroelastic Stability and Response of Propulsion Systems. Rotational speed and freestream Mach number are the parameters for calculating the stability of the two blade designs with a modal method combining a finite-element structural model with 2D steady and unsteady cascade aerodynamic models. The model demonstrates convergence to the least stable aeroelastic mode, describes the effects of a nonuniform inflow, and permits the modification of geometry and rotation. The analysis shows that the propfan designs are suitable for the wind-tunnel test and confirms that the propfans should be flutter-free under the range of conditions of the testing.

  20. Usage of Parameterized Fatigue Spectra and Physics-Based Systems Engineering Models for Wind Turbine Component Sizing: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsons, Taylor; Guo, Yi; Veers, Paul

    Software models that use design-level input variables and physics-based engineering analysis for estimating the mass and geometrical properties of components in large-scale machinery can be very useful for analyzing design trade-offs in complex systems. This study uses DriveSE, an OpenMDAO-based drivetrain model that uses stress and deflection criteria to size drivetrain components within a geared, upwind wind turbine. Because a full lifetime fatigue load spectrum can only be defined using computationally-expensive simulations in programs such as FAST, a parameterized fatigue loads spectrum that depends on wind conditions, rotor diameter, and turbine design life has been implemented. The parameterized fatigue spectrummore » is only used in this paper to demonstrate the proposed fatigue analysis approach. This paper details a three-part investigation of the parameterized approach and a comparison of the DriveSE model with and without fatigue analysis on the main shaft system. It compares loads from three turbines of varying size and determines if and when fatigue governs drivetrain sizing compared to extreme load-driven design. It also investigates the model's sensitivity to shaft material parameters. The intent of this paper is to demonstrate how fatigue considerations in addition to extreme loads can be brought into a system engineering optimization.« less

  1. Integrated modeling approach for optimal management of water, energy and food security nexus

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodong; Vesselinov, Velimir V.

    2017-03-01

    Water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-period socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. The obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.

  2. The NASA/industry Design Analysis Methods for Vibrations (DAMVIBS) program: McDonnell-Douglas Helicopter Company achievements

    NASA Technical Reports Server (NTRS)

    Toossi, Mostafa; Weisenburger, Richard; Hashemi-Kia, Mostafa

    1993-01-01

    This paper presents a summary of some of the work performed by McDonnell Douglas Helicopter Company under NASA Langley-sponsored rotorcraft structural dynamics program known as DAMVIBS (Design Analysis Methods for VIBrationS). A set of guidelines which is applicable to dynamic modeling, analysis, testing, and correlation of both helicopter airframes and a large variety of structural finite element models is presented. Utilization of these guidelines and the key features of their applications to vibration modeling of helicopter airframes are discussed. Correlation studies with the test data, together with the development and applications of a set of efficient finite element model checkout procedures, are demonstrated on a large helicopter airframe finite element model. Finally, the lessons learned and the benefits resulting from this program are summarized.

  3. Stability analysis of the phytoplankton effect model on changes in nitrogen concentration on integrated multi-trophic aquaculture systems

    NASA Astrophysics Data System (ADS)

    Widowati; Putro, S. P.; Silfiana

    2018-05-01

    Integrated Multi-Trophic Aquaculture (IMTA) is a polyculture with several biotas maintained in it to optimize waste recycling as a food source. The interaction between phytoplankton and nitrogen as waste in fish cultivation including ammonia, nitrite, and nitrate studied in the form of mathematical models. The form model is non-linear systems of differential equations with the four variables. The analytical analysis was used to study the dynamic behavior of this model. Local stability analysis is performed at the equilibrium point with the first step linearized model by using Taylor series, then determined the Jacobian matrix. If all eigenvalues have negative real parts, then the equilibrium of the system is locally asymptotic stable. Some numerical simulations were also demonstrated to verify our analytical result.

  4. Effects of angle of model-demonstration on learning of motor skill.

    PubMed

    Ishikura, T; Inomata, K

    1995-04-01

    The purpose was to examine the effects of three different demonstrations by a model on acquisition and retention of a sequential gross movement task. The second purpose was to examine the relationship between reversal processing of visual information about skills and coding of skill information. Thirty undergraduates (15 men and 15 women) were assigned into one of three conditions, Objective condition which demonstrated the task with the model facing the subject. Looking-glass condition in which the skill was demonstrated with the model facing the subject who viewed the performance opposite the right and left directions in executing the task, and the Subjective condition in which the subject observed the model from the rear. Number of immediate recall tests required to accomplish the sequential movements completely and the sum of the performance points for reproduced movements at each delayed recall test (1 day, 7 days, and 5 mo. after the immediate recall test) were employed. Analysis indicated the Subjective condition produced a significantly greater modeling effect in immediate recall of the movements than the Looking-glass condition. Retention of the acquired skills was almost equal under the three conditions.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huff, Kathryn D.

    Component level and system level abstraction of detailed computational geologic repository models have resulted in four rapid computational models of hydrologic radionuclide transport at varying levels of detail. Those models are described, as is their implementation in Cyder, a software library of interchangeable radionuclide transport models appropriate for representing natural and engineered barrier components of generic geology repository concepts. A proof of principle demonstration was also conducted in which these models were used to represent the natural and engineered barrier components of a repository concept in a reducing, homogenous, generic geology. This base case demonstrates integration of the Cyder openmore » source library with the Cyclus computational fuel cycle systems analysis platform to facilitate calculation of repository performance metrics with respect to fuel cycle choices. (authors)« less

  6. Closing the loop: modelling of heart failure progression from health to end-stage using a meta-analysis of left ventricular pressure-volume loops.

    PubMed

    Warriner, David R; Brown, Alistair G; Varma, Susheel; Sheridan, Paul J; Lawford, Patricia; Hose, David R; Al-Mohammad, Abdallah; Shi, Yubing

    2014-01-01

    The American Heart Association (AHA)/American College of Cardiology (ACC) guidelines for the classification of heart failure (HF) are descriptive but lack precise and objective measures which would assist in categorising such patients. Our aim was two fold, firstly to demonstrate quantitatively the progression of HF through each stage using a meta-analysis of existing left ventricular (LV) pressure-volume (PV) loop data and secondly use the LV PV loop data to create stage specific HF models. A literature search yielded 31 papers with PV data, representing over 200 patients in different stages of HF. The raw pressure and volume data were extracted from the papers using a digitising software package and the means were calculated. The data demonstrated that, as HF progressed, stroke volume (SV), ejection fraction (EF%) decreased while LV volumes increased. A 2-element lumped parameter model was employed to model the mean loops and the error was calculated between the loops, demonstrating close fit between the loops. The only parameter that was consistently and statistically different across all the stages was the elastance (Emax). For the first time, the authors have created a visual and quantitative representation of the AHA/ACC stages of LVSD-HF, from normal to end-stage. The study demonstrates that robust, load-independent and reproducible parameters, such as elastance, can be used to categorise and model HF, complementing the existing classification. The modelled PV loops establish previously unknown physiological parameters for each AHA/ACC stage of LVSD-HF, such as LV elastance and highlight that it this parameter alone, in lumped parameter models, that determines the severity of HF. Such information will enable cardiovascular modellers with an interest in HF, to create more accurate models of the heart as it fails.

  7. A Functional Model of the Digital Extensor Mechanism: Demonstrating Biomechanics with Hair Bands

    ERIC Educational Resources Information Center

    Cloud, Beth A.; Youdas, James W.; Hellyer, Nathan J.; Krause, David A.

    2010-01-01

    The action of muscles about joints can be explained through analysis of their spatial relationship. A functional model of these relationships can be valuable in learning and understanding the muscular action about a joint. A model can be particularly helpful when examining complex actions across multiple joints such as in the digital extensor…

  8. Integration of Human Reliability Analysis Models into the Simulation-Based Framework for the Risk-Informed Safety Margin Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Rasmussen, Martin

    2016-06-01

    This report presents an application of a computation-based human reliability analysis (HRA) framework called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER). HUNTER has been developed not as a standalone HRA method but rather as framework that ties together different HRA methods to model dynamic risk of human activities as part of an overall probabilistic risk assessment (PRA). While we have adopted particular methods to build an initial model, the HUNTER framework is meant to be intrinsically flexible to new pieces that achieve particular modeling goals. In the present report, the HUNTER implementation has the following goals: •more » Integration with a high fidelity thermal-hydraulic model capable of modeling nuclear power plant behaviors and transients • Consideration of a PRA context • Incorporation of a solid psychological basis for operator performance • Demonstration of a functional dynamic model of a plant upset condition and appropriate operator response This report outlines these efforts and presents the case study of a station blackout scenario to demonstrate the various modules developed to date under the HUNTER research umbrella.« less

  9. Automatic simplification of systems of reaction-diffusion equations by a posteriori analysis.

    PubMed

    Maybank, Philip J; Whiteley, Jonathan P

    2014-02-01

    Many mathematical models in biology and physiology are represented by systems of nonlinear differential equations. In recent years these models have become increasingly complex in order to explain the enormous volume of data now available. A key role of modellers is to determine which components of the model have the greatest effect on a given observed behaviour. An approach for automatically fulfilling this role, based on a posteriori analysis, has recently been developed for nonlinear initial value ordinary differential equations [J.P. Whiteley, Model reduction using a posteriori analysis, Math. Biosci. 225 (2010) 44-52]. In this paper we extend this model reduction technique for application to both steady-state and time-dependent nonlinear reaction-diffusion systems. Exemplar problems drawn from biology are used to demonstrate the applicability of the technique. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Stormwater quality modelling in combined sewers: calibration and uncertainty analysis.

    PubMed

    Kanso, A; Chebbo, G; Tassin, B

    2005-01-01

    Estimating the level of uncertainty in urban stormwater quality models is vital for their utilization. This paper presents the results of application of a Monte Carlo Markov Chain method based on the Bayesian theory for the calibration and uncertainty analysis of a storm water quality model commonly used in available software. The tested model uses a hydrologic/hydrodynamic scheme to estimate the accumulation, the erosion and the transport of pollutants on surfaces and in sewers. It was calibrated for four different initial conditions of in-sewer deposits. Calibration results showed large variability in the model's responses in function of the initial conditions. They demonstrated that the model's predictive capacity is very low.

  11. Decision science and cervical cancer.

    PubMed

    Cantor, Scott B; Fahs, Marianne C; Mandelblatt, Jeanne S; Myers, Evan R; Sanders, Gillian D

    2003-11-01

    Mathematical modeling is an effective tool for guiding cervical cancer screening, diagnosis, and treatment decisions for patients and policymakers. This article describes the use of mathematical modeling as outlined in five presentations from the Decision Science and Cervical Cancer session of the Second International Conference on Cervical Cancer held at The University of Texas M. D. Anderson Cancer Center, April 11-14, 2002. The authors provide an overview of mathematical modeling, especially decision analysis and cost-effectiveness analysis, and examples of how it can be used for clinical decision making regarding the prevention, diagnosis, and treatment of cervical cancer. Included are applications as well as theory regarding decision science and cervical cancer. Mathematical modeling can answer such questions as the optimal frequency for screening, the optimal age to stop screening, and the optimal way to diagnose cervical cancer. Results from one mathematical model demonstrated that a vaccine against high-risk strains of human papillomavirus was a cost-effective use of resources, and discussion of another model demonstrated the importance of collecting direct non-health care costs and time costs for cost-effectiveness analysis. Research presented indicated that care must be taken when applying the results of population-wide, cost-effectiveness analyses to reduce health disparities. Mathematical modeling can encompass a variety of theoretical and applied issues regarding decision science and cervical cancer. The ultimate objective of using decision-analytic and cost-effectiveness models is to identify ways to improve women's health at an economically reasonable cost. Copyright 2003 American Cancer Society.

  12. Distributed Finite Element Analysis Using a Transputer Network

    NASA Technical Reports Server (NTRS)

    Watson, James; Favenesi, James; Danial, Albert; Tombrello, Joseph; Yang, Dabby; Reynolds, Brian; Turrentine, Ronald; Shephard, Mark; Baehmann, Peggy

    1989-01-01

    The principal objective of this research effort was to demonstrate the extraordinarily cost effective acceleration of finite element structural analysis problems using a transputer-based parallel processing network. This objective was accomplished in the form of a commercially viable parallel processing workstation. The workstation is a desktop size, low-maintenance computing unit capable of supercomputer performance yet costs two orders of magnitude less. To achieve the principal research objective, a transputer based structural analysis workstation termed XPFEM was implemented with linear static structural analysis capabilities resembling commercially available NASTRAN. Finite element model files, generated using the on-line preprocessing module or external preprocessing packages, are downloaded to a network of 32 transputers for accelerated solution. The system currently executes at about one third Cray X-MP24 speed but additional acceleration appears likely. For the NASA selected demonstration problem of a Space Shuttle main engine turbine blade model with about 1500 nodes and 4500 independent degrees of freedom, the Cray X-MP24 required 23.9 seconds to obtain a solution while the transputer network, operated from an IBM PC-AT compatible host computer, required 71.7 seconds. Consequently, the $80,000 transputer network demonstrated a cost-performance ratio about 60 times better than the $15,000,000 Cray X-MP24 system.

  13. Diagnostics of Robust Growth Curve Modeling Using Student's "t" Distribution

    ERIC Educational Resources Information Center

    Tong, Xin; Zhang, Zhiyong

    2012-01-01

    Growth curve models with different types of distributions of random effects and of intraindividual measurement errors for robust analysis are compared. After demonstrating the influence of distribution specification on parameter estimation, 3 methods for diagnosing the distributions for both random effects and intraindividual measurement errors…

  14. International workshop on ITS benefits : how evaluation results are used in transportation decision-making, November 9, 2000 Turin, Italy

    DOT National Transportation Integrated Search

    1998-09-16

    This paper demonstrates application of the principles of economic analysis to evaluate highway capacity expansion in an urban setting, using a sketch-planning model called Spreadsheet Model for Induced Travel Estimation (SMITE). The application takes...

  15. Modeling Individual Differences in Unfolding Preference Data: A Restricted Latent Class Approach.

    ERIC Educational Resources Information Center

    Bockenholt, Ulf; Bockenholt, Ingo

    1990-01-01

    A latent-class scaling approach is presented for modeling paired comparison and "pick any/t" data obtained in preference studies. The utility of this approach is demonstrated through analysis of data from studies involving consumer preference and preference for political candidates. (SLD)

  16. Mixed Models and Reduction Techniques for Large-Rotation, Nonlinear Analysis of Shells of Revolution with Application to Tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Andersen, C. M.; Tanner, J. A.

    1984-01-01

    An effective computational strategy is presented for the large-rotation, nonlinear axisymmetric analysis of shells of revolution. The three key elements of the computational strategy are: (1) use of mixed finite-element models with discontinuous stress resultants at the element interfaces; (2) substantial reduction in the total number of degrees of freedom through the use of a multiple-parameter reduction technique; and (3) reduction in the size of the analysis model through the decomposition of asymmetric loads into symmetric and antisymmetric components coupled with the use of the multiple-parameter reduction technique. The potential of the proposed computational strategy is discussed. Numerical results are presented to demonstrate the high accuracy of the mixed models developed and to show the potential of using the proposed computational strategy for the analysis of tires.

  17. Incorporating principal component analysis into air quality ...

    EPA Pesticide Factsheets

    The efficacy of standard air quality model evaluation techniques is becoming compromised as the simulation periods continue to lengthen in response to ever increasing computing capacity. Accordingly, the purpose of this paper is to demonstrate a statistical approach called Principal Component Analysis (PCA) with the intent of motivating its use by the evaluation community. One of the main objectives of PCA is to identify, through data reduction, the recurring and independent modes of variations (or signals) within a very large dataset, thereby summarizing the essential information of that dataset so that meaningful and descriptive conclusions can be made. In this demonstration, PCA is applied to a simple evaluation metric – the model bias associated with EPA's Community Multi-scale Air Quality (CMAQ) model when compared to weekly observations of sulfate (SO42−) and ammonium (NH4+) ambient air concentrations measured by the Clean Air Status and Trends Network (CASTNet). The advantages of using this technique are demonstrated as it identifies strong and systematic patterns of CMAQ model bias across a myriad of spatial and temporal scales that are neither constrained to geopolitical boundaries nor monthly/seasonal time periods (a limitation of many current studies). The technique also identifies locations (station–grid cell pairs) that are used as indicators for a more thorough diagnostic evaluation thereby hastening and facilitating understanding of the prob

  18. Exploring the measurement properties of the osteopathy clinical teaching questionnaire using Rasch analysis.

    PubMed

    Vaughan, Brett

    2018-01-01

    Clinical teaching evaluations are common in health profession education programs to ensure students are receiving a quality clinical education experience. Questionnaires students use to evaluate their clinical teachers have been developed in professions such as medicine and nursing. The development of a questionnaire that is specifically for the osteopathy on-campus, student-led clinic environment is warranted. Previous work developed the 30-item Osteopathy Clinical Teaching Questionnaire. The current study utilised Rasch analysis to investigate the construct validity of the Osteopathy Clinical Teaching Questionnaire and provide evidence for the validity argument through fit to the Rasch model. Senior osteopathy students at four institutions in Australia, New Zealand and the United Kingdom rated their clinical teachers using the Osteopathy Clinical Teaching Questionnaire. Three hundred and ninety-nine valid responses were received and the data were evaluated for fit to the Rasch model. Reliability estimations (Cronbach's alpha and McDonald's omega) were also evaluated for the final model. The initial analysis demonstrated the data did not fit the Rasch model. Accordingly, modifications to the questionnaire were made including removing items, removing person responses, and rescoring one item. The final model contained 12 items and fit to the Rasch model was adequate. Support for unidimensionality was demonstrated through both the Principal Components Analysis/t-test, and the Cronbach's alpha and McDonald's omega reliability estimates. Analysis of the questionnaire using McDonald's omega hierarchical supported a general factor (quality of clinical teaching in osteopathy). The evidence for unidimensionality and the presence of a general factor support the calculation of a total score for the questionnaire as a sufficient statistic. Further work is now required to investigate the reliability of the 12-item Osteopathy Clinical Teaching Questionnaire to provide evidence for the validity argument.

  19. Investigation of Antigen-Antibody Interactions of Sulfonamides with a Monoclonal Antibody in a Fluorescence Polarization Immunoassay Using 3D-QSAR Models

    PubMed Central

    Wang, Zhanhui; Kai, Zhenpeng; Beier, Ross C.; Shen, Jianzhong; Yang, Xinling

    2012-01-01

    A three-dimensional quantitative structure-activity relationship (3D-QSAR) model of sulfonamide analogs binding a monoclonal antibody (MAbSMR) produced against sulfamerazine was carried out by Distance Comparison (DISCOtech), comparative molecular field analysis (CoMFA), and comparative molecular similarity indices analysis (CoMSIA). The affinities of the MAbSMR, expressed as Log10IC50, for 17 sulfonamide analogs were determined by competitive fluorescence polarization immunoassay (FPIA). The results demonstrated that the proposed pharmacophore model containing two hydrogen-bond acceptors, two hydrogen-bond donors and two hydrophobic centers characterized the structural features of the sulfonamides necessary for MAbSMR binding. Removal of two outliers from the initial set of 17 sulfonamide analogs improved the predictability of the models. The 3D-QSAR models of 15 sulfonamides based on CoMFA and CoMSIA resulted in q2 cv values of 0.600 and 0.523, and r2 values of 0.995 and 0.994, respectively, which indicates that both methods have significant predictive capability. Connolly surface analysis, which mainly focused on steric force fields, was performed to complement the results from CoMFA and CoMSIA. This novel study combining FPIA with pharmacophore modeling demonstrates that multidisciplinary research is useful for investigating antigen-antibody interactions and also may provide information required for the design of new haptens. PMID:22754368

  20. Analysis of high vacuum systems using SINDA'85

    NASA Technical Reports Server (NTRS)

    Spivey, R. A.; Clanton, S. E.; Moore, J. D.

    1993-01-01

    The theory, algorithms, and test data correlation analysis of a math model developed to predict performance of the Space Station Freedom Vacuum Exhaust System are presented. The theory used to predict the flow characteristics of viscous, transition, and molecular flow is presented in detail. Development of user subroutines which predict the flow characteristics in conjunction with the SINDA'85/FLUINT analysis software are discussed. The resistance-capacitance network approach with application to vacuum system analysis is demonstrated and results from the model are correlated with test data. The model was developed to predict the performance of the Space Station Freedom Vacuum Exhaust System. However, the unique use of the user subroutines developed in this model and written into the SINDA'85/FLUINT thermal analysis model provides a powerful tool that can be used to predict the transient performance of vacuum systems and gas flow in tubes of virtually any geometry. This can be accomplished using a resistance-capacitance (R-C) method very similar to the methods used to perform thermal analyses.

  1. Aligning Event Logs to Task-Time Matrix Clinical Pathways in BPMN for Variance Analysis.

    PubMed

    Yan, Hui; Van Gorp, Pieter; Kaymak, Uzay; Lu, Xudong; Ji, Lei; Chiau, Choo Chiap; Korsten, Hendrikus H M; Duan, Huilong

    2018-03-01

    Clinical pathways (CPs) are popular healthcare management tools to standardize care and ensure quality. Analyzing CP compliance levels and variances is known to be useful for training and CP redesign purposes. Flexible semantics of the business process model and notation (BPMN) language has been shown to be useful for the modeling and analysis of complex protocols. However, in practical cases one may want to exploit that CPs often have the form of task-time matrices. This paper presents a new method parsing complex BPMN models and aligning traces to the models heuristically. A case study on variance analysis is undertaken, where a CP from the practice and two large sets of patients data from an electronic medical record (EMR) database are used. The results demonstrate that automated variance analysis between BPMN task-time models and real-life EMR data are feasible, whereas that was not the case for the existing analysis techniques. We also provide meaningful insights for further improvement.

  2. Promoting motivation through mode of instruction: The relationship between use of affective teaching techniques and motivation to learn science

    NASA Astrophysics Data System (ADS)

    Sanchez Rivera, Yamil

    The purpose of this study is to add to what we know about the affective domain and to create a valid instrument for future studies. The Motivation to Learn Science (MLS) Inventory is based on Krathwohl's Taxonomy of Affective Behaviors (Krathwohl et al., 1964). The results of the Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) demonstrated that the MLS Inventory is a valid and reliable instrument. Therefore, the MLS Inventory is a uni-dimensional instrument composed of 9 items with convergent validity (no divergence). The instrument had a high Chronbach Alpha value of .898 during the EFA analysis and .919 with the CFA analysis. Factor loadings on the 9 items ranged from .617 to .800. Standardized regression weights ranged from .639 to .835 in the CFA analysis. Various indices (RMSEA = .033; NFI = .987; GFI = .985; CFI = 1.000) demonstrated a good fitness of the proposed model. Hierarchical linear modeling was used to statistical analyze data where students' motivation to learn science scores (level-1) were nested within teachers (level-2). The analysis was geared toward identifying if teachers' use of affective behavior (a level-2 classroom variable) was significantly related with students' MLS scores (level-1 criterion variable). Model testing proceeded in three phases: intercept-only model, means-as-outcome model, and a random-regression coefficient model. The intercept-only model revealed an intra-class correlation coefficient of .224 with an estimated reliability of .726. Therefore, data suggested that only 22.4% of the variance in MLS scores is between-classes and the remaining 77.6% is at the student-level. Due to the significant variance in MLS scores, X2(62.756, p<.0001), teachers' TAB scores were added as a level-2 predictor. The regression coefficient was non-significant (p>.05). Therefore, the teachers' self-reported use of affective behaviors was not a significant predictor of students' motivation to learn science.

  3. Abstraction and model evaluation in category learning.

    PubMed

    Vanpaemel, Wolf; Storms, Gert

    2010-05-01

    Thirty previously published data sets, from seminal category learning tasks, are reanalyzed using the varying abstraction model (VAM). Unlike a prototype-versus-exemplar analysis, which focuses on extreme levels of abstraction only, a VAM analysis also considers the possibility of partial abstraction. Whereas most data sets support no abstraction when only the extreme possibilities are considered, we show that evidence for abstraction can be provided using the broader view on abstraction provided by the VAM. The present results generalize earlier demonstrations of partial abstraction (Vanpaemel & Storms, 2008), in which only a small number of data sets was analyzed. Following the dominant modus operandi in category learning research, Vanpaemel and Storms evaluated the models on their best fit, a practice known to ignore the complexity of the models under consideration. In the present study, in contrast, model evaluation not only relies on the maximal likelihood, but also on the marginal likelihood, which is sensitive to model complexity. Finally, using a large recovery study, it is demonstrated that, across the 30 data sets, complexity differences between the models in the VAM family are small. This indicates that a (computationally challenging) complexity-sensitive model evaluation method is uncalled for, and that the use of a (computationally straightforward) complexity-insensitive model evaluation method is justified.

  4. Adaptive Immunity Restricts Replication of Novel Murine Astroviruses

    PubMed Central

    Yokoyama, Christine C.; Loh, Joy; Zhao, Guoyan; Stappenbeck, Thaddeus S.; Wang, David; Huang, Henry V.

    2012-01-01

    The mechanisms of astrovirus pathogenesis are largely unknown, in part due to a lack of a small-animal model of disease. Using shotgun sequencing and a custom analysis pipeline, we identified two novel astroviruses capable of infecting research mice, murine astrovirus (MuAstV) STL1 and STL2. Subsequent analysis revealed the presence of at least two additional viruses (MuAstV STL3 and STL4), suggestive of a diverse population of murine astroviruses in research mice. Complete genomic characterization and subsequent phylogenetic analysis showed that MuAstV STL1 to STL4 are members of the mamastrovirus genus and are likely members of a new mamastrovirus genogroup. Using Rag1−/− mice deficient in B and T cells, we demonstrate that adaptive immunity is required to control MuAstV infection. Furthermore, using Stat1−/− mice deficient in innate signaling, we demonstrate a role for the innate immune response in the control of MuAstV replication. Our results demonstrate that MuAstV STL permits the study of the mechanisms of astrovirus infection and host-pathogen interactions in a genetically manipulable small-animal model. Finally, we detected MuAstV in commercially available mice, suggesting that these viruses may be present in academic and commercial research mouse facilities, with possible implications for interpretation of data generated in current mouse models of disease. PMID:22951832

  5. Analysis of reversed torsion of FCC metals using polycrystal plasticity models

    DOE PAGES

    Guo, Xiao Qian; Wang, Huamiao; Wu, Pei Dong; ...

    2015-06-19

    Large strain behavior of FCC polycrystals during reversed torsion are investigated through the special purpose finite element based on the classical Taylor model and the elastic-viscoplastic self-consistent (EVPSC) model with various Self-Consistent Schemes (SCSs). It is found that the response of both the fixed-end and free-end torsion is very sensitive to the constitutive models. The models are assessed through comparing their predictions to the corresponding experiments in terms of the stress and strain curves, the Swift effect and texture evolution. It is demonstrated that none of the models examined can precisely predict all the experimental results. However, more careful observationmore » reveals that, among the models considered, the tangent model gives the worst overall performance. As a result, it is also demonstrated that the intensity of residual texture during reverse twisting is dependent on the amounts of pre-shear strain during forward twisting and the model used.« less

  6. The Fifth Annual Thermal and Fluids Analysis Workshop

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Fifth Annual Thermal and Fluids Analysis Workshop was held at the Ohio Aerospace Institute, Brook Park, Ohio, cosponsored by NASA Lewis Research Center and the Ohio Aerospace Institute, 16-20 Aug. 1993. The workshop consisted of classes, vendor demonstrations, and paper sessions. The classes and vendor demonstrations provided participants with the information on widely used tools for thermal and fluid analysis. The paper sessions provided a forum for the exchange of information and ideas among thermal and fluids analysts. Paper topics included advances and uses of established thermal and fluids computer codes (such as SINDA and TRASYS) as well as unique modeling techniques and applications.

  7. EUROPLANET-RI modelling service for the planetary science community: European Modelling and Data Analysis Facility (EMDAF)

    NASA Astrophysics Data System (ADS)

    Khodachenko, Maxim; Miller, Steven; Stoeckler, Robert; Topf, Florian

    2010-05-01

    Computational modeling and observational data analysis are two major aspects of the modern scientific research. Both appear nowadays under extensive development and application. Many of the scientific goals of planetary space missions require robust models of planetary objects and environments as well as efficient data analysis algorithms, to predict conditions for mission planning and to interpret the experimental data. Europe has great strength in these areas, but it is insufficiently coordinated; individual groups, models, techniques and algorithms need to be coupled and integrated. Existing level of scientific cooperation and the technical capabilities for operative communication, allow considerable progress in the development of a distributed international Research Infrastructure (RI) which is based on the existing in Europe computational modelling and data analysis centers, providing the scientific community with dedicated services in the fields of their computational and data analysis expertise. These services will appear as a product of the collaborative communication and joint research efforts of the numerical and data analysis experts together with planetary scientists. The major goal of the EUROPLANET-RI / EMDAF is to make computational models and data analysis algorithms associated with particular national RIs and teams, as well as their outputs, more readily available to their potential user community and more tailored to scientific user requirements, without compromising front-line specialized research on model and data analysis algorithms development and software implementation. This objective will be met through four keys subdivisions/tasks of EMAF: 1) an Interactive Catalogue of Planetary Models; 2) a Distributed Planetary Modelling Laboratory; 3) a Distributed Data Analysis Laboratory, and 4) enabling Models and Routines for High Performance Computing Grids. Using the advantages of the coordinated operation and efficient communication between the involved computational modelling, research and data analysis expert teams and their related research infrastructures, EMDAF will provide a 1) flexible, 2) scientific user oriented, 3) continuously developing and fast upgrading computational and data analysis service to support and intensify the European planetary scientific research. At the beginning EMDAF will create a set of demonstrators and operational tests of this service in key areas of European planetary science. This work will aim at the following objectives: (a) Development and implementation of tools for distant interactive communication between the planetary scientists and computing experts (including related RIs); (b) Development of standard routine packages, and user-friendly interfaces for operation of the existing numerical codes and data analysis algorithms by the specialized planetary scientists; (c) Development of a prototype of numerical modelling services "on demand" for space missions and planetary researchers; (d) Development of a prototype of data analysis services "on demand" for space missions and planetary researchers; (e) Development of a prototype of coordinated interconnected simulations of planetary phenomena and objects (global multi-model simulators); (f) Providing the demonstrators of a coordinated use of high performance computing facilities (super-computer networks), done in cooperation with European HPC Grid DEISA.

  8. Computer assisted spirometry.

    PubMed

    Hansen, D J; Toy, V M; Deininger, R A; Collopy, T K

    1983-06-01

    Three of the most popular microcomputers, the TRS-80 Model I, the APPLE II+, and the IBM Personal Computer were connected to a spirometer for data acquisition and analysis. Simple programs were written which allow the collection, analysis and storage of the data produced during spirometry. Three examples demonstrate the relative ease for automating spirometers.

  9. Testing Specific Hypotheses Concerning Latent Group Differences in Multi-group Covariance Structure Analysis with Structured Means.

    ERIC Educational Resources Information Center

    Dolan, Conor V.; Molenaar, Peter C. M.

    1994-01-01

    In multigroup covariance structure analysis with structured means, the traditional latent selection model is formulated as a special case of phenotypic selection. Illustrations with real and simulated data demonstrate how one can test specific hypotheses concerning selection on latent variables. (SLD)

  10. Assessing the Impact of Financial Aid Offers on Enrollment Decisions.

    ERIC Educational Resources Information Center

    Somers, Patricia A.; St. John, Edward P.

    1993-01-01

    A study tested a model for assessing the impact of financial aid offers on 2,558 accepted students' college enrollment decisions. The analysis demonstrates that financial aid strategies have a substantial influence on enrollment and the systematic analysis of student enrollment decisions can help institutional administrators refine their financing…

  11. Establishing Factor Validity Using Variable Reduction in Confirmatory Factor Analysis.

    ERIC Educational Resources Information Center

    Hofmann, Rich

    1995-01-01

    Using a 21-statement attitude-type instrument, an iterative procedure for improving confirmatory model fit is demonstrated within the context of the EQS program of P. M. Bentler and maximum likelihood factor analysis. Each iteration systematically eliminates the poorest fitting statement as identified by a variable fit index. (SLD)

  12. The design and development of a two-dimensional adaptive truss structure

    NASA Technical Reports Server (NTRS)

    Kuwao, Fumihiro; Motohashi, Shoichi; Yoshihara, Makoto; Takahara, Kenichi; Natori, Michihiro

    1987-01-01

    The functional model of a two dimensional adaptive truss structure which can purposefully change its geometrical configuration is introduced. The details of design and fabrication such as kinematic analysis, dynamic characteristics analysis and some test results are presented for the demonstration of this two dimensional truss concept.

  13. Chemical control of rate and onset temperature of nadimide polymerization

    NASA Technical Reports Server (NTRS)

    Lauver, R. W.

    1985-01-01

    The chemistry of norbornenyl capped imide compounds (nadimides) is briefly reviewed with emphasis on the contribution of Diels-Alder reversion in controlling the rate and onset of the thermal polymerization reaction. Control of onset temperature of the cure exotherm by adjusting the concentration of maleimide is demonstrated using selected model compounds. The effects of nitrophenyl compounds as free radical retarders on nadimide reactivity are discussed. A simple copolymerization model is proposed for the overall nadimide cure reaction. An approximate numerical analysis is carried out to demonstrate the ability of the model to simulate the trends observed for both maleimide and nitrophenyl additions.

  14. Verification of an Analytical Method for Measuring Crystal Nucleation Rates in Glasses from DTA Data

    NASA Technical Reports Server (NTRS)

    Ranasinghe, K. S.; Wei, P. F.; Kelton, K. F.; Ray, C. S.; Day, D. E.

    2004-01-01

    A recently proposed analytical (DTA) method for estimating the nucleation rates in glasses has been evaluated by comparing experimental data with numerically computed nucleation rates for a model lithium disilicate glass. The time and temperature dependent nucleation rates were predicted using the model and compared with those values from an analysis of numerically calculated DTA curves. The validity of the numerical approach was demonstrated earlier by a comparison with experimental data. The excellent agreement between the nucleation rates from the model calculations and fiom the computer generated DTA data demonstrates the validity of the proposed analytical DTA method.

  15. Controlled release of vancomycin from thin sol-gel films on implant surfaces successfully controls osteomyelitis.

    PubMed

    Adams, Christopher S; Antoci, Valentin; Harrison, Gerald; Patal, Payal; Freeman, Terry A; Shapiro, Irving M; Parvizi, Javad; Hickok, Noreen J; Radin, Shula; Ducheyne, Paul

    2009-06-01

    Peri-prosthetic infection remains a serious complication of joint replacement surgery. Herein, we demonstrate that a vancomycin-containing sol-gel film on Ti alloy rods can successfully treat bacterial infections in an animal model. The vancomycin-containing sol-gel films exhibited predictable release kinetics, while significantly inhibiting S. aureus adhesion. When evaluated in a rat osteomyelitis model, microbiological analysis indicated that the vancomycin-containing sol-gel film caused a profound decrease in S. aureus number. Radiologically, while the control side showed extensive bone degradation, including abscesses and an extensive periosteal reaction, rods coated with the vancomycin-containing sol-gel film resulted in minimal signs of infection. MicroCT analysis confirmed the radiological results, while demonstrating that the vancomycin-containing sol-gel film significantly protected dense bone from resorption and minimized remodeling. These results clearly demonstrate that this novel thin sol-gel technology can be used for the targeted delivery of antibiotics for the treatment of periprosthetic as well as other bone infections. Copyright 2008 Orthopaedic Research Society

  16. A local structure model for network analysis

    DOE PAGES

    Casleton, Emily; Nordman, Daniel; Kaiser, Mark

    2017-04-01

    The statistical analysis of networks is a popular research topic with ever widening applications. Exponential random graph models (ERGMs), which specify a model through interpretable, global network features, are common for this purpose. In this study we introduce a new class of models for network analysis, called local structure graph models (LSGMs). In contrast to an ERGM, a LSGM specifies a network model through local features and allows for an interpretable and controllable local dependence structure. In particular, LSGMs are formulated by a set of full conditional distributions for each network edge, e.g., the probability of edge presence/absence, depending onmore » neighborhoods of other edges. Additional model features are introduced to aid in specification and to help alleviate a common issue (occurring also with ERGMs) of model degeneracy. Finally, the proposed models are demonstrated on a network of tornadoes in Arkansas where a LSGM is shown to perform significantly better than a model without local dependence.« less

  17. A local structure model for network analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casleton, Emily; Nordman, Daniel; Kaiser, Mark

    The statistical analysis of networks is a popular research topic with ever widening applications. Exponential random graph models (ERGMs), which specify a model through interpretable, global network features, are common for this purpose. In this study we introduce a new class of models for network analysis, called local structure graph models (LSGMs). In contrast to an ERGM, a LSGM specifies a network model through local features and allows for an interpretable and controllable local dependence structure. In particular, LSGMs are formulated by a set of full conditional distributions for each network edge, e.g., the probability of edge presence/absence, depending onmore » neighborhoods of other edges. Additional model features are introduced to aid in specification and to help alleviate a common issue (occurring also with ERGMs) of model degeneracy. Finally, the proposed models are demonstrated on a network of tornadoes in Arkansas where a LSGM is shown to perform significantly better than a model without local dependence.« less

  18. Modeling the Relationships between Subdimensions of Environmental Literacy

    ERIC Educational Resources Information Center

    Genc, Murat; Akilli, Mustafa

    2016-01-01

    The aim of this study is to demonstrate the relationships between subdimensions of environmental literacy using Structural Equation Modeling (SEM). The study was conducted by the analysis of students' answers to questionnaires data using SEM. Initially, Kaiser-Meyer-Olkin and Bartlett's tests were done to test appropriateness of subdimensions to…

  19. An Integrated Ecological Modeling System for Assessing Impacts of Multiple Stressors on Stream and Riverine Ecosystem Services Within River Basins

    EPA Science Inventory

    We demonstrate a novel, spatially explicit assessment of the current condition of aquatic ecosystem services, with limited sensitivity analysis for the atmospheric contaminant mercury. The Integrated Ecological Modeling System (IEMS) forecasts water quality and quantity, habitat ...

  20. An Ideological Analysis of Digital Reference Service Models.

    ERIC Educational Resources Information Center

    Dilevko, Juris

    2001-01-01

    Looks at some of the new paradigms for reference service, in particular the ideological implications of the digital reference call-center model, demonstrates how they lead to a "deprofessionalization" of reference work, and provides examples of how extensive reading can help reference librarians provide better service and become an…

  1. Segmentation-free image processing and analysis of precipitate shapes in 2D and 3D

    NASA Astrophysics Data System (ADS)

    Bales, Ben; Pollock, Tresa; Petzold, Linda

    2017-06-01

    Segmentation based image analysis techniques are routinely employed for quantitative analysis of complex microstructures containing two or more phases. The primary advantage of these approaches is that spatial information on the distribution of phases is retained, enabling subjective judgements of the quality of the segmentation and subsequent analysis process. The downside is that computing micrograph segmentations with data from morphologically complex microstructures gathered with error-prone detectors is challenging and, if no special care is taken, the artifacts of the segmentation will make any subsequent analysis and conclusions uncertain. In this paper we demonstrate, using a two phase nickel-base superalloy microstructure as a model system, a new methodology for analysis of precipitate shapes using a segmentation-free approach based on the histogram of oriented gradients feature descriptor, a classic tool in image analysis. The benefits of this methodology for analysis of microstructure in two and three-dimensions are demonstrated.

  2. Data accuracy assessment using enterprise architecture

    NASA Astrophysics Data System (ADS)

    Närman, Per; Holm, Hannes; Johnson, Pontus; König, Johan; Chenine, Moustafa; Ekstedt, Mathias

    2011-02-01

    Errors in business processes result in poor data accuracy. This article proposes an architecture analysis method which utilises ArchiMate and the Probabilistic Relational Model formalism to model and analyse data accuracy. Since the resources available for architecture analysis are usually quite scarce, the method advocates interviews as the primary data collection technique. A case study demonstrates that the method yields correct data accuracy estimates and is more resource-efficient than a competing sampling-based data accuracy estimation method.

  3. A biological compression model and its applications.

    PubMed

    Cao, Minh Duc; Dix, Trevor I; Allison, Lloyd

    2011-01-01

    A biological compression model, expert model, is presented which is superior to existing compression algorithms in both compression performance and speed. The model is able to compress whole eukaryotic genomes. Most importantly, the model provides a framework for knowledge discovery from biological data. It can be used for repeat element discovery, sequence alignment and phylogenetic analysis. We demonstrate that the model can handle statistically biased sequences and distantly related sequences where conventional knowledge discovery tools often fail.

  4. Critical evaluation of mechanistic two-phase flow pipeline and well simulation models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dhulesia, H.; Lopez, D.

    1996-12-31

    Mechanistic steady state simulation models, rather than empirical correlations, are used for a design of multiphase production system including well, pipeline and downstream installations. Among the available models, PEPITE, WELLSIM, OLGA, TACITE and TUFFP are widely used for this purpose and consequently, a critical evaluation of these models is needed. An extensive validation methodology is proposed which consists of two distinct steps: first to validate the hydrodynamic point model using the test loop data and, then to validate the over-all simulation model using the real pipelines and wells data. The test loop databank used in this analysis contains about 5952more » data sets originated from four different test loops and a majority of these data are obtained at high pressures (up to 90 bars) with real hydrocarbon fluids. Before performing the model evaluation, physical analysis of the test loops data is required to eliminate non-coherent data. The evaluation of these point models demonstrates that the TACITE and OLGA models can be applied to any configuration of pipes. The TACITE model performs better than the OLGA model because it uses the most appropriate closure laws from the literature validated on a large number of data. The comparison of predicted and measured pressure drop for various real pipelines and wells demonstrates that the TACITE model is a reliable tool.« less

  5. Coupled electromechanical response of composite beams with embedded piezoelectric sensors and actuators

    NASA Technical Reports Server (NTRS)

    Saravanos, D. A.; Heyliger, P. R.

    1994-01-01

    Unified mechanics are developed with the capability to model both sensory and active composite laminates with embedded piezoelectric layers. A discrete-layer formulation enables analysis of both global and local electromechanical response. The mechanics include the contributions from elastic, piezoelectric, and dielectric components. The incorporation of electric potential into the state variables permits representation of general electromechanical boundary conditions. Approximate finite element solutions for the static and free-vibration analysis of beams are presented. Applications on composite beams demonstrate the capability to represent either sensory or active structures and to model the complicated stress-strain fields, the interactions between passive/active layers, interfacial phenomena between sensors and composite plies, and critical damage modes in the material. The capability to predict the dynamic characteristics under various electrical boundary conditions is also demonstrated.

  6. Sensitive Analysis of Protein Adsorption to Colloidal Gold by Differential Centrifugal Sedimentation

    PubMed Central

    2017-01-01

    It is demonstrated that the adsorption of bovine serum albumin (BSA) to aqueous gold colloids can be quantified with molecular resolution by differential centrifugal sedimentation (DCS). This method separates colloidal particles of comparable density by mass. When proteins adsorb to the nanoparticles, both their mass and their effective density change, which strongly affects the sedimentation time. A straightforward analysis allows quantification of the adsorbed layer. Most importantly, unlike many other methods, DCS can be used to detect chemisorbed proteins (“hard corona”) as well as physisorbed proteins (“soft corona”). The results for BSA on gold colloid nanoparticles can be modeled in terms of Langmuir-type adsorption isotherms (Hill model). The effects of surface modification with small thiol-PEG ligands on protein adsorption are also demonstrated. PMID:28513153

  7. PDB-NMA of a protein homodimer reproduces distinct experimental motility asymmetry.

    PubMed

    Tirion, Monique M; Ben-Avraham, Daniel

    2018-01-16

    We have extended our analytically derived PDB-NMA formulation, Atomic Torsional Modal Analysis or ATMAN (Tirion and ben-Avraham 2015 Phys. Rev. E 91 032712), to include protein dimers using mixed internal and Cartesian coordinates. A test case on a 1.3 [Formula: see text] resolution model of a small homodimer, ActVA-ORF6, consisting of two 112-residue subunits identically folded in a compact 50 [Formula: see text] sphere, reproduces the distinct experimental Debye-Waller motility asymmetry for the two chains, demonstrating that structure sensitively selects vibrational signatures. The vibrational analysis of this PDB entry, together with biochemical and crystallographic data, demonstrates the cooperative nature of the dimeric interaction of the two subunits and suggests a mechanical model for subunit interconversion during the catalytic cycle.

  8. PDB-NMA of a protein homodimer reproduces distinct experimental motility asymmetry

    NASA Astrophysics Data System (ADS)

    Tirion, Monique M.; ben-Avraham, Daniel

    2018-03-01

    We have extended our analytically derived PDB-NMA formulation, Atomic Torsional Modal Analysis or ATMAN (Tirion and ben-Avraham 2015 Phys. Rev. E 91 032712), to include protein dimers using mixed internal and Cartesian coordinates. A test case on a 1.3 {\\mathringA} resolution model of a small homodimer, ActVA-ORF6, consisting of two 112-residue subunits identically folded in a compact 50 {\\mathringA} sphere, reproduces the distinct experimental Debye-Waller motility asymmetry for the two chains, demonstrating that structure sensitively selects vibrational signatures. The vibrational analysis of this PDB entry, together with biochemical and crystallographic data, demonstrates the cooperative nature of the dimeric interaction of the two subunits and suggests a mechanical model for subunit interconversion during the catalytic cycle.

  9. A practical measure of workplace resilience: developing the resilience at work scale.

    PubMed

    Winwood, Peter C; Colon, Rochelle; McEwen, Kath

    2013-10-01

    To develop an effective measure of resilience at work for use in individual work-related performance and emotional distress contexts. Two separate cross-sectional studies investigated: (1) exploratory factor analysis of 45 items putatively underpinning workplace resilience among 397 participants and (2) confirmatory factor analysis of resilience measure derived from Study 1 demonstrating a credible model of interaction, with performance outcome variables among 194 participants. A 20-item scale explaining 67% of variance, measuring seven aspects of workplace resilience, which are teachable and capable of conscious development, was achieved. A credible model of relationships with work engagement, sleep, stress recovery, and physical health was demonstrated in the expected directions. The new scale shows considerable promise as a reliable instrument for use in the area of employee support and development.

  10. Focal activation of primary visual cortex following supra-choroidal electrical stimulation of the retina: Intrinsic signal imaging and linear model analysis.

    PubMed

    Cloherty, Shaun L; Hietanen, Markus A; Suaning, Gregg J; Ibbotson, Michael R

    2010-01-01

    We performed optical intrinsic signal imaging of cat primary visual cortex (Area 17 and 18) while delivering bipolar electrical stimulation to the retina by way of a supra-choroidal electrode array. Using a general linear model (GLM) analysis we identified statistically significant (p < 0.01) activation in a localized region of cortex following supra-threshold electrical stimulation at a single retinal locus. (1) demonstrate that intrinsic signal imaging combined with linear model analysis provides a powerful tool for assessing cortical responses to prosthetic stimulation, and (2) confirm that supra-choroidal electrical stimulation can achieve localized activation of the cortex consistent with focal activation of the retina.

  11. Modeling potential future individual tree-species distributions in the eastern United States under a climate change scenario: a case study with Pinus virginiana

    Treesearch

    Louis R. Iverson; Anantha Prasad; Mark W. Schwartz; Mark W. Schwartz

    1999-01-01

    We are using a deterministic regression tree analysis model (DISTRIB) and a stochastic migration model (SHIFT) to examine potential distributions of ~66 individual species of eastern US trees under a 2 x CO2 climate change scenario. This process is demonstrated for Virginia pine (Pinus virginiana).

  12. Neural Networks Based Approach to Enhance Space Hardware Reliability

    NASA Technical Reports Server (NTRS)

    Zebulum, Ricardo S.; Thakoor, Anilkumar; Lu, Thomas; Franco, Lauro; Lin, Tsung Han; McClure, S. S.

    2011-01-01

    This paper demonstrates the use of Neural Networks as a device modeling tool to increase the reliability analysis accuracy of circuits targeted for space applications. The paper tackles a number of case studies of relevance to the design of Flight hardware. The results show that the proposed technique generates more accurate models than the ones regularly used to model circuits.

  13. Model analysis of the link between interest rates and crashes

    NASA Astrophysics Data System (ADS)

    Broga, Kristijonas M.; Viegas, Eduardo; Jensen, Henrik Jeldtoft

    2016-09-01

    We analyse the effect of distinct levels of interest rates on the stability of the financial network under our modelling framework. We demonstrate that banking failures are likely to emerge early on under sustained high interest rates, and at much later stage-with higher probability-under a sustained low interest rate scenario. Moreover, we demonstrate that those bank failures are of a different nature: high interest rates tend to result in significantly more bankruptcies associated to credit losses whereas lack of liquidity tends to be the primary cause of failures under lower rates.

  14. Using the weighted area under the net benefit curve for decision curve analysis.

    PubMed

    Talluri, Rajesh; Shete, Sanjay

    2016-07-18

    Risk prediction models have been proposed for various diseases and are being improved as new predictors are identified. A major challenge is to determine whether the newly discovered predictors improve risk prediction. Decision curve analysis has been proposed as an alternative to the area under the curve and net reclassification index to evaluate the performance of prediction models in clinical scenarios. The decision curve computed using the net benefit can evaluate the predictive performance of risk models at a given or range of threshold probabilities. However, when the decision curves for 2 competing models cross in the range of interest, it is difficult to identify the best model as there is no readily available summary measure for evaluating the predictive performance. The key deterrent for using simple measures such as the area under the net benefit curve is the assumption that the threshold probabilities are uniformly distributed among patients. We propose a novel measure for performing decision curve analysis. The approach estimates the distribution of threshold probabilities without the need of additional data. Using the estimated distribution of threshold probabilities, the weighted area under the net benefit curve serves as the summary measure to compare risk prediction models in a range of interest. We compared 3 different approaches, the standard method, the area under the net benefit curve, and the weighted area under the net benefit curve. Type 1 error and power comparisons demonstrate that the weighted area under the net benefit curve has higher power compared to the other methods. Several simulation studies are presented to demonstrate the improvement in model comparison using the weighted area under the net benefit curve compared to the standard method. The proposed measure improves decision curve analysis by using the weighted area under the curve and thereby improves the power of the decision curve analysis to compare risk prediction models in a clinical scenario.

  15. Latent Class Analysis of Incomplete Data via an Entropy-Based Criterion

    PubMed Central

    Larose, Chantal; Harel, Ofer; Kordas, Katarzyna; Dey, Dipak K.

    2016-01-01

    Latent class analysis is used to group categorical data into classes via a probability model. Model selection criteria then judge how well the model fits the data. When addressing incomplete data, the current methodology restricts the imputation to a single, pre-specified number of classes. We seek to develop an entropy-based model selection criterion that does not restrict the imputation to one number of clusters. Simulations show the new criterion performing well against the current standards of AIC and BIC, while a family studies application demonstrates how the criterion provides more detailed and useful results than AIC and BIC. PMID:27695391

  16. Modeling of power electronic systems with EMTP

    NASA Technical Reports Server (NTRS)

    Tam, Kwa-Sur; Dravid, Narayan V.

    1989-01-01

    In view of the potential impact of power electronics on power systems, there is need for a computer modeling/analysis tool to perform simulation studies on power systems with power electronic components as well as to educate engineering students about such systems. The modeling of the major power electronic components of the NASA Space Station Freedom Electric Power System is described along with ElectroMagnetic Transients Program (EMTP) and it is demonstrated that EMTP can serve as a very useful tool for teaching, design, analysis, and research in the area of power systems with power electronic components. EMTP modeling of power electronic circuits is described and simulation results are presented.

  17. Application of Interface Technology in Nonlinear Analysis of a Stitched/RFI Composite Wing Stub Box

    NASA Technical Reports Server (NTRS)

    Wang, John T.; Ransom, Jonathan B.

    1997-01-01

    A recently developed interface technology was successfully employed in the geometrically nonlinear analysis of a full-scale stitched/RFI composite wing box loaded in bending. The technology allows mismatched finite element models to be joined in a variationally consistent manner and reduces the modeling complexity by eliminating transition meshing. In the analysis, local finite element models of nonlinearly deformed wide bays of the wing box are refined without the need for transition meshing to the surrounding coarse mesh. The COMET-AR finite element code, which has the interface technology capability, was used to perform the analyses. The COMET-AR analysis is compared to both a NASTRAN analysis and to experimental data. The interface technology solution is shown to be in good agreement with both. The viability of interface technology for coupled global/local analysis of large scale aircraft structures is demonstrated.

  18. Best (but oft-forgotten) practices: mediation analysis.

    PubMed

    Fairchild, Amanda J; McDaniel, Heather L

    2017-06-01

    This contribution in the "Best (but Oft-Forgotten) Practices" series considers mediation analysis. A mediator (sometimes referred to as an intermediate variable, surrogate endpoint, or intermediate endpoint) is a third variable that explains how or why ≥2 other variables relate in a putative causal pathway. The current article discusses mediation analysis with the ultimate intention of helping nutrition researchers to clarify the rationale for examining mediation, avoid common pitfalls when using the model, and conduct well-informed analyses that can contribute to improving causal inference in evaluations of underlying mechanisms of effects on nutrition-related behavioral and health outcomes. We give specific attention to underevaluated limitations inherent in common approaches to mediation. In addition, we discuss how to conduct a power analysis for mediation models and offer an applied example to demonstrate mediation analysis. Finally, we provide an example write-up of mediation analysis results as a model for applied researchers. © 2017 American Society for Nutrition.

  19. Best (but oft-forgotten) practices: mediation analysis12

    PubMed Central

    McDaniel, Heather L

    2017-01-01

    This contribution in the “Best (but Oft-Forgotten) Practices” series considers mediation analysis. A mediator (sometimes referred to as an intermediate variable, surrogate endpoint, or intermediate endpoint) is a third variable that explains how or why ≥2 other variables relate in a putative causal pathway. The current article discusses mediation analysis with the ultimate intention of helping nutrition researchers to clarify the rationale for examining mediation, avoid common pitfalls when using the model, and conduct well-informed analyses that can contribute to improving causal inference in evaluations of underlying mechanisms of effects on nutrition-related behavioral and health outcomes. We give specific attention to underevaluated limitations inherent in common approaches to mediation. In addition, we discuss how to conduct a power analysis for mediation models and offer an applied example to demonstrate mediation analysis. Finally, we provide an example write-up of mediation analysis results as a model for applied researchers. PMID:28446497

  20. A simulation model for probabilistic analysis of Space Shuttle abort modes

    NASA Technical Reports Server (NTRS)

    Hage, R. T.

    1993-01-01

    A simulation model which was developed to provide a probabilistic analysis tool to study the various space transportation system abort mode situations is presented. The simulation model is based on Monte Carlo simulation of an event-tree diagram which accounts for events during the space transportation system's ascent and its abort modes. The simulation model considers just the propulsion elements of the shuttle system (i.e., external tank, main engines, and solid boosters). The model was developed to provide a better understanding of the probability of occurrence and successful completion of abort modes during the vehicle's ascent. The results of the simulation runs discussed are for demonstration purposes only, they are not official NASA probability estimates.

  1. A pilot modeling technique for handling-qualities research

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1980-01-01

    A brief survey of the more dominant analysis techniques used in closed-loop handling-qualities research is presented. These techniques are shown to rely on so-called classical and modern analytical models of the human pilot which have their foundation in the analysis and design principles of feedback control. The optimal control model of the human pilot is discussed in some detail and a novel approach to the a priori selection of pertinent model parameters is discussed. Frequency domain and tracking performance data from 10 pilot-in-the-loop simulation experiments involving 3 different tasks are used to demonstrate the parameter selection technique. Finally, the utility of this modeling approach in handling-qualities research is discussed.

  2. Formal Analysis of BPMN Models Using Event-B

    NASA Astrophysics Data System (ADS)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  3. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  4. A Bayesian Multinomial Probit MODEL FOR THE ANALYSIS OF PANEL CHOICE DATA.

    PubMed

    Fong, Duncan K H; Kim, Sunghoon; Chen, Zhe; DeSarbo, Wayne S

    2016-03-01

    A new Bayesian multinomial probit model is proposed for the analysis of panel choice data. Using a parameter expansion technique, we are able to devise a Markov Chain Monte Carlo algorithm to compute our Bayesian estimates efficiently. We also show that the proposed procedure enables the estimation of individual level coefficients for the single-period multinomial probit model even when the available prior information is vague. We apply our new procedure to consumer purchase data and reanalyze a well-known scanner panel dataset that reveals new substantive insights. In addition, we delineate a number of advantageous features of our proposed procedure over several benchmark models. Finally, through a simulation analysis employing a fractional factorial design, we demonstrate that the results from our proposed model are quite robust with respect to differing factors across various conditions.

  5. Modeling stock price dynamics by continuum percolation system and relevant complex systems analysis

    NASA Astrophysics Data System (ADS)

    Xiao, Di; Wang, Jun

    2012-10-01

    The continuum percolation system is developed to model a random stock price process in this work. Recent empirical research has demonstrated various statistical features of stock price changes, the financial model aiming at understanding price fluctuations needs to define a mechanism for the formation of the price, in an attempt to reproduce and explain this set of empirical facts. The continuum percolation model is usually referred to as a random coverage process or a Boolean model, the local interaction or influence among traders is constructed by the continuum percolation, and a cluster of continuum percolation is applied to define the cluster of traders sharing the same opinion about the market. We investigate and analyze the statistical behaviors of normalized returns of the price model by some analysis methods, including power-law tail distribution analysis, chaotic behavior analysis and Zipf analysis. Moreover, we consider the daily returns of Shanghai Stock Exchange Composite Index from January 1997 to July 2011, and the comparisons of return behaviors between the actual data and the simulation data are exhibited.

  6. RAD-ADAPT: Software for modelling clonogenic assay data in radiation biology.

    PubMed

    Zhang, Yaping; Hu, Kaiqiang; Beumer, Jan H; Bakkenist, Christopher J; D'Argenio, David Z

    2017-04-01

    We present a comprehensive software program, RAD-ADAPT, for the quantitative analysis of clonogenic assays in radiation biology. Two commonly used models for clonogenic assay analysis, the linear-quadratic model and single-hit multi-target model, are included in the software. RAD-ADAPT uses maximum likelihood estimation method to obtain parameter estimates with the assumption that cell colony count data follow a Poisson distribution. The program has an intuitive interface, generates model prediction plots, tabulates model parameter estimates, and allows automatic statistical comparison of parameters between different groups. The RAD-ADAPT interface is written using the statistical software R and the underlying computations are accomplished by the ADAPT software system for pharmacokinetic/pharmacodynamic systems analysis. The use of RAD-ADAPT is demonstrated using an example that examines the impact of pharmacologic ATM and ATR kinase inhibition on human lung cancer cell line A549 after ionizing radiation. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. The design, analysis, and testing of a low-budget wind-tunnel flutter model with active aerodynamic controls

    NASA Technical Reports Server (NTRS)

    Bolding, R. M.; Stearman, R. O.

    1976-01-01

    A low budget flutter model incorporating active aerodynamic controls for flutter suppression studies was designed as both an educational and research tool to study the interfering lifting surface flutter phenomenon in the form of a swept wing-tail configuration. A flutter suppression mechanism was demonstrated on a simple semirigid three-degree-of-freedom flutter model of this configuration employing an active stabilator control, and was then verified analytically using a doublet lattice lifting surface code and the model's measured mass, mode shapes, and frequencies in a flutter analysis. Preliminary studies were significantly encouraging to extend the analysis to the larger degree of freedom AFFDL wing-tail flutter model where additional analytical flutter suppression studies indicated significant gains in flutter margins could be achieved. The analytical and experimental design of a flutter suppression system for the AFFDL model is presented along with the results of a preliminary passive flutter test.

  8. Discovering phases, phase transitions, and crossovers through unsupervised machine learning: A critical examination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Wenjian; Singh, Rajiv R. P.; Scalettar, Richard T.

    Here, we apply unsupervised machine learning techniques, mainly principal component analysis (PCA), to compare and contrast the phase behavior and phase transitions in several classical spin models - the square and triangular-lattice Ising models, the Blume-Capel model, a highly degenerate biquadratic-exchange spin-one Ising (BSI) model, and the 2D XY model, and examine critically what machine learning is teaching us. We find that quantified principal components from PCA not only allow exploration of different phases and symmetry-breaking, but can distinguish phase transition types and locate critical points. We show that the corresponding weight vectors have a clear physical interpretation, which ismore » particularly interesting in the frustrated models such as the triangular antiferromagnet, where they can point to incipient orders. Unlike the other well-studied models, the properties of the BSI model are less well known. Using both PCA and conventional Monte Carlo analysis, we demonstrate that the BSI model shows an absence of phase transition and macroscopic ground-state degeneracy. The failure to capture the 'charge' correlations (vorticity) in the BSI model (XY model) from raw spin configurations points to some of the limitations of PCA. Finally, we employ a nonlinear unsupervised machine learning procedure, the 'antoencoder method', and demonstrate that it too can be trained to capture phase transitions and critical points.« less

  9. Discovering phases, phase transitions, and crossovers through unsupervised machine learning: A critical examination

    DOE PAGES

    Hu, Wenjian; Singh, Rajiv R. P.; Scalettar, Richard T.

    2017-06-19

    Here, we apply unsupervised machine learning techniques, mainly principal component analysis (PCA), to compare and contrast the phase behavior and phase transitions in several classical spin models - the square and triangular-lattice Ising models, the Blume-Capel model, a highly degenerate biquadratic-exchange spin-one Ising (BSI) model, and the 2D XY model, and examine critically what machine learning is teaching us. We find that quantified principal components from PCA not only allow exploration of different phases and symmetry-breaking, but can distinguish phase transition types and locate critical points. We show that the corresponding weight vectors have a clear physical interpretation, which ismore » particularly interesting in the frustrated models such as the triangular antiferromagnet, where they can point to incipient orders. Unlike the other well-studied models, the properties of the BSI model are less well known. Using both PCA and conventional Monte Carlo analysis, we demonstrate that the BSI model shows an absence of phase transition and macroscopic ground-state degeneracy. The failure to capture the 'charge' correlations (vorticity) in the BSI model (XY model) from raw spin configurations points to some of the limitations of PCA. Finally, we employ a nonlinear unsupervised machine learning procedure, the 'antoencoder method', and demonstrate that it too can be trained to capture phase transitions and critical points.« less

  10. Atmospheric Tracer Inverse Modeling Using Markov Chain Monte Carlo (MCMC)

    NASA Astrophysics Data System (ADS)

    Kasibhatla, P.

    2004-12-01

    In recent years, there has been an increasing emphasis on the use of Bayesian statistical estimation techniques to characterize the temporal and spatial variability of atmospheric trace gas sources and sinks. The applications have been varied in terms of the particular species of interest, as well as in terms of the spatial and temporal resolution of the estimated fluxes. However, one common characteristic has been the use of relatively simple statistical models for describing the measurement and chemical transport model error statistics and prior source statistics. For example, multivariate normal probability distribution functions (pdfs) are commonly used to model these quantities and inverse source estimates are derived for fixed values of pdf paramaters. While the advantage of this approach is that closed form analytical solutions for the a posteriori pdfs of interest are available, it is worth exploring Bayesian analysis approaches which allow for a more general treatment of error and prior source statistics. Here, we present an application of the Markov Chain Monte Carlo (MCMC) methodology to an atmospheric tracer inversion problem to demonstrate how more gereral statistical models for errors can be incorporated into the analysis in a relatively straightforward manner. The MCMC approach to Bayesian analysis, which has found wide application in a variety of fields, is a statistical simulation approach that involves computing moments of interest of the a posteriori pdf by efficiently sampling this pdf. The specific inverse problem that we focus on is the annual mean CO2 source/sink estimation problem considered by the TransCom3 project. TransCom3 was a collaborative effort involving various modeling groups and followed a common modeling and analysis protocoal. As such, this problem provides a convenient case study to demonstrate the applicability of the MCMC methodology to atmospheric tracer source/sink estimation problems.

  11. Structural analysis consultation using artificial intelligence

    NASA Technical Reports Server (NTRS)

    Melosh, R. J.; Marcal, P. V.; Berke, L.

    1978-01-01

    The primary goal of consultation is definition of the best strategy to deal with a structural engineering analysis objective. The knowledge base to meet the need is designed to identify the type of numerical analysis, the needed modeling detail, and specific analysis data required. Decisions are constructed on the basis of the data in the knowledge base - material behavior, relations between geometry and structural behavior, measures of the importance of time and temperature changes - and user supplied specifics characteristics of the spectrum of analysis types, the relation between accuracy and model detail on the structure, its mechanical loadings, and its temperature states. Existing software demonstrated the feasibility of the approach, encompassing the 36 analysis classes spanning nonlinear, temperature affected, incremental analyses which track the behavior of structural systems.

  12. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    DOE PAGES

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-08-03

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  13. ISPAN (Interactive Stiffened Panel Analysis): A tool for quick concept evaluation and design trade studies

    NASA Technical Reports Server (NTRS)

    Hairr, John W.; Dorris, William J.; Ingram, J. Edward; Shah, Bharat M.

    1993-01-01

    Interactive Stiffened Panel Analysis (ISPAN) modules, written in FORTRAN, were developed to provide an easy to use tool for creating finite element models of composite material stiffened panels. The modules allow the user to interactively construct, solve and post-process finite element models of four general types of structural panel configurations using only the panel dimensions and properties as input data. Linear, buckling and post-buckling solution capability is provided. This interactive input allows rapid model generation and solution by non finite element users. The results of a parametric study of a blade stiffened panel are presented to demonstrate the usefulness of the ISPAN modules. Also, a non-linear analysis of a test panel was conducted and the results compared to measured data and previous correlation analysis.

  14. 3-d finite element model development for biomechanics: a software demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollerbach, K.; Hollister, A.M.; Ashby, E.

    1997-03-01

    Finite element analysis is becoming an increasingly important part of biomechanics and orthopedic research, as computational resources become more powerful, and data handling algorithms become more sophisticated. Until recently, tools with sufficient power did not exist or were not accessible to adequately model complicated, three-dimensional, nonlinear biomechanical systems. In the past, finite element analyses in biomechanics have often been limited to two-dimensional approaches, linear analyses, or simulations of single tissue types. Today, we have the resources to model fully three-dimensional, nonlinear, multi-tissue, and even multi-joint systems. The authors will present the process of developing these kinds of finite element models,more » using human hand and knee examples, and will demonstrate their software tools.« less

  15. Algorithmic detectability threshold of the stochastic block model

    NASA Astrophysics Data System (ADS)

    Kawamoto, Tatsuro

    2018-03-01

    The assumption that the values of model parameters are known or correctly learned, i.e., the Nishimori condition, is one of the requirements for the detectability analysis of the stochastic block model in statistical inference. In practice, however, there is no example demonstrating that we can know the model parameters beforehand, and there is no guarantee that the model parameters can be learned accurately. In this study, we consider the expectation-maximization (EM) algorithm with belief propagation (BP) and derive its algorithmic detectability threshold. Our analysis is not restricted to the community structure but includes general modular structures. Because the algorithm cannot always learn the planted model parameters correctly, the algorithmic detectability threshold is qualitatively different from the one with the Nishimori condition.

  16. COI Structural Analysis Presentation

    NASA Technical Reports Server (NTRS)

    Cline, Todd; Stahl, H. Philip (Technical Monitor)

    2001-01-01

    This report discusses the structural analysis of the Next Generation Space Telescope Mirror System Demonstrator (NMSD) developed by Composite Optics Incorporated (COI) in support of the Next Generation Space Telescope (NGST) project. The mirror was submitted to Marshall Space Flight Center (MSFC) for cryogenic testing and evaluation. Once at MSFC, the mirror was lowered to approximately 40 K and the optical surface distortions were measured. Alongside this experiment, an analytical model was developed and used to compare to the test results. A NASTRAN finite element model was provided by COI and a thermal model was developed from it. Using the thermal model, steady state nodal temperatures were calculated based on the predicted environment of the large cryogenic test chamber at MSFC. This temperature distribution was applied in the structural analysis to solve for the deflections of the optical surface. Finally, these deflections were submitted for optical analysis and comparison to the interferometer test data.

  17. Comparing GWAS Results of Complex Traits Using Full Genetic Model and Additive Models for Revealing Genetic Architecture

    PubMed Central

    Monir, Md. Mamun; Zhu, Jun

    2017-01-01

    Most of the genome-wide association studies (GWASs) for human complex diseases have ignored dominance, epistasis and ethnic interactions. We conducted comparative GWASs for total cholesterol using full model and additive models, which illustrate the impacts of the ignoring genetic variants on analysis results and demonstrate how genetic effects of multiple loci could differ across different ethnic groups. There were 15 quantitative trait loci with 13 individual loci and 3 pairs of epistasis loci identified by full model, whereas only 14 loci (9 common loci and 5 different loci) identified by multi-loci additive model. Again, 4 full model detected loci were not detected using multi-loci additive model. PLINK-analysis identified two loci and GCTA-analysis detected only one locus with genome-wide significance. Full model identified three previously reported genes as well as several new genes. Bioinformatics analysis showed some new genes are related with cholesterol related chemicals and/or diseases. Analyses of cholesterol data and simulation studies revealed that the full model performs were better than the additive-model performs in terms of detecting power and unbiased estimations of genetic variants of complex traits. PMID:28079101

  18. Multiphysics Nuclear Thermal Rocket Thrust Chamber Analysis

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See

    2005-01-01

    The objective of this effort is t o develop an efficient and accurate thermo-fluid computational methodology to predict environments for hypothetical thrust chamber design and analysis. The current task scope is to perform multidimensional, multiphysics analysis of thrust performance and heat transfer analysis for a hypothetical solid-core, nuclear thermal engine including thrust chamber and nozzle. The multiphysics aspects of the model include: real fluid dynamics, chemical reactivity, turbulent flow, and conjugate heat transfer. The model will be designed to identify thermal, fluid, and hydrogen environments in all flow paths and materials. This model would then be used to perform non- nuclear reproduction of the flow element failures demonstrated in the Rover/NERVA testing, investigate performance of specific configurations and assess potential issues and enhancements. A two-pronged approach will be employed in this effort: a detailed analysis of a multi-channel, flow-element, and global modeling of the entire thrust chamber assembly with a porosity modeling technique. It is expected that the detailed analysis of a single flow element would provide detailed fluid, thermal, and hydrogen environments for stress analysis, while the global thrust chamber assembly analysis would promote understanding of the effects of hydrogen dissociation and heat transfer on thrust performance. These modeling activities will be validated as much as possible by testing performed by other related efforts.

  19. On the road to metallic nanoparticles by rational design: bridging the gap between atomic-level theoretical modeling and reality by total scattering experiments

    NASA Astrophysics Data System (ADS)

    Prasai, Binay; Wilson, A. R.; Wiley, B. J.; Ren, Y.; Petkov, Valeri

    2015-10-01

    The extent to which current theoretical modeling alone can reveal real-world metallic nanoparticles (NPs) at the atomic level was scrutinized and demonstrated to be insufficient and how it can be improved by using a pragmatic approach involving straightforward experiments is shown. In particular, 4 to 6 nm in size silica supported Au100-xPdx (x = 30, 46 and 58) explored for catalytic applications is characterized structurally by total scattering experiments including high-energy synchrotron X-ray diffraction (XRD) coupled to atomic pair distribution function (PDF) analysis. Atomic-level models for the NPs are built by molecular dynamics simulations based on the archetypal for current theoretical modeling Sutton-Chen (SC) method. Models are matched against independent experimental data and are demonstrated to be inaccurate unless their theoretical foundation, i.e. the SC method, is supplemented with basic yet crucial information on the length and strength of metal-to-metal bonds and, when necessary, structural disorder in the actual NPs studied. An atomic PDF-based approach for accessing such information and implementing it in theoretical modeling is put forward. For completeness, the approach is concisely demonstrated on 15 nm in size water-dispersed Au particles explored for bio-medical applications and 16 nm in size hexane-dispersed Fe48Pd52 particles explored for magnetic applications as well. It is argued that when ``tuned up'' against experiments relevant to metals and alloys confined to nanoscale dimensions, such as total scattering coupled to atomic PDF analysis, rather than by mere intuition and/or against data for the respective solids, atomic-level theoretical modeling can provide a sound understanding of the synthesis-structure-property relationships in real-world metallic NPs. Ultimately this can help advance nanoscience and technology a step closer to producing metallic NPs by rational design.The extent to which current theoretical modeling alone can reveal real-world metallic nanoparticles (NPs) at the atomic level was scrutinized and demonstrated to be insufficient and how it can be improved by using a pragmatic approach involving straightforward experiments is shown. In particular, 4 to 6 nm in size silica supported Au100-xPdx (x = 30, 46 and 58) explored for catalytic applications is characterized structurally by total scattering experiments including high-energy synchrotron X-ray diffraction (XRD) coupled to atomic pair distribution function (PDF) analysis. Atomic-level models for the NPs are built by molecular dynamics simulations based on the archetypal for current theoretical modeling Sutton-Chen (SC) method. Models are matched against independent experimental data and are demonstrated to be inaccurate unless their theoretical foundation, i.e. the SC method, is supplemented with basic yet crucial information on the length and strength of metal-to-metal bonds and, when necessary, structural disorder in the actual NPs studied. An atomic PDF-based approach for accessing such information and implementing it in theoretical modeling is put forward. For completeness, the approach is concisely demonstrated on 15 nm in size water-dispersed Au particles explored for bio-medical applications and 16 nm in size hexane-dispersed Fe48Pd52 particles explored for magnetic applications as well. It is argued that when ``tuned up'' against experiments relevant to metals and alloys confined to nanoscale dimensions, such as total scattering coupled to atomic PDF analysis, rather than by mere intuition and/or against data for the respective solids, atomic-level theoretical modeling can provide a sound understanding of the synthesis-structure-property relationships in real-world metallic NPs. Ultimately this can help advance nanoscience and technology a step closer to producing metallic NPs by rational design. Electronic supplementary information (ESI) available: XRD patterns, TEM and 3D structure modelling methodology. See DOI: 10.1039/c5nr04678e

  20. A Method for Evaluating the Safety Impacts of Air Traffic Automation

    NASA Technical Reports Server (NTRS)

    Kostiuk, Peter; Shapiro, Gerald; Hanson, Dave; Kolitz, Stephan; Leong, Frank; Rosch, Gene; Bonesteel, Charles

    1998-01-01

    This report describes a methodology for analyzing the safety and operational impacts of emerging air traffic technologies. The approach integrates traditional reliability models of the system infrastructure with models that analyze the environment within which the system operates, and models of how the system responds to different scenarios. Products of the analysis include safety measures such as predicted incident rates, predicted accident statistics, and false alarm rates; and operational availability data. The report demonstrates the methodology with an analysis of the operation of the Center-TRACON Automation System at Dallas-Fort Worth International Airport.

  1. The Big-Five factor structure as an integrative framework: an analysis of Clarke's AVA model.

    PubMed

    Goldberg, L R; Sweeney, D; Merenda, P F; Hughes, J E

    1996-06-01

    Using a large (N = 3,629) sample of participants selected to be representative of U.S. working adults in the year 2,000, we provide links between the constructs in 2 personality models that have been derived from quite different rationales. We demonstrate the use of a novel procedure for providing orthogonal Big-Five factor scores and use those scores to analyze the scales of the Activity Vector Analysis (AVA). We discuss the implications of our many findings both for the science of personality assessment and for future research using the AVA model.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  3. Guidance for the utility of linear models in meta-analysis of genetic association studies of binary phenotypes.

    PubMed

    Cook, James P; Mahajan, Anubha; Morris, Andrew P

    2017-02-01

    Linear mixed models are increasingly used for the analysis of genome-wide association studies (GWAS) of binary phenotypes because they can efficiently and robustly account for population stratification and relatedness through inclusion of random effects for a genetic relationship matrix. However, the utility of linear (mixed) models in the context of meta-analysis of GWAS of binary phenotypes has not been previously explored. In this investigation, we present simulations to compare the performance of linear and logistic regression models under alternative weighting schemes in a fixed-effects meta-analysis framework, considering designs that incorporate variable case-control imbalance, confounding factors and population stratification. Our results demonstrate that linear models can be used for meta-analysis of GWAS of binary phenotypes, without loss of power, even in the presence of extreme case-control imbalance, provided that one of the following schemes is used: (i) effective sample size weighting of Z-scores or (ii) inverse-variance weighting of allelic effect sizes after conversion onto the log-odds scale. Our conclusions thus provide essential recommendations for the development of robust protocols for meta-analysis of binary phenotypes with linear models.

  4. Air Force Reusable Booster System A Quick-look, Design Focused Modeling and Cost Analysis Study

    NASA Technical Reports Server (NTRS)

    Zapata, Edgar

    2011-01-01

    Presents work supporting the Air force Reusable Booster System (RBS) - A Cost Study with Goals as follows: Support US launch systems decision makers, esp. in regards to the research, technology and demonstration investments required for reusable systems to succeed. Encourage operable directions in Reusable Booster / Launch Vehicle Systems technology choices, system design and product and process developments. Perform a quick-look cost study, while developing a cost model for more refined future analysis.

  5. Material characteristics and equivalent circuit models of stacked graphene oxide for capacitive humidity sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, Kook In; Lee, In Gyu; Hwang, Wan Sik, E-mail: mhshin@kau.ac.kr, E-mail: whwang@kau.ac.kr

    The oxidation properties of graphene oxide (GO) are systematically correlated with their chemical sensing properties. Based on an impedance analysis, the equivalent circuit models of the capacitive sensors are established, and it is demonstrated that capacitive operations are related to the degree of oxidation. This is also confirmed by X-ray diffraction and Raman analysis. Finally, highly sensitive stacked GO sensors are shown to detect humidity in capacitive mode, which can be useful in various applications requiring low power consumption.

  6. Bayesian analysis of Jolly-Seber type models

    USGS Publications Warehouse

    Matechou, Eleni; Nicholls, Geoff K.; Morgan, Byron J. T.; Collazo, Jaime A.; Lyons, James E.

    2016-01-01

    We propose the use of finite mixtures of continuous distributions in modelling the process by which new individuals, that arrive in groups, become part of a wildlife population. We demonstrate this approach using a data set of migrating semipalmated sandpipers (Calidris pussila) for which we extend existing stopover models to allow for individuals to have different behaviour in terms of their stopover duration at the site. We demonstrate the use of reversible jump MCMC methods to derive posterior distributions for the model parameters and the models, simultaneously. The algorithm moves between models with different numbers of arrival groups as well as between models with different numbers of behavioural groups. The approach is shown to provide new ecological insights about the stopover behaviour of semipalmated sandpipers but is generally applicable to any population in which animals arrive in groups and potentially exhibit heterogeneity in terms of one or more other processes.

  7. Integrated Modeling Approach for Optimal Management of Water, Energy and Food Security Nexus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaodong; Vesselinov, Velimir Valentinov

    We report that water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-periodmore » socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. Lastly, the obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.« less

  8. Integrated Modeling Approach for Optimal Management of Water, Energy and Food Security Nexus

    DOE PAGES

    Zhang, Xiaodong; Vesselinov, Velimir Valentinov

    2016-12-28

    We report that water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-periodmore » socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. Lastly, the obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.« less

  9. Confirmatory factor analysis and measurement invariance of the Child Feeding Questionnaire in low-income Hispanic and African-American mothers with preschool-age children.

    PubMed

    Kong, Angela; Vijayasiri, Ganga; Fitzgibbon, Marian L; Schiffer, Linda A; Campbell, Richard T

    2015-07-01

    Validation work of the Child Feeding Questionnaire (CFQ) in low-income minority samples suggests a need for further conceptual refinement of this instrument. Using confirmatory factor analysis, this study evaluated 5- and 6-factor models on a large sample of African-American and Hispanic mothers with preschool-age children (n = 962). The 5-factor model included: 'perceived responsibility', 'concern about child's weight', 'restriction', 'pressure to eat', and 'monitoring' and the 6-factor model also tested 'food as a reward'. Multi-group analysis assessed measurement invariance by race/ethnicity. In the 5-factor model, two low-loading items from 'restriction' and one low-variance item from 'perceived responsibility' were dropped to achieve fit. Only removal of the low-variance item was needed to achieve fit in the 6-factor model. Invariance analyses demonstrated differences in factor loadings. This finding suggests African-American and Hispanic mothers may vary in their interpretation of some CFQ items and use of cognitive interviews could enhance item interpretation. Our results also demonstrated that 'food as a reward' is a plausible construct among a low-income minority sample and adds to the evidence that this factor resonates conceptually with parents of preschoolers; however, further testing is needed to determine the validity of this factor with older age groups. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. An Economic Analysis of the Demand for Scientific Journals

    ERIC Educational Resources Information Center

    Berg, Sanford V.

    1972-01-01

    The purpose of this study is to demonstrate that economic analysis can be useful in modeling the scientific journal market. Of particular interest is the efficiency of pricing and page policies. To calculate loses due to inefficiencies, demand parameters are statistically estimated and used in a discussion of market efficiency. (3 references)…

  11. The Three Rs of Education Finance Reform: Re-Thinking, Re-Tooling, and Re-Evaluating School-Site Information.

    ERIC Educational Resources Information Center

    Speakman, Sheree T.; And Others

    1997-01-01

    Examines the need for new financial reporting and analysis, starting with rethinking the school finance field, retooling the management information systems for school finance, and re-evaluating knowledge about school-site management, accounting, and reporting. Demonstrates a new reporting methodology, the Financial Analysis Model, that traces…

  12. An Approximation of an Instructional Model for Developing Home Living Skills in Severely Handicapped Students.

    ERIC Educational Resources Information Center

    Hamre, S.

    The author discusses the need for severely handicapped students to acquire basic home living skills, reviews task analysis principles, and provides sample instructional programs. Listed are basic grooming, dressing, domestic maintenance, and cooking skills. A sample task analysis procedure is demonstrated for the skill of brushing teeth. Reported…

  13. The Shock and Vibration Digest. Volume 16, Number 3

    DTIC Science & Technology

    1984-03-01

    Fluid-induced Statistical Energy Analysis Method excitation, Wind tunnel testing V.R. Miller and L.L. Faulkner Flight Dynamics Lab., Air Force...84475 wall by the statistical energy analysis (SEA) method. The fuselage structure is represented as a series of curved, iso- Probabilistic Fracture...heavy are demonstrated in three-dimensional form. floor, a statistical energy analysis (SEA) model is presented. Only structural systems (i.e., no

  14. A pulsed jumping ring apparatus for demonstration of Lenz's law

    NASA Astrophysics Data System (ADS)

    Tanner, Paul; Loebach, Jeff; Cook, James; Hallen, H. D.

    2001-08-01

    Lenz's law is often demonstrated in classrooms by the use of Elihu Thomson's jumping ring. However, it is ironic that a thorough analysis of the physics of the ac jumping ring reveals that the operation is due mainly to a phase difference, not Lenz's law. A complete analysis of the physics behind the ac jumping ring is difficult for the introductory student. We present a design for a pulsed jumping ring which can be fully described by the application of Lenz's law. Other advantages of this system are that it lends itself to a rigorous analysis of the force balances and energy flow. The simple jumping ring apparatus closely resembles Thomson's, but is powered by a capacitor bank. The jump heights were measured for several rings as a function of energy stored in the capacitors. A simple model describes the data well. Currents in both the drive coil and ring are measured and that of the drive coil modeled to illuminate some properties of the capacitors. An analysis of the energy flow in the system explains the higher jump heights, to 2 m, when the ring is cooled.

  15. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  16. Fault Detection and Severity Analysis of Servo Valves Using Recurrence Quantification Analysis

    DTIC Science & Technology

    2014-10-02

    Fault Detection and Severity Analysis of Servo Valves Using Recurrence Quantification Analysis M. Samadani1, C. A. Kitio Kwuimy2, and C. Nataraj3...diagnostics of nonlinear systems. A detailed nonlinear math- ematical model of a servo electro-hydraulic system has been used to demonstrate the procedure...Two faults have been considered associated with the servo valve including the in- creased friction between spool and sleeve and the degradation of the

  17. Total protein analysis as a reliable loading control for quantitative fluorescent Western blotting.

    PubMed

    Eaton, Samantha L; Roche, Sarah L; Llavero Hurtado, Maica; Oldknow, Karla J; Farquharson, Colin; Gillingwater, Thomas H; Wishart, Thomas M

    2013-01-01

    Western blotting has been a key technique for determining the relative expression of proteins within complex biological samples since the first publications in 1979. Recent developments in sensitive fluorescent labels, with truly quantifiable linear ranges and greater limits of detection, have allowed biologists to probe tissue specific pathways and processes with higher resolution than ever before. However, the application of quantitative Western blotting (QWB) to a range of healthy tissues and those from degenerative models has highlighted a problem with significant consequences for quantitative protein analysis: how can researchers conduct comparative expression analyses when many of the commonly used reference proteins (e.g. loading controls) are differentially expressed? Here we demonstrate that common controls, including actin and tubulin, are differentially expressed in tissues from a wide range of animal models of neurodegeneration. We highlight the prevalence of such alterations through examination of published "-omics" data, and demonstrate similar responses in sensitive QWB experiments. For example, QWB analysis of spinal cord from a murine model of Spinal Muscular Atrophy using an Odyssey scanner revealed that beta-actin expression was decreased by 19.3±2% compared to healthy littermate controls. Thus, normalising QWB data to β-actin in these circumstances could result in 'skewing' of all data by ∼20%. We further demonstrate that differential expression of commonly used loading controls was not restricted to the nervous system, but was also detectable across multiple tissues, including bone, fat and internal organs. Moreover, expression of these "control" proteins was not consistent between different portions of the same tissue, highlighting the importance of careful and consistent tissue sampling for QWB experiments. Finally, having illustrated the problem of selecting appropriate single protein loading controls, we demonstrate that normalisation using total protein analysis on samples run in parallel with stains such as Coomassie blue provides a more robust approach.

  18. Structural Biology of Tumor Necrosis Factor Demonstrated for Undergraduates Instruction by Computer Simulation

    ERIC Educational Resources Information Center

    Roy, Urmi

    2016-01-01

    This work presents a three-dimensional (3D) modeling exercise for undergraduate students in chemistry and health sciences disciplines, focusing on a protein-group linked to immune system regulation. Specifically, the exercise involves molecular modeling and structural analysis of tumor necrosis factor (TNF) proteins, both wild type and mutant. The…

  19. A Bayesian Approach for Nonlinear Structural Equation Models with Dichotomous Variables Using Logit and Probit Links

    ERIC Educational Resources Information Center

    Lee, Sik-Yum; Song, Xin-Yuan; Cai, Jing-Heng

    2010-01-01

    Analysis of ordered binary and unordered binary data has received considerable attention in social and psychological research. This article introduces a Bayesian approach, which has several nice features in practical applications, for analyzing nonlinear structural equation models with dichotomous data. We demonstrate how to use the software…

  20. Water quality modeling based on landscape analysis: Importance of riparian hydrology

    Treesearch

    Thomas Grabs

    2010-01-01

    Several studies in high-latitude catchments have demonstrated the importance of near-stream riparian zones as hydrogeochemical hotspots with a substantial influence on stream chemistry. An adequate representation of the spatial variability of riparian-zone processes and characteristics is the key for modeling spatiotemporal variations of stream-water quality. This...

  1. Structuring Formal Control Systems Specifications for Reuse: Surviving Hardware Changes

    NASA Technical Reports Server (NTRS)

    Thompson, Jeffrey M.; Heimdahl, Mats P. E.; Erickson, Debra M.

    2000-01-01

    Formal capture and analysis of the required behavior of control systems have many advantages. For instance, it encourages rigorous requirements analysis, the required behavior is unambiguously defined, and we can assure that various safety properties are satisfied. Formal modeling is, however, a costly and time consuming process and if one could reuse the formal models over a family of products, significant cost savings would be realized. In an ongoing project we are investigating how to structure state-based models to achieve a high level of reusability within product families. In this paper we discuss a high-level structure of requirements models that achieves reusability of the desired control behavior across varying hardware platforms in a product family. The structuring approach is demonstrated through a case study in the mobile robotics domain where the desired robot behavior is reused on two diverse platforms-one commercial mobile platform and one build in-house. We use our language RSML (-e) to capture the control behavior for reuse and our tool NIMBUS to demonstrate how the formal specification can be validated and used as a prototype on the two platforms.

  2. Multi-scale Modeling of the Impact Response of a Strain Rate Sensitive High-Manganese Austenitic Steel

    NASA Astrophysics Data System (ADS)

    Önal, Orkun; Ozmenci, Cemre; Canadinc, Demircan

    2014-09-01

    A multi-scale modeling approach was applied to predict the impact response of a strain rate sensitive high-manganese austenitic steel. The roles of texture, geometry and strain rate sensitivity were successfully taken into account all at once by coupling crystal plasticity and finite element (FE) analysis. Specifically, crystal plasticity was utilized to obtain the multi-axial flow rule at different strain rates based on the experimental deformation response under uniaxial tensile loading. The equivalent stress - equivalent strain response was then incorporated into the FE model for the sake of a more representative hardening rule under impact loading. The current results demonstrate that reliable predictions can be obtained by proper coupling of crystal plasticity and FE analysis even if the experimental flow rule of the material is acquired under uniaxial loading and at moderate strain rates that are significantly slower than those attained during impact loading. Furthermore, the current findings also demonstrate the need for an experiment-based multi-scale modeling approach for the sake of reliable predictions of the impact response.

  3. Use of Forest Inventory and Analysis information in wildlife habitat modeling: a process for linking multiple scales

    Treesearch

    Thomas C. Edwards; Gretchen G. Moisen; Tracey S. Frescino; Joshua L. Lawler

    2002-01-01

    We describe our collective efforts to develop and apply methods for using FIA data to model forest resources and wildlife habitat. Our work demonstrates how flexible regression techniques, such as generalized additive models, can be linked with spatially explicit environmental information for the mapping of forest type and structure. We illustrate how these maps of...

  4. The choice of prior distribution for a covariance matrix in multivariate meta-analysis: a simulation study.

    PubMed

    Hurtado Rúa, Sandra M; Mazumdar, Madhu; Strawderman, Robert L

    2015-12-30

    Bayesian meta-analysis is an increasingly important component of clinical research, with multivariate meta-analysis a promising tool for studies with multiple endpoints. Model assumptions, including the choice of priors, are crucial aspects of multivariate Bayesian meta-analysis (MBMA) models. In a given model, two different prior distributions can lead to different inferences about a particular parameter. A simulation study was performed in which the impact of families of prior distributions for the covariance matrix of a multivariate normal random effects MBMA model was analyzed. Inferences about effect sizes were not particularly sensitive to prior choice, but the related covariance estimates were. A few families of prior distributions with small relative biases, tight mean squared errors, and close to nominal coverage for the effect size estimates were identified. Our results demonstrate the need for sensitivity analysis and suggest some guidelines for choosing prior distributions in this class of problems. The MBMA models proposed here are illustrated in a small meta-analysis example from the periodontal field and a medium meta-analysis from the study of stroke. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  5. Frequencies and Flutter Speed Estimation for Damaged Aircraft Wing Using Scaled Equivalent Plate Analysis

    NASA Technical Reports Server (NTRS)

    Krishnamurthy, Thiagarajan

    2010-01-01

    Equivalent plate analysis is often used to replace the computationally expensive finite element analysis in initial design stages or in conceptual design of aircraft wing structures. The equivalent plate model can also be used to design a wind tunnel model to match the stiffness characteristics of the wing box of a full-scale aircraft wing model while satisfying strength-based requirements An equivalent plate analysis technique is presented to predict the static and dynamic response of an aircraft wing with or without damage. First, a geometric scale factor and a dynamic pressure scale factor are defined to relate the stiffness, load and deformation of the equivalent plate to the aircraft wing. A procedure using an optimization technique is presented to create scaled equivalent plate models from the full scale aircraft wing using geometric and dynamic pressure scale factors. The scaled models are constructed by matching the stiffness of the scaled equivalent plate with the scaled aircraft wing stiffness. It is demonstrated that the scaled equivalent plate model can be used to predict the deformation of the aircraft wing accurately. Once the full equivalent plate geometry is obtained, any other scaled equivalent plate geometry can be obtained using the geometric scale factor. Next, an average frequency scale factor is defined as the average ratio of the frequencies of the aircraft wing to the frequencies of the full-scaled equivalent plate. The average frequency scale factor combined with the geometric scale factor is used to predict the frequency response of the aircraft wing from the scaled equivalent plate analysis. A procedure is outlined to estimate the frequency response and the flutter speed of an aircraft wing from the equivalent plate analysis using the frequency scale factor and geometric scale factor. The equivalent plate analysis is demonstrated using an aircraft wing without damage and another with damage. Both of the problems show that the scaled equivalent plate analysis can be successfully used to predict the frequencies and flutter speed of a typical aircraft wing.

  6. The JBEI quantitative metabolic modeling library (jQMM): a python library for modeling microbial metabolism.

    PubMed

    Birkel, Garrett W; Ghosh, Amit; Kumar, Vinay S; Weaver, Daniel; Ando, David; Backman, Tyler W H; Arkin, Adam P; Keasling, Jay D; Martín, Héctor García

    2017-04-05

    Modeling of microbial metabolism is a topic of growing importance in biotechnology. Mathematical modeling helps provide a mechanistic understanding for the studied process, separating the main drivers from the circumstantial ones, bounding the outcomes of experiments and guiding engineering approaches. Among different modeling schemes, the quantification of intracellular metabolic fluxes (i.e. the rate of each reaction in cellular metabolism) is of particular interest for metabolic engineering because it describes how carbon and energy flow throughout the cell. In addition to flux analysis, new methods for the effective use of the ever more readily available and abundant -omics data (i.e. transcriptomics, proteomics and metabolomics) are urgently needed. The jQMM library presented here provides an open-source, Python-based framework for modeling internal metabolic fluxes and leveraging other -omics data for the scientific study of cellular metabolism and bioengineering purposes. Firstly, it presents a complete toolbox for simultaneously performing two different types of flux analysis that are typically disjoint: Flux Balance Analysis and 13 C Metabolic Flux Analysis. Moreover, it introduces the capability to use 13 C labeling experimental data to constrain comprehensive genome-scale models through a technique called two-scale 13 C Metabolic Flux Analysis (2S- 13 C MFA). In addition, the library includes a demonstration of a method that uses proteomics data to produce actionable insights to increase biofuel production. Finally, the use of the jQMM library is illustrated through the addition of several Jupyter notebook demonstration files that enhance reproducibility and provide the capability to be adapted to the user's specific needs. jQMM will facilitate the design and metabolic engineering of organisms for biofuels and other chemicals, as well as investigations of cellular metabolism and leveraging -omics data. As an open source software project, we hope it will attract additions from the community and grow with the rapidly changing field of metabolic engineering.

  7. The JBEI quantitative metabolic modeling library (jQMM): a python library for modeling microbial metabolism

    DOE PAGES

    Birkel, Garrett W.; Ghosh, Amit; Kumar, Vinay S.; ...

    2017-04-05

    Modeling of microbial metabolism is a topic of growing importance in biotechnology. Mathematical modeling helps provide a mechanistic understanding for the studied process, separating the main drivers from the circumstantial ones, bounding the outcomes of experiments and guiding engineering approaches. Among different modeling schemes, the quantification of intracellular metabolic fluxes (i.e. the rate of each reaction in cellular metabolism) is of particular interest for metabolic engineering because it describes how carbon and energy flow throughout the cell. In addition to flux analysis, new methods for the effective use of the ever more readily available and abundant -omics data (i.e. transcriptomics,more » proteomics and metabolomics) are urgently needed. The jQMM library presented here provides an open-source, Python-based framework for modeling internal metabolic fluxes and leveraging other -omics data for the scientific study of cellular metabolism and bioengineering purposes. Firstly, it presents a complete toolbox for simultaneously performing two different types of flux analysis that are typically disjoint: Flux Balance Analysis and 13C Metabolic Flux Analysis. Moreover, it introduces the capability to use 13C labeling experimental data to constrain comprehensive genome-scale models through a technique called two-scale 13C Metabolic Flux Analysis (2S- 13C MFA). In addition, the library includes a demonstration of a method that uses proteomics data to produce actionable insights to increase biofuel production. Finally, the use of the jQMM library is illustrated through the addition of several Jupyter notebook demonstration files that enhance reproducibility and provide the capability to be adapted to the user's specific needs. jQMM will facilitate the design and metabolic engineering of organisms for biofuels and other chemicals, as well as investigations of cellular metabolism and leveraging -omics data. As an open source software project, we hope it will attract additions from the community and grow with the rapidly changing field of metabolic engineering.« less

  8. The JBEI quantitative metabolic modeling library (jQMM): a python library for modeling microbial metabolism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birkel, Garrett W.; Ghosh, Amit; Kumar, Vinay S.

    Modeling of microbial metabolism is a topic of growing importance in biotechnology. Mathematical modeling helps provide a mechanistic understanding for the studied process, separating the main drivers from the circumstantial ones, bounding the outcomes of experiments and guiding engineering approaches. Among different modeling schemes, the quantification of intracellular metabolic fluxes (i.e. the rate of each reaction in cellular metabolism) is of particular interest for metabolic engineering because it describes how carbon and energy flow throughout the cell. In addition to flux analysis, new methods for the effective use of the ever more readily available and abundant -omics data (i.e. transcriptomics,more » proteomics and metabolomics) are urgently needed. The jQMM library presented here provides an open-source, Python-based framework for modeling internal metabolic fluxes and leveraging other -omics data for the scientific study of cellular metabolism and bioengineering purposes. Firstly, it presents a complete toolbox for simultaneously performing two different types of flux analysis that are typically disjoint: Flux Balance Analysis and 13C Metabolic Flux Analysis. Moreover, it introduces the capability to use 13C labeling experimental data to constrain comprehensive genome-scale models through a technique called two-scale 13C Metabolic Flux Analysis (2S- 13C MFA). In addition, the library includes a demonstration of a method that uses proteomics data to produce actionable insights to increase biofuel production. Finally, the use of the jQMM library is illustrated through the addition of several Jupyter notebook demonstration files that enhance reproducibility and provide the capability to be adapted to the user's specific needs. jQMM will facilitate the design and metabolic engineering of organisms for biofuels and other chemicals, as well as investigations of cellular metabolism and leveraging -omics data. As an open source software project, we hope it will attract additions from the community and grow with the rapidly changing field of metabolic engineering.« less

  9. Modeling and analysis of a meso-hydraulic climbing robot with artificial muscle actuation.

    PubMed

    Chapman, Edward M; Jenkins, Tyler E; Bryant, Matthew

    2017-11-08

    This paper presents a fully coupled electro-hydraulic model of a bio-inspired climbing robot actuated by fluidic artificial muscles (FAMs). This analysis expands upon previous FAM literature by considering not only the force and contraction characteristics of the actuator, but the complete hydraulic and electromechanical circuits as well as the dynamics of the climbing robot. This analysis allows modeling of the time-varying applied pressure, electrical current, and actuator contraction for accurate prediction of the robot motion, energy consumption, and mechanical work output. The developed model is first validated against mechanical and electrical data collected from a proof-of-concept prototype robot. The model is then employed to study the system-level sensitivities of the robot locomotion efficiency and average climbing speed to several design and operating parameters. The results of this analysis demonstrate that considering only the transduction efficiency of the FAM actuators is insufficient to maximize the efficiency of the complete robot, and that a holistic approach can lead to significant improvements in performance.

  10. Hamiltonian Analysis of Subcritical Stochastic Epidemic Dynamics

    PubMed Central

    2017-01-01

    We extend a technique of approximation of the long-term behavior of a supercritical stochastic epidemic model, using the WKB approximation and a Hamiltonian phase space, to the subcritical case. The limiting behavior of the model and approximation are qualitatively different in the subcritical case, requiring a novel analysis of the limiting behavior of the Hamiltonian system away from its deterministic subsystem. This yields a novel, general technique of approximation of the quasistationary distribution of stochastic epidemic and birth-death models and may lead to techniques for analysis of these models beyond the quasistationary distribution. For a classic SIS model, the approximation found for the quasistationary distribution is very similar to published approximations but not identical. For a birth-death process without depletion of susceptibles, the approximation is exact. Dynamics on the phase plane similar to those predicted by the Hamiltonian analysis are demonstrated in cross-sectional data from trachoma treatment trials in Ethiopia, in which declining prevalences are consistent with subcritical epidemic dynamics. PMID:28932256

  11. Maximizing sinter plant operating flexibility through emissions trading and air modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schewe, G.J.; Wagner, J.A.; Heron, T.

    1998-12-31

    This paper provides details on the dispersion modeling analysis performed to demonstrate air quality impacts associated with an emission trading scheme for a sintering operation in Youngstown, Ohio. The emission trade was proposed to allow the sinter plant to expand its current allowable sulfur dioxide (SO2) emissions while being offset with SO{sub 2} emissions from boilers at a nearby shutdown steel mill. While the emission trade itself was feasible and the emissions required for the offset were available (the boiler shutdown and their subsequent SO{sub 2} emission credits were never claimed, banked, or used elsewhere), the second criteria for determiningmore » compliance was a demonstration of minimal air quality impact. The air analysis combined the increased ambient SO{sub 2} concentrations of the relaxed sinter plant emissions with the offsetting air quality of the shutdown boilers to yield the net air quality impacts. To test this net air impact, dispersion modeling was performed treating the sinter plant SO{sub 2} emissions as positive and the shutdown boiler SO{sub 2} emissions as negative. The results of the modeling indicated that the ambient air concentrations due to the proposed emissions increase will be offset by the nearby boiler emissions to levels acceptable under EPA`s offset policy Level 2 significant impact concentrations. Therefore, the dispersion modeling demonstrated that the emission trading scheme would not result in significant air quality impacts and maximum operating flexibility was provided to the sintering facility.« less

  12. Hyperspectral Imaging and SPA-LDA Quantitative Analysis for Detection of Colon Cancer Tissue

    NASA Astrophysics Data System (ADS)

    Yuan, X.; Zhang, D.; Wang, Ch.; Dai, B.; Zhao, M.; Li, B.

    2018-05-01

    Hyperspectral imaging (HSI) has been demonstrated to provide a rapid, precise, and noninvasive method for cancer detection. However, because HSI contains many data, quantitative analysis is often necessary to distill information useful for distinguishing cancerous from normal tissue. To demonstrate that HSI with our proposed algorithm can make this distinction, we built a Vis-NIR HSI setup and made many spectral images of colon tissues, and then used a successive projection algorithm (SPA) to analyze the hyperspectral image data of the tissues. This was used to build an identification model based on linear discrimination analysis (LDA) using the relative reflectance values of the effective wavelengths. Other tissues were used as a prediction set to verify the reliability of the identification model. The results suggest that Vis-NIR hyperspectral images, together with the spectroscopic classification method, provide a new approach for reliable and safe diagnosis of colon cancer and could lead to advances in cancer diagnosis generally.

  13. Parametric studies and orbital analysis for an electric orbit transfer vehicle space flight demonstration

    NASA Astrophysics Data System (ADS)

    Avila, Edward R.

    The Electric Insertion Transfer Experiment (ELITE) is an Air Force Advanced Technology Transition Demonstration which is being executed as a cooperative Research and Development Agreement between the Phillips Lab and TRW. The objective is to build, test, and fly a solar-electric orbit transfer and orbit maneuvering vehicle, as a precursor to an operational electric orbit transfer vehicle (EOTV). This paper surveys some of the analysis tools used to do parametric studies and discusses the study results. The primary analysis tool was the Electric Vehicle Analyzer (EVA) developed by the Phillips Lab and modified by The Aerospace Corporation. It uses a simple orbit averaging approach to model low-thrust transfer performance, and runs in a PC environment. The assumptions used in deriving the EVA math model are presented. This tool and others surveyed were used to size the solar array power required for the spacecraft, and develop a baseline mission profile that meets the requirements of the ELITE mission.

  14. Modeling Pathways of Character Development across the First Three Decades of Life: An Application of Integrative Data Analysis Techniques to Understanding the Development of Hopeful Future Expectations.

    PubMed

    Callina, Kristina Schmid; Johnson, Sara K; Tirrell, Jonathan M; Batanova, Milena; Weiner, Michelle B; Lerner, Richard M

    2017-06-01

    There were two purposes of the present research: first, to add to scholarship about a key character virtue, hopeful future expectations; and second, to demonstrate a recent innovation in longitudinal methodology that may be especially useful in enhancing the understanding of the developmental course of hopeful future expectations and other character virtues that have been the focus of recent scholarship in youth development. Burgeoning interest in character development has led to a proliferation of short-term, longitudinal studies on character. These data sets are sometimes limited in their ability to model character development trajectories due to low power or relatively brief time spans assessed. However, the integrative data analysis approach allows researchers to pool raw data across studies in order to fit one model to an aggregated data set. The purpose of this article is to demonstrate the promises and challenges of this new tool for modeling character development. We used data from four studies evaluating youth character strengths in different settings to fit latent growth curve models of hopeful future expectations from participants aged 7 through 26 years. We describe the analytic strategy for pooling the data and modeling the growth curves. Implications for future research are discussed in regard to the advantages of integrative data analysis. Finally, we discuss issues researchers should consider when applying these techniques in their own work.

  15. Sorafenib metabolism, transport, and enterohepatic recycling: physiologically based modeling and simulation in mice.

    PubMed

    Edginton, Andrea N; Zimmerman, Eric I; Vasilyeva, Aksana; Baker, Sharyn D; Panetta, John C

    2016-05-01

    This study used uncertainty and sensitivity analysis to evaluate a physiologically based pharmacokinetic (PBPK) model of the complex mechanisms of sorafenib and its two main metabolites, sorafenib glucuronide and sorafenib N-oxide in mice. A PBPK model for sorafenib and its two main metabolites was developed to explain disposition in mice. It included relevant influx (Oatp) and efflux (Abcc2 and Abcc3) transporters, hepatic metabolic enzymes (CYP3A4 and UGT1A9), and intestinal β-glucuronidase. Parameterization of drug-specific processes was based on in vitro, ex vivo, and in silico data along with plasma and liver pharmacokinetic data from single and multiple transporter knockout mice. Uncertainty analysis demonstrated that the model structure and parameter values could explain the observed variability in the pharmacokinetic data. Global sensitivity analysis demonstrated the global effects of metabolizing enzymes on sorafenib and metabolite disposition and the local effects of transporters on their respective substrate exposures. In addition, through hypothesis testing, the model supported that the influx transporter Oatp is a weak substrate for sorafenib and a strong substrate for sorafenib glucuronide and that the efflux transporter Abcc2 is not the only transporter affected in the Abcc2 knockout mouse. Translation of the mouse model to humans for the purpose of explaining exceptionally high human pharmacokinetic variability and its relationship with exposure-dependent dose-limiting toxicities will require delineation of the importance of these processes on disposition.

  16. Sparse Group Penalized Integrative Analysis of Multiple Cancer Prognosis Datasets

    PubMed Central

    Liu, Jin; Huang, Jian; Xie, Yang; Ma, Shuangge

    2014-01-01

    SUMMARY In cancer research, high-throughput profiling studies have been extensively conducted, searching for markers associated with prognosis. Because of the “large d, small n” characteristic, results generated from the analysis of a single dataset can be unsatisfactory. Recent studies have shown that integrative analysis, which simultaneously analyzes multiple datasets, can be more effective than single-dataset analysis and classic meta-analysis. In most of existing integrative analysis, the homogeneity model has been assumed, which postulates that different datasets share the same set of markers. Several approaches have been designed to reinforce this assumption. In practice, different datasets may differ in terms of patient selection criteria, profiling techniques, and many other aspects. Such differences may make the homogeneity model too restricted. In this study, we assume the heterogeneity model, under which different datasets are allowed to have different sets of markers. With multiple cancer prognosis datasets, we adopt the AFT (accelerated failure time) model to describe survival. This model may have the lowest computational cost among popular semiparametric survival models. For marker selection, we adopt a sparse group MCP (minimax concave penalty) approach. This approach has an intuitive formulation and can be computed using an effective group coordinate descent algorithm. Simulation study shows that it outperforms the existing approaches under both the homogeneity and heterogeneity models. Data analysis further demonstrates the merit of heterogeneity model and proposed approach. PMID:23938111

  17. Utility of a Systematic Approach to Teaching Photographic Nasal Analysis to Otolaryngology Residents.

    PubMed

    Robitschek, Jon; Dresner, Harley; Hilger, Peter

    2017-12-01

    Photographic nasal analysis constitutes a critical step along the path toward accurate diagnosis and precise surgical planning in rhinoplasty. The learned process by which one assesses photographs, analyzes relevant anatomical landmarks, and generates a global view of the nasal aesthetic is less widely described. To discern the common pitfalls in performing photographic nasal analysis and to quantify the utility of a systematic approach model in teaching photographic nasal analysis to otolaryngology residents. This prospective observational study included 20 participants from a university-based otolaryngology residency program. The control and intervention groups underwent baseline graded assessment of 3 patients. The intervention group received instruction on a systematic approach model for nasal analysis, and both groups underwent postintervention testing at 10 weeks. Data were collected from October 1, 2015, through June 1, 2016. A 10-minute, 11-slide presentation provided instruction on a systematic approach to nasal analysis to the intervention group. Graded photographic nasal analysis using a binary 18-point system. The 20 otolaryngology residents (15 men and 5 women; age range, 24-34 years) were adept at mentioning dorsal deviation and dorsal profile with focused descriptions of tip angle and contour. Areas commonly omitted by residents included verification of the Frankfort plane, position of the lower lateral crura, radix position, and ratio of the ala to tip lobule. The intervention group demonstrated immediate improvement after instruction on the teaching model, with the mean (SD) postintervention test score doubling compared with their baseline performance (7.5 [2.7] vs 10.3 [2.5]; P < .001). At 10 weeks after the intervention, the mean comparative improvement in overall graded nasal analysis was 17% (95% CI, 10%-23%; P < .001). Otolaryngology residents demonstrated proficiency at incorporating nasal deviation, tip angle, and dorsal profile contour into their nasal analysis. They often omitted verification of the Frankfort plane, position of lower lateral crura, radix depth, and ala-to-tip lobule ratio. Findings with this novel 10-minute teaching model should be validated at other teaching institutions, and the instruction model should be further enhanced to teach more sophisticated analysis to residents as they proceed through training. NA.

  18. A combined experimental-modelling method for the detection and analysis of pollution in coastal zones

    NASA Astrophysics Data System (ADS)

    Limić, Nedzad; Valković, Vladivoj

    1996-04-01

    Pollution of coastal seas with toxic substances can be efficiently detected by examining toxic materials in sediment samples. These samples contain information on the overall pollution from surrounding sources such as yacht anchorages, nearby industries, sewage systems, etc. In an efficient analysis of pollution one must determine the contribution from each individual source. In this work it is demonstrated that a modelling method can be utilized for solving this latter problem. The modelling method is based on a unique interpretation of concentrations in sediments from all sampling stations. The proposed method is a synthesis consisting of the utilization of PIXE as an efficient method of pollution concentration determination and the code ANCOPOL (N. Limic and R. Benis, The computer code ANCOPOL, SimTel/msdos/geology, 1994 [1]) for the calculation of contributions from the main polluters. The efficiency and limits of the proposed method are demonstrated by discussing trace element concentrations in sediments of Punat Bay on the island of Krk in Croatia.

  19. Numerical analysis of the chimera states in the multilayered network model

    NASA Astrophysics Data System (ADS)

    Goremyko, Mikhail V.; Maksimenko, Vladimir A.; Makarov, Vladimir V.; Ghosh, Dibakar; Bera, Bidesh K.; Dana, Syamal K.; Hramov, Alexander E.

    2017-03-01

    We numerically study the interaction between the ensembles of the Hindmarsh-Rose (HR) neuron systems, arranged in the multilayer network model. We have shown that the fully identical layers, demonstrated individually different chimera due to the initial mismatch, come to the identical chimera state with the increase of inter-layer coupling. Within the multilayer model we also consider the case, when the one layer demonstrates chimera state, while another layer exhibits coherent or incoherent dynamics. It has been shown that the interactions chimera-coherent state and chimera-incoherent state leads to the both excitation of chimera as from the ensemble of fully coherent or incoherent oscillators, and suppression of initially stable chimera state

  20. Cell motion predicts human epidermal stemness

    PubMed Central

    Toki, Fujio; Tate, Sota; Imai, Matome; Matsushita, Natsuki; Shiraishi, Ken; Sayama, Koji; Toki, Hiroshi; Higashiyama, Shigeki

    2015-01-01

    Image-based identification of cultured stem cells and noninvasive evaluation of their proliferative capacity advance cell therapy and stem cell research. Here we demonstrate that human keratinocyte stem cells can be identified in situ by analyzing cell motion during their cultivation. Modeling experiments suggested that the clonal type of cultured human clonogenic keratinocytes can be efficiently determined by analysis of early cell movement. Image analysis experiments demonstrated that keratinocyte stem cells indeed display a unique rotational movement that can be identified as early as the two-cell stage colony. We also demonstrate that α6 integrin is required for both rotational and collective cell motion. Our experiments provide, for the first time, strong evidence that cell motion and epidermal stemness are linked. We conclude that early identification of human keratinocyte stem cells by image analysis of cell movement is a valid parameter for quality control of cultured keratinocytes for transplantation. PMID:25897083

  1. Random vectors and spatial analysis by geostatistics for geotechnical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, D.S.

    1987-08-01

    Geostatistics is extended to the spatial analysis of vector variables by defining the estimation variance and vector variogram in terms of the magnitude of difference vectors. Many random variables in geotechnology are in vectorial terms rather than scalars, and its structural analysis requires those sample variable interpolations to construct and characterize structural models. A better local estimator will result in greater quality of input models; geostatistics can provide such estimators; kriging estimators. The efficiency of geostatistics for vector variables is demonstrated in a case study of rock joint orientations in geological formations. The positive cross-validation encourages application of geostatistics tomore » spatial analysis of random vectors in geoscience as well as various geotechnical fields including optimum site characterization, rock mechanics for mining and civil structures, cavability analysis of block cavings, petroleum engineering, and hydrologic and hydraulic modelings.« less

  2. Surrogate models for efficient stability analysis of brake systems

    NASA Astrophysics Data System (ADS)

    Nechak, Lyes; Gillot, Frédéric; Besset, Sébastien; Sinou, Jean-Jacques

    2015-07-01

    This study assesses capacities of the global sensitivity analysis combined together with the kriging formalism to be useful in the robust stability analysis of brake systems, which is too costly when performed with the classical complex eigenvalues analysis (CEA) based on finite element models (FEMs). By considering a simplified brake system, the global sensitivity analysis is first shown very helpful for understanding the effects of design parameters on the brake system's stability. This is allowed by the so-called Sobol indices which discriminate design parameters with respect to their influence on the stability. Consequently, only uncertainty of influent parameters is taken into account in the following step, namely, the surrogate modelling based on kriging. The latter is then demonstrated to be an interesting alternative to FEMs since it allowed, with a lower cost, an accurate estimation of the system's proportions of instability corresponding to the influent parameters.

  3. Discovering phases, phase transitions, and crossovers through unsupervised machine learning: A critical examination

    NASA Astrophysics Data System (ADS)

    Hu, Wenjian; Singh, Rajiv R. P.; Scalettar, Richard T.

    2017-06-01

    We apply unsupervised machine learning techniques, mainly principal component analysis (PCA), to compare and contrast the phase behavior and phase transitions in several classical spin models—the square- and triangular-lattice Ising models, the Blume-Capel model, a highly degenerate biquadratic-exchange spin-1 Ising (BSI) model, and the two-dimensional X Y model—and we examine critically what machine learning is teaching us. We find that quantified principal components from PCA not only allow the exploration of different phases and symmetry-breaking, but they can distinguish phase-transition types and locate critical points. We show that the corresponding weight vectors have a clear physical interpretation, which is particularly interesting in the frustrated models such as the triangular antiferromagnet, where they can point to incipient orders. Unlike the other well-studied models, the properties of the BSI model are less well known. Using both PCA and conventional Monte Carlo analysis, we demonstrate that the BSI model shows an absence of phase transition and macroscopic ground-state degeneracy. The failure to capture the "charge" correlations (vorticity) in the BSI model (X Y model) from raw spin configurations points to some of the limitations of PCA. Finally, we employ a nonlinear unsupervised machine learning procedure, the "autoencoder method," and we demonstrate that it too can be trained to capture phase transitions and critical points.

  4. Building toy models of proteins using coevolutionary information

    NASA Astrophysics Data System (ADS)

    Cheng, Ryan; Raghunathan, Mohit; Onuchic, Jose

    2015-03-01

    Recent developments in global statistical methodologies have advanced the analysis of large collections of protein sequences for coevolutionary information. Coevolution between amino acids in a protein arises from compensatory mutations that are needed to maintain the stability or function of a protein over the course of evolution. This gives rise to quantifiable correlations between amino acid positions within the multiple sequence alignment of a protein family. Here, we use Direct Coupling Analysis (DCA) to infer a Potts model Hamiltonian governing the correlated mutations in a protein family to obtain the sequence-dependent interaction energies of a toy protein model. We demonstrate that this methodology predicts residue-residue interaction energies that are consistent with experimental mutational changes in protein stabilities as well as other computational methodologies. Furthermore, we demonstrate with several examples that DCA could be used to construct a structure-based model that quantitatively agrees with experimental data on folding mechanisms. This work serves as a potential framework for generating models of proteins that are enriched by evolutionary data that can potentially be used to engineer key functional motions and interactions in protein systems. This research has been supported by the NSF INSPIRE award MCB-1241332 and by the CTBP sponsored by the NSF (Grant PHY-1427654).

  5. Agent Based Modeling: Fine-Scale Spatio-Temporal Analysis of Pertussis

    NASA Astrophysics Data System (ADS)

    Mills, D. A.

    2017-10-01

    In epidemiology, spatial and temporal variables are used to compute vaccination efficacy and effectiveness. The chosen resolution and scale of a spatial or spatio-temporal analysis will affect the results. When calculating vaccination efficacy, for example, a simple environment that offers various ideal outcomes is often modeled using coarse scale data aggregated on an annual basis. In contrast to the inadequacy of this aggregated method, this research uses agent based modeling of fine-scale neighborhood data centered around the interactions of infants in daycare and their families to demonstrate an accurate reflection of vaccination capabilities. Despite being able to prevent major symptoms, recent studies suggest that acellular Pertussis does not prevent the colonization and transmission of Bordetella Pertussis bacteria. After vaccination, a treated individual becomes a potential asymptomatic carrier of the Pertussis bacteria, rather than an immune individual. Agent based modeling enables the measurable depiction of asymptomatic carriers that are otherwise unaccounted for when calculating vaccination efficacy and effectiveness. Using empirical data from a Florida Pertussis outbreak case study, the results of this model demonstrate that asymptomatic carriers bias the calculated vaccination efficacy and reveal a need for reconsidering current methods that are widely used for calculating vaccination efficacy and effectiveness.

  6. Lessons learned: Optimization of a murine small bowel resection model

    PubMed Central

    Taylor, Janice A.; Martin, Colin A.; Nair, Rajalakshmi; Guo, Jun; Erwin, Christopher R.; Warner, Brad W.

    2008-01-01

    Background/Purpose Central to the use of murine models of disease is the ability to derive reproducible data. The purpose of this study was to determine factors contributing to variability in our murine model of small bowel resection (SBR). Methods Male C57Bl/6 mice were randomized to sham or 50% SBR. The effect of housing type (pathogen-free versus standard housing), nutrition (reconstituted powder versus tube feeding formulation), and correlates of intestinal morphology with gene expression changes were investigated Multiple linear regression modeling or one-way ANOVA was used for data analysis. Results Pathogen-free mice had significantly shorter ileal villi at baseline and demonstrated greater villus growth after SBR compared to mice housed in standard rooms. Food type did not affect adaptation. Gene expression changes were more consistent and significant in isolated crypt cells that demonstrated adaptive growth when compared with crypts that did not deepen after SBR. Conclusion Maintenance of mice in pathogen-free conditions and restricting gene expression analysis to individual animals exhibiting morphologic adaptation enhances sensitivity and specificity of data derived from this model. These refinements will minimize experimental variability and lead to improved understanding of the complex process of intestinal adaptation. PMID:18558176

  7. Quality evaluation of health information system's architectures developed using the HIS-DF methodology.

    PubMed

    López, Diego M; Blobel, Bernd; Gonzalez, Carolina

    2010-01-01

    Requirement analysis, design, implementation, evaluation, use, and maintenance of semantically interoperable Health Information Systems (HIS) have to be based on eHealth standards. HIS-DF is a comprehensive approach for HIS architectural development based on standard information models and vocabulary. The empirical validity of HIS-DF has not been demonstrated so far. Through an empirical experiment, the paper demonstrates that using HIS-DF and HL7 information models, semantic quality of HIS architecture can be improved, compared to architectures developed using traditional RUP process. Semantic quality of the architecture has been measured in terms of model's completeness and validity metrics. The experimental results demonstrated an increased completeness of 14.38% and an increased validity of 16.63% when using the HIS-DF and HL7 information models in a sample HIS development project. Quality assurance of the system architecture in earlier stages of HIS development presumes an increased quality of final HIS systems, which supposes an indirect impact on patient care.

  8. Model-Driven Safety Analysis of Closed-Loop Medical Systems

    PubMed Central

    Pajic, Miroslav; Mangharam, Rahul; Sokolsky, Oleg; Arney, David; Goldman, Julian; Lee, Insup

    2013-01-01

    In modern hospitals, patients are treated using a wide array of medical devices that are increasingly interacting with each other over the network, thus offering a perfect example of a cyber-physical system. We study the safety of a medical device system for the physiologic closed-loop control of drug infusion. The main contribution of the paper is the verification approach for the safety properties of closed-loop medical device systems. We demonstrate, using a case study, that the approach can be applied to a system of clinical importance. Our method combines simulation-based analysis of a detailed model of the system that contains continuous patient dynamics with model checking of a more abstract timed automata model. We show that the relationship between the two models preserves the crucial aspect of the timing behavior that ensures the conservativeness of the safety analysis. We also describe system design that can provide open-loop safety under network failure. PMID:24177176

  9. Model-Driven Safety Analysis of Closed-Loop Medical Systems.

    PubMed

    Pajic, Miroslav; Mangharam, Rahul; Sokolsky, Oleg; Arney, David; Goldman, Julian; Lee, Insup

    2012-10-26

    In modern hospitals, patients are treated using a wide array of medical devices that are increasingly interacting with each other over the network, thus offering a perfect example of a cyber-physical system. We study the safety of a medical device system for the physiologic closed-loop control of drug infusion. The main contribution of the paper is the verification approach for the safety properties of closed-loop medical device systems. We demonstrate, using a case study, that the approach can be applied to a system of clinical importance. Our method combines simulation-based analysis of a detailed model of the system that contains continuous patient dynamics with model checking of a more abstract timed automata model. We show that the relationship between the two models preserves the crucial aspect of the timing behavior that ensures the conservativeness of the safety analysis. We also describe system design that can provide open-loop safety under network failure.

  10. Mass Spec Studio for Integrative Structural Biology

    PubMed Central

    Rey, Martial; Sarpe, Vladimir; Burns, Kyle; Buse, Joshua; Baker, Charles A.H.; van Dijk, Marc; Wordeman, Linda; Bonvin, Alexandre M.J.J.; Schriemer, David C.

    2015-01-01

    SUMMARY The integration of biophysical data from multiple sources is critical for developing accurate structural models of large multiprotein systems and their regulators. Mass spectrometry (MS) can be used to measure the insertion location for a wide range of topographically sensitive chemical probes, and such insertion data provide a rich, but disparate set of modeling restraints. We have developed a software platform that integrates the analysis of label-based MS data with protein modeling activities (Mass Spec Studio). Analysis packages can mine any labeling data from any mass spectrometer in a proteomics-grade manner, and link labeling methods with data-directed protein interaction modeling using HADDOCK. Support is provided for hydrogen/ deuterium exchange (HX) and covalent labeling chemistries, including novel acquisition strategies such as targeted HX-tandem MS (MS2) and data-independent HX-MS2. The latter permits the modeling of highly complex systems, which we demonstrate by the analysis of microtubule interactions. PMID:25242457

  11. Validation of hierarchical cluster analysis for identification of bacterial species using 42 bacterial isolates

    NASA Astrophysics Data System (ADS)

    Ghebremedhin, Meron; Yesupriya, Shubha; Luka, Janos; Crane, Nicole J.

    2015-03-01

    Recent studies have demonstrated the potential advantages of the use of Raman spectroscopy in the biomedical field due to its rapidity and noninvasive nature. In this study, Raman spectroscopy is applied as a method for differentiating between bacteria isolates for Gram status and Genus species. We created models for identifying 28 bacterial isolates using spectra collected with a 785 nm laser excitation Raman spectroscopic system. In order to investigate the groupings of these samples, partial least squares discriminant analysis (PLSDA) and hierarchical cluster analysis (HCA) was implemented. In addition, cluster analyses of the isolates were performed using various data types consisting of, biochemical tests, gene sequence alignment, high resolution melt (HRM) analysis and antimicrobial susceptibility tests of minimum inhibitory concentration (MIC) and degree of antimicrobial resistance (SIR). In order to evaluate the ability of these models to correctly classify bacterial isolates using solely Raman spectroscopic data, a set of 14 validation samples were tested using the PLSDA models and consequently the HCA models. External cluster evaluation criteria of purity and Rand index were calculated at different taxonomic levels to compare the performance of clustering using Raman spectra as well as the other datasets. Results showed that Raman spectra performed comparably, and in some cases better than, the other data types with Rand index and purity values up to 0.933 and 0.947, respectively. This study clearly demonstrates that the discrimination of bacterial species using Raman spectroscopic data and hierarchical cluster analysis is possible and has the potential to be a powerful point-of-care tool in clinical settings.

  12. Reliability modelling and analysis of thermal MEMS

    NASA Astrophysics Data System (ADS)

    Muratet, Sylvaine; Lavu, Srikanth; Fourniols, Jean-Yves; Bell, George; Desmulliez, Marc P. Y.

    2006-04-01

    This paper presents a MEMS reliability study methodology based on the novel concept of 'virtual prototyping'. This methodology can be used for the development of reliable sensors or actuators and also to characterize their behaviour in specific use conditions and applications. The methodology is demonstrated on the U-shaped micro electro thermal actuator used as test vehicle. To demonstrate this approach, a 'virtual prototype' has been developed with the modeling tools MatLab and VHDL-AMS. A best practice FMEA (Failure Mode and Effect Analysis) is applied on the thermal MEMS to investigate and assess the failure mechanisms. Reliability study is performed by injecting the identified defaults into the 'virtual prototype'. The reliability characterization methodology predicts the evolution of the behavior of these MEMS as a function of the number of cycles of operation and specific operational conditions.

  13. Reliability Quantification of Advanced Stirling Convertor (ASC) Components

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward

    2010-01-01

    The Advanced Stirling Convertor, is intended to provide power for an unmanned planetary spacecraft and has an operational life requirement of 17 years. Over this 17 year mission, the ASC must provide power with desired performance and efficiency and require no corrective maintenance. Reliability demonstration testing for the ASC was found to be very limited due to schedule and resource constraints. Reliability demonstration must involve the application of analysis, system and component level testing, and simulation models, taken collectively. Therefore, computer simulation with limited test data verification is a viable approach to assess the reliability of ASC components. This approach is based on physics-of-failure mechanisms and involves the relationship among the design variables based on physics, mechanics, material behavior models, interaction of different components and their respective disciplines such as structures, materials, fluid, thermal, mechanical, electrical, etc. In addition, these models are based on the available test data, which can be updated, and analysis refined as more data and information becomes available. The failure mechanisms and causes of failure are included in the analysis, especially in light of the new information, in order to develop guidelines to improve design reliability and better operating controls to reduce the probability of failure. Quantified reliability assessment based on fundamental physical behavior of components and their relationship with other components has demonstrated itself to be a superior technique to conventional reliability approaches based on utilizing failure rates derived from similar equipment or simply expert judgment.

  14. Inverse analysis of aerodynamic loads from strain information using structural models and neural networks

    NASA Astrophysics Data System (ADS)

    Wada, Daichi; Sugimoto, Yohei

    2017-04-01

    Aerodynamic loads on aircraft wings are one of the key parameters to be monitored for reliable and effective aircraft operations and management. Flight data of the aerodynamic loads would be used onboard to control the aircraft and accumulated data would be used for the condition-based maintenance and the feedback for the fatigue and critical load modeling. The effective sensing techniques such as fiber optic distributed sensing have been developed and demonstrated promising capability of monitoring structural responses, i.e., strains on the surface of the aircraft wings. By using the developed techniques, load identification methods for structural health monitoring are expected to be established. The typical inverse analysis for load identification using strains calculates the loads in a discrete form of concentrated forces, however, the distributed form of the loads is essential for the accurate and reliable estimation of the critical stress at structural parts. In this study, we demonstrate an inverse analysis to identify the distributed loads from measured strain information. The introduced inverse analysis technique calculates aerodynamic loads not in a discrete but in a distributed manner based on a finite element model. In order to verify the technique through numerical simulations, we apply static aerodynamic loads on a flat panel model, and conduct the inverse identification of the load distributions. We take two approaches to build the inverse system between loads and strains. The first one uses structural models and the second one uses neural networks. We compare the performance of the two approaches, and discuss the effect of the amount of the strain sensing information.

  15. An extended car-following model to describe connected traffic dynamics under cyberattacks

    NASA Astrophysics Data System (ADS)

    Wang, Pengcheng; Yu, Guizhen; Wu, Xinkai; Qin, Hongmao; Wang, Yunpeng

    2018-04-01

    In this paper, the impacts of the potential cyberattacks on vehicles are modeled through an extended car-following model. To better understand the mechanism of traffic disturbance under cyberattacks, the linear and nonlinear stability analysis are conducted respectively. Particularly, linear stability analysis is performed to obtain different neutral stability conditions with various parameters; and nonlinear stability analysis is carried out by using reductive perturbation method to derive the soliton solution of the modified Korteweg de Vries equation (mKdV) near the critical point, which is used to draw coexisting stability lines. Furthermore, by applying linear and nonlinear stability analysis, traffic flow state can be divided into three states, i.e., stable, metastable and unstable states which are useful to describe shockwave dynamics and driving behaviors under cyberattacks. The theoretical results show that the proposed car-following model is capable of successfully describing the car-following behavior of connected vehicles with cyberattacks. Finally, numerical simulation using real values has confirmed the validity of theoretical analysis. The results further demonstrate our model can be used to help avoid collisions and relieve traffic congestion with cybersecurity threats.

  16. The Impact of Measurement Noise in GPA Diagnostic Analysis of a Gas Turbine Engine

    NASA Astrophysics Data System (ADS)

    Ntantis, Efstratios L.; Li, Y. G.

    2013-12-01

    The performance diagnostic analysis of a gas turbine is accomplished by estimating a set of internal engine health parameters from available sensor measurements. No physical measuring instruments however can ever completely eliminate the presence of measurement uncertainties. Sensor measurements are often distorted by noise and bias leading to inaccurate estimation results. This paper explores the impact of measurement noise on Gas Turbine GPA analysis. The analysis is demonstrated with a test case where gas turbine performance simulation and diagnostics code TURBOMATCH is used to build a performance model of a model engine similar to Rolls-Royce Trent 500 turbofan engine, and carry out the diagnostic analysis with the presence of different levels of measurement noise. Conclusively, to improve the reliability of the diagnostic results, a statistical analysis of the data scattering caused by sensor uncertainties is made. The diagnostic tool used to deal with the statistical analysis of measurement noise impact is a model-based method utilizing a non-linear GPA.

  17. Analysis technique for controlling system wavefront error with active/adaptive optics

    NASA Astrophysics Data System (ADS)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  18. Collective Thomson scattering data analysis for Wendelstein 7-X

    NASA Astrophysics Data System (ADS)

    Abramovic, I.; Pavone, A.; Svensson, J.; Moseev, D.; Salewski, M.; Laqua, H. P.; Lopes Cardozo, N. J.; Wolf, R. C.

    2017-08-01

    Collective Thomson scattering (CTS) diagnostic is being installed on the Wendelstein 7-X stellarator to measure the bulk ion temperature in the upcoming experimental campaign. In order to prepare for the data analysis, a forward model of the diagnostic (eCTS) has been developed and integrated into the Bayesian data analysis framework Minerva. Synthetic spectra have been calculated with the forward model and inverted using Minerva in order to demonstrate the feasibility to measure the ion temperature in the presence of nuisance parameters that also influence CTS spectra. In this paper we report on the results of this anlysis and discuss the main sources of uncertainty in the CTS data analysis.

  19. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  20. Regression Model for Light Weight and Crashworthiness Enhancement Design of Automotive Parts in Frontal CAR Crash

    NASA Astrophysics Data System (ADS)

    Bae, Gihyun; Huh, Hoon; Park, Sungho

    This paper deals with a regression model for light weight and crashworthiness enhancement design of automotive parts in frontal car crash. The ULSAB-AVC model is employed for the crash analysis and effective parts are selected based on the amount of energy absorption during the crash behavior. Finite element analyses are carried out for designated design cases in order to investigate the crashworthiness and weight according to the material and thickness of main energy absorption parts. Based on simulations results, a regression analysis is performed to construct a regression model utilized for light weight and crashworthiness enhancement design of automotive parts. An example for weight reduction of main energy absorption parts demonstrates the validity of a regression model constructed.

  1. A computer program for predicting nonlinear uniaxial material responses using viscoplastic models

    NASA Technical Reports Server (NTRS)

    Chang, T. Y.; Thompson, R. L.

    1984-01-01

    A computer program was developed for predicting nonlinear uniaxial material responses using viscoplastic constitutive models. Four specific models, i.e., those due to Miller, Walker, Krieg-Swearengen-Rhode, and Robinson, are included. Any other unified model is easily implemented into the program in the form of subroutines. Analysis features include stress-strain cycling, creep response, stress relaxation, thermomechanical fatigue loop, or any combination of these responses. An outline is given on the theoretical background of uniaxial constitutive models, analysis procedure, and numerical integration methods for solving the nonlinear constitutive equations. In addition, a discussion on the computer program implementation is also given. Finally, seven numerical examples are included to demonstrate the versatility of the computer program developed.

  2. How can we reduce phosphorus export from lowland polders? Implications from a sensitivity analysis of a coupled model.

    PubMed

    Huang, Jiacong; Gao, Junfeng; Yan, Renhua

    2016-08-15

    Phosphorus (P) export from lowland polders has caused severe water pollution. Numerical models are an important resource that help water managers control P export. This study coupled three models, i.e., Phosphorus Dynamic model for Polders (PDP), Integrated Catchments model of Phosphorus dynamics (INCA-P) and Universal Soil Loss Equation (USLE), to describe the P dynamics in polders. Based on the coupled models and a dataset collected from Polder Jian in China, sensitivity analysis were carried out to analyze the cause-effect relationships between environmental factors and P export from Polder Jian. The sensitivity analysis results showed that P export from Polder Jian were strongly affected by air temperature, precipitation and fertilization. Proper fertilization management should be a strategic priority for reducing P export from Polder Jian. This study demonstrated the success of model coupling, and its application in investigating potential strategies to support pollution control in polder systems. Copyright © 2016. Published by Elsevier B.V.

  3. Creating system engineering products with executable models in a model-based engineering environment

    NASA Astrophysics Data System (ADS)

    Karban, Robert; Dekens, Frank G.; Herzig, Sebastian; Elaasar, Maged; Jankevičius, Nerijus

    2016-08-01

    Applying systems engineering across the life-cycle results in a number of products built from interdependent sources of information using different kinds of system level analysis. This paper focuses on leveraging the Executable System Engineering Method (ESEM) [1] [2], which automates requirements verification (e.g. power and mass budget margins and duration analysis of operational modes) using executable SysML [3] models. The particular value proposition is to integrate requirements, and executable behavior and performance models for certain types of system level analysis. The models are created with modeling patterns that involve structural, behavioral and parametric diagrams, and are managed by an open source Model Based Engineering Environment (named OpenMBEE [4]). This paper demonstrates how the ESEM is applied in conjunction with OpenMBEE to create key engineering products (e.g. operational concept document) for the Alignment and Phasing System (APS) within the Thirty Meter Telescope (TMT) project [5], which is under development by the TMT International Observatory (TIO) [5].

  4. Predicting Air Permeability of Handloom Fabrics: A Comparative Analysis of Regression and Artificial Neural Network Models

    NASA Astrophysics Data System (ADS)

    Mitra, Ashis; Majumdar, Prabal Kumar; Bannerjee, Debamalya

    2013-03-01

    This paper presents a comparative analysis of two modeling methodologies for the prediction of air permeability of plain woven handloom cotton fabrics. Four basic fabric constructional parameters namely ends per inch, picks per inch, warp count and weft count have been used as inputs for artificial neural network (ANN) and regression models. Out of the four regression models tried, interaction model showed very good prediction performance with a meager mean absolute error of 2.017 %. However, ANN models demonstrated superiority over the regression models both in terms of correlation coefficient and mean absolute error. The ANN model with 10 nodes in the single hidden layer showed very good correlation coefficient of 0.982 and 0.929 and mean absolute error of only 0.923 and 2.043 % for training and testing data respectively.

  5. Investigation and Modeling of Capacitive Human Body Communication.

    PubMed

    Zhu, Xiao-Qi; Guo, Yong-Xin; Wu, Wen

    2017-04-01

    This paper presents a systematic investigation of the capacitive human body communication (HBC). The measurement of HBC channels is performed using a novel battery-powered system to eliminate the effects of baluns, cables and instruments. To verify the measured results, a numerical model incorporating the entire HBC system is established. Besides, it is demonstrated that both the impedance and path gain bandwidths of HBC channels is affected by the electrode configuration. Based on the analysis of the simulated electric field distribution, an equivalent circuit model is proposed and the circuit parameters are extracted using the finite element method. The transmission capability along the human body is also studied. The simulated results using the numerical and circuit models coincide very well with the measurement, which demonstrates that the proposed circuit model can effectively interpret the operation mechanism of the capacitive HBC.

  6. Variable selection for marginal longitudinal generalized linear models.

    PubMed

    Cantoni, Eva; Flemming, Joanna Mills; Ronchetti, Elvezio

    2005-06-01

    Variable selection is an essential part of any statistical analysis and yet has been somewhat neglected in the context of longitudinal data analysis. In this article, we propose a generalized version of Mallows's C(p) (GC(p)) suitable for use with both parametric and nonparametric models. GC(p) provides an estimate of a measure of model's adequacy for prediction. We examine its performance with popular marginal longitudinal models (fitted using GEE) and contrast results with what is typically done in practice: variable selection based on Wald-type or score-type tests. An application to real data further demonstrates the merits of our approach while at the same time emphasizing some important robust features inherent to GC(p).

  7. Cosmological reconstruction and Om diagnostic analysis of Einstein-Aether theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pasqua, Antonio; Chattopadhyay, Surajit; Momeni, Davood

    In this paper, we analyze the cosmological models in Einstein-Aether gravity, which is a modified theory of gravity in which a time-like vector field breaks the Lorentz symmetry. We use this formalism to analyse different cosmological models with different behavior of the scale factor. In this analysis, we use a certain functional dependence of the Dark Energy (DE) on the Hubble parameter H . It will be demonstrated that the Aether vector field has a non-trivial effect on these cosmological models. We also perform the Om diagnostic in Einstein-Aether gravity and we fit the parameters of the cosmological models usingmore » recent observational data.« less

  8. Structural similitude and scaling laws for laminated beam-plates

    NASA Technical Reports Server (NTRS)

    Simitses, George J.; Rezaeepazhand, Jalil

    1992-01-01

    The establishment of similarity conditions between two structural systems is discussed. Similarity conditions provide the relationship between a scale model and its prototype and can be used to predict the behavior of the prototype by extrapolating the experimental data of the corresponding small-scale model. Since satisfying all the similarity conditions simultaneously is difficult or even impossible, distorted models with partial similarity (with at least one similarity condition relaxed) are more practical. Establishing similarity conditions based on both dimensional analysis and direct use of governing equations is discussed, and the possibility of designing distorted models is investigated. The method is demonstrated through analysis of the cylindrical bending of orthotropic laminated beam-plates subjected to transverse line loads.

  9. Self-reconfigurable ship fluid-network modeling for simulation-based design

    NASA Astrophysics Data System (ADS)

    Moon, Kyungjin

    Our world is filled with large-scale engineering systems, which provide various services and conveniences in our daily life. A distinctive trend in the development of today's large-scale engineering systems is the extensive and aggressive adoption of automation and autonomy that enable the significant improvement of systems' robustness, efficiency, and performance, with considerably reduced manning and maintenance costs, and the U.S. Navy's DD(X), the next-generation destroyer program, is considered as an extreme example of such a trend. This thesis pursues a modeling solution for performing simulation-based analysis in the conceptual or preliminary design stage of an intelligent, self-reconfigurable ship fluid system, which is one of the concepts of DD(X) engineering plant development. Through the investigations on the Navy's approach for designing a more survivable ship system, it is found that the current naval simulation-based analysis environment is limited by the capability gaps in damage modeling, dynamic model reconfiguration, and simulation speed of the domain specific models, especially fluid network models. As enablers of filling these gaps, two essential elements were identified in the formulation of the modeling method. The first one is the graph-based topological modeling method, which will be employed for rapid model reconstruction and damage modeling, and the second one is the recurrent neural network-based, component-level surrogate modeling method, which will be used to improve the affordability and efficiency of the modeling and simulation (M&S) computations. The integration of the two methods can deliver computationally efficient, flexible, and automation-friendly M&S which will create an environment for more rigorous damage analysis and exploration of design alternatives. As a demonstration for evaluating the developed method, a simulation model of a notional ship fluid system was created, and a damage analysis was performed. Next, the models representing different design configurations of the fluid system were created, and damage analyses were performed with them in order to find an optimal design configuration for system survivability. Finally, the benefits and drawbacks of the developed method were discussed based on the result of the demonstration.

  10. A cellular automata model of Ebola virus dynamics

    NASA Astrophysics Data System (ADS)

    Burkhead, Emily; Hawkins, Jane

    2015-11-01

    We construct a stochastic cellular automaton (SCA) model for the spread of the Ebola virus (EBOV). We make substantial modifications to an existing SCA model used for HIV, introduced by others and studied by the authors. We give a rigorous analysis of the similarities between models due to the spread of virus and the typical immune response to it, and the differences which reflect the drastically different timing of the course of EBOV. We demonstrate output from the model and compare it with clinical data.

  11. Full velocity difference car-following model considering desired inter-vehicle distance

    NASA Astrophysics Data System (ADS)

    Xin, Tong; Yi, Liu; Rongjun, Cheng; Hongxia, Ge

    Based on the full velocity difference car-following model, an improved car-following model is put forward by considering the driver’s desired inter-vehicle distance. The stability conditions are obtained by applying the control method. The results of theoretical analysis are used to demonstrate the advantages of our model. Numerical simulations are used to show that traffic congestion can be improved as the desired inter-vehicle distance is considered in the full velocity difference car-following model.

  12. NASTRAN analysis of the 1/8-scale space shuttle dynamic model

    NASA Technical Reports Server (NTRS)

    Bernstein, M.; Mason, P. W.; Zalesak, J.; Gregory, D. J.; Levy, A.

    1973-01-01

    The space shuttle configuration has more complex structural dynamic characteristics than previous launch vehicles primarily because of the high model density at low frequencies and the high degree of coupling between the lateral and longitudinal motions. An accurate analytical representation of these characteristics is a primary means for treating structural dynamics problems during the design phase of the shuttle program. The 1/8-scale model program was developed to explore the adequacy of available analytical modeling technology and to provide the means for investigating problems which are more readily treated experimentally. The basic objectives of the 1/8-scale model program are: (1) to provide early verification of analytical modeling procedures on a shuttle-like structure, (2) to demonstrate important vehicle dynamic characteristics of a typical shuttle design, (3) to disclose any previously unanticipated structural dynamic characteristics, and (4) to provide for development and demonstration of cost effective prototype testing procedures.

  13. Semiotic-conceptual analysis: a proposal

    NASA Astrophysics Data System (ADS)

    Priss, Uta

    2017-07-01

    This paper provides the basic definitions of Semiotic-conceptual analysis (SCA), which is a mathematical modelling of signs as elements of a triadic relation. FCA concept lattices are constructed for each of the three sign components. It is demonstrated how core linguistic and semiotic notions (such as synonymy and icon) can be represented with SCA. While the usefulness of SCA has already been demonstrated in a number of applications and several propositions are proven in this paper, there are still many open questions as to what to do next with SCA. Therefore, this paper is meant as a proposal and encouragement for further development.

  14. Sensitivity of wildlife habitat models to uncertainties in GIS data

    NASA Technical Reports Server (NTRS)

    Stoms, David M.; Davis, Frank W.; Cogan, Christopher B.

    1992-01-01

    Decision makers need to know the reliability of output products from GIS analysis. For many GIS applications, it is not possible to compare these products to an independent measure of 'truth'. Sensitivity analysis offers an alternative means of estimating reliability. In this paper, we present a CIS-based statistical procedure for estimating the sensitivity of wildlife habitat models to uncertainties in input data and model assumptions. The approach is demonstrated in an analysis of habitat associations derived from a GIS database for the endangered California condor. Alternative data sets were generated to compare results over a reasonable range of assumptions about several sources of uncertainty. Sensitivity analysis indicated that condor habitat associations are relatively robust, and the results have increased our confidence in our initial findings. Uncertainties and methods described in the paper have general relevance for many GIS applications.

  15. Structural Finite Element Model Updating Using Vibration Tests and Modal Analysis for NPL footbridge - SHM demonstrator

    NASA Astrophysics Data System (ADS)

    Barton, E.; Middleton, C.; Koo, K.; Crocker, L.; Brownjohn, J.

    2011-07-01

    This paper presents the results from collaboration between the National Physical Laboratory (NPL) and the University of Sheffield on an ongoing research project at NPL. A 50 year old reinforced concrete footbridge has been converted to a full scale structural health monitoring (SHM) demonstrator. The structure is monitored using a variety of techniques; however, interrelating results and converting data to knowledge are not possible without a reliable numerical model. During the first stage of the project, the work concentrated on static loading and an FE model of the undamaged bridge was created, and updated, under specified static loading and temperature conditions. This model was found to accurately represent the response under static loading and it was used to identify locations for sensor installation. The next stage involves the evaluation of repair/strengthening patches under both static and dynamic loading. Therefore, before deliberately introducing significant damage, the first set of dynamic tests was conducted and modal properties were estimated. The measured modal properties did not match the modal analysis from the statically updated FE model; it was clear that the existing model required updating. This paper introduces the results of the dynamic testing and model updating. It is shown that the structure exhibits large non-linear, amplitude dependant characteristics. This creates a difficult updating process, but we attempt to produce the best linear representation of the structure. A sensitivity analysis is performed to determine the most sensitive locations for planned damage/repair scenarios and is used to decide whether additional sensors will be necessary.

  16. An intervention study to test Locker's conceptual framework of oral health in edentulous elders.

    PubMed

    Yamaga, Eijiro; Sato, Yusuke; Minakuchi, Shunsuke

    2018-06-01

    To test a previously described conceptual framework of oral health in edentulous elders using an intervention study that included complete denture replacement. Confirmatory factor analysis (CFA) was also conducted to substantiate construct validity. To date, the model proposed by Locker has been tested on edentulous elders using structural equation model (SEM) analysis. However, cross-sectional designs and the Short-Form Oral Health Impact Profile (OHIP-14) cannot adequately express cause-effect relationships and distribution in edentulous patients. Accordingly, the authors investigated Locker's model using an interventional design that included complete denture replacement using the OHIP for edentulous subjects (OHIP-EDENT). A total of 265 edentulous participants who visited the Dental Hospital of Tokyo Medical and Dental University (Tokyo, Japan) for new complete dentures were recruited. Locker's model was investigated, and CFA was performed using the change in subscale scores in the Japanese version of the OHIP-EDENT before and after complete denture replacement. CFA demonstrated an excellent model fit after adding several covariates. The Locker model also met the criteria of fit in all indices after 1 nonsignificant path was omitted. All path coefficients were significant. The findings of the present interventional study demonstrated an empirical fit to Locker's model in edentulous elders using SEM analysis, which included complete denture replacement. It is anticipated that clarification of causal mechanisms of oral health-related quality of life will lead to improvement of overall quality of life, thus maintaining or improving the activities of normal daily life for edentulous elders. © 2018 John Wiley & Sons A/S and The Gerodontology Association. Published by John Wiley & Sons Ltd.

  17. The NASA/Industry Design Analysis Methods for Vibrations (DAMVIBS) Program - A government overview. [of rotorcraft technology development using finite element method

    NASA Technical Reports Server (NTRS)

    Kvaternik, Raymond G.

    1992-01-01

    An overview is presented of government contributions to the program called Design Analysis Methods for Vibrations (DAMV) which attempted to develop finite-element-based analyses of rotorcraft vibrations. NASA initiated the program with a finite-element modeling program for the CH-47D tandem-rotor helicopter. The DAMV program emphasized four areas including: airframe finite-element modeling, difficult components studies, coupled rotor-airframe vibrations, and airframe structural optimization. Key accomplishments of the program include industrywide standards for modeling metal and composite airframes, improved industrial designs for vibrations, and the identification of critical structural contributors to airframe vibratory responses. The program also demonstrated the value of incorporating secondary modeling details to improving correlation, and the findings provide the basis for an improved finite-element-based dynamics design-analysis capability.

  18. Modeling and Analysis of Wrinkled Membranes: An Overview

    NASA Technical Reports Server (NTRS)

    Yang, B.; Ding, H.; Lou, M.; Fang, H.; Broduer, Steve (Technical Monitor)

    2001-01-01

    Thin-film membranes are basic elements of a variety of space inflatable/deployable structures. Wrinkling degrades the performance and reliability of these membrane structures, and hence has been a topic of continued interest. Wrinkling analysis of membranes for general geometry and arbitrary boundary conditions is quite challenging. The objective of this presentation is two-fold. Firstly, the existing models of wrinkled membranes and related numerical solution methods are reviewed. The important issues to be discussed are the capability of a membrane model to characterize taut, wrinkled and slack states of membranes in a consistent and physically reasonable manner; the ability of a wrinkling analysis method to predict the formation and growth of wrinkled regions, and to determine out-of-plane deformation and wrinkled waves; the convergence of a numerical solution method for wrinkling analysis; and the compatibility of a wrinkling analysis with general-purpose finite element codes. According to this review, several opening issues in modeling and analysis of wrinkled membranes that are to be addressed in future research are summarized, The second objective of this presentation is to discuss a newly developed membrane model of two viable parameters (2-VP model) and associated parametric finite element method (PFEM) for wrinkling analysis are introduced. The innovations and advantages of the proposed membrane model and PFEM-based wrinkling analysis are: (1) Via a unified stress-strain relation; the 2-VP model treat the taut, wrinkled, and slack states of membranes consistently; (2) The PFEM-based wrinkling analysis has guaranteed convergence; (3) The 2-VP model along with PFEM is capable of predicting membrane out-of-plane deformations; and (4) The PFEM can be integrated into any existing finite element code. Preliminary numerical examples are also included in this presentation to demonstrate the 2-VP model and PFEM-based wrinkling analysis approach.

  19. Program Model Checking: A Practitioner's Guide

    NASA Technical Reports Server (NTRS)

    Pressburger, Thomas T.; Mansouri-Samani, Masoud; Mehlitz, Peter C.; Pasareanu, Corina S.; Markosian, Lawrence Z.; Penix, John J.; Brat, Guillaume P.; Visser, Willem C.

    2008-01-01

    Program model checking is a verification technology that uses state-space exploration to evaluate large numbers of potential program executions. Program model checking provides improved coverage over testing by systematically evaluating all possible test inputs and all possible interleavings of threads in a multithreaded system. Model-checking algorithms use several classes of optimizations to reduce the time and memory requirements for analysis, as well as heuristics for meaningful analysis of partial areas of the state space Our goal in this guidebook is to assemble, distill, and demonstrate emerging best practices for applying program model checking. We offer it as a starting point and introduction for those who want to apply model checking to software verification and validation. The guidebook will not discuss any specific tool in great detail, but we provide references for specific tools.

  20. A New Modular Approach for Tightly Coupled Fluid/Structure Analysis

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru

    2003-01-01

    Static aeroelastic computations are made using a C++ executive suitable for closely coupled fluid/structure interaction studies. The fluid flow is modeled using the Euler/Navier Stokes equations and the structure is modeled using finite elements. FORTRAN based fluids and structures codes are integrated under C++ environment. The flow and structural solvers are treated as separate object files. The data flow between fluids and structures is accomplished using I/O. Results are demonstrated for transonic flow over partially flexible surface that is important for aerospace vehicles. Use of this development to accurately predict flow induced structural failure will be demonstrated.

  1. Two- and three-dimensional natural and mixed convection simulation using modular zonal models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wurtz, E.; Nataf, J.M.; Winkelmann, F.

    We demonstrate the use of the zonal model approach, which is a simplified method for calculating natural and mixed convection in rooms. Zonal models use a coarse grid and use balance equations, state equations, hydrostatic pressure drop equations and power law equations of the form {ital m} = {ital C}{Delta}{sup {ital n}}. The advantage of the zonal approach and its modular implementation are discussed. The zonal model resolution of nonlinear equation systems is demonstrated for three cases: a 2-D room, a 3-D room and a pair of 3-D rooms separated by a partition with an opening. A sensitivity analysis withmore » respect to physical parameters and grid coarseness is presented. Results are compared to computational fluid dynamics (CFD) calculations and experimental data.« less

  2. Development and Demonstration of a Computational Tool for the Analysis of Particle Vitiation Effects in Hypersonic Propulsion Test Facilities

    NASA Technical Reports Server (NTRS)

    Perkins, Hugh Douglas

    2010-01-01

    In order to improve the understanding of particle vitiation effects in hypersonic propulsion test facilities, a quasi-one dimensional numerical tool was developed to efficiently model reacting particle-gas flows over a wide range of conditions. Features of this code include gas-phase finite-rate kinetics, a global porous-particle combustion model, mass, momentum and energy interactions between phases, and subsonic and supersonic particle drag and heat transfer models. The basic capabilities of this tool were validated against available data or other validated codes. To demonstrate the capabilities of the code a series of computations were performed for a model hypersonic propulsion test facility and scramjet. Parameters studied were simulated flight Mach number, particle size, particle mass fraction and particle material.

  3. SCALE TSUNAMI Analysis of Critical Experiments for Validation of 233U Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Don; Rearden, Bradley T

    2009-01-01

    Oak Ridge National Laboratory (ORNL) staff used the SCALE TSUNAMI tools to provide a demonstration evaluation of critical experiments considered for use in validation of current and anticipated operations involving {sup 233}U at the Radiochemical Development Facility (RDF). This work was reported in ORNL/TM-2008/196 issued in January 2009. This paper presents the analysis of two representative safety analysis models provided by RDF staff.

  4. Security Analysis of Smart Grid Cyber Physical Infrastructures Using Modeling and Game Theoretic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T.

    Cyber physical computing infrastructures typically consist of a number of sites are interconnected. Its operation critically depends both on cyber components and physical components. Both types of components are subject to attacks of different kinds and frequencies, which must be accounted for the initial provisioning and subsequent operation of the infrastructure via information security analysis. Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, andmore » information assets. We concentrated our analysis on the electric sector failure scenarios and impact analyses by the NESCOR Working Group Study, From the Section 5 electric sector representative failure scenarios; we extracted the four generic failure scenarios and grouped them into three specific threat categories (confidentiality, integrity, and availability) to the system. These specific failure scenarios serve as a demonstration of our simulation. The analysis using our ABGT simulation demonstrates how to model the electric sector functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the cyber physical infrastructure network with respect to CIA.« less

  5. Midwives׳ clinical reasoning during second stage labour: Report on an interpretive study.

    PubMed

    Jefford, Elaine; Fahy, Kathleen

    2015-05-01

    clinical reasoning was once thought to be the exclusive domain of medicine - setting it apart from 'non-scientific' occupations like midwifery. Poor assessment, clinical reasoning and decision-making skills are well known contributors to adverse outcomes in maternity care. Midwifery decision-making models share a common deficit: they are insufficiently detailed to guide reasoning processes for midwives in practice. For these reasons we wanted to explore if midwives actively engaged in clinical reasoning processes within their clinical practice and if so to what extent. The study was conducted using post structural, feminist methodology. to what extent do midwives engage in clinical reasoning processes when making decisions in the second stage labour? twenty-six practising midwives were interviewed. Feminist interpretive analysis was conducted by two researchers guided by the steps of a model of clinical reasoning process. Six narratives were excluded from analysis because they did not sufficiently address the research question. The midwives narratives were prepared via data reduction. A theoretically informed analysis and interpretation was conducted. using a feminist, interpretive approach we created a model of midwifery clinical reasoning grounded in the literature and consistent with the data. Thirteen of the 20 participant narratives demonstrate analytical clinical reasoning abilities but only nine completed the process and implemented the decision. Seven midwives used non-analytical decision-making without adequately checking against assessment data. over half of the participants demonstrated the ability to use clinical reasoning skills. Less than half of the midwives demonstrated clinical reasoning as their way of making decisions. The new model of Midwifery Clinical Reasoning includes 'intuition' as a valued way of knowing. Using intuition, however, should not replace clinical reasoning which promotes through decision-making can be made transparent and be consensually validated. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Organizational Analysis and Career Projections Based on a Level-of-Responsibility/Equitable Payment Model. Technical Report.

    ERIC Educational Resources Information Center

    Laner, Stephen; And Others

    Following an explanation of the Level of Responsibility/Equitable Pay Function, its applicability is demonstrated to the analysis and to the design and redesign of organizational hierarchies. It is shown how certain common dysfuntional anomalies can be avoided by structuring an organization along the principles outlined. A technique is then…

  7. Beyond Traditional School Value-Added Models: A Multilevel Analysis of Complex School Effects in Chile

    ERIC Educational Resources Information Center

    Troncoso, Patricio; Pampaka, Maria; Olsen, Wendy

    2016-01-01

    School value-added studies have largely demonstrated the effects of socioeconomic and demographic characteristics of the schools and the pupils on performance in standardised tests. Traditionally, these studies have assessed the variation coming only from the schools and the pupils. However, recent studies have shown that the analysis of academic…

  8. Analyzing Developmental Processes on an Individual Level Using Nonstationary Time Series Modeling

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Sinclair, Katerina O.; Rovine, Michael J.; Ram, Nilam; Corneal, Sherry E.

    2009-01-01

    Individuals change over time, often in complex ways. Generally, studies of change over time have combined individuals into groups for analysis, which is inappropriate in most, if not all, studies of development. The authors explain how to identify appropriate levels of analysis (individual vs. group) and demonstrate how to estimate changes in…

  9. Passing the Test: Ecological Regression Analysis in the Los Angeles County Case and Beyond.

    ERIC Educational Resources Information Center

    Lichtman, Allan J.

    1991-01-01

    Statistical analysis of racially polarized voting prepared for the Garza v County of Los Angeles (California) (1990) voting rights case is reviewed to demonstrate that ecological regression is a flexible, robust technique that illuminates the reality of ethnic voting, and superior to the neighborhood model supported by the defendants. (SLD)

  10. Optimization of Soft Tissue Management, Spacer Design, and Grafting Strategies for Large Segmental Bone Defects using the Chronic Caprine Tibial Defect Model

    DTIC Science & Technology

    2014-10-01

    histology, and microCT analysis. In the current phase of work he will receive more specialized ` training and orientation to microCT analysis...fibrous connective tissue. • Performed histology on goat autogenous bone graft which demonstrated that the quantity and quality of cancellous bone graft

  11. Sage Simulation Model for Technology Demonstration Convertor by a Step-by-Step Approach

    NASA Technical Reports Server (NTRS)

    Demko, Rikako; Penswick, L. Barry

    2006-01-01

    The development of a Stirling model using the 1-D Saga design code was completed using a step-by-step approach. This is a method of gradually increasing the complexity of the Saga model while observing the energy balance and energy losses at each step of the development. This step-by-step model development and energy-flow analysis can clarify where the losses occur, their impact, and suggest possible opportunities for design improvement.

  12. Structure-based Markov random field model for representing evolutionary constraints on functional sites.

    PubMed

    Jeong, Chan-Seok; Kim, Dongsup

    2016-02-24

    Elucidating the cooperative mechanism of interconnected residues is an important component toward understanding the biological function of a protein. Coevolution analysis has been developed to model the coevolutionary information reflecting structural and functional constraints. Recently, several methods have been developed based on a probabilistic graphical model called the Markov random field (MRF), which have led to significant improvements for coevolution analysis; however, thus far, the performance of these models has mainly been assessed by focusing on the aspect of protein structure. In this study, we built an MRF model whose graphical topology is determined by the residue proximity in the protein structure, and derived a novel positional coevolution estimate utilizing the node weight of the MRF model. This structure-based MRF method was evaluated for three data sets, each of which annotates catalytic site, allosteric site, and comprehensively determined functional site information. We demonstrate that the structure-based MRF architecture can encode the evolutionary information associated with biological function. Furthermore, we show that the node weight can more accurately represent positional coevolution information compared to the edge weight. Lastly, we demonstrate that the structure-based MRF model can be reliably built with only a few aligned sequences in linear time. The results show that adoption of a structure-based architecture could be an acceptable approximation for coevolution modeling with efficient computation complexity.

  13. Genome-scale metabolic modeling of Mucor circinelloides and comparative analysis with other oleaginous species.

    PubMed

    Vongsangnak, Wanwipa; Klanchui, Amornpan; Tawornsamretkit, Iyarest; Tatiyaborwornchai, Witthawin; Laoteng, Kobkul; Meechai, Asawin

    2016-06-01

    We present a novel genome-scale metabolic model iWV1213 of Mucor circinelloides, which is an oleaginous fungus for industrial applications. The model contains 1213 genes, 1413 metabolites and 1326 metabolic reactions across different compartments. We demonstrate that iWV1213 is able to accurately predict the growth rates of M. circinelloides on various nutrient sources and culture conditions using Flux Balance Analysis and Phenotypic Phase Plane analysis. Comparative analysis of three oleaginous genome-scale models, including M. circinelloides (iWV1213), Mortierella alpina (iCY1106) and Yarrowia lipolytica (iYL619_PCP) revealed that iWV1213 possesses a higher number of genes involved in carbohydrate, amino acid, and lipid metabolisms that might contribute to its versatility in nutrient utilization. Moreover, the identification of unique and common active reactions among the Zygomycetes oleaginous models using Flux Variability Analysis unveiled a set of gene/enzyme candidates as metabolic engineering targets for cellular improvement. Thus, iWV1213 offers a powerful metabolic engineering tool for multi-level omics analysis, enabling strain optimization as a cell factory platform of lipid-based production. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Diagnostik von Kontingenzerfahrungen in der fruehen Kindheit. Forschungsbericht Rapport Scientifique. Nr. 68. (Methods of Social Contingency Analysis. FB Nr. 68).

    ERIC Educational Resources Information Center

    Perrez, Meinrad

    Written in German, this article demonstrates the influence of different types of contingency information on the development of infant's locus of control and causal attribution, and discusses empirical models for calculating contingency parameters of the microsocial environment of infants, toddlers, and preschool children. Models discussed include:…

  15. Understanding the Common Elements of Evidence-Based Practice: Misconceptions and Clinical Examples

    ERIC Educational Resources Information Center

    Chorpita, Bruce F.; Becker, Kimberly D.; Daleiden, Eric L.

    2007-01-01

    In this article, the authors proposed a distillation and matching model (DMM) that describes how evidence-based treatment operations can be conceptualized at a lower order level of analysis than simply by their manuals. Also referred to as the "common elements" approach, this model demonstrates the feasibility of coding and identifying the…

  16. High Speed Trimaran (HST) Seatrain Experiments, Model 5714

    DTIC Science & Technology

    2013-12-01

    Marine Highway 1 Historical Seatrains 1 Objectives 2 Hull &: Model Description 4 Data Acquisition and Instrumentation 7 Carriage II - Deep ...Operational Demonstration Measurement System 10 Experimental Procedures 10 Carriage II - Deep Water Basin Test 10 Calm Water Resistance 11... Deep Water Basin Analysis 17 Calm Water Resistance 17 Longitudinal Flow Through The Propeller Plane 18 Body Forces & Moments 18

  17. Controlled impact demonstration airframe bending bridges

    NASA Technical Reports Server (NTRS)

    Soltis, S. J.

    1986-01-01

    The calibration of the KRASH and DYCAST models for transport aircraft is discussed. The FAA uses computer analysis techniques to predict the response of controlled impact demonstration (CID) during impact. The moment bridges can provide a direct correlation between the predictive loads or moments that the models will predict and what was experienced during the actual impact. Another goal is to examine structural failure mechanisms and correlate with analytical predictions. The bending bridges did achieve their goals and objectives. The data traces do provide some insight with respect to airframe loads and structural response. They demonstrate quite clearly what's happening to the airframe. A direct quantification of metal airframe loads was measured by the moment bridges. The measured moments can be correlated with the KRASH and DYCAST computer models. The bending bridge data support airframe failure mechanisms analysis and provide residual airframe strength estimation. It did not appear as if any of the bending bridges on the airframe exceeded limit loads. (The observed airframe fracture was due to the fuselage encounter with the tomahawk which tore out the keel beam.) The airframe bridges can be used to estimate the impact conditions and those estimates are correlating with some of the other data measurements. Structural response, frequency and structural damping are readily measured by the moment bridges.

  18. Propulsion Powertrain Real-Time Simulation Using Hardware-in-the-Loop (HIL) for Aircraft Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    Choi, Benjamin B.; Brown, Gerald V.

    2017-01-01

    It is essential to design a propulsion powertrain real-time simulator using the hardware-in-the-loop (HIL) system that emulates an electrified aircraft propulsion (EAP) systems power grid. This simulator would enable us to facilitate in-depth understanding of the system principles, to validate system model analysis and performance prediction, and to demonstrate the proof-of-concept of the EAP electrical system. This paper describes how subscale electrical machines with their controllers can mimic the power components in an EAP powertrain. In particular, three powertrain emulations are presented to mimic 1) a gas turbo-=shaft engine driving a generator, consisting of two permanent magnet (PM) motors with brushless motor drives, coupled by a shaft, 2) a motor driving a propulsive fan, and 3) a turbo-shaft engine driven fan (turbofan engine) operation. As a first step towards the demonstration, experimental dynamic characterization of the two motor drive systems, coupled by a mechanical shaft, were performed. The previously developed analytical motor models1 were then replaced with the experimental motor models to perform the real-time demonstration in the predefined flight path profiles. This technique can convert the plain motor system into a unique EAP power grid emulator that enables rapid analysis and real-time simulation performance using hardware-in-the-loop (HIL).

  19. Demonstration of frequency-sweep testing technique using a Bell 214-ST helicopter

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Fletcher, Jay W.; Diekmann, Vernon L.; Williams, Robert A.; Cason, Randall W.

    1987-01-01

    A demonstration of frequency-sweep testing using a Bell-214ST single-rotor helicopter was completed in support of the Army's development of an updated MIL-H-8501A, and an LHX (ADS-33) handling-qualities specification. Hover and level-flight (V sub a = 0 knots and V sub a = 90 knots) tests were conducted in 3 flight hours by Army test pilots at the Army Aviation Engineering Flight Activity (AEFA) at Edwards AFB, Calif. Bandwidth and phase-delay parameters were determined from the flight-extracted frequency responses as required by the proposed specifications. Transfer function modeling and verification demonstrates the validity of the frequency-response concept for characterizing closed-loop flight dynamics of single-rotor helicopters -- even in hover. This report documents the frequency-sweep flight-testing technique and data-analysis procedures. Special emphasis is given to piloting and analysis considerations which are important for demonstrating frequency-domain specification compliance.

  20. Modelling land use change with generalized linear models--a multi-model analysis of change between 1860 and 2000 in Gallatin Valley, Montana.

    PubMed

    Aspinall, Richard

    2004-08-01

    This paper develops an approach to modelling land use change that links model selection and multi-model inference with empirical models and GIS. Land use change is frequently studied, and understanding gained, through a process of modelling that is an empirical analysis of documented changes in land cover or land use patterns. The approach here is based on analysis and comparison of multiple models of land use patterns using model selection and multi-model inference. The approach is illustrated with a case study of rural housing as it has developed for part of Gallatin County, Montana, USA. A GIS contains the location of rural housing on a yearly basis from 1860 to 2000. The database also documents a variety of environmental and socio-economic conditions. A general model of settlement development describes the evolution of drivers of land use change and their impacts in the region. This model is used to develop a series of different models reflecting drivers of change at different periods in the history of the study area. These period specific models represent a series of multiple working hypotheses describing (a) the effects of spatial variables as a representation of social, economic and environmental drivers of land use change, and (b) temporal changes in the effects of the spatial variables as the drivers of change evolve over time. Logistic regression is used to calibrate and interpret these models and the models are then compared and evaluated with model selection techniques. Results show that different models are 'best' for the different periods. The different models for different periods demonstrate that models are not invariant over time which presents challenges for validation and testing of empirical models. The research demonstrates (i) model selection as a mechanism for rating among many plausible models that describe land cover or land use patterns, (ii) inference from a set of models rather than from a single model, (iii) that models can be developed based on hypothesised relationships based on consideration of underlying and proximate causes of change, and (iv) that models are not invariant over time.

  1. Propulsion System Simulation Using the Toolbox for the Modeling and Analysis of Thermodynamic System T-MATS

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This paper describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this paper is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture. A model comparison was conducted by matching steady-state performance results from a T-MATS developed gas turbine simulation to a well-documented steady-state simulation. Transient modeling capabilities are then demonstrated when the steady-state T-MATS model is updated to run dynamically.

  2. Propulsion System Simulation Using the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS)

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This paper describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this paper is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture. A model comparison was conducted by matching steady-state performance results from a T-MATS developed gas turbine simulation to a well-documented steady-state simulation. Transient modeling capabilities are then demonstrated when the steady-state T-MATS model is updated to run dynamically.

  3. A Stochastic-Variational Model for Soft Mumford-Shah Segmentation

    PubMed Central

    2006-01-01

    In contemporary image and vision analysis, stochastic approaches demonstrate great flexibility in representing and modeling complex phenomena, while variational-PDE methods gain enormous computational advantages over Monte Carlo or other stochastic algorithms. In combination, the two can lead to much more powerful novel models and efficient algorithms. In the current work, we propose a stochastic-variational model for soft (or fuzzy) Mumford-Shah segmentation of mixture image patterns. Unlike the classical hard Mumford-Shah segmentation, the new model allows each pixel to belong to each image pattern with some probability. Soft segmentation could lead to hard segmentation, and hence is more general. The modeling procedure, mathematical analysis on the existence of optimal solutions, and computational implementation of the new model are explored in detail, and numerical examples of both synthetic and natural images are presented. PMID:23165059

  4. A lumped parameter mathematical model for simulation of subsonic wind tunnels

    NASA Technical Reports Server (NTRS)

    Krosel, S. M.; Cole, G. L.; Bruton, W. M.; Szuch, J. R.

    1986-01-01

    Equations for a lumped parameter mathematical model of a subsonic wind tunnel circuit are presented. The equation state variables are internal energy, density, and mass flow rate. The circuit model is structured to allow for integration and analysis of tunnel subsystem models which provide functions such as control of altitude pressure and temperature. Thus the model provides a useful tool for investigating the transient behavior of the tunnel and control requirements. The model was applied to the proposed NASA Lewis Altitude Wind Tunnel (AWT) circuit and included transfer function representations of the tunnel supply/exhaust air and refrigeration subsystems. Both steady state and frequency response data are presented for the circuit model indicating the type of results and accuracy that can be expected from the model. Transient data for closed loop control of the tunnel and its subsystems are also presented, demonstrating the model's use as a control analysis tool.

  5. A User's Guide for the Differential Reduced Ejector/Mixer Analysis "DREA" Program. 1.0

    NASA Technical Reports Server (NTRS)

    DeChant, Lawrence J.; Nadell, Shari-Beth

    1999-01-01

    A system of analytical and numerical two-dimensional mixer/ejector nozzle models that require minimal empirical input has been developed and programmed for use in conceptual and preliminary design. This report contains a user's guide describing the operation of the computer code, DREA (Differential Reduced Ejector/mixer Analysis), that contains these mathematical models. This program is currently being adopted by the Propulsion Systems Analysis Office at the NASA Glenn Research Center. A brief summary of the DREA method is provided, followed by detailed descriptions of the program input and output files. Sample cases demonstrating the application of the program are presented.

  6. Blade loss transient dynamics analysis, volume 1. Task 1: Survey and perspective. [aircraft gas turbine engines

    NASA Technical Reports Server (NTRS)

    Gallardo, V. C.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.

    1981-01-01

    An analytical technique was developed to predict the behavior of a rotor system subjected to sudden unbalance. The technique is implemented in the Turbine Engine Transient Rotor Analysis (TETRA) computer program using the component element method. The analysis was particularly aimed toward blade-loss phenomena in gas turbine engines. A dual-rotor, casing, and pylon structure can be modeled by the computer program. Blade tip rubs, Coriolis forces, and mechanical clearances are included. The analytical system was verified by modeling and simulating actual test conditions for a rig test as well as a full-engine, blade-release demonstration.

  7. Modeling of multi-rotor torsional vibrations in rotating machinery using substructuring

    NASA Technical Reports Server (NTRS)

    Soares, Fola R.

    1986-01-01

    The application of FEM modeling techniques to the analysis of torsional vibrations in complex rotating systems is described and demonstrated, summarizing results reported by Soares (1985). A substructuring approach is used for determination of torsional natural frequencies and resonant-mode shapes, steady-state frequency-sweep analysis, identification of dynamically unstable speed ranges, and characterization of transient linear and nonlinear systems. Results for several sample problems are presented in diagrams, graphs, and tables. STORV, a computer code based on this approach, is in use as a preliminary design tool for drive-train torsional analysis in the High Altitude Wind Tunnel at NASA Lewis.

  8. Demonstration and Pantomime in the Evolution of Teaching.

    PubMed

    Gärdenfors, Peter

    2017-01-01

    Donald proposes that early Homo evolved mimesis as a new form of cognition. This article investigates the mimesis hypothesis in relation to the evolution of teaching. The fundamental capacities that distinguish hominin teaching from that of other animals are demonstration and pantomime. A conceptual analysis of the instructional and communicative functions of demonstration and pantomime is presented. Archaeological evidence that demonstration was used for transmitting the Oldowan technology is summarized. It is argued that pantomime develops out of demonstration so that the primary objective of pantomime is that the onlooker learns the motoric patterns shown in the pantomime. The communicative use of pantomime is judged to be secondary. This use of pantomime is also contrasted with other forms of gestures. A key feature of the analysis is that the meaning of a pantomime is characterized by the force patterns of the movements. These force patterns form the core of a model of the cognitive mechanism behind pantomime. Finally, the role of pantomime in the evolution of language is also discussed.

  9. Demonstration and Pantomime in the Evolution of Teaching

    PubMed Central

    Gärdenfors, Peter

    2017-01-01

    Donald proposes that early Homo evolved mimesis as a new form of cognition. This article investigates the mimesis hypothesis in relation to the evolution of teaching. The fundamental capacities that distinguish hominin teaching from that of other animals are demonstration and pantomime. A conceptual analysis of the instructional and communicative functions of demonstration and pantomime is presented. Archaeological evidence that demonstration was used for transmitting the Oldowan technology is summarized. It is argued that pantomime develops out of demonstration so that the primary objective of pantomime is that the onlooker learns the motoric patterns shown in the pantomime. The communicative use of pantomime is judged to be secondary. This use of pantomime is also contrasted with other forms of gestures. A key feature of the analysis is that the meaning of a pantomime is characterized by the force patterns of the movements. These force patterns form the core of a model of the cognitive mechanism behind pantomime. Finally, the role of pantomime in the evolution of language is also discussed. PMID:28382011

  10. Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities

    NASA Astrophysics Data System (ADS)

    Esposito, Gaetano

    Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and identifying sources of uncertainty affecting relevant reaction pathways are usually addressed by resorting to Global Sensitivity Analysis (GSA) techniques. In particular, the most sensitive reactions controlling combustion phenomena are first identified using the Morris Method and then analyzed under the Random Sampling -- High Dimensional Model Representation (RS-HDMR) framework. The HDMR decomposition shows that 10% of the variance seen in the extinction strain rate of non-premixed flames is due to second-order effects between parameters, whereas the maximum concentration of acetylene, a key soot precursor, is affected by mostly only first-order contributions. Moreover, the analysis of the global sensitivity indices demonstrates that improving the accuracy of the reaction rates including the vinyl radical, C2H3, can drastically reduce the uncertainty of predicting targeted flame properties. Finally, the back-propagation of the experimental uncertainty of the extinction strain rate to the parameter space is also performed. This exercise, achieved by recycling the numerical solutions of the RS-HDMR, shows that some regions of the parameter space have a high probability of reproducing the experimental value of the extinction strain rate between its own uncertainty bounds. Therefore this study demonstrates that the uncertainty analysis of bulk flame properties can effectively provide information on relevant chemical reactions.

  11. Marginal analysis in assessing factors contributing time to physician in the Emergency Department using operations data.

    PubMed

    Pathan, Sameer A; Bhutta, Zain A; Moinudheen, Jibin; Jenkins, Dominic; Silva, Ashwin D; Sharma, Yogdutt; Saleh, Warda A; Khudabakhsh, Zeenat; Irfan, Furqan B; Thomas, Stephen H

    2016-01-01

    Background: Standard Emergency Department (ED) operations goals include minimization of the time interval (tMD) between patients' initial ED presentation and initial physician evaluation. This study assessed factors known (or suspected) to influence tMD with a two-step goal. The first step was generation of a multivariate model identifying parameters associated with prolongation of tMD at a single study center. The second step was the use of a study center-specific multivariate tMD model as a basis for predictive marginal probability analysis; the marginal model allowed for prediction of the degree of ED operations benefit that would be affected with specific ED operations improvements. Methods: The study was conducted using one month (May 2015) of data obtained from an ED administrative database (EDAD) in an urban academic tertiary ED with an annual census of approximately 500,000; during the study month, the ED saw 39,593 cases. The EDAD data were used to generate a multivariate linear regression model assessing the various demographic and operational covariates' effects on the dependent variable tMD. Predictive marginal probability analysis was used to calculate the relative contributions of key covariates as well as demonstrate the likely tMD impact on modifying those covariates with operational improvements. Analyses were conducted with Stata 14MP, with significance defined at p  < 0.05 and confidence intervals (CIs) reported at the 95% level. Results: In an acceptable linear regression model that accounted for just over half of the overall variance in tMD (adjusted r 2 0.51), important contributors to tMD included shift census ( p  = 0.008), shift time of day ( p  = 0.002), and physician coverage n ( p  = 0.004). These strong associations remained even after adjusting for each other and other covariates. Marginal predictive probability analysis was used to predict the overall tMD impact (improvement from 50 to 43 minutes, p  < 0.001) of consistent staffing with 22 physicians. Conclusions: The analysis identified expected variables contributing to tMD with regression demonstrating significance and effect magnitude of alterations in covariates including patient census, shift time of day, and number of physicians. Marginal analysis provided operationally useful demonstration of the need to adjust physician coverage numbers, prompting changes at the study ED. The methods used in this analysis may prove useful in other EDs wishing to analyze operations information with the goal of predicting which interventions may have the most benefit.

  12. A Simulation and Modeling Framework for Space Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellitemore » intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.« less

  13. Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.

    PubMed

    Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen

    2015-10-01

    Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.

  14. The development of advanced manufacturing systems

    NASA Astrophysics Data System (ADS)

    Doumeingts, Guy; Vallespir, Bruno; Darricau, Didier; Roboam, Michel

    Various methods for the design of advanced manufacturing systems (AMSs) are reviewed. The specifications for AMSs and problems inherent in their development are first discussed. Three models, the Computer Aided Manufacturing-International model, the National Bureau of Standards model, and the GRAI model, are considered in detail. Hierarchical modeling tools such as structured analysis and design techniques, Petri nets, and the Icam definition method are used in the development of integrated manufacturing models. Finally, the GRAI method is demonstrated in the design of specifications for the production management system of the Snecma AMS.

  15. Electronic health record acceptance by physicians: testing an integrated theoretical model.

    PubMed

    Gagnon, Marie-Pierre; Ghandour, El Kebir; Talla, Pascaline Kengne; Simonyan, David; Godin, Gaston; Labrecque, Michel; Ouimet, Mathieu; Rousseau, Michel

    2014-04-01

    Several countries are in the process of implementing an Electronic Health Record (EHR), but limited physicians' acceptance of this technology presents a serious threat to its successful implementation. The aim of this study was to identify the main determinants of physician acceptance of EHR in a sample of general practitioners and specialists of the Province of Quebec (Canada). We sent an electronic questionnaire to physician members of the Quebec Medical Association. We tested four theoretical models (Technology acceptance model (TAM), Extended TAM, Psychosocial Model, and Integrated Model) using path analysis and multiple linear regression analysis in order to identify the main determinants of physicians' intention to use the EHR. We evaluated the modifying effect of sociodemographic characteristics using multi-group analysis of structural weights invariance. A total of 157 questionnaires were returned. The four models performed well and explained between 44% and 55% of the variance in physicians' intention to use the EHR. The Integrated model performed the best and showed that perceived ease of use, professional norm, social norm, and demonstrability of the results are the strongest predictors of physicians' intention to use the EHR. Age, gender, previous experience and specialty modified the association between those determinants and intention. The proposed integrated theoretical model is useful in identifying which factors could motivate physicians from different backgrounds to use the EHR. Physicians who perceive the EHR to be easy to use, coherent with their professional norms, supported by their peers and patients, and able to demonstrate tangible results are more likely to accept this technology. Age, gender, specialty and experience should also be taken into account when developing EHR implementation strategies targeting physicians. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Analysis of out-of-plane thermal microactuators

    NASA Astrophysics Data System (ADS)

    Atre, Amarendra

    2006-02-01

    Out-of-plane thermal microactuators find applications in optical switches to motivate micromirrors. Accurate analysis of such actuators is beneficial for improving existing designs and constructing more energy efficient actuators. However, the analysis is complicated by the nonlinear deformation of the thermal actuators along with temperature-dependent properties of polysilicon. This paper describes the development, modeling issues and results of a three-dimensional multiphysics nonlinear finite element model of surface micromachined out-of-plane thermal actuators. The model includes conductive and convective cooling effects and takes into account the effect of variable air gap on the response of the actuator. The model is implemented to investigate the characteristics of two diverse MUMPs fabricated out-of-plane thermal actuators. Reasonable agreement is observed between simulated and measured results for the model that considers the influence of air gap on actuator response. The usefulness of the model is demonstrated by implementing it to observe the effect of actuator geometry variation on steady-state deflection response.

  17. Minding the Cyber-Physical Gap: Model-Based Analysis and Mitigation of Systemic Perception-Induced Failure.

    PubMed

    Mordecai, Yaniv; Dori, Dov

    2017-07-17

    The cyber-physical gap (CPG) is the difference between the 'real' state of the world and the way the system perceives it. This discrepancy often stems from the limitations of sensing and data collection technologies and capabilities, and is inevitable at some degree in any cyber-physical system (CPS). Ignoring or misrepresenting such limitations during system modeling, specification, design, and analysis can potentially result in systemic misconceptions, disrupted functionality and performance, system failure, severe damage, and potential detrimental impacts on the system and its environment. We propose CPG-Aware Modeling & Engineering (CPGAME), a conceptual model-based approach to capturing, explaining, and mitigating the CPG. CPGAME enhances the systems engineer's ability to cope with CPGs, mitigate them by design, and prevent erroneous decisions and actions. We demonstrate CPGAME by applying it for modeling and analysis of the 1979 Three Miles Island 2 nuclear accident, and show how its meltdown could be mitigated. We use ISO-19450:2015-Object Process Methodology as our conceptual modeling framework.

  18. How much detail and accuracy is required in plant growth sub-models to address questions about optimal management strategies in agricultural systems?

    PubMed Central

    Renton, Michael

    2011-01-01

    Background and aims Simulations that integrate sub-models of important biological processes can be used to ask questions about optimal management strategies in agricultural and ecological systems. Building sub-models with more detail and aiming for greater accuracy and realism may seem attractive, but is likely to be more expensive and time-consuming and result in more complicated models that lack transparency. This paper illustrates a general integrated approach for constructing models of agricultural and ecological systems that is based on the principle of starting simple and then directly testing for the need to add additional detail and complexity. Methodology The approach is demonstrated using LUSO (Land Use Sequence Optimizer), an agricultural system analysis framework based on simulation and optimization. A simple sensitivity analysis and functional perturbation analysis is used to test to what extent LUSO's crop–weed competition sub-model affects the answers to a number of questions at the scale of the whole farming system regarding optimal land-use sequencing strategies and resulting profitability. Principal results The need for accuracy in the crop–weed competition sub-model within LUSO depended to a small extent on the parameter being varied, but more importantly and interestingly on the type of question being addressed with the model. Only a small part of the crop–weed competition model actually affects the answers to these questions. Conclusions This study illustrates an example application of the proposed integrated approach for constructing models of agricultural and ecological systems based on testing whether complexity needs to be added to address particular questions of interest. We conclude that this example clearly demonstrates the potential value of the general approach. Advantages of this approach include minimizing costs and resources required for model construction, keeping models transparent and easy to analyse, and ensuring the model is well suited to address the question of interest. PMID:22476477

  19. Classification of Phase Transitions by Microcanonical Inflection-Point Analysis

    NASA Astrophysics Data System (ADS)

    Qi, Kai; Bachmann, Michael

    2018-05-01

    By means of the principle of minimal sensitivity we generalize the microcanonical inflection-point analysis method by probing derivatives of the microcanonical entropy for signals of transitions in complex systems. A strategy of systematically identifying and locating independent and dependent phase transitions of any order is proposed. The power of the generalized method is demonstrated in applications to the ferromagnetic Ising model and a coarse-grained model for polymer adsorption onto a substrate. The results shed new light on the intrinsic phase structure of systems with cooperative behavior.

  20. Modeling antibiotic and cytotoxic effects of the dimeric isoquinoline IQ-143 on metabolism and its regulation in Staphylococcus aureus, Staphylococcus epidermidis and human cells

    PubMed Central

    2011-01-01

    Background Xenobiotics represent an environmental stress and as such are a source for antibiotics, including the isoquinoline (IQ) compound IQ-143. Here, we demonstrate the utility of complementary analysis of both host and pathogen datasets in assessing bacterial adaptation to IQ-143, a synthetic analog of the novel type N,C-coupled naphthyl-isoquinoline alkaloid ancisheynine. Results Metabolite measurements, gene expression data and functional assays were combined with metabolic modeling to assess the effects of IQ-143 on Staphylococcus aureus, Staphylococcus epidermidis and human cell lines, as a potential paradigm for novel antibiotics. Genome annotation and PCR validation identified novel enzymes in the primary metabolism of staphylococci. Gene expression response analysis and metabolic modeling demonstrated the adaptation of enzymes to IQ-143, including those not affected by significant gene expression changes. At lower concentrations, IQ-143 was bacteriostatic, and at higher concentrations bactericidal, while the analysis suggested that the mode of action was a direct interference in nucleotide and energy metabolism. Experiments in human cell lines supported the conclusions from pathway modeling and found that IQ-143 had low cytotoxicity. Conclusions The data suggest that IQ-143 is a promising lead compound for antibiotic therapy against staphylococci. The combination of gene expression and metabolite analyses with in silico modeling of metabolite pathways allowed us to study metabolic adaptations in detail and can be used for the evaluation of metabolic effects of other xenobiotics. PMID:21418624

  1. SHARP Multiphysics Tutorials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Y. Q.; Shemon, E. R.; Mahadevan, Vijay S.

    SHARP, developed under the NEAMS Reactor Product Line, is an advanced modeling and simulation toolkit for the analysis of advanced nuclear reactors. SHARP is comprised of three physics modules currently including neutronics, thermal hydraulics, and structural mechanics. SHARP empowers designers to produce accurate results for modeling physical phenomena that have been identified as important for nuclear reactor analysis. SHARP can use existing physics codes and take advantage of existing infrastructure capabilities in the MOAB framework and the coupling driver/solver library, the Coupled Physics Environment (CouPE), which utilizes the widely used, scalable PETSc library. This report aims at identifying the coupled-physicsmore » simulation capability of SHARP by introducing the demonstration example called sahex in advance of the SHARP release expected by Mar 2016. sahex consists of 6 fuel pins with cladding, 1 control rod, sodium coolant and an outer duct wall that encloses all the other components. This example is carefully chosen to demonstrate the proof of concept for solving more complex demonstration examples such as EBR II assembly and ABTR full core. The workflow of preparing the input files, running the case and analyzing the results is demonstrated in this report. Moreover, an extension of the sahex model called sahex_core, which adds six homogenized neighboring assemblies to the full heterogeneous sahex model, is presented to test homogenization capabilities in both Nek5000 and PROTEUS. Some primary information on the configuration and build aspects for the SHARP toolkit, which includes capability to auto-download dependencies and configure/install with optimal flags in an architecture-aware fashion, is also covered by this report. A step-by-step instruction is provided to help users to create their cases. Details on these processes will be provided in the SHARP user manual that will accompany the first release.« less

  2. Reconfigurable and responsive droplet-based compound micro-lenses.

    PubMed

    Nagelberg, Sara; Zarzar, Lauren D; Nicolas, Natalie; Subramanian, Kaushikaram; Kalow, Julia A; Sresht, Vishnu; Blankschtein, Daniel; Barbastathis, George; Kreysing, Moritz; Swager, Timothy M; Kolle, Mathias

    2017-03-07

    Micro-scale optical components play a crucial role in imaging and display technology, biosensing, beam shaping, optical switching, wavefront-analysis, and device miniaturization. Herein, we demonstrate liquid compound micro-lenses with dynamically tunable focal lengths. We employ bi-phase emulsion droplets fabricated from immiscible hydrocarbon and fluorocarbon liquids to form responsive micro-lenses that can be reconfigured to focus or scatter light, form real or virtual images, and display variable focal lengths. Experimental demonstrations of dynamic refractive control are complemented by theoretical analysis and wave-optical modelling. Additionally, we provide evidence of the micro-lenses' functionality for two potential applications-integral micro-scale imaging devices and light field display technology-thereby demonstrating both the fundamental characteristics and the promising opportunities for fluid-based dynamic refractive micro-scale compound lenses.

  3. Reconfigurable and responsive droplet-based compound micro-lenses

    PubMed Central

    Nagelberg, Sara; Zarzar, Lauren D.; Nicolas, Natalie; Subramanian, Kaushikaram; Kalow, Julia A.; Sresht, Vishnu; Blankschtein, Daniel; Barbastathis, George; Kreysing, Moritz; Swager, Timothy M.; Kolle, Mathias

    2017-01-01

    Micro-scale optical components play a crucial role in imaging and display technology, biosensing, beam shaping, optical switching, wavefront-analysis, and device miniaturization. Herein, we demonstrate liquid compound micro-lenses with dynamically tunable focal lengths. We employ bi-phase emulsion droplets fabricated from immiscible hydrocarbon and fluorocarbon liquids to form responsive micro-lenses that can be reconfigured to focus or scatter light, form real or virtual images, and display variable focal lengths. Experimental demonstrations of dynamic refractive control are complemented by theoretical analysis and wave-optical modelling. Additionally, we provide evidence of the micro-lenses' functionality for two potential applications—integral micro-scale imaging devices and light field display technology—thereby demonstrating both the fundamental characteristics and the promising opportunities for fluid-based dynamic refractive micro-scale compound lenses. PMID:28266505

  4. TACT: A Set of MSC/PATRAN- and MSC/NASTRAN- based Modal Correlation Tools

    NASA Technical Reports Server (NTRS)

    Marlowe, Jill M.; Dixon, Genevieve D.

    1998-01-01

    This paper describes the functionality and demonstrates the utility of the Test Analysis Correlation Tools (TACT), a suite of MSC/PATRAN Command Language (PCL) tools which automate the process of correlating finite element models to modal survey test data. The initial release of TACT provides a basic yet complete set of tools for performing correlation totally inside the PATRAN/NASTRAN environment. Features include a step-by-step menu structure, pre-test accelerometer set evaluation and selection, analysis and test result export/import in Universal File Format, calculation of frequency percent difference and cross-orthogonality correlation results using NASTRAN, creation and manipulation of mode pairs, and five different ways of viewing synchronized animations of analysis and test modal results. For the PATRAN-based analyst, TACT eliminates the repetitive, time-consuming and error-prone steps associated with transferring finite element data to a third-party modal correlation package, which allows the analyst to spend more time on the more challenging task of model updating. The usefulness of this software is presented using a case history, the correlation for a NASA Langley Research Center (LaRC) low aspect ratio research wind tunnel model. To demonstrate the improvements that TACT offers the MSC/PATRAN- and MSC/DIASTRAN- based structural analysis community, a comparison of the modal correlation process using TACT within PATRAN versus external third-party modal correlation packages is presented.

  5. Challenges in Visual Analysis of Ensembles

    DOE PAGES

    Crossno, Patricia

    2018-04-12

    Modeling physical phenomena through computational simulation increasingly relies on generating a collection of related runs, known as an ensemble. In this paper, we explore the challenges we face in developing analysis and visualization systems for large and complex ensemble data sets, which we seek to understand without having to view the results of every simulation run. Implementing approaches and ideas developed in response to this goal, we demonstrate the analysis of a 15K run material fracturing study using Slycat, our ensemble analysis system.

  6. Challenges in Visual Analysis of Ensembles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crossno, Patricia

    Modeling physical phenomena through computational simulation increasingly relies on generating a collection of related runs, known as an ensemble. In this paper, we explore the challenges we face in developing analysis and visualization systems for large and complex ensemble data sets, which we seek to understand without having to view the results of every simulation run. Implementing approaches and ideas developed in response to this goal, we demonstrate the analysis of a 15K run material fracturing study using Slycat, our ensemble analysis system.

  7. Investigation of the Structural Relationships Between Social Support, Self-Compassion, and Subjective Well-Being in Korean Elite Student Athletes.

    PubMed

    Jeon, Hyunsoo; Lee, Keunchul; Kwon, Sungho

    2016-08-01

    The study examined whether self-compassion mediates the relationship between social support and subjective well-being, as perceived by athletes. It also investigated the structural relationships between these variables. Participants were 333 athletes attending high school or university. Structural equation analysis showed that self-compassion partially mediated the relationship between social support and subjective well-being. To test the stability of the model, a multiple group analysis was performed according to sex of participant and school level, and this demonstrated that the model had similar fit to the data regardless of group. The confirmation that self-compassion plays an intermediary role in the relationship between social support and subjective well-being demonstrates that self-compassionate attitudes can be fostered by social support, and that, in turn, has a positive effect on an individual's subjective well-being. © The Author(s) 2016.

  8. Fleet DNA Phase 1 Refinement & Phase 2 Implementation; NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, Kenneth; Duran, Adam

    2015-06-11

    Fleet DNA acts as a secure data warehouse for medium- and heavy-duty vehicle data. It demonstrates that vehicle drive cycle data can be collected and stored for large-scale analysis and modeling applications. The data serve as a real-world data source for model development and validation. Storage of the results of past/present/future data collection efforts improves analysis efficiency through pooling of shared data and provides the opportunity for 'big data' type analyses. Fleet DNA shows it is possible to develop a common database structure that can store/analyze/report on data sourced from multiple parties, each with unique data formats/types. Data filtration andmore » normalization algorithms developed for the project allow for a wide range of data types and inputs, expanding the project’s potential. Fleet DNA demonstrates the power of integrating Big Data with existing and future tools and analyses: it provides an enhanced understanding and education of users, users can explore greenhouse gases and economic opportunities via AFLEET and ADOPT modeling, drive cycles can be characterized and visualized using DRIVE, high-level vehicle modeling can be performed using real-world drive cycles via FASTSim, and data reporting through Fleet DNA Phase 1 and 2 websites provides external users access to analysis results and gives the opportunity to explore on their own.« less

  9. Graph configuration model based evaluation of the education-occupation match

    PubMed Central

    2018-01-01

    To study education—occupation matchings we developed a bipartite network model of education to work transition and a graph configuration model based metric. We studied the career paths of 15 thousand Hungarian students based on the integrated database of the National Tax Administration, the National Health Insurance Fund, and the higher education information system of the Hungarian Government. A brief analysis of gender pay gap and the spatial distribution of over-education is presented to demonstrate the background of the research and the resulted open dataset. We highlighted the hierarchical and clustered structure of the career paths based on the multi-resolution analysis of the graph modularity. The results of the cluster analysis can support policymakers to fine-tune the fragmented program structure of higher education. PMID:29509783

  10. Graph configuration model based evaluation of the education-occupation match.

    PubMed

    Gadar, Laszlo; Abonyi, Janos

    2018-01-01

    To study education-occupation matchings we developed a bipartite network model of education to work transition and a graph configuration model based metric. We studied the career paths of 15 thousand Hungarian students based on the integrated database of the National Tax Administration, the National Health Insurance Fund, and the higher education information system of the Hungarian Government. A brief analysis of gender pay gap and the spatial distribution of over-education is presented to demonstrate the background of the research and the resulted open dataset. We highlighted the hierarchical and clustered structure of the career paths based on the multi-resolution analysis of the graph modularity. The results of the cluster analysis can support policymakers to fine-tune the fragmented program structure of higher education.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuypers, Marshall A.; Lambert, Gregory Joseph; Moore, Thomas W.

    Chronic infection with Hepatitis C virus (HCV) results in cirrhosis, liver cancer and death. As the nations largest provider of care for HCV, US Veterans Health Administration (VHA) invests extensive resources in the diagnosis and treatment of the disease. This report documents modeling and analysis of HCV treatment dynamics performed for the VHA aimed at improving service delivery efficiency. System dynamics modeling of disease treatment demonstrated the benefits of early detection and the role of comorbidities in disease progress and patient mortality. Preliminary modeling showed that adherence to rigorous treatment protocols is a primary determinant of treatment success. In depthmore » meta-analysis revealed correlations of adherence and various psycho-social factors. This initial meta-analysis indicates areas where substantial improvement in patient outcomes can potentially result from VA programs which incorporate these factors into their design.« less

  12. Swarming behaviors in multi-agent systems with nonlinear dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Wenwu, E-mail: wenwuyu@gmail.com; School of Electrical and Computer Engineering, RMIT University, Melbourne VIC 3001; Chen, Guanrong

    2013-12-15

    The dynamic analysis of a continuous-time multi-agent swarm model with nonlinear profiles is investigated in this paper. It is shown that, under mild conditions, all agents in a swarm can reach cohesion within a finite time, where the upper bounds of the cohesion are derived in terms of the parameters of the swarm model. The results are then generalized by considering stochastic noise and switching between nonlinear profiles. Furthermore, swarm models with limited sensing range inducing changing communication topologies and unbounded repulsive interactions between agents are studied by switching system and nonsmooth analysis. Here, the sensing range of each agentmore » is limited and the possibility of collision among nearby agents is high. Finally, simulation results are presented to demonstrate the validity of the theoretical analysis.« less

  13. Dynamics of an HBV Model with Drug Resistance Under Intermittent Antiviral Therapy

    NASA Astrophysics Data System (ADS)

    Zhang, Ben-Gong; Tanaka, Gouhei; Aihara, Kazuyuki; Honda, Masao; Kaneko, Shuichi; Chen, Luonan

    2015-06-01

    This paper studies the dynamics of the hepatitis B virus (HBV) model and the therapy regimens of HBV disease. First, we propose a new mathematical model of HBV with drug resistance, and then analyze its qualitative and dynamical properties. Combining the clinical data and theoretical analysis, we demonstrate that our model is biologically plausible and also computationally viable. Second, we demonstrate that the intermittent antiviral therapy regimen is one of the possible strategies to treat this kind of complex disease. There are two main advantages of this regimen, i.e. it not only may delay the development of drug resistance, but also may reduce the duration of on-treatment time compared with the long-term continuous medication. Moreover, such an intermittent antiviral therapy can reduce the adverse side effects. Our theoretical model and computational results provide qualitative insight into the progression of HBV, and also a possible new therapy for HBV disease.

  14. Optimizing Chemotherapy Dose and Schedule by Norton-Simon Mathematical Modeling

    PubMed Central

    Traina, Tiffany A.; Dugan, Ute; Higgins, Brian; Kolinsky, Kenneth; Theodoulou, Maria; Hudis, Clifford A.; Norton, Larry

    2011-01-01

    Background To hasten and improve anticancer drug development, we created a novel approach to generating and analyzing preclinical dose-scheduling data so as to optimize benefit-to-toxicity ratios. Methods We applied mathematical methods based upon Norton-Simon growth kinetic modeling to tumor-volume data from breast cancer xenografts treated with capecitabine (Xeloda®, Roche) at the conventional schedule of 14 days of treatment followed by a 7-day rest (14 - 7). Results The model predicted that 7 days of treatment followed by a 7-day rest (7 - 7) would be superior. Subsequent preclinical studies demonstrated that this biweekly capecitabine schedule allowed for safe delivery of higher daily doses, improved tumor response, and prolonged animal survival. Conclusions We demonstrated that the application of Norton-Simon modeling to the design and analysis of preclinical data predicts an improved capecitabine dosing schedule in xenograft models. This method warrants further investigation and application in clinical drug development. PMID:20519801

  15. Photogrammetry and remote sensing for visualization of spatial data in a virtual reality environment

    NASA Astrophysics Data System (ADS)

    Bhagawati, Dwipen

    2001-07-01

    Researchers in many disciplines have started using the tool of Virtual Reality (VR) to gain new insights into problems in their respective disciplines. Recent advances in computer graphics, software and hardware technologies have created many opportunities for VR systems, advanced scientific and engineering applications being among them. In Geometronics, generally photogrammetry and remote sensing are used for management of spatial data inventory. VR technology can be suitably used for management of spatial data inventory. This research demonstrates usefulness of VR technology for inventory management by taking the roadside features as a case study. Management of roadside feature inventory involves positioning and visualization of the features. This research has developed a methodology to demonstrate how photogrammetric principles can be used to position the features using the video-logging images and GPS camera positioning and how image analysis can help produce appropriate texture for building the VR, which then can be visualized in a Cave Augmented Virtual Environment (CAVE). VR modeling was implemented in two stages to demonstrate the different approaches for modeling the VR scene. A simulated highway scene was implemented with the brute force approach, while modeling software was used to model the real world scene using feature positions produced in this research. The first approach demonstrates an implementation of the scene by writing C++ codes to include a multi-level wand menu for interaction with the scene that enables the user to interact with the scene. The interactions include editing the features inside the CAVE display, navigating inside the scene, and performing limited geographic analysis. The second approach demonstrates creation of a VR scene for a real roadway environment using feature positions determined in this research. The scene looks realistic with textures from the real site mapped on to the geometry of the scene. Remote sensing and digital image processing techniques were used for texturing the roadway features in this scene.

  16. NASA's Earth Resources Laboratory - Seventeen years of using remotely sensed satellite data in land applications

    NASA Technical Reports Server (NTRS)

    Cashion, Kenneth D.; Whitehurst, Charles A.

    1987-01-01

    The activities of the Earth Resources Laboratoy (ERL) for the past seventeen years are reviewed with particular reference to four typical applications demonstrating the use of remotely sensed data in a geobased information system context. The applications discussed are: a fire control model for the Olympic National Park; wildlife habitat modeling; a resource inventory system including a potential soil erosion model; and a corridor analysis model for locating routes between geographical locations. Some future applications are also discussed.

  17. A simple analytical model for signal amplification by reversible exchange (SABRE) process.

    PubMed

    Barskiy, Danila A; Pravdivtsev, Andrey N; Ivanov, Konstantin L; Kovtunov, Kirill V; Koptyug, Igor V

    2016-01-07

    We demonstrate an analytical model for the description of the signal amplification by reversible exchange (SABRE) process. The model relies on a combined analysis of chemical kinetics and the evolution of the nuclear spin system during the hyperpolarization process. The presented model for the first time provides rationale for deciding which system parameters (i.e. J-couplings, relaxation rates, reaction rate constants) have to be optimized in order to achieve higher signal enhancement for a substrate of interest in SABRE experiments.

  18. Geographic information system/watershed model interface

    USGS Publications Warehouse

    Fisher, Gary T.

    1989-01-01

    Geographic information systems allow for the interactive analysis of spatial data related to water-resources investigations. A conceptual design for an interface between a geographic information system and a watershed model includes functions for the estimation of model parameter values. Design criteria include ease of use, minimal equipment requirements, a generic data-base management system, and use of a macro language. An application is demonstrated for a 90.1-square-kilometer subbasin of the Patuxent River near Unity, Maryland, that performs automated derivation of watershed parameters for hydrologic modeling.

  19. An analysis of the least-squares problem for the DSN systematic pointing error model

    NASA Technical Reports Server (NTRS)

    Alvarez, L. S.

    1991-01-01

    A systematic pointing error model is used to calibrate antennas in the Deep Space Network. The least squares problem is described and analyzed along with the solution methods used to determine the model's parameters. Specifically studied are the rank degeneracy problems resulting from beam pointing error measurement sets that incorporate inadequate sky coverage. A least squares parameter subset selection method is described and its applicability to the systematic error modeling process is demonstrated on Voyager 2 measurement distribution.

  20. A two-step along-track spectral analysis for estimating the magnetic signals of magnetospheric ring current from Swarm data

    NASA Astrophysics Data System (ADS)

    Martinec, Zdeněk; Velímský, Jakub; Haagmans, Roger; Šachl, Libor

    2018-02-01

    This study deals with the analysis of Swarm vector magnetic field measurements in order to estimate the magnetic field of magnetospheric ring current. For a single Swarm satellite, the magnetic measurements are processed by along-track spectral analysis on a track-by-track basis. The main and lithospheric magnetic fields are modelled by the CHAOS-6 field model and subtracted from the along-track Swarm magnetic data. The mid-latitude residual signal is then spectrally analysed and extrapolated to the polar regions. The resulting model of the magnetosphere (model MME) is compared to the existing Swarm Level 2 magnetospheric field model (MMA_SHA_2C). The differences of up to 10 nT are found on the nightsides Swarm data from 2014 April 8 to May 10, which are due to different processing schemes used to construct the two magnetospheric magnetic field models. The forward-simulated magnetospheric magnetic field generated by the external part of model MME then demonstrates the consistency of the separation of the Swarm along-track signal into the external and internal parts by the two-step along-track spectral analysis.

  1. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    NASA Astrophysics Data System (ADS)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  2. Game theoretic analysis of physical protection system design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canion, B.; Schneider, E.; Bickel, E.

    The physical protection system (PPS) of a fictional small modular reactor (SMR) facility have been modeled as a platform for a game theoretic approach to security decision analysis. To demonstrate the game theoretic approach, a rational adversary with complete knowledge of the facility has been modeled attempting a sabotage attack. The adversary adjusts his decisions in response to investments made by the defender to enhance the security measures. This can lead to a conservative physical protection system design. Since defender upgrades were limited by a budget, cost benefit analysis may be conducted upon security upgrades. One approach to cost benefitmore » analysis is the efficient frontier, which depicts the reduction in expected consequence per incremental increase in the security budget.« less

  3. An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

    2003-09-01

    This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less

  4. The potential of statistical shape modelling for geometric morphometric analysis of human teeth in archaeological research

    PubMed Central

    Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia

    2017-01-01

    This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199

  5. Coexistence Analysis of Civil Unmanned Aircraft Systems at Low Altitudes

    NASA Astrophysics Data System (ADS)

    Zhou, Yuzhe

    2016-11-01

    The requirement of unmanned aircraft systems in civil areas is growing. However, provisioning of flight efficiency and safety of unmanned aircraft has critical requirements on wireless communication spectrum resources. Current researches mainly focus on spectrum availability. In this paper, the unmanned aircraft system communication models, including the coverage model and data rate model, and two coexistence analysis procedures, i. e. the interference and noise ratio criterion and frequency-distance-direction criterion, are proposed to analyze spectrum requirements and interference results of the civil unmanned aircraft systems at low altitudes. In addition, explicit explanations are provided. The proposed coexistence analysis criteria are applied to assess unmanned aircraft systems' uplink and downlink interference performances and to support corresponding spectrum planning. Numerical results demonstrate that the proposed assessments and analysis procedures satisfy requirements of flexible spectrum accessing and safe coexistence among multiple unmanned aircraft systems.

  6. The natural mathematics of behavior analysis.

    PubMed

    Li, Don; Hautus, Michael J; Elliffe, Douglas

    2018-04-19

    Models that generate event records have very general scope regarding the dimensions of the target behavior that we measure. From a set of predicted event records, we can generate predictions for any dependent variable that we could compute from the event records of our subjects. In this sense, models that generate event records permit us a freely multivariate analysis. To explore this proposition, we conducted a multivariate examination of Catania's Operant Reserve on single VI schedules in transition using a Markov Chain Monte Carlo scheme for Approximate Bayesian Computation. Although we found systematic deviations between our implementation of Catania's Operant Reserve and our observed data (e.g., mismatches in the shape of the interresponse time distributions), the general approach that we have demonstrated represents an avenue for modelling behavior that transcends the typical constraints of algebraic models. © 2018 Society for the Experimental Analysis of Behavior.

  7. Dependability analysis of parallel systems using a simulation-based approach. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Sawyer, Darren Charles

    1994-01-01

    The analysis of dependability in large, complex, parallel systems executing real applications or workloads is examined in this thesis. To effectively demonstrate the wide range of dependability problems that can be analyzed through simulation, the analysis of three case studies is presented. For each case, the organization of the simulation model used is outlined, and the results from simulated fault injection experiments are explained, showing the usefulness of this method in dependability modeling of large parallel systems. The simulation models are constructed using DEPEND and C++. Where possible, methods to increase dependability are derived from the experimental results. Another interesting facet of all three cases is the presence of some kind of workload of application executing in the simulation while faults are injected. This provides a completely new dimension to this type of study, not possible to model accurately with analytical approaches.

  8. Information Security Analysis Using Game Theory and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlicher, Bob G; Abercrombie, Robert K

    Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions aremore » always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.« less

  9. ID201202961, DOE S-124,539, Information Security Analysis Using Game Theory and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Schlicher, Bob G

    Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions aremore » always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.« less

  10. Recent Progress on Labfit: a Multispectrum Analysis Program for Fitting Lineshapes Including the Htp Model and Temperature Dependence

    NASA Astrophysics Data System (ADS)

    Cich, Matthew J.; Guillaume, Alexandre; Drouin, Brian; Benner, D. Chris

    2017-06-01

    Multispectrum analysis can be a challenge for a variety of reasons. It can be computationally intensive to fit a proper line shape model especially for high resolution experimental data. Band-wide analyses including many transitions along with interactions, across many pressures and temperatures are essential to accurately model, for example, atmospherically relevant systems. Labfit is a fast multispectrum analysis program originally developed by D. Chris Benner with a text-based interface. More recently at JPL a graphical user interface was developed with the goal of increasing the ease of use but also the number of potential users. The HTP lineshape model has been added to Labfit keeping it up-to-date with community standards. Recent analyses using labfit will be shown to demonstrate its ability to competently handle large experimental datasets, including high order lineshape effects, that are otherwise unmanageable.

  11. Browsing Space Weather Data and Models with the Integrated Space Weather Analysis (iSWA) System

    NASA Technical Reports Server (NTRS)

    Maddox, Marlo M.; Mullinix, Richard E.; Berrios, David H.; Hesse, Michael; Rastaetter, Lutz; Pulkkinen, Antti; Hourcle, Joseph A.; Thompson, Barbara J.

    2011-01-01

    The Integrated Space Weather Analysis (iSWA) System is a comprehensive web-based platform for space weather information that combines data from solar, heliospheric and geospace observatories with forecasts based on the most advanced space weather models. The iSWA system collects, generates, and presents a wide array of space weather resources in an intuitive, user-configurable, and adaptable format - thus enabling users to respond to current and future space weather impacts as well as enabling post-impact analysis. iSWA currently provides over 200 data and modeling products, and features a variety of tools that allow the user to browse, combine, and examine data and models from various sources. This presentation will consist of a summary of the iSWA products and an overview of the customizable user interfaces, and will feature several tutorial demonstrations highlighting the interactive tools and advanced capabilities.

  12. Dynamic Analysis of Geared Rotors by Finite Elements

    NASA Technical Reports Server (NTRS)

    Kahraman, A.; Ozguven, H. Nevzat; Houser, D. R.; Zakrajsek, J. J.

    1992-01-01

    A finite element model of a geared rotor system on flexible bearings has been developed. The model includes the rotary inertia of on shaft elements, the axial loading on shafts, flexibility and damping of bearings, material damping of shafts and the stiffness and the damping of gear mesh. The coupling between the torsional and transverse vibrations of gears were considered in the model. A constant mesh stiffness was assumed. The analysis procedure can be used for forced vibration analysis geared rotors by calculating the critical speeds and determining the response of any point on the shafts to mass unbalances, geometric eccentricities of gears, and displacement transmission error excitation at the mesh point. The dynamic mesh forces due to these excitations can also be calculated. The model has been applied to several systems for the demonstration of its accuracy and for studying the effect of bearing compliances on system dynamics.

  13. NASA Langley developments in response calculations needed for failure and life prediction

    NASA Technical Reports Server (NTRS)

    Housner, Jerrold M.

    1993-01-01

    NASA Langley developments in response calculations needed for failure and life predictions are discussed. Topics covered include: structural failure analysis in concurrent engineering; accuracy of independent regional modeling demonstrated on classical example; functional interface method accurately joins incompatible finite element models; interface method for insertion of local detail modeling extended to curve pressurized fuselage window panel; interface concept for joining structural regions; motivation for coupled 2D-3D analysis; compression panel with discontinuous stiffener coupled 2D-3D model and axial surface strains at the middle of the hat stiffener; use of adaptive refinement with multiple methods; adaptive mesh refinement; and studies on quantity effect of bow-type initial imperfections on reliability of stiffened panels.

  14. Treated cabin acoustic prediction using statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Yoerkie, Charles A.; Ingraham, Steven T.; Moore, James A.

    1987-01-01

    The application of statistical energy analysis (SEA) to the modeling and design of helicopter cabin interior noise control treatment is demonstrated. The information presented here is obtained from work sponsored at NASA Langley for the development of analytic modeling techniques and the basic understanding of cabin noise. Utility and executive interior models are developed directly from existing S-76 aircraft designs. The relative importance of panel transmission loss (TL), acoustic leakage, and absorption to the control of cabin noise is shown using the SEA modeling parameters. It is shown that the major cabin noise improvement below 1000 Hz comes from increased panel TL, while above 1000 Hz it comes from reduced acoustic leakage and increased absorption in the cabin and overhead cavities.

  15. A Community Data Model for Hydrologic Observations

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Horsburgh, J. S.; Zaslavsky, I.; Maidment, D. R.; Valentine, D.; Jennings, B.

    2006-12-01

    The CUAHSI Hydrologic Information System project is developing information technology infrastructure to support hydrologic science. Hydrologic information science involves the description of hydrologic environments in a consistent way, using data models for information integration. This includes a hydrologic observations data model for the storage and retrieval of hydrologic observations in a relational database designed to facilitate data retrieval for integrated analysis of information collected by multiple investigators. It is intended to provide a standard format to facilitate the effective sharing of information between investigators and to facilitate analysis of information within a single study area or hydrologic observatory, or across hydrologic observatories and regions. The observations data model is designed to store hydrologic observations and sufficient ancillary information (metadata) about the observations to allow them to be unambiguously interpreted and used and provide traceable heritage from raw measurements to usable information. The design is based on the premise that a relational database at the single observation level is most effective for providing querying capability and cross dimension data retrieval and analysis. This premise is being tested through the implementation of a prototype hydrologic observations database, and the development of web services for the retrieval of data from and ingestion of data into the database. These web services hosted by the San Diego Supercomputer center make data in the database accessible both through a Hydrologic Data Access System portal and directly from applications software such as Excel, Matlab and ArcGIS that have Standard Object Access Protocol (SOAP) capability. This paper will (1) describe the data model; (2) demonstrate the capability for representing diverse data in the same database; (3) demonstrate the use of the database from applications software for the performance of hydrologic analysis across different observation types.

  16. Thorough specification of the neurophysiologic processes underlying behavior and of their manifestation in EEG - demonstration with the go/no-go task.

    PubMed

    Shahaf, Goded; Pratt, Hillel

    2013-01-01

    In this work we demonstrate the principles of a systematic modeling approach of the neurophysiologic processes underlying a behavioral function. The modeling is based upon a flexible simulation tool, which enables parametric specification of the underlying neurophysiologic characteristics. While the impact of selecting specific parameters is of interest, in this work we focus on the insights, which emerge from rather accepted assumptions regarding neuronal representation. We show that harnessing of even such simple assumptions enables the derivation of significant insights regarding the nature of the neurophysiologic processes underlying behavior. We demonstrate our approach in some detail by modeling the behavioral go/no-go task. We further demonstrate the practical significance of this simplified modeling approach in interpreting experimental data - the manifestation of these processes in the EEG and ERP literature of normal and abnormal (ADHD) function, as well as with comprehensive relevant ERP data analysis. In-fact we show that from the model-based spatiotemporal segregation of the processes, it is possible to derive simple and yet effective and theory-based EEG markers differentiating normal and ADHD subjects. We summarize by claiming that the neurophysiologic processes modeled for the go/no-go task are part of a limited set of neurophysiologic processes which underlie, in a variety of combinations, any behavioral function with measurable operational definition. Such neurophysiologic processes could be sampled directly from EEG on the basis of model-based spatiotemporal segregation.

  17. Perceived Stress and Use of Coping Strategies as a Response to Rapidly Changing Student Demographics

    ERIC Educational Resources Information Center

    Lindner, Reinhard W.; Healy, Donald E., Jr.

    2004-01-01

    The authors present findings from an initial information gathering and analysis of a larger Office of Special Education Programs model demonstration project, "Connections to Success." The article is based on an analysis of a teacher survey conducted at four schools in the project's partner district focused on perceived problems, sources of stress,…

  18. Multilevel Latent Class Analysis for Large-Scale Educational Assessment Data: Exploring the Relation between the Curriculum and Students' Mathematical Strategies

    ERIC Educational Resources Information Center

    Fagginger Auer, Marije F.; Hickendorff, Marian; Van Putten, Cornelis M.; Béguin, Anton A.; Heiser, Willem J.

    2016-01-01

    A first application of multilevel latent class analysis (MLCA) to educational large-scale assessment data is demonstrated. This statistical technique addresses several of the challenges that assessment data offers. Importantly, MLCA allows modeling of the often ignored teacher effects and of the joint influence of teacher and student variables.…

  19. Construction and Validation of the Career and Educational Decision Self-Efficacy Inventory for Secondary Students (CEDSIS)

    ERIC Educational Resources Information Center

    Ho, Esther Sui Chu; Sum, Kwok Wing

    2018-01-01

    This study aims to construct and validate the Career and Educational Decision Self-Efficacy Inventory for Secondary Students (CEDSIS) by using a sample of 2,631 students in Hong Kong. Principal component analysis yielded a three-factor structure, which demonstrated good model fit in confirmatory factor analysis. High reliability was found for the…

  20. Observation uncertainty in reversible Markov chains.

    PubMed

    Metzner, Philipp; Weber, Marcus; Schütte, Christof

    2010-09-01

    In many applications one is interested in finding a simplified model which captures the essential dynamical behavior of a real life process. If the essential dynamics can be assumed to be (approximately) memoryless then a reasonable choice for a model is a Markov model whose parameters are estimated by means of Bayesian inference from an observed time series. We propose an efficient Monte Carlo Markov chain framework to assess the uncertainty of the Markov model and related observables. The derived Gibbs sampler allows for sampling distributions of transition matrices subject to reversibility and/or sparsity constraints. The performance of the suggested sampling scheme is demonstrated and discussed for a variety of model examples. The uncertainty analysis of functions of the Markov model under investigation is discussed in application to the identification of conformations of the trialanine molecule via Robust Perron Cluster Analysis (PCCA+) .

  1. Interrupted time series regression for the evaluation of public health interventions: a tutorial.

    PubMed

    Bernal, James Lopez; Cummins, Steven; Gasparrini, Antonio

    2017-02-01

    Interrupted time series (ITS) analysis is a valuable study design for evaluating the effectiveness of population-level health interventions that have been implemented at a clearly defined point in time. It is increasingly being used to evaluate the effectiveness of interventions ranging from clinical therapy to national public health legislation. Whereas the design shares many properties of regression-based approaches in other epidemiological studies, there are a range of unique features of time series data that require additional methodological considerations. In this tutorial we use a worked example to demonstrate a robust approach to ITS analysis using segmented regression. We begin by describing the design and considering when ITS is an appropriate design choice. We then discuss the essential, yet often omitted, step of proposing the impact model a priori. Subsequently, we demonstrate the approach to statistical analysis including the main segmented regression model. Finally we describe the main methodological issues associated with ITS analysis: over-dispersion of time series data, autocorrelation, adjusting for seasonal trends and controlling for time-varying confounders, and we also outline some of the more complex design adaptations that can be used to strengthen the basic ITS design.

  2. Generation of an Atlas of the Proximal Femur and Its Application to Trabecular Bone Analysis

    PubMed Central

    Carballido-Gamio, Julio; Folkesson, Jenny; Karampinos, Dimitrios C.; Baum, Thomas; Link, Thomas M.; Majumdar, Sharmila; Krug, Roland

    2013-01-01

    Automatic placement of anatomically corresponding volumes of interest and comparison of parameters against a standard of reference are essential components in studies of trabecular bone. Only recently, in vivo MR images of the proximal femur, an important fracture site, could be acquired with high-spatial resolution. The purpose of this MRI trabecular bone study was two-fold: (1) to generate an atlas of the proximal femur to automatically place anatomically corresponding volumes of interest in a population study and (2) to demonstrate how mean models of geodesic topological analysis parameters can be generated to be used as potential standard of reference. Ten females were used to generate the atlas and geodesic topological analysis models, and 10 females were used to demonstrate the atlas-based trabecular bone analysis. All alignments were based on three-dimensional (3D) multiresolution affine transformations followed by 3D multiresolution free-form deformations. Mean distances less than 1 mm between aligned femora, and sharp edges in the atlas and in fused gray-level images of registered femora indicated that the anatomical variability was well accommodated and explained by the free-form deformations. PMID:21432904

  3. Interrupted time series regression for the evaluation of public health interventions: a tutorial

    PubMed Central

    Bernal, James Lopez; Cummins, Steven; Gasparrini, Antonio

    2017-01-01

    Abstract Interrupted time series (ITS) analysis is a valuable study design for evaluating the effectiveness of population-level health interventions that have been implemented at a clearly defined point in time. It is increasingly being used to evaluate the effectiveness of interventions ranging from clinical therapy to national public health legislation. Whereas the design shares many properties of regression-based approaches in other epidemiological studies, there are a range of unique features of time series data that require additional methodological considerations. In this tutorial we use a worked example to demonstrate a robust approach to ITS analysis using segmented regression. We begin by describing the design and considering when ITS is an appropriate design choice. We then discuss the essential, yet often omitted, step of proposing the impact model a priori. Subsequently, we demonstrate the approach to statistical analysis including the main segmented regression model. Finally we describe the main methodological issues associated with ITS analysis: over-dispersion of time series data, autocorrelation, adjusting for seasonal trends and controlling for time-varying confounders, and we also outline some of the more complex design adaptations that can be used to strengthen the basic ITS design. PMID:27283160

  4. HANFORD DST THERMAL & SEISMIC PROJECT ANSYS BENCHMARK ANALYSIS OF SEISMIC INDUCED FLUID STRUCTURE INTERACTION IN A HANFORD DOUBLE SHELL PRIMARY TANK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MACKEY, T.C.

    M&D Professional Services, Inc. (M&D) is under subcontract to Pacific Northwest National Laboratories (PNNL) to perform seismic analysis of the Hanford Site Double-Shell Tanks (DSTs) in support of a project entitled ''Double-Shell Tank (DSV Integrity Project-DST Thermal and Seismic Analyses)''. The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST System at Hanford in support of Tri-Party Agreement Milestone M-48-14. The work described herein was performed in support of the seismic analysis of the DSTs. The thermal and operating loads analysis of the DSTs is documented in Rinker et al. (2004). Themore » overall seismic analysis of the DSTs is being performed with the general-purpose finite element code ANSYS. The overall model used for the seismic analysis of the DSTs includes the DST structure, the contained waste, and the surrounding soil. The seismic analysis of the DSTs must address the fluid-structure interaction behavior and sloshing response of the primary tank and contained liquid. ANSYS has demonstrated capabilities for structural analysis, but the capabilities and limitations of ANSYS to perform fluid-structure interaction are less well understood. The purpose of this study is to demonstrate the capabilities and investigate the limitations of ANSYS for performing a fluid-structure interaction analysis of the primary tank and contained waste. To this end, the ANSYS solutions are benchmarked against theoretical solutions appearing in BNL 1995, when such theoretical solutions exist. When theoretical solutions were not available, comparisons were made to theoretical solutions of similar problems and to the results from Dytran simulations. The capabilities and limitations of the finite element code Dytran for performing a fluid-structure interaction analysis of the primary tank and contained waste were explored in a parallel investigation (Abatt 2006). In conjunction with the results of the global ANSYS analysis reported in Carpenter et al. (2006), the results of the two investigations will be compared to help determine if a more refined sub-model of the primary tank is necessary to capture the important fluid-structure interaction effects in the tank and if so, how to best utilize a refined sub-model of the primary tank. Both rigid tank and flexible tank configurations were analyzed with ANSYS. The response parameters of interest are total hydrodynamic reaction forces, impulsive and convective mode frequencies, waste pressures, and slosh heights. To a limited extent: tank stresses are also reported. The results of this study demonstrate that the ANSYS model has the capability to adequately predict global responses such as frequencies and overall reaction forces. Thus, the model is suitable for predicting the global response of the tank and contained waste. On the other hand, while the ANSYS model is capable of adequately predicting waste pressures and primary tank stresses in a large portion of the waste tank, the model does not accurately capture the convective behavior of the waste near the free surface, nor did the model give accurate predictions of slosh heights. Based on the ability of the ANSYS benchmark model to accurately predict frequencies and global reaction forces and on the results presented in Abatt, et al. (2006), the global ANSYS model described in Carpenter et al. (2006) is sufficient for the seismic evaluation of all tank components except for local areas of the primary tank. Due to the limitations of the ANSYS model in predicting the convective response of the waste, the evaluation of primary tank stresses near the waste free surface should be supplemented by results from an ANSYS sub-model of the primary tank that incorporates pressures from theoretical solutions or from Dytran solutions. However, the primary tank is expected to have low demand to capacity ratios in the upper wall. Moreover, due to the less than desired mesh resolution in the primary tank knuckle of the global ANSYS model, the evaluation of the primary tank stresses in the lower knuckle should be supplemented by results from a more refined ANSYS sub-model of the primary tank that incorporates pressures from theoretical solutions or from Dytran solutions.« less

  5. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  6. Evaluation of relative response factor methodology for demonstrating attainment of ozone in Houston, Texas.

    PubMed

    Vizuete, William; Biton, Leiran; Jeffries, Harvey E; Couzo, Evan

    2010-07-01

    In 2007, the U.S. Environmental Protection Agency (EPA) released guidance on demonstrating attainment of the federal ozone (O3) standard. This guidance recommended a change in the use of air quality model (AQM) predictions from an absolute to a relative way. This was accomplished by using a ratio, and not the absolute difference of AQM O3 predictions from a historical year to an attainment year. This ratio of O3 concentrations, labeled the relative response factor (RRF), is multiplied by an average of observed concentrations at every monitor. In this analysis, whether the methodology used to calculate RRFs is severing the source-receptor relationship for a given monitor was investigated. Model predictions were generated with a regulatory AQM system used to support the 2004 Houston-Galveston-Brazoria State Implementation Plan. Following the procedures in the EPA guidance, an attainment demonstration was completed using regulatory AQM predictions and measurements from the Houston ground-monitoring network. Results show that the model predictions used for the RRF calculation were often based on model conditions that were geographically remote from observations and counter to wind flow. Many of the monitors used the same model predictions for an RRF, even if that O3 plume did not impact it. The RRF methodology resulted in severing the true source-receptor relationship for a monitor. This analysis also showed that model performance could influence RRF values, and values at monitoring sites appear to be sensitive to model bias. Results indicate an inverse linear correlation of RRFs with model bias at each monitor (R2 = 0.47), resulting in a change in future O3 design values up to 5 parts per billion (ppb). These results suggest that the application of RRF methodology in Houston, TX, should be changed from using all model predictions above 85 ppb to a method that removes any predictions that are not relevant to the observed source-receptor relationship.

  7. Loss of Coolant Accident (LOCA) / Emergency Core Coolant System (ECCS Evaluation of Risk-Informed Margins Management Strategies for a Representative Pressurized Water Reactor (PWR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szilard, Ronaldo Henriques

    A Risk Informed Safety Margin Characterization (RISMC) toolkit and methodology are proposed for investigating nuclear power plant core, fuels design and safety analysis, including postulated Loss-of-Coolant Accident (LOCA) analysis. This toolkit, under an integrated evaluation model framework, is name LOCA toolkit for the US (LOTUS). This demonstration includes coupled analysis of core design, fuel design, thermal hydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results.

  8. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  9. Fast and Scalable Gaussian Process Modeling with Applications to Astronomical Time Series

    NASA Astrophysics Data System (ADS)

    Foreman-Mackey, Daniel; Agol, Eric; Ambikasaran, Sivaram; Angus, Ruth

    2017-12-01

    The growing field of large-scale time domain astronomy requires methods for probabilistic data analysis that are computationally tractable, even with large data sets. Gaussian processes (GPs) are a popular class of models used for this purpose, but since the computational cost scales, in general, as the cube of the number of data points, their application has been limited to small data sets. In this paper, we present a novel method for GPs modeling in one dimension where the computational requirements scale linearly with the size of the data set. We demonstrate the method by applying it to simulated and real astronomical time series data sets. These demonstrations are examples of probabilistic inference of stellar rotation periods, asteroseismic oscillation spectra, and transiting planet parameters. The method exploits structure in the problem when the covariance function is expressed as a mixture of complex exponentials, without requiring evenly spaced observations or uniform noise. This form of covariance arises naturally when the process is a mixture of stochastically driven damped harmonic oscillators—providing a physical motivation for and interpretation of this choice—but we also demonstrate that it can be a useful effective model in some other cases. We present a mathematical description of the method and compare it to existing scalable GP methods. The method is fast and interpretable, with a range of potential applications within astronomical data analysis and beyond. We provide well-tested and documented open-source implementations of this method in C++, Python, and Julia.

  10. A New Lease of Life for Thomson's Bonds Model of Intelligence

    ERIC Educational Resources Information Center

    Bartholomew, David J.; Deary, Ian J.; Lawn, Martin

    2009-01-01

    Modern factor analysis is the outgrowth of Spearman's original "2-factor" model of intelligence, according to which a mental test score is regarded as the sum of a general factor and a specific factor. As early as 1914, Godfrey Thomson realized that the data did not require this interpretation and he demonstrated this by proposing what became…

  11. A Multi-Faceted Analysis of a New Therapeutic Model of Linking Appraisals to Affective Experiences.

    ERIC Educational Resources Information Center

    McCarthy, Christopher; And Others

    I. Roseman, M. Spindel, and P. Jose (1990) had previously demonstrated that specific appraisals of events led to discrete emotional responses, but this model has not been widely tested by other research teams using alternative research methods. The present study utilized four qualitative research methods, taught by Patti Lather at the 1994…

  12. How Is Self-Determination Conceptualized in Special Education Literature? A Content Analysis of Model Demonstration Projects and Special Education Literature

    ERIC Educational Resources Information Center

    Rosser, Mariola Srednicka

    2010-01-01

    This study investigates the concept of self-determination in special education. The self-determination construct in special education is often described in terms of abilities and attitudes needed to achieve one's goals (Field & Hoffman, 1994a; Ward, 1988; Powers et al., 1996). Despite many similarities among definitions and models of…

  13. A global sensitivity analysis approach for morphogenesis models.

    PubMed

    Boas, Sonja E M; Navarro Jimenez, Maria I; Merks, Roeland M H; Blom, Joke G

    2015-11-21

    Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such 'black-box' models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all 'black-box' models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  14. Development of a model system to analyze chondrogenic differentiation of mesenchymal stem cells

    PubMed Central

    Ruedel, Anke; Hofmeister, Simone; Bosserhoff, Anja-Katrin

    2013-01-01

    High-density cell culture is widely used for the analysis of cartilage development of human mesenchymal stem cells (HMSCs) in vitro. Several cell culture systems, as micromass, pellet culture and alginate culture, are applied by groups in the field to induce chondrogenic differentiation of HMSCs. A draw back of all model systems is the high amount of cells necessary for the experiments. Further, handling of large experimental approaches is difficult due to culturing e.g. in 15 ml tubes. Therefore, we aimed to develop a new model system based on “hanging drop” cultures using 10 to 100 fold less cells. Here, we demonstrate that differentiation of chondrogenic cells was induced as previously shown in other model systems. Real time RT-PCR analysis demonstrated that Collagen type II and MIA/CD-RAP were upregulated during culturing whereas for induction of hypertrophic markers like Collagen type X and AP-2 epsilon treatment with TGF beta was needed. To further test the system, siRNA against Sox9 was used and effects on chondrogenic gene expression were evaluated. In summary, the hanging drop culture system was determined to be a promising tool for in vitro chondrogenic studies. PMID:24294400

  15. A theory of planned behaviour-based analysis of TIMSS 2011 to determine factors influencing inquiry teaching practices in high-performing countries

    NASA Astrophysics Data System (ADS)

    Pongsophon, Pongprapan; Herman, Benjamin C.

    2017-07-01

    Given the abundance of literature describing the strong relationship between inquiry-based teaching and student achievement, more should be known about the factors impacting science teachers' classroom inquiry implementation. This study utilises the theory of planned behaviour to propose and validate a causal model of inquiry-based teaching through analysing data relating to high-performing countries retrieved from the 2011 Trends in International Mathematics and Science Study assessments. Data analysis was completed through structural equation modelling using a polychoric correlation matrix for data input and diagonally weighted least squares estimation. Adequate fit of the full model to the empirical data was realised. The model demonstrates that the extent the teachers participated in academic collaborations was positively related to their occupational satisfaction, confidence in teaching inquiry, and classroom inquiry practices. Furthermore, the teachers' confidence with implementing inquiry was positively related to their classroom inquiry implementation and occupational satisfaction. However, perceived student-generated constraints demonstrated a negative relationship with the teachers' confidence with implementing inquiry and occupational satisfaction. Implications from this study include supporting teachers through promoting collaborative opportunities that facilitate inquiry-based practices and occupational satisfaction.

  16. How well do simulated last glacial maximum tropical temperatures constrain equilibrium climate sensitivity?

    NASA Astrophysics Data System (ADS)

    Hopcroft, Peter O.; Valdes, Paul J.

    2015-07-01

    Previous work demonstrated a significant correlation between tropical surface air temperature and equilibrium climate sensitivity (ECS) in PMIP (Paleoclimate Modelling Intercomparison Project) phase 2 model simulations of the last glacial maximum (LGM). This implies that reconstructed LGM cooling in this region could provide information about the climate system ECS value. We analyze results from new simulations of the LGM performed as part of Coupled Model Intercomparison Project (CMIP5) and PMIP phase 3. These results show no consistent relationship between the LGM tropical cooling and ECS. A radiative forcing and feedback analysis shows that a number of factors are responsible for this decoupling, some of which are related to vegetation and aerosol feedbacks. While several of the processes identified are LGM specific and do not impact on elevated CO2 simulations, this analysis demonstrates one area where the newer CMIP5 models behave in a qualitatively different manner compared with the older ensemble. The results imply that so-called Earth System components such as vegetation and aerosols can have a significant impact on the climate response in LGM simulations, and this should be taken into account in future analyses.

  17. Multi-Disciplinary, Multi-Fidelity Discrete Data Transfer Using Degenerate Geometry Forms

    NASA Technical Reports Server (NTRS)

    Olson, Erik D.

    2016-01-01

    In a typical multi-fidelity design process, different levels of geometric abstraction are used for different analysis methods, and transitioning from one phase of design to the next often requires a complete re-creation of the geometry. To maintain consistency between lower-order and higher-order analysis results, Vehicle Sketch Pad (OpenVSP) recently introduced the ability to generate and export several degenerate forms of the geometry, representing the type of abstraction required to perform low- to medium-order analysis for a range of aeronautical disciplines. In this research, the functionality of these degenerate models was extended, so that in addition to serving as repositories for the geometric information that is required as input to an analysis, the degenerate models can also store the results of that analysis mapped back onto the geometric nodes. At the same time, the results are also mapped indirectly onto the nodes of lower-order degenerate models using a process called aggregation, and onto higher-order models using a process called disaggregation. The mapped analysis results are available for use by any subsequent analysis in an integrated design and analysis process. A simple multi-fidelity analysis process for a single-aisle subsonic transport aircraft is used as an example case to demonstrate the value of the approach.

  18. Kernel analysis of partial least squares (PLS) regression models.

    PubMed

    Shinzawa, Hideyuki; Ritthiruangdej, Pitiporn; Ozaki, Yukihiro

    2011-05-01

    An analytical technique based on kernel matrix representation is demonstrated to provide further chemically meaningful insight into partial least squares (PLS) regression models. The kernel matrix condenses essential information about scores derived from PLS or principal component analysis (PCA). Thus, it becomes possible to establish the proper interpretation of the scores. A PLS model for the total nitrogen (TN) content in multiple Thai fish sauces is built with a set of near-infrared (NIR) transmittance spectra of the fish sauce samples. The kernel analysis of the scores effectively reveals that the variation of the spectral feature induced by the change in protein content is substantially associated with the total water content and the protein hydration. Kernel analysis is also carried out on a set of time-dependent infrared (IR) spectra representing transient evaporation of ethanol from a binary mixture solution of ethanol and oleic acid. A PLS model to predict the elapsed time is built with the IR spectra and the kernel matrix is derived from the scores. The detailed analysis of the kernel matrix provides penetrating insight into the interaction between the ethanol and the oleic acid.

  19. Using software security analysis to verify the secure socket layer (SSL) protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  20. An information-based approach to change-point analysis with applications to biophysics and cell biology.

    PubMed

    Wiggins, Paul A

    2015-07-21

    This article describes the application of a change-point algorithm to the analysis of stochastic signals in biological systems whose underlying state dynamics consist of transitions between discrete states. Applications of this analysis include molecular-motor stepping, fluorophore bleaching, electrophysiology, particle and cell tracking, detection of copy number variation by sequencing, tethered-particle motion, etc. We present a unified approach to the analysis of processes whose noise can be modeled by Gaussian, Wiener, or Ornstein-Uhlenbeck processes. To fit the model, we exploit explicit, closed-form algebraic expressions for maximum-likelihood estimators of model parameters and estimated information loss of the generalized noise model, which can be computed extremely efficiently. We implement change-point detection using the frequentist information criterion (which, to our knowledge, is a new information criterion). The frequentist information criterion specifies a single, information-based statistical test that is free from ad hoc parameters and requires no prior probability distribution. We demonstrate this information-based approach in the analysis of simulated and experimental tethered-particle-motion data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  1. Application of Multiple Imputation for Missing Values in Three-Way Three-Mode Multi-Environment Trial Data

    PubMed Central

    Tian, Ting; McLachlan, Geoffrey J.; Dieters, Mark J.; Basford, Kaye E.

    2015-01-01

    It is a common occurrence in plant breeding programs to observe missing values in three-way three-mode multi-environment trial (MET) data. We proposed modifications of models for estimating missing observations for these data arrays, and developed a novel approach in terms of hierarchical clustering. Multiple imputation (MI) was used in four ways, multiple agglomerative hierarchical clustering, normal distribution model, normal regression model, and predictive mean match. The later three models used both Bayesian analysis and non-Bayesian analysis, while the first approach used a clustering procedure with randomly selected attributes and assigned real values from the nearest neighbour to the one with missing observations. Different proportions of data entries in six complete datasets were randomly selected to be missing and the MI methods were compared based on the efficiency and accuracy of estimating those values. The results indicated that the models using Bayesian analysis had slightly higher accuracy of estimation performance than those using non-Bayesian analysis but they were more time-consuming. However, the novel approach of multiple agglomerative hierarchical clustering demonstrated the overall best performances. PMID:26689369

  2. A Multilayer Naïve Bayes Model for Analyzing User's Retweeting Sentiment Tendency.

    PubMed

    Wang, Mengmeng; Zuo, Wanli; Wang, Ying

    2015-01-01

    Today microblogging has increasingly become a means of information diffusion via user's retweeting behavior. Since retweeting content, as context information of microblogging, is an understanding of microblogging, hence, user's retweeting sentiment tendency analysis has gradually become a hot research topic. Targeted at online microblogging, a dynamic social network, we investigate how to exploit dynamic retweeting sentiment features in retweeting sentiment tendency analysis. On the basis of time series of user's network structure information and published text information, we first model dynamic retweeting sentiment features. Then we build Naïve Bayes models from profile-, relationship-, and emotion-based dimensions, respectively. Finally, we build a multilayer Naïve Bayes model based on multidimensional Naïve Bayes models to analyze user's retweeting sentiment tendency towards a microblog. Experiments on real-world dataset demonstrate the effectiveness of the proposed framework. Further experiments are conducted to understand the importance of dynamic retweeting sentiment features and temporal information in retweeting sentiment tendency analysis. What is more, we provide a new train of thought for retweeting sentiment tendency analysis in dynamic social networks.

  3. Application of Multiple Imputation for Missing Values in Three-Way Three-Mode Multi-Environment Trial Data.

    PubMed

    Tian, Ting; McLachlan, Geoffrey J; Dieters, Mark J; Basford, Kaye E

    2015-01-01

    It is a common occurrence in plant breeding programs to observe missing values in three-way three-mode multi-environment trial (MET) data. We proposed modifications of models for estimating missing observations for these data arrays, and developed a novel approach in terms of hierarchical clustering. Multiple imputation (MI) was used in four ways, multiple agglomerative hierarchical clustering, normal distribution model, normal regression model, and predictive mean match. The later three models used both Bayesian analysis and non-Bayesian analysis, while the first approach used a clustering procedure with randomly selected attributes and assigned real values from the nearest neighbour to the one with missing observations. Different proportions of data entries in six complete datasets were randomly selected to be missing and the MI methods were compared based on the efficiency and accuracy of estimating those values. The results indicated that the models using Bayesian analysis had slightly higher accuracy of estimation performance than those using non-Bayesian analysis but they were more time-consuming. However, the novel approach of multiple agglomerative hierarchical clustering demonstrated the overall best performances.

  4. A unifying framework for marginalized random intercept models of correlated binary outcomes

    PubMed Central

    Swihart, Bruce J.; Caffo, Brian S.; Crainiceanu, Ciprian M.

    2013-01-01

    We demonstrate that many current approaches for marginal modeling of correlated binary outcomes produce likelihoods that are equivalent to the copula-based models herein. These general copula models of underlying latent threshold random variables yield likelihood-based models for marginal fixed effects estimation and interpretation in the analysis of correlated binary data with exchangeable correlation structures. Moreover, we propose a nomenclature and set of model relationships that substantially elucidates the complex area of marginalized random intercept models for binary data. A diverse collection of didactic mathematical and numerical examples are given to illustrate concepts. PMID:25342871

  5. Scripting MODFLOW model development using Python and FloPy

    USGS Publications Warehouse

    Bakker, Mark; Post, Vincent E. A.; Langevin, Christian D.; Hughes, Joseph D.; White, Jeremy; Starn, Jeffrey; Fienen, Michael N.

    2016-01-01

    Graphical user interfaces (GUIs) are commonly used to construct and postprocess numerical groundwater flow and transport models. Scripting model development with the programming language Python is presented here as an alternative approach. One advantage of Python is that there are many packages available to facilitate the model development process, including packages for plotting, array manipulation, optimization, and data analysis. For MODFLOW-based models, the FloPy package was developed by the authors to construct model input files, run the model, and read and plot simulation results. Use of Python with the available scientific packages and FloPy facilitates data exploration, alternative model evaluations, and model analyses that can be difficult to perform with GUIs. Furthermore, Python scripts are a complete, transparent, and repeatable record of the modeling process. The approach is introduced with a simple FloPy example to create and postprocess a MODFLOW model. A more complicated capture-fraction analysis with a real-world model is presented to demonstrate the types of analyses that can be performed using Python and FloPy.

  6. Validation of a Scalable Solar Sailcraft

    NASA Technical Reports Server (NTRS)

    Murphy, D. M.

    2006-01-01

    The NASA In-Space Propulsion (ISP) program sponsored intensive solar sail technology and systems design, development, and hardware demonstration activities over the past 3 years. Efforts to validate a scalable solar sail system by functional demonstration in relevant environments, together with test-analysis correlation activities on a scalable solar sail system have recently been successfully completed. A review of the program, with descriptions of the design, results of testing, and analytical model validations of component and assembly functional, strength, stiffness, shape, and dynamic behavior are discussed. The scaled performance of the validated system is projected to demonstrate the applicability to flight demonstration and important NASA road-map missions.

  7. Development of solar wind shock models with tensor plasma pressure for data analysis

    NASA Technical Reports Server (NTRS)

    Abraham-Shrauner, B.

    1975-01-01

    The development of solar wind shock models with tensor plasma pressure and the comparison of some of the shock models with the satellite data from Pioneer 6 through Pioneer 9 are reported. Theoretically, difficulties were found in non-turbulent fluid shock models for tensor pressure plasmas. For microscopic shock theories nonlinear growth caused by plasma instabilities was frequently not clearly demonstrated to lead to the formation of a shock. As a result no clear choice for a shock model for the bow shock or interplanetary tensor pressure shocks emerged.

  8. Analysis of volatile compounds by open-air ionization mass spectrometry.

    PubMed

    Meher, Anil Kumar; Chen, Yu-Chie

    2017-05-08

    This study demonstrates a simple method for rapid and in situ identification of volatile and endogenous compounds in culinary spice samples through mass spectrometry (MS). This method only requires a holder for solid spice sample (2-3 mm) that is placed close to a mass spectrometer inlet, which is applied with a high voltage. Volatile species responsible for the aroma of the spice samples can be readily detected by the mass spectrometer. Sample pretreatment is not required prior to MS analysis, and no solvent was used during MS analysis. The high voltage applied to the inlet of the mass spectrometer induces the ionization of volatile compounds released from the solid spice samples. Furthermore, moisture in the air also contributes to the ionization of volatile compounds. Dried spices including cinnamon and cloves are used as the model sample to demonstrate this straightforward MS analysis, which can be completed within few seconds. Furthermore, we also demonstrate the suitability of the current method for rapid screening of cinnamon quality through detection of the presence of a hepatotoxic agent, i.e. coumarin. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Potential uncertainty reduction in model-averaged benchmark dose estimates informed by an additional dose study.

    PubMed

    Shao, Kan; Small, Mitchell J

    2011-10-01

    A methodology is presented for assessing the information value of an additional dosage experiment in existing bioassay studies. The analysis demonstrates the potential reduction in the uncertainty of toxicity metrics derived from expanded studies, providing insights for future studies. Bayesian methods are used to fit alternative dose-response models using Markov chain Monte Carlo (MCMC) simulation for parameter estimation and Bayesian model averaging (BMA) is used to compare and combine the alternative models. BMA predictions for benchmark dose (BMD) are developed, with uncertainty in these predictions used to derive the lower bound BMDL. The MCMC and BMA results provide a basis for a subsequent Monte Carlo analysis that backcasts the dosage where an additional test group would have been most beneficial in reducing the uncertainty in the BMD prediction, along with the magnitude of the expected uncertainty reduction. Uncertainty reductions are measured in terms of reduced interval widths of predicted BMD values and increases in BMDL values that occur as a result of this reduced uncertainty. The methodology is illustrated using two existing data sets for TCDD carcinogenicity, fitted with two alternative dose-response models (logistic and quantal-linear). The example shows that an additional dose at a relatively high value would have been most effective for reducing the uncertainty in BMA BMD estimates, with predicted reductions in the widths of uncertainty intervals of approximately 30%, and expected increases in BMDL values of 5-10%. The results demonstrate that dose selection for studies that subsequently inform dose-response models can benefit from consideration of how these models will be fit, combined, and interpreted. © 2011 Society for Risk Analysis.

  10. VARIATIONS IN SEASONAL PATTERNS OF GASTROINTESTINAL INFECTIONS ALONG A RIVER

    EPA Science Inventory

    Epidemiologic analysis of waterborne diseases typically considers socio-economic, demographic, and pathogen-specific characteristics. However, hydrological parameters may need to be considered as well. Fate and transport models of watersheds have demonstrated impairment due to li...

  11. Chemistry Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1982

    1982-01-01

    Presents laboratory procedures, classroom materials/activities, and demonstrations, including: vapor pressure of liquid mixtures and Raoult's law; preparation/analysis of transition metal complexes of ethylammonium chloride; atomic structure display using a ZX81 (includes complete program listing); "pop-up" models of molecules and ions;…

  12. EnviroLand: A Simple Computer Program for Quantitative Stream Assessment.

    ERIC Educational Resources Information Center

    Dunnivant, Frank; Danowski, Dan; Timmens-Haroldson, Alice; Newman, Meredith

    2002-01-01

    Introduces the Enviroland computer program which features lab simulations of theoretical calculations for quantitative analysis and environmental chemistry, and fate and transport models. Uses the program to demonstrate the nature of linear and nonlinear equations. (Author/YDS)

  13. Using APEX to Model Anticipated Human Error: Analysis of a GPS Navigational Aid

    NASA Technical Reports Server (NTRS)

    VanSelst, Mark; Freed, Michael; Shefto, Michael (Technical Monitor)

    1997-01-01

    The interface development process can be dramatically improved by predicting design facilitated human error at an early stage in the design process. The approach we advocate is to SIMULATE the behavior of a human agent carrying out tasks with a well-specified user interface, ANALYZE the simulation for instances of human error, and then REFINE the interface or protocol to minimize predicted error. This approach, incorporated into the APEX modeling architecture, differs from past approaches to human simulation in Its emphasis on error rather than e.g. learning rate or speed of response. The APEX model consists of two major components: (1) a powerful action selection component capable of simulating behavior in complex, multiple-task environments; and (2) a resource architecture which constrains cognitive, perceptual, and motor capabilities to within empirically demonstrated limits. The model mimics human errors arising from interactions between limited human resources and elements of the computer interface whose design falls to anticipate those limits. We analyze the design of a hand-held Global Positioning System (GPS) device used for radical and navigational decisions in small yacht recalls. The analysis demonstrates how human system modeling can be an effective design aid, helping to accelerate the process of refining a product (or procedure).

  14. Reproducing Kernel Particle Method in Plasticity of Pressure-Sensitive Material with Reference to Powder Forming Process

    NASA Astrophysics Data System (ADS)

    Khoei, A. R.; Samimi, M.; Azami, A. R.

    2007-02-01

    In this paper, an application of the reproducing kernel particle method (RKPM) is presented in plasticity behavior of pressure-sensitive material. The RKPM technique is implemented in large deformation analysis of powder compaction process. The RKPM shape function and its derivatives are constructed by imposing the consistency conditions. The essential boundary conditions are enforced by the use of the penalty approach. The support of the RKPM shape function covers the same set of particles during powder compaction, hence no instability is encountered in the large deformation computation. A double-surface plasticity model is developed in numerical simulation of pressure-sensitive material. The plasticity model includes a failure surface and an elliptical cap, which closes the open space between the failure surface and hydrostatic axis. The moving cap expands in the stress space according to a specified hardening rule. The cap model is presented within the framework of large deformation RKPM analysis in order to predict the non-uniform relative density distribution during powder die pressing. Numerical computations are performed to demonstrate the applicability of the algorithm in modeling of powder forming processes and the results are compared to those obtained from finite element simulation to demonstrate the accuracy of the proposed model.

  15. Binary Population and Spectral Synthesis Version 2.1: Construction, Observational Verification, and New Results

    NASA Astrophysics Data System (ADS)

    Eldridge, J. J.; Stanway, E. R.; Xiao, L.; McClelland, L. A. S.; Taylor, G.; Ng, M.; Greis, S. M. L.; Bray, J. C.

    2017-11-01

    The Binary Population and Spectral Synthesis suite of binary stellar evolution models and synthetic stellar populations provides a framework for the physically motivated analysis of both the integrated light from distant stellar populations and the detailed properties of those nearby. We present a new version 2.1 data release of these models, detailing the methodology by which Binary Population and Spectral Synthesis incorporates binary mass transfer and its effect on stellar evolution pathways, as well as the construction of simple stellar populations. We demonstrate key tests of the latest Binary Population and Spectral Synthesis model suite demonstrating its ability to reproduce the colours and derived properties of resolved stellar populations, including well-constrained eclipsing binaries. We consider observational constraints on the ratio of massive star types and the distribution of stellar remnant masses. We describe the identification of supernova progenitors in our models, and demonstrate a good agreement to the properties of observed progenitors. We also test our models against photometric and spectroscopic observations of unresolved stellar populations, both in the local and distant Universe, finding that binary models provide a self-consistent explanation for observed galaxy properties across a broad redshift range. Finally, we carefully describe the limitations of our models, and areas where we expect to see significant improvement in future versions.

  16. Formulation of the nonlinear analysis of shell-like structures, subjected to time-dependent mechanical and thermal loading

    NASA Technical Reports Server (NTRS)

    Simitses, George J.; Carlson, Robert L.; Riff, Richard

    1991-01-01

    The object of the research reported herein was to develop a general mathematical model and solution methodologies for analyzing the structural response of thin, metallic shell structures under large transient, cyclic, or static thermomechanical loads. Among the system responses associated with these loads and conditions are thermal buckling, creep buckling, and ratcheting. Thus geometric and material nonlinearities (of high order) can be anticipated and must be considered in developing the mathematical model. The methodology is demonstrated through different problems of extension, shear, and of planar curved beams. Moreover, importance of the inclusion of large strain is clearly demonstrated, through the chosen applications.

  17. Interactive computer graphics and its role in control system design of large space structures

    NASA Technical Reports Server (NTRS)

    Reddy, A. S. S. R.

    1985-01-01

    This paper attempts to show the relevance of interactive computer graphics in the design of control systems to maintain attitude and shape of large space structures to accomplish the required mission objectives. The typical phases of control system design, starting from the physical model such as modeling the dynamics, modal analysis, and control system design methodology are reviewed and the need of the interactive computer graphics is demonstrated. Typical constituent parts of large space structures such as free-free beams and free-free plates are used to demonstrate the complexity of the control system design and the effectiveness of the interactive computer graphics.

  18. Face Hallucination with Linear Regression Model in Semi-Orthogonal Multilinear PCA Method

    NASA Astrophysics Data System (ADS)

    Asavaskulkiet, Krissada

    2018-04-01

    In this paper, we propose a new face hallucination technique, face images reconstruction in HSV color space with a semi-orthogonal multilinear principal component analysis method. This novel hallucination technique can perform directly from tensors via tensor-to-vector projection by imposing the orthogonality constraint in only one mode. In our experiments, we use facial images from FERET database to test our hallucination approach which is demonstrated by extensive experiments with high-quality hallucinated color faces. The experimental results assure clearly demonstrated that we can generate photorealistic color face images by using the SO-MPCA subspace with a linear regression model.

  19. Quasi-Linear Vacancy Dynamics Modeling and Circuit Analysis of the Bipolar Memristor

    PubMed Central

    Abraham, Isaac

    2014-01-01

    The quasi-linear transport equation is investigated for modeling the bipolar memory resistor. The solution accommodates vacancy and circuit level perspectives on memristance. For the first time in literature the component resistors that constitute the contemporary dual variable resistor circuit model are quantified using vacancy parameters and derived from a governing partial differential equation. The model describes known memristor dynamics even as it generates new insight about vacancy migration, bottlenecks to switching speed and elucidates subtle relationships between switching resistance range and device parameters. The model is shown to comply with Chua's generalized equations for the memristor. Independent experimental results are used throughout, to validate the insights obtained from the model. The paper concludes by implementing a memristor-capacitor filter and compares its performance to a reference resistor-capacitor filter to demonstrate that the model is usable for practical circuit analysis. PMID:25390634

  20. A Study of Upgraded Phenolic Curing for RSRM Nozzle Rings

    NASA Technical Reports Server (NTRS)

    Smartt, Ziba

    2000-01-01

    A thermochemical cure model for predicting temperature and degree of cure profiles in curing phenolic parts was developed, validated and refined over several years. The model supports optimization of cure cycles and allows input of properties based upon the types of material and the process by which these materials are used to make nozzle components. The model has been refined to use sophisticated computer graphics to demonstrate the changes in temperature and degree of cure during the curing process. The effort discussed in the paper will be the conversion from an outdated solid modeling input program and SINDA analysis code to an integrated solid modeling and analysis package (I-DEAS solid model and TMG). Also discussed will be the incorporation of updated material properties obtained during full scale curing tests into the cure models and the results for all the Reusable Solid Rocket Motor (RSRM) nozzle rings.

  1. Adaptive non-interventional heuristics for covariation detection in causal induction: model comparison and rational analysis.

    PubMed

    Hattori, Masasi; Oaksford, Mike

    2007-09-10

    In this article, 41 models of covariation detection from 2 × 2 contingency tables were evaluated against past data in the literature and against data from new experiments. A new model was also included based on a limiting case of the normative phi-coefficient under an extreme rarity assumption, which has been shown to be an important factor in covariation detection (McKenzie & Mikkelsen, 2007) and data selection (Hattori, 2002; Oaksford & Chater, 1994, 2003). The results were supportive of the new model. To investigate its explanatory adequacy, a rational analysis using two computer simulations was conducted. These simulations revealed the environmental conditions and the memory restrictions under which the new model best approximates the normative model of covariation detection in these tasks. They thus demonstrated the adaptive rationality of the new model. 2007 Cognitive Science Society, Inc.

  2. Quasi-linear vacancy dynamics modeling and circuit analysis of the bipolar memristor.

    PubMed

    Abraham, Isaac

    2014-01-01

    The quasi-linear transport equation is investigated for modeling the bipolar memory resistor. The solution accommodates vacancy and circuit level perspectives on memristance. For the first time in literature the component resistors that constitute the contemporary dual variable resistor circuit model are quantified using vacancy parameters and derived from a governing partial differential equation. The model describes known memristor dynamics even as it generates new insight about vacancy migration, bottlenecks to switching speed and elucidates subtle relationships between switching resistance range and device parameters. The model is shown to comply with Chua's generalized equations for the memristor. Independent experimental results are used throughout, to validate the insights obtained from the model. The paper concludes by implementing a memristor-capacitor filter and compares its performance to a reference resistor-capacitor filter to demonstrate that the model is usable for practical circuit analysis.

  3. Demonstration of the Application of Composite Load Spectra (CLS) and Probabilistic Structural Analysis (PSAM) Codes to SSME Heat Exchanger Turnaround Vane

    NASA Technical Reports Server (NTRS)

    Rajagopal, Kadambi R.; DebChaudhury, Amitabha; Orient, George

    2000-01-01

    This report describes a probabilistic structural analysis performed to determine the probabilistic structural response under fluctuating random pressure loads for the Space Shuttle Main Engine (SSME) turnaround vane. It uses a newly developed frequency and distance dependent correlation model that has features to model the decay phenomena along the flow and across the flow with the capability to introduce a phase delay. The analytical results are compared using two computer codes SAFER (Spectral Analysis of Finite Element Responses) and NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) and with experimentally observed strain gage data. The computer code NESSUS with an interface to a sub set of Composite Load Spectra (CLS) code is used for the probabilistic analysis. A Fatigue code was used to calculate fatigue damage due to the random pressure excitation. The random variables modeled include engine system primitive variables that influence the operating conditions, convection velocity coefficient, stress concentration factor, structural damping, and thickness of the inner and outer vanes. The need for an appropriate correlation model in addition to magnitude of the PSD is emphasized. The study demonstrates that correlation characteristics even under random pressure loads are capable of causing resonance like effects for some modes. The study identifies the important variables that contribute to structural alternate stress response and drive the fatigue damage for the new design. Since the alternate stress for the new redesign is less than the endurance limit for the material, the damage due high cycle fatigue is negligible.

  4. Goodness-Of-Fit Test for Nonparametric Regression Models: Smoothing Spline ANOVA Models as Example.

    PubMed

    Teran Hidalgo, Sebastian J; Wu, Michael C; Engel, Stephanie M; Kosorok, Michael R

    2018-06-01

    Nonparametric regression models do not require the specification of the functional form between the outcome and the covariates. Despite their popularity, the amount of diagnostic statistics, in comparison to their parametric counter-parts, is small. We propose a goodness-of-fit test for nonparametric regression models with linear smoother form. In particular, we apply this testing framework to smoothing spline ANOVA models. The test can consider two sources of lack-of-fit: whether covariates that are not currently in the model need to be included, and whether the current model fits the data well. The proposed method derives estimated residuals from the model. Then, statistical dependence is assessed between the estimated residuals and the covariates using the HSIC. If dependence exists, the model does not capture all the variability in the outcome associated with the covariates, otherwise the model fits the data well. The bootstrap is used to obtain p-values. Application of the method is demonstrated with a neonatal mental development data analysis. We demonstrate correct type I error as well as power performance through simulations.

  5. Global Sensitivity Analysis for Process Identification under Model Uncertainty

    NASA Astrophysics Data System (ADS)

    Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.

    2015-12-01

    The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.

  6. Mindtagger: A Demonstration of Data Labeling in Knowledge Base Construction.

    PubMed

    Shin, Jaeho; Ré, Christopher; Cafarella, Michael

    2015-08-01

    End-to-end knowledge base construction systems using statistical inference are enabling more people to automatically extract high-quality domain-specific information from unstructured data. As a result of deploying DeepDive framework across several domains, we found new challenges in debugging and improving such end-to-end systems to construct high-quality knowledge bases. DeepDive has an iterative development cycle in which users improve the data. To help our users, we needed to develop principles for analyzing the system's error as well as provide tooling for inspecting and labeling various data products of the system. We created guidelines for error analysis modeled after our colleagues' best practices, in which data labeling plays a critical role in every step of the analysis. To enable more productive and systematic data labeling, we created Mindtagger, a versatile tool that can be configured to support a wide range of tasks. In this demonstration, we show in detail what data labeling tasks are modeled in our error analysis guidelines and how each of them is performed using Mindtagger.

  7. High-Dimensional Heteroscedastic Regression with an Application to eQTL Data Analysis

    PubMed Central

    Daye, Z. John; Chen, Jinbo; Li, Hongzhe

    2011-01-01

    Summary We consider the problem of high-dimensional regression under non-constant error variances. Despite being a common phenomenon in biological applications, heteroscedasticity has, so far, been largely ignored in high-dimensional analysis of genomic data sets. We propose a new methodology that allows non-constant error variances for high-dimensional estimation and model selection. Our method incorporates heteroscedasticity by simultaneously modeling both the mean and variance components via a novel doubly regularized approach. Extensive Monte Carlo simulations indicate that our proposed procedure can result in better estimation and variable selection than existing methods when heteroscedasticity arises from the presence of predictors explaining error variances and outliers. Further, we demonstrate the presence of heteroscedasticity in and apply our method to an expression quantitative trait loci (eQTLs) study of 112 yeast segregants. The new procedure can automatically account for heteroscedasticity in identifying the eQTLs that are associated with gene expression variations and lead to smaller prediction errors. These results demonstrate the importance of considering heteroscedasticity in eQTL data analysis. PMID:22547833

  8. Determination of the botanical origin of honey by front-face synchronous fluorescence spectroscopy.

    PubMed

    Lenhardt, Lea; Zeković, Ivana; Dramićanin, Tatjana; Dramićanin, Miroslav D; Bro, Rasmus

    2014-01-01

    Front-face synchronous fluorescence spectroscopy combined with chemometrics is used to classify honey samples according to their botanical origin. Synchronous fluorescence spectra of three monofloral (linden, sunflower, and acacia), polyfloral (meadow mix), and fake (fake acacia and linden) honey types (109 samples) were collected in an excitation range of 240-500 nm for synchronous wavelength intervals of 30-300 nm. Chemometric analysis of the gathered data included principal component analysis and partial least squares discriminant analysis. Mean cross-validated classification errors of 0.2 and 4.8% were found for a model that accounts only for monofloral samples and for a model that includes both the monofloral and polyfloral groups, respectively. The results demonstrate that single synchronous fluorescence spectra of different honeys differ significantly because of their distinct physical and chemical characteristics and provide sufficient data for the clear differentiation among honey groups. The spectra of fake honey samples showed pronounced differences from those of genuine honey, and these samples are easily recognized on the basis of their synchronous fluorescence spectra. The study demonstrated that this method is a valuable and promising technique for honey authentication.

  9. PSAMM: A Portable System for the Analysis of Metabolic Models

    PubMed Central

    Steffensen, Jon Lund; Dufault-Thompson, Keith; Zhang, Ying

    2016-01-01

    The genome-scale models of metabolic networks have been broadly applied in phenotype prediction, evolutionary reconstruction, community functional analysis, and metabolic engineering. Despite the development of tools that support individual steps along the modeling procedure, it is still difficult to associate mathematical simulation results with the annotation and biological interpretation of metabolic models. In order to solve this problem, here we developed a Portable System for the Analysis of Metabolic Models (PSAMM), a new open-source software package that supports the integration of heterogeneous metadata in model annotations and provides a user-friendly interface for the analysis of metabolic models. PSAMM is independent of paid software environments like MATLAB, and all its dependencies are freely available for academic users. Compared to existing tools, PSAMM significantly reduced the running time of constraint-based analysis and enabled flexible settings of simulation parameters using simple one-line commands. The integration of heterogeneous, model-specific annotation information in PSAMM is achieved with a novel format of YAML-based model representation, which has several advantages, such as providing a modular organization of model components and simulation settings, enabling model version tracking, and permitting the integration of multiple simulation problems. PSAMM also includes a number of quality checking procedures to examine stoichiometric balance and to identify blocked reactions. Applying PSAMM to 57 models collected from current literature, we demonstrated how the software can be used for managing and simulating metabolic models. We identified a number of common inconsistencies in existing models and constructed an updated model repository to document the resolution of these inconsistencies. PMID:26828591

  10. Evaluation of Uncertainty and Sensitivity in Environmental Modeling at a Radioactive Waste Management Site

    NASA Astrophysics Data System (ADS)

    Stockton, T. B.; Black, P. K.; Catlett, K. M.; Tauxe, J. D.

    2002-05-01

    Environmental modeling is an essential component in the evaluation of regulatory compliance of radioactive waste management sites (RWMSs) at the Nevada Test Site in southern Nevada, USA. For those sites that are currently operating, further goals are to support integrated decision analysis for the development of acceptance criteria for future wastes, as well as site maintenance, closure, and monitoring. At these RWMSs, the principal pathways for release of contamination to the environment are upward towards the ground surface rather than downwards towards the deep water table. Biotic processes, such as burrow excavation and plant uptake and turnover, dominate this upward transport. A combined multi-pathway contaminant transport and risk assessment model was constructed using the GoldSim modeling platform. This platform facilitates probabilistic analysis of environmental systems, and is especially well suited for assessments involving radionuclide decay chains. The model employs probabilistic definitions of key parameters governing contaminant transport, with the goals of quantifying cumulative uncertainty in the estimation of performance measures and providing information necessary to perform sensitivity analyses. This modeling differs from previous radiological performance assessments (PAs) in that the modeling parameters are intended to be representative of the current knowledge, and the uncertainty in that knowledge, of parameter values rather than reflective of a conservative assessment approach. While a conservative PA may be sufficient to demonstrate regulatory compliance, a parametrically honest PA can also be used for more general site decision-making. In particular, a parametrically honest probabilistic modeling approach allows both uncertainty and sensitivity analyses to be explicitly coupled to the decision framework using a single set of model realizations. For example, sensitivity analysis provides a guide for analyzing the value of collecting more information by quantifying the relative importance of each input parameter in predicting the model response. However, in these complex, high dimensional eco-system models, represented by the RWMS model, the dynamics of the systems can act in a non-linear manner. Quantitatively assessing the importance of input variables becomes more difficult as the dimensionality, the non-linearities, and the non-monotonicities of the model increase. Methods from data mining such as Multivariate Adaptive Regression Splines (MARS) and the Fourier Amplitude Sensitivity Test (FAST) provide tools that can be used in global sensitivity analysis in these high dimensional, non-linear situations. The enhanced interpretability of model output provided by the quantitative measures estimated by these global sensitivity analysis tools will be demonstrated using the RWMS model.

  11. Multiscale Analysis of Delamination of Carbon Fiber-Epoxy Laminates with Carbon Nanotubes

    NASA Technical Reports Server (NTRS)

    Riddick, Jaret C.; Frankland, SJV; Gates, TS

    2006-01-01

    A multi-scale analysis is presented to parametrically describe the Mode I delamination of a carbon fiber/epoxy laminate. In the midplane of the laminate, carbon nanotubes are included for the purposes of selectively enhancing the fracture toughness of the laminate. To analyze carbon fiber epoxy carbon nanotube laminate, the multi-scale methodology presented here links a series of parameterizations taken at various length scales ranging from the atomistic through the micromechanical to the structural level. At the atomistic scale molecular dynamics simulations are performed in conjunction with an equivalent continuum approach to develop constitutive properties for representative volume elements of the molecular structure of components of the laminate. The molecular-level constitutive results are then used in the Mori-Tanaka micromechanics to develop bulk properties for the epoxy-carbon nanotube matrix system. In order to demonstrate a possible application of this multi-scale methodology, a double cantilever beam specimen is modeled. An existing analysis is employed which uses discrete springs to model the fiber bridging affect during delamination propagation. In the absence of empirical data or a damage mechanics model describing the effect of CNTs on fracture toughness, several tractions laws are postulated, linking CNT volume fraction to fiber bridging in a DCB specimen. Results from this demonstration are presented in terms of DCB specimen load-displacement responses.

  12. Multivariable harmonic balance analysis of the neuronal oscillator for leech swimming.

    PubMed

    Chen, Zhiyong; Zheng, Min; Friesen, W Otto; Iwasaki, Tetsuya

    2008-12-01

    Biological systems, and particularly neuronal circuits, embody a very high level of complexity. Mathematical modeling is therefore essential for understanding how large sets of neurons with complex multiple interconnections work as a functional system. With the increase in computing power, it is now possible to numerically integrate a model with many variables to simulate behavior. However, such analysis can be time-consuming and may not reveal the mechanisms underlying the observed phenomena. An alternative, complementary approach is mathematical analysis, which can demonstrate direct and explicit relationships between a property of interest and system parameters. This paper introduces a mathematical tool for analyzing neuronal oscillator circuits based on multivariable harmonic balance (MHB). The tool is applied to a model of the central pattern generator (CPG) for leech swimming, which comprises a chain of weakly coupled segmental oscillators. The results demonstrate the effectiveness of the MHB method and provide analytical explanations for some CPG properties. In particular, the intersegmental phase lag is estimated to be the sum of a nominal value and a perturbation, where the former depends on the structure and span of the neuronal connections and the latter is roughly proportional to the period gradient, communication delay, and the reciprocal of the intersegmental coupling strength.

  13. Advanced Nacelle Acoustic Lining Concepts Development

    NASA Technical Reports Server (NTRS)

    Bielak, G.; Gallman, J.; Kunze, R.; Murray, P.; Premo, J.; Kosanchick, M.; Hersh, A.; Celano, J.; Walker, B.; Yu, J.; hide

    2002-01-01

    The work reported in this document consisted of six distinct liner technology development subtasks: 1) Analysis of Model Scale ADP Fan Duct Lining Data (Boeing): An evaluation of an AST Milestone experiment to demonstrate 1995 liner technology superiority relative to that of 1992 was performed on 1:5.9 scale model fan rig (Advanced Ducted Propeller) test data acquired in the NASA Glenn 9 x 15 foot wind tunnel. The goal of 50% improvement was deemed satisfied. 2) Bias Flow Liner Investigation (Boeing, VCES): The ability to control liner impedance by low velocity bias flow through liner was demonstrated. An impedance prediction model to include bias flow was developed. 3) Grazing Flow Impedance Testing (Boeing): Grazing flow impedance tests were conducted for comparison with results achieved at four different laboratories. 4) Micro-Perforate Acoustic Liner Technology (BFG, HAE, NG): Proof of concept testing of a "linear liner." 5) Extended Reaction Liners (Boeing, NG): Bandwidth improvements for non-locally reacting liner were investigated with porous honeycomb core test liners. 6) Development of a Hybrid Active/Passive Lining Concept (HAE): Synergism between active and passive attenuation of noise radiated by a model inlet was demonstrated.

  14. A Study of Fundamental Shock Noise Mechanisms

    NASA Technical Reports Server (NTRS)

    Meadows, Kristine R.

    1997-01-01

    This paper investigates two mechanisms fundamental to sound generation in shocked flows: shock motion and shock deformation. Shock motion is modeled numerically by examining the interaction of a sound wave with a shock. This numerical approach is validated by comparison with results obtained by linear theory for a small-disturbance case. Analysis of the perturbation energy with Myers' energy corollary demonstrates that acoustic energy is generated by the interaction of acoustic disturbances with shocks. This analysis suggests that shock motion generates acoustic and entropy disturbance energy. Shock deformation is modeled numerically by examining the interaction of a vortex ring with a shock. These numerical simulations demonstrate the generation of both an acoustic wave and contact surfaces. The acoustic wave spreads cylindrically. The sound intensity is highly directional and the sound pressure increases with increasing shock strength. The numerically determined relationship between the sound pressure and the Mach number is found to be consistent with experimental observations of shock noise. This consistency implies that a dominant physical process in the generation of shock noise is modeled in this study.

  15. Can biomechanical variables predict improvement in crouch gait?

    PubMed Central

    Hicks, Jennifer L.; Delp, Scott L.; Schwartz, Michael H.

    2011-01-01

    Many patients respond positively to treatments for crouch gait, yet surgical outcomes are inconsistent and unpredictable. In this study, we developed a multivariable regression model to determine if biomechanical variables and other subject characteristics measured during a physical exam and gait analysis can predict which subjects with crouch gait will demonstrate improved knee kinematics on a follow-up gait analysis. We formulated the model and tested its performance by retrospectively analyzing 353 limbs of subjects who walked with crouch gait. The regression model was able to predict which subjects would demonstrate ‘improved’ and ‘unimproved’ knee kinematics with over 70% accuracy, and was able to explain approximately 49% of the variance in subjects’ change in knee flexion between gait analyses. We found that improvement in stance phase knee flexion was positively associated with three variables that were drawn from knowledge about the biomechanical contributors to crouch gait: i) adequate hamstrings lengths and velocities, possibly achieved via hamstrings lengthening surgery, ii) normal tibial torsion, possibly achieved via tibial derotation osteotomy, and iii) sufficient muscle strength. PMID:21616666

  16. H-aggregate analysis of P3HT thin films-Capability and limitation of photoluminescence and UV/Vis spectroscopy.

    PubMed

    Ehrenreich, Philipp; Birkhold, Susanne T; Zimmermann, Eugen; Hu, Hao; Kim, Kwang-Dae; Weickert, Jonas; Pfadler, Thomas; Schmidt-Mende, Lukas

    2016-09-01

    Polymer morphology and aggregation play an essential role for efficient charge carrier transport and charge separation in polymer-based electronic devices. It is a common method to apply the H-aggregate model to UV/Vis or photoluminescence spectra in order to analyze polymer aggregation. In this work we present strategies to obtain reliable and conclusive information on polymer aggregation and morphology based on the application of an H-aggregate analysis on UV/Vis and photoluminescence spectra. We demonstrate, with P3HT as model system, that thickness dependent reflection behavior can lead to misinterpretation of UV/Vis spectra within the H-aggregate model. Values for the exciton bandwidth can deviate by a factor of two for polymer thicknesses below 150 nm. In contrast, photoluminescence spectra are found to be a reliable basis for characterization of polymer aggregation due to their weaker dependence on the wavelength dependent refractive index of the polymer. We demonstrate this by studying the influence of surface characteristics on polymer aggregation for spin-coated thin-films that are commonly used in organic and hybrid solar cells.

  17. Velocity analysis of simultaneous-source data using high-resolution semblance—coping with the strong noise

    NASA Astrophysics Data System (ADS)

    Gan, Shuwei; Wang, Shoudong; Chen, Yangkang; Qu, Shan; Zu, Shaohuan

    2016-02-01

    Direct imaging of simultaneous-source (or blended) data, without the need of deblending, requires a precise subsurface velocity model. In this paper, we focus on the velocity analysis of simultaneous-source data using the normal moveout-based velocity picking approach.We demonstrate that it is possible to obtain a precise velocity model directly from the blended data in the common-midpoint domain. The similarity-weighted semblance can help us obtain much better velocity spectrum with higher resolution and higher reliability compared with the traditional semblance. The similarity-weighted semblance enforces an inherent noise attenuation solely in the semblance calculation stage, thus it is not sensitive to the intense interference. We use both simulated synthetic and field data examples to demonstrate the performance of the similarity-weighted semblance in obtaining reliable subsurface velocity model for direct migration of simultaneous-source data. The migrated image of blended field data using prestack Kirchhoff time migration approach based on the picked velocity from the similarity-weighted semblance is very close to the migrated image of unblended data.

  18. First-Order Model Management With Variable-Fidelity Physics Applied to Multi-Element Airfoil Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, N. M.; Nielsen, E. J.; Lewis, R. M.; Anderson, W. K.

    2000-01-01

    First-order approximation and model management is a methodology for a systematic use of variable-fidelity models or approximations in optimization. The intent of model management is to attain convergence to high-fidelity solutions with minimal expense in high-fidelity computations. The savings in terms of computationally intensive evaluations depends on the ability of the available lower-fidelity model or a suite of models to predict the improvement trends for the high-fidelity problem, Variable-fidelity models can be represented by data-fitting approximations, variable-resolution models. variable-convergence models. or variable physical fidelity models. The present work considers the use of variable-fidelity physics models. We demonstrate the performance of model management on an aerodynamic optimization of a multi-element airfoil designed to operate in the transonic regime. Reynolds-averaged Navier-Stokes equations represent the high-fidelity model, while the Euler equations represent the low-fidelity model. An unstructured mesh-based analysis code FUN2D evaluates functions and sensitivity derivatives for both models. Model management for the present demonstration problem yields fivefold savings in terms of high-fidelity evaluations compared to optimization done with high-fidelity computations alone.

  19. Application of Petri net based analysis techniques to signal transduction pathways.

    PubMed

    Sackmann, Andrea; Heiner, Monika; Koch, Ina

    2006-11-02

    Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically meaningful functional units. The paper demonstrates how Petri net analysis techniques can promote a deeper understanding of signal transduction pathways. The new concepts of feasible t-invariants and MCT-sets have been proven to be useful for model validation and the interpretation of the biological system behaviour. Whereas MCT-sets provide a decomposition of the net into disjunctive subnets, feasible t-invariants describe subnets, which generally overlap. This work contributes to qualitative modelling and to the analysis of large biological networks by their fully automatic decomposition into biologically meaningful modules.

  20. Application of Petri net based analysis techniques to signal transduction pathways

    PubMed Central

    Sackmann, Andrea; Heiner, Monika; Koch, Ina

    2006-01-01

    Background Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. Methods We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. Results We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically meaningful functional units. Conclusion The paper demonstrates how Petri net analysis techniques can promote a deeper understanding of signal transduction pathways. The new concepts of feasible t-invariants and MCT-sets have been proven to be useful for model validation and the interpretation of the biological system behaviour. Whereas MCT-sets provide a decomposition of the net into disjunctive subnets, feasible t-invariants describe subnets, which generally overlap. This work contributes to qualitative modelling and to the analysis of large biological networks by their fully automatic decomposition into biologically meaningful modules. PMID:17081284

Top